Friedrichshafen. 15 April 2020 . CIMON-2, the updated version of the CIMON astronaut assistant, developed and built by Airbus for the German Aerospace Center Space Administration (DLR), has now demonstrated its capabilities during initial tests on the International Space Station (ISS). The free-flying, spherical technology demonstrator with artificial intelligence (AI) showed off a number of its features during interactions with ESA astronaut Luca Parmitano. CIMON-2 started its journey to the ISS on 05 December 2019, launching with the CRS-19 supply mission from the Kennedy Space Center in Cape Canaveral, Florida. It is scheduled to stay on the ISS for up to three years. Just shy of two months after the successful first use of CIMON-2, the project team has now received the analysis. 

A number of tests have now been carried out on CIMON-2, for example on its autonomous flight capabilities, voice-controlled navigation, and its ability to understand and complete various tasks. It also managed to fly to a specific point in the ISS Columbus module for the first time. Thanks to absolute navigation capabilities, CIMON-2 was able to follow verbal commands to move to a particular location, regardless of where it was to begin with. For example, while starting up its new hardware and software, ESA´s Astronaut Luca Parmitano asked CIMON-2 to fly to the Biological Experiment Laboratory (Biolab) inside the Columbus module.  

It was also given the task of taking photos and videos in the European ISS module on request – and then showing these to the astronaut. Using these capabilities, CIMON-2 will be able to help with future scientific experiments on the ISS.  

The microphones of the current version of the technology demonstrator are more sensitive than its predecessor’s (CIMON), and it has a more advanced sense of direction. Its AI capabilities and the stability of its complex software applications have also been significantly improved. The degree of autonomy of the battery-powered assistant has been increased by around 30%. Astronauts can also activate a feature on CIMON-2 that allows it to analyse emotion in language and show empathy when interacting with the astronauts. 

In addition, the project aims to research whether intelligent assistants such as CIMON could help reduce stress. As a partner and assistant, CIMON could support astronauts with their high workload of experiments and maintenance and repair work, thereby reducing their exposure to stress. CIMON lays the foundations for social assistance systems that could reduce stress resulting from isolation or group dynamics during long-term missions. Such systems could also possibly help to minimise similar problems back on Earth as well. 

With the new improved hardware and complex software working so well, the CIMON team from DLR, Airbus, IBM, Ludwig Maximilian University in Munich (LMU) and the ESA User Support Centre Biotesc in Lucerne (Switzerland) are extremely satisfied with CIMON-2’s performance. This continued success of the CIMON project is yet another pioneering achievement in the use of AI in human space flight.  

The CIMON ‘family’
The interactive astronaut assistant CIMON was developed and built by Airbus in Friedrichshafen and Bremen on behalf of the German Aerospace Center Space Administration (Deutsches Zentrum für Luft- und Raumfahrt – DLR) and funded by the German Federal Ministry for Economic Affairs and Energy. Watson AI technology from IBM Cloud provides voice-controlled artificial intelligence. Scientists from the Ludwig-Maximilian University Hospital in Munich (LMU) helped develop and oversee the human aspects of the assistance system. Biotesc at the University of Lucerne ensures that CIMON works perfectly in the Columbus module of the ISS and supports interaction of astronauts with CIMON from ground.   Starting in August 2016, an approximately 50-strong project team from DLR, Airbus, IBM and LMU worked on the implementation of CIMON-1 for around two years. The prototype of the technology experiment flew on board the ISS from 02 July 2018 to 27 August 2019, and made its 90-minute debut – a world first – on 15 November 2018 with German ESA astronaut Alexander Gerst. It is no coincidence that CIMON’s name is reminiscent of ‘Professor Simon Wright’, the robotic assistant – or the ‘flying brain’ – from the Japanese science fiction series ‘Captain Future’. Following the successful CIMON-1 mission, the first European autonomous robot in human space flight was declared a German cultural asset and returned to Earth. The work on CIMON-2 was completed in less than a year by 20 employees from the CIMON ‘family’.  

CIMON – the idea
Developed and built in Germany, CIMON is a technology experiment to support astronauts and increase the efficiency of their work. CIMON is able to show and explain information and instructions for scientific experiments and repairs. The voice-controlled access to documents and media is an advantage, as the astronauts can keep both hands free. It can also be used as a mobile camera to save astronaut crew time. In particular, CIMON could be used to perform routine tasks, such as documenting experiments, searching for objects and taking inventory. CIMON can also see, hear, understand and speak. CIMON can orientate itself using its ‘eyes’ – a stereo camera and a high-resolution camera that it uses for facial recognition – as well as two other cameras fitted to its sides that it uses for photos and video documentation. Ultrasound sensors measure distances to prevent potential collisions. Its ‘ears’ consist of eight microphones to identify directions, and an additional directional microphone to improve voice recognition. Its ‘mouth’ is a loudspeaker that it can use to speak or play music. At the heart of the AI for language understanding is IBM Watson AI technology from IBM Cloud. CIMON has not been equipped with self-learning capabilities and requires active human instruction. The AI used for autonomous navigation was provided by Airbus and is designed for movement planning and object recognition. Twelve internal rotors allow CIMON to move and rotate freely in all directions. This means it can turn towards the astronaut when addressed, nod and shake its head, and follow the astronaut – either autonomously or on command.