This morning, the Artemis 1 flight was scheduled to depart on a mission to the moon, but the flight suffered setbacks due to engine problems and was postponed. When the massive space launch system lifts off, this unmanned test mission will advance the Artemis program for NASA, which aims to eventually return astronauts to the moon. Orion, the spacecraft on the SLS used for the test, won’t have people on board, but three companies — Lockheed Martin, Cisco and Amazon — have teamed up to put together a technology package that could give NASA additional information about it what happens on board.
The collaboration, which bears the project name Callisto, lasts four years. It was revealed to the public earlier this year. This demo will test whether Alexa could be used to control certain spacecraft functions like lights or get hands-free telemetry, and whether Webex could be used to provide a secure and stable connection for video conferencing and more.
Here’s a look at how the companies got theirs star trek-inspired technology for space.
A beta test in space
After being presented with the idea by Lockheed Martin in September 2018, both Cisco and the Alexa team created a game plan and started official work in 2019. From there, both teams developed designs and worked their way through hardware and software issues, giving them the finishing touches.
“When I first started working with Lockheed Martin, I found it interesting that their approach to this project was very customer-centric,” says Justin Nikolaus, Lead Voice UX Designer for Alexa. “Astronauts’ time in space is very expensive, it’s very scripted, and they wanted to make it as efficient as possible. They are looking for different ways for astronauts to interact with the vehicle. The voice is one of those mechanisms.”
From the beginning, the goal was to make the life of the end users – the astronauts – easier, safer and more enjoyable. Nikolaus describes this current attempt as “probably a kind of beta test”.
But the journey of adapting Earth technology to space has been fraught with unique challenges.
“At every major milestone, we started having more hardware. I know about modeling, I know about machine learning, but space is a whole new frontier for me,” said Clement Chung, manager of applied sciences at Alexa. “We could plan and guess what it looks like, but when we get different hardware and the spacecraft to start testing, a lot of our assumptions go over the door.”
[Related: The big new updates to Alexa, and Amazon’s pursuit of ambient AI]
The acoustic noises in the space capsule, the enclosed environment of the spacecraft, and what astronauts say or how they want to interact with Alexa were all factors they had to consider in their initial design. In addition, the team needed to integrate Alexa and the Amazon cloud with NASA’s Deep Space Network (DSN) and add local processing capabilities to reduce latency.
Initially, the team worked to adapt and optimize Alexa’s capabilities for a different audience in a different environment.
Using space sounds provided by Lockheed Martin and considering the materials used onboard, they retrained their models as the noise-to-signal ratio was different. Cisco will use its background noise cancellation software for this purpose to circumvent this challenge.
Alexa, the language assistant, also had to acquire a lot of new knowledge. “Only from a content point of view, I interviewed former astronauts and air traffic controllers to understand what astronauts want to ask and how to convey this information,” says Nikolaus. For example, Alexa has access to 100,000 telemetry endpoints on Orion, from the scientific data it collects on its mission, to the temperature and functions on various parts of the spacecraft, to its position and flight destination.
“I needed to turn that technical data into an easy-to-digest sentence so Alexa could talk to the crew members,” says Nikolaus. “There are many nuances for this environment and this particular user that we needed to learn and explore and make appropriate for the flight ahead.”
Above the clouds and beyond the cloud
In addition, the team had to work out the larger infrastructure issues. The device had to be robust as a repair team could not easily update it. “When you fly into space, you have to consider radiation, shock, vibration, temperature control and the components within,” says Philippe Lantin, lead solution architect for the Alexa voice service. “You really can’t have batteries, for example. I thought this would be a great learning experience for us to design things a certain way that might be more resilient.”
Once the device was built, they had to figure out how best to split Alexa’s capabilities between the cloud and an on-board computer. “We had to figure out how to get the voice of the VIP who is on the ground to the spacecraft. So we had to develop this technology from scratch,” says Lantin. A major focus of the project was creating a standalone version of Alexa that doesn’t rely on the cloud. “We are only allowed to use a very limited bandwidth for this project. The total time between speaking and hearing an answer can be 13 seconds.”
Adjusting for bandwidth availability and latency was also an issue Cisco was working to solve. In addition to videoconferencing, Cisco placed cameras around the spacecraft as eyes. “Right now we’re firing off megabit-scale video, and that’s just not a reality in space exploration, where we rely on NASA’s deep space network,” said Jono Luk, VP of Product Management at Cisco .
Amazon engineers determined that using a cloud-based service for Alexa, which involves contacting Amazon servers to give you an answer, would not be practical in space. So they expanded a feature called local voice control found in certain Echo family devices for tasks like getting the time, date and smart home commands like turning lights on and off and changing their color.
While people on the ground are used to being able to ask Alexa to turn on the lights, in space Alexa will do something similar: it has the ability to control the connected devices aboard Orion. So there’s a bunch of lights that Alexa can control, and Webex cameras would be used to confirm if Alexa was turning the lights on and off
But Alexa won’t be alone in space all the time. The final question the team is testing is whether Alexa could make specific requests that go back to a secured cloud on Earth to be processed. This would be if astronauts wanted to ask about sports scores or keep up to date with everything that’s happening on Earth. “Some things you just can’t know beforehand,” says Nikolaus.
If the tech demo goes well on the unmanned mission, both companies could envision future applications where this technology could help the human crew onboard, even if it’s just doing small things like setting timers, schedules, and reminders .
If Webex technology could hold up well at Cisco, the company would consider testing other features, such as annotating images to provide directions, for example, on how an astronaut should proceed in an experiment in space from the ground. Luk is even more ambitious about integrating more immersive experiences into future tests, such as augmented reality or the hologram function.