NASA’s upcoming Artemis-1 mission, scheduled for launch on Monday (August 29), will carry a wide range of scientific payloads and spacecraft designed to advance our understanding of our neck of the cosmic forest and its exploration.
That’s not all the massive Space Launch System rocket and Orion spacecraft will carry, however. Aboard Orion will be a revolutionary new technology demonstrator designed to change the way astronauts interact with their spacecraft: Callisto.
The Callisto system, developed by Lockheed Martin in partnership with Cisco and Amazon, leverages the same voice-activated capabilities offered by consumer virtual assistant technologies such as Amazon’s Alexa or Apple’s Siri. Instead of warning when packages have been delivered or turning lights on and off, Callisto aims to one day provide real-time data, improved connectivity and mission-specific feedback to astronauts aboard future Orion missions.
Related: Artemis 1 CubeSats: The 10 tiny satellites going to the moon on a NASA ride
Callisto, named after a character from Greek mythology who was a devotee of Artemis, the Greek goddess of the moon and the hunt, responds to natural language voice commands similar to other Alexa-enabled products, but focuses specifically on the needs of a space crew and their air traffic controllers on the ground.
The Callisto device fits into a briefcase-style case measuring 45 x 30 cm (1.5 feet x 1 foot) and approximately 12.7 cm (5 inches) deep. Engineers from Lockheed Martin and Amazon had to ensure the device could withstand the vibration and extreme shock of a rocket launch. Additionally, Callisto’s electronics have been hardened against the radiation environment of space.
A NASA statement (opens in new tab) Released in January 2022, it says Callisto is “intended to show how commercial technology could assist future astronauts on space missions” and will demonstrate how “human-machine interface technology” can make spaceflight easier, safer, and more efficient for astronauts and ground crew alike .
“I can envision a future where astronauts can access information about flight status and telemetry — like spacecraft orientation, water supply levels, or battery voltage status — through simple voice commands,” said Howard Hu, associate Orion program manager at the Johnson Space Center (JSC) of NASA) in Houston, the statement said. “Orion is already the most advanced spacecraft ever designed to launch astronauts to the moon, and voice-actuated technology could take it to the next level, enabling the interactive computing systems of next-generation sci-fi spacecraft.” become reality by explorers.”
According to a Lockheed Martin expression (opens in new tab)Callisto will demonstrate “how voice technology, AI and portable tablet-based video conferencing can help improve efficiency and situational awareness for those on board the spacecraft.” In addition, the voice assistant will provide “access to real-time mission information and a virtual connection to people and information on Earth.”
Integrated Cisco Webex video conferencing technology will allow mission leaders at JSC to interact with Callisto from the ground, “access flight status and telemetry, and the ability to control connected devices aboard Orion,” according to a Cisco Statement (opens in new tab).
Lockheed Martin says the goal of the Callisto-Tech demonstration is “to explore how these commercial technologies can assist astronauts on future space missions to the moon and beyond.”
Inside a NASA Artemis technology demonstration and solar system Science Payloads Conference Call (opens in new tab) Rob Chambers, director of commercial civil space strategy at Lockheed Martin, held on August 16, said the Callisto team was inspired by science fiction.
“We’ve looked at what technologies are out there, but also some kind of common vision that we’ve seen with the companies [Cisco and Amazon]’ Chambers said. “And I’ll be honest, we picked up the phone and called Amazon and it was one of those ‘Hey, how about flying and Alexa on the other side of the moon’ kind of thing, and I was expecting it to get a very confused response . But instead they explained that one of the – and this is their story, but they often share it – one of the precursors to Alexa, one of the bots they had, if you will, was Starship Enterprise’s voice computer.
“Once we started talking to them and talking about some of their local activities here on Earth that require them to run Alexa without internet, we realized that this was a really great test case of what they wanted to do,” continued Chambers. “So the response has been so strong, and the technologies and the knowledge have been, and our kind of commitment to mission and leadership has been really strong from the start, [so] that we just never looked back.”
Unlike Alexa-enabled consumer devices that rely on an internet connection, Callisto will connect to mission controllers via NASA’s Deep Space Network and feature technology called Local Voice Control, which “allows Alexa to process voice commands locally.” , instead of sending information to the cloud,” says one Amazon statement (opens in new tab).
If current weather forecasts hold and all goes according to plan, NASA’s Artemis 1 mission will launch Monday (August 29) and send an unmanned Orion spacecraft to lunar orbit and back, Callisto and all.