Captain Kirk, Spock and the rest of the star trek Gang were in constant dialogue with the flight crew company computer and asks him questions about the spaceship and its extraterrestrial environment.
With NASA reviving its manned space exploration program through Artemis within days, it seems only natural that real-world astronauts of the 2020s who will be crewing the upcoming missions would do the same. After all, it could be lonely to bravely go where no one has gone before, and having an AI buddy could be helpful on those long journeys.
When Lockheed Martin, the company that built NASA’s new Orion spacecraft, first invented the talking computer, engineers figured they’d just throw an Amazon Echo Dot with a laptop on the dashboard and call it a day. But it wasn’t nearly as easy, said Rob Chambers, Lockheed’s director of commercial civil space strategy.
Technical limitations aside, they had to overcome the ominous representations of a Stanley Kubrick-style on-board computer in space 2001: A Space Odyssey. Unlike the collegial computer in star trek“HAL” begins to interfere, taking control of the spacecraft and then fighting the crew’s attempts to shut it down.
This isn’t just a concern raised by science fiction. This summer, AI developer Blake Lemoine, formerly of Google, went public with his belief that a chatbot he helped create had become sentient. The story sparked a worldwide discussion about whether an artificial intelligence is — or could be — conscious.
William Shatner as Captain James T. Kirk in Star Trek talks to Starship Enterprise’s computer.
Photo Credit: Photo by CBS Photo Archive/Getty Images
Such claims reinforce fears long ingrained in popular culture – that one day the advanced technology that allows humans to achieve extraordinary things may be too clever, perhaps leading to machines that are self-aware and want to hurt people.
“We don’t want the HAL 9000, ‘I’m sorry Dave. I can’t open the pod bay doors,'” Chambers told Mashable. “That’s the first thing everyone said when we first proposed that.”
“We don’t want the HAL 9000, ‘Sorry Dave. I can’t open the cabin doors. That was the first thing everyone said when we first proposed it.”
Rather, Lockheed Martin and its associates believe it would be more convenient for astronauts to have a voice-activated virtual assistant and video calls on the spacecraft because they would allow them access to information outside of the crew console. That flexibility could even make them safer, say engineers.
An experiment to test the technology will fly with Artemis on her first spaceflight, which could launch as early as August 29. Named Callisto after one of Artemis’ favorite hunting companions in Greek mythology, the project is programmed to provide live responses to the crew about the spacecraft’s flight status and other data such as water and battery levels. The technology is paid for by the companies – not by NASA.
A custom Alexa system built specifically for the spacecraft will have access to around 120,000 data readouts — more than astronauts have previously had, with some bonus information previously only available within Houston’s Mission Control.
Howard Hu, NASA assistant Orion program manager, and Brian Jones, Lockheed Martin’s chief engineer for the Callisto project, observe signals from the Orion spacecraft during a connectivity test at NASA’s Kennedy Space Center in Florida.
Photo credit: NASA
On this first mission, no astronaut will actually be aboard Orion — unless the dummy in the cockpit counts. But the first 42-day space flight, testing different orbits and re-entry into the atmosphere, will pave the way for NASA to send a crew on subsequent missions. Whether or not a virtual assistant will be integrated into the spacecraft for these expeditions depends on a successful demonstration during Artemis I.
To test their Alexa, mission control will use video conferencing software provided by Cisco Webex to ask questions and issue verbal commands inside the spacecraft. Cisco will run its software on an iPad in the capsule. Cameras placed throughout the Orion monitor how it works.
Want more science and technology news delivered straight to your inbox? Sign up for Mashable’s Top Stories newsletter today.
Most of the time, the virtual assistant will answer questions like “Alexa, how fast is Orion going?” and “Alexa, what’s the temperature in the cabin?” The only thing the system can actually control are the lights, said Justin Nikolaus, an Alexa -Voice designer of the project.
“In terms of controlling the vehicle, we don’t have access to critical components or mission-critical software on board,” Nikolaus told Mashable. “We’re safely housed in a sandbox in Orion.”
The space-faring Alexa might not seem so advanced. But the engineers had to figure out how to get the device to recognize a voice in a tin can. Orion’s acoustics, with mostly metal finishes, were unlike anything developers have experienced before. What they learned from the project will now be applied to other challenging sound environments on Earth, such as recognizing speech in a moving car with the windows down, Nikolaus said.
The tweet may have been deleted
(opens in a new tab)
The most significant change from Amazon’s off-the-shelf devices is that the system will introduce new technology the company calls “local voice control,” which allows Alexa to work without an internet connection. Back on Earth, Alexa works in the cloud, running on the internet and using computer servers housed in data centers.
In space, when Orion is hundreds of thousands of kilometers away, the time delays to reach the cloud would be, say, astronomical. Looking ahead, that delay could extend from seconds to an hour to transmit messages back and forth to a spacecraft on its way to Mars, some 96 million miles from Earth.
That’s why engineers built a spacecraft computer to handle the data processing, Chambers said.
“It’s not canned. It’s real real-time processing,” he said. “All of that intelligence has to be on the spaceship because we didn’t want to suffer the time lag of going back to the spaceship, back to Earth, back and back again.”
“All of that intelligence has to be on the spaceship because we didn’t want to suffer the time lag of going back to the spaceship, back to Earth, back and back again.”
NASA added a new 111-foot Beam waveguide antenna to the Deep Space Network at the Madrid ground station in February 2022.
For the questions Alexa can’t answer offline, Callisto will use the Deep Space Network, the radio dish system NASA uses to communicate with its most distant spacecraft, relaying the signals to the cloud on Earth. This could allow Callisto to support a wider range of requests, e.g. B. reading news or reporting sports scores.
Or order more toilet paper and bin liners — seriously.
The designers built in the ability for astronauts to buy things from Amazon. Overnight delivery to the moon would not be an option, but sending flowers to a spouse on earth for a special occasion would be possible.
Cisco will also use the Deep Space Network to offer video conferencing. Engineers say astronauts could use this tool for “whiteboarding” meetings with their colleagues in Houston. Imagine how handy that would have been for the Apollo 13 crew when NASA tried to tell them how to fit a round air filter into a square hole without visual aids.
Transmission of high-resolution images across the solar system is not easy, especially with such limited data capacity. One of the reasons Lockheed Martin chose Cisco as a partner was the company’s expertise in video compression, Chambers said. When videos travel through space, the data can become garbled. Cisco was working on error correction technology to smooth the transmissions.
“One of my colleagues at Cisco describes this as trying to make high-bandwidth 4K Ethernet and Gigabit Ethernet with a dial-up modem from the 1980s,” he said. “Obviously the Deep Space Network is very, very powerful, but we’re trying to do modern video conferencing.”
“One of my colleagues at Cisco refers to this as trying to do high-bandwidth 4K Ethernet and Gigabit Ethernet with a 1980’s dial-up modem.”
The tweet may have been deleted
(opens in a new tab)
To create the custom virtual assistant, staff spent time interviewing astronauts. Among other things, a dictation service is required, said Nikolaus. Their notepads and pens often float away. It is also difficult to use a computer in a zero gravity environment.
“When you walk up to a keyboard and you’re not used to microgravity and you start typing, your force on the keyboard pushes your body away from it,” Nikolaus said.
But: Alexa, can you fly me to the moon?
Yes, if you want a little Frank Sinatra buzzing around the cabin.
Alexa, can you open or close the cabin doors?
Luckily no. The system couldn’t do anything to put the astronauts at risk, Chambers said.
“We think about it a lot, not necessarily about them becoming sentient and, you know, rise of the machines, and [become] our software overlords,” he said.
But software is complex. Strange behaviors can occur due to unexpected linkages of activities, he said: “We build the system in such a way that it is actually not possible for this device to communicate with this other device.”
So, if all goes according to plan, perhaps the biggest mess the real HAL could wreak is pranking an astronaut’s family with an unwanted Amazon Fresh pizza delivery.