Captain Kirk talking to enterprise computers

NASA astronauts on Artemis can talk to spacecraft computers

Captain Kirk, Spock and others Star Trek The gang has been talking to the boat enterprise computer, ask it questions about the starship and its alien environment.

With NASA resuming its human space exploration program through Artemis in a matter of days, it seems that only the true astronauts of the 2020s will be able to complete the upcoming mission. After all, boldly going where no one has been before can be lonely, and having an AI assistant might help on those long journeys.

When Lockheed Martin, the company building the new Orion spacecraft for NASA, first dreamed up a talking computer, engineers thought they would just throw an Amazon Echo Dot on the dashboard with a laptop. But it’s not that simple, said Rob Chambers, director of commercial civil space strategy at Lockheed.

In addition to technical limitations, they also had to overcome the menacing performance of onboard space computers, as Stanley Kubrick put it 2001: A Space Odyssey. not like college computers Star Trek“HAL” began to malfunction, took control of the spacecraft, and then fought the crew’s attempts to shut it down.

This isn’t just a question raised through science fiction. This summer, former Google AI developer Blake Lemoine said publicly that he believes the chatbot he helped build has become sentient. The story has sparked a global discussion about whether some AIs are — or could be — conscious.

William Shatner in Star Trek as Capt. James T. Kirk talking to the Starship Enterprise computer.
Image credit: CBS Photo Archive/Getty Images

The claim reinforces a long-standing fear in popular culture that one day the advanced technology that enables humans to achieve extraordinary feats may be too smart to cause machines to become self-aware and want to harm humans.

“We don’t want the HAL 9000, ‘Sorry, Dave. I can’t open the pod door,'” Chambers told Mashable. “That was the first thing everyone said when we first made this proposal.”

“We don’t want the HAL 9000, ‘Sorry, Dave. I can’t open the pod door. That’s the first thing everyone said when we first suggested this.”

Instead, Lockheed Martin and its collaborators believe it would be more convenient for astronauts to have voice-activated virtual assistants and video calls in the spacecraft, allowing them to access information away from the crew consoles. That flexibility could even make them safer, engineers say.

An experiment to test the technology will accompany Artemis on its first spaceflight, possibly as early as August 29. Named Callisto after one of Artemis’ favorite hunting companions in Greek mythology, the project aims to provide crews with live answers about the spacecraft’s flight status and other data, such as water and battery levels. The technology is being paid for by the company, not NASA.

A custom Alexa system built specifically for the spacecraft will have access to about 120,000 data readings — more data than astronauts previously had, some additional information previously only available in Mission Control in Houston.

Testing the Callisto payload on Earth

Howard Hu, associate manager of NASA’s Orion program, and Brian Jones, lead engineer for Lockheed Martin’s Callisto program, observe signals from the Orion spacecraft at NASA’s Kennedy Space Center in Florida during a connection test.
Credit: NASA

No astronaut will actually be on the first mission in Orion — unless the dummy in the cockpit is important. But the first 42-day spaceflight, testing various orbits and atmospheric reentry, will clear the way for NASA to send astronauts on follow-on missions. The integration of virtual assistants into these expedition spacecraft depends on a successful demonstration during Artemis I.

To test their Alexa, Mission Control will use videoconferencing software provided by Cisco Webex to ask questions and give verbal commands inside the spacecraft. Cisco will run its software on the iPad in the capsule. Cameras installed throughout Orion will monitor its workings.

Want more tech news straight to your inbox? Sign up for Mashable’s Top Stories newsletter today.

In most cases, the virtual assistant will answer questions like “Alexa, how fast does Orion travel?” And “Alexa, what’s the temperature in the cabin?” said Justin Nikolaus, the project’s Alexa voice designer, the only thing the system can actually control is the lights.

“As far as the control of the vehicle is concerned, we don’t have access to any critical components or onboard mission-critical software,” Nicholas told Mashable. “We are safely sandboxed in Orion.”

A spaceflight Alexa might not look all that advanced. But engineers had to figure out how to get the device to recognize the sound in the tin. Orion’s acoustics, mostly metal surfaces, are unlike anything the developers have encountered before. What they learned from the project is now being applied to other challenging acoustic environments on Earth, such as detecting speech in a moving car with its windows rolled down, Nikolaus said.

The most notable change from off-the-shelf Amazon devices is that the system will feature a new technology the company calls “local voice control,” which allows Alexa to work without an internet connection. Back on Earth, Alexa runs on the cloud, which runs on the internet and uses computer servers stored in data centers.

In deep space, when Orion is hundreds of thousands of miles away, the delay in reaching the clouds is, shall we say, astronomical. Going forward, this delay could range from a few seconds to an hour to transmit information back and forth on Mars, which is about 96 million miles from Earth.

That’s why engineers build spacecraft computers to handle data processing, Chambers said.

“This is not canned food. This is actual real-time processing,” he said. “All this intelligence has to be on the spacecraft because we don’t want to suffer the time delay of going back to the spacecraft, back to Earth, back and back again.”

“All this intelligence has to be on the spacecraft because we don’t want to suffer the time delay of going back to the spacecraft, back to Earth, back and back again.”

New radio antenna to support NASA's Deep Space Network

In February 2022, NASA added a new 111-foot beam waveguide antenna to the Deep Space Network at its ground station in Madrid.
Image credit: NASA/JPL-Caltech

For issues that Alexa can’t handle offline, Callisto will tap the Deep Space Network, the radio antenna system NASA uses to communicate with its farthest spacecraft, and route the signal to the cloud on Earth. This allows Callisto to support broader requests, such as reading news or reporting sports scores.

Or order more toilet paper and trash bags – seriously.

Designers built the ability for astronauts to buy stuff from Amazon. Sending flowers to the moon overnight is not an option, but sending flowers to a spouse on Earth on a special occasion is.

Cisco will also use the Deep Space Network to provide video conferencing calls. Astronauts will be able to use the tool to conduct “whiteboard” meetings with colleagues in Houston, the engineers said. Imagine how convenient it would be for the Apollo 13 crew as NASA tried to tell them how to fit a circular air filter into a square hole without visual aid.

Broadcasting pictures in high resolution across the solar system is not easy, especially with such limited data capacity. One of the reasons Lockheed Martin chose Cisco as a partner was the company’s expertise in video compression, Chambers said. As the video travels through space, the data can be garbled. Cisco works on error correction technology to smooth transmission.

“One of my colleagues at Cisco called this an attempt to use 1980s dial-up modems for 4K, high-bandwidth, gigabit-type Ethernet,” he said. “Obviously the Deep Space Network is very, very capable, but we’re trying to do modern videoconferencing.”

“A colleague of mine at Cisco called this trying to do 4K, high-bandwidth, gigabit-type Ethernet using a dial-up modem from the 1980s.”

To create a custom virtual assistant, collaborators spent time interviewing astronauts. One of the things they asked for, Nicholas said, was a dictation service. Their notepads and pens often float away. Using a computer in a weightless environment is also difficult.

“If you’re using a keyboard and you’re not used to microgravity and you start typing, the force you put on the keyboard will push your body away from it,” Nicholas said.

but: Alexa, can you take me to the moon?

Yes, if all you want is a little Frank Sinatra crooning in the cabin.

Alexa, can you open or close the pod door?

Fortunately, no. Chambers said the system couldn’t put astronauts at risk.

“We think about it a lot, it’s not necessarily that they’re going to become sentient, you know, the rise of machines, and [become] Our software overlord,” he said.

But software is complicated. “What we’ve done is build systems that make it practically impossible for this device to communicate with another device,” he said.

So, if all goes according to plan, the biggest havoc a real HAL could do might be pranking astronauts’ families with unwanted Amazon Fresh pizza delivery.

Leave a Comment

Your email address will not be published.