AI system inspects astronauts’ gloves for damage in real-time

Microsoft and Hewlett Packard Enterprise (HSE) are collaborating with NASA scientists to create an artificial intelligence system for checking astronaut gloves.

Space is a harsh environment, and equipment failures can have disastrous consequences. Gloves are especially vulnerable to wear and tear because they’re used for so many things, including repairing and installing new equipment.

Currently, astronauts will send photos of their gloves back to Earth, where NASA analysts will evaluate them manually.

“This process gets the job done with the ISS’s low orbit distance of about 250 miles from Earth, but things will be different when NASA once again sends people to the moon, and then to Mars – 140 million miles away from Earth,” explains Tom Keane, Corporate Vice President of Mission Engineering at Microsoft, in a blog post.

Harnessing the power of HPE’s Spaceborne Computer-2, the teams from the three companies are developing an AI system that can quickly detect even small signs of wear and tear on astronauts’ gloves that could end up compromising their safety.

The gloves worn by astronauts are made of five layers and are designed to be durable. The first line of defence is the outer layer, which has a rubber coating for grip. The Vectran® layer, a cut-resistant substance, comes next. The final three layers maintain pressure and provide protection from space’s severe temperatures.

However, space will do everything it can to get past these barriers, and issues may arise once the Vectran layer is reached. Aside from the normal wear and tear that occurs even when wearing gloves on Earth, astronauts’ gloves are exposed to a variety of additional dangers.

Sharp edges are produced by micrometeorites, for example, on railings and other components. Due to the lack of natural erosion on places like the moon and Mars, rock particles resemble broken glass rather than sand.

The project’s team started with photographs of new, undamaged gloves and those with wear and tear from spacewalk and terrestrial training to construct the glove analyser. Through Azure Cognitive Services’ Custom Vision, NASA engineers combed through the images and categorised various sorts of clothing.

The data was used to train a cloud-based AI system, and the results were comparable to NASA’s own genuine damage reports. The programme generates a probability score for each glove’s damage.

In space, images would be taken of astronauts’ gloves while they remove their equipment in the airlock. These images would then be analysed locally using HPE’s Spaceborne Computer-2 for signs of damage and, if any is detected, a message will be sent to Earth with areas highlighted for additional human review by NASA engineers.

“What we demonstrated is that we can perform AI and edge processing on the ISS and analyse gloves in real-time,” said Ryan Campbell, senior software engineer at Microsoft Azure Space.

“Because we’re literally next to the astronaut when we’re processing, we can run our tests faster than the images can be sent to the ground.”

The project serves as a great example of the power of AI combined with edge computing, in areas with as limited connectivity as space.

Going forward, the project could extend to detecting early damage to other areas like docking hatches before they become a serious problem. Microsoft even envisions that a device like HoloLens 2 or a successor could be used to enable astronauts to visually scan for damage in real-time.

“Bringing cloud computing power to the ultimate edge through projects like this allows us to think about and prepare for what we can safely do next – as we expect longer-range human spaceflights in the future and as we collectively begin pushing that edge further out,” concludes Jennifer Ott, Data and AI Specialist at Microsoft.

 

 

Related posts

Leave a Comment