AI system inspects astronauts’ gloves for damage in real-time
Microsoft and Hewlett Packard Enterprise (HSE) are working with NASA scientists to develop an AI system for inspecting astronauts’ gloves.
Space is an unforgiving environment and equipment failures can be catastrophic. Gloves are particularly prone to wear and tear as they’re used for just about everything, including repairing equipment and installing new equipment.
Currently, astronauts will send back images of their gloves to Earth to be manually examined by NASA analysts.
“This process gets the job done with the ISS’s low orbit distance of about 250 miles from Earth, but things will be different when NASA once again sends people to the moon, and then to Mars – 140 million miles away from Earth,” explains Tom Keane, Corporate Vice President of Mission Engineering at Microsoft, in a blog post.
Harnessing the power of HPE’s Spaceborne Computer-2, the teams from the three companies are developing an AI system that can quickly detect even small signs of wear and tear on astronauts’ gloves that could end up compromising their safety.
Astronauts’ gloves are built to be robust and have five layers. The outer layer features a rubber coating for grip and acts as the first defense. Next up is the Vectran® layer, a cut-resistant material. The final three layers maintain pressure and protect against the extreme temperatures of space.
However, space does its best to do all it can to get through these defenses and problems can occur when the Vectran® layer is reached. Aside from the usual day-to-day wear that happens even from using gloves here on Earth, astronauts’ gloves have to deal with a variety of additional hazards.
Micrometeorites, for example, create numerous sharp edges on handrails and other components. On arrival to locations like the moon and Mars, the lack of natural erosion means rock particles are more like broken glass than sand.
To create the glove analyser, the project’s team first started with images of new, undamaged gloves and those which featured wear and tear from spacewalk and terrestrial training. NASA engineers went through the images and tagged specific types of wear through Azure Cognitive Services’ Custom Vision.
A cloud-based AI system was trained using the data and the results were comparable to NASA’s own actual damage reports. The tool generates a probability score of damage to areas of each glove.
In space, images would be taken of astronauts’ gloves while they remove their equipment in the airlock. These images would then be analysed locally using HPE’s Spaceborne Computer-2 for signs of damage and, if any is detected, a message will be sent to Earth with areas highlighted for additional human review by NASA engineers.
“What we demonstrated is that we can perform AI and edge processing on the ISS and analyse gloves in real-time,” said Ryan Campbell, senior software engineer at Microsoft Azure Space.
“Because we’re literally next to the astronaut when we’re processing, we can run our tests faster than the images can be sent to the ground.”
The project serves as a great example of the power of AI combined with edge computing, in areas with as limited connectivity as space.
Going forward, the project could extend to detecting early damage to other areas like docking hatches before they become a serious problem. Microsoft even envisions that a device like HoloLens 2 or a successor could be used to enable astronauts to visually scan for damage in real-time.
“Bringing cloud computing power to the ultimate edge through projects like this allows us to think about and prepare for what we can safely do next – as we expect longer-range human spaceflights in the future and as we collectively begin pushing that edge further out,” concludes Jennifer Ott, Data and AI Specialist at Microsoft.
Post a Comment