The expanse of star-filled space seems calm and peaceful from afar, but the reality is that outer space is a harsh and unforgiving environment. Astronauts are relatively safe aboard a spacecraft or the International Space Station (ISS), but it is necessary to venture outside to make repairs or modifications from time to time. When astronauts perform extravehicular activities or EVAs, more commonly known as spacewalks, the space suit they wear is the only thing that separates them from extreme temperatures and inhospitable radiation, so it’s a matter of life or death to ensure the integrity of the spacesuit. Microsoft and HPE have partnered with NASA to leverage artificial intelligence (AI) to perform crucial spacesuit glove inspections.
Wear and Tear of Spacesuit Gloves
The integrity of the entire spacesuit is crucial, but the part that wears the most and is most likely to have problems is the gloves. When astronauts perform spacewalks, the sharp edges of handrails can cause rips and cuts, and when working with tools and equipment, the area between the thumb and index finger is vulnerable to damage.
A Microsoft blog post explains: “Astronaut gloves have five layers. The outer layer consists of a rubberized coating that provides grip and acts as the first layer of defense. Next comes a layer of a cut-resistant material called Vectran®. The three additional layers maintain the pressure of the suit and protect against extreme temperatures in space, which can range from 180 degrees Fahrenheit to 235 degrees Fahrenheit.
Inspecting Spacesuit Gloves
I had the opportunity to speak with a NASA team working on this issue. Jordan Lindsey is an integrated test specialist and works in the Extravehicular Activity and Human Surface Mobility Program at Johnson Space Center in Houston. Lindsey explained that the existing glove inspection process has been in place for 20 years at NASA.
“The crew, after a spacewalk, will take a series of photos, and this is a very prescribed way of doing it. There is a procedure in place for the crew to take these photos. These photos are then transmitted to the ground, then a team of experts on the ground examines them and decides: “Are these gloves good or can they be reused, or not?” and the crew must use a pair of emergency gloves.
The process has been good enough so far, but Lindsey and a colleague have begun to wonder how the glove inspection process will work as we move away from the ISS and spacecraft in low Earth orbit and travel further to the Moon or Mars. , and beyond.
Microsoft’s blog post points out: “Since Mars, it will take up to 20 minutes to say ‘hello’ to someone on Earth, and another 20 minutes for someone on Earth to say ‘hello’ back. That means it could take at least 40 minutes in total to determine if an astronaut’s glove is verified, which is just too long to wait.
Automation of glove inspection with AI/ML
Martin Garcia, a computer engineer working in the Chief Information Officer’s Office for Artificial Intelligence and Machine Learning at Johnson Space Center, is one of the driving forces behind the integration of AI and ML at the Nasa. He shared that he received funding from the digital transformation team and set a goal to integrate Microsoft Azure and Microsoft AI/ML tools. The initial performance and security checks passed with flying colors, so the next step was to identify a project to put these tools to the test.
“I was looking at a lot of proposals, and one came across my desk, which happened to be the glove inspection project, which I immediately fell in love with. I thought, ‘It’s mission-driven, and there’s a lot of value to be gained from that.” So now we’re here. We’ve gotten to the point where, in less than a year, from concept to deployment—we did it in a year, less than a year—we’ve deployed to the ISS not once, but twice, so we’re very excited about that.
The “here” Garcia is referring to is that his team worked with Microsoft and HPE to develop a working model to automate the glove inspection process using AI and ML. Part of what enabled such rapid progress and results is that the team already had an existing data set.
Lindsey shared that they were able to use the historical database of photos from previous missions. “There were somewhere between 2,000 and 3,500 original images taken from each of these previous EVAs that helped build the International Space Station. We’re using that as the initial training dataset.
In addition to the images, NASA also had a record of the reports accompanying each inspection. This allowed them to identify areas of concern and help train the ML on known good and known bad glove images. To expand the dataset, they also edited and manipulated the images by rotating them 90 degrees, blurring parts, flipping them, and more. to create a larger library and ensure that ML training was as comprehensive as possible.
The project is still in the research and development phase. Although they were able to deploy twice to the ISS and perform AI/ML model testing, the traditional process is still the one the astronauts actually use and rely on. The team simply piggybacks and uses the same images the crew shares with NASA, allowing them to test the system in a real environment and compare the AI results and assessment to the findings of human analysts. from NASA.
Microsoft and HP
Microsoft and HPE are providing the processing power and artificial intelligence platform to NASA to perform glove inspections aboard the ISS. Microsoft and HPE worked with NASA to develop the Spaceport Computer-2. This computer aboard the ISS allows astronauts to process data at the extreme end – 227 miles above the planet – to provide analysis and information in minutes instead of months.
“NASA is incredibly intentional about who they talk about partnerships with, so we’re very happy to be able to share the work we do with NASA,” said Tom Keane, vice president of Azure at Microsoft.
I have already spoken with Keane about the work Microsoft is doing in space in terms of connectivity and analysis. This latest news is all about developers and expanding and improving the things developers can do from space.
Keane told me, “The way I describe it is Azure Orbital – which is our space technology – I describe it as our cloud fabric in space. I think a year ago I was talking about space as a fringe scenario. It absolutely is, but as connectivity continues to grow, it’s always connected and it’s really a cloud structure that works in orbit.
Keane emphasized that Microsoft is not a “space company”. He explained that what makes Microsoft unique is its focus on partnering to create technologies that all other companies can use to build their solutions. “We are a technology company powering a group of space companies.”
To infinity and beyond
With visions of a manned mission to Mars on the horizon, there’s a limited timeframe to get these things up and running, but it’s still pretty early in the process. Fortunately, progress looks promising.
For now, this project is still in R&D. NASA will continue to test the glove inspection AI in parallel with the existing manual process. There is no specific time for the changeover. They plan to continue moving forward with this initiative with the eventual goal of operationalizing the technology. At the same time, Garcia and his team are also exploring other opportunities to integrate AI and ML to streamline operations and support the mission, both in the near term and as we embark on Mars and beyond. -of the.
Keane summed it up nicely. “The reason space is so exciting is that you look at a lot of these scenarios and how technology can completely change them. Then you look at the foundation that you’re building on and its vintage. And there’s absolutely pretty amazing possibilities as more digital technology comes into these space scenarios to completely reinvent and rethink the way things are done, and I think we’re very early in this modernization, if you will, of space. I really think we’re just getting started.