University of Houston’s Extended Reality Simulation Advances Space Architecture Education
- The simulation is the first in the world to combine real-time virtual reality in a greenroom with traditional space simulation.
- The extended reality (XR) frame simulates three methodologies: design validation, regular habitat operations, and outside regular and emergency operations.
- Sasakawa International Center of Space Architecture Director Olga Bannova said the idea for the simulation is to have it be a part of every space architect’s thesis defense.
Students studying space architecture at the University of Houston (UH) are now utilizing a revolutionary advancement in their curriculum: technology that allows them to accurately design, build, and operate structures on Mars and the moon from a lab on Earth.
Three researchers at the university’s Sasakawa International Center of Space Architecture (SICSA) operate the world’s only space simulation frame that marries the physical and virtual worlds to replicate space environment conditions. Students can test designs and operations with altered visuals and even gravity.
BestColleges spoke with the team to find out what the frame is capable of, how it could change the future of space architecture, and how it’s taught in the classroom.
What Does the Extended Reality (XR) Frame Simulate
The frame has three space architecture methodologies: design validation, regular habitat operations, and regular and emergency outside operations simulations.
Validation simulations test whether a design works within a simulated environment.
Olga Bannova, the director of SICSA and the world’s only master of science space architecture program, said they wanted to see how efficient a design would be for offloading cargo from a rover to the habitat to a cargo airlock and then back to the primary habitat.
“It’s very efficient when you design, let’s say, a space station or commercial station as we’re talking about tourism and talking about tourists, even going, doing [electric vehicles],” Bannova said.
It’s a delicate job to understand the best features to enhance the simulation and then use that ability to integrate physical and virtual characteristics, added Vittorio Netti, a researcher at SICSA and a 2021 graduate of UH’s space architecture program who stayed at SICSA as a researcher responsible for XR development.
For the primary living habitat, students can build it in virtual reality (VR) since making it physically would take too much time and money.
The team can take physical objects and digitally bind virtual objects to trackers on the physical objects. For example, a virtual drill can be mapped onto a prop to “take a sample” of the moon’s surface.
The second methodology is routine operations — doing everything from eating to sleeping to rearranging the primary habitat.
The third methodology uses the spacesuit and a crane that lifts the user in the air. It simulates any outside operation, maintenance, or emergency.
The team has simulated the lower Earth orbit, the lunar orbit, the lunar surface, the Martian surface, and the Martian orbit. Within those environments, students can also simulate dust storms on Mars or even extremely bright sunlight that almost blinds you.
How the Extended Reality Frame Works
Netti said the easiest and fastest way to test if certain simulation features will work is to take already existing software and tailor that software to your specific needs.
“I was amazed how many of the technologies that I wanted to build were already existing in a different form,” Netti told BestColleges.
“And so it was all a lot of work refitting, repurposing those technologies. And it’s crazy for me how many of these solutions — for example, internal especially of software, but also some parts of the hardware — they directly come from the entertainment and gaming industries.”
Netti said it is fascinating that new technologies have gotten cheaper as coding and VR have become more mainstream tools and how game developers and streamers have helped the use of those tools grow.
The first simulations were standing simulations on Mars or the moon, where the user used controllers to move around the environment. It was immersive but not like the crane.
Users in crane simulations hang belly-down and, with tools and mockups, maneuver and work while balancing, explained Paolo Mangili, an Italian international student in space architecture. He started UH’s program in 2022 and has been a research assistant since this spring.
The bulky suit that a user wears simulates the inhibited movement and range of motion astronauts experience, he said. It’s not meant to be pretty; it just needs to mimic the real thing.
Before the first crane test, Mangili, whose primary responsibilities relate to modeling and prototyping, thought the suit would be comfortable. However, he had to come back down after only five minutes. Then the second time, he got dizzy. He hadn’t even put on the spacesuit yet, which was a heavy and constricting mockup secured by a separate harness to distance it from his body.
“I thought to myself, this is going to kill me.”
However, it was much more comfortable than he thought, and the suit’s bulkiness — combined with the virtual reality goggles — was an overwhelmingly immersive experience.
“So the whole idea is actually to trick yourself, your mind, that at least you can simulate that to the microgravity conditions,” Bannova told BestColleges.
“Again, we cannot turn off gravity even underwater; it’s still gravity, right? But we combine this virtual input that the brain receives with the feeling that you don’t have, this usual support in your feet when you stand up.”
The crane idea stemmed from a technique NASA has been using for decades, but the UH team is the first to have a crane combined with a virtual reality simulator housed inside a greenroom where Netti can alter the background and have a partial view of the “astronaut” in the simulation.
“Design is always about combining things. It’s not always to design things from scratch,” said Mangili. “We always have the picture of the scientist that invents and creates everything from nothing.”
Bannova added, “But, it’s also using things beyond their obvious or original intention. Yes. So that’s part of the creative approach.”
How the Extended Reality Frame Started
The simulation frame is a spinoff from a study for Boeing about lunar surface architectures and mission operations analysis in tight environments. The team proposed using virtual reality and mixed reality (MR) as design evaluation tools.
Bannova said Netti was the principal engineer, and Mangili joined later when Boeing approved and sponsored the next stage of their studies.
Netti said XR is a big reality family that includes virtual reality and augmented reality. He said he’s always been interested in XR since it’s an excellent way to enable next-level immersion in simulations.
Netti was doing his master’s thesis and interviewed many people across the space industry and found that everyone knew there was a need to integrate this type of simulation, but nobody knew what to do with it.
The lack of industry standard for space simulations motivated him to create the frame.
Bannova said the biggest challenges started during the COVID-19 pandemic — it took extra time to get materials and build the hardware.
One of the most significant accomplishments is how the team maximized the confined space for the virtual reality lab frame. The second was finalizing the spacesuit mockup.
“We figured that we actually could have done it probably more efficiently. Well, it’s related to any process. Usually, you’re like, ‘OK, the second time I would do it, and it’ll be way faster and more efficient.’ But you have to do something to be able to learn from it and progress and see how you can do it next time better.”
The Future of Space Architecture Education
Bannova said the idea is for the simulation space to be a part of every space architect’s thesis defense. She said if the project is about designing a habitat or station, the final delivery should evaluate your design within the virtual reality simulation frame.
She said it would require a few additional skills for students to work in the lab, but most students are already familiar with VR.
“Pretty much everyone is familiar with the systems. But combining it again with learning how to design is part of what they’re learning in the program, and also being able to produce a 3D environment that will be possible to combine and put in a VR environment. And then to test it, at least from your own perspective.”
Bannova said Netti was the first to defend his thesis with the frame. He built a mockup of a robotic assistant and demonstrated it during his defense. This virtual integration will become a more common element of education and for validating design before building and using.
“It’ll expedite the design process, which is really critical for commercial space flight and human space flight,” Bannova said.
“So a tool that will shorten the design and validation time, but also the cost,” Netti added, “especially in, as we say, instead of building these enormous, very expensive mockups, enable a really iterative process.”
Bannova said technology is constantly advancing, so next year, there could be a new generation of VR goggles, and she hopes haptic gloves, which allow people to feel virtual environments and will one day become haptic suits.
Bannova said despite this type of XR technology, nothing can replace going to space.
“Of course not, because we only know when we go there, and we can make all sorts of assumptions and simulations and think what it would be,” she told BestColleges. “But we won’t know until we are really there. So we are just trying to prepare as much as we can.”
“We’re just making it more hopefully safe for those who one day will be there.”