Skip to content
Science & Technology

Student-created AR/VR tools help everywhere from the classroom to the operating room

REALITY CHECK: Students from the Introduction to Augmented and Virtual Reality class offer hands-on demos of their AR/VR projects developed using Snap Spectacles. (University of Rochester photo / J. Adam Fenster)

Transdisciplinary class empowers emerging technologists to develop AR/VR solutions using AI and the latest hardware from Meta and Snap Inc.

Could augmented or virtual reality be the key to debugging robotic systems for medical procedures? Can AR/VR help improve classroom engagement and interaction? Can it be used to teach piano?

This semester, dozens of University of Rochester students set out to answer these questions—and many more—as part of the Introduction to Augmented and Virtual Reality class taught by Daniel Nikolov, an adjunct professor and research engineer at the Institute of Optics.

Spanning optics, computer science, brain and cognitive sciences, and other disciplines, the students not only received a holistic look at the rapidly developing technologies but also got a chance to use the latest hardware including Meta Quest headsets and Snap Spectacles. The immersive experiences that emerged were then presented in a transdisciplinary showcase at the Mary Ann Mavrinac Studio X.

A student wearing a VR headset plays a digital keyboard at a table while classmates test VR tools in a classroom.
HITTING ALL THE RIGHT NOTES: One of the many projects developed in the class uses Meta Quest headsets to teach users how to play the piano. (University of Rochester photo / J. Adam Fenster)

“This class gives us a broad perspective of AR/VR,” says Jiamu Tang ’26, a data science student minoring in computer science and philosophy. “Some really interesting guest lectures from researchers in optics, philosophy, eye perception, and other fields have given us the basic knowledge we need to conduct research or develop games and applications in AR/VR. It’s been amazing.”

Tang developed an application for the Meta Quest she calls LinguaLens to help people learn a new language as they observe the world around them. She used the Quest’s forward-facing cameras to present users a mixed reality view that labels their surroundings with relevant vocabulary.

“The goal is to create new opportunities for people to learn a new language,” says Tang. “When they are walking on the street, they can talk to the digital assistant, which will give them the context they need to learn new words and phrases. It’s designed to be very easy.”

Tang said Studio X provided students critical support for the class, not only giving them access to the Meta Quest hardware, but also offering help learning Unity software to develop AR/VR applications.

Broadening the scope

The Intro to AR/VR class started six years ago with funding from the National Science Foundation Research Traineeship program, and has continually evolved since. The grant ended earlier this year, but the course now continues through the Center for Extended Reality (CXR), recently launched through URochester’s Boundless Possibility: 2030 Strategic Plan to connect and enhance the University’s efforts in AR, VR, and artificial intelligence.

Projects in previous years of the Intro to AR/VR class have focused largely on creating virtual reality experiences, but Nikolov and his colleagues wanted to broaden opportunities for students.

“We didn’t really have an option for students to work on projects in augmented reality, and there were plenty of ideas from students for these projects,” says Nikolov. “It wasn’t due to a lack of creativity; we mostly lacked the hardware to distribute to a whole class.”

Professor of Optical Engineering Jannick Rolland wearing AR smart glasses points ahead while interacting with a digital display in a bright, modern room.
SNAP TO IT: Professor of Optical Engineering Jannick Rolland tries out Snap Spectacles during the transdisciplinary showcase at the Mary Ann Mavrinac Studio X. (University of Rochester photo / J. Adam Fenster)

Nikolov and the University solved the limitation by pursuing a partnership with Snap Inc. as one of CXR’s first major achievements. Snap Inc. supplied new Snap Spectacles for Nikolov’s class, as well as for a human-computer interaction course taught by Assistant Professor Yukang Yan this fall and an AR/VR interaction design course to be taught in the spring by Assistant Professor Zhen Bai, the Biggar Family Fellow in Data Science.

Addison Black, Spectacles producer at Snap Inc., says the partnership has been a win-win.

“We know that the platform of AR and Snap Spectacles can only really be made great by fantastic developers bringing new ideas and pushing the technology,” says Black. “We’ve found that the most successful and interesting use cases for Snap Spectacles have come from those new to the industry and students who were pushing the boundaries of what’s possible. And so we were excited when we found the opportunity to collaborate with the University of Rochester and see the projects coming in.”

Anakin de la Cruz Flecha ’25 (biomedical engineering), who’s pursuing his master’s through the Center for Medical Technology and Innovation program, was excited by the chance to work with Snap Spectacles. He created a memory-based object tracker he named Recallis AR to help physicians in an operating room keep track of their tools.

“The Snap Spectacles are obviously very new and cool, and you can do so much with them,” says Flecha. “Before this class, I didn’t know anything about AR, and it’s been fun to learn about it and apply it to medical devices.”

A student wears a VR headset and holds a controller while another gestures nearby, with a robotic arm and colorful lab space behind them.
DOMO ARIGATO MR ROBOTO: George Kassis, an MD-PhD student in biomedical engineering, shows iZone employee Samantha Monaghan how an AI-powered robotic arm, which he designed to perform ultrasound in rural areas, can be controlled remotely using VR. (University of Rochester photo / J. Adam Fenster)

Augmenting research

For George Kassis, the class provided a new dimension to enhance a long-term project he’s developing for his MD-PhD program. Under the guidance of Professor Benjamín Castañeda ’09 (PhD) from the Department of Biomedical Engineering, Kassis is creating a low-cost, autonomous, AI-powered robotic arm to perform ultrasound in rural areas without access to trained ultrasound professionals. It’s designed to be 3D printable while still being mechanically robust.

Kassis says VR can be used to remotely control the robotic arms, safely train new users with digital twins, and provide important training data to help debug and refine the machine.

“I’d like to have expert physicians use VR to control the robot and actually do the ultrasound scan, so then we can use the data to train the robot,” says Kassis. “There’s a big gap in ultrasound specialists across the country, and this could free up sonographers to focus on the more complex cases and give the simple, routine tasks to the robot.”