top of page
Image 090.png
DALL·E 2022-08-16 13.42.59 - stillshot of cybernetic person in augmented reality headset,

AtomXR

The first natural language driven XR application builder

A video demo of AtomXR in action.
Goals
​

Our goals for the project evolved significantly and became more and more ambitious with each success. Here's what we wanted to do to feel satisfied with our work:

  1. Users can create a experiences completely through natural language.

  2. Users can develop experiences in the headset without needing to go to a desktop.

  3. The development cycle must be near-instant: no long compiling, building, and deploying.

Outcomes
​
  1. We were able to complete all 3 goals. Using natural language queries like "place this cube here" in the headset, users can create experiences very quickly.

  2. We created an interpreted programming language, AtomScript, that allows for run-time development of experiences while in-game.

  3. AtomXR has been used by dozens of engineering students at Harvard for making XR experiences rapidly and easily.

  4. We are currently working with Harvard SEAS professor Dr. Elena Glassman on a paper for publication.

Timeline
​

July 2022: AtomXR started as a hackathon project for Microsoft's mixed reality hackathon in the summer of 2022. Due to the potential novelty of the project, the team continued to pursue it for weeks after the hackathon ended.

October 2022: After a several month hiatus from the project, AtomXR is revived. The team completes a minimum viable version of the product and begins working with an HCI (Human-Computer Interaction) professor at Harvard to pursue publication

November 2022: The team begins sandbox tests for AtomXR. These tests were to get general feedback before proceeding to the more serious user studies

December 2022-present: The team begins work on a paper for submission to a journal.

Untitled.png
Aryan, a friend of the group, tests out an emulator version of AtomXR in the Harvard AR/VR Club studio. This is part of a sandbox test, which is used to get rapid feedback prior to launching more-serious user studies.
IMG_0081.jpg
Alice, one of the main developers on AtomXR, shows off a demo to a class of HCI students.
How We Built It
​

AtomXR was built in Unity and deployed to Microsoft's HoloLens 2. We natural language commands (ex. "Create a big sea turtle") to a web server. The web server then uses a fine-tuned version of GPT-3 to attempt to convert natural language into AtomScript code. AtomScript code is then interpreted at runtime with the help of the Antlr library.

 

 

​

Future Plans
​

Many plans were proposed for the future of AtomXR. We do not currently plan on commercialization, however, we are collaborating with UX developers in the Harvard AR/VR club on a design overhaul. Right now, the UI leaves much to be desired. With a design overhaul, the platform will be more usable.

 

We envision AtomXR as a standard tool for novices and experts alike to develop in-headset. With our connections to the Harvard AR/VR community, we would like to on-board more of our local onto the platform. Based on feedback we receive from our community, we can then expand to the AR/VR community at large.

bottom of page