top of page
DALL·E 2022-08-16 13.42.59 - stillshot of cybernetic person in augmented reality headset,

Highlighted Projects

We focus on producing high-impact work in the AR/VR field.

IMG_0081.jpg
Alice, one of the main developers on AtomXR, shows off a demo to a class of HCI students.
A video demo of AtomXR in action.
Untitled.png

AtomXR: The first natural-language driven in-HMD game engine

Extended reality (XR) game development suffers from three major inefficiencies and entry barriers:

  1. the inaccessibility and steep learning curve of game engines and IDEs to non-technicals,

  2. the mismatch between development environment on 2D screens and ultimate 3D user experience inside head-mounted displays (HMDs), and

  3. the long deployment cycles for testing and iteration onto target HMDs.

To address these severely limiting inefficiencies, we introduce AtomXR, the first natural language-driven, at-runtime game development platform. Designed with low floors, AtomXR allows anyone to create any application using intuitive, non-technical interactions, all at runtime.

Users can design 3D environments, spawn objects, and add functionality through natural language input, complemented by physical interactions to redesign, reposition, and resize components. Akin to industry-standard game engines, AtomXR provides out-of-the-box functionality for buttons, user interfaces, collectible items, path finding, and more. Beyond this, users can add game logic of arbitrary complexity- boolean operators, loops, conditionals- anything that can be described in natural language can be implemented with AtomXR.

Aryan, a friend of the group, tests out an emulator version of AtomXR in the Harvard AR/VR Club studio. This is part of a sandbox test, which is used to get rapid feedback prior to launching more-serious user studies.

Goals

Our goals for the project evolved significantly and became more and more ambitious with each success. Here's what we wanted to do to feel satisfied with our work:

  1. Users can create a experiences completely through natural language.

  2. Users can develop experiences in the headset without needing to go to a desktop.

  3. The development cycle must be near-instant: no long compiling, building, and deploying.

Outcomes

  1. We were able to complete all 3 goals. Using natural language queries like "place this cube here" in the headset, users can create experiences very quickly.

  2. We created an interpreted programming language, AtomScript, that allows for run-time development of experiences while in-game.

  3. AtomXR has been used by dozens of engineering students at Harvard for making XR experiences rapidly and easily.

  4. We are currently working with Harvard SEAS professor Dr. Elena Glassman on a paper for publication.

Timeline

July 2022: AtomXR started as a hackathon project for Microsoft's mixed reality hackathon in the summer of 2022. Due to the potential novelty of the project, the team continued to pursue it for weeks after the hackathon ended.

October 2022: After a several month hiatus from the project, AtomXR is revived. The team completes a minimum viable version of the product and begins working with an HCI (Human-Computer Interaction) professor at Harvard to pursue publication

November 2022: The team begins sandbox tests for AtomXR. These tests were to get general feedback before proceeding to the more serious user studies

December 2022-present: The team begins work on a paper for submission to a journal.

IMG_5374.HEIC.heic

Current version of the AR glasses.

DIY AR Glasses

Building utility-focused AR glasses from scratch

Current commercial AR glasses cost upwards of hundreds of dollars and have little day-to-day utility for the average person. We are developing DIY AR Glasses that are affordable, easy to build, and actually useful. To achieve this goal, we are taking apart a phone and using the electronics for the frames. This allows the device to have an entirely supported OS and App Library, with native mobile AR built-in, and because it has all the functionality of a phone, a user could replace their phone with AR entirely.
 

Project Goals

  1. Develop AR glasses from scratch.

  2. Develop guides on fabrication and software integration processes.

  3. Build software systems (e.g. life documentation) on top of the AR glasses.

Digital Humans

Digital characters, friends, clones, and more

Our current extended reality worlds lack consistent synchronous interaction from users and therefore tend to feel empty. From open-ended conversations to decision-making, XRAgents are dynamic parts of your world that can serve as digital friends, clones, fictional characters, and more.

DigitalHumans.png

Life Documentation

Software systems for effective life documentation

Software systems for effective life documentation

Life is finite, time is unidirectional, and we itch to immortalize ourselves through documenting our lives. The Life Documentation project is a collection of software modules integrated with XR hardware that enables both automated and human-in-the-loop processing of life documentation data.

Life Documentation.png

 

John Harvard Avatar

Convert an existing mesh of the John Harvard Statue into a rigged  Unreal Engine Meta Human avatar.

 

Harvard AI Avatar

Attach the Meta Human avatar to a GPT-3 AI driven by the Hollis library information database about Harvard.

 

Harvard AR Info App

Create an AR app that uses the avatar to provide phone based information both at the statue and anywhere on campus.

 

Charleston house 3D Model

Construct a 3D graphical version of John Harvard's 17th century Charlestown house for the avatar that will work for a related VR app.

Projects with VizLab

DALL·E 2022-11-16 09.11.25 - statue in front of college campus in virtual reality, 3d rend
bottom of page