loader image
Close

Campus Virtual Tour

Campus tours meet the Metaverse: What if exploring a university could be gamified to the point that it feels like you're actually walking around in-person?

Made For

Date

Campus tours have traditionally been the staple medium to learn about college for incoming first-year students and high school seniors. With the surge of digital communications since the COVID-19 pandemic, it is no surprise that virtual tours have started becoming increasingly popular–over 700 universities in the US alone use virtual tours to attract potential students.

The current status quo of virtual tours is encompassed by 360-degree-image-based with point-and-click arrows that let users «teleport» between key areas of campus. With new developments such as the Metaverse and growth in Mixed Reality applications, we sought to bring campus tours to the next level by incorporating several of these emerging technologies combined with industry-standard marketing techniques.

A screenshot of the current 360° image tour.

Current virtual tours focus on facts and information and crucial campus areas. While this is a good step in mirroring real-life tours, it can become unengaging after a while due to the lack of interaction. We wanted to upgrade the experience by giving users extra flexibility when walking around our virtual campus world while also making the tour-guiding experience feel like they dialogue with a tour guide to supplement the lack of real human interaction. We are currently working with our partners from Matherium.com in implementing a fully-interactive virtual campus guide.

A wireframe + shaded screenshot of the interactive virtual campus (final stages).

For this case study, we sought to replicate the area around the center of the University of Massachusetts Amherst campus. The actual development looked to create a fully-functional «mini tour» of UMass, correctly representing the virtual tour’s user experience, visuals, and functionality while saving time on overhead development of a fully-fledged campus tour.

The real-life Old Chapel and campus Library.

We decided to focus on the area around the campus library and the Old Chapel, which included buildings such as South College, Goodell Hall, Bartlett Hall, Machmer Hall, Thompson Hall, the Student Union, the Integrated Learning Center, and the Fine Arts Center. However, we created not all of these buildings in complete detail; we focused only on the areas visible on the tour path.

Top view of a virtual tour mockup. The walkable area is shown in yellow, while the red area represents the core of the experience where most details are focused.

The experience had to feel effortless–like an extension of real life, so we used the Metaverse concept to inspire a big part of our design philosophy. This meant 3D-modeling most buildings, hand-texturing every detail, sculpting the terrain, and designing the tour path and interactions. From the start, we wanted to make the virtual world feel as accurate as possible, so we put a lot of attention to detail into the visuals, particularly the lighting.

Terrain Sculpting

The terrain topography had to be done solely by hand since Geographic Information System (GIS) data is inaccurate enough for our level of detail and requires heavy noise smoothing. We used Google Maps and Street View data, together with photographs we took on our own to replicate the terrain. While the elevation around the Library wasn’t as contrasting as a lot of the surrounding hillside of the Amherst area, all the elevation details had to be carefully measured since the differences are very noticeable when walking in person. If a slope is too steep, the user will notice it when walking around the virtual world.

Typical Nissan Leaf used by the campus’ IT department.

The attention to detail went beyond the terrain, though. It was essential to maintain consistency among the visuals, not to have things feel out of place; the vegetation, the roads, the banners, and even the vehicles matched the design, measurements, and models of the entire campus. While most digital models were done by hand, the mascot, a statue, used photogrammetry, a technique involving photographs to generate 3D models. We took nearly 300 pictures from slightly different angles to create the point-mesh with over 3 million vertices.

These are some of the photographs taken of the statue.
The results looked astonishingly close to the actual statue.

However, since the point mesh had exceeded the budget of vertices for the entire project, we had to dramatically reduce the amount of 3D geometry used to represent the statue. We used specialized software to generate a series of copies to optimize the source model, each with gradual decreases in vertice counts.

The different Sam levels of detail in wireframe mode. Notice the drastic count in geometry complexity.

Each model is called a Level of Detail (or LOD), a concept used in real-time computer graphics, which lets us switch the model depending on the distance between the camera and the model. The farther away the camera is, the less detailed the model will be; this lets the simulation run smoothly while not sacrificing visual fidelity.

From most detailed to least detailed. The left-most model is displayed when the user is very close, while the right-most model is only shown when the user is very far away.

On the other hand, the buildings had to be polygon modeled using computer-aided design (CAD) software, similar to 3D animated films and architectural visualizations. Under ideal circumstances, we would have access to floor models to create the buildings with precise measurements. In this case, since we didn’t have access to the floor models, we used Google satellite imagery to measure the base structure while carefully eyeballing the height.


A low-polygon distant building, Herter Hall, shows how we used satellite imagery.

Similar to how we did with the photogrammetry models, we created three different variations for the LODs. This ensured we could optimize the entire scene and target any device, from high-end computers to budget smartphones. Only highly detailed models had LODs, while other far-away models were done with a single model.

South college with LODs most detailed from left to right.

One of the trickier problems when doing open virtual worlds is providing the sensation that they are infinite–this is especially important when making the user feel immersed. If the virtual world has clear and defined bounds, it makes the user feel like they are navigating a cardboard cutout. This meant modeling distant buildings and details, placing forests, and expanding the terrain beyond the area proposed for the case study.

The entire virtual world. The walkable area is highlighted in red in the center. Notice the tall buildings are placed far away.

The terrain is approximately twenty-five times larger than the walkable area to ensure that the world looks infinite. We also made sure to model signature tall buildings (in this case, the Southwest high-rises and the Lederly Research Tower), even though they were far away because these would be visible to the player from their location. We also used techniques such as billboarding to mimic the continuity of campus grounds in all directions.

Billboards from up close. These are just several «flat» polygons with a texture (usually a photograph) wrapped on top.

Beyond the virtual world, we wanted to make the tour feel as close to touring a college campus as possible. One of the trickier parts of this was fulfilling the role of a tour guide. We had a couple of options: we could guide the player using a more streamlined approach, such as 3D waypoints, or try replicating the tour guide via a 3D model or avatar.

We decided to combine these: we created the avatar via photographed real-life people instead of 3D avatars, but it would be able to talk to you and guide you throughout as a part of the virtual world. Furthermore, we decided to create a screenplay and write characters, together with an interactive branching dialogue. After reaching each point of interest, the tour would let the player interactively ask questions or move on.

Lucy, the tour guide, prompts the user on how to proceed.
The player’s possible dialogues are displayed at the bottom.

For the tour, we decided to create four characters:

Lucy: the tour guide and protagonist. She’s a witty college student who’s very playful and friendly and will engage with the user in whichever way he chooses to move the direction of the tour.

Connor: a calmer, more down-to-earth student the user bumps into later in the tour. He helps counter-balance Lucy’s wittiness.

Dr. Young: a professor who taught Lucy in one of his classes. He bumps into the tour group and decides to join for a bit.

Emma: a grad student who helps Dr. Young describe the classes and other aspects of college, serves as a more straightforward and calming character to contrast with the two hyperactive undergraduates.

We also decided to add two more voices, representing the user’s choices. When the user selects a dialogue, their voice can be heard–they have the selection at the beginning of choosing their gender, and the voice reflects this.

Branching dialogues add another dimension of interactivity to the user, making them feel they have complete control over what happens next. While in this case, dialogues only influence how the characters respond to the user, in other scenarios, we use these two to change how the application interacts with the user beyond by adding easter eggs and hidden content, as well as have characters adapt to the personality exhibited by the user.


Beyond the individual character interactions, we also wanted the tour to have built-in tools to help college administrators and admissions officers gauge how potential students view the school and time. We can measure engagement via analytics of what the user interacts with in the tour, which areas they visit, and what dialogue choices they make with the characters.

The characters adapt to the user’s choice of field of study, impacting how we showcase the University.

When the tour ends, it prompts the user with their contact information for a seamless CRM integration. All contacts are relayed. But we didn’t stop there since the tour asks the user what field they are interested in via the tour guide; we can forward all this information to the University without having to fill in tedious contact forms.

Seamless CRM integration means you can save and connect with prospective students engaged with the tour.

Overall, we found this project very exciting, and we believe this is the future of campus tours and digital strategies. The UMass Tour concept opened a world of possibilities in the new era of the Metaverse: can this concept become the replacement for current virtual campus tours? We believe so.