Skip to main content

Visualizing Complete Street designs in VR: A Higher Education R&D project

The ArcGIS Maps SDK for Unity provides a powerful method of bringing real-world GIS data into the Unity game engine, allowing for immersive visualizations, high-quality rendering and visual effects, and custom interactivity. In this post, we discuss how we created a VR app using the Maps SDK for Unity to visualize proposed designs for Complete Streets within the City of Toronto.

This blog post was jointly written by Michael Luubert and Jonathan Van Dusen of the Education and Research group.

The ArcGIS Maps SDK for Unity was released in June 2022 and allows layers published on ArcGIS Online to be used in the game engine. This provides the opportunity to create vivid visualizations and simulations by combining Unity’s physics system, lighting, and visual effects with existing GIS layers. The Maps SDK provided a way to bring our group’s past work modelling Complete Streets to the next level.  

The Education and Research group’s involvement with Complete Streets began several years ago with our participation in iCity, a research project exploring transportation in Toronto. We created Unity generated animations of a pedestrian’s viewpoint walking on the sidewalk for different street scenarios. In collaboration with researchers at the University of Toronto’s Transportation Research Institute (UTTRI), these animated videos were used in a web survey to rank Complete Street designs. We also worked with OCAD University during the iCity project to create a web dashboard that allowed users to interactively configure a Complete Street scenario and view it in a 3D web scene.

Last year, we used the new Maps SDK for Unity in a research project to visualize Complete Street designs with virtual reality headsets. We began by creating a scene of our study area with buildings from Toronto’s Open Data Portal. Next, the road data from the Complete Streets dashboard was used to create 3D labels of street names that are rotated to always face the users as they navigate the scene. Finally, the action of selecting a road was connected to loading the corresponding Complete Street 3D model.

Design Considerations

We made several design choices both in the planning stages of the project and based on our experience implementing the app with the Maps SDK for Unity.

VR and Non-VR Users

One of the project requirements was that the Unity application would work with a mouse and keyboard for users without a headset, so each action in the game engine required multiple event listeners, one for a keyboard or mouse and another for the VR controller. For example, selecting a road segment in VR involves pointing the controller’s virtual ray at the street name and pressing the trigger button, whereas non-VR users can simply click on the street name with the mouse cursor.

Two cameras are present in the scene, one for VR and the other for non-VR users. When the application opens, a script checks to see if a VR headset is detected and enables the appropriate camera. This simplifies the logic of enabling the correct controls since the keyboard and mouse camera controller is attached to the non-VR camera.

Improving Scene Layers

Once the basic viewing experience was functional, focus shifted to improving issues with the building and basemap layers.

For the building layers, many of the more detailed models contained overlapping faces that caused flickering, which was very distracting when experienced in VR. This was resolved by editing the problematic buildings in ArcGIS CityEngine and either deleting overlapping faces or shifting them a few centimetres.

A capture of the flickering faces on the roof a building, shown in ArcGIS CityEngine with the camera zooming in and out.

An example of the overlapping faces that needed to be removed in ArcGIS CityEngine

Another problem with the building layer was that the mesh geometries of iconic buildings were oversimplified under certain conditions. This was due to scene layers having multiple levels of detail to reduce the amount of geometry that needs to be drawn when a building is viewed from a distance. This was very noticeable for the CN Tower, and it was resolved by generating a scene layer in ArcGIS CityEngine with only the CN Tower mesh. When this singular layer was loaded into the Unity Maps SDK, a more detailed mesh that preserved the rounded shape of the observation deck was displayed when the tower appeared in the distance.

A side-by-side comparison of the CN tower as viewed from a distance in the app, with the simplified mesh view on the left and the view with the full building geometry on the right.

Comparing the lower level of detail CN Tower mesh with the updated scene layer

The base layer we used at the start of the project was Esri’s Streets basemap. This worked well when viewing the city from above but looked out of place at ground level. We ended up creating a custom tiled image layer using road and sidewalk polygons with a colour scheme that better matched the Complete Street 3D models.

UX and UI design

Prior to this project, we had developed several 3D GIS applications in the Unity game engine, but had not developed any VR applications, so we researched the human factors involved when designing for this medium.

One key finding was the need to avoid VR motion sickness, caused when movements in the VR environment do not correspond to movements by the user in the real world. Specifically, “flying” through a scene using the VR game controllers can cause motion sickness, primarily while accelerating or decelerating. A commonly used alternative in VR is teleportation; however, this can be disorienting and cause users to lose spatial context between locations.

For this app, we allow users to explore the 3D world by flying, and to travel directly between streets using teleportation. To reduce the likelihood of motion sickness while flying, we start the user at a low speed, and allow them to gradually increase and decrease the speed. As well, to avoid initial feelings of vertigo, we start the user at ground level, instead of starting with an elevated view of the city.

We also learned about the need to place user interfaces within the 3D environment, similar to signs or billboards in the real world. This can increase immersion as well as comfort, allowing users to view the interfaces from any angle and walk around them naturally. Other key findings included the need for limited text and larger font sizes, since text appears more pixelated in VR headsets, as well as positioning UI elements to avoid arm and neck strain.

After conducting this research, we created low-fidelity paper sketches of the user interface, allowing us to experiment with UI layouts and flows before developing the app. Some examples are presented below, showing ideas for how the user could select a street and scenario.

User-interface sketches showing potential dialog boxes and workflows for selecting a street in Toronto and a scenario for the selected street

A sampling of our UI sketches for the Complete Streets VR app

To learn more about UI sketching, please see the instructional video on our Higher Education Resource Finder.

The app

The video below presents an overview of the current version of the Complete Streets VR app. It shows an example of how a user can interact with the app using a VR headset and controllers, including selecting a street and a scenario, viewing the scenario at ground level, and switching between scenarios.

Future work

In 2023, we are planning to improve the Complete Streets VR app in several areas:

  • Integrate the VR app with the online Complete Streets Dashboard. This will allow users to sign into the VR app with their username and password from the online dashboard, open the complete-street scenarios that they created and configured online, and view these in VR.
  • Make the controls more discoverable and provide more visual feedback. While using the VR headset, if the user raises the handheld VR controllers in front of them, they can see 3D models of these controllers within the 3D world. We plan to label the buttons of these on-screen controllers, so that users can learn the function of each button more easily. We also plan to add on-screen information such as the user’s speed, the model’s scale, and tips for new users.
  • Integrate the user interfaces more fully into the VR environment. While the user interfaces have been placed into 3D space, they still use 2D panels similar to a traditional video game. We would like to take greater advantage of the immersion and interactions offered by VR, such as presenting scenario options as miniature 3D models and allowing users to change scene settings directly from the controllers.
  • Add textures to some of the buildings. Currently, the 3D buildings for the City of Toronto are displayed with plain white textures. Since one key benefit of VR is the ability to immerse oneself in an environment, we would like to add more realistic building textures along a few street segments to see if this enhances the feeling of immersion for users.

Conclusion

In this project, we demonstrated how the ArcGIS Maps SDK for Unity was used for an internal R&D project to visualize complete-street designs in Toronto in VR. Our intention is for urban planners and members of the public (e.g., at public-consultation meetings) to use the app to experience and evaluate Complete Street scenarios in an immersive way before the designs are implemented in real life. Since the app can be used with or without a VR headset, anyone with a desktop or laptop computer would be able to participate in the planning process.

To gain experience using the ArcGIS Maps SDK for Unity, you can follow this tutorial presented at the 2023 GIS in Education and Research conference.

About the Author

Jonathan Van Dusen is a Higher Education Specialist in the Education and Research group at Esri Canada. He develops learning resources and workshops for university and college students, provides assistance with app development and user-interface design for higher-education research projects and leads training sessions at the annual Indigenous Mapping Workshop. Jonathan completed a Bachelor of Environmental Studies with a Computer Science minor at the University of Waterloo, and also has a certificate in User Experience Design from the University of Toronto. A lifelong fan of maps, geography and technology, Jonathan is passionate about combining these through GIS to help solve current social and environmental challenges, as well as designing technology appropriately to meet user needs.

Profile Photo of Jonathan Van Dusen