Skip to main content

Visualizing Complete Street designs in VR, Part II

Two years ago, we started a research project to visualize Complete Street designs in VR. This blog post will outline the additional features and enhancements added to the app since our last post on the project. This work includes a new mini-map to view the larger scene context, integration with the Complete Streets Dashboard, and user interface improvements.

This blog post was jointly written by Jonathan Van Dusen and Michael Luubert of the Education and Research group.

The Complete Streets VR App lets users preview different complete street designs with the immersive experience of a VR headset. It is built with the ArcGIS Maps SDK for Unity and is an internal R&D project we worked on over the past two years as an extension of the iCity research project. This R&D project is now essentially complete, and this blog post describes the enhancements added over the past year. We focused on making the VR app more easily learnable for new users and more useful for an urban planning audience.

Mini-map

To provide better situational context at ground level, we added a mini-map, which is a scaled down 3D cutout of the area surrounding the user. It appears fixed to the left controller and allows the user to move the model closer and change the rotation by simply moving their hand that’s holding the controller. Its visibility is toggled on and off using a button on the left controller. A red arrow on the mini-map indicates the direction the user is facing and the current elevation.  

Connection with Complete Streets Dashboard and Integration of Scenario Metrics

One of our major goals for the second phase of our work was to make the Complete Streets VR app more useful to an urban planning audience by integrating the app with the Complete Streets Dashboard. In the previous blog post, we mentioned our group’s collaboration on the iCity research project, which included working with students from OCAD University to create an online Complete Streets Dashboard. The Complete Streets Dashboard allows users to view Level of Service (LOS) scores for Toronto streets, including scores for walking, cycling, transit, goods transportation, automobiles, and a sense of place. The LOS scores are based on calculations from the US National Cooperative Highway Research Program (see report).

Through the dashboard’s inclusion of the open-source Streetmix app, users are able to create their own complete-street designs by interactively adding and removing different lane types for each street and modifying the lane widths. Users can also view an animated 3D model of their new street design using an embedded ArcGIS Maps SDK for JavaScript viewer, with the 3D model generated behind the scenes on a server running the ArcGIS CityEngine SDK. We wanted planners to be able to configure a scenario in the Complete Streets Dashboard using the Streetmix interface, and then immediately view their custom scenario in the VR app. We also wanted to incorporate the LOS scores and other metrics such as road widths into the VR environment and provide a way for users to learn more about what the LOS scores and metrics represent.

The video below shows the results of the integration, based on an example workflow where a planner or other interested citizen can (1) inspect a street segment in the Complete Streets VR app, (2) view the LOS scores and lane widths in VR, (3) modify the scenario in the Complete Streets Dashboard, and (4) view their changes in the Complete Streets VR app, including the effects on the LOS scores.

Sun-angle controls

In the initial version of the Complete Streets VR app, we had included the ability for users to modify the sun angle by dragging the sun around in the sky using the mouse or the VR controller. While fun to use, and a great example of direct manipulation in user interfaces, this was not always suitable for urban planning use cases, since it allowed users to set sun angles that would never occur in real life for Toronto’s latitude and longitude. Our co-op student from the University of Waterloo, Ben Woodward, implemented controls to limit the sun’s angle to realistic ranges based on Toronto’s latitude and longitude, using the open-source SunCalcNet library to calculate the sun’s altitude and azimuth. The user can specify the date and time using one slider for the month of the year (using the 21st day of each month to make the solstices available as options) and a second slider for the time of day. Together, these sliders allow users to view realistic lighting conditions for the Complete Streets scenarios, including shadows from adjacent buildings.

For a video demonstration of the sun-angle functionality, please see Ben’s blog post, Transit Equity and Virtual Reality: Outtakes from my Co-op with Esri Canada.

UI Updates

In addition to the interface for changing the sun angle, we also added new interfaces to display the users’ speed as they “fly” through the scene with the VR controllers and to display the current scaling factor for the model. These appear as small contextual UIs while the user is accelerating, decelerating, or changing the model scale, and are anchored to the left VR controller. These indicators were considered useful for making the status of the application more visible to users (which is the first item in Jakob Nielsen’s classic list of 10 usability heuristics).

Animation showing the VR controllers in a VR scene in downtown Toronto. The user is flying through the scene, and when they accelerate or decelerate, the value in the speed indicator changes.

Animation showing the VR controllers in a VR scene in downtown Toronto. The user is flying through the scene, and when they accelerate or decelerate, the value in the speed indicator changes. Next, they change the scale of the model to make it smaller, and then change it back to normal, and the value in the scale indicator changes accordingly.
Demonstration of the speed and scale indicators while in a VR scene

As well, we added on-screen labels for each button on the VR controllers so that users could learn the controllers’ functionality more easily. Users can toggle the labels on and off at any point while using the VR app.

Animation showing the VR controllers in a VR scene in downtown Toronto. The controllers show one button label by default, telling the user that they can press a button to show controller labels. The user presses this button and all button labels are visible, and then the user presses the button again to go back to the default state.
Demonstration of toggling the controller labels on and off while in a VR scene

Tutorial level

Ben Woodward also added a tutorial level that appears when users open the app and walks them through how to use the VR headset’s controllers (or their mouse and keyboard) to navigate the 3D environment, teleport between streets, and use the menu UI. Although we had already added labels for the controller buttons as described above, we found in our testing that users still struggled to familiarize themselves with the layout of the VR controllers, as well as how to navigate the VR scenes, so we felt that it would be useful to introduce this functionality more gradually.

The idea of a tutorial level had actually been included in our earliest UI sketches for the VR app, as presented in the previous blog post, so we are glad that Ben was able to make this a (virtual) reality.

A set of screenshots showing the tutorial panel in a VR scene. The panel is black with white text, appearing towards the bottom of the user’s view, and angled away from the user slightly. The panel contains instructions as described in the image caption.
Screenshots of example steps from the tutorial level, showing instructions for how to move downwards in a scene and how to teleport to a street

Building textures

To enhance the level of immersion in the app, we add textured buildings used in the iCity 3D Toronto model. Apart from iconic buildings such as the CN tower, the buildings were textured in CityEngine using a procedural rule. The textured buildings can be toggled on and off in the app to change the level of immersion at street level. Below you can see a comparison of the textured and untextured models and a video showing the view from street level.
 Screenshot comparing the textured and untextured downtown buildings from a view above Lake Ontario.

Conclusion

One overarching theme for our work on the Complete Streets VR app in 2023 was to make the app more relevant to an urban planning audience, which led us to integrate scenario metrics, refine the sun-angle controls, and add building textures. At the same time, we wanted to make the app more easily accessible to a citizen audience, specifically those without VR expertise, leading us to add the mini-map, tutorial level, controller labels, and speed and scale indicators.

If you’re interested in using the Complete Streets VR app, a modified version is available to download in the form of the Scene Viewer VR app. This app allows you to view any public scenes from ArcGIS Online in VR, using the same interface as the Complete Streets VR app, and is compatible with Windows PCs, Meta Quest headsets, and Windows Mixed Reality headsets. You can learn more about the Scene Viewer VR app in a blog post by Michael Luubert and download the app from its GitHub repo.

Beyond the VR apps, we plan to continue our work with the ArcGIS SDKs for Game Engines and to participate in iCity 2.0, the continuation of the iCity research project that originally led to our work on Complete Streets. Through this work and our other higher education R&D projects, we expect that there will be many more opportunities to push the limits of the 3D capabilities offered throughout the ArcGIS ecosystem!

About the Author

Michael Luubert is a software developer in the Education and Research department at Esri Canada. He undertakes research in 3D GIS, including developing web apps with the ArcGIS API for JavaScript and using ArcGIS CityEngine and game engines to create immersive urban and campus simulations. Michael graduated from the University of Waterloo with a Bachelor of Computer Science, a Minor in Geography and Environmental Management and a Diploma of Excellence in GIS. He enjoys playing soccer and ultimate frisbee.