Visualizing Complete Street designs in VR, Part II
Two years ago, we started a research project to visualize Complete Street designs in VR. This blog post will outline the additional features and enhancements added to the app since our last post on the project. This work includes a new mini-map to view the larger scene context, integration with the Complete Streets Dashboard, and user interface improvements.
This blog post was jointly written by Jonathan Van Dusen and Michael Luubert of the Education and Research group.
The Complete Streets VR App lets users preview different complete street designs with the immersive experience of a VR headset. It is built with the ArcGIS Maps SDK for Unity and is an internal R&D project we worked on over the past two years as an extension of the iCity research project. This R&D project is now essentially complete, and this blog post describes the enhancements added over the past year. We focused on making the VR app more easily learnable for new users and more useful for an urban planning audience.
Mini-map
To provide better situational context at ground level, we added a mini-map, which is a scaled down 3D cutout of the area surrounding the user. It appears fixed to the left controller and allows the user to move the model closer and change the rotation by simply moving their hand that’s holding the controller. Its visibility is toggled on and off using a button on the left controller. A red arrow on the mini-map indicates the direction the user is facing and the current elevation.
Connection with Complete Streets Dashboard and Integration of Scenario Metrics
One of our major goals for the second phase of our work was to make the Complete Streets VR app more useful to an urban planning audience by integrating the app with the Complete Streets Dashboard. In the previous blog post, we mentioned our group’s collaboration on the iCity research project, which included working with students from OCAD University to create an online Complete Streets Dashboard. The Complete Streets Dashboard allows users to view Level of Service (LOS) scores for Toronto streets, including scores for walking, cycling, transit, goods transportation, automobiles, and a sense of place. The LOS scores are based on calculations from the US National Cooperative Highway Research Program (see report).
Through the dashboard’s inclusion of the open-source Streetmix app, users are able to create their own complete-street designs by interactively adding and removing different lane types for each street and modifying the lane widths. Users can also view an animated 3D model of their new street design using an embedded ArcGIS Maps SDK for JavaScript viewer, with the 3D model generated behind the scenes on a server running the ArcGIS CityEngine SDK. We wanted planners to be able to configure a scenario in the Complete Streets Dashboard using the Streetmix interface, and then immediately view their custom scenario in the VR app. We also wanted to incorporate the LOS scores and other metrics such as road widths into the VR environment and provide a way for users to learn more about what the LOS scores and metrics represent.
The video below shows the results of the integration, based on an example workflow where a planner or other interested citizen can (1) inspect a street segment in the Complete Streets VR app, (2) view the LOS scores and lane widths in VR, (3) modify the scenario in the Complete Streets Dashboard, and (4) view their changes in the Complete Streets VR app, including the effects on the LOS scores.
Sun-angle controls
In the initial version of the Complete Streets VR app, we had included the ability for users to modify the sun angle by dragging the sun around in the sky using the mouse or the VR controller. While fun to use, and a great example of direct manipulation in user interfaces, this was not always suitable for urban planning use cases, since it allowed users to set sun angles that would never occur in real life for Toronto’s latitude and longitude. Our co-op student from the University of Waterloo, Ben Woodward, implemented controls to limit the sun’s angle to realistic ranges based on Toronto’s latitude and longitude, using the open-source SunCalcNet library to calculate the sun’s altitude and azimuth. The user can specify the date and time using one slider for the month of the year (using the 21st day of each month to make the solstices available as options) and a second slider for the time of day. Together, these sliders allow users to view realistic lighting conditions for the Complete Streets scenarios, including shadows from adjacent buildings.
For a video demonstration of the sun-angle functionality, please see Ben’s blog post, Transit Equity and Virtual Reality: Outtakes from my Co-op with Esri Canada.
UI Updates
In addition to the interface for changing the sun angle, we also added new interfaces to display the users’ speed as they “fly” through the scene with the VR controllers and to display the current scaling factor for the model. These appear as small contextual UIs while the user is accelerating, decelerating, or changing the model scale, and are anchored to the left VR controller. These indicators were considered useful for making the status of the application more visible to users (which is the first item in Jakob Nielsen’s classic list of 10 usability heuristics).
Demonstration of the speed and scale indicators while in a VR scene
As well, we added on-screen labels for each button on the VR controllers so that users could learn the controllers’ functionality more easily. Users can toggle the labels on and off at any point while using the VR app.
Demonstration of toggling the controller labels on and off while in a VR scene
Tutorial level
Ben Woodward also added a tutorial level that appears when users open the app and walks them through how to use the VR headset’s controllers (or their mouse and keyboard) to navigate the 3D environment, teleport between streets, and use the menu UI. Although we had already added labels for the controller buttons as described above, we found in our testing that users still struggled to familiarize themselves with the layout of the VR controllers, as well as how to navigate the VR scenes, so we felt that it would be useful to introduce this functionality more gradually.
The idea of a tutorial level had actually been included in our earliest UI sketches for the VR app, as presented in the previous blog post, so we are glad that Ben was able to make this a (virtual) reality.
Screenshots of example steps from the tutorial level, showing instructions for how to move downwards in a scene and how to teleport to a street
Building textures
To enhance the level of immersion in the app, we add textured buildings used in the iCity 3D Toronto model. Apart from iconic buildings such as the CN tower, the buildings were textured in CityEngine using a procedural rule. The textured buildings can be toggled on and off in the app to change the level of immersion at street level. Below you can see a comparison of the textured and untextured models and a video showing the view from street level.
Conclusion
One overarching theme for our work on the Complete Streets VR app in 2023 was to make the app more relevant to an urban planning audience, which led us to integrate scenario metrics, refine the sun-angle controls, and add building textures. At the same time, we wanted to make the app more easily accessible to a citizen audience, specifically those without VR expertise, leading us to add the mini-map, tutorial level, controller labels, and speed and scale indicators.
If you’re interested in using the Complete Streets VR app, a modified version is available to download in the form of the Scene Viewer VR app. This app allows you to view any public scenes from ArcGIS Online in VR, using the same interface as the Complete Streets VR app, and is compatible with Windows PCs, Meta Quest headsets, and Windows Mixed Reality headsets. You can learn more about the Scene Viewer VR app in a blog post by Michael Luubert and download the app from its GitHub repo.
Beyond the VR apps, we plan to continue our work with the ArcGIS SDKs for Game Engines and to participate in iCity 2.0, the continuation of the iCity research project that originally led to our work on Complete Streets. Through this work and our other higher education R&D projects, we expect that there will be many more opportunities to push the limits of the 3D capabilities offered throughout the ArcGIS ecosystem!