
AR for the Future of Public Transit
Designing intuitive AR experiences for a futuristic hybrid public transit system
Interaction Design
Interaction Design
Interaction Design
Augmented Reality
Augmented Reality
Augmented Reality
Embodied Interaction
Embodied Interaction
Embodied Interaction
project type
Academic Project
GoaL
Target Audience
ROLE
team
4 co-designers
tools
Physical Prototyping
Date
Feb 2025
Duration
TL;DR
Overview
We designed augmented reality (AR) interactions to support a speculative future public transit system in Bloomington that blends human-driven buses and autonomous pods. The goal was to enhance the everyday experience of local passengers, from booking to transfer, using AR while ensuring clarity, safety, and continuity throughout the journey.
Methods
+
+
+
+
Highlights
1


1. Guide users through a seamless, multi-stage journey
To support Bloomington residents navigating a hybrid pod–bus transit system, we designed an end-to-end AR experience. Our goal was to reduce cognitive load and clearly signal what to do next—whether booking a ride, boarding a pod, or transferring to a bus.
Designed an AR-based interface showing users their live journey progress using layered visual cues (e.g., color-coded segments for walking, pod, and bus phases).
Developed contextual prompts like directional arrows, lock/unlock icons, and journey progress bars to support transitions between ride stages.
Evaluated the clarity of visual cues and feedback during bodystorming sessions, improving labels and iconography (e.g., “Your pod is here,” exit direction arrows).
1. Guide users through a seamless, multi-stage journey
To support Bloomington residents navigating a hybrid pod–bus transit system, we designed an end-to-end AR experience. Our goal was to reduce cognitive load and clearly signal what to do next—whether booking a ride, boarding a pod, or transferring to a bus.
Designed an AR-based interface showing users their live journey progress using layered visual cues (e.g., color-coded segments for walking, pod, and bus phases).
Developed contextual prompts like directional arrows, lock/unlock icons, and journey progress bars to support transitions between ride stages.
Evaluated the clarity of visual cues and feedback during bodystorming sessions, improving labels and iconography (e.g., “Your pod is here,” exit direction arrows).
1. Guide users through a seamless, multi-stage journey
To support Bloomington residents navigating a hybrid pod–bus transit system, we designed an end-to-end AR experience. Our goal was to reduce cognitive load and clearly signal what to do next—whether booking a ride, boarding a pod, or transferring to a bus.
Designed an AR-based interface showing users their live journey progress using layered visual cues (e.g., color-coded segments for walking, pod, and bus phases).
Developed contextual prompts like directional arrows, lock/unlock icons, and journey progress bars to support transitions between ride stages.
Evaluated the clarity of visual cues and feedback during bodystorming sessions, improving labels and iconography (e.g., “Your pod is here,” exit direction arrows).
2


2. Minimize friction in AR interactions during daily activities
AR transit interactions needed to integrate smoothly into users’ routines without overwhelming them or requiring full attention.
Created a gesture-driven interface that lets users check schedules, book rides, and confirm actions using subtle finger swipes, taps, and voice input.
Tested these micro-interactions using sketch-based prototypes in a cognitive walkthrough, adjusting icon placement and gesture feedback based on confusion points (e.g., lock icons initially misread as device settings).
Refined the UI to avoid overloading users, balancing persistent visuals (journey bar) with light-touch notifications (arrival indicators).
2. Minimize friction in AR interactions during daily activities
AR transit interactions needed to integrate smoothly into users’ routines without overwhelming them or requiring full attention.
Created a gesture-driven interface that lets users check schedules, book rides, and confirm actions using subtle finger swipes, taps, and voice input.
Tested these micro-interactions using sketch-based prototypes in a cognitive walkthrough, adjusting icon placement and gesture feedback based on confusion points (e.g., lock icons initially misread as device settings).
Refined the UI to avoid overloading users, balancing persistent visuals (journey bar) with light-touch notifications (arrival indicators).
2. Minimize friction in AR interactions during daily activities
AR transit interactions needed to integrate smoothly into users’ routines without overwhelming them or requiring full attention.
Created a gesture-driven interface that lets users check schedules, book rides, and confirm actions using subtle finger swipes, taps, and voice input.
Tested these micro-interactions using sketch-based prototypes in a cognitive walkthrough, adjusting icon placement and gesture feedback based on confusion points (e.g., lock icons initially misread as device settings).
Refined the UI to avoid overloading users, balancing persistent visuals (journey bar) with light-touch notifications (arrival indicators).
3


3. Communicate contextual information without distraction
Because users might be watching content or navigating other apps, our AR interface needed to show only the most relevant information at the right time.
Introduced a multi-layered visual hierarchy: static overlays for always-relevant info (trip stage, time), spatial overlays for directional cues (boarding arrows), and notifications for time-sensitive actions (exit prompts).
Mapped user activity alongside trip phases to identify opportune moments for interaction vs. passive updates.
Tested and iterated based on feedback that journey summaries were unclear, leading us to visually anchor summaries to booking visuals for better cohesion.
3. Communicate contextual information without distraction
Because users might be watching content or navigating other apps, our AR interface needed to show only the most relevant information at the right time.
Introduced a multi-layered visual hierarchy: static overlays for always-relevant info (trip stage, time), spatial overlays for directional cues (boarding arrows), and notifications for time-sensitive actions (exit prompts).
Mapped user activity alongside trip phases to identify opportune moments for interaction vs. passive updates.
Tested and iterated based on feedback that journey summaries were unclear, leading us to visually anchor summaries to booking visuals for better cohesion.
3. Communicate contextual information without distraction
Because users might be watching content or navigating other apps, our AR interface needed to show only the most relevant information at the right time.
Introduced a multi-layered visual hierarchy: static overlays for always-relevant info (trip stage, time), spatial overlays for directional cues (boarding arrows), and notifications for time-sensitive actions (exit prompts).
Mapped user activity alongside trip phases to identify opportune moments for interaction vs. passive updates.
Tested and iterated based on feedback that journey summaries were unclear, leading us to visually anchor summaries to booking visuals for better cohesion.
4
Reflection
Designing AR interactions required granular attention to context, motion, and clarity—far more than static UI design. Early-stage sketching helped us align on a shared vision, and bodystorming revealed edge cases we’d initially overlooked. If we scoped tighter or involved users earlier in co-design, we could’ve tested specific features in greater depth. Still, this project reinforced how strong interaction design can make even speculative futures feel intuitive.
Reflection
Designing AR interactions required granular attention to context, motion, and clarity—far more than static UI design. Early-stage sketching helped us align on a shared vision, and bodystorming revealed edge cases we’d initially overlooked. If we scoped tighter or involved users earlier in co-design, we could’ve tested specific features in greater depth. Still, this project reinforced how strong interaction design can make even speculative futures feel intuitive.
Reflection
Designing AR interactions required granular attention to context, motion, and clarity—far more than static UI design. Early-stage sketching helped us align on a shared vision, and bodystorming revealed edge cases we’d initially overlooked. If we scoped tighter or involved users earlier in co-design, we could’ve tested specific features in greater depth. Still, this project reinforced how strong interaction design can make even speculative futures feel intuitive.