
Background
When Roger (tech developer) told us: "let's make an AR APP! "
We were all very pumped. We started to look into AR applications on the market and the client list.
Proof of concept is a way to gain potential new projects from clients, and keep the team growing with newest technology.
This project opened a new door for me to create the UI on the top of the real world. Let's take a look!
I. Intro
It seems the number of lights, symbols, and buttons in automobiles keeps increasing. While these new features may keep us safer, it can be tricky to figure out what they all mean.
​
This app is a proposal to help drivers understand the new features and the car issues timely with AR technology.
Type of Project
​
Proof of Concept for Hyundai
​Open Source Project
My Role
​
User Research, UI/UX
Tech Support
​
iOS AR, Azure Cloud, Flutter
Team
​
Project Manager, Technology Architect, Developer, UX Designer
I. User Stories
1. Flashing icon: "I'm not sure what that means!"
​
Mary is a new driver, she saw this fleshing icon on her dashboard while she was driving. She stopped her car
and going though pages to find out the alert.
2. Unfamiliar New Car
​
John purchased a new car online replaced his old 1997 Forerunner. He realized that this car is too new for him to understand all the buttons. He read through user manure and watched YouTube videos for it.
III. Design Highlights
Main Menu
Based on the user sceneries
and technology possibilities, we map 3 main function options into user flow.
Following the branding guide, I designed sign in page and menu page with current app color palate, to transfer it into the AR world, I made a animation for transition. I want to make the APP used in horizontal way so that the camera can capture a wider view.
​

Start scanning, darken the screen to create higher contrast with white dots.
Since the steps is very easy, the instruction box is hidden. The user can can more help if needed.
​
Instruction display
For the first time user, the instruction should be on the screen.
App Interfaces
When the window is open, the window will stay on the screen, in case the phone is moving.
The User’s Movement
​
Sitting in the driver's seat operating the AR application.
Going somewhere, probably in a hurry.
​
The user's motion will include moving the phone around the steering wheel and getting closer and further to the instrumentation.
​
-
When the camera zoom in and out, the UI should be able to move with the user movement.




The User’s Environment & Camera View
​
This car interior has dark colored steering wheel and dashboard.
-
The UI need to display information in a high contrast.
​
The driver is looking at dashboard and steering wheel which already contains a lot information.
​
-
On the top of that, the UI shouldn't overwhelm the user.
White dots on a dark overlay are visible in a variety of interior colors.
The dots are sized with the distance between the object and the phone.
​
Normal Scanning
Tap to focus on target areas. The white dashed line identifies the focus area. Relevant info is shown.
​
Alert Detected
When a red icon is detected, highlighted with red for attention with animation. It's also an button for more information

Object is close object is further
When there's alert, the flashing icon will increase salience on screen. The Instruction box will pop out, and direct user to click on the red icon. To avoid distraction, no white dots showing in this case.
IV. Understand Technology
Learning from the developers can help me understand the possibilities.
​​
​
-
The system integrated with machine learning.
-
New models give state of the art performance in real-time object detection
-
Models may have a variety of methods giving different resolutions & accuracy each fit for different use cases
-
Four primary ways of detecting items:
​

Our application model is used identify icons and buttons. Here is the ways to build up the AI:
​
-
Taking many pictures of the different possible icons that can appear on the dashboard
-
Training our models to be able to identify those images in a 3D space
-
Hooking those models into an AR view in Flutter and respond to correct identifications
​
Combining AR, and machine learning put on top of a Flutter application to give the customer the ability to learn about features of their car with their camera instead of having to dig through a user manual or website help section.
V. UI Variations:
I applied different layouts to test out the best UI with car interior. ​

The instruction is very obvious, but it took too much space. The dots are sized the same.

The instruction is hidden, and the dots are shaped in different size. Within 5 inch, the name will also show on the screen.
​

The dots are sized with distance, the instruction is hidden, and dots are sized with distance.
VI. Testing
The first demo helped me to test the UI in the car environment, and giving us an idea how fast the machine can detect the icon and buttons.
​
We only have the horizontal view of this app.
​
We tested during night time, and when the light is enough, the machine can recognize the object.


VII. What's next?
​
1. The horizontal will will narrow the scope of view, but the object is very focused. The APP should be designed in both ways.
​
2. Create the different UI styles for the app and try with real user testing.
​
3. Testing in different lighting conditions is needed, make a instruction box for dark environment.
VIII. Reflection
AR is about the camera, is about the information adding on the environment.
​
The user research in the car is key of this project. Sitting on the drivers' seat, and looking through the camera, I learned that the surrounding is complicated and confusing. On the top of that, I want to create a new focus on the screen with a clean UI.
​
Accessibility to many people and their needs is the app's main purpose. I thought through natural behaviors to shape the AR experience. When the phone moves, the AR has to react well and fast. The important information has to stay on the screen.
When a driver needs help on the road, they need to know where to go to fix the problem. The content was designed with executives advices.
​
This is my first AR app and I have learned about the camera and capture things.