top of page

EatARy Assistant


Waiting time at restaurants is annoying. Making up a choice for a restaurant is difficult especially when you want to avoid long and irritating waiting time. 

EatARy Assistant is my multimedia research project in which I am trying to solve the problem of reducing the time to make up a choice for a restaurant considering the long waiting times.

Project duration: 4 months

Project Type: Individual

My Role: User Research


               UX & UI Design

               Visual Design

              Usability Testing

              AR App development


Research Question

How to minimize the time in the selection of the restaurants to evade the waiting time?

Screen Shot 2020-01-30 at 2.02.42 PM.png

Research Insights

To find out the solution to the research question, I took interviews with different people. I also collected opinions from the people who were actually standing in a waiting queue outside the restaurant. 

user interview-01.png

Success Goals

  • Encouraging: New resources other than online ordering and  pick-up

  • Efficiency: Less and easy steps                                           

  • User-centric: The main focus on Waiting time 

Proposed Design Solution

I proposed a solution to create an Augmented Reality Application which can show the waiting time, time to reach the restaurants, and names with the use of a Camera.

eatary assistant logo-01.png


The proposed design solution required consideration of various factors such as driving distance to the location, driving time to reach the restaurant, waiting time after reaching the restaurant, or waiting time including driving time. It was a little difficult to achieve the desired result within the timeframe of 4 months by considering all the factors in the scope.

icon for challenges-01.png

Driving Distance

to the location

icon for challenges-01.png

Driving time

to the location

icon for challenges-01.png

Waiting time after reaching the restaurant

icon for challenges-01.png
icon for challenges-01.png

Waiting time


Driving time

Changes to overcome Challenges

For the first stage of the prototype of the project, I narrowed down the scope by considering only one factor which is showing the results within the range of the walking distance.  


Walking distance to the restaurant


 Time to reach the restaurant by walking 

icon for challenges-01.png

Waiting time after reaching the restaurant


Design Solution 

EatARy Assistant is an Augmented Reality(AR) application that will show the waiting time at the restaurants and walking time to reach the restaurant in the real world using the camera of your smartphone. The goal of the application is to reduce the steps to find out where to go to get their food faster by avoiding the long waiting time

Consider Bella's journey as an example:

Task flow one/On-boarding process

Bella launches the EatARy Assistant and the on-boarding steps by showing the purpose first. 

The main purpose is to avoid "waiting time" when you are hungry. To serve the purpose of using the. EatARy Assistant, Bella allows GPS and Camera Access. Then she sees how she can see the results in the application by moving his phone camera in 360 degrees

Task Flow one/ On-boarding process

Task flow two/See the results and find out  the restaurant with least waiting time

Bella clicks on the phone on the screen to understand the way to interpret results. The biggest square represents the nearest restaurant and the smallest square represents the farthest restaurant. Once Bella understands how to interpret results, she gets started using the application. 

Task Flow two/ How to interpret results

Screen Shot 2020-03-15 at 8.09.11 PM.png

Flow for EatARy Assistant

Design Decisions for

Success Goals

To achieve these three success metrics: EncouragingEfficiency, and User-centric, I got motivated to take several design decisions which include a new technology called Augmented Reality(AR) and simplified User Experience.


My Process


Usability Testing



Read articles, blogs


Used existing mobile AR applications 


The technology part of a research

Usability testing with different users with the paper prototype 

Sketched UI screens


Designed Graphics

 & made Animations


Made Invision Studio Prototype

AR with Markers

 & Markerless AR 


Combined live restaurant data and animations with GPS & 3D Map


For the research, I started with using the existing AR mobile applications, and reading about the blogs, articles related to AR design. That helped me to understand the key points for AR application User Experience and development. 


I also researched the technical part as I was not aware of the development process in AR and I had to learn the coding language and platforms to build the functional prototype. 

Usability Testing

I started doing user testing with my paper prototype first to create the refined version for the actual development.


First Usability Testing: 

First I did user testing with my friends who were not were about my project at all and received the feedback of improving visuals for the application.



They suggested creating a more detailed onboarding process. 

Second Usability Testing:

I refined the prototype and did user testing again with my classmates.


They suggested minor changes for the symbols cand hierarchy of the steps.


All the collected feedback and my research helped me to create a better version of my prototype. 


Pictures of Usability Tests


For the design, I started creating the graphics for the on-boarding process as I wanted the user to understand properly how the user is going to use the application.  

So I created human figures for better correlation and other graphics for waiting and walking time. 

Screen Shot 2020-03-29 at 2.24.24 AM.png
Screen Shot 2020-03-29 at 2.24.43 AM.png
Screen Shot 2020-03-29 at 2.24.36 AM.png
Screen Shot 2020-03-29 at 2.53.48 AM.png
Screen Shot 2020-03-29 at 2.53.48 AM.png
Artboard 3.png

Graphics for on-boarding process of an application

Apart from that I also made few animations and motion graphics to show the actual results. 

Screen Shot 2020-03-29 at 2.24.24 AM.png

Motion graphics for showing results

Color Scheme: The first three colors are primary colors and other colors are used secondarily. 




Screen Shot 2020-03-29 at 5.55.32 PM.png





Research Question
Success Goals
Design Solution
Design Decisions

Fonts: For instructions and to create artifacts for AR I used two different fonts so that it can be identified easily. 

For On-boarding instructions:

Futura PT Bold

Futura Pt Book

Artboard 3.png

For AR squares to show the results:

Gibson (Semibold)



For the development part, I started with simple Marker-based Augmented Reality as I was not aware of AR development in the beginning.

Screen Shot 2020-03-29 at 8.05.23 PM.png

Augmented Reality with Markers

After successful of the attempt of Maker

Based on AR, I jumped into Markerless AR where I learned the use of Mapbox SDK to load the geo-location-based maps. 

Screen Shot 2020-03-29 at 8.05.33 PM.png

Augmented Reality with Marker Less Geo-Loacation

For the next step, I integrated Phone GPS and used it to load the 3D Map. 

Then I integrated the point of interest on the 3D Map such as restaurants’  names using Mapbox SDK and AR foundation library.

Screen Shot 2020-03-29 at 8.12.24 PM.png

Augmented Reality with Marker Less Geo-Location

(Showing Restaurants' names near California State University, East Bay)

Then, I used the fundamentals of physics in Unity such as Raycasting and collisions to make motion graphics appear where the actual restaurant is.

Final Prototype of showing results based on the Location

The waiting time and the walking time to reach the restaurant are not based on any dataset for the prototype. They are arbitrary numbers. 


  • Integration of the Point of interest like (restaurants) with 3D AR Map with Mapbox SDK and  World Scale AR technology.

  • Make animation work where the actual location of the restaurant is.


  • Switched to AR foundation Library instead of World Scale AR with Mapbox SDK.

  • Applied fundamentals of Physics in Unity such as when the ray from the camera intersects the  Box Collider attached to the restaurant's boarding the animation appears otherwise it disappears.

EatARy Assistant

bottom of page