AR application – Setting up basic functionalities

  1. UI
    The first thing the player will encounter upon opening the application is an instructions panel. The player must press the “Ok” button in order to proceed to experience the app.
  2. Plane Detection
    The application detects a horizontal plane via the smartphone’s camera and automatically instantiates trash objects.
    Initially the player had to tap the screen in order to instantiate the trash however I decided to change it to automatic for a couple of reasons. The first one being that I truly believed this step was quite unnecessary and it was an extra task the player had to remember to do. The second is that the OK button in the UI interfered with the plane detection and I found myself while testing the app, instantiating trash while pressing OK leading to misplacement of trash.
  3. Image Tracking
    The player uses the target image to generate a virtual net that will be used to pick up the virtual trash. The player must use the net to collide with trash. Upon collision, trash item disappears signifying it is now inside the net.

Initial Screenshots of the three basic functionalities of the app:

User Interface: The instructions panel.
The instructions have now changed since there is no need to tap on the screen to generate the trash
Plane Detection: Trash on the detected plane.
The sizing and the type of trash is not representative. [Although when the player comes closer to the objects they do get larger they are still small. ]
Image Tracking:
The image of a real net was on the laptop screen The smartphone detected that image and generated a cube. The cube represents the virtual net that will be used to pick up trash.

Leave a comment

Your email address will not be published. Required fields are marked *