Average rating
Cast your vote
You can rate an item by clicking the amount of stars they wish to award to this item.
When enough users have cast their vote on this item, the average rating will also be shown.
Star rating
Your vote was cast
Thank you for your feedback
Thank you for your feedback
Author
Sharma, PragyaKeyword
Electrical engineeringComputer engineering
Computer vision
School of Engineering
Convolutional neural networks
Eye tracking
Facial recognition
Human-computer interaction
Machine learning
Date
2022
Metadata
Show full item recordTitle
Using Facial Features to Produce an Augmented Reality SystemAbstract
Under the direction of Anthony Choi, Ph.D. In the following work, an augmented reality system is proposed that takes the gaze of the eye, along with facial movements for assistance, as an input to allow the user to interact with a sample menu from a restaurant. Currently, the only way for a customer to order food at a restaurant is by touching a menu in person or by having the customer touch a screen. In either instance, the customer is having to interact with surfaces that are shared by many individuals, with the risk of contracting any number of illnesses. Such is a bigger problem when it comes to living through a pandemic, for instance, where interaction between shared surfaces poses a higher risk of exposure to the virus. Using an already trained neural network that incorporates pre-identified facial landmarks that every user possesses, the program can keep track of the user’s gaze and show the positions of both the left and right pupils, respectively. Along with this, the program begins with the user opening their mouth to pass a certain threshold and starts to read input. The user guides the cursor with the movement of their face within the green box shown on screen. By stopping facial movement, the user can select a menu item with the wink of their left eye. While conducting tests to see if the program was displaying correct coordinates, user testing took place and it was found that nine times out of ten, the program was displaying the correct coordinates. With more practice, the user was able to get used to using their facial movements to guide the cursor, although the cursor control speed could be better adjusted for future testing. Along with this, such exaggeration of facial movements could be adjusted so that the user does not feel awkward utilizing the system.Collections