Show simple item record

dc.contributor.authorSharma, Pragya
dc.date2022
dc.date.accessioned2022-05-02T15:22:51Z
dc.date.available2022-05-02T15:22:51Z
dc.identifier.urihttp://hdl.handle.net/10898/13576
dc.description.abstractUnder the direction of Anthony Choi, Ph.D. In the following work, an augmented reality system is proposed that takes the gaze of the eye, along with facial movements for assistance, as an input to allow the user to interact with a sample menu from a restaurant. Currently, the only way for a customer to order food at a restaurant is by touching a menu in person or by having the customer touch a screen. In either instance, the customer is having to interact with surfaces that are shared by many individuals, with the risk of contracting any number of illnesses. Such is a bigger problem when it comes to living through a pandemic, for instance, where interaction between shared surfaces poses a higher risk of exposure to the virus. Using an already trained neural network that incorporates pre-identified facial landmarks that every user possesses, the program can keep track of the user’s gaze and show the positions of both the left and right pupils, respectively. Along with this, the program begins with the user opening their mouth to pass a certain threshold and starts to read input. The user guides the cursor with the movement of their face within the green box shown on screen. By stopping facial movement, the user can select a menu item with the wink of their left eye. While conducting tests to see if the program was displaying correct coordinates, user testing took place and it was found that nine times out of ten, the program was displaying the correct coordinates. With more practice, the user was able to get used to using their facial movements to guide the cursor, although the cursor control speed could be better adjusted for future testing. Along with this, such exaggeration of facial movements could be adjusted so that the user does not feel awkward utilizing the system.
dc.publisherMercer University
dc.subjectElectrical engineering
dc.subjectComputer engineering
dc.subjectComputer vision
dc.subjectSchool of Engineering
dc.subjectConvolutional neural networks
dc.subjectEye tracking
dc.subjectFacial recognition
dc.subjectHuman-computer interaction
dc.subjectMachine learning
dc.titleUsing Facial Features to Produce an Augmented Reality System
dc.typedissertationen_US
dc.date.updated2022-04-28T16:04:36Z
dc.language.rfc3066en
refterms.dateFOA2022-05-02T15:22:52Z
dc.contributor.departmentSchool of Engineering
dc.description.advisorChoi, Anthony
dc.description.committeeBarnett, Kevin
dc.description.committeeSchultz, Scott
dc.description.degreeM.Sc.Eng.


Files in this item

Thumbnail
Name:
Sharma_mercer_1160N_10380.pdf
Size:
1.221Mb
Format:
PDF

This item appears in the following Collection(s)

Show simple item record