Eye Controlled Electric Wheelchair: Proof of Concept Using an Arduino Robotic Car

Authors

  • Arjun Koirala Department of Electronics and Computer Engineering, Sagarmatha Engineering College, Kathmandu, Nepal
  • Gokul Subedi Department of Electronics and Computer Engineering, Sagarmatha Engineering College, Kathmandu, Nepal
  • Sushmit Paudel Department of Electronics and Computer Engineering, Sagarmatha Engineering College, Kathmandu, Nepal
  • Bipin Thapa Magar Department of Electronics and Computer Engineering, Sagarmatha Engineering College, Kathmandu, Nepal

DOI:

https://doi.org/10.3126/injet.v2i1.72569

Keywords:

ResNet-18, Eye-control, Deep Learning, Improved mobility

Abstract

This research aims to develop an electric wheelchair that can be controlledby using eye gestures, providing an intuitive and accessible interface for individuals with mobility impairments. The eye gesture is captured with the help of a camera and the live image is then preprocessed and fed to a CNN-based model for classification. The gesture is classified into 4 classes, namely, forward, left, right, and stop. Based on the classification, a signal is sent to the wheelchair through Bluetooth and the control system of the wheelchair operates as per the instruction. For the model training, the datasethas been collectedwith image frames of eye movement representing different control commands. The dataset is then trained using a ResNet-18-based CNN model. The model is then deployed on a mobile device which takes the image of the user’s eyes from the camera and inferences the image to find the eye movement. The movement is then recognized, and appropriate control signals are generated and transmitted to the wheelchair through Bluetooth. The receiver in the wheelchair maps the transmitted signal to the specific movement and turns the actuators in the appropriate directions.Hence, the wheelchair can interpret eye movement captured by the camera, accurately recognize pupil movement, and translate them into control signals mapped from the predicted values given by the model, ultimately empowering users with improved mobility and independence. On testing the system under various lighting conditions and with different users, the control system showed 90.25% accuracy and the overall movement of the system showed an excellent result in following the user's eye movement in real-time. But when tested on very low light conditions, especially during night time, the system cannot perform as expected and often predicts random and false values.

Downloads

Download data is not yet available.
Abstract
83
PDF
42

Downloads

Published

2024-12-16

How to Cite

Koirala, A., Subedi, G., Paudel, S., & Magar, B. T. (2024). Eye Controlled Electric Wheelchair: Proof of Concept Using an Arduino Robotic Car. International Journal on Engineering Technology, 2(1), 176–186. https://doi.org/10.3126/injet.v2i1.72569

Issue

Section

Articles