Gesture Controlled Virtual Eye for Blind
This article provides a hand motion control vehicle model and identifies trends in technology, application, and usability. Real-time detection, gesture-based information that regulates vehicle movement, and user gesture manipulation via hand movements present an integrated method. It would not be an overstatement to say that it will be a drastic revolution in the lives of those deprived of vision, which would replace their old orthodox sticks to become their modern smart navigator, which would act as a barrier between their feet and the obstacles in their way and would allow a goodbye to stumbling.
The adaption is a three axis accelerometer.
The accelerometer also moves accordingly as the person moves their hand. The
gesture is accelerometer-captured and gesture-processed. Interactions between
human machines today are moving away from mouse and pen and are becoming
advanced. Every day, with the advent of new technologies, the distance between
machines and humans is reduced. With a future selection of advance robotic
cars that are shaped and worked to assist visually impaired in road crossings
and other chores. The hand gesture alone can be easily used to control the car.
This project also serves an attempt to introduce
brightness in the lives of visually-impaired people who have always sought a
helping hand to guide them even in the most trivial of the matters. The blind
mode would not only protect them from falling but would lift them and their
spirits up and would prove to be such a bolsterer they had never known before.
The proposed work is
achieved by utilizing the Arduino microcontroller, accelerometer and RF sender
and receiver (transceiver). Two main benefactions are presented in this
work. Firstly, we’ve shown that the car can be controlled with hand-gestures
according to the movement and position of the hand. Secondly, the proposed car
system is further extended to and infused with Ultrasonic sensors to detect
real time obstacles, which responds through a callback in turn giving away a
vibration feel on wrist. This automatic obstacle detection system is
introduced to improve the safety measurements to avoid any hazards. The
proposed system is designed at lab-scale prototype to experimentally validate
the efficiency, accuracy, and affordability of the system. We remark that the
proposed system can be implemented under real conditions at large-scale in the future
that will be useful in automobiles and robotics applications.
In this method, a gesture-driven
robotic vehicle is built in which the movement and manipulation of the vehicle,
i.e. handling and control, depends on the user's gesture. The gesture is
captured by the accelerometer in this device and is processed by software,
namely microcontroller software known as Arduino IDE, and the parameters are
sent to the microcontroller and encoder circuit, further transmitted by the
NRF24L01 transceiver (transmitter section). In the receiver section, the NRF24L01
transceiver module. Locks down with the microcontroller the obtained
specifications and method as well as provides the robotic vehicle with these
parameters so that it behaves according to the gesture. Through this method,
long distance processing can be accomplished. Locks down with the
microcontroller the obtained specifications and method as well as provides the
robotic vehicle with these parameters so that it behaves according to the
gesture. Through this method, long distance processing can be accomplished. It
would be an efficient way to eliminate the social difficulties faced by people
with physiological challenges. This explicitly interferes with the community's
social significance. Mobility assistance that will help expand their location
to the challenged individuals. These populations are typically bedridden and
complicated for their bed movements. It is really hard to make them switch from
one place to another. These populations are typically bedridden and complicated
for their bed movements. It is really hard to make them switch from one place
to another. The main reason behind the implementation of this project is to
give a helping hand for the sufferings of the challenged people. They have no
way to get rid of from the bed due to their lack of movements. To make daily
life more easy and effortless, we have to take the help of technology, may be
autonomous or manual. But life of human being can be smoother if the technology
becomes hands free. If human can control machines totally by their voice,
gestures and other activities, then interactions will be easier. There are many
high tech bots which can do these things but still normal people are still far
away from that types of technology mainly due to high price of those products.
Our target was to make low cost smart products which can easily be bought. Not
only that but also it will be easier to operate those types of machines. Our
dream was to make such kind of bots which will be made of wires and circuits but would act like personal assistant in more realistic way. Gesture Controlled robotic bot is a kind of robotic car which can be
controlled by our hand gestures not by old buttons. We just need to wear a
small transmitting device in our hand which included an accelerometer. This
will transmit an appropriate command to the robot so that it can do whatever we
want.
Aim of the project
The main Aim of this
project is to design an autonomous robot which will be controlled by using
gestures for remote environment surveillance and help to work for visually
impaired people where working for such human beings is difficult.
Objectives of the project
These
following are the main objectives of the project:
•
To design a robotic platform controlled using
gestures.
•
To perform autonomous obstacle avoidance
capability.
•
To provide an aid for visually impaired people.
Scope of
Work
Insight for blinds
is an alteration which helps the visually impaired people to navigate around
with confidence by knowing the nearby obstacles and fall using the help of
gesture controlled robotic car which produces ultrasonic waves that assists the
disabled through vibration. It allows the user to walk freely by detecting the obstacles
and pitch. They only need to have a wearable to control the car.
According to World Health
Organization 39 million people are estimated to be blind around the world.
They’ve been using traditional ways for assistance therefore, this new way
would make them independent of human assistance. They are suffering from a
lot of hardships in their daily life. The affected ones have been using the
traditional white cane for many years which although being effective, still have a lot of disadvantages. Another way is, having a pet animal such as a dog which too being really expensive. So the aim of the project is to develop a cheap and
more efficient way to help visually impaired to navigate with greater comfort,
speed and confidence.
Figure 2
The tools
and technologies used in project
Transmitter
Side:
• Arduino Nano
•Transceiver (NRF24L01)
• Vibration motor
• Nano Shield
• ADXL
Receiver Side Car:
• Arduino Uno
• Transceiver (NRF24L01)
• Motor Driver (L298N)
• Chassis
• Ultrasonic Sensor
• Lithium cells
• Button
Conclusion
So far the results are satisfactory
and accurate. The ultrasonic convincingly detects the object from a certain
distance and conveniently sends the callback response which further gives
vibration on the wrist whereas, the sideways are detected through piezo buzzer.
The transmission and reception is perfectly done through NRF24L01. The
ultrasonic at the front and back are quintessentially working to detect object from a
certain distance. Thus, avoiding collision to occur.
Code:
Further developments can be carried out for towered accuracy and precision.
Email: easymathsforyou3@gmail.com



Great project
ReplyDelete