Posts | Comments

Planet Arduino

Archive for the ‘assistive technology’ Category

While many of the things we interact with every day have become more usable by people with disabilities, the kitchen remains as one important area of our lives that still lacks many accessibility features. One of these commonplace appliances is the coffee maker and its array of small buttons or even a touchscreen that can be hard to see/touch. Orlie on Instructables has developed a set of wireless buttons and an accompanying receiver that translate simple actions into an easy, end-to-end brewing experience.

Each button started as a custom 3D-printed shell with compartments for a AA battery holder, large arcade button, and the perfboard that also contained the ESP8266 microcontroller. In this system, the ESP8266 communicates with the Arduino GIGA R1 WiFi board via Wi-Fi and an MQTT message broker running on a host PC. This enables each button to be assigned a unique message that dictates the desired task to be performed.

At the coffee maker, the GIGA R1 WiFi was wired into a pair of ULN2003 stepper motor driver modules that move a gantry across a set of linear rails and eventually push the corresponding buttons once the correct position has been reached. Ultimately, this allows for those with less mobility and/or dexterity to select what they want from anywhere in the house — all over Wi-Fi.

To see how this project was built in greater detail, you can read Orlie’s write-up here on Instructables.

The post This Arduino GIGA R1 WiFi project turns a coffee maker into a more accessible appliance appeared first on Arduino Blog.

Henry Evans suffered a brain-stem stroke 20 years ago that left him paralyzed with quadriplegia. He can move his head, but other than a small amount of movement in his left thumb, he can’t control the rest of his body. To help Evans live a more independent life, researchers from Carnegie Mellon University’s School of Computer Science developed a motion control interface that lets him operate a mobile robot.

The robot is a Stretch model from Hello Robot, which can navigate a home on its mobile base, interact with objects using its arm and gripper, and provide a live view through a pair of cameras (one on its head and one on its gripper). But this telepresence robot doesn’t have any provisions for operation by a person with quadriplegia like Evans. That’s where the SCS team came in.

They created a head-worn motion control interface consisting of an Arduino Nano board, a Bosch BNO055 IMU and an HC-05 Bluetooth module. The Arduino monitors Evans’s head movement with the IMU, then sends cursor movement commands over Bluetooth to the computer running the software that controls the Stretch robot. That lets Evans move the cursor on the screen, and then he can click a mouse button thanks to the limited movement of his left thumb.

During a week-long testing session, Evans successfully used this system to perform many tasks around his home. He was able to use the robot to pick up tissues and bring them to his face, and even to adjust the blinds on his bedroom window. Clever “Drivers Assistance” software lets the robot operate semi-autonomously in order to complete tasks that would have been difficult for Evans to accomplish through manual control.

While the Stretch robot is expensive at about $25,000 dollars, the HAT (Head-worn Assistive Teleoperation) control interface is affordable. This is just a prototype, but a device like this could help many people around the world living with quadriplegia and other conditions that affect motor control.  

The post Motion control interface facilitates robot operation for those with paralysis appeared first on Arduino Blog.

Henry Evans suffered a brain-stem stroke 20 years ago that left him paralyzed with quadriplegia. He can move his head, but other than a small amount of movement in his left thumb, he can’t control the rest of his body. To help Evans live a more independent life, researchers from Carnegie Mellon University’s School of Computer Science developed a motion control interface that lets him operate a mobile robot.

The robot is a Stretch model from Hello Robot, which can navigate a home on its mobile base, interact with objects using its arm and gripper, and provide a live view through a pair of cameras (one on its head and one on its gripper). But this telepresence robot doesn’t have any provisions for operation by a person with quadriplegia like Evans. That’s where the SCS team came in.

They created a head-worn motion control interface consisting of an Arduino Nano board, a Bosch BNO055 IMU and an HC-05 Bluetooth module. The Arduino monitors Evans’s head movement with the IMU, then sends cursor movement commands over Bluetooth to the computer running the software that controls the Stretch robot. That lets Evans move the cursor on the screen, and then he can click a mouse button thanks to the limited movement of his left thumb.

During a week-long testing session, Evans successfully used this system to perform many tasks around his home. He was able to use the robot to pick up tissues and bring them to his face, and even to adjust the blinds on his bedroom window. Clever “Drivers Assistance” software lets the robot operate semi-autonomously in order to complete tasks that would have been difficult for Evans to accomplish through manual control.

While the Stretch robot is expensive at about $25,000 dollars, the HAT (Head-worn Assistive Teleoperation) control interface is affordable. This is just a prototype, but a device like this could help many people around the world living with quadriplegia and other conditions that affect motor control.  

The post Motion control interface facilitates robot operation for those with paralysis appeared first on Arduino Blog.

As a society, we have decided to enact some measures to make our world more accessible to those with disabilities. Wheelchair ramps, for example, are often legal requirements for businesses in many countries. But we tend to drop the ball when it comes to things aren’t necessities. For instance, entertainment options are an afterthought much of the time. That’s why Alain Mauer developed this LED gaming platform for people with special needs.

This device offers a lot of flexibility so that builders can tailor it to a specific individual’s own needs and tastes. Mauer designed it for his son, who is 17 years old and lives with non-verbal autism. Entertainment options intended for neurotypical people don’t engage the teen, but toys designed for children fail to hold his interest for long. This game, dubbed “Scott’s Arcade,” is simple to understand and interact with, while still offering a lot of replayability. It is also durable and able to withstand rough handling.

Scott’s Arcade consists of a “screen” made up of individually addressable RGB LEDs and a faceplate with shape cutouts that act as masks for the LEDs. An Arduino Nano controls the lights and responds to presses of the large buttons beneath the screen. It can trigger sound effects through a DFRobot DFPlayer Mini MP3 player as well.

Mauer programmed a few simple games for the device, such as a matching game that challenges the player to find the circle of the same color as the triangle. When they succeed, they’re rewarded with fanfare sound effects and flashing lights. Makers can also program their own games to suit the players’ abilities and interests. 

The post A gaming platform tailored to those with special needs appeared first on Arduino Blog.

Almost all modern video games require either a gamepad or a keyboard and mouse, which means that they’re inaccessible to many people with disabilities that affect manual dexterity. Bob Hammell’s voice-enabled controller lets some of those people experience the joy of video games.

This is a simplified video game controller with a minimal number of physical buttons, but with special voice-activated virtual buttons to make up the difference. The gamepad only has six physical buttons, plus an analog joystick. That makes it much easier to handle than a typical modern controller, which might have a dozen buttons and two joysticks. If the player has the ability, they can utilize the physical controls and then speak commands to activate the game functions not covered by those buttons.

The controller’s brain is an Arduino Micro board, which Hammell selected because it can be configured to show up as a standard USB HID gamepad or keyboard when connected to a PC. The physical controls are an Adafruit analog two-axis joystick and tactile switches. An Adafruit 1.3″ OLED screen displays information, including the status of the voice activation.

An Elechouse V3 Voice Recognition Module performs the voice recognition and it can understand up to 80 different commands. When it recognizes a command, like “menu,” it tells the Arduino to send the corresponding virtual button press to the connected computer. It takes time for a person to speak a command, so those are best suited to functions that players don’t use very often.

If you know someone that would benefit from a controller like this, Hammell posted a full tutorial and all of the necessary files to Hackster.io so you can build your own.

The post Voice-enabled controller makes video games more accessible appeared first on Arduino Blog.

Modern devices rely heavily on touchscreens because they allow for dynamic interfaces that aren’t possible with conventional tactile buttons. But those interfaces present an issue for people with certain disabilities. A person with vision loss, for example, might not be able to see the screen’s content or its virtual buttons at all. To make touchscreens more accessible, a team of engineers from the University of Michigan developed this special phone case called BrushLens.

This case expands a smartphone to add a matrix of actuators or capacitive touch simulator pads. The former work with all forms of touchscreens (including resistive), while the latter only work with capacitive touchscreens — though those are the most common type today. The smartphone’s own camera and sensors let it detect its position on a larger touchscreen, so it can guide a user to a virtual button and then press that button itself.

The prototype hardware includes an Arduino Nano 33 IoT board to control the actuators and/or capacitive touch points. It receives its commands from the smartphone via Bluetooth® Low Energy.

For that to work, the smartphone must understand the target touchscreen and communicate the content to the user. That communication is possible using existing text-to-voice techniques, but analyzing target touchscreens is more difficult. Ideally, UI designers would include some sort of identifier so the user’s smartphone can query screen content and button positions. However, that is an added expense and would require rebuilds of existing interfaces. For that reason, BrushLens includes some ability to analyze touchscreens and their content.

This is a very early prototype, but the concept has a great deal of potential for making a world full of touchscreens more accessible to those living with disabilities.

Image credit: Chen Liang, doctoral student, University of Michigan’s Computer Science and Engineering

The post Tapping without seeing: Making touchscreens accessible appeared first on Arduino Blog.

One of the many realities of living with cerebral palsy is limited upper body dexterity, as almost every activity requires the help of a caregiver. That includes something that most of us take for granted: drinking water. To restore at least that little bit of independence, Rice University engineering students Thomas Kutcher and Rafe Neathery designed the RoboCup.

A typical solution for letting people with cerebral palsy drink without assistance is a “giraffe bottle.” That is a water bottle with a long gooseneck straw that extends in front of the user’s mouth. But while that does give them the ability to drink on their own, it is obtrusive and leaves a bulky straw in front of their face. RoboCup eliminates that issue by rotating the straw out of the way when it isn’t in use. To take a drink, the user just needs to push a button or move their finger over a sensor. The straw will then rotate back over to their mouth.

The best part is that RoboCup is open source, so anyone with a 3D printer and some basic skill with electronics can build one for around $100. The key component is an Arduino Nano board. It monitors the tactical button or distance sensor (whichever is appropriate for the user’s capability) and controls a servo motor that rotates the straw. Power comes from a small rechargeable battery and all of the components, aside from the 3D-printed parts, are off-the-shelf and readily available.

More details on the RoboCup along with instructions are available on the project’s page here.

The post RoboCup is an assistive drinking device for people living with cerebral palsy appeared first on Arduino Blog.

People with visual impairments also enjoy going out to a restaurant for a nice meal, which is why it is common for wait staff to place the salt and pepper shakes in a consistent fashion: salt on the right and pepper on the left. That helps visually impaired diners quickly find the spice they’re looking for and a similar arrangement works for utensils. But what about after the diner sets down a utensil in the middle of a meal? The ForkLocator is an AI system that can help them locate the utensil again.

This is a wearable device meant for people with visual impairments. It uses object recognition and haptic cues to help the user locate their fork. The current prototype, built by Revoxdyna, only works with forks. But it would be possible to expand the system to work with the full range of utensils. Haptic cues come from four servo motors, which prod the user’s arm to indicate the direction in which they should move their hand to find the fork.

The user’s smartphone performs the object recognition and should be worn or positioned in such a way that its camera faces the table. The smartphone app looks for the plate, the fork, and the user’s hand. It then calculates a vector from the hand to the fork and tells an Arduino board to actuate the servo motors corresponding to that direction. Those servos and the Arduino attach to a 3D-printed frame that straps to the user’s upper arm.

A lot more development is necessary before a system like the ForkLocator would be ready for the consumer market, but the accessibility benefits are something to applaud.

The post This AI system helps visually impaired people locate dining utensils appeared first on Arduino Blog.

It is no secret that visual impairments — even those that don’t result in complete blindness — make it very difficult for people to live their lives. White canes can help people get around, but they require physical contact. Seeing eye dogs provide very valuable assistance, but they’re expensive and need care of their own. That’s why Nilay Roy Choudhury designed the Walk-Bot device to help people with visual impairments navigate safely.

Walk-Bot is a wearable navigation device that uses audible cues and haptic feedback to give visually impaired people a sense of their immediate environment. It has a host of sensors that let it identify nearby obstacles at any height from the floor to the ceiling. Walk-Bot performs onboard trigonometry to determine the distance to any obstacles that might interfere with its user’s ability to walk safely. And it is affordable and easy to build with common components.

Those components include an Arduino Nano board, two HC-SR04 ultrasonic sensors, a GP2Y0A02YK0F infrared sensor, a vibration motor, a buzzer, an MPU6050 gyroscope, and an HC-05 Bluetooth module. Those all fit inside a 3D-printed wearable enclosure.

One ultrasonic sensor faces upwards at a 45-degree angle to detect high obstacles. The second ultrasonic sensor faces directly forwards. The infrared sensor points downwards at a 45-degree angle to detect low obstacles and was chosen because ultrasonic sensors struggle with some common floor surfaces. The gyroscope lets Walk-Bot determine its own orientation in space. When it detects an obstacle, Walk-Bot sounds the buzzer and activates the vibration motor. It also includes a panic button that will tell Walk-Bot to connect to the user’s smartphone through the Bluetooth module to message a chosen contact in the event of an emergency.

The post Walk-Bot helps people with visual impairments navigate safely appeared first on Arduino Blog.

Over time, people age and naturally tend to lose some or most of their mobility, leading to a need for a wheelchair, walker, or other assistive device. This led hitesh.boghani to submit his project, which he calls the smartChair, to element14’s Design for a Cause 2021 contest. This build features a sort of pseudo-walker that enables a user to transition from a sitting to a standing position with some motorized assistance. Apart from that primary use, Hitesh also wanted to create a “summon” mode that would allow the walker to move on its own to where it’s needed.

As with every other project submitted to the contest, this too makes use of the Arduino Nano 33 IoT to handle both motor control and communication with a client device. In order to lift the walker from a compacted state to an expanded one, Hitesh began by assembling a wooden frame and then placed a brushless DC motor in line with some gearing to increase torque and reduce the speed. Next, an L293D motor driver IC was connected to a breadboard and a Nano 33 IoT for receiving input signals. And finally, a bit of code was written that spins the motor for a certain number of turns depending on the speed and direction requested.

Unfortunately, time ran out to complete the summon feature, so Hitesh plans on improving this project continually to add a camera, a motorized base, and a basic smartphone app for controlling the whole thing. But even in its current state, the smartChair is a great assistive tool for anyone who needs extra help getting up from a sitting position. 

The post The smartChair is a Nano 33 IoT-based stand-up and walking aid appeared first on Arduino Blog.



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook