Posts | Comments

Planet Arduino

Archive for the ‘Wearables’ Category

We recently showed you Becky Stern’s recreation of the “computer book” carried by Penny in the Inspector Gadget cartoon, but Stern didn’t stop there. She also built a replica of Penny’s most iconic gadget: her watch. Penny was a trendsetter and rocked that decades before the Apple Watch hit the market. Stern’s replica looks just like the cartoon version and even has some of the same features.

The centerpiece of this project is an Arduino Nicla Voice board. The Arduino team designed that board specifically for speech recognition on the edge, which made it perfect for recognizing Penny’s signature “come in, Brain!” voice command. Stern used Edge Impulse to train an AI to recognize that phrase as a wake word. When the Nicla Voice board hears that, it changes the image on the smart watch screen to a new picture of Brain the dog.

The Nicla Vision board and an Adafruit 1.69″ color IPS TFT screen fit inside a 3D-printed enclosure modeled on Penny’s watch from the cartoon. That even has a clever 3D-printed watch band with links connected by lengths of fresh filament. Power comes from a small lithium battery that also fits inside the enclosure.

This watch and Stern’s computer book will both be part of an Inspector Gadget display put on by Digi-Key at Maker Faire Rome, so you can see it in person if you attend.

The post Building the OG smartwatch from Inspector Gadget appeared first on Arduino Blog.

Most people today rely on technology to navigate through the world. That is practical thanks to the reliability of modern GPS. But receiving directions can be difficult for people with certain disabilities. People who are blind, for instance, cannot look at a map on a smartphone. People with missing limbs may not even be able to hold a smartphone. To help those people, Rice University engineers have developed a lightweight wearable device that uses pressurized air to provide directions.

Instead of displaying a graphical map, this device indicates to the user when they should make a turn. It does so through pneumatic haptic feedback. The device’s electronic components attach to wearable textiles, so it is out of the way. Pneumatic hoses run from the device to nozzles pointed at the user’s skin. The locations of those nozzles can be tailored to suit the user’s preferences and physiology. When the device needs to alert the user, such as when a turn is coming up, it will open a valve to the appropriate hose so air blows on their skin.

The Rice engineers designed a prototype to test this concept, which uses an Arduino Nano board for control. The Arduino opens the solenoid valves through MOSFETs and it receives commands from an external device, like a computer, via a four-channel 433MHz receiver. Air comes from canisters of compressed CO2 through a pressure regulator.

In testing, subjects were able to correctly interpret the pneumatic haptic feedback 87% of the time.

The post This wearable device uses air to provide directions appeared first on Arduino Blog.

People with visual impairments also enjoy going out to a restaurant for a nice meal, which is why it is common for wait staff to place the salt and pepper shakes in a consistent fashion: salt on the right and pepper on the left. That helps visually impaired diners quickly find the spice they’re looking for and a similar arrangement works for utensils. But what about after the diner sets down a utensil in the middle of a meal? The ForkLocator is an AI system that can help them locate the utensil again.

This is a wearable device meant for people with visual impairments. It uses object recognition and haptic cues to help the user locate their fork. The current prototype, built by Revoxdyna, only works with forks. But it would be possible to expand the system to work with the full range of utensils. Haptic cues come from four servo motors, which prod the user’s arm to indicate the direction in which they should move their hand to find the fork.

The user’s smartphone performs the object recognition and should be worn or positioned in such a way that its camera faces the table. The smartphone app looks for the plate, the fork, and the user’s hand. It then calculates a vector from the hand to the fork and tells an Arduino board to actuate the servo motors corresponding to that direction. Those servos and the Arduino attach to a 3D-printed frame that straps to the user’s upper arm.

A lot more development is necessary before a system like the ForkLocator would be ready for the consumer market, but the accessibility benefits are something to applaud.

The post This AI system helps visually impaired people locate dining utensils appeared first on Arduino Blog.

Many people (especially those with autism spectrum disorder) have difficulty communicating with others around them. That is always a challenge, but becomes particularly noticeable when one cannot convey their emotions through body language. If someone can’t show that they’re not in the mood to talk, that may lead to confusing interactions. To help people communicate their emotions, University of Stuttgart students Clara Blum and Mohammad Jafari came up with this wearable device that makes them obvious.

The aptly named Emotion Aid sits on the user’s shoulders like a small backpack. The prototype was designed to attach to a bra, but it could be tweaked to be worn by those who don’t use bras. It has two functions: detecting the user’s emotions and communicating those emotions. It uses an array of different sensors to detect biometric indicators, such as temperature, pulse, and sweat, to try and determine the user’s emotional state. It then conveys that emotional state to the surrounding world with an actuated fan-like apparatus.

An Arduino Uno Rev3 handles these functions. Input comes from a capacitive moisture sensor, a temperature sensor, and a pulse sensor. The Arduino actuates the fan mechanism using a small hobby servo motor. Power comes from a 9V battery. The assembly process is highly dependent on the way the device is to be worn, but the write-up illustrates how to attach the various sensors to a bra. There are many possible variations, so the creators of the Emotion Aid encourage people to experiment with the idea.

The post The Emotion Aid is a wearable device that communicates the user’s emotions appeared first on Arduino Blog.

A great number of activities require the precise application of force with the fingertips. When playing a guitar, for example, you must exert the proper amount of force to push a string against the fret board. Training is difficult, because new guitarists don’t know how much force to apply. This system controls fingertip force to help users learn how to perform new activities.

Developed by NTT Corporation researchers, the system needs two parts to enable fingertip force control: stimulation and feedback. EMS (electronic muscle stimulation) handles the former by pulsing a small amount of electric current through the user’s muscles, forcing them to contract. That is commonplace technology today, with uses ranging from legitimate medical therapy to more homeopathic remedies. For feedback, the system utilizes bioacoustic technology (a transducer and piezoelectric sensor) to determine the amount of force applied by a user’s finger.

An Arduino Uno Rev3 board paired with a function generator gives the system precise control over the EMS unit, allowing it to adjust muscle stimulation as necessary. It does so in real-time in response to fingertip force estimated by a machine-learning regression model. An expert in the activity could use the system to train it on the proper amount of force for an action, then the system could provide the amount of stimulation necessary for a new student to replicate the expert’s force. With practice, the student would gain a feel for the force and then could perform the activity on their own without the aid of the system.

Additional details on the project can be found in the researchers’ paper here.

The post Fingertip force control aids in sports and musical training appeared first on Arduino Blog.

Modern consumer devices are fantastic at providing visual and auditory stimulation, but they fail to excite any of the other senses. At most, we get some tactile sensation in the form of haptic feedback. But those course vibrations do little more than provide an indication that something is happening, which is why researchers look for alternatives. Developed by a team of City University of Hong Kong researchers, Emoband provides a new kind of tactile feedback in the form of stroking and squeezing of the user’s wrist.

Emoband looks a bit like an over-sized smartwatch with three bands. Two of those bands are just normal straps that secure the device to the user’s wrist. The third band, in the middle, can be made of several different materials. It attaches to two spools on the device, which can reel in or out the material. If both reel in the band, then it will squeeze the user’s wrist. If one reels in while the other reels out, then the band strokes the user’s wrist. Depending on the material, those sensations may elicit different emotional responses from the user.

The prototype Emoband unit uses an Arduino Mega 2560 board to control two servo motors that turn the spools for the material band. A laptop communicates with the Arduino through serial, telling it how to move the band to mirror the onscreen content. Two load cells provide feedback on the amount of squeezing pressure. The prototype device’s frame and spools were 3D-printed.

In the future, it could be possible to integrate this functionality into the smart watches that people already wear—if the general public decided that they want this kind of tactile feedback. Initial testing showed the users certainly noticed the feedback, but it isn’t clear if they thought it was worthwhile or practical. More details on the project can be found in the researchers’ paper here.

The post Emoband strokes and squeezes your wrist appeared first on Arduino Blog.

For those aged 65 and over, falls can be one of the most serious health concerns they face either due to lower mobility or decreasing overall coordination. Recognizing this issue, Naveen Kumar set out to produce a wearable fall-detecting device that aims to increase the speed at which this occurs by utilizing a Transformer-based model rather than a more traditional recurrent neural network (RNN) model.

Because this project needed to be both fast and consume only small amounts of current, Kumar went with the new Arduino GIGA R1 WiFi due to its STM32H74XI dual-core Arm CPU, onboard WiFi/Bluetooth®, and ability to interface with a wide variety of sensors. After connecting an ADXL345 three-axis accelerometer, he realized that collecting many hours of samples by hand would be far too time consuming, so instead, he downloaded the SisFall dataset, ran a Python script to parse the sample data into an Edge Impulse-compatible format, and then uploaded the resulting JSON files into a new project. Once completed, he used the API to split each sample into four-second segments and then used the Keras block edit feature to build a reduced-sized Transformer model.

The result after training was a 202KB large model that could accurately determine if a fall occurred 96% of the time. Deployment was then as simple as using the Arduino library feature within a sketch to run an inference and display the result via an LED, though future iterations could leverage the GIGA R1 WiFi’s connectivity to send out alert notifications if an accident is detected. More information can be found here in Kumar’s write-up.

The post This GIGA R1 WiFi-powered wearable detects falls using a Transformer model appeared first on Arduino Blog.

Few things are worse than going to exercise, coming back home, and then realizing that you have been nose blind the entire time to your own odor. In order to detect the potential stench before anyone else does, Luke Berndt and his daughter, Elena, teamed up to create the Smelling Fresh, Feeling Fresh! project.

Their idea was to take a Nicla Sense ME board along with one of K-Way’s jackets as part of our recent collaboration and use it to recognize when the outerwear developed a foul smell. Data was gathered using already stinky clothes from dirty laundry bins and trash, with the BME688 four-in-one gas sensor picking up the slight differences in CO2, humidity, and volatile organic compounds (VOCs) between clean and smelly samples. All of the data was then uploaded to the Edge Impulse Studio and used to train a model, and after a few more rounds of gathering more data, it was finally accurate enough to deploy.

The original plan involved sending an alert over Bluetooth® Low Energy to an accompanying phone app and displaying the message to the user, but this proved too difficult because of low-memory issues. So instead, the duo simply made the code illuminate the RGB either red, yellow, or green to indicate the current air cleanliness.

For more details, you can check out their proof of concept on the Arduino Project Hub.

The post Making jackets smarter by letting them smell appeared first on Arduino Blog.

Bone density, strength, and coordination all decrease as we age, and this fact can lead to some serious consequences in the form of slips, falls, and other accidents. In Finland, falling is the most common type of accidental death among those age 65 and over, amounting to around 1,200 per year. But Thomas Vikstrom hopes to decrease this number by detecting falls the moment they occur through the use of the Arduino Nicla Sense ME’s accelerometer together with a K-Way jacket and a smartwatch.

At first, Vikstrom tried to gather and label data for all kinds of activities, including sitting, walking, running, driving, etc., but later realized anomaly detection would be much better suited for this application. After collecting around 80 seconds of data with Edge Impulse Studio, he trained an anomaly detection model to detect when any out-of-the-ordinary events occur. The model was then deployed to the Nicla Sense ME by integrating the inferencing code with a BLE service that outputs a positive value when a fall is detected, as well as illuminating the onboard LED.

To receive this information, Vikstrom added a Bangle.js smartwatch to the system which automatically calls an emergency number if the wearer fails to intervene. For more information about this project, you can check out his Edge Impulse docs page here. Although only a proof of concept, this K-Way demonstrates how tinyML-powered outerwear can be used to detect falls, and together with cellular network devices send for help in case the user is immobile.

The post Detecting falls by embedding ML into clothing appeared first on Arduino Blog.

Spectroscopy is a field of study that utilizes the measurement of electromagnetic radiation (often visible light) as it reflects off of or passes through a substance. It can, for instance, help researchers determine the composition of a material, as that composition influences how the material reflects light. Spectroscopy is also used in medicine, but traditionally requires that patients visit a lab. To enable long-term spectroscopic analysis, a team of engineers built a wearable spectroscopy sensor called Lumos.

Lumos comes in two forms: a smartwatch-like wearable wristband and a fingertip model that resembles the pulse oximeters that nurses put on your finger when you go in for a checkup. The latter is meant for use in doctor’s offices and labs, but the former was designed for patients to wear as they go about their daily lives. It would continue to collect spectroscopic data as they do, which could provide valuable insight. Such long-term data collection would help physicians observe how conditions progress or to see conditions that don’t present consistently.

The engineers chose an A7341 spectral sensor for Lumos because it is compact, but still has a large sensing range. An Arduino Nano 33 IoT development board provides power to the A7341, receives the data from the A7341 through an I2C connection, and then sends the data to a base station via WiFi. Power comes from a 400mAh lithium-ion battery, which lasts for around five hours before it needs recharging. That’s five hours of spectroscopic data to analyze — far more than can be gathered using traditional in-lab instruments.

Image credit: Watson and Kendel et al.

The post Lumos finally enables wearable spectroscopy research appeared first on Arduino Blog.



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook