Posts | Comments

Planet Arduino

Archive for the ‘Wearable Computing’ Category

For his school science fair, Mars Kapadia decided to take things up a notch and create his own pair of smart glasses.

The wearable device, which went on to place in the state competition, uses a transparent OLED display to show info from Retro Watch software running on an Android phone. They’re controlled by an Arduino Nano Every with an HC-05 Bluetooth module to communicate with the mobile app. Power is provided via a LiPo battery.

One unusual feature is that the darkened lenses can be flipped down for sun protection in outdoor environments, then up to allow easy viewing in darker areas. Kapadia demonstrates how his glasses work, plus discusses the technology used in the video below.

When wearing a face mask nowadays, you can’t show expressions in the same ways that we’re all accustomed to. As a possible solution to this problem, programmer Tyler Glaiel decided to create a custom covering, with an 8×8 LED matrix that picks up his voice and imitates his mouth moving. It even allows him to smile by sensing when he makes a “pop” sound.

The build is entirely self-contained, with an Arduino Nano, 9V power supply, and electret microphone embedded in the mask’s translucent black cloth.

Instructions on how to make your own are available in Glaiel’s blog post, though he is quick to note that it’s not guaranteed to inhibit virus transmissions, and is meant as something of a novelty.

Typing with your thumbs on a smartphone has become an everyday activity for many, but what if you could enter text by simply tapping on your index fingers? With BiTipText, that may soon be a reality. 

The researchers’ prototype consists of an interactive skin overlay made out of flexible PCB material, allowing an Arduino Uno and MPR121 sensor chip to read capacitive signals from both digits. 

In testing, users were able to enter text at over 23 WPM, with a 0.03% uncorrected error rate. Notably, the two-handed implementation means that software can determine not only the position of presses, but the sequence of left/right inputs to help with word interpretation.

More details on the bimanual text input method can be found in the team’s paper here.

Researchers at UNIST in South Korea have developed a novel system for smart device input using touch-sensitive fingernails, called Nailz.

As noted in the team’s paper, fingernails have long been augmented for cosmetic purposes and are extremely accessible, making them a perfect unobtrusive input platform.

The nail system. (Image: Lee et al.)

144 thumb/finger gestures were identified for the setup, with 29 selected as being most practical. This method was then tested with a 94.3 percent accuracy. Fingernails were augmented using flexible PCB material, along with an MPR121 capacitive sensing chip, and an accelerometer was also used to detect wrist movement. Input data for this experiment was obtained by an Arduino MKR WiFi 1010 and sent to a PC for further processing.

More details on the project can be found here

When you get a notification on your smartphone, more often than not, you’re doing something more pressing. You then silence the alarm, and perhaps forget about it. Nick Bild, however, has created a pair of smart glasses that take a new “look” at things by instead giving you a notification when you’re staring at an appropriate item.

For instance, as demonstrated in the demo below, if your calendar says to “Go for a walk,” the Newrons would light up when you’re glancing at a pair of sneakers.

The prototype is controlled by an Arduino Nano 33 IoT, which connects to the Google Calendar API over WiFi to view your schedule. Object recognition is taken care of with a JeVois A33 machine vision camera and notifications are shown on an LED.

More details can be found in Bild’s write-up here

Researchers at the University of Waterloo in Canada have developed a novel hand-based input technique called Tip-Tap that amazingly requires no batteries. 

The wearable device uses a series of three custom RFID tags on both the thumb and index finger with half an antenna on each digit. When the fingertips are touched together, a signal is sent to the computer indicating where the thumb and index finger intersect, which is mapped as a position on a 2D grid.

Usability experiments were carried out using an Arduino Mega, with both on-screen visual feedback and without. Possible applications could include the medical field, where Tip-Tap can be added to disposable gloves enabling surgeons to access a laptop without dictating inputs to an assistant or sterilization issues.

We describe Tip-Tap, a wearable input technique that can be implemented without batteries using a custom RFID tag. It recognizes 2-dimensional discrete touch events by sensing the intersection between two arrays of contact points: one array along the index fingertip and the other along the thumb tip. A formative study identifies locations on the index finger that are reachable by different parts of the thumb tip, and the results determine the pattern of contacts points used for the technique. Using a reconfigurable 3×3 evaluation device, a second study shows eyes-free accuracy is 86% after a very short period, and adding bumpy or magnetic passive haptic feedback to contacts is not necessary. Finally, two battery-free prototypes using a new RFID tag design demonstrates how Tip-Tap can be implemented in a glove or tattoo form factor.

A stretchable light-emitting device becomes an epidermal stopwatch.
Image: Adapted from ACS Materials Letters 2019

Imagine if your watch wasn’t mounted on your wrist, but was instead integrated into a sort of temporary tattoo on the back of your hand? Such an idea is now one step closer to reality, thanks to new research into alternating-current electroluminescent (ACEL) display technology.

While normally such displays require well over 100VAC to produce sufficient brightness, scientists have worked to get this number down into the 10-35V range, allowing them to be used in much closer proximity to human skin. 

To demonstrate this technology, the team constructed a 4-digit 7-segment display that can be applied to one’s hand, using an Arduino Mega and driver circuitry to turn it into a digital timepiece.

More information can be found in the researchers’ paper published in ACS Materials Letters.

Imagine if you had whiskers. Obviously, this would make you something of an oddity in today’s society. On the other hand, you’d be able to sense nearby objects via the transmission of force through these hair structures.

In order to explore this concept, Chris Hill has created a whisker assembly for sensory augmentation, substituting flex sensors for the stiff hairs that we as humans don’t possess. The sensors—four are used here—vary resistance when bent, furnishing information about their status to the Arduino Uno that controls the wearable device. Forehead-mounted vibratory motors are pulsed via PWM outputs in response, allowing the user to feel what’s going on in the surrounding environment.

If this looks familiar, Hill is quick to credit Nicholas Gonyea’s Whisker Sensory Extension Wearable as the basis for this project. He hopes his take on things improves the original, making it lighter, more cost-effective, and easier to construct. 

The purpose of this project was to focus on the creation of novel, computationally-enriched “sensory extensions” that allow for augmented-sensing of the natural world. My major effort with this project was devoted to the fabrication and implementation of sensory augmentations that will extend a sense through sensors and respond with a tactile output for the user. The intent is to enable anyone to fabricate their own sensory extensions, and thusly map intrinsically human/animal senses onto hardware. Effectively extending our senses in new and exciting ways that will lead to a better understanding of how our brain is able to adapt to new external senses.

While you can get a very good workout on your own, it’s ideal if you have someone else watching over your form. This, of course, isn’t always practical, so researchers at the University of Auckland’s Augmented Human Lab have prototyped a wearable system called GymSoles to help. 

GymSoles consists of a pressure-sensitive insole that is used to determine a foot’s center of pressure, and thus infer whether or not the participant is keeping the weights in the proper position relative to his or her body—perfect for exercises like squats and deadlifts. 

Feedback is provided visually as well as through tactile feedback via eight vibrating motors, allowing participants to modify technique without having to focus on a screen. A computer is used to control the device using an Arduino Uno with motor drivers and an I2C multiplexer.

The correct execution of exercises, such as squats and dead-lifts, is essential to prevent various bodily injuries. Existing solutions either rely on expensive motion tracking or multiple Inertial Measurement Units (IMU) systems require an extensive set-up and individual calibration. This paper introduces a proof of concept, GymSoles, an insole prototype that provides feedback on the Centre of Pressure (CoP) at the feet to assist users with maintaining the correct body posture, while performing squats and dead-lifts. GymSoles was evaluated with 13 users in three conditions: 1) no feedback, 2) vibrotactile feedback, and 3) visual feedback. It has shown that solely providing feedback on the current CoP, results in a significantly improved body posture.

Instructables author Daniel Quintana loves mountain biking, but after having to interrupt a ride to continuously check the time, he did what any normal teenager would do in this situation: he created his own Google Glass-like headset from scratch.

His DIY AR device, called “Uware,” takes the form of a 3D-printed enclosure with a tiny 0.49″ OLED screen stuffed inside, along with an HC-06 Bluetooth module, an APDS-9960 gesture sensor, a 3.7V battery, and of course, a tiny Arduino Pro Mini for control.

In normal usage, the wearable displays the time and text messages transmitted from Quintana’s phone over Bluetooth via a custom app that he wrote. Swiping right in front of the gesture sensor puts it into camera mode, allowing him to capture the environment hands-free!

Want to see more? You can find Quintana’s write-up here, or check out Uware’s prototype electronics setup and custom magnetic charging rig in the videos below!



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook