Posts | Comments

Planet Arduino

Archive for the ‘Wearable Computing’ Category

When you get a notification on your smartphone, more often than not, you’re doing something more pressing. You then silence the alarm, and perhaps forget about it. Nick Bild, however, has created a pair of smart glasses that take a new “look” at things by instead giving you a notification when you’re staring at an appropriate item.

For instance, as demonstrated in the demo below, if your calendar says to “Go for a walk,” the Newrons would light up when you’re glancing at a pair of sneakers.

The prototype is controlled by an Arduino Nano 33 IoT, which connects to the Google Calendar API over WiFi to view your schedule. Object recognition is taken care of with a JeVois A33 machine vision camera and notifications are shown on an LED.

More details can be found in Bild’s write-up here

Researchers at the University of Waterloo in Canada have developed a novel hand-based input technique called Tip-Tap that amazingly requires no batteries. 

The wearable device uses a series of three custom RFID tags on both the thumb and index finger with half an antenna on each digit. When the fingertips are touched together, a signal is sent to the computer indicating where the thumb and index finger intersect, which is mapped as a position on a 2D grid.

Usability experiments were carried out using an Arduino Mega, with both on-screen visual feedback and without. Possible applications could include the medical field, where Tip-Tap can be added to disposable gloves enabling surgeons to access a laptop without dictating inputs to an assistant or sterilization issues.

We describe Tip-Tap, a wearable input technique that can be implemented without batteries using a custom RFID tag. It recognizes 2-dimensional discrete touch events by sensing the intersection between two arrays of contact points: one array along the index fingertip and the other along the thumb tip. A formative study identifies locations on the index finger that are reachable by different parts of the thumb tip, and the results determine the pattern of contacts points used for the technique. Using a reconfigurable 3×3 evaluation device, a second study shows eyes-free accuracy is 86% after a very short period, and adding bumpy or magnetic passive haptic feedback to contacts is not necessary. Finally, two battery-free prototypes using a new RFID tag design demonstrates how Tip-Tap can be implemented in a glove or tattoo form factor.

A stretchable light-emitting device becomes an epidermal stopwatch.
Image: Adapted from ACS Materials Letters 2019

Imagine if your watch wasn’t mounted on your wrist, but was instead integrated into a sort of temporary tattoo on the back of your hand? Such an idea is now one step closer to reality, thanks to new research into alternating-current electroluminescent (ACEL) display technology.

While normally such displays require well over 100VAC to produce sufficient brightness, scientists have worked to get this number down into the 10-35V range, allowing them to be used in much closer proximity to human skin. 

To demonstrate this technology, the team constructed a 4-digit 7-segment display that can be applied to one’s hand, using an Arduino Mega and driver circuitry to turn it into a digital timepiece.

More information can be found in the researchers’ paper published in ACS Materials Letters.

Imagine if you had whiskers. Obviously, this would make you something of an oddity in today’s society. On the other hand, you’d be able to sense nearby objects via the transmission of force through these hair structures.

In order to explore this concept, Chris Hill has created a whisker assembly for sensory augmentation, substituting flex sensors for the stiff hairs that we as humans don’t possess. The sensors—four are used here—vary resistance when bent, furnishing information about their status to the Arduino Uno that controls the wearable device. Forehead-mounted vibratory motors are pulsed via PWM outputs in response, allowing the user to feel what’s going on in the surrounding environment.

If this looks familiar, Hill is quick to credit Nicholas Gonyea’s Whisker Sensory Extension Wearable as the basis for this project. He hopes his take on things improves the original, making it lighter, more cost-effective, and easier to construct. 

The purpose of this project was to focus on the creation of novel, computationally-enriched “sensory extensions” that allow for augmented-sensing of the natural world. My major effort with this project was devoted to the fabrication and implementation of sensory augmentations that will extend a sense through sensors and respond with a tactile output for the user. The intent is to enable anyone to fabricate their own sensory extensions, and thusly map intrinsically human/animal senses onto hardware. Effectively extending our senses in new and exciting ways that will lead to a better understanding of how our brain is able to adapt to new external senses.

While you can get a very good workout on your own, it’s ideal if you have someone else watching over your form. This, of course, isn’t always practical, so researchers at the University of Auckland’s Augmented Human Lab have prototyped a wearable system called GymSoles to help. 

GymSoles consists of a pressure-sensitive insole that is used to determine a foot’s center of pressure, and thus infer whether or not the participant is keeping the weights in the proper position relative to his or her body—perfect for exercises like squats and deadlifts. 

Feedback is provided visually as well as through tactile feedback via eight vibrating motors, allowing participants to modify technique without having to focus on a screen. A computer is used to control the device using an Arduino Uno with motor drivers and an I2C multiplexer.

The correct execution of exercises, such as squats and dead-lifts, is essential to prevent various bodily injuries. Existing solutions either rely on expensive motion tracking or multiple Inertial Measurement Units (IMU) systems require an extensive set-up and individual calibration. This paper introduces a proof of concept, GymSoles, an insole prototype that provides feedback on the Centre of Pressure (CoP) at the feet to assist users with maintaining the correct body posture, while performing squats and dead-lifts. GymSoles was evaluated with 13 users in three conditions: 1) no feedback, 2) vibrotactile feedback, and 3) visual feedback. It has shown that solely providing feedback on the current CoP, results in a significantly improved body posture.

Instructables author Daniel Quintana loves mountain biking, but after having to interrupt a ride to continuously check the time, he did what any normal teenager would do in this situation: he created his own Google Glass-like headset from scratch.

His DIY AR device, called “Uware,” takes the form of a 3D-printed enclosure with a tiny 0.49″ OLED screen stuffed inside, along with an HC-06 Bluetooth module, an APDS-9960 gesture sensor, a 3.7V battery, and of course, a tiny Arduino Pro Mini for control.

In normal usage, the wearable displays the time and text messages transmitted from Quintana’s phone over Bluetooth via a custom app that he wrote. Swiping right in front of the gesture sensor puts it into camera mode, allowing him to capture the environment hands-free!

Want to see more? You can find Quintana’s write-up here, or check out Uware’s prototype electronics setup and custom magnetic charging rig in the videos below!

Instructables author Daniel Quintana loves mountain biking, but after having to interrupt a ride to continuously check the time, he did what any normal teenager would do in this situation: he created his own Google Glass-like headset from scratch.

His DIY AR device, called “Uware,” takes the form of a 3D-printed enclosure with a tiny 0.49″ OLED screen stuffed inside, along with an HC-06 Bluetooth module, an APDS-9960 gesture sensor, a 3.7V battery, and of course, a tiny Arduino Pro Mini for control.

In normal usage, the wearable displays the time and text messages transmitted from Quintana’s phone over Bluetooth via a custom app that he wrote. Swiping right in front of the gesture sensor puts it into camera mode, allowing him to capture the environment hands-free!

Want to see more? You can find Quintana’s write-up here, or check out Uware’s prototype electronics setup and custom magnetic charging rig in the videos below!

To address the limitations of today’s fixed-face watches, researchers have come up with an actuated smartphone concept that physically moves itself using an Arduino Due, Bluetooth and several motors.

Receiving Internet notifications has gone from using a computer, to checking them on your smartphone, to now simply seeing them come in on your wearable device. On the other hand, you still have to rotate your wrist into the right position to see the screen. Worse yet, if you want to show others what is on your wrist, you may even have to twist your arm awkwardly.

Fortunately, there is a possible solution to this scourge in the form of Cito, which bills itself as “An Actuated Smartwatch for Extended Interactions.” This design can move in five different directions–rotates, hinges, translates, orbits and rises–potentially making viewing more convenient, or even providing haptic feedback. Prototype electronics are housed inside a control box on the upper arm, but presumably would become much smaller in a production version.

You can see the team’s entire paper here, or read this write-up for a more involved summary.

Photo: Jun Gong

After much experimentation, researchers at Fraunhofer Institute for Computer Graphics Research in Rostock and the University of Cologne in Germany have developed an electronically-augmented earplug that can read facial expressions and convert them into controls for your smartphone. For example, you may soon be able to answer a call with a wink or launch an app by moving your head to one side.

The prototype of this EarFieldSensing, or EarFS, technology consists of the earbud itself, a reference electrode attached to the user’s earlobe, and an Arduino along with four sensing shields in a companion bag.

Currently, the system can recognize five expressions–winking, smiling, opening your mouth, making a ‘shh’ sound, and turning your head the right–with over 85% accuracy while walking, and even better when sitting. Hands-free emojis would be an obvious use case, but perhaps it could be employed for covert signaling as well. Was that a nice smile, or are you calling in backup? It could also be quite useful while driving or for those with disabilities.

You can read more about EarFS in the team’s paper and in this New Scientist article.

Photo: Denys J.C. Matthies / Daily Mail

Developed by a team of UC Berkeley students, Skintillates is a wearable technology that mimics tattoos.

When you think of temporary tattoos, you likely think of something that comes out of a gumball dispenser, or perhaps “art” that you got on a spring break trip. As interesting as those may be, Skintillates is taking things to the next level.

These “epidermal wearable interactive devices” can serve as everything from passive and active on-skin displays, to capacitive and resistive sensors for controlling gadgets, to strain gauges for posture detection.

Using several layers allows these designs to stick to the skin, integrate various electronics, and have visible art for others to see. Electronics can mean that the tattoos can integrate sensors, or perhaps even LEDs. In at least one case, these lights are programmed to flash along with the beat of music, driven by an Arduino hidden under the wearer’s clothing.

Just like the traditional temporary tattoos often worn by children and adults alike, Skintillates flex naturally with the user’s skin. Our simple fabrication technique also enables users to freely design and print with a full range of colors to create application-specific customized designs.

You can find more on this project on the Hybrid Ecologies Lab page and read the team’s entire paper here.

(Photos: Eric Paulos)



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook