Posts | Comments

Planet Arduino

Archive for the ‘Wearables’ Category

Your leather jacket might look cool, but one thing it’s missing—unless you’re maker “abetusk” or perhaps a Japanese musician—is lasers! 

After seeing Yoshii Kazuya’s laser-spiked outfit, ‘tusk decided to create an excellent version of the getup by embedding 128 laser diodes embedded in his own jacket. These lasers are powered by an Arduino Nano, along with a pair of I2C PWM output boards, allowing them to be switched in sets of four. 

The lasers can be controlled either by joystick, via a microphone in order to react to sound, or in a looping ‘twinkle’ pattern. 

More information on the project is available in this write-up as well as on GitHub, which includes Arduino code and other files needed to build your own.

After seeing Wei Chieh Shi’s laser jacket design, I wanted to create my own. These instructions show how to modify a jacket to add laser diodes and control them electronically to produce different laser light patterns. The laser diodes give the jacket an appearance of being “spiky”, like having metal spikes but with red laser light. The effect is especially striking in environments with fog or smoke as the laser light path shows a trail from where it originates.

The concept and execution is relatively simple but care has to be taken to make sure that the electronics, wiring and other aspects of the jacket don’t fail when in use. Much of the subtlety and complexity of the project is providing proper wire routing and making sure that strain relief for the electronics and connections is provided so that it’s resilient under normal wear.

Assuming the basic parts are available (soldering iron, multimeter, wire strippers, laser cutter, etc.) I would estimate that this project is about $300 in raw materials and about 20 hours worth of labor.

Depending on the battery used, the jacket can operate for about an hour or two continuously. Spare batteries can be carried around and used to replace the depleted batteries if need be.

Those familiar with the Dragon Ball Z franchise will recognize the head-mounted Scouter computer often seen adorning character faces. As part of his Goku costume, Marcin Poblocki made an impressive replica of this device, featuring a see-through lens that shows the “strength” of the person he’s looking at, based on a distance measurement taken using a VL53L0X sensor. 

An Arduino Nano provides processing power for the headset, and light from a small OLED display is reflected on the lens for AR-style viewing.

It’s not exactly perfect copy but it’s actually working device. Inspired by Google virtual glasses I made virtual distance sensor.

I used Arduino Nano, OLED screen and laser distance sensor. Laser sensor takes readings (not calibrated yet) and displays number on OLED screen. Perspex mirror reflects the image (45 degrees) to the the lens (used from cheap Google Cardboard virtual glasses) and then it’s projected on clear Perspex screen.

So you will still see everything but in the clear Perspex you will also see distance to the object you looking at. On OLED screen I typed ‘Power’ instead distance because that’s what this device suppose to measure in DBZ. 😀

Print files as well as code and the circuit diagram needed to hook this head-mounted device up  are available on Thingiverse. For those that don’t have a DBZ costume in their immediate future, the concept could be expanded to a wide variety of other sci-fi and real world applications.

Smartwatches can keep us informed of incoming information at a glance, but responding still takes the use of another hand, potentially occupied by other tasks. Researchers at Dartmouth College are trying to change that with their new WrisText system.

The device divides the outside of a Ticwatch 2 into six sections of letters, selected by the movement of one’s wrist. As letters are chosen, possible words are displayed on the screen, which are then selected automatically, or by rubbing and tapping gestures between one’s finger and thumb. 

The prototype employs an Arduino DUE to pass information to a computer, along with proximity and piezo sensors to detect hand and finger movements. 

We present WrisText – a one-handed text entry technique for smartwatches using the joystick-like motion of the wrist. A user enters text by whirling the wrist of the watch hand, towards six directions which each represent a key in a circular keyboard, and where the letters are distributed in an alphabetical order. The design of WrisText was an iterative process, where we first conducted a study to investigate optimal key size, and found that keys needed to be 55o or wider to achieve over 90% striking accuracy. We then computed an optimal keyboard layout, considering a joint optimization problem of striking accuracy, striking comfort, word disambiguation. We evaluated the performance of WrisText through a five-day study with 10 participants in two text entry scenarios: hand-up and hand- down. On average, participants achieved a text entry speed of 9.9 WPM across all sessions, and were able to type as fast as 15.2 WPM by the end of the last day.

More information can be found in the project’s research paper, or you can see it demonstrated in the video below.

Most of what we see on the wearable tech front is built around traditional textiles, like adding turn signals to a jacket for safer bike riding, or wiring up a scarf with RGB LEDs and a color sensor to make it match any outfit. Although we’ve seen the odd light-up hair accessory here and there, we’ve never seen anything quite like these Bluetooth-enabled, shape-shifting, touch-sensing hair extensions created by UC Berkeley students [Sarah], [Molly], and [Christine].

HairIO is based on the idea that hair is an important part of self-expression, and that it can be a natural platform for sandboxing wearable interactivity. Each hair extension is braided up with nitinol wire, which holds one shape at room temperature and changes to a different shape when heated. The idea is that you could walk around with a straight braid that curls up when you get a text, or lifts up to guide the way when a friend sends directions. You could even use the braid to wrap up your hair in a bun for work, and then literally let it down at 5:00 by sending a signal to straighten out the braid. There’s a slick video after the break that demonstrates the possibilities.

HairIO is controlled with an Arduino Nano and a custom PCB that combines the Nano, a Bluetooth module, and BJTs that drive the braid. Each braid circuit also has a thermistor to keep the heat under control. The team also adapted the swept-frequency capacitive sensing of Disney’s Touché project to make HairIO extensions respond to complex touches. Our favorite part has to be that they chalked some of the artificial tresses with thermochromic pigment powder so they change color with heat. Makes us wish we still had our Hypercolor t-shirt.

Nitinol wire is nifty stuff. You can use it to retract the landing gear on an RC plane, or make a marker dance to Duke Nukem.

Most of what we see on the wearable tech front is built around traditional textiles, like adding turn signals to a jacket for safer bike riding, or wiring up a scarf with RGB LEDs and a color sensor to make it match any outfit. Although we’ve seen the odd light-up hair accessory here and there, we’ve never seen anything quite like these Bluetooth-enabled, shape-shifting, touch-sensing hair extensions created by UC Berkeley students [Sarah], [Molly], and [Christine].

HairIO is based on the idea that hair is an important part of self-expression, and that it can be a natural platform for sandboxing wearable interactivity. Each hair extension is braided up with nitinol wire, which holds one shape at room temperature and changes to a different shape when heated. The idea is that you could walk around with a straight braid that curls up when you get a text, or lifts up to guide the way when a friend sends directions. You could even use the braid to wrap up your hair in a bun for work, and then literally let it down at 5:00 by sending a signal to straighten out the braid. There’s a slick video after the break that demonstrates the possibilities.

HairIO is controlled with an Arduino Nano and a custom PCB that combines the Nano, a Bluetooth module, and BJTs that drive the braid. Each braid circuit also has a thermistor to keep the heat under control. The team also adapted the swept-frequency capacitive sensing of Disney’s Touché project to make HairIO extensions respond to complex touches. Our favorite part has to be that they chalked some of the artificial tresses with thermochromic pigment powder so they change color with heat. Makes us wish we still had our Hypercolor t-shirt.

Nitinol wire is nifty stuff. You can use it to retract the landing gear on an RC plane, or make a marker dance to Duke Nukem.

Keep track of each high five for a whole weekend!

Read more on MAKE

The post Hacking Together A Smart Glove to Count High Fives at World Maker Faire appeared first on Make: DIY Projects and Ideas for Makers.

Instructables author Daniel Quintana loves mountain biking, but after having to interrupt a ride to continuously check the time, he did what any normal teenager would do in this situation: he created his own Google Glass-like headset from scratch.

His DIY AR device, called “Uware,” takes the form of a 3D-printed enclosure with a tiny 0.49″ OLED screen stuffed inside, along with an HC-06 Bluetooth module, an APDS-9960 gesture sensor, a 3.7V battery, and of course, a tiny Arduino Pro Mini for control.

In normal usage, the wearable displays the time and text messages transmitted from Quintana’s phone over Bluetooth via a custom app that he wrote. Swiping right in front of the gesture sensor puts it into camera mode, allowing him to capture the environment hands-free!

Want to see more? You can find Quintana’s write-up here, or check out Uware’s prototype electronics setup and custom magnetic charging rig in the videos below!

Instructables author Daniel Quintana loves mountain biking, but after having to interrupt a ride to continuously check the time, he did what any normal teenager would do in this situation: he created his own Google Glass-like headset from scratch.

His DIY AR device, called “Uware,” takes the form of a 3D-printed enclosure with a tiny 0.49″ OLED screen stuffed inside, along with an HC-06 Bluetooth module, an APDS-9960 gesture sensor, a 3.7V battery, and of course, a tiny Arduino Pro Mini for control.

In normal usage, the wearable displays the time and text messages transmitted from Quintana’s phone over Bluetooth via a custom app that he wrote. Swiping right in front of the gesture sensor puts it into camera mode, allowing him to capture the environment hands-free!

Want to see more? You can find Quintana’s write-up here, or check out Uware’s prototype electronics setup and custom magnetic charging rig in the videos below!

Join us this Friday at noon PDT for a Hack Chat with Tenaya Hurst of Arduino. If you’ve been one of the big Maker Faires over the last few years (or innumerable other live events) and stopped by the Arduino area you’ve probably met Tenaya. She is the Education Accounts Manager for Arduino and loves working with wearable electronics.

Come and discuss maker education and the role Arduino is playing in getting our students excited about electronics, and STEAM education in general. Tenaya will also be discussing a new wearable tech kit she’s been working on. We hope to see the gear in person at Bay Area Maker Faire next week.

Here’s How To Take Part:

join-hack-chatOur Hack Chats are live community events on the Hackaday.io Hack Chat group messaging.

Log into Hackaday.io, visit that page, and look for the ‘Join this Project’ Button. Once you’re part of the project, the button will change to ‘Team Messaging’, which takes you directly to the Hack Chat.

You don’t have to wait until Friday; join whenever you want and you can see what the community is talking about.


Filed under: Arduino Hacks, Hackaday Columns, wearable hacks

After much experimentation, researchers at Fraunhofer Institute for Computer Graphics Research in Rostock and the University of Cologne in Germany have developed an electronically-augmented earplug that can read facial expressions and convert them into controls for your smartphone. For example, you may soon be able to answer a call with a wink or launch an app by moving your head to one side.

The prototype of this EarFieldSensing, or EarFS, technology consists of the earbud itself, a reference electrode attached to the user’s earlobe, and an Arduino along with four sensing shields in a companion bag.

Currently, the system can recognize five expressions–winking, smiling, opening your mouth, making a ‘shh’ sound, and turning your head the right–with over 85% accuracy while walking, and even better when sitting. Hands-free emojis would be an obvious use case, but perhaps it could be employed for covert signaling as well. Was that a nice smile, or are you calling in backup? It could also be quite useful while driving or for those with disabilities.

You can read more about EarFS in the team’s paper and in this New Scientist article.

Photo: Denys J.C. Matthies / Daily Mail



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook