Posts | Comments

Planet Arduino

Archive for the ‘Wearable Technology’ Category

While you can get a very good workout on your own, it’s ideal if you have someone else watching over your form. This, of course, isn’t always practical, so researchers at the University of Auckland’s Augmented Human Lab have prototyped a wearable system called GymSoles to help. 

GymSoles consists of a pressure-sensitive insole that is used to determine a foot’s center of pressure, and thus infer whether or not the participant is keeping the weights in the proper position relative to his or her body—perfect for exercises like squats and deadlifts. 

Feedback is provided visually as well as through tactile feedback via eight vibrating motors, allowing participants to modify technique without having to focus on a screen. A computer is used to control the device using an Arduino Uno with motor drivers and an I2C multiplexer.

The correct execution of exercises, such as squats and dead-lifts, is essential to prevent various bodily injuries. Existing solutions either rely on expensive motion tracking or multiple Inertial Measurement Units (IMU) systems require an extensive set-up and individual calibration. This paper introduces a proof of concept, GymSoles, an insole prototype that provides feedback on the Centre of Pressure (CoP) at the feet to assist users with maintaining the correct body posture, while performing squats and dead-lifts. GymSoles was evaluated with 13 users in three conditions: 1) no feedback, 2) vibrotactile feedback, and 3) visual feedback. It has shown that solely providing feedback on the current CoP, results in a significantly improved body posture.

After much experimentation, researchers at Fraunhofer Institute for Computer Graphics Research in Rostock and the University of Cologne in Germany have developed an electronically-augmented earplug that can read facial expressions and convert them into controls for your smartphone. For example, you may soon be able to answer a call with a wink or launch an app by moving your head to one side.

The prototype of this EarFieldSensing, or EarFS, technology consists of the earbud itself, a reference electrode attached to the user’s earlobe, and an Arduino along with four sensing shields in a companion bag.

Currently, the system can recognize five expressions–winking, smiling, opening your mouth, making a ‘shh’ sound, and turning your head the right–with over 85% accuracy while walking, and even better when sitting. Hands-free emojis would be an obvious use case, but perhaps it could be employed for covert signaling as well. Was that a nice smile, or are you calling in backup? It could also be quite useful while driving or for those with disabilities.

You can read more about EarFS in the team’s paper and in this New Scientist article.

Photo: Denys J.C. Matthies / Daily Mail

Developed by a team of UC Berkeley students, Skintillates is a wearable technology that mimics tattoos.

When you think of temporary tattoos, you likely think of something that comes out of a gumball dispenser, or perhaps “art” that you got on a spring break trip. As interesting as those may be, Skintillates is taking things to the next level.

These “epidermal wearable interactive devices” can serve as everything from passive and active on-skin displays, to capacitive and resistive sensors for controlling gadgets, to strain gauges for posture detection.

Using several layers allows these designs to stick to the skin, integrate various electronics, and have visible art for others to see. Electronics can mean that the tattoos can integrate sensors, or perhaps even LEDs. In at least one case, these lights are programmed to flash along with the beat of music, driven by an Arduino hidden under the wearer’s clothing.

Just like the traditional temporary tattoos often worn by children and adults alike, Skintillates flex naturally with the user’s skin. Our simple fabrication technique also enables users to freely design and print with a full range of colors to create application-specific customized designs.

You can find more on this project on the Hybrid Ecologies Lab page and read the team’s entire paper here.

(Photos: Eric Paulos)

1_img3021-11low

Cosmic Bitcasting is a digital art and science project emerging from the idea of connecting the human body with the cosmos by creating a wearable device with embedded light, sound and vibration that will provide sensory information on the invisible cosmic radiation that surrounds us. This open-source project actually works by detecting secondary muons generated by cosmic rays hitting the Earth’s atmosphere that pass through the body.

Artist Afroditi Psarra and experimental physicist Cécile Lapoire worked together to develop a prototype of the wearable cosmic ray detector during a one-month residency at Etopia in Zaragoza, and is currently on display at the Etopia-Center for Art and Technology in Zaragoza as part of the exhibition REVERBERADAS.

img3553

Cosmic Bitcasting is comprised of an Arduino Lilypad, High Flex 3981 7×1 fach Kupfer blank conductive thread from Karl Grimm, Pure Copper Polyester Taffeta Fabric by Less EMF, white SMD LEDs, a coin cell vibration motor, and an IRL3103 MOSFET with a 100 Ohm resistor to drive the motor.

Intrigued? Take a look at the video below and read the diary of the residency to learn more!

 

LEYLA 01

Leyla is an interactive Niqab that reveals facials expressions on textile recreating the movement of facial muscles involved in smile and frown. The project was created by Patrizia Sciglitano and sent to us through our blog submission form. We got in touch with her to know more about it.

How come you started working at this project?

I started my BA graduation project in February 2012. I’m not Muslim but I’ve always lived in environment influenced by Islamic culture and I’ve been fascinated by it. Some months ago I participated to  a workshop in Prato about Wearable Technology with Riccardo Marchesi of Plug&Wear and I started to understand this new technology and to have real answers to my questions.

Leyla - schema circuito

How does it work?
Leyla’s circuit is composed by two facial-muscle sensors detecting micro-facial movements. The Arduino Lilypad receives data from them and sends the processed information to the Nitinol wires (muscle wires)  that are sewn into the fabric,  creating curls of the expressions hidden under the veil.

Leyla - inside

Have you got yet any reactions from girls wearing the veil?

I kept working on my research project while attending an association for non-EU women in my city, organized by a Muslim friend of mine since childhood. I met several women there, both young and old who’ve helped me understanding better their culture.  I explained the project to them and from the very first concept ideas I received a positive feedback.
Not very often designers create accessories suited for their necessities and thorough this object they could gain more “emotional communication” capabilities while maintaining their decency and this new opportunity  made them very happy.
They were both intrigued by the new technology I showed them (muscle Wires), and on how I was materializing my new idea of communication. Muslim women thought that my idea was very cool. It was a chance to give voice to a new way of communicating their emotions without needing to “undress”.

Until now I haven’t had the chance to test “Leyla” in Saudi Arabia, although I would love to do it in the future. Thanks to a friend of mine, however, I had the chance to show “Leyla” to some women wearing the Niqab staying in Istanbul for Erasmus program: they even asked me if I was selling it!

——

In the video and picture below you can see  the result, from left to right: Relaxed muscle – Contracted muscle: smile – Relaxed muscle – Contracted muscle: anger.

Leyla - expressions



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook