Posts | Comments

Planet Arduino

Archive for the ‘Processing’ Category

This Arduino-based project creates interesting tumbling patterns using a system that tilts a plane in a controlled manner while deforming its surface.

NEOANALOG, a “studio for hybrid things and spaces,” was commissioned to build the Particle Flow installation, which explores how granules tumble under the control of gravity. This mechanism takes the form of a large hexagon held in three corners by linkages pushed up and down by NEMA 24 stepper motors. As these rods are lifted, the granules inside the “arena” are steered over to the opposite side producing a zen-like experience.

Inside the main hexagon are 19 smaller hexagons, each controlled by servos to lift an individual section of the rolling surface up and down. Control of the entire system is accomplished via a PC running Processing, which sends commands via Ethernet to an Arduino Mega and the steppers to an Arduino Uno with three motor drivers. 

A moving slanted plane and a grid of motorized stamps control the elements to form infinite variations of behaviors and patterns. The result is a zen-like experience that is both: fascinating and contemplative. Software controlled motion follows a complex choreography and enables precise steering of physical particles in a variety of ways: from subtle to obvious, from slow to high paced, from random-like to symmetric.

Intrigued? Be sure to check out Creative Applications Network’s write-up on this piece as well as NEOANALOG’s page for more details.

This Arduino-based project creates interesting tumbling patterns using a system that tilts a plane in a controlled manner while deforming its surface.

NEOANALOG, a “studio for hybrid things and spaces,” was commissioned to build the Particle Flow installation, which explores how granules tumble under the control of gravity. This mechanism takes the form of a large hexagon held in three corners by linkages pushed up and down by NEMA 24 stepper motors. As these rods are lifted, the granules inside the “arena” are steered over to the opposite side producing a zen-like experience.

Inside the main hexagon are 19 smaller hexagons, each controlled by servos to lift an individual section of the rolling surface up and down. Control of the entire system is accomplished via a PC running Processing, which sends commands via Ethernet to an Arduino Mega and the steppers to an Arduino Uno with three motor drivers. 

A moving slanted plane and a grid of motorized stamps control the elements to form infinite variations of behaviors and patterns. The result is a zen-like experience that is both: fascinating and contemplative. Software controlled motion follows a complex choreography and enables precise steering of physical particles in a variety of ways: from subtle to obvious, from slow to high paced, from random-like to symmetric.

Intrigued? Be sure to check out Creative Applications Network’s write-up on this piece as well as NEOANALOG’s page for more details.

[Mr_GreenCoat] is studying engineering. His thermodynamics teacher agreed with the stance that engineering is best learned through experimentation, and tasked [Mr_GreenCoat]’s group with the construction of a vacuum chamber to prove that the boiling point of a liquid goes down with the pressure it is exposed to.

His group used black PVC pipe to construct their chamber. They used an air compressor to generate the vacuum. The lid is a sheet of lexan with a silicone disk. We’ve covered these sorts of designs before. Since a vacuum chamber is at max going to suffer 14.9 ish psi distributed load on the outside there’s no real worry of their design going too horribly wrong.

The interesting part of the build is the hardware and software built to boil the water and log the temperatures and pressures. Science isn’t done until something is written down after all. They have a power resistor and a temperature probe inside of the chamber. The temperature over time is logged using an Arduino and a bit of processing code.

In the end their experiment matched what they had been learning in class. The current laws of thermodynamics are still in effect — all is right in the universe — and these poor students can probably save some money and get along with an old edition of the textbook. Video after the break.


Filed under: Arduino Hacks, tool hacks

Data Cocktail_web02

Data Cocktail is a device which translates in a tasty way the Twitter activity and running on Arduino Due and Arduino Pro Mini. When you want a cocktail, the machine will look for the five latest messages around the world quoting one of the available ingredients. These messages define the drink composition and Data Cocktail not only provides a unique kind of drink, but it also prints the cocktail’s recipe along with the corresponding tweets.
Once the cocktail mix is done, Data Cocktail thanks the tweeters who have helped at making the recipe, without knowing it. Check the video below to see how it works:

Data Cocktail was created in a workshop held at Stereolux in Nantes by a theme composed by Bertille Masse, Manon Le Moal-Joubel, Sébastien Maury, Clément Gault & Thibaut Métivier.

They made it using Processing and Arduino:

A first application, developed in Processing, pilots the device. The requests are performed using the Twitter4J library, then the application processes the data and controls the device, i.e. the robot, the solenoid valves and the light. The robot itself is based on a modified Zumo frame, an Arduino Pro, a Motor Shield and a Bluetooth module. The solenoid valves and the LEDs are controlled by an Arduino Due connected via USB. The impression is realized by Automator.

To prepare a cocktail, the machine can take up to a minute and may provide up to 6 different ingredients!

Mar
04

mabos2-2

Social Vibes’ is a Masters Degree (MSc.) project, in Interactive Media by Cian McLysaght, at the University of Limerick, Ireland. They shared with us their project, running on Arduino Uno, composed by a physical artifact designed and created specifically for an installation adopting the fundamental sound mechanisms used in a vibraphone, know also as a ‘Vibe’:

The instrument consists of twelve musical tones of different pitches. The music created on the instrument is derived from a continuous stream of input via multiple users on Twitter and the explicit interaction from Twitter users, tweeting the instrument directly to the project’s, “@vibe_experiment” Twitter account. Data associated with the emotional status of Twitter users, is mined from the Twitter network via Twitter’s open source, application programming interface (API).

For example if a user tweets “The sun is out, I’m happy”, the code I’ve written will strip out key words and strings associated with the user’s emotional state, within the tweets, ie “I’m happy”, and translate this to a musical notation. Mining Twitter’s API, allows a continuous stream of data. These emotional states are then mapped to specific notes on the physical musical instrument, located in a public space. The tempo of the musical expression will be entirely based upon the speed and volume of the incoming tweets on the Twitter API.

Twitter users who are both followers and non followers of the musical instrument’s Twitter account (@vibe_experiment) can tweet directly to the instrument and this direct interaction will be given precedence, allowing user’s who tweet directly to have their emotional state ‘played’. This allows users to hijack or take over the instrument and experiment with it in a playful manner, but also allows those with musical knowledge the potential to compose simple musical arrangements. When users are not tweeting the instrument directly, then the instrument will revert to mining the Twitter API.

To entice users to interact and observe the action of the instrument there is a live streaming broadcast of the instrument via Twitcam on the Vibe’s Twitter account. This is a live streaming broadcast of the instrument via Twitcam on the @vibe_experiment account. Twitcam, is Twitter’s built in live-streaming platform. This simply requires a webcam and a valid Twitter account.

The instrument constantly tweets back updates to it’s own Twitter account to not only inform people of the general status but also to engage users to interact directly with the ‘Vibe’.

Feb
28

Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms.

android hacks, arduino, arduino hacks, Flappy Bird, kinect, misc hacks, Processing Commenti disabilitati su Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms. 

flappy-double

After viral popularity, developer rage quits, and crazy eBay auctions, the world at large is just about done with Flappy Bird. Here at Hackaday, we can’t let it go without showcasing two more hacks. The first is the one that we’ve all been waiting for: a robot that will play the damn game for us. Your eyes don’t deceive you in that title image. The Flappy Bird bot is up to 147 points and going strong. [Shi Xuekun] and [Liu Yang], two hackers from China, have taken full responsibility for this hack. They used OpenCV with a webcam on Ubuntu to determine the position of both the bird and the pipes. Once positions are known, the computer calculates the next move. When it’s time to flap, a signal is sent to an Arduino Mega 2560. The genius of this hack is the actuator. Most servos or motors would have been too slow for this application. [Shi] and [Liu] used the Arduino and a motor driver to activate a hard drive voice coil. The voice coil was fast enough to touch the screen at exactly the right time, but not so powerful as to smash their tablet.

If you would like to make flapping a bit more of a physical affair, [Jérémie] created Flappy Bird with Kinect. He wrote a quick Processing sketch which uses the Microsoft Kinect to look for humans flapping their arms. If flapping is detected, a command is sent to an Android tablet. [Jérémie] initially wanted to use Android Debug Bridge (ADB) to send the touch commands, but found it was too laggy for this sort of hardcore gaming. The workaround is to use a serial connected Arduino as a mouse. The Processing sketch sends a ‘#’ to the Arduino via serial. The Arduino then sends a mouse click to the computer, which is running  hidclient.  Hidclient finally sends Bluetooth mouse clicks to the tablet. Admittedly, this is a bit of a Rube Goldberg approach, but it does add an Arduino to a Flappy Bird hack, which we think is a perfect pairing.

[Thanks Parker!]


Filed under: Android Hacks, Arduino Hacks, misc hacks
Nov
24

Hack-a-Day Logo Laser Light Show

arduino hacks, laser, laser hacks, logo, Processing, scanning, servo, software hacks Commenti disabilitati su Hack-a-Day Logo Laser Light Show 

hackaday-laser-2

The Hack-a-Day logo challenge keeps on bearing fruit. This tip comes from [Enrico Lamperti] from Argentina who posted his follies as well as success creating a Hack-a-Day logo using a home built scanning laser projector.

The build consists of a couple small servos, a hacked up pen laser and an Arduino with some stored coordinates to draw out the image. As usual the first challenge is powering your external peripheral devices like servos. [Enrico] tackled this problem using 6 Ni-MH batteries and an LM2956 simple switcher power converter. The servos and Arduino get power directly from the battery pack and the Arduino controls the PWM signals to the servos as they trace out the stored coordinate data. The laser is connected to the servo assembly and is engaged and powered by an Arduino pin via an NPN transistor. He also incorporated a potentiometer to adjust the servo calibration point.

His first imported coordinate data generated from some Python script was not very successful. But later he used processing with an SVG file to process a click-made path the Arduino could use as map data to draw the Hack-a-Day logo. It requires a long exposure time to photograph the completed drawing in a dark room but the results are impressive.

It’s an excellent project where [Enrico] shares what he learned about using Servo.writeMicroseconds() instead of Servo.write() for performance along with several other tweaks. He also shared the BOM, Fritzing diagram, Processing Creator and Simulator tools and serial commands on GitHub. He wraps up with some options that he thinks would improve his device, and he requests any help others may want to provide for better performance. And if you want you could step it up a notch and create a laser video projector with an ATMega16 AVR microcontroller and some clever spinning tilted mirrors.


Filed under: Arduino Hacks, laser hacks, software hacks
Ott
07

An Arduino “Radar” Installation

arduino, Processing, project, radar, Range Finder, Robotics, ultrasonic Commenti disabilitati su An Arduino “Radar” Installation 

An Arduino "Radar" InstallationUltrasonic range finder mounted on a servo motor controlled by an Arduino with a Processing Radar-like visualisation.

Read more on MAKE

Lug
15

Meet the maker – Afroditi experiments with embroidery, soft circuits and diy electronics

arduino, embroidery, Interview, Lilypad, MakerFaire, music, Processing, Sinthesizer Commenti disabilitati su Meet the maker – Afroditi experiments with embroidery, soft circuits and diy electronics 

afroditi psarra

The work of Afroditi Psarra includes experimentation with embroidery, soft circuit and diy electronics. I got in touch with her after discovering she was holding a workshop in Barcelona around sound performances using Lilypad Arduino along with a really cool embroidered synthesizer (…and also submitting her project to Maker Faire Rome !).

Even if her background is in fine arts, as a little girl she got interested in creative ways of expression: on one side she was lucky enough to have all sorts of after-school activities that included painting, theater games and learning but also how to program using LOGO and QBasic. That was in the days of black-and-white terminals and MS-DOS commands:

I still remember the excitement of not knowing what to expect at the opposite side of the screen. So for me, technology has always been a major part of my life.

Lilytron

Below you can find my questions to her:

Zoe Romano: In which way you started mixing art, technology and craft?
Afroditi Psarra: I had the chance to spend a year in Madrid as an ERASMUS student and there I encountered the work that was done at the Medialab Padro and had my first physical media art experience at the  ”The making of Balkan Wars: The Game” exhibition.  Two years later I went back to Madrid to do a post-graduate course on Image, Technology and Design and there I got familiar with Processing. I started working on interactive applets, but after some time I felt like I was missing the manual, hands-on labour of creating, so while I was coding I was also working on simple embroideries oriented around women and technology. These embroidery skills were passed on to me by my grandmother who taught me everything about knitting.

How did you get to know Arduino?
At the various media art workshops that I attended at the Medialab-Prado I was always hearing about Arduino, but for me electronics was something totally unknown and was always connected to robotics and automation processes. About two years ago a friend and very talented media-artist, Maria Varela, who was studying in London told me that she had attended a LilyPad Arduino workshop and that this was an Arduino implementation designed to be used with conductive threads instead of wires.

I was really excited by the idea that this would allow me to combine my work in embroidery with coding, so I bought myself a kit and started to experiment with some basic examples and tutorials I found in Instructables and started to follow the work of Hannah Perner-Wilson (Plusea, Kobakant), Lynne Bruning and Becky Stern. At the time I was still living in Madrid so me and another girl from Medialab, Francesca Mereu, formed a small group called SmartcraftLab and posted our experiments on-line.

Lilykorg

I remember that one of my first experiments was using the conductive thread as a pressure sensor that created tones, and when I heard that primitive digital sound I instantly felt that it was something that I wished to explore further. I think that this interest in physical computing, e-textiles and sound brought all of the things that I was working on earlier together, and the Arduino allowed me to do that.

As for the production of my projects, it is always done by me, but often look to the Arduino community for solutions to problems that I may encounter and ask for other people’s help on hardware and software issues. I do not see myself as a very skilled programmer just yet, but I certainly am evolving. After all, I believe that workshops, hands-on experience and collaborations with other people are the things that allow you to grow as a Maker.
afroditi psarra

A couple of years ago Paola Antonelli, senior curator in the Department of Architecture and Design at the Museum of Modern Art, said “The two most important introductions for art in the past 20 years have been the Arduino and Processing”, how do you see it?

I totally agree with the quote. Processing and Arduino are the two things that have allowed artists with no previous background in computing and electronics work with tools that where only available to specialists before. These two languages have created a tendency towards interactive art and we are now experiencing a revolution in DIY digital fabrication, hacking and tinkering on so many different levels. I think that the increasing spread of Medialabs, Hackerspaces and Fab labs around the world is the living proof of that.

In which ways are you experimenting with the Lilypad?
The LilyPad has allowed me to explore the relation between crafts connected with women’s labour such as knitting, sewing and embroidery, with electronics and creative coding, as well as the creation of soft interfaces of control. In my project Lilytronica I am currently using the LilyPad to create my own embroidered synthesizers that I use to perform live.

Considering that the LilyPad is not designed for creating sound, and you only have digital outputs and 8 MHz clock speed, the result is a very rough, primitive sound quality, which I personally like a lot. In my interactive performance Idoru() I am exploring the body as an interface of control of sound though the use of wearables. In this project the LilyPad acts as a controller, and the sound is produced in SuperCollider.

Idoru - data flow

I am also participating in conferences around open source technologies and organizing workshops on e-textiles and the use of the LilyPad, because I want to transmit my passion and because I want to get more people involved in this exciting new artistic field.

Are you releasing your work in open source?
Ever since I started to work with the Arduino I try to publish my work on-line so that I can have feedback on everything and until now I have been releasing the code on my personal website, but I am thinking of creating a Github account and releasing the code there so that everything is easily accessed by anyone interested. I firmly believe in releasing one’s work in open source, because this way you can evolve your work more rapidly and share your creation process with other like-minded individuals.

Where do you see wearable computing most interesting developments going towards?
I think it is a bit early to tell. Technology evolves at a very fast pace and multinationals sometimes reject certain developments because of their lack of economic interest. Seeing all the fuss around the Google glasses, one would argue that wearable computing is heading to connect the physical body with the Internet of Things. I personally feel that we can certainly expect developments around wearables and locative media and various medical applications.

Noisepad

For now, the most interesting applications in wearables are around fashion, art and music, and they require a certain craftsmanship to be made. As Kobakant argue in their paper ”Future Master Craftsmanship: where we want electronic textile crafts to go“  we never know what can happen when industrial automation kicks in. When our skills become devalued because machines can produce work faster, cheaper and better, we will still enjoy the craft process. But instead of sitting back to become E-Textile grandmothers, perhaps competition from the automated machines will encourage us to move on.

Pictures courtesy of Afroditi Psarra

Mag
29

An interactive installation showing the exciting diversity of a city

arduino, installation, Interaction Design, music, music installation, Processing Commenti disabilitati su An interactive installation showing the exciting diversity of a city 

Global Sounds

Global Sounds is an interactive installation by Rebecca Gischel. It is composed by a series of pyramids made of acrylic glass installed in a square in Edinburgh and each of them programmed to play different instrumental sections of a song when interacted with.

The composition, which was written especially for the project and includes a mix of instruments symbolic of different cultures such as the kato and didgeridoo, allude to the multicultural richness migrants have brought to the UK and Europe bringing parts of their own culture with them.



The song is combined of 7 instruments and 7 pyramids. At first, none of the instruments plays. When someone is standing beside a pyramid, one instruments starts to play. The more people come together, the more instruments join in. Each pyramid has a light bulb inside which is like an equalizer of one instruments. When there are at least 7 people playing with the installation together, the square becomes a play of sound and light. When all pyramids are working together, they compose an harmonic musical piece in its entirety.

Rebecca wrote us:

I used the Arduino Uno. I have one webcam with a fisheye lens on the top of each pyramid. I used the flob library + processing to detect if someone is standing beside a pyramid. If so, one instrument starts to play and processing gives the digital values of the equalizer to Arduino (photo ‘Arduino picture 1′, this was my first testing of the equalizer with normal LEDs). I wanted to use real light bulb instead of LED’s, so I build a transformer (photo ‘Arduino picture 3′) which translates the Arduino-Input into an 12V-Output for the light bulbs which are powered by a car battery. I used 7 pins, one pin per light bulb.

Arduino - picture 1

Arduino - picture 3



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook