Posts | Comments

Planet Arduino

Archive for the ‘Processing’ Category

A lot of consumer gadgets use touch sensors now. It is a cheap and reliable way to replace a variety of knobs and switches on everything from headphones to automobiles. However, creating a custom touch controller for a one-off project can be daunting. A recent ACM paper shows how just about any capacitive sensor can work as a multitouch sensor with nothing more than an Arduino although a PC running processing interprets the data for higher-level functions.

The key is that the Arduino excites the grid using PWM and then examines the signal coming out of the grid. Finger poking changes the response quite a bit and the Arduino can sense it using the analog to digital converters onboard. You can find the actual software kit online. The tutorial document is probably more interesting than the ACM paper if you only want to use the kit.

The optimum drive frequency is 10 MHz. The examples rely on harmonics of a lower frequency PWM signal to get there. The analog conversion, of course, isn’t that fast but since your finger touch rate is relatively slow, they treat the signal as an amplitude-modulated input which is very easy to decode.

The sensors can be conductive ink, thread, or copper strips. There are several example applications, including a 3D printed bunny you can pet, a control panel on a sleeve, and an interactive greeting card.

The sensor forms an image and OpenCV detects the actual touch configuration. It appears you can use the raw data from the Arduino, too, but it might be a little harder.

We imagine aluminum foil would work with this technique. If you get to the point of laying out a PCB, this might come in handy.

All the cool projects now can connect to a computer or phone for control, right? But it is a pain to create an app to run on different platforms to talk to your project. [Kevin Darrah] says no and shows how you can use Google Chrome to do the dirty work. He takes a garden-variety Arduino and a cheap Bluetooth interface board and then controls it from Chrome. You can see the video below.

The HM-10 board is cheap and could connect to nearly anything. The control application uses Processing, which is the software the Arduino system derives from. So how do you get to Chrome from Processing? Easy. The p5.js library allows Processing to work from within Chrome. There’s also a Bluetooth BLE library for P5.

Once you know about those libraries, you can probably figure the rest out. But [Kevin] shows a nice example that you could easily replicate. The Arduino and Bluetooth code aren’t very hard to follow.  The Processing program looks a lot like an Arduino program with a setup and loop function, but it also has canvases, buttons, and other things you don’t usually have in an Arduino.

It is surprisingly easy to create a Chrome app that talks to the hardware. Our usual go to for phone apps is Blynk. We even used it as a joystick for a robot.

This Arduino-based project creates interesting tumbling patterns using a system that tilts a plane in a controlled manner while deforming its surface.

NEOANALOG, a “studio for hybrid things and spaces,” was commissioned to build the Particle Flow installation, which explores how granules tumble under the control of gravity. This mechanism takes the form of a large hexagon held in three corners by linkages pushed up and down by NEMA 24 stepper motors. As these rods are lifted, the granules inside the “arena” are steered over to the opposite side producing a zen-like experience.

Inside the main hexagon are 19 smaller hexagons, each controlled by servos to lift an individual section of the rolling surface up and down. Control of the entire system is accomplished via a PC running Processing, which sends commands via Ethernet to an Arduino Mega and the steppers to an Arduino Uno with three motor drivers. 

A moving slanted plane and a grid of motorized stamps control the elements to form infinite variations of behaviors and patterns. The result is a zen-like experience that is both: fascinating and contemplative. Software controlled motion follows a complex choreography and enables precise steering of physical particles in a variety of ways: from subtle to obvious, from slow to high paced, from random-like to symmetric.

Intrigued? Be sure to check out Creative Applications Network’s write-up on this piece as well as NEOANALOG’s page for more details.

This Arduino-based project creates interesting tumbling patterns using a system that tilts a plane in a controlled manner while deforming its surface.

NEOANALOG, a “studio for hybrid things and spaces,” was commissioned to build the Particle Flow installation, which explores how granules tumble under the control of gravity. This mechanism takes the form of a large hexagon held in three corners by linkages pushed up and down by NEMA 24 stepper motors. As these rods are lifted, the granules inside the “arena” are steered over to the opposite side producing a zen-like experience.

Inside the main hexagon are 19 smaller hexagons, each controlled by servos to lift an individual section of the rolling surface up and down. Control of the entire system is accomplished via a PC running Processing, which sends commands via Ethernet to an Arduino Mega and the steppers to an Arduino Uno with three motor drivers. 

A moving slanted plane and a grid of motorized stamps control the elements to form infinite variations of behaviors and patterns. The result is a zen-like experience that is both: fascinating and contemplative. Software controlled motion follows a complex choreography and enables precise steering of physical particles in a variety of ways: from subtle to obvious, from slow to high paced, from random-like to symmetric.

Intrigued? Be sure to check out Creative Applications Network’s write-up on this piece as well as NEOANALOG’s page for more details.

[Mr_GreenCoat] is studying engineering. His thermodynamics teacher agreed with the stance that engineering is best learned through experimentation, and tasked [Mr_GreenCoat]’s group with the construction of a vacuum chamber to prove that the boiling point of a liquid goes down with the pressure it is exposed to.

His group used black PVC pipe to construct their chamber. They used an air compressor to generate the vacuum. The lid is a sheet of lexan with a silicone disk. We’ve covered these sorts of designs before. Since a vacuum chamber is at max going to suffer 14.9 ish psi distributed load on the outside there’s no real worry of their design going too horribly wrong.

The interesting part of the build is the hardware and software built to boil the water and log the temperatures and pressures. Science isn’t done until something is written down after all. They have a power resistor and a temperature probe inside of the chamber. The temperature over time is logged using an Arduino and a bit of processing code.

In the end their experiment matched what they had been learning in class. The current laws of thermodynamics are still in effect — all is right in the universe — and these poor students can probably save some money and get along with an old edition of the textbook. Video after the break.


Filed under: Arduino Hacks, tool hacks

Data Cocktail_web02

Data Cocktail is a device which translates in a tasty way the Twitter activity and running on Arduino Due and Arduino Pro Mini. When you want a cocktail, the machine will look for the five latest messages around the world quoting one of the available ingredients. These messages define the drink composition and Data Cocktail not only provides a unique kind of drink, but it also prints the cocktail’s recipe along with the corresponding tweets.
Once the cocktail mix is done, Data Cocktail thanks the tweeters who have helped at making the recipe, without knowing it. Check the video below to see how it works:

Data Cocktail was created in a workshop held at Stereolux in Nantes by a theme composed by Bertille Masse, Manon Le Moal-Joubel, Sébastien Maury, Clément Gault & Thibaut Métivier.

They made it using Processing and Arduino:

A first application, developed in Processing, pilots the device. The requests are performed using the Twitter4J library, then the application processes the data and controls the device, i.e. the robot, the solenoid valves and the light. The robot itself is based on a modified Zumo frame, an Arduino Pro, a Motor Shield and a Bluetooth module. The solenoid valves and the LEDs are controlled by an Arduino Due connected via USB. The impression is realized by Automator.

To prepare a cocktail, the machine can take up to a minute and may provide up to 6 different ingredients!

Mar
04

mabos2-2

Social Vibes’ is a Masters Degree (MSc.) project, in Interactive Media by Cian McLysaght, at the University of Limerick, Ireland. They shared with us their project, running on Arduino Uno, composed by a physical artifact designed and created specifically for an installation adopting the fundamental sound mechanisms used in a vibraphone, know also as a ‘Vibe’:

The instrument consists of twelve musical tones of different pitches. The music created on the instrument is derived from a continuous stream of input via multiple users on Twitter and the explicit interaction from Twitter users, tweeting the instrument directly to the project’s, “@vibe_experiment” Twitter account. Data associated with the emotional status of Twitter users, is mined from the Twitter network via Twitter’s open source, application programming interface (API).

For example if a user tweets “The sun is out, I’m happy”, the code I’ve written will strip out key words and strings associated with the user’s emotional state, within the tweets, ie “I’m happy”, and translate this to a musical notation. Mining Twitter’s API, allows a continuous stream of data. These emotional states are then mapped to specific notes on the physical musical instrument, located in a public space. The tempo of the musical expression will be entirely based upon the speed and volume of the incoming tweets on the Twitter API.

Twitter users who are both followers and non followers of the musical instrument’s Twitter account (@vibe_experiment) can tweet directly to the instrument and this direct interaction will be given precedence, allowing user’s who tweet directly to have their emotional state ‘played’. This allows users to hijack or take over the instrument and experiment with it in a playful manner, but also allows those with musical knowledge the potential to compose simple musical arrangements. When users are not tweeting the instrument directly, then the instrument will revert to mining the Twitter API.

To entice users to interact and observe the action of the instrument there is a live streaming broadcast of the instrument via Twitcam on the Vibe’s Twitter account. This is a live streaming broadcast of the instrument via Twitcam on the @vibe_experiment account. Twitcam, is Twitter’s built in live-streaming platform. This simply requires a webcam and a valid Twitter account.

The instrument constantly tweets back updates to it’s own Twitter account to not only inform people of the general status but also to engage users to interact directly with the ‘Vibe’.

Feb
28

Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms.

android hacks, arduino, arduino hacks, Flappy Bird, kinect, misc hacks, Processing Commenti disabilitati su Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms. 

flappy-double

After viral popularity, developer rage quits, and crazy eBay auctions, the world at large is just about done with Flappy Bird. Here at Hackaday, we can’t let it go without showcasing two more hacks. The first is the one that we’ve all been waiting for: a robot that will play the damn game for us. Your eyes don’t deceive you in that title image. The Flappy Bird bot is up to 147 points and going strong. [Shi Xuekun] and [Liu Yang], two hackers from China, have taken full responsibility for this hack. They used OpenCV with a webcam on Ubuntu to determine the position of both the bird and the pipes. Once positions are known, the computer calculates the next move. When it’s time to flap, a signal is sent to an Arduino Mega 2560. The genius of this hack is the actuator. Most servos or motors would have been too slow for this application. [Shi] and [Liu] used the Arduino and a motor driver to activate a hard drive voice coil. The voice coil was fast enough to touch the screen at exactly the right time, but not so powerful as to smash their tablet.

If you would like to make flapping a bit more of a physical affair, [Jérémie] created Flappy Bird with Kinect. He wrote a quick Processing sketch which uses the Microsoft Kinect to look for humans flapping their arms. If flapping is detected, a command is sent to an Android tablet. [Jérémie] initially wanted to use Android Debug Bridge (ADB) to send the touch commands, but found it was too laggy for this sort of hardcore gaming. The workaround is to use a serial connected Arduino as a mouse. The Processing sketch sends a ‘#’ to the Arduino via serial. The Arduino then sends a mouse click to the computer, which is running  hidclient.  Hidclient finally sends Bluetooth mouse clicks to the tablet. Admittedly, this is a bit of a Rube Goldberg approach, but it does add an Arduino to a Flappy Bird hack, which we think is a perfect pairing.

[Thanks Parker!]


Filed under: Android Hacks, Arduino Hacks, misc hacks
Nov
24

Hack-a-Day Logo Laser Light Show

arduino hacks, laser, laser hacks, logo, Processing, scanning, servo, software hacks Commenti disabilitati su Hack-a-Day Logo Laser Light Show 

hackaday-laser-2

The Hack-a-Day logo challenge keeps on bearing fruit. This tip comes from [Enrico Lamperti] from Argentina who posted his follies as well as success creating a Hack-a-Day logo using a home built scanning laser projector.

The build consists of a couple small servos, a hacked up pen laser and an Arduino with some stored coordinates to draw out the image. As usual the first challenge is powering your external peripheral devices like servos. [Enrico] tackled this problem using 6 Ni-MH batteries and an LM2956 simple switcher power converter. The servos and Arduino get power directly from the battery pack and the Arduino controls the PWM signals to the servos as they trace out the stored coordinate data. The laser is connected to the servo assembly and is engaged and powered by an Arduino pin via an NPN transistor. He also incorporated a potentiometer to adjust the servo calibration point.

His first imported coordinate data generated from some Python script was not very successful. But later he used processing with an SVG file to process a click-made path the Arduino could use as map data to draw the Hack-a-Day logo. It requires a long exposure time to photograph the completed drawing in a dark room but the results are impressive.

It’s an excellent project where [Enrico] shares what he learned about using Servo.writeMicroseconds() instead of Servo.write() for performance along with several other tweaks. He also shared the BOM, Fritzing diagram, Processing Creator and Simulator tools and serial commands on GitHub. He wraps up with some options that he thinks would improve his device, and he requests any help others may want to provide for better performance. And if you want you could step it up a notch and create a laser video projector with an ATMega16 AVR microcontroller and some clever spinning tilted mirrors.


Filed under: Arduino Hacks, laser hacks, software hacks
Ott
07

An Arduino “Radar” Installation

arduino, Processing, project, radar, Range Finder, Robotics, ultrasonic Commenti disabilitati su An Arduino “Radar” Installation 

An Arduino "Radar" InstallationUltrasonic range finder mounted on a servo motor controlled by an Arduino with a Processing Radar-like visualisation.

Read more on MAKE



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook