Posts | Comments

Planet Arduino

Archive for the ‘Processing’ Category

In many respects we think of artificial intelligence as being all encompassing. One AI will do any task we ask of it. But in reality, even when AI reaches the advanced levels we envision, it won’t automatically be able to do everything. The Fraunhofer Institute for Microelectronic Circuits and Systems has been giving this a lot of thought.

AI gesture training

Okay, so you’ve got an AI. Now you need it to learn the tasks you want it to perform. Even today this isn’t an uncommon exercise. But the challenge that Fraunhofer IMS set itself was training an AI without any additional computers.

As a test case, an Arduino Nano 33 BLE Sense was employed to build a demonstration device. Using only the onboard 9-axis motion sensor, the team built an untethered gesture recognition controller. When a button is pressed, the user draws a number in the air, and corresponding commands are wirelessly sent to peripherals. In this case, a robotic arm.

Embedded intelligence

At first glance this might not seem overly advanced. But consider that it’s running entirely from the device, with just a small amount of memory and an Arduino Nano. Fraunhofer IMS calls this “embedded intelligence,” as it’s not the robot arms that’s clever, but the controller itself.

This is achieved when training the device using a “feature extraction” algorithm. When the gesture is executed, the artificial neural network (ANN) is able to pick out only the relevant information. This allows for impressive data reduction and a very efficient, compact AI.

Fraunhofer IMS Arduino Nano with Gesture Recognition

Obviously this is just an example use case. It’s easy to see the massive potential that this kind of compact, learning AI could have. Whether it’s in edge control, industrial applications, wearables or maker projects. If you can train a device to do the job you want, it can offer amazing embedded intelligence with very few resources.

The post Customizable artificial intelligence and gesture recognition appeared first on Arduino Blog.

As an open-source company, Arduino aims to ensure that open-source continues to thrive and remains sustainable for the long term. The Arduino Donation Program is intended to fund projects and institutions that make a lasting difference in the worldwide open-source community. 

Arduino’s corporate giving efforts are focused on not-for-profit and charitable organizations supporting the free and open-source software movement. Arduino Donation Program recipients have been selected according to the importance of their project, and above all, their dedication to making technology accessible to everyone.

A giving back program

Free and open-source software is created as a collaborative effort in which programmers improve upon the code and share the changes within the community. Arduino endorses the philosophy of creating free tools that allow users to focus on “what” they are developing rather than the “how.”

Arduino continuously releases open-source products and code, which thanks to community members buying original products, enables Arduino to continue to invest in R&D and develop new innovative hardware and software. Arduino benefits from the continuous contribution of the Arduino community along with many other projects. We are infinitely grateful for these efforts, and are aware that the rich and diverse Arduino ecosystem would not exist without their contributions. 

From now on, Arduino will donate to the free software and open-source projects that it collaborates with as well as those that embody the Arduino approach and philosophy. 

Arduino has donated $55,000 to date in 2020. The institutions who have received a $5,000 grant from Arduino are:

  • The Processing Foundation promotes software literacy within the visual arts, and visual literacy within technology-related fields — and makes these fields accessible to diverse communities. The Processing software is free and open-source.
  • Creative Commons is a non-profit organization devoted to expanding the range of creative works available for others to build upon legally and to share. The Creative Commons licenses let creators communicate which rights they reserve and which rights they waive for the benefit of recipients or other creators.
  • Founded in 2015, the RISC-V Foundation is a free and open ISA enabling a new era of processor innovation through open standard collaboration. 
  • The Free Software Foundation is a charity that empowers users to control technology. Free Software gives everybody the rights to use, understand, adapt, and share software. These rights help support other fundamental freedoms like freedom of speech, press and privacy.
  • The Linux Foundation is dedicated to building sustainable ecosystems around open-source projects to accelerate technology development and industry adoption. Founded in 2000, it provides support for open-source communities through financial and intellectual resources, infrastructure, services, events, and training. 
  • The Open Source Security Foundation (OpenSFF) is a cross-industry effort hosted by the Linux Foundation to improve the security of open source software. The foundation includes technical initiatives and working groups that address vulnerability disclosures, security tooling, security best practices, and the identification of security threats to the open-source project. 

At Arduino, we really hope that more companies involved in open-source hardware and software will follow Arduino’s example.

Open-source exists if all of us participate,”  said Arduino co-founder Massimo Banzi. “The open-source creators have to be supported but also incentivized: effectively doing open-source is a lot of work. There are multiple ways to keep open-source alive; one thing that we reflected on is that we decided to take 50,000 dollars and donate back to a bunch of open-source projects and I am sort of challenging other companies whose business model benefits from open-source to also donate to such causes. If we all donate, these open-source projects can thrive and grow to the benefit of all.”

If you need more information about the program, please contact press@arduino.cc.

A lot of consumer gadgets use touch sensors now. It is a cheap and reliable way to replace a variety of knobs and switches on everything from headphones to automobiles. However, creating a custom touch controller for a one-off project can be daunting. A recent ACM paper shows how just about any capacitive sensor can work as a multitouch sensor with nothing more than an Arduino although a PC running processing interprets the data for higher-level functions.

The key is that the Arduino excites the grid using PWM and then examines the signal coming out of the grid. Finger poking changes the response quite a bit and the Arduino can sense it using the analog to digital converters onboard. You can find the actual software kit online. The tutorial document is probably more interesting than the ACM paper if you only want to use the kit.

The optimum drive frequency is 10 MHz. The examples rely on harmonics of a lower frequency PWM signal to get there. The analog conversion, of course, isn’t that fast but since your finger touch rate is relatively slow, they treat the signal as an amplitude-modulated input which is very easy to decode.

The sensors can be conductive ink, thread, or copper strips. There are several example applications, including a 3D printed bunny you can pet, a control panel on a sleeve, and an interactive greeting card.

The sensor forms an image and OpenCV detects the actual touch configuration. It appears you can use the raw data from the Arduino, too, but it might be a little harder.

We imagine aluminum foil would work with this technique. If you get to the point of laying out a PCB, this might come in handy.

All the cool projects now can connect to a computer or phone for control, right? But it is a pain to create an app to run on different platforms to talk to your project. [Kevin Darrah] says no and shows how you can use Google Chrome to do the dirty work. He takes a garden-variety Arduino and a cheap Bluetooth interface board and then controls it from Chrome. You can see the video below.

The HM-10 board is cheap and could connect to nearly anything. The control application uses Processing, which is the software the Arduino system derives from. So how do you get to Chrome from Processing? Easy. The p5.js library allows Processing to work from within Chrome. There’s also a Bluetooth BLE library for P5.

Once you know about those libraries, you can probably figure the rest out. But [Kevin] shows a nice example that you could easily replicate. The Arduino and Bluetooth code aren’t very hard to follow.  The Processing program looks a lot like an Arduino program with a setup and loop function, but it also has canvases, buttons, and other things you don’t usually have in an Arduino.

It is surprisingly easy to create a Chrome app that talks to the hardware. Our usual go to for phone apps is Blynk. We even used it as a joystick for a robot.

This Arduino-based project creates interesting tumbling patterns using a system that tilts a plane in a controlled manner while deforming its surface.

NEOANALOG, a “studio for hybrid things and spaces,” was commissioned to build the Particle Flow installation, which explores how granules tumble under the control of gravity. This mechanism takes the form of a large hexagon held in three corners by linkages pushed up and down by NEMA 24 stepper motors. As these rods are lifted, the granules inside the “arena” are steered over to the opposite side producing a zen-like experience.

Inside the main hexagon are 19 smaller hexagons, each controlled by servos to lift an individual section of the rolling surface up and down. Control of the entire system is accomplished via a PC running Processing, which sends commands via Ethernet to an Arduino Mega and the steppers to an Arduino Uno with three motor drivers. 

A moving slanted plane and a grid of motorized stamps control the elements to form infinite variations of behaviors and patterns. The result is a zen-like experience that is both: fascinating and contemplative. Software controlled motion follows a complex choreography and enables precise steering of physical particles in a variety of ways: from subtle to obvious, from slow to high paced, from random-like to symmetric.

Intrigued? Be sure to check out Creative Applications Network’s write-up on this piece as well as NEOANALOG’s page for more details.

This Arduino-based project creates interesting tumbling patterns using a system that tilts a plane in a controlled manner while deforming its surface.

NEOANALOG, a “studio for hybrid things and spaces,” was commissioned to build the Particle Flow installation, which explores how granules tumble under the control of gravity. This mechanism takes the form of a large hexagon held in three corners by linkages pushed up and down by NEMA 24 stepper motors. As these rods are lifted, the granules inside the “arena” are steered over to the opposite side producing a zen-like experience.

Inside the main hexagon are 19 smaller hexagons, each controlled by servos to lift an individual section of the rolling surface up and down. Control of the entire system is accomplished via a PC running Processing, which sends commands via Ethernet to an Arduino Mega and the steppers to an Arduino Uno with three motor drivers. 

A moving slanted plane and a grid of motorized stamps control the elements to form infinite variations of behaviors and patterns. The result is a zen-like experience that is both: fascinating and contemplative. Software controlled motion follows a complex choreography and enables precise steering of physical particles in a variety of ways: from subtle to obvious, from slow to high paced, from random-like to symmetric.

Intrigued? Be sure to check out Creative Applications Network’s write-up on this piece as well as NEOANALOG’s page for more details.

[Mr_GreenCoat] is studying engineering. His thermodynamics teacher agreed with the stance that engineering is best learned through experimentation, and tasked [Mr_GreenCoat]’s group with the construction of a vacuum chamber to prove that the boiling point of a liquid goes down with the pressure it is exposed to.

His group used black PVC pipe to construct their chamber. They used an air compressor to generate the vacuum. The lid is a sheet of lexan with a silicone disk. We’ve covered these sorts of designs before. Since a vacuum chamber is at max going to suffer 14.9 ish psi distributed load on the outside there’s no real worry of their design going too horribly wrong.

The interesting part of the build is the hardware and software built to boil the water and log the temperatures and pressures. Science isn’t done until something is written down after all. They have a power resistor and a temperature probe inside of the chamber. The temperature over time is logged using an Arduino and a bit of processing code.

In the end their experiment matched what they had been learning in class. The current laws of thermodynamics are still in effect — all is right in the universe — and these poor students can probably save some money and get along with an old edition of the textbook. Video after the break.


Filed under: Arduino Hacks, tool hacks

Data Cocktail_web02

Data Cocktail is a device which translates in a tasty way the Twitter activity and running on Arduino Due and Arduino Pro Mini. When you want a cocktail, the machine will look for the five latest messages around the world quoting one of the available ingredients. These messages define the drink composition and Data Cocktail not only provides a unique kind of drink, but it also prints the cocktail’s recipe along with the corresponding tweets.
Once the cocktail mix is done, Data Cocktail thanks the tweeters who have helped at making the recipe, without knowing it. Check the video below to see how it works:

Data Cocktail was created in a workshop held at Stereolux in Nantes by a theme composed by Bertille Masse, Manon Le Moal-Joubel, Sébastien Maury, Clément Gault & Thibaut Métivier.

They made it using Processing and Arduino:

A first application, developed in Processing, pilots the device. The requests are performed using the Twitter4J library, then the application processes the data and controls the device, i.e. the robot, the solenoid valves and the light. The robot itself is based on a modified Zumo frame, an Arduino Pro, a Motor Shield and a Bluetooth module. The solenoid valves and the LEDs are controlled by an Arduino Due connected via USB. The impression is realized by Automator.

To prepare a cocktail, the machine can take up to a minute and may provide up to 6 different ingredients!

mabos2-2

Social Vibes’ is a Masters Degree (MSc.) project, in Interactive Media by Cian McLysaght, at the University of Limerick, Ireland. They shared with us their project, running on Arduino Uno, composed by a physical artifact designed and created specifically for an installation adopting the fundamental sound mechanisms used in a vibraphone, know also as a ‘Vibe’:

The instrument consists of twelve musical tones of different pitches. The music created on the instrument is derived from a continuous stream of input via multiple users on Twitter and the explicit interaction from Twitter users, tweeting the instrument directly to the project’s, “@vibe_experiment” Twitter account. Data associated with the emotional status of Twitter users, is mined from the Twitter network via Twitter’s open source, application programming interface (API).

For example if a user tweets “The sun is out, I’m happy”, the code I’ve written will strip out key words and strings associated with the user’s emotional state, within the tweets, ie “I’m happy”, and translate this to a musical notation. Mining Twitter’s API, allows a continuous stream of data. These emotional states are then mapped to specific notes on the physical musical instrument, located in a public space. The tempo of the musical expression will be entirely based upon the speed and volume of the incoming tweets on the Twitter API.

Twitter users who are both followers and non followers of the musical instrument’s Twitter account (@vibe_experiment) can tweet directly to the instrument and this direct interaction will be given precedence, allowing user’s who tweet directly to have their emotional state ‘played’. This allows users to hijack or take over the instrument and experiment with it in a playful manner, but also allows those with musical knowledge the potential to compose simple musical arrangements. When users are not tweeting the instrument directly, then the instrument will revert to mining the Twitter API.

To entice users to interact and observe the action of the instrument there is a live streaming broadcast of the instrument via Twitcam on the Vibe’s Twitter account. This is a live streaming broadcast of the instrument via Twitcam on the @vibe_experiment account. Twitcam, is Twitter’s built in live-streaming platform. This simply requires a webcam and a valid Twitter account.

The instrument constantly tweets back updates to it’s own Twitter account to not only inform people of the general status but also to engage users to interact directly with the ‘Vibe’.

Feb
28

Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms.

android hacks, arduino, arduino hacks, Flappy Bird, kinect, misc hacks, Processing Comments Off on Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms. 

flappy-double

After viral popularity, developer rage quits, and crazy eBay auctions, the world at large is just about done with Flappy Bird. Here at Hackaday, we can’t let it go without showcasing two more hacks. The first is the one that we’ve all been waiting for: a robot that will play the damn game for us. Your eyes don’t deceive you in that title image. The Flappy Bird bot is up to 147 points and going strong. [Shi Xuekun] and [Liu Yang], two hackers from China, have taken full responsibility for this hack. They used OpenCV with a webcam on Ubuntu to determine the position of both the bird and the pipes. Once positions are known, the computer calculates the next move. When it’s time to flap, a signal is sent to an Arduino Mega 2560. The genius of this hack is the actuator. Most servos or motors would have been too slow for this application. [Shi] and [Liu] used the Arduino and a motor driver to activate a hard drive voice coil. The voice coil was fast enough to touch the screen at exactly the right time, but not so powerful as to smash their tablet.

If you would like to make flapping a bit more of a physical affair, [Jérémie] created Flappy Bird with Kinect. He wrote a quick Processing sketch which uses the Microsoft Kinect to look for humans flapping their arms. If flapping is detected, a command is sent to an Android tablet. [Jérémie] initially wanted to use Android Debug Bridge (ADB) to send the touch commands, but found it was too laggy for this sort of hardcore gaming. The workaround is to use a serial connected Arduino as a mouse. The Processing sketch sends a ‘#’ to the Arduino via serial. The Arduino then sends a mouse click to the computer, which is running  hidclient.  Hidclient finally sends Bluetooth mouse clicks to the tablet. Admittedly, this is a bit of a Rube Goldberg approach, but it does add an Arduino to a Flappy Bird hack, which we think is a perfect pairing.

[Thanks Parker!]


Filed under: Android Hacks, Arduino Hacks, misc hacks


  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook