Posts | Comments

Planet Arduino

Archive for the ‘research’ Category

In many respects we think of artificial intelligence as being all encompassing. One AI will do any task we ask of it. But in reality, even when AI reaches the advanced levels we envision, it won’t automatically be able to do everything. The Fraunhofer Institute for Microelectronic Circuits and Systems has been giving this a lot of thought.

AI gesture training

Okay, so you’ve got an AI. Now you need it to learn the tasks you want it to perform. Even today this isn’t an uncommon exercise. But the challenge that Fraunhofer IMS set itself was training an AI without any additional computers.

As a test case, an Arduino Nano 33 BLE Sense was employed to build a demonstration device. Using only the onboard 9-axis motion sensor, the team built an untethered gesture recognition controller. When a button is pressed, the user draws a number in the air, and corresponding commands are wirelessly sent to peripherals. In this case, a robotic arm.

Embedded intelligence

At first glance this might not seem overly advanced. But consider that it’s running entirely from the device, with just a small amount of memory and an Arduino Nano. Fraunhofer IMS calls this “embedded intelligence,” as it’s not the robot arms that’s clever, but the controller itself.

This is achieved when training the device using a “feature extraction” algorithm. When the gesture is executed, the artificial neural network (ANN) is able to pick out only the relevant information. This allows for impressive data reduction and a very efficient, compact AI.

Fraunhofer IMS Arduino Nano with Gesture Recognition

Obviously this is just an example use case. It’s easy to see the massive potential that this kind of compact, learning AI could have. Whether it’s in edge control, industrial applications, wearables or maker projects. If you can train a device to do the job you want, it can offer amazing embedded intelligence with very few resources.

The post Customizable artificial intelligence and gesture recognition appeared first on Arduino Blog.

We recently sponsored one of the labs at Lulea University in Sweden, the INSPIRE (INstrumentation for Space and Planetary Investigation, Resources and Exploration) Lab. It is not just any lab, it is the lab from Prof. Mari Paz Zorzano and Prof. Javier Martín, both known for their work in the possibility of discovering water on Mars’ surface, this extent was published in this Nature magazine article in 2015, among other places.

What I learned rather quickly, thanks to my interactions with both professors over the last couple of years, is that Arduino has been a basic component in the countless projects made in their lab–the Mega and Due are their students’ favorites due to the amount of available pins as well as robustness of the earlier; but also because of the floating comma, analog to digital converter, and general relevance for instrumentation of the latter.

This article is going to be the first of a series where we will highlight the way the Lulea lab is using Arduino for instruments, real life experiences, zero gravity tests, low orbit missions, and general teaching. We hope they will inspire many to follow in their steps and look at the stars with a renewed interest in science and technology.

Meet the players

Mari Paz and Javier were known to me before I actually got to meet up with them in person. As a researcher, I had heard of the article in Nature, who hadn’t? Plus, since both of them come from Spain (as I do), you can imagine that the national press was covering their finding pretty well when it was published. Funny enough, they knew about Arduino because they, as many researchers, needed to figure out methods to better finance their experiments, and Arduino is a tool known for being affordable, as well as technically competent to command many of their tests. I should confess that, by the time we all got in touch, I was already trying to figure out how to talk to them.

In November 2016, Mari Paz and Javier had just opened their lab in Kiruna, their discovery had given them new positions at a new university (Lulea University, owner of the Kiruna campus, closer to the launching station), a new team, and access to a lot more resources. And so they got back to work. I was invited to give a speech as part of their seminar series and later host a short workshop mainly for master and PhD students. The Kiruna campus in November is completely surrounded by snow. You can make it there skiing several months in the year, something I got told people do sometimes. However, the city of Kiruna is going to go through a bunch of transformations (the city center will be moved 30km due to the mine that is literally under it), and the professors decided to move their lab to Lulea’s main campus for the time being. Follow the descriptions of some of the projects developed there.

Project 1: PVT-Gamers

One of the biggest challenges for spacecrafts is how to weigh the remaining propellant (fuel) in the absence of gravity. With contemporary space vehicles in mind, which can be reused, this has become one of the most economically critical limitations to be taken into account. PVT-Gamers is the acronym for ‘Improved Pressure-Volume-Temperature Gauging Method for Electric-Propulsion Systems’ experiment designed at the INSPIRE Lab. It is exploring the use of pressurized propellants, like Xenon, and monitoring how it is used and how much is left to keep the spacecraft moving.

PVT-Gamers has been chosen by the European Space Agency (ESA) to fly on-board the Airbus A310 ZERO-G airplane. For those of you not familiar with it, it is a flying vehicle that reaches a state similar to zero gravity, and therefore is used for simulating space conditions. PVT-Gamers has been selected within the ESA program “Fly Your Thesis! 2018,” which will give the research team behind it the ability to test their assumptions in a real world scenario. A new method will be applied to small pressurized Xenon gas containers under hyper/micro-gravity cycles at a stationary cooling. Arduino boards, specifically six Mega 2560, are instrumental in recovering all the data, such as temperature, pressure, deformation, or acceleration. Subsequently, it will be possible to reproduce on-orbit, thrust phase, external accelerations, and fuel transfer conditions over a propellant tank at its End Of Life (EOL) stage, where there is almost no propellant left.

The potential applications from this scientific experiment may provide the upcoming spacecraft generation with a fuel measuring and control method that could constitute a turning point for long-term space missions. This can be applied to CubeSats or telecommunication satellites, and also to large future projects using electric propulsion such as the lunar space station “Deep Space Gateway” or the Mercury mission BepiColombo.

Current design of the PVT-Gamers experiment rack configuration to be attached to the A310 ZERO-G cabin. Photo credit: PVT-Gamers

Simulation of the velocity distribution in magnitude within a spacecraft propellant tank as consequence of external heating. Photo credit: PVT-Gamers

A310 ZERO-G cabin during a micro-gravity stage. Photo credit: ESA

Closing with a reflection: Why is this so important?

You might wonder… Why should Arduino be so interested in the creation of machines aimed at the exploration of space? The answer is three-fold. First, space is the ultimate frontier, the conditions are very tough, shipping electronics out of the atmosphere is expensive and forces engineers to become very creative, reusability is key (a part has to be used for more than one thing, even the hardware components). For Arduino, proving that our boards and choice of materials, while still cheap, are good enough to be part of the space career, is of course of vital importance. If it works in space, it works on Earth, also for the industry.

Second, the limitations are such, that many of the designs become very useful in everyday situations. If we made a greenhouse for Mars, it would work for the Arctic, or for poor villagers on the mountains anywhere in the world as well. Isn’t an excuse good enough to make a machine that will help improve people’s lives?

Third, in education we need icons to follow, and we need experiences to replicate. The ones from Mari Paz, Javier, and their team will for sure awaken the scientific vocation in many of our younger ones. Helping science is helping education!

iosic

Leonardo Lupori and Raffaele Mazziotti are active in the field of neuroscience at Tommaso Pizzorusso’s lab at Neuroscience Institute CNR of Pisa respectively as molecular biologist and experimental psychologist. They created an Arduino-based and MATLAB-controlled tool called IOSIC (Intrinsic Optical Signal Imaging Chamber), powered by an Arduino Micro and focused on intrinsic optical signal (IOS) imaging apparatus to run experiments on the plasticity of the brain.

Intrinsic optical signal (IOS) imaging is a functional imaging technique that has revolutionized our understanding of cortical functional organization and plasticity since it was first implemented, around 30 years ago. IOS is produced by the brain when processing information and is similar to the information recorded with the plethysmograph (the instrument to measure heart rate from a finger) and it is useful to investigate how the brain works. The researchers are especially interested to investigate how the brain is able to adapt to the environment to store information but also acquire new skills and these studies are really useful to understand what happens to the brain when is in good health or during a disease.

iosic02

Even if their lab has a long-standing expertise in electrophysiological studies, they decided to  developed a fully functional apparatus for IOS with tools already available and low-cost:

To set up the entire system we used a mix of components commercially available and custom-made. The most expensive tool we used is an imaging camera from Hamamatsu (it is necessary because we need to analyze data quantitatively), but you can also use a cheaper camera (at least with a CCD chip 12-bit depth is recommended). The rest is stuff collected from old tools of the lab. For example, the microscope, that in our case is an old Olympus confocal microscope, but any transmitted light microscope or macroscope should be ok, was already in the lab and is currently used also for other purposes. For light illumination, we used a custom made crown-shaped LED holder that can be attached to the objective and provide a really stable light source. Afterwards, we wrote a MATLAB script to control the camera and then we built an imaging chamber to analyze the animal preparation. The imaging chamber is essential to keep the animal stable during the imaging session (about 7 minutes) and also to maintain its physiological temperature during the time course of anesthesia. An additional feature added to the chamber is the possibility to change the animal’s visual field automatically allowing us to measure rapidly, efficiently and repeatedly a very important parameter of plasticity called ocular dominance. The chamber is composed by a 3D printed structure on which an Arduino MICRO, two servo motors, a heating pad, an IR thermometer and a magnetic ring have been installed. Currently we are using this system with success and we hope to discover something really relevant.

You can download IOSIC code for the Arduino MICRO here. The code uses third-party libraries : TMP006 and Servo. MATLAB code to control shutters is available here.

May
20

Are you a developer? Take a 10-min survey and shape a new dev report

community, Featured, iot, Opinion, research, survey Comments Off on Are you a developer? Take a 10-min survey and shape a new dev report 

visionmobile01
How will IoT play out in your ecosystem? Is HTML vs. Native still relevant? Are you using AWS, Azure or Google Cloud? Which are the hottest IoT verticals? These are some of the questions that researchers at VisionMobile address through their 9th edition of Developer Economics research launched at the beginning of this month. You can make your voice heard taking the 10-minute Developer Skill Census survey and later read key insights given back to the community as a free download in late July.

The Developer Economics research program tracks developer sentiment across platforms, revenues, apps, tools, APIs, segments and regions, tackling some of 2015’s most commonly asked questions. It’s the largest, most global app developer research & engagement program reaching up to 10,000 developers in over 140 countries and we believe open source developers could give an interesting point of view on the topic!

After  taking the survey, you can download immediately a free chapter from one of VisionMobile’s premium paid reports taking a close look at app profits & costs and also enter a draw to win prizes such as an iPhone 6, an Apple Sports Watch, an Oculus Rift Dev Kit, and many more.

Jun
24

horus - falcon

The Lesser Kestrel (Falco naumanni) is a small falcon at the center of HORUS, a project aiming to develop a system for automatic real-time monitoring of colonial falcons at Doñana Biological Station, a public Research Institute in Spain.

The falcons breed in nest-boxes on the window sills which the  research team converted into “smart nest-boxes”: they have sensors to identify the falcons entering the box using RFID tags, but also cameras and other equipment controlled by and Arduino Mega.

 

Horus project

That’s how they use it:

This board is based on the ATmega2560, an economic, low power and robust microcontroller. It controls and processes the nest’s sensor information. This board communicates with sensors and other components, and processes the collected information that is sent to the process server over the communication interface.
The program implemented in the microcontroller performs the following tasks:
- Communicates with the process server over a communication interface, and synchronize clock time with this.
- Checks infra-red barriers. Each nest-box has two infra-red barriers at both extremes of the corridor. The sequence in which they are activated indicates whether birds enter or leave the nest-box.
- Checks if the RFID reader has read a code from ringed kestrels.
- Obtains the body mass measurement from a digital balance.
- Reads the temperature and humidity of the nest.
- Controls the RFID reader to identify individuals.

You can follow daily updates on their Facebook page and find all the info on the page of the Horus  project and on  the Wiki.

 

 

Jan
08

Project feature: Accessing YQL from Arduino

Hacks, research, users, YQL Comments Off on Project feature: Accessing YQL from Arduino 

The ethernet shield opens up lot of possibilities for Arduino. One of which has been explored by Sudar. He has found a way to make YQL calls and even parse the JSON response using Arduino and Ethernet shield.

So what is YQL?

YQL stands for Yahoo Query Language. It is an expressive SQL-like language that lets you query, filter, and join data across Web services. You can read more about YQL from the Yahoo Developer network page.

Checkout the tutorial and the source code at his blog hardwarefun.com.

He is a Research Engineer at Yahoo Research Labs India, by profession, and a hardware hacker by passion. More of his projects can be found here.



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook