Posts | Comments

Planet Arduino

Archive for the ‘kinect’ Category

There are several projects you can imagine where it would be useful to have a robot follow you. For example, we’ve always wanted luggage that would trail us at the airport and we’ve seen several coolers that will follow you. [Madmax95] apparently dream of having a medical cart following a patient, though, and that’s good too. But how do you do it? [Max’s] method was to strip down a Roomba and build a work table and electronics on it. An Arduino controls the motor and communicates with a PC. The PC reads video from a Kinect camera on the robot and uses special tracking software to follow the patient.

We could easily imagine all of this project except the tracking. That depended on a service called Nuitrack. There is a free version that only works for 3 minutes, but it costs if you want to use it practically. However, it would still be cheaper than rolling your own if your time has value.

Nuitrack can do body tracking, face tracking, and also gesture recognition. So it would be easy to imagine commanding the robot using some sort of Jedi-style gesture. It looks like using this makes the project significantly easier than you’d imagine.

The project also uses Thingsboard to create a simple control panel. This is another solution that requires a subscription, but there is a free community edition you can host locally.

Overall, we probably won’t duplicate this robot ourselves, but we were interested in learning how we could pull off something similar for other projects. We’ve seen variations of this done with things like OpenCV. Our suitcase idea, by the way, isn’t original, but we wonder about how much packing volume you lose for batteries, electronics, and we imagine the airlines will be unhappy stowing one.

[Aldric Negrier] wanted to make 3D-scanning a person streamlined and simple. To that end, he created this voice-controlled 3D-scanning rig.

[Aldric] used a variety of hacking skills to make this project, and his thorough Instructable illustrates this nicely. Everything from CNC milling to Arduino programming to 3D-printing was incorporated into the making of this rig. Plywood was used to construct the base and the large toothed gear. A 12″ Lazy Susan bearing was attached to this gear to allow smooth rotation. In order to automate the rig, a 12V DC geared motor was attached to a smaller 3D-printed gear and positioned on the base. When the motor is on, the smaller gear’s teeth take the larger gear for a spin. He used a custom dual H-bridge motor driver made by a friend, which is connected to an Arduino Nano. The Nano is also connected to a Bluetooth module and an ultrasonic range finder. When an object within 1-35cm is detected on the rig for 3 seconds, the motor starts to spin, stopping when the object is no longer detected. A typical scan takes about 60 seconds.

This alone would have been a great project, but [Aldric] did not stop there. He wanted to be able to step on the rig and issue commands while being scanned. It makes sense if you want to scan yourself – get on the rig, assume the desired position, and then initiate the scan. He used the Windows speech recognition SDK to develop an application that issues commands via Bluetooth to Skanect, a 3D-scanning software. The commands are as simple as saying “Start Skanect.” You can also tell the motor to switch on or off and change its speed or direction without breaking form. [Aldric] used an Asus Xtion for a 3D-scanner, but a Kinect will also work. Afterwards, he smoothed his scans using MeshMixer, a program featured in previous hacks.

Check out the videos of the rig after the break. Voice commands are difficult to hear due to the background music in one of the videos, but if you listen carefully, you can hear them. You can also see more of [Aldric’s] projects here or on this YouTube channel.

[Thanks for the tip, MERover!]


Filed under: 3d Printer hacks, Arduino Hacks
Feb
28

Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms.

android hacks, arduino, arduino hacks, Flappy Bird, kinect, misc hacks, Processing Commenti disabilitati su Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms. 

flappy-double

After viral popularity, developer rage quits, and crazy eBay auctions, the world at large is just about done with Flappy Bird. Here at Hackaday, we can’t let it go without showcasing two more hacks. The first is the one that we’ve all been waiting for: a robot that will play the damn game for us. Your eyes don’t deceive you in that title image. The Flappy Bird bot is up to 147 points and going strong. [Shi Xuekun] and [Liu Yang], two hackers from China, have taken full responsibility for this hack. They used OpenCV with a webcam on Ubuntu to determine the position of both the bird and the pipes. Once positions are known, the computer calculates the next move. When it’s time to flap, a signal is sent to an Arduino Mega 2560. The genius of this hack is the actuator. Most servos or motors would have been too slow for this application. [Shi] and [Liu] used the Arduino and a motor driver to activate a hard drive voice coil. The voice coil was fast enough to touch the screen at exactly the right time, but not so powerful as to smash their tablet.

If you would like to make flapping a bit more of a physical affair, [Jérémie] created Flappy Bird with Kinect. He wrote a quick Processing sketch which uses the Microsoft Kinect to look for humans flapping their arms. If flapping is detected, a command is sent to an Android tablet. [Jérémie] initially wanted to use Android Debug Bridge (ADB) to send the touch commands, but found it was too laggy for this sort of hardcore gaming. The workaround is to use a serial connected Arduino as a mouse. The Processing sketch sends a ‘#’ to the Arduino via serial. The Arduino then sends a mouse click to the computer, which is running  hidclient.  Hidclient finally sends Bluetooth mouse clicks to the tablet. Admittedly, this is a bit of a Rube Goldberg approach, but it does add an Arduino to a Flappy Bird hack, which we think is a perfect pairing.

[Thanks Parker!]


Filed under: Android Hacks, Arduino Hacks, misc hacks
Feb
01

Autonomous Lighting with Intelligence

arduino hacks, kinect, Kinect hacks, lighting, xtion Commenti disabilitati su Autonomous Lighting with Intelligence 

myra_light_01_29

Getting into home automation usually starts with lighting, like hacking your lights to automatically turn on when motion is detected, timer controls, or even tying everything into an app on your smart phone. [Ken] took things to a completely different level, by giving his lighting intelligence.

The system is called ‘Myra’, and it works by detecting what you’re doing in the room, and based on this, robotic lights will optimally adjust to the activity. For example, if you’re walking through the room, the system will attempt to illuminate your path as you walk. Other activities are detected as well, like reading a book, watching TV, or just standing still.

At the heart of the ‘Myra’ system is an RGBD Sensor (Microsoft Kinect/Asus Xtion). The space in the room is processed by a PC running an application to determine the current ‘activity’. Wireless robotic lights are strategically placed around the room; each with a 2-servo system and standalone Arduino. The PC sends out commands to each light with an angle for the two axis and the intensity of the light. The lights receive this command wirelessly via a 315MHz receiver, and the Arduino then ‘aims’ the beam according to the command.

This isn’t the first time we’ve seen [Ken’s] work; a couple of years ago we saw his extremely unique ‘real life’ weather display.  The ‘Myra’ system is still a work in progress, so we can’t wait to see how it all ends up.  Be sure to check out the video after the break for a demo of the system.


Filed under: Arduino Hacks, Kinect hacks
fingertipAided by affordable materials, 3D printers, and open source technology, the merging of human and machine is a thriving subset of the maker community. Next week's World Maker Faire New York will showcase a number of these projects and the makers who made them. These projects are also a testament to the best impulses of human nature: once we possess new skills and technology we look for ways to use them as a force for good and to share them with others.

Read more on MAKE



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook