Posts | Comments

Planet Arduino

Archive for the ‘robots hacks’ Category

[Ivan] seems to enjoy making 3D printed vehicles with tracks. His latest one uses 50 servo motors to draw patterns in the sand at the beach. You can see it work in the video below. Well, more accurately you can see it not work and then work as the first iteration didn’t go exactly as planned.

An Arduino Mega 2560 provides the brains and the whole unit weighs in at almost 31 pounds, including the batteries. We didn’t see Ivan’s design files, although it wouldn’t be hard to do your own take on the robot.

Speaking of the weight, we were amused at [Ivan’s] quick and dirty trailer he built to haul the thing around. We wondered if he had those wheels sitting around or if he had to source them from somewhere for this project.

The robot more or less moves in a straight line and the servos either drag a pointy part into the sand or lift the pointy part up so the sand is undisturbed in that area. The robot isn’t perfect. Not only did it not work the first time, but it also looked like it dropped at least one pointy part during the second test run. The tracks seemed to provide good traction, but we would not want to bet that the motion was completely straight.

On the other hand, it did get the job done. It was a lot of wiring and we suspect that’s why it was made all in one piece. Making it break down into sections would have been nice for transport. You might even be able to make it take a varying number of sections if you did it right. However, it would take a lot of connectors and a way for those connectors to support the weight of the beam, so that would be a much tougher problem.

We wish the design files were posted, but we still thought this was a neat enough idea and easy enough to figure out. We aren’t likely to build a 30-pound robot, but we might think about replicating it on a smaller scale to take to our local beach next summer.

We couldn’t help but remember Skryf, the robot that didn’t draw in the sand but drew with sand. Then there’s also  SandBot.

[Harrison] has been busy finding the sweeter side of quarantine by building a voice-controlled, face-tracking M&M launcher. Not only does this carefully-designed candy launcher have control over the angle, direction, and velocity of its ammunition, it also locates and locks on to targets by itself.

Here comes the science: [Harrison] tricked Alexa into thinking the Raspberry Pi inside the machine is a smart TV named [Chocolate]. He just tells an Echo to increase the volume by however many candy-colored projectiles he wants launched at his face. Simply knowing the secret language isn’t enough, though. Thanks to a little face-based security, you pretty much have to be [Harrison] or his doppelgänger to get any candy.

The Pi takes a picture, looks for faces, and rotates the turret base in that direction using three servos driven by Arduino Nanos. Then the Pi does facial landmark detection to find the target’s mouth hole before calculating the perfect parabola and firing. As [Harrison] notes in the excellent build video below, this machine uses a flywheel driven by a DC motor instead of being spring-loaded. M&Ms travel a short distance from the chute and hit a flexible, spinning disc that flings them like a pitching machine.

We would understand if you didn’t want your face involved in a build with Alexa. It’s okay — you can still have a voice-controlled candy cannon.

After this pandemic thing is all said and done, historians will look back on this period from many different perspectives. The one we’re most interested in of course will concern the creativity that flourished in the petri dish of anxiety, stress, and boredom that have come as unwanted side dishes to stay-at-home orders.

[Hunter Irving] and his brother were really missing their friends, so they held a very exclusive hackathon and built a terrifying telepresence robot that looks like a mash-up of Wilson from Castaway and that swirly-cheeked tricycle-riding thing from the Saw movies. Oh, and to make things even worse, it’s made of glow-in-the-dark PLA.

Now when they video chat with friends, TELEBOT is there to make it feel as though that person is in the room with them. The Arduino Uno behind its servo-manipulated vintage doll eyes uses the friend’s voice input to control the wind-up teeth based on their volume levels. As you might imagine, their friends had some uncanny valley issues with TELEBOT, so they printed a set of tiny hats that actually do kind of make it all better. Check out the build/demo video after the break if you think you can handle it.

Not creepy enough for you? Try building your own eyes from the ground up.

 

First the robots took our jobs, then they came for our video games. This dystopian future is brought to you by [Little French Kev] who designed this adorable 3D-printed robot arm to interface with an Xbox One controller joystick. He shows it off in the video after the break, controlling a ball-balancing physics demonstration written in Unity.

Hats off to him on the quality of the design. There are two parts that nestle the knob of the thumbstick from either side. He mates those pieces with each other using screws, firmly hugging the stick. Bearings are used at the joints for smooth action of the two servo motors that control the arm. The base of the robotic appendage is zip-tied to the controller itself.

The build targets experimentation with machine learning. Since the computer can control the arm via an Arduino, and the computer has access to metrics of what’s happening in the virtual environment, it’s a perfect for training a neural network. Are you thinking what we’re thinking? This is the beginning of hardware speed-running your favorite video games like [SethBling] did for Super Mario World half a decade ago. It will be more impressive since this would be done by automating the mechanical bit of the controller rather than operating purely in the software realm. You’ll just need to do your own hack to implement button control.

Building a robot arm is fun, but no longer the challenge it once was. You can find lots of plans and kits, and driving the motors is a solved problem. However, there is always one decision you have to make that can be a challenge: what effector to put on the end of it. If you are [MertArduino] the answer is to put suction at the end. If you need to grab the right things, this could be just the ticket for reliably lifting and letting go. You can see a video of the arm in action, below.

The arm itself is steel with four servo motors and comes in a kit. The video shows the arm making a sandwich under manual control. We suspect he might have put it under Arduino control but there’s no sudo for making sandwiches.

An air pump and a solenoid valve round out the arm. An Arduino reads some pots to control the servo motors on the arm. However, the air pickup is manually controlled. It wouldn’t be very hard to use a FET or a transistor to put that under Arduino control, as well.

This made us think of air tweezer designs we’ve seen in the past. We also wondered if the arm was robust enough for a pick and place setup.

Fighting fire with robots may take jobs away from humans, but it can also save lives. [Mell Bell Electronics] has built a (supervised) kid-friendly version of a firefighting robot that extinguishes flames by chasing them down and blowing them out.

This hyper-vigilant robot is always on the lookout for fire, and doesn’t waste movement on anything else. As soon as it detects the presence of a flame, it centers itself on the source and speeds over to snuff it out with a fan made from a propeller and a DC motor.

Here comes the science: fire emits infrared light, and hobbyist flame sensors use IR to, well, detect fire. This fire bot has three of these flame sensors across the front that output digital data to what has got to be the world’s smallest Arduino – the ATmega32U4-based PICO board that [Mell Bell] just so happens to sell. Cover your mouth and nose and crawl along the floor toward the break to see how responsive this thing is.

Firefighters aren’t the only brave humans involved in the process of keeping the forests standing, or who feel the rising pressure of automation. Hackaday’s own [Tom Nardi] wrote a piece on a dying breed called fire lookouts that will no doubt ignite your interest.

[Will] wanted to build some animatronic eyes that didn’t require high-precision 3D printing. He wound up with a forgiving design that uses an Arduino and six servo motors. You can see the video of the eyes moving around in the video below.

The bill of materials is pretty simple and features an Arduino, a driver board, and a joystick. The 3D printing parts are easy to print with no supports, and will work with PLA. Other than opening up holes there wasn’t much post-processing required, though he did sand the actual eyeballs which sounds painful.

The result is a nice tight package to hold six motors, and the response time of the eye motion is very impressive. This would be great as part of a prop or even a robot in place of the conventional googly eyes.

While the joystick is nice, we’d like to see an ultrasonic sensor connected so the eyes track you as you walk across the room. Maybe they could be mounted behind an old portrait for next Halloween. Then again, perhaps a skull would be even better. If you want a refresher about servos, start with a laser turret tutorial.

Building a future where robots work alongside humans relies heavily on soft robotics. Typically this means there will be an air compressor or a hydraulic system nearby, taking up precious space. But it doesn’t have to.

Engineers at the UC-San Diego Jacobs School have created a soft robotics system that uses electricity to control flexible actuators, much like our brains move our muscles. It works like this: sheets of heat-sensitive liquid crystal elastomer are sandwiched between two layers of standard elastomer. These layers are rolled into cylinders that can twist and bend in different directions depending on which of its six element(s) get electricity. Light up all six, and the tube contracts, forming the foundation for a good gripper. The team also built a tiny walker, pictured above.

The project is still in its infancy, so the actuators are slow to bend and even slower to return to their original shape, but it’s still a great start. Imagine all the soft robotic projects that can get off the ground without being shackled by the bulk and weight of an air compressor or fluid handling system. Watch it do various sped-up things after the break, like claw-machine gripping a bottle of chocolate rocks.

Speaking of delicious candy, edible soft robotics is totally a thing.

Via Arduino blog

[Miller] wanted to practice a bit with some wireless modules and wound up creating a robotic hand he could teleoperate with the help of a haptic glove. It lookes highly reproducible, as you can see the video, below the break.

The glove uses an Arduino’s analog to digital converter to read some flex sensors. Commercial flex sensors are pretty expensive, so he experimented with some homemade sensors. The ones with tin foil and graphite didn’t work well, but using some bent can metal worked better despite not having good resolution.

The wireless communications set up was pretty easy thanks to the NRF24L01 modules. The hard part was sewing the flex sensors into the glove. We thought some of the circuitry looked precarious on the glove, too.

For the robot hand, he used balsa wood and hinges for each joint. Flexible thread provided the return power like a spring. The hand was surprisingly artistic in a primitive sort of way.

While this is a cool demo, the hand isn’t likely to be practical for much as it is. Nerve impulses are better but harder. The glove reminded us a little of one we’d seen before.

If you have a Roomba, you know they are handy. However, they do have a habit of getting into places you’d rather they avoid. You can get virtual walls which are just little IR beacons, but it is certainly possible to roll your own. That’s what [MKme] did and it was surprisingly simple, although it could be the springboard to something more complicated. You can see a video about the build below.

As Arduino projects go, this could hardly be more simple. An IR LED, a resistor and a handfull of code that calls into an IR remote library. If that’s all you wanted, the Arduino is a bit overkill, although it is certainly easy enough and cheap.

We know that’s not much, but we were impressed with some of the other information associated with the project for future directions. For example, there’s this project that adds an ultrasonic sensor to a Roomba using the serial port built under the handle. The interface and protocol for that port is even nicely documented.

That got us thinking. You could probably use some ultrasonic sensors for two-way communication to do custom walls. For example, you could use one to send a set number of pulses per second and have another device on the Roomba to receive them and count. You could program rules like a particular wall is only really a wall between 8 AM and 5 PM, for example.

We’ve seen some people use the Roomba as a general-purpose robot platform. We still wish we could find a sensor in the DigiKey catalog to help avoid this common problem.



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook