Posts | Comments

Planet Arduino

Archive for the ‘Robotics’ Category

Poisonous plants, like poison ivy, can really ruin your day. In an effort to combat this “green menace,” YouTuber Sciencish decided to create his own quadruped robot.

The robotic dog is equipped with two servos per leg, for a total eight, which enable it to move its shoulders and elbows back and forth.

An Arduino Uno controller determines leg positions via trigonometric calculation, and when in position, it dispenses weed killer via a relay and aquarium pump setup. The reservoir can also be used to hold other liquids, whether for watering duties or even to provide extra fuel to a fire.

If you’d like to build a walking biped robot, this 3D-printed design by Technovation looks like a fantastic place to start. Each leg features three servos that actuate it at the hip, knee, and ankle for a total of six degrees of freedom.

Control is handled by an Arduino Uno board that rides on top of the legs, along with a perfboard to connect to the servos directly.

Movements are calculated via inverse kinematics, meaning one simply has to input the x and z positions, and the Arduino calculates the proper servo angles. The bot is even able to take steps between two and 10 centimeters without falling over.

The Internet has been perhaps more important than ever to keep us connected these days. Available technology, however, apparently wasn’t good enough for brothers Hunter and Josh Irving, who built their own telepresence robot using parts on-hand during their own two-person hackathon.

The robot they came up with, dubbed TELEBOT, features a partially 3D-printed face along with a set of chattering teeth and eyes recycled from an antique doll. An Arduino Uno is used to take audio signals from remote “guests,” simulating their facial expressions with servos that drive its mouth and LED-lit eyes. 

The duo designed TELEBOT’s “body” out of an adjustable lamp for manual movement. And, as an added bonus, the device is capable of glowing in the dark and can be customized with a wizard, cowboy or top hat. 

While it might not be the most comforting robot you’ve ever seen, it looks like a fun build! 

Daniel Hingston wanted to build a four-legged walking robot for several years, and with current coronavirus restrictions he finally got his chance. His 3D-printed robodog, dubbed “GoodBoy,” is reminiscent of a miniature version of Boston Dynamics’ Spot, which helped inspired the project. 

It’s extremely clean, with wiring integrated into the legs mid-print. Two micro servos per leg move it in a forward direction, controlled by an Arduino Uno.

Obstacle avoidance is provided by a pair of ultrasonic sensor “eyes,” allowing it to stop when something is in its path. An LDR sensor is also implemented, which when covered by its human minder commands it to present its paw for shaking.

Be sure to check out a short demo of GoodBoy below! 

Will Cogley, known for his awesome animatronics, has created a robotic mouth that’s already a work of art and could form the basis of something even more amazing. 

The device features an array of servo mechanisms to actuate its jaw, forceps, cheeks, and a tongue. The cheek assemblies are particularly interesting, employing two servos each and a linkage system that allows it to move in a variety of positions.

For control, the project uses a Python program to break typed sentences up into individual sounds. It then sends these to an Arduino, which poses the mouth in sequence. Cogley has also experimented with microphone input and hopes to explore motion capture with it in the future.

If you’d like to build your own vaguely humanoid robot, but don’t care about it getting around, then look no farther than Aster

The 3D-printed bot is controlled by an Arduino Uno, with a servo shield to actuate its 16 servo motors. This enables it to move its arms quite dramatically as seen in the video below, along with its head. The legs also appear to be capable of movement, though not meant to walk, and is supported with a column in the middle of its structure.

Aster’s head display is made out of an old smartphone, and in the demo it shows its eyes as green geometric objects, an animated sketch, and then, somewhat shockingly, as different humans. Print files for the project are available here and the design is actually based on the more expensive Poppy Humanoid.

As robotics advance, the future could certainly involve humans and automated elements working together as a team. The question then becomes, how do you design such an interaction? A team of researchers from Purdue University attempt to provide a solution with their GhostAR system.

The setup records human movements for playback later in augmented reality, while a robotic partner is programmed to work around this “ghost.” This enables a user to then plan out how to collaborate with the robot and work out kinks before actually performing a task.

GhostAR’s hardware includes an Oculus Rift headset and IR LED tracking, along with actual robots used in development. Simulation hardware consists of a six-axis Tinkerkit Braccio robot, as well as an Arduino-controlled omni-wheel base that can mount either a robot an arm or a camera as needed.

More information on the project can be found in the team’s research paper here.

We present GhostAR, a time-space editor for authoring and acting Human-Robot-Collaborative (HRC) tasks in-situ. Our system adopts an embodied authoring approach in Augmented Reality (AR), for spatially editing the actions and programming the robots through demonstrative role-playing. We propose a novel HRC workflow that externalizes user’s authoring as demonstrative and editable AR ghost, allowing for spatially situated visual referencing, realistic animated simulation, and collaborative action guidance. We develop a dynamic time warping (DTW) based collaboration model which takes the real-time captured motion as inputs, maps it to the previously authored human actions, and outputs the corresponding robot actions to achieve adaptive collaboration. We emphasize an in-situ authoring and rapid iterations of joint plans without an offline training process. Further, we demonstrate and evaluate the effectiveness of our workflow through HRC use cases and a three-session user study.

While it’s yet to make its premiere, Matt Denton has already built the D-O droid from Star Wars: The Rise of Skywalker using a MKR WiFi 1010 for control, along with a MKR IMU Shield and a MKR Motor Carrier

The droid scoots around on what appears to be one large wheel, which conceals the Arduino boards as well as other electronics, batteries, and mechanical components. Denton’s wheel design is a bit more complicated mechanically than it first appears, as its split into a center section, with thin drive wheels on the side that enable differential steering.

On top, a cone-shaped head provides sounds and movement, giving the little RC D-O a ton of personality. The droid isn’t quite finished as of the video below, but given how well it works there, the end product should be amazing!

MOREbot is an Arduino-powered educational robotic platform that’s currently available for pre-order. While the base kit is geared (literally and figuratively) towards building a small two-motor robot, MORE Technologies CEO Canon Reeves shows off how it can be reconfigured into an RC zip lining device in the video below.

The project uses the kit’s DC motors for traversing the cable, with O-rings that normally form the tires taken off in order to grip the top of a paracord. Everything is controlled by an Arduino Uno and a motor shield, while a Bluetooth module provides wireless connectivity. Control is via an iPad app, which simply rotates both motors at the same time as needed.

Since the parts are all modular, Reeves is planning on adding a few other attachments including a GoPro camera mount and perhaps even a servo that lets him drop a payload like a water balloon from it.

For the Warman Design and Build Competition in Sydney last month, Redditor ‘Travman_16 and team created an excellent Arduino-powered entry. The contest involved picking up 20 payloads (AKA balls) from a trough, and delivering them to a target trough several feet away in under 60 seconds.

Their autonomous project uses Mecanum wheels to move in any direction, plus a four-servo arm to collect balls in a box-like scoop made out of aluminum sheet. 

An Arduino Mega controls four DC gear motors via four IBT-4 drivers, while a Nano handles the servos. As seen in the video, it pops out of the starting area, sweeps up the balls and places them in the correct area at an impressive ~15 seconds. 

It manages to secure all but one ball on this run, and although that small omission was frustrating, the robot was still able to take fifth out of 19 teams. 



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook