Posts | Comments

Planet Arduino

MOREbot is an Arduino-powered educational robotic platform that’s currently available for pre-order. While the base kit is geared (literally and figuratively) towards building a small two-motor robot, MORE Technologies CEO Canon Reeves shows off how it can be reconfigured into an RC zip lining device in the video below.

The project uses the kit’s DC motors for traversing the cable, with O-rings that normally form the tires taken off in order to grip the top of a paracord. Everything is controlled by an Arduino Uno and a motor shield, while a Bluetooth module provides wireless connectivity. Control is via an iPad app, which simply rotates both motors at the same time as needed.

Since the parts are all modular, Reeves is planning on adding a few other attachments including a GoPro camera mount and perhaps even a servo that lets him drop a payload like a water balloon from it.

People who were subscribed to updates on the Alexa Connect Kit (ACK) would recently have received an email informing that this kit is now available for sale. Last time we covered the ACK was back in September of 2018, the ‘release’ moniker meant ‘preview’ and there wasn’t any hardware one could actually purchase.

Over a year a later it seems that we can now finally get our grubby mitts on this kit that should enable us to make any of our projects Alexa-enabled. What this basically seems to mean is that one can spend close to 200 US dollars on an Arduino Zero and an Arduino shield-mounted WM-BN-MT-52 module from USI (though not listed on their site, but similar to the WM-BN-BM-22?) that integrates a 192 MHz Cortex-M MCU and a WiFi/Bluetooth module, as summarized on the Amazon Developer page for the ACK.

Getting Started with ACK

The idea behind the kit is that one uses the Arduino IDE to program the Cortex-M0+-based Arduino board with the application firmware. The fully assembled kit will listen on the network for any service discovery broadcast from an Alexa app (on a smartphone or similar), responding to such a broadcast with a summary of its capabilities, following the Smart Home Skill API protocol. This is essentially the application of mDNS with DNS-SD (Service Discovery).

After the Alexa app on one’s smarthome has found all Alexa-enabled devices, you can then use the Alexa voice interface to control those devices, such as turning them on and off, or adjusting parameters like the speed of a PWM-controlled fan. The Amazon Developer site provides an overview of what kind of devices are supported by the Alexa system for reference.

Welcome to the Amazon Walled Garden

For those who already rushed out to get an ACK, they will have run into the unfortunate realization that the ACK is not merely a fun piece of hardware to play around with. By purchasing it, you are literally signing up to become a part of the Amazon ecosystem, starting by registering the Amazon Developer Account. As noted by the intrepid reporters over at The Register last year,  part of the cost of the ACK is you paying for the Amazon cloud services that enable the ACK to work, with Amazon’s Alexa servers doing the heavy lifting of interpreting customer utterances for you.

The WiFi/Bluetooth module that one gets with the ACK also seems rather secretive, with no datasheet or detailed information available on the internet at the time of writing. It appears to be limited to 802.11 b/g/n (2.4 GHz single-band) WiFi with no mention of anything newer than Bluetooth 4.1 support, meaning it misses out on the energy-saving features in Bluetooth 5 (and BLE).

So then there is the cool thing on one hand that with a bit of Arduino wrangling and the use of the Alexa Android app (or that Echo in your living room), you can make that smart toaster you have always dreamed of, allowing you to burn toast with a simple voice command. On the other hand it means that you fully rely on Amazon’s Alexa infrastructure and the continued existence of ACK support.

We Have Been Here Before

Those with a few years of Internet-of-Things news under their belt may remember Apple’s Homekit, which from a distance at least looks to be a carbon copy of the Alexa Connect Kit, with Apple-blessed hardware and SDK that people would have to integrate into their product to enable Smart Home goodness. Homekit is now pretty much on life-support.

Apple Homekit on iPad, iPhone and iWatch.

Apple decided to throw in the proprietary towel last year, instead joining the Thread Group, which was started by Google and ARM, and which focuses on creating a low-power wireless networking protocol, suitable for connecting smart devices within the home. Thread is built around IEEE 802.15.4, which specifies a low-rate wireless personal area network (LR-WPAN). This same standard underlies Zigbee. Networks supporting this standard are low-power and feature data rates of <1 Mb.

The skeptical view then is that WiFi-based home automation like what ACK offers is beating the same dead horse which Apple seemed to have been beating with Homekit, merely with Alexa instead of Siri. The same skeptic is also likely to note that the Thread protocol is not the open and free panacea some may see it as, with one having to be a (paying) member of the Thread Group to be allowed to have any input on its development, and to be allowed to ship Thread-enabled devices. But if you’re still itching to jump on the Alexa-enabled bandwagon and can live with the spectre of Amazon rule, the door is now open.

If you’ve ever used a VR system and thought what was really missing is the feeling of being hit in the face, then a team researchers at the National Taiwan University may hold just the solution. 

ElastImpact takes the form of a head-mounted display with two impact drivers situated roughly parallel to one’s eyes for normal — straight-on — impacts, and another that rotates about the front of your face for side blows.

Each impact driver first stretches an elastic band using a gearmotor, then releases it with a micro servo when an impact is required. The system is controlled by an Arduino Mega, along with a pair of TB6612FNG motor drivers. 

Impact is a common effect in both daily life and virtual reality (VR) experiences, e.g., being punched, hit or bumped. Impact force is instantly produced, which is distinct from other force feedback, e.g., push and pull. We propose ElastImpact to provide 2.5D instant impact on a head-mounted display (HMD) for realistic and versatile VR experiences. ElastImpact consists of three impact devices, also called impactors. Each impactor blocks an elastic band with a mechanical brake using a servo motor and extending it using a DC motor to store the impact power. When releasing the brake, it provides impact instantly. Two impactors are affixed on both sides of the head and connected with the HMD to provide the normal direction impact toward the face (i.e., 0.5D in z-axis). The other impactor is connected with a proxy collider in a barrel in front of the HMD and rotated by a DC motor in the tangential plane of the face to provide 2D impact (i.e., xy-plane). By performing a just-noticeable difference (JND) study, we realize users’ impact force perception distinguishability on the heads in the normal direction and tangential plane, separately. Based on the results, we combine normal and tangential impact as 2.5D impact, and performed a VR experience study to verify that the proposed 2.5D impact significantly enhances realism.

Today when you get a text, you can respond with message via an on-screen keyboard. Looking into the future, however, how would you interact unobtrusively with a device that’s integrated into eyeglasses, contacts, or perhaps even something else?

TipText is one solution envisioned by researchers at Dartmouth College, which uses a MPR121 capacitive touch sensor wrapped around one’s index finger as a tiny 2×3 grid QWERTY keyboard.

The setup incorporates an Arduino to process inputs on the grid and propose a number of possible words on a wrist-mounted display that the user can select by swiping right with the thumb. A new word is automatically started when the next text entry tap is received, allowing for a typing speed of around 12-13 words per minute.

Ever hear of Microsoft Soundscape? We hadn’t, either. But apparently it and similar apps like Blindsquare provide people with vision problems context about their surroundings. The app is made to run in the background of the user’s mobile device and respond to media controls, but if you are navigating around with a cane, getting to media controls on a phone or even a headset might not be very convenient. [Jazzang] set out to build buttons that could control apps like this that could be integrated with a cane or otherwise located in a convenient location.

There are four buttons of interest. Play/pause, Next, Back, and Home. There’s also a mute button and an additional button you can use with the phone’s accessibility settings. Each button has a special function for Soundscape. For example, Next will describe the point of interest in front of you. Soundscape runs on an iPhone so Bluetooth is the obvious choice for creating the buttons.

To simplify things, the project uses an Adafruit Feather nRF52 Bluefruit board. Given that it’s Arduino compatible and provides a Bluetooth Human Interface Device (HID) out of the box, there’s almost nothing else to do for the hardware but wire up the switches and some pull up resistors. That would make the circuit easy to stick almost anywhere.

Software-wise, things aren’t too hard either. The library provides all the Bluetooth HID device trappings you need, and once that’s set up, it is pretty simple to send keys to the phone. This is a great example of how simple so many tasks have become due to the availability of abstractions that handle all of the details. Since a Bluetooth HID device is just a keyboard, you can probably think of many other uses for this setup with just small changes in the software.

We covered the Bluefruit back when it first appeared. We don’t know about mounting this to a cane, but we do remember something similar attached to a sword.

While flexible electronics can pose certain advantages, they are often subject to a rather short life, and can be difficult if not impossible to repair. As a solution to this conundrum, researchers from Carnegie Mellon University and the University of Tokyo have been exploring a novel material that can fuse itself together automatically, and conduct electricity.

The composite material is called MWCNTs-PBS, or multi-walled carbon nanotubes surrounded by a flexible polymer polyborosiloxane outer region. When two sections need to be attached, they’re simply pressed together and like magic it forms an electrical and mechanical bond. 

Tests performed with the help of an Arduino Mega include pressure and touch sensing, as well as cut detection. It will be interesting to see how this technology advances in the future, perhaps leading to a day when devices just ‘heal’ themselves!

Living things in nature have long been utilizing the ability to “heal” their wounds on the soft bodies to survive in the outer environment. In order to impart this self-healing property to our daily life interface, we propose Self-healing UI, a soft-bodied interface that can intrinsically self-heal damages without external stimuli or glue. The key material to achieving Self-healing UI is MWCNTs-PBS, a composite material of a self-healing polymer polyborosiloxane (PBS) and a filler material multi-walled carbon nanotubes (MWCNTs), which retains mechanical and electrical self-healability. We developed a hybrid model that combines PBS, MWCNTs-PBS, and other common soft materials including fabric and silicone to build interface devices with self-healing, sensing, and actuation capability. These devices were implemented by layer-by-layer stacking fabrication without glue or any post-processing, by leveraging the materials’ inherent self-healing property between two layers. We then demonstrated sensing primitives and interactive applications that extend the design space of shape-changing interfaces with their ability to transform, con- form, reconfigure, heal, and fuse, which we believe can enrich the toolbox of human-computer interaction (HCI). 

For the Warman Design and Build Competition in Sydney last month, Redditor ‘Travman_16 and team created an excellent Arduino-powered entry. The contest involved picking up 20 payloads (AKA balls) from a trough, and delivering them to a target trough several feet away in under 60 seconds.

Their autonomous project uses Mecanum wheels to move in any direction, plus a four-servo arm to collect balls in a box-like scoop made out of aluminum sheet. 

An Arduino Mega controls four DC gear motors via four IBT-4 drivers, while a Nano handles the servos. As seen in the video, it pops out of the starting area, sweeps up the balls and places them in the correct area at an impressive ~15 seconds. 

It manages to secure all but one ball on this run, and although that small omission was frustrating, the robot was still able to take fifth out of 19 teams. 

For the Warman Design and Build Competition in Sydney last month, Redditor ‘Travman_16 and team created an excellent Arduino-powered entry. The contest involved picking up 20 payloads (AKA balls) from a trough, and delivering them to a target trough several feet away in under 60 seconds.

Their autonomous project uses Mecanum wheels to move in any direction, plus a four-servo arm to collect balls in a box-like scoop made out of aluminum sheet. 

An Arduino Mega controls four DC gear motors via four IBT-4 drivers, while a Nano handles the servos. As seen in the video, it pops out of the starting area, sweeps up the balls and places them in the correct area at an impressive ~15 seconds. 

It manages to secure all but one ball on this run, and although that small omission was frustrating, the robot was still able to take fifth out of 19 teams. 

SCARA robots are often used in industrial settings to move components in the proper location. In order to demonstrate the concept to students, Nicholas Schwankl has come up with a simple unit that employs three servos and 3D-printed parts to dispense 4.5mm bearings.

The device runs on an Arduino Mega (though an Uno or other model would work) and as seen in the video below, it twists its ‘shoulder’ and ‘elbow’ joint to position its dispenser tube. Once in place, a micro servo releases a bearing, allowing the tiny steel ball to drop into an empty slot.

STL files, a parts list, and Arduino code are available in the Schwankl’s write-up.

What can you do with ferromagnetic PLA? [TheMixedSignal] used it to give new meaning to the term ‘musicians’ gear’. He’s made a proof of concept for a DIY tone generator, which is the same revolutionary system that made the Hammond organ sing.

Whereas the Hammond has one tonewheel per note, this project uses an Arduino to drive a stepper at varying speeds to produce different notes. Like we said, it’s a proof of concept. [TheMixedSignal] is proving that tonewheels can be printed, pickups can be wound at home, and together they will produce audible frequencies. The principle is otherwise the same — the protruding teeth of the gear induce changes in the magnetic field of the pickup.

[TheMixedSignal] fully intends to expand on this project by adding more tone wheels, trying different gear profiles, and replacing the stepper with a brushless motor. We can’t wait to hear him play “Karn Evil 9”. In the meantime, put on those cans and check out the demo/build video after the break.

We don’t have to tell you how great Hammond organs are for making music. But did you know they can also encode secret messages?

Via the Arduino blog.



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook