Posts | Comments

Planet Arduino

Archive for the ‘Nicla’ Category

Having constant, reliable access to a working HVAC system is vital for our way of living, as they provide a steady supply of fresh, conditioned air. In an effort to decrease downtime and maintenance costs from failures, Yunior González and Danelis Guillan have developed a prototype device that aims to leverage edge machine learning to predict issues before they occur.

The duo went with a Nicla Sense ME due to its onboard accelerometer, and after collecting many readings from each of the three axes at a 10Hz sampling rate, they imported the data into Edge Impulse to create the model. This time, rather than using a classifier, they utilized a K-means clustering algorithm — which is great at detecting anomalous readings, such as a motor spinning erratically, compared to a steady baseline.

Once the Nicla Sense ME had detected an anomaly, it needed a way to send this data somewhere else and generate an alert. González and Guillan’s setup accomplishes the goal by connecting a Microchip AVR-IoT Cellular Mini board to the Sense ME along with a screen, and upon receiving a digital signal from the Sense ME, the AVR-IoT Cellular Mini logs a failure in an Azure Cosmos DB instance where it can be viewed later on a web app.

To read more about this preventative maintenance project, you can read the pair’s write-up here on Hackster.io.

The post Detecting HVAC failures early with an Arduino Nicla Sense ME and edge ML appeared first on Arduino Blog.

Whether it’s an elf that stealthily watches from across the room or an all-knowing Santa Claus that keeps a list of one’s actions, spying during the holidays is nothing new. But when it comes time to receive presents, the more eager among us might want to know what presents await us a few days in advance under the tree, which is what prompted element14 Presents host Milos Rasic to build a robotic ornament equipped with vision and a compact movement system.

On the hardware side, Rasic went with an Arduino Nicla Vision board as it contains a camera and the ability to livestream the video feed over the network. A pair of continuous servo motors allow the mobile robot platform to drive along the ground while another set of servos open the ornament’s trapdoor to expose the wheels and carefully lower it from the tree through a clever system of bands and thread.

The livestreaming portion of the project was based off an existing MJPEG RTP example that exposes a web API endpoint for fetching the latest frame from the Nicla’s onboard camera and delivering it via Wi-Fi. To control the robot, including winching, driving, and toggling the lights, Rasic created a Node-RED interface that sent MQTT messages to the Nicla.

To see more about how this creative device was designed, watch Rasic’s video below or read his full write-up here.

The post This Nicla Vision-powered ornament covertly spies on the presents below appeared first on Arduino Blog.

As Jallson Suryo discusses in his project, adding voice controls to our appliances typically involves an internet connection and a smart assistant device such as Amazon Alexa or Google Assistant. This means extra latency, security concerns, and increased expenses due to the additional hardware and bandwidth requirements. This is why he created a prototype based on an Arduino Nicla Voice that can provide power for up to four outlets using just a voice command.

Suryo gathered a dataset by repeating the words “one,” “two,” “three,” “four,” “on,” and “off” into his phone and then uploaded the recordings to an Edge Impulse project. From here, he split the files into individual words before rebalancing his dataset to ensure each label was equally represented. The classifier model was trained for keyword spotting and used Syntiant NDP120-optimal settings for voice to yield an accuracy of around 80%.

Apart from the Nicla Voice, Suryo incorporated a Pro Micro board to handle switching the bank of relays on or off. When the Nicla Voice detects the relay number, such as “one” or “three”, it then waits until the follow-up “on” or “off” keyword is detected. With both the number and state now known, it sends an I2C transmission to the accompanying Pro Micro which decodes the command and switches the correct relay.

To see more about this voice-controlled power strip, be sure to check out Suryo’s Edge Impulse tutorial.

The post Controlling a power strip with a keyword spotting model and the Nicla Voice appeared first on Arduino Blog.

We recently showed you Becky Stern’s recreation of the “computer book” carried by Penny in the Inspector Gadget cartoon, but Stern didn’t stop there. She also built a replica of Penny’s most iconic gadget: her watch. Penny was a trendsetter and rocked that decades before the Apple Watch hit the market. Stern’s replica looks just like the cartoon version and even has some of the same features.

The centerpiece of this project is an Arduino Nicla Voice board. The Arduino team designed that board specifically for speech recognition on the edge, which made it perfect for recognizing Penny’s signature “come in, Brain!” voice command. Stern used Edge Impulse to train an AI to recognize that phrase as a wake word. When the Nicla Voice board hears that, it changes the image on the smart watch screen to a new picture of Brain the dog.

The Nicla Vision board and an Adafruit 1.69″ color IPS TFT screen fit inside a 3D-printed enclosure modeled on Penny’s watch from the cartoon. That even has a clever 3D-printed watch band with links connected by lengths of fresh filament. Power comes from a small lithium battery that also fits inside the enclosure.

This watch and Stern’s computer book will both be part of an Inspector Gadget display put on by Digi-Key at Maker Faire Rome, so you can see it in person if you attend.

The post Building the OG smartwatch from Inspector Gadget appeared first on Arduino Blog.

When dealing with indoor climate controls, there are several variables to consider, such as the outside weather, people’s tolerance to hot or cold temperatures, and the desired level of energy savings. Windows can make this extra challenging, as they let in large amounts of light/heat and can create poorly insulated regions, which is why Jallson Suryo developed a prototype that aims to balance these needs automatically through edge AI techniques.

Suryo’s smart building ventilation system utilizes two separate boards, with an Arduino Nano 33 BLE Sense handling environmental sensor fusion and a Nicla Voice listening for certain ambient sounds. Rain and thunder noises were uploaded from an existing dataset, split and labeled accordingly, and then used to train a Syntiant audio classification model for the Nicla Voice’s NDP120 processor. Meanwhile, weather and ambient light data was gathered using the Nano’s onboard sensors and combined into time-series samples with labels for sunny/cloudy, humid, comfortable, and dry conditions.

After deploying the board’s respective classification models, Suryo added some additional code that writes new I2C data from the Nicla Voice to the Nano that indicates if rain/thunderstorm sounds are present. If they are, the Nano can automatically close the window via servo motors while other environmental factors can set the position of the blinds. With this multi-sensor technique, a higher level of accuracy can be achieved for more precision control over a building’s windows, and thus attempt to lower the HVAC costs.

More information about Suryo’s project can be found here on its Edge Impulse docs page

The post Improving comfort and energy efficiency in buildings with automated windows and blinds appeared first on Arduino Blog.

Your dog has nerve endings covering its entire body, giving it a sense of touch. It can feel the ground through its paws and use that information to gain better traction or detect harmful terrain. For robots to perform as well as their biological counterparts, they need a similar level of sensory input. In pursuit of that goal, the Autonomous Robots Lab designed TRACEPaw for legged robots.

TRACEPaw (Terrain Recognition And Contact force Estimation Paw) is a sensorized foot for robot dogs that includes all of the hardware necessary to calculate force and classify terrain. Most systems like this use direct sensor readings, such as those from force sensors. But TRACEPaw is unique in that it uses indirect data to infer this information. The actual foot is a deformable silicone hemisphere. A camera looks at that and calculates the force based on the deformation it sees. In a similar way, a microphone listens to the sound of contact and uses that to judge the type of terrain, like gravel or dirt.

To keep TRACEPaw self-contained, Autonomous Robots Lab chose to utilize an Arduino Nicla Vision board. That has an integrated camera, microphone, six-axis motion sensor, and enough processing power for onboard machine learning. Using OpenMV and TensorFlow Lite, TRACEPaw can estimate the force on the silicone pad based on how much it deforms during a step. It can also analyze the audio signal from the microphone to guess the terrain, as the silicone pad sounds different when touching asphalt than it does when touching loose soil.

More details on the project are available on GitHub.

The post Helping robot dogs feel through their paws appeared first on Arduino Blog.

The traditional method for changing a diaper starts when someone smells or feels the that the diaper has been soiled, and while it isn’t the greatest process, removing the soiled diaper as soon as possible is important for avoiding rashes and infections. Justin Lutz has created an intelligent solution to this situation by designing a small device that alerts people over Bluetooth® when the diaper is ready to be changed.

Because a dirty diaper gives off volatile organic compounds (VOCs) and small particulates, Lutz realized he could use the Arduino Nicla Sense ME’s built-in BME688 sensor which can measure VOCs, temperature/humidity, and air quality. After gathering 29 minutes of gas and air quality measurements in the Edge impulse Studio for both clean and soiled diapers, he trained a classification model for 300 epochs, resulting in a model with 95% accuracy.

Based on his prior experience with the Nicla Sense ME’s BLE capabilities and MIT App Inventor, Lutz used the two to devise a small gadget that wirelessly connects to a phone app so it can send notifications when it’s time for a new diaper.

To read more about this project, you can check out Lutz’s write-up here on the Edge Impulse docs page.

The post This smart diaper knows when it is ready to be changed appeared first on Arduino Blog.

Due to an ever-warming planet thanks to climate change and greatly increasing wildfire chances because of prolonged droughts, being able to quickly detect when a fire has broken out is vital for responding while it’s still in a containable stage. But one major hurdle to collecting machine learning model datasets on these types of events is that they can be quite sporadic. In his proof of concept system, engineer Shakhizat Nurgaliyev shows how he leveraged NVIDIA Omniverse Replicator to create an entirely generated dataset and then deploy a model trained on that data to an Arduino Nicla Vision board.

The project started out as a simple fire animation inside of Omniverse which was soon followed by a Python script that produces a pair of virtual cameras and randomizes the ground plane before capturing images. Once enough had been created, Nurgaliyev utilized the zero-shot object detection application Grounding DINO to automatically draw bounding boxes around the virtual flames. Lastly, each image was brought into an Edge Impulse project and used to develop a FOMO-based object detection model.

By taking this approach, the model achieved an F1 score of nearly 87% while also only needing a max of 239KB of RAM and a mere 56KB of flash storage. Once deployed as an OpenMV library, Nurgaliyev shows in his video below how the MicroPython sketch running on a Nicla Vision within the OpenMV IDE detects and bounds flames. More information about this system can be found here on Hackster.io.

The post This Nicla Vision-based fire detector was trained entirely on synthetic data appeared first on Arduino Blog.

Due to an ever-warming planet thanks to climate change and greatly increasing wildfire chances because of prolonged droughts, being able to quickly detect when a fire has broken out is vital for responding while it’s still in a containable stage. But one major hurdle to collecting machine learning model datasets on these types of events is that they can be quite sporadic. In his proof of concept system, engineer Shakhizat Nurgaliyev shows how he leveraged NVIDIA Omniverse Replicator to create an entirely generated dataset and then deploy a model trained on that data to an Arduino Nicla Vision board.

The project started out as a simple fire animation inside of Omniverse which was soon followed by a Python script that produces a pair of virtual cameras and randomizes the ground plane before capturing images. Once enough had been created, Nurgaliyev utilized the zero-shot object detection application Grounding DINO to automatically draw bounding boxes around the virtual flames. Lastly, each image was brought into an Edge Impulse project and used to develop a FOMO-based object detection model.

By taking this approach, the model achieved an F1 score of nearly 87% while also only needing a max of 239KB of RAM and a mere 56KB of flash storage. Once deployed as an OpenMV library, Nurgaliyev shows in his video below how the MicroPython sketch running on a Nicla Vision within the OpenMV IDE detects and bounds flames. More information about this system can be found here on Hackster.io.

The post This Nicla Vision-based fire detector was trained entirely on synthetic data appeared first on Arduino Blog.

Despite snoring itself being a relatively harmless condition, those who do snore while asleep can also be suffering from sleep apnea — a potentially serious disorder which causes the airway to repeatedly close and block oxygen from getting to the lungs. As an effort to alert those who might be unaware they have sleep apnea, Naveen Kumar devised a small device using an Arduino Pro Nicla Voice to detect when a person is snoring and gently alert them via haptic feedback in their pillow.

Although many boards have microphones and can run sound recognition machine learning models, the Nicla Voice contains a Syntiant NDP120 Neural Decision Processor that is specifically designed to accelerate deep learning workloads while also decreasing the amount of power needed to do so. Apart from the board, Kumar added an Adafruit DRV2605L haptic motor driver and haptic motor as a way to wake up the user without disturbing others nearby.

The model was created by first downloading a snoring dataset that contains hundreds of short samples of either snoring or non-snoring. After adding them to the Edge Impulse Studio, Kumar constructed an impulse from the Syntiant Audio blocks and trained a model that achieved a 94.6% accuracy against the test dataset. The code integrating the model continuously collects new audio samples from the microphone, passes them to the NDP120 for classification, and triggers the haptic motor if snoring is sensed.

To read more about this project, you can check out Kumar’s write-up here.

The post A snore-no-more device designed to help those with sleep apnea appeared first on Arduino Blog.



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook