Adding real-time local voice controls to a SMARS Quad Mod robot with an Arduino RP2040 Connect

Robotics kits like the Screwless/Screwed Modular Assemblable Robotic System (SMARS) are great tools for learning more about how electronics, mechanics, and software can combine to perform useful tasks in the physical world. And in his latest project, Edge Impulse’s senior embedded software engineer Dmitry Maslov shows how he was able to take this platform and give it […]

The post Adding real-time local voice controls to a SMARS Quad Mod robot with an Arduino RP2040 Connect appeared first on Arduino Blog.

Robotics kits like the Screwless/Screwed Modular Assemblable Robotic System (SMARS) are great tools for learning more about how electronics, mechanics, and software can combine to perform useful tasks in the physical world. And in his latest project, Edge Impulse’s senior embedded software engineer Dmitry Maslov shows how he was able to take this platform and give it both speech recognition and Wi-Fi control capabilities using an Arduino Nano RP2040 Connect.

Constructed from an array of 3D-printed parts and eight servo motors, the SMARS Quad Mod robot is a small, blocky quadruped that uses two LiPo battery cells, a step-down converter, and an IO expansion board to move based on simple directional commands such as “forward” and “left,” among others. Normally, these would come from an IR remote or a preprogrammed sequence, but by leveraging AI at the edge, it can respond in real-time to audible commands. And to achieve this, Maslov imported a dataset containing many samples of people saying directions along with background noise before training a keyword spotting model on it.

Once exported as a C++ library, the model was embedded into the robot’s sketch. Thanks to the RP2040’s dual-core architecture, the first core continuously reads new data from the microphone, performs inferencing, and sends the result to the second core when available. Then when the value is received, the other core maps the direction to a sequence of leg movements.

For more information about this project, you can check out Maslov’s tutorial on Hackster.io and see its dataset/model here in the Edge Impulse Studio

The post Adding real-time local voice controls to a SMARS Quad Mod robot with an Arduino RP2040 Connect appeared first on Arduino Blog.

Amassing a mobile Minion militia

Channeling his inner Gru, YouTuber Electo built a robotic minion army to terrorize and amuse the public in local shopping malls. Building one minion robot is, in theory, pretty straightforward. That is especially true when, like these, that robot isn’t actually bipedal and instead rolls around on little wheels attached to the feet. But creating […]

The post Amassing a mobile Minion militia appeared first on Arduino Blog.

Channeling his inner Gru, YouTuber Electo built a robotic minion army to terrorize and amuse the public in local shopping malls.

Building one minion robot is, in theory, pretty straightforward. That is especially true when, like these, that robot isn’t actually bipedal and instead rolls around on little wheels attached to the feet. But creating 10 robots is more of a challenge. Assuming a limited budget, the robots would have to be relatively inexpensive. So, how could Electo give them the ability to run around causing mayhem?

Electo’s solution was to make one smart minion, called King Bob, to lead all of the other minions of lesser intelligence. The basic design consists of an Arduino that controls the two drive motors and that can communicate with other Arduino boards via radio transceiver modules. Those components fit inside a 3D-printed shell and this basic minion is pretty affordable to construct.

But King Bob has more advanced hardware and special abilities. He can receive explicit movement commands from Electo’s radio transmitter controller, but also has some intelligence thanks to a single-board computer and a camera. That lets it run a computer vision application to detect and follow specific things that it sees. In this case, that is a banana.

King Bob could follow explicit commands or a banana, but what about the other minions? Electo gave them the ability to follow their leader by simply mimicking its movements. Any movement that King Bob makes is also transmitted over radio to the other minions, so they can make the same movements. This is intentionally clumsy (because minions), but lets the group move together in an entertaining way as they traverse shopping malls and movie theaters.

The post Amassing a mobile Minion militia appeared first on Arduino Blog.

This beautiful table creates art in the sand

Kinetic sand art tables are pretty hot right now, because they look really cool. They’re like zen gardens that rake themselves in intricate patterns. But most of the builds we’ve seen use a conventional cartesian CNC layout or polar layout. This table by Newsons Electronics takes a different approach inspired by spirograph drawing machines. A […]

The post This beautiful table creates art in the sand appeared first on Arduino Blog.

Kinetic sand art tables are pretty hot right now, because they look really cool. They’re like zen gardens that rake themselves in intricate patterns. But most of the builds we’ve seen use a conventional cartesian CNC layout or polar layout. This table by Newsons Electronics takes a different approach inspired by spirograph drawing machines.

A spirograph is drawing template mechanism made up of at least two gears (and often several). By placing a pen in the hole, the user can draw a line that traces the path created by the gear movement. That path varies based on the gear parameters and can be extremely intricate. The geometric beauty is appealing and this table produces those patterns in sand.

Like other kinetic art tables, this draws in the sand by using a magnet to pull a ball bearing through the sand. In this case, that magnet attaches to a motor-driven spirograph mechanism underneath the table. One motor rotates the mechanism, while another motor actuates a rack-and-pinion that affects the path and ultimately the drawn pattern.

Those are both stepper motors and an Arduino UNO Rev3 board controls them through a stepper shield. The Arduino also controls the LED accent lighting, with potentiometer knobs to adjust brightness and the speed of animated transitions.

Newsons Electronics designed the table’s structure and frame to be made from stacked sheets of plywood cut out with a laser for precision, but it would be possible to make the parts with a CNC router or even a scroll saw. The result is a gorgeous piece of kinetic art.

The post This beautiful table creates art in the sand appeared first on Arduino Blog.

An Arduino-powered robotic ukulele that plays itself

The ukulele has a bit of a reputation for being quaint, but it is a legitimate instrument like any other and that means it takes a lot of practice to play competently. Zeroshot is too busy building cool stuff to bother with all of that, so he put his skills to use constructing this robotic […]

The post An Arduino-powered robotic ukulele that plays itself appeared first on Arduino Blog.

The ukulele has a bit of a reputation for being quaint, but it is a legitimate instrument like any other and that means it takes a lot of practice to play competently. Zeroshot is too busy building cool stuff to bother with all of that, so he put his skills to use constructing this robotic ukulele that plays itself.

Like a guitarist, a ukulelist can play a note by strumming multiple strings at once or by picking individual strings. More exotic techniques are also possible, but uncommon and outside the scope of this project. The key to Zeroshot’s design is the mechanism that can both pick and strum. It does so by using two actuators: a servo motor to lift and drop the pick, and a stepper to slide the pick back and forth perpendicular to the strings.

An Arduino UNO Rev3 board controls those motors through a HiLetgo L293D motor shield, with a TMC2208 driver module for the stepper. The Arduino can lower the pick and strum it across all of the strings, or it can move to a specific string and pluck just that one. 

But it would be limited to only a handful of songs if it could only play open strings, so Zeroshot also needed to add hardware to hold the strings down on the fretboard. He chose solenoids for that job, held in a 3D-printed mount. With power coming from the motor shield, the Arduino can extend the solenoids to play any required notes.

Zeroshot designed the mount to accommodate up to 16 solenoids, for the first four frets across the four strings. When including open strings, that would give the robot up to 20 notes to work with. But a lot of songs only require a handful of solenoids, as Zeroshot demonstrated by performing Celine Dion’s “My Heart Will Go On.”

The post An Arduino-powered robotic ukulele that plays itself appeared first on Arduino Blog.

Texas Instruments signs preliminary agreement to receive up to $1.6 billion in CHIPS and Science Act proposed funding for semiconductor manufacturing in Texas and Utah

Proposed funding, coupled with an estimated $6 billion to $8 billion in investment tax credit, will help TI provide geopolitically dependable, 300mm capacity for analog and embedded processing semiconductors

Proposed funding, coupled with an estimated $6 billion to $8 billion in investment tax credit, will help TI provide geopolitically dependable, 300mm capacity for analog and embedded processing semiconductors

This UNO R4 WiFi-controlled device streamlines a restaurant’s online order system

Most successful restaurants operating today have to take advantage of online ordering, as a huge chunk of customers have switched to takeout and delivery. But point-of-sale (POS) systems don’t always integrate well into a kitchen’s workflow and that can lead to missed orders — one of the worst things a restaurant can do. To help […]

The post This UNO R4 WiFi-controlled device streamlines a restaurant’s online order system appeared first on Arduino Blog.

Most successful restaurants operating today have to take advantage of online ordering, as a huge chunk of customers have switched to takeout and delivery. But point-of-sale (POS) systems don’t always integrate well into a kitchen’s workflow and that can lead to missed orders — one of the worst things a restaurant can do. To help streamline a POS for a friend’s fried chicken takeout restaurant, Redditor UncleBobbyTO developed this affordable notification bot.

UncleBobbyTO’s friend uses a Square system in her restaurant, which has an online interface and sends an email for each new order. But the kitchen staff is busy and they sometimes fail to notice the emails. This device solves that problem. It can sit in the kitchen or by the expo window and connects to the Square API, checking for new orders every three minutes. When the device detects a new order, it lights up green and displays basic information about that transaction. Staff can then look up the order and press a button on the device to clear the notification.

That’s all possible because the device contains an Arduino UNO R4 WiFi board, which has built-in Wi-Fi capabilities that lets it connect to the internet and the Square API. It resides inside of a sturdy 3D-printed enclosure that also contains an RGB LED strip and a 16×2 character LCD screen. 

Now UncleBobbyTO’s friend can run her restaurant without worrying that staff might miss an order. 

The post This UNO R4 WiFi-controlled device streamlines a restaurant’s online order system appeared first on Arduino Blog.

Reimagining the chicken coop with predator detection, Wi-Fi control, and more

The traditional backyard chicken coop is a very simple structure that typically consists of a nesting area, an egg-retrieval panel, and a way to provide food and water as needed. Realizing that some aspects of raising chickens are too labor-intensive, the Coders Cafe crew decided to automate most of the daily care process by bringing […]

The post Reimagining the chicken coop with predator detection, Wi-Fi control, and more appeared first on Arduino Blog.

The traditional backyard chicken coop is a very simple structure that typically consists of a nesting area, an egg-retrieval panel, and a way to provide food and water as needed. Realizing that some aspects of raising chickens are too labor-intensive, the Coders Cafe crew decided to automate most of the daily care process by bringing some IoT smarts to the traditional hen house.

Controlled and actuated by an Arduino UNO R4 WiFi and a stepper motor, respectively, the front door of the coop relies on a rack-and-pinion mechanism to quickly open or close at the scheduled times. After the chickens have entered the coop to rest or lay eggs, they can be fed using a pair of fully-automatic dispensers. Each one is a hopper with a screw at the bottom which pulls in the food with the help of gravity and gently distributes it onto the ground. And similar to the door, feeding chickens can be scheduled in advance through the team’s custom app and the UNO R4’s integrated Wi-Fi chipset.

The last and most advanced feature is the AI predator detection system. Thanks to a DFRobot HuskeyLens vision module and its built-in training process, images of predatory animals can be captured and leveraged to train the HuskyLens for when to generate an alert. Once an animal has been detected, it tells the UNO R4 over I2C, which in turn, sends an SMS message via Twilio.

More details about the project can be found in Coders Cafe’s Instructables writeup.

The post Reimagining the chicken coop with predator detection, Wi-Fi control, and more appeared first on Arduino Blog.

Adjusting office chair height with simple voice commands

A month ago, ElectronicLab modified his office chair with an electric car jack, giving it motorized height adjustment. That worked well, but required that he push buttons to raise or lower the seat. Pushing those buttons is a hassle when one’s hands are full, so ElectronicLab went back to the workbench to add voice control […]

The post Adjusting office chair height with simple voice commands appeared first on Arduino Blog.

A month ago, ElectronicLab modified his office chair with an electric car jack, giving it motorized height adjustment. That worked well, but required that he push buttons to raise or lower the seat. Pushing those buttons is a hassle when one’s hands are full, so ElectronicLab went back to the workbench to add voice control capabilities.

ElectronicLab was using an Arduino Nano to control the electric jack motor in response to button presses, so he already had most of the hardware necessary to make the system smarter. He just needed the Arduino to recognize specific voice commands, which he was able to achieve using an ELECHOUSE Voice Recognition Module V3.

That voice recognition modules supports up to 80 voice commands, but ElectronicLab only needed a few of them — just enough to tell the chair which direction to move and how far to go. The module came with a microphone, which ElectronicLab was able to attach outside of the 3D-printed enclosure where it could pick up his voice.

But there was still one problem: the movement was very slow. The jack was designed to lift a car, so it uses a high-torque motor with a 10:1 planetary gearset to drive a hydraulic pump. ElectronicLab didn’t need that much torque, so he welded the planetary gears to give the motor a direct 1:1 ratio. Sadly, that was a mistake. The hydraulic oil can’t flow fast enough to keep up, so the motor pulls way too much current for the driver.

Still, the voice control was a success and so ElectronicLab can simply swap out the motor.

Perhaps in the future ElectronicLab can even consolidate the components using the speech recognition-capable Nano 33 BLE Sense Rev2 or Nano RP2040 Connect

The post Adjusting office chair height with simple voice commands appeared first on Arduino Blog.

Making fire detection more accurate with ML sensor fusion

The mere presence of a flame in a controlled environment, such as a candle, is perfectly acceptable, but when tasked with determining if there is cause for alarm solely using vision data, embedded ML models can struggle with false positives. Solomon Githu’s project aims to lower the rate of incorrect detections with a multi-input sensor fusion technique […]

The post Making fire detection more accurate with ML sensor fusion appeared first on Arduino Blog.

The mere presence of a flame in a controlled environment, such as a candle, is perfectly acceptable, but when tasked with determining if there is cause for alarm solely using vision data, embedded ML models can struggle with false positives. Solomon Githu’s project aims to lower the rate of incorrect detections with a multi-input sensor fusion technique wherein image and temperature data points are used by a model to alert if there’s a potentially dangerous blaze.

Gathering both kinds of data is the Arduino TinyML Kit’s Nano 33 BLE Sense. Using the kit, Githu could capture a wide variety of images thanks to the OV7675 camera module and temperature information with the Nano 33 BLE Sense’s onboard HTS221 sensor. After exporting a large dataset of fire/fire-less samples alongside a range of ambient temperatures, he leveraged Google Colab to train the model before importing it into the Edge Impulse Studio. In here, the model’s memory footprint was further reduced to fit onto the Nano 33 BLE Sense.

The inferencing sketch polls the camera for a new frame, and once it has been resized, its frame data, along with a new sample from the temperature sensor, are merged and sent through the model which outputs either “fire” or “safe_environment”. As detailed in Githu’s project post, the system accurately classified several scenarios in which a flame combined with elevated temperatures resulted in a positive detection.

The post Making fire detection more accurate with ML sensor fusion appeared first on Arduino Blog.