By Rosie Hattersley. Posted

Those of us with an aversion to the company of spiders may find themselves a little uncomfortable at the design of the latest AI robot created by the Sparklers, a group of computer science engineers in India. Not only does the robot specialise in watching its object’s every move, but it has multiple legs and the ability to move in unexpected ways. 

Team leader Arijit Das explains that SPY-DER is so-named because of its ability to walk like an arachnopod – much to the consternation of some observers. The robot’s main purpose is for live video surveillance, streaming footage that can be viewed remotely via a dedicated web browser. SPY-DER can be controlled in two ways, either through voice commands or from the web interface. Riffing on one of the favourite items of prey for real-life spiders, Arijit uses ‘bumblebee’ as the robot’s wake word, prompting its LED eyes to light up. 

“Whenever I call it by its name, it starts listening to me and then, based on my voice command, it will act,” he clarifies. These commands cover both speech recognition and ‘intent detection’. “I can give SPY-DER the same command in different ways, i.e. ‘Wave your hands’ or ‘Say hello’. Both commands make it wave its legs,” says Arijit. The online control panel, meanwhile, lets the user control all of SPY‑DER’s actions. If such movements don’t make you tense up, you can take a look at SPY-DER in action.

Evolutionary process

The first version of SPY-DER was an Arduino Nano-based device, controllable via an Android smartphone or web browser, but lacking the voice recognition aspect Arijit was keen to add. “Obviously I needed a small computer here,” he comments. Several of his previous projects, including the Sudoku‑solving robot we featured in The MagPi #98, were based around Raspberry Pi. Size constraints led Arijit to choose Raspberry Pi Zero W this time. Wireless connectivity and the ability to connect to a Raspberry Pi Camera Module were also critical features. 

The limited RAM provision of Raspberry Zero W presents a challenge for Arijit’s hope of eventually implementing a local speech and intent recognition system from scratch. The speech recognition system would need to be highly optimised, otherwise it will take a lot of time to recognise the speech, he acknowledges. For now, SPY-DER uses an amended version of Picovoice speech recognition, but in time Arijit also aims to add image processing and AI-based features, such as object tracking and face recognition. 

Building it 

Rather than designing SPY-DER’s body from scratch, and having recently got his first 3D printer, Arijit adapted an existing spider robot chassis, changing its dimensions to accommodate Raspberry Pi Zero W, microphone, and the Camera Module. He tweaked the spider robot’s Arduino code, but wrote the Python code for Raspberry Pi to be able to control SPY-DER. The web controls use a Flask framework with a web page coded in HTML, CSS, and jQuery. “For the live video streaming, I used RPi-Cam-Web-Interface, due to the fact that the latency is very low here,” Arijit explains. Helpfully, he provides diagrams and code.

From The MagPi store