Trying to think of something to write can be hard (a fact that feels very much true right now, given that this is the fifth stab at penning an intro!). But imagine how much harder writing would be if you had a peppy paper clip in your eyeline chirpily offering hints and tips. “Do you need assistance,” it would say, as soon as you so much as tapped a single letter. And if you didn’t need help right at that moment, then it’d be sure to ask you again later!

For users of Microsoft’s Office 97, this became a reality courtesy of an ‘intelligent’ user interface called Clippy which, it’s fair to say, split opinion. But even though TIME magazine reckoned it was one of the 50 worst inventions of all time, there have been revivals. In fact, as well as being able to relive the memories of Clippy via a sticker pack for Microsoft Teams, you can now also enjoy a good old natter with the character, thanks to David Packman and a Raspberry Pi computer.

By combining Clippy with ChatGPT, the AI chatbot that is still taking the world by storm, David has brought Microsoft’s character back to life in a physical form and given users the ability to hold a conversation with it. The result is a distinctive, fun, AI-powered digital assistant. “All the memes about Clippy being ChatGPT in disguise convinced me that the first ChatGPT-empowered companion bot I should build would be a desktop Clippy,” David says.

Chatty man

David has long been interested in robotics and chatbots. “I’ve been building robots as a side project, with a focus on companion or social robotics, for a few years now, and one of the key components that has been hard to implement is open-ended conversational capability,” he says. To that end, he’s found it challenging to hard-code a manually-curated chatbot into a robotic build, but says ChatGPT was the answer.

“Conversational large language models like ChatGPT are a game-changer, especially for social robotics, so I was keenly interested in integrating this new tool into a robotics build,” he says. “I wanted to use it as a test case to find the best approach for doing this on a budget. It all stems from my desire to create helpful robots that can help people who have difficulty socialising, or cannot leave home due to illness or other situations.”

Part of this drive is personal. Being able to give people the ability to make friends or create robots that can help with tasks could be of assistance to his son, who has special needs and finds socialising difficult. “I’ve always wanted to help him make friends,” David says. “A conversant chatbot using an inexpensive API really advances that aim, so I had to figure out how and then share that with others.”

Say it again

At the robot’s heart is a Raspberry Pi 3 Model B computer which he took from an older robot build knowing that it was compatible with other components he owned. One such add-on was the Adafruit Cricket HAT that enables the creation of robot friends using Python by allowing motors, servos, or solenoids to be controlled. “There were some software library dependencies that made it easier to use Raspberry Pi OS as well,” he adds.

To handle speech, David turned to Microsoft Azure’s speech services. This allowed him to convert speech to text and vice versa, with David finding the free tier provided all the capability he needed to provide a speech interface with ChatGPT. “It also has the benefit of recognising an offline wake word,” he says. “So nothing is recorded or sent over the internet until I specifically activate it.”

David also made use of the OpenAI Python library to run the ChatGPT queries. “The OpenAI API calls are the only expense, but it’s pretty cheap for a build like this,” he says. “One day of heavy usage was only eight cents. Also, the Python SpeechRecognition library is largely broken, so it was a very appealing alternative with extra benefits.”

Getting animated

A key part of the whole build has been the animatronics. While Clippy’s speech is outputted to a speech bubble (essentially a 3D-printed case surrounding a 5-inch HDMI display, fitted with a salvaged speaker from an Amazon Echo Dot), the paper clip has wiggling eyebrows which help to bring this retro cultural icon to life.

“The animatronics are pretty simplistic, although it was a bit of a bodge to get the eyebrow mechanism to work, since the directional force has to wind through some curved passages inside Clippy,” David says. “In the end, I used a technique that we use to build 3D-printed assistive device hands, which is fishing line.

“To counter the pull of the lines, I also had to design a spring mechanism to return the movement to a start position after the servos release the tension on the lines. I couldn’t use bands since that would pull the mechanism out of the sockets in the back, so I resorted to the old standby of using compression springs from old retractable ink pens. From a coding perspective, adding movement to the conversational aspect is relatively easy.”

Looking ahead

In all, the build took David a week to complete. He was able to do this with a single Python script. “And I did it in a way that made it really easy to integrate into any project, in order to build something that was more engaging and personable than just a canned speaker with a microphone.”

David says he is now integrating the techniques into more complex robotic builds, such as a conversational companion bot based on BMO from the Cartoon Network’s Adventure Time animated television series, and a counter-top kitchen robot called Whiskrz.

“That will include some machine vision to help identify ingredients, suggesting recipes on what you show it, and do simple tasks such as mixing,” David adds. “I’ve left the build simple so that it’s easy for others to adjust, allowing them to make more cool things with it. There are lots of improvements that can be made based on what you need.”

From The MagPi store

Subscribe

Subscribe to the newsletter

Get every issue delivered directly to your inbox and keep up to date with the latest news, offers, events, and more.