top of page
WhatsApp Image 2018-05-16 at 09.09.01.jp
Bachelor's final year project: Charlie, a social robot

 

Actuation:  Arm joints- Encoder DC motors;
Fingers, neck, eyes - Hobby Servo motors;
Controller: On-board - 4 AVR ATMegas, 1 RPi;
Programming language: Python;
The torso was designed in Catia V5 and the face borrowed from InMoov, an open-source Humanoid platform. We could've of course used the torso from InMoov as well, but it is designed to be used with large, high-torque and expensive servo motors. With a tight budget and a bit more efforts, using geared motors with optical encoders proved more affordable. But this meant designing the torso from scratch for the available motors. Besides, 3D modelling is always exciting, and this was a great opportunity to test and improve my CAD skills. The four-arm joint motors and their encoders are controlled by individual AVR microcontrollers which are all mastered by the onboard RPi through an SPI bus. We used potentiometers at each joint as end stop switches, which the robot uses to "homes" all joints when it first boots. Once homed, the respective motor drivers and the AVRs takes over position control. Next, the on-board computer establishes a TCP connection with the off-board computer to facilitate the reception of the data (for example, the XY coordinates of the object/person that charlie is focused on, in its image view), so that head and eye motors can be adjusted.
Charlie uses a web-based API to produce natural responses, sandwiched between Speech-to-text and Text-to-speech services. The on-board RPi controls all joint movements. An Off-board computer(Laptop) processes the image streams from the webcams. We use a series of SSD(Single shot detectors) and convolutional Neural Network to make face detections and recognitions. 

bottom of page