Having issues with my Dynamixels on the head/neck, so I'm moving on to other stuff while I'm waiting for a USB2AX to arrive. A couple weeks backed I picked up a craigslist special. A Quickie P-190 for $50. So I'm switching out Artie's Power Wheel base for the wheelchair motors. Still using the HB-25's for now, but contemplating moving to a Sabretooth 2x25.
Anyways, with this update, I'm showing that I stripped down the base, removed the electric brakes, and prototyped Artie's body sitting on top of the new chassis (zip ties for the win). I got him driving down my driveway, but wasn't able to hold the camera and drive him at the same time.
Finally figured out how to get the Dynamixel servos to be controlled by the XBox controller through my Java program. Using a USB2Dynamixel dongle and the simpledynamixel processing library (https://code.google.com/p/simple-dynamixel/), I was able to get pan working. Unfortunately, found an issue with the mechanical design of the tilt mechanism, so have to go back to the drawing board on that one.
He speaks! Re-did the head, using a bunch of 3D printed corner brackets to hold it together vs glue like before. Also installed a speaker from a spare computer speaker set I had, as well as built an access panel on the top of the head (vs before where I just pulled off the top that was wedged in). I also made the Arduino interface a little cleaner, using a prototyping shield to sit on top, vs the separate breadboard I was using before.
The voice in the video is not his eventual voice. It will sound more high-pitched and robotic as I'm going to feed the input through a modified kids voice changer toy. Also, the audio in the video is just a pre-recorded sample from the ATT Natural Voices website. The plan is to actually use FreeTTS for speech synthesis. But I wanted to test out the system before going to bed, and of course wanted to show it off here!
Posting a photo of RT at the SXSW Create event. Wish I got more photos/videos, but I was too busy driving the robot to man the camera.
Hooked up Neopixel ring lights and posting a video of it going through the different patterns.
The eyes can do 10 different patterns and 8 different colors. When not in "attract" mode, the patterns and colors are set using serial commands over usb from the host computer.
Update - 03/03/2014
More updates. Just recieved a pan-tilt system hand-machined by a friend. Thought some of you might be interested in photos of it, so I'm adding a bunch of different views. You can see the NeoPixel LED ring holders on the head now and the ultrasonic sensors on the front (positioning will change when second layer is added on). Also, I plan to cut the PVC pipe for the neck in half to aid with balance issues.
New images of the latest design of RT-01. Using only one section for the torso instead of two until the second layer is actually needed.
The RT Series robot will be a 5' tall robot. It uses two geartrains from a Power Wheels ride on toy, and two Parallax HB-25 motor controllers for motor control. An Arduino currently serves as an interface between the Acer One netbook (the same one on Geoffrey), and the motor control.
Eventually the RT-01 robot will be jam packed with processing power and sensors. Currently slated is.
- 1 Microsoft Kinect w/ depth sensor, RGB camera, and 4 microphones
- 4 other web cameras (2 in the eyes, 1 in the back, and one pointing down to the ground to track motion)
- 1 speaker unit (located in the "mouth" area
- 8 HC-SR04 Ultrasonic sensors
- An array of bump sensors (amount and kind tbd)
- 2 microphones located on the sides of the head.
For processing power.
- 1 - Intel 1.7 ghz processor with 2 gb RAM and 80 gb hdd
- 1 - Arduino Uno @ 16mhz
- 1 - Arduino Mega 2560
- 1 - Raspberry Pi
- 1- Dual Core Android Tablet @ 1.2 ghz
These processing systems will be connected together to provide modularity.
The system will be open source, and both the hardware and software process will be documented for replication.