Let's Make Robots!


Telepresence. Human/Computer interaction (speech, movement). Autonomous mode using A* in a controlled environment.

Upate 09/28/2014

Having issues with my Dynamixels on the head/neck, so I'm moving on to other stuff while I'm waiting for a USB2AX to arrive.  A couple weeks backed I picked up a craigslist special.  A Quickie P-190 for $50.  So I'm switching out Artie's Power Wheel base for the wheelchair motors.  Still using the HB-25's for now, but contemplating moving to a Sabretooth 2x25.

Anyways, with this update, I'm showing that I stripped down the base, removed the electric brakes, and prototyped Artie's body sitting on top of the new chassis (zip ties for the win).  I got him driving down my driveway, but wasn't able to hold the camera and drive him at the same time.


Update 08/18/2014

Finally figured out how to get the Dynamixel servos to be controlled by the XBox controller through my Java program.  Using a USB2Dynamixel dongle and the simpledynamixel processing library (https://code.google.com/p/simple-dynamixel/), I was able to get pan working.  Unfortunately, found an issue with the mechanical design of the tilt mechanism, so have to go back to the drawing board on that one.

Update 03/24/2014

He speaks!  Re-did the head, using a bunch of 3D printed corner brackets to hold it together vs glue like before.  Also installed a speaker from a spare computer speaker set I had, as well as built an access panel on the top of the head (vs before where I just pulled off the top that was wedged in).  I also made the Arduino interface a little cleaner, using a prototyping shield to sit on top, vs the separate breadboard I was using before.

The voice in the video is not his eventual voice.  It will sound more high-pitched and robotic as I'm going to feed the input through a modified kids voice changer toy.  Also, the audio in the video is just a pre-recorded sample from the ATT Natural Voices website.  The plan is to actually use FreeTTS for speech synthesis.  But I wanted to test out the system before going to bed, and of course wanted to show it off here!

Update 03/11/2014

Posting a photo of RT at the SXSW Create event.  Wish I got more photos/videos, but I was too busy driving the robot to man the camera.


Update 03/07/2014

Hooked up Neopixel ring lights and posting a video of it going through the different patterns.

The eyes can do 10 different patterns and 8 different colors.  When not in "attract" mode, the patterns and colors are set using serial commands over usb from the host computer.

Update - 03/03/2014

More updates.  Just recieved a pan-tilt system hand-machined by a friend.  Thought some of you might be interested in photos of it, so I'm adding a bunch of different views.  You can see the NeoPixel LED ring holders on the head now and the ultrasonic sensors on the front (positioning will change when second layer is added on).  Also, I plan to cut the PVC pipe for the neck in half to aid with balance issues.

New images of the latest design of RT-01.  Using only one section for the torso instead of two until the second layer is actually needed. 

RT with paint job


The RT Series robot will be a 5' tall robot.  It uses two geartrains from a Power Wheels ride on toy, and two Parallax HB-25 motor controllers for motor control.   An Arduino currently serves as an interface between the Acer One netbook (the same one on Geoffrey), and the motor control.
Eventually the RT-01 robot will be jam packed with processing power and sensors.  Currently slated is.

  • 1 Microsoft Kinect w/ depth sensor, RGB camera, and 4 microphones
  • 4 other web cameras (2 in the eyes, 1 in the back, and one pointing down to the ground to track motion)
  • 1 speaker unit (located in the "mouth" area
  • 8 HC-SR04 Ultrasonic sensors
  • An array of bump sensors (amount and kind tbd)
  • 2 microphones located on the sides of the head.

For processing power.

  • 1 - Intel 1.7 ghz processor with 2 gb RAM and 80 gb hdd
  • 1 - Arduino Uno @ 16mhz
  • 1 - Arduino Mega 2560
  • 1 - Raspberry Pi
  • 1- Dual Core Android Tablet @ 1.2 ghz

These processing systems will be connected together to provide modularity.
The system will be open source, and both the hardware and software process will be documented for replication.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

We seem to share similar building styles and enjoy personality from our machines.....I will be keeping an eye on your future gadgets!


Nice.  I wanted to switch mine to an XBox controller...tried an IR remote and also an RC set.  I don't like either.  I suspect the software on the robot side for interpreting the RC stuff creates timing issues with the Arduino running everything else, as it had to measure duration of RC pulses.

Does the XBox controller send out IR signals or something else?

RT is getting more distinguished looking everytime you post.  Perhaps you are planning dinner party appearances.

I am hoping to have some brain code for you by end of year.  Like to find a way to get your design input sometime.




I'm not quite sure what the XBox Controller sends out.  I believe it is proprietary 2.4ghz RF.  But, I can explain what I do.

The XBox controller I use is this.


It comes with a USB dongle that plugs right into the PC.  I then use a library called jinput


I use it to initialize the library and read the input.  I then communicate with the servos using the simple dynamixel library.

However, it doesn't seem that you aren't alone with wanting to use an XBox controller with Arduino.  But, if you don't have any luck, I've seen some articles about using a Playstation controller with the Arduino (both have similar buttons).

Good luck!

Looking better and better.  Got any video of A* Pathfinding in action?

Can't believe I didn't notice this post.

No A* yet.  I decided I'm going to go a different strategy.  Artie is too big to find a reliable space to debug A*, so I decided that I'm going to experiment with my IRobot Create instead.  The plan is to control the Create via a bluetooth link, and have all of the processing off-board via the laptop.  This way I can have enough space to generate a decent "map" in my garage (especially when I sell my motorcycle and we finish painting the new house).  I'm still worried about odometry drift errors, but I feel at least starting in a direction will help motivate me.

I think RT-01 might single-handedly bring back disco!

How's that Pan&Tilt working out?  Will your friend make another and for how much?  Seriously.

Fun stuff indeed.  Watch out Travolta, RT-01 is coming to town.  - Martin

The pan/tilt shouldn't be too far out.  My friend couldn't find his USB2Dynamixel for me to use, but I'm going to be borrowing a CM-5, which I can control via serial interface to run the dynamixel servos.

Now that I've had the bot requested for a birthday party for the 29th, I have a deadline to get it up and working (even if it's hacked together to run for one night).

my friend is a machinist by trade, so he only does extra work for stuff that he's personally invested in.  but, I could ping him again in a few months.


Weird how much personality the neopixels around the "eyes" add.

Did you ever get your NeoPixel rings up and running?

I'm curious how they look as robot eyes.

I've had fun using NeoPixel rings but I haven't thought of a good way of using them as robot eyes yet. 

Thanks for posting.

Just updated with the rings mocked up.  Not the best video quality but it was late at night.  The important part is seeing the blinking.  I think it adds a lot of personality.