This is my first experiment with artificial inteligence using a simple Doodle Bot with a RTC and SD card for extra memory. When I say artificial inteligence I am not talking about a supercomputer that you can discuss philosophy with over a cold beer. Simply a robot that will try to learn from past experience as previously discussed in my blog: http://letsmakerobots.com/node/34177
The challenge: This simple robot will pretend it is cleaning a floor. To clean a floor efficiently the robot needs to cover the entire area without missing any spots in a meaningful pattern. The experiment will be done using a whiteboard as the floor to be cleaned. The robot will trace it's path with a whiteboard marker so we can see where it has been and if it is learning or not. Obstacles will be put in the robots path to test it's learning ability.
The only navigation sensors are 2 low resolution (8 counts per wheel revolution) wheel encoders and the IR receiver which will use the docking station as a beacon. The 3-axis accelerometer will be used to detect collisions.
The A.I. theory: This robot does not use a neural net nor does it borrow the brain power of a laptop or PC. It simply remembers what has happened to it before and uses those previous experiences to try and guide present actions. When it's battery starts to get flat then it will use it's IR receiver to locate the docking station and charge it's batteries. During this time it will sort through it's memory looking for ways to improve it's decision making process and compress the memories for storage as long term memory.
Any one who has programmed a robot before will tell me that this is a difficult task to acheive and I agree. I plan to break the experiment down into different steps. At the end of each step I hope to have some useful functions that can be used by others.
Step 1: Get the robot to store it's raw sensor data onto the SD card as short term memory with a time stamp of when the even occured. Organize and compress the data when charging the batteries. At this point only a simple "bump, backup and turn" routine will be used to gather the data. As a bonus, the data gathered should be useful for making a map of the area to be cleaned for the second step.
Encountered my first problem, the SD library is huge! by the time I have the SD, Wire and microM libraries installed and initialized, 9642bytes are used. Without the SD library only 3078 bytes is used. I have already planned to avoid using a library for the DS3107 (which incorporates the wire library) because of memory restraints.
As the Micro Magician only has 16K (assume no bootloader) this does not leave much room. Fortunately I should receive a sample of the Micro Magician Pro next week which uses the ATmega328P and also has a 5V regulator which will be better for the DS1307 RTC which goes into low power mode once the battery voltage drops below about 3.8V.
As a fun sidenote, the Wire library has been updated for V1.0 of the Arduino IDE and as they have changed some names the instructions on the net are no longer suitable for older versions. It's possible this may also cripple the DS3107 library as it uses the wire library and would also need to be updated. I looked at the library code and it has been updated to work with older and newer versions of the wire library.
For now I am proceeding with STEP 1 as I think the ATmega168 still has enough memory. Later steps will definitely require an ATmega328P or better. I have found some code for accessing the SD card in a raw format. This code is a lot smaller but would not allow the card to be read by a PC.
Ok, I haven't had a lot of time for coding and so far I am just experimenting with a new I2C library.
The good news is that I now have a prototype of the new Micro Magician Pro to play with.The two big advantages for this project is that I now have an ATmega328P processor with 32K of flash memory to help cope with the libraries and a 5V regulator for the DS1307 RTC.
The RTC will communicate quite happily at 3.3V as any signal above 2.2V is considered a logical 1. The problem is that Vcc must be 1.25x Vbat otherwise it goes into low power mode and stops communicating on the I2C lines.
I am now thinking that for my robot to successfully cover all the floor without missing any spots it will need more than the docking station homing beacon for guidance so I am going to upgrade to 3 IR beacons and attempt to get the robot to triangulate it's position.
The IR beacon (shown here without the IR LEDs) is basically the same as a TV remote except it just sends the same code out over and over again. Dip switches 1-7 are used to set the 7 bit code. Switch 8 is the power switch. These beacons do not use an MCU, just a simple CMOS circuit which helps minimize power consumption. This PCB is the size of a AAA battery and will work with any voltage from 3V to 6V making it perfect for running of a single AAA Li-ion rechargeable battery.
I've started on the actual coding of step 1 after sorting out some issues with my I2C.You can download the I2C library I am using from here: http://www.dsscircuits.com/articles/arduino-i2c-master-library.html. I am
At the moment it is not much more than "bump and turn" code to generate data to be stored, the basic storage to the SD card and timestamp generation. I won't have anything worthy of video until at least next week when my company has a 5 day holiday.
Here is the current structure of a single "memory event". The number at the left is an index number used for selecting specific information from the event.
Note index 0, this is a quick reference byte that records the reason for the event being recorded e.g. impact detected or motor stalled.This can be used later when the robot is looking for memories of similar occurances hapening. For our robot to be inteligent it must be able to cross reference and analyze these memories.
AI short memory data structure 0 memory trigger event byte 1 year byte 2 month byte 3 date byte 4 day byte 5 hour byte 6 minute byte 7 second byte 8 magnitude msb 9 magnitude lsb 10 deltx msb 11 deltx lsb 12 delty msb 13 delty lsb 14 deltz msb 15 deltz lsb 16 x-axis msb 17 x-axis lsb 18 y-axis msb 19 y-axis lsb 20 z-axis msb 21 z-axis lsb 22 0G time milliseconds msb 23 0G time milliseconds lsb 24 left encoder count msb 25 left encoder count lsb 26 right encoder count msb 27 right encoder count lsb 28 left speed msb 29 left speed lsb 30 right speed msb 31 right speed lsb 32 left & right brake byte 33 left & right stall byte 34 servo position msb 35 servo position lsb 36 battery voltage msb 37 battery voltage lsb 38 IR command received byte 39 current mode byte 40 position X co-ordinate msb 41 position X co-ordinate lsb 42 position Y co-ordinate msb 43 position Y co-ordinate lsb 44 position Z co-ordinate msb 45 position Z co-ordinate lsb 46 direction byte 47 action taken byte 48 reason for action byte 49-63 spare byte
A lot of this data is not really relavent and can ultimately be eliminated from a memory event. For example, 0G fall time is only relavent if the robot actually falls. The problem is that our processor is no super computer and trying to sort through this information while the robot is functioning would significantly slow down it's reaction time.
Another problem is that until the data is analyzed we do not know what data is important. If for example an impact occurs then we need to know the reason. Was the impact the result of a simple collision or did the robot fall off the table? Did the robot hit a wall (maybe the motors stalled) or did something else hit the robot (the angle of impact was not the angle of travel).
This is why we record it all and sort it later. In order to sort the information I need to create several algorithm's. Just as Patrick's Maze solving robot used one algorithm to follow a wall and another to eliminate wrong turns. I need to generate algorithms to perform functions such as mapping, navigation, collision avoidance and position triangulation.
So far this is raw sensor data and about 15 bytes spare for anything I haven't thought of yet. The idea is that everytime something significant occurs an entry will be made in the short term memory. I have a more detailed description of the theory in my blog: http://letsmakerobots.com/node/34177