Final Video

Below is the final video of the robot drawing.

Fast

Slow

Advertisements

Progress Update

Here is the bot! We have IR sensors, ultrasonic sound sensors, and encoders working in good condition. We have done alot of work fine tuning the encoders to give us proper data, and they probably still need work. There is also some kind of grounding problem that makes the wheels desynced when they stop, which we fixed, but was reintroduced when we added a bigger breadboard.

As shown in the video above we are using the encoders to determine how far to go forward and how much to turn right. The objective of this was to create a square. The infra red and ultrasonic sensors are not used in this example as it was not necessary for a simple square but they are fully functional if we need to use them. As stated earlier there is been an issue with our reverse pulse where one of the four wheels are not doing this. Next week we are planning on moving from the breadboard to the arduino’s proto shield.

One issue we had earlier with the encoders was during a high acceleration the encoders were inaccurately interrupting causing our incremented variable to be off. For example the wheels should have 1 rotation during 0.8 sec. This means with 10 sproket encoder there should be an interrupt every 0.07 – 0.08 of a second. Monitoring the output with the serial cable the incremented variable would jump 2-4 times while is should have only incremented one time. There are two different ways of solving this problem. One would be to use software by adding a period after each interrupt where another interrupt can not trigger. The second way would be a hardware solution by using 2 NAND gates to debounce the signal. Cody implemented the software solution and Adam implemented the hardware solution using the SW 7400LSOON. Below is a diagram of the 2 NAND gate hardware debounce.

After applying both of these solutions they did improve the interrupt accuracy. Its not exactly perfect yet but with some more adjusting I am sure we will improve further. We have not decided yet to use both when implementing the protoshield becuase the Arduino Protoshield didnt have enough room for 2 h-bridge and 1 NAND gate chips.

Nannett has calculated the power dissipation of each component and how long the batteries will last on a full charge shown in the spread sheet in the following link:  Power Dissipation

From her calculations the batteries should last up to 2 hours at full charge.

Cody has written a good blog on the encoders and interrupts  http://cneuburg.wordpress.com/2011/10/28/encoders/

Progress Update

It has been a few weeks since the past post and some progress has been made.

First the robot is about 90% assembled. We need to connect the H-Bridges which will allow you to control the wheels forward and backwards.

Second we have reach one of our 5 milestones which is create a computer simulated API. Cody, our programmer, created a simulator in C# that has a very nice top-down design. It is a working simulator where you can tell a robot to turn right or left at a given degree and move forward given a desired distance. Below is an example of our desired image that we are trying to make. One improvement that we need to implement on the simulation side would be to control the right and left wheels so that can be easily converted to the robot.

Our next task is to get more familiar with the Arduino MC and do some experiments with them.

 

 

Processing diagram

For one of your assignments we needed to create a small diagram/simulation in a program called Processing. Processing is used by artist to create drawings using functions.

Coming up with the symbol was somewhat challenging just becuase we wanted original and different that any of the other groups. We ending up taking our initials which made the word “CAN” and looked into the Japanese alphabet for the word. so we came up the this symbol  “缶”.Below is an image of our processing model.

Key:
Black lines – no ink laid down
Gree lines – ink laid down
Blue lines – robot goes over already inked link

Welcome to CAN B0T

This blog will document our progress in Embedded Robotics class Fall 2011.

Our mission is to complete an autonomous robot that will recognized its location in a grid and preform a sequence of moves laying paint that will in result in a desired diagram.

Mini goals/ Milestones:

1. Assemble the robot to a running state.
2. Create a computer simulation (API).
3. Convert simulation to actual testing.
4. Have robot complete a simple circle.
5. Have robot complete complex diagram.

Within these mini goals we will also need to research communication between other robots and autonomous localization of the given grid.