Robotics page

my Robotics page

Server Games languages ai 3D_graphics
Tuesday.
We got to program Lego robots, we were told to pair up with a partner, and i paired with Aaron, the guy who i patterned with last time in the game programming, our robot had a basic design that we got off the lego windstorm instruction page, its name was 5 minute bot because i probably takes only 5 minutes to build it, this is what it looked like before I put add-ons on, and unfortunately I don't have a picture of robot with the add-ons put on.

We programmed the robot to move in a simple square, but it turns out its not that simple because we had to account for the width of the robot, the motor power of the robot, and the wheel size.
The program that was used to make the robot move in a square is what you might expect. First you had move forward x times, then turn right 90 digress, and repeating that 4 more times, it wasn't perfect, but it was close enough.
here is a picture example of the program that I used.



Wednesday.
today, we got to attached a touch sensor, which was to send a signal to the robot that would trigger a program, and a sight sensor, which would do the same thing as the touch sensor, but instead would rely on sight instead of touch. this would allow our robot to see or feel if there was a obstacle in the way. Programing it was a bit tricky, but i guess that was because I'm so used to using the nxt program for programing a lego robot. Our program was to simply have it go forward forever unless the touch sensor was touched by a obstacle, if the sensor got touched, it would back up, turn right about 100 degrees, then move forward in the other direction, and the sight sensor was basically the same thing, exept we had to tell it how far it could see so that it wouldn't be just backing up the whole time even though the obstacle was 20 ft away.
Thursday.
On the third day, we got these BGR (Blue, Green, Red) sensors, our 1st objective was to make our robots find a red ball, what we did was simply just set the robot to its normal "obstical avoidence program" and set it off into the room, once it got a hold of the ball it would wipe its claw arm, but it was not capture the ball, it was just to say that it had the ball, to give you a clue to what the claw looked like here is a picture of one.

The long arm part of the robot above is what we used on ours.
Monday
The next Monday, we had our robot do a mars mission simulator, we used a light sensor to navigate the board, for example, we used the color sensor to detect if the robot was about to go off the egde by sensing black and if it sensed blue it would stop. We could have gotten our root to work but the robot was sensing its own shadow.