Robotics

Robotics

Video

Home

Monday November 17, 2014(day 1, week 3)

Today was the first day back to expeditions and we got right to work. This week our focus is on robots, and more specifically, how to program Lego Mindstorm robots. To begin I had some problems trying to get Python 2.6 to run, but when it finally started to work it went pretty smoothly for most of the first period. The first thing I half to do was find the original Pybot document that Dr. Mark Miller made. Once I found that Jimmy and I set Python aside and tried to make our Robot. It took several try's to make our own robot but pretty soon we just went with the 5 minute robot from the link that Mr. Farrell used.

But we decided to add several different sensors like the Ultra Violet sensor and the touch sensor. When we started to control the robot with python, one problem that I kept running into was how I defined my inputs. After several tries I made a simple command that still needs some adjustments.

What I wanted my input to do was to have the robot make a square, but because it took a while to make a robot that could actually stand, we ran out of time to test my new code. However Jimmy was able to generate multiple commands that used the UV sensor to guide itself, which was pretty cool because it would go around the corner of the class but when ever it got to close to the wall it stopped and turned.

Tuesday November 18, 2014(day 2, week 3)

Today we continued to work on our robots, but today we had a new challenge and that was to guide our robot by having a set of code that it followed. This was actually decently challenging were it was hard but not to hard, and fun. It was fun because we could see our little lego robots go in Circles or they might run into each other. And it was challenging because we had to try to make the color sensor tell us what the color in front of it was. When It came to the final challenge where we were going against other people we totally lost. We lost because even though the sensor could detect light we were unable to make a loop where it continually checked for the color. I think one if statement that we could have used was to tell the robot that if it detected a certain color it was to stop. The code that we used to sense the color wasthe one below.

These inputs were already in the bot code but we had to change the name and the ports so that it would work on our robot. But after everyone fnished the challenge, everything just got crazy because before we knew it people were having robofights. Because our robot originally moved away from any object in front of it, we had to command it manually instead of with a loop. What some people did that was very smart was that they would change the name of the command so that they could control the bot faster. I don't have a picture for rhis but basically they would change a command like forward to a single letter like w or f.

Wednesday 11/19/2014

Today we were again working on robotics. However, today we had the challenge to make the bot follow a specific path. The path it had to follow was similar to a path that a working robot has to follow, but it was made up of blue masking tape. This challenge was pretty difficult because the bot had to move in a straight line. One solution that our group figured out was that the max speed the robot could travel evenly. The max speed for the robot that we were able to find was 127, and we found this out by looking in the pybot program that Chris Miller made.

When it got to the challenge of following the line, we where able to find that a variable that would allow us to scan the color on the floor. This variable commanded the robot to only move forward if the ground was blue, and to rotate until it found a blue line. Our code looked like this.

The robot was able to follow the color but had trouble when it got to close to the edge of the tape. It would walk off the tape and was not able to get back on. But our group ran out of time to adjust it because we had trouble earlier trying to figure a way for the robot to get back on track.

Thursday 11/20/2014

The challenge today was to get a robot to "scatter". This meant that the robot had to not run into other robots, desks, or walls. In order to do this the robot needed an Ultra Sonic Sensor, and a touch sensor.


The coding we used was similar to the code that we used during the rest of the week, but because we used the sensors we had to define which ports they where in.



As you can see, each port is labelled so when you plug a sensor into the port you need to make sure that you plugged it into the right one. So when our robot was tested it did work because it had been working the rest of the week. So after we adjusted it a little we had some time to remodel the bot. because we thought that our bot was going to be tested against other robots and that the competition would be tough, we decided to try to add another motor that would be in front that could lift up other robots. Some of the other class mates had begun to weaponize their robot so me and Jimmy decided to weaponize ours against them. Bu we found out that in the Pybot code, even though there was three different motors, only two of them were defaulted. So we had trouble from that barrier, but when we made all three motors defaulted on a copy of Pybot, we still had a problem. Fortunately Nick and Ramiero found out how to use the third motor. As it turned out, there was already a code to move just the third motor and it was defined as Up and Down.



Unfortunately we still could not get it to work, but another cool thing that Willy found out was that you could upload songs into the robot, so once we downloaded them we could connect our bot to the computer and transfer the file to it. And the python code to get it to play the song was.


Friday 11/21/14

Today our challenge was to make a bot that could travel on a "Martian" landscape. So in other words it had to not fall off the desk. Our partner group that made the undefeated champion bot "Moosen" was able to make the code for it to accomplish the challenge. One thing that I noticed with their bot was that it had trouble near the edges. One thing that they were able to do to fix it was put the color sensor farther away from the bot so that it could sense the color of the table, and when the color stop sensing the table color, it would stop.