Robotics

Back to index

Using the modularities of the Lego NXT 2.0 kits, we programmed little bots with Python.

This is the official five-minute bot from the set, and a chassis for the different motors and sensors to be used by the robot.

Parts

Brick - the computing part of the NXT kit
Servo Motor - the motors used by the brick to perform actions in the physical world. It outputs differently based on speed and differing measurements of length, including time and degree.
Ultrasonic Sensor - like sonar, it sends out a sound higher than human perception and measures distance by how long it takes to return. Samples returned are in a range from 0 to 255.
Touch Sensor - senses for whether the orange tab is depressed. It returns as True if depressed and False if not.
Color Sensor - measures for the amount of red, blue, and green on a surface. It returns a color or the saturation of the sample.

Functions

Monday, November 3:
The first day was mostly just an introduction to how to understand motion in Python and the robot. We made some functions:

While we didn't need to control individual motors to move, there were some new pieces of code that we used:

Monday, November 10:
Today, we finally used sensors in conditionals in order to govern movement.


This is a modified version of the original script that would stop in a certain range: the way to retrieve sensor input is actually as a sample, used as [sensor].get_sample().

Instead of stopping, the robot acts as a scatterbot and tried to avoid contact. When the touch sensor goes off, it'll stop altogether.

Wednesday, November 12:
Extending the scatterbot scripts into today, our group made two main variations that both used random movement in different ways.

JJ's Method:



Using a list of function names to go forwards, right, and left, JJ had his script choose an element randomly and acted on it when not too close or when it had its touch sensor engaged.

My Method


Instead of different movement commands, I just had my bot turn left or right a random amount before proceeding.

Thursday, November 13:
With my work done, I spent the day making novelty programs and helping Khanh.


Jeremy chose to swoop in at my desk to show how to take keyboard input in relation to movement (raw_input([insert prompt here]); I messed around until I got a nice feel to controlling the bot. Inputting a linear movement will continue it until a different one or the brake 'e' is stuck in, while turning is in gouts of about 45 degrees.

With some messing around with the robot, a color sensor, and structures, I managed to get a functioning penbot that wouldn't stray from a paper (although it tore it up pretty badly)