Yesterday I had an opportunity to play with a Nao robot in a three-hour workshop at UCSC, run by Dr. Denise Szecsei of the University of Iowa. I found out about the workshops through an article in Santa Cruz Tech Beat, an on-line publication about local tech events. (Santa Cruz Tech Beat is worth reading, with a high signal-to-noise ratio and only about 30 articles a month.)
The basic idea that Denise was pushing is the use of the Nao robots in introductory programming courses—she created a course at the University of Iowa called Dancing Robots that supposedly been successful in recruiting women into programming courses there (she did not give us detailed information, as the focus of the workshop was on getting us to experience programming the robots, not on academic justification). She was also looking for collaborators on either educational projects or art projects, so was glad to have some grad students from the Digital Arts and New Media project at the workshop.
You can see an example of the results of the dancing robots courses on YouTube:
I’ve always thought that the Nao robots were cute, but at $9000 from the US distributor RobotsLab (and $890/year for warranty), they are too expensive for me to buy as a toy. So I welcomed the chance to play with one for 3 hours.
What the workshop was intended to be was a brief tutorial on Choreographe, the drag-and-drop scripting environment for the robots. That environment looks ok to use, with simple message passing along wires to trigger pre-written boxes for actions like “say” or “timeline” animation. Most of the dancing work is done by using the timeline like stop-frame animation (at 25 “frames” per second), with the software interpolating between target poses. The target poses are created by physically manipulating the robot, making the whole process accessible to 6th graders (though the fragility and cost of the robots makes me think that you would need careful supervision even for high-school and college students).
What I wanted to do was to program one action—shifting the weight of the robot onto one leg, then picking up the other leg, so that the robot stood on one foot. I planned to do the weight shifting by coordinated motion of the hip roll and ankle roll actuators. Initially, I had thought to do it on just one leg, but I ended up doing it on both legs, since the starting position had the feet approximately the hip distance apart, so rotating both hip-roll actuators one way and both ankle-roll actuators the other way results in a parallelogram linkage, with the hips and torso staying level while moving sideways.
To detect the weight shift, I used the force resistors in the foot. There are several ways to access them through getData() calls: the processed “leftFootTotalWeight” and “Device/SubDeviceList/LFoot/FSR/TotalWeight/Sensor/Value” or the raw sensor values “Device/SubDeviceList/LFoot/FSR/FrontLeft/Sensor/Value”, … . I ended up using “leftFootTotalWeight” or “rightFootTotalWeight”. The basic idea was to start a thread running moving the hips far to the left, and set up an interrupt routine triggered by the event footContactChanged. When that event fired, I checked the weight on the right foot, and stopped the motion if the weight was low enough (I think I used 300g, since the unloaded force resistor with the foot in the air was reporting something like 200g).
I did not have time to add a further refinement to the weight shift, to adjust the weight to be centered over the supporting foot, using the center of pressure sensor. I had the robot speak the values of that sensor, but did not use it to tweak the hip and ankle angles of the supporting leg.
Once the weight had been shifted on the left leg, I had Nao pick up the right leg by adjusting the hip pitch, knee pitch, and ankle pitch of that leg. The posing software in Choreographe made it fairly easy to figure out what the correct signs were for the angles and for picking target values. The robot lurched a little bit as the right foot was picked up, probably because the foot had not been fully unweighted, but possibly because of the foot not being lifted quite vertically.
If I’d had more time, I would have done the centering of the weight over the supporting leg before lifting the other foot. I would also have moved the motion of the supporting foot into a separate script, so that different gestures could be made from the one-legged stance. In doing that, I’d probably want to have any one-legged boxes run a balancing thread to keep the weight centered over the supporting foot, so that sticking out an arm or a leg doesn’t topple the robot. Either that, or have a one-legged balance box that is runs in parallel with any gesture actions. It would probably take me a day or two of programming to create one-legged action boxes robust enough for someone else to use them, and probably a few weeks of use testing before they would be worth adding to Choreographe.
Working with the robots definitely needs two people, as one person needs to spot the robot any time it is moving (remember, fragile and expensive!).
It was very useful working with someone familiar with part of the programming interface, so that I did not have to waste much time figuring out how to create a new box or test it out. He was the one who suggested outputting a string to pass into a “say” box for reporting the foot sensor values for debugging, for example. I started out just reading the sensors and reporting the weight, then tried figuring out how to interrupt an ongoing motion by raising one arm very slowly, to be interrupted by any of the touch sensors. Once I had the basics of starting a parallel thread and stopping it on an interrupt, I programmed the weight shift. Once that was working I added lifting and lowering the non-supporting leg.
I fully expected that the spotter would be called on to catch the robot as it fell during my development of the program, but despite a little worrisome lurching as the unweighted foot was lifted, the robot never toppled.
I doubt that the Python SDK is fast enough to do a closed-loop feedback system for balance, but it was certainly good enough for a slow weight shift onto a single leg, and the feet are wide enough that no dynamic balancing was needed. It would be a fun programming environment for a Python course, as long as the students were into humanoid robotics.
The Choreographe environment provides a reasonable interface for fairly simple sequencing and synchronization, but I suspect one hits its limits pretty soon. Being able to create new blocks rapidly, by copying existing blocks and editing the Python in them, makes the system a lot more powerful, though I got the impression that the “Dancing Robots” courses rarely get that far.
The Nao robots were, as I expected, a lot of fun, but I couldn’t really recommend them for a beginning programming class. At $9000 for each pair of students, they are way too expensive and way too fragile. For beginning programmers, you really want things that students don’t have to worry about breaking if they make a little mistake. One can get fun robot toys for students to program for $100 each, since wheeled robots are much cheaper and easier to make than humanoid ones. Not only is the financial risk lower with cheap robots, but they can be made much more robust (though educational robots needn’t be made to sturdiness required of combat bots).
To break into the consumer market or school market, I think that Nao would have to come down in price by about a factor of 10, which would be difficult to manage, given what the robots contain. There are 24 actuators (2 head, 6 in each arm, 6 in each leg), plus cameras, ultrasonic range finder, touch sensors, foot pressure sensors (4 per foot), gyroscope, accelerometer, LEDs, … . The motors and 12-bit joint angle sensors (magnetic rotary encoders) alone probably cost close to $1000.