Gas station without pumps

2016 August 12

Playing with Nao humanoid robot

Filed under: Robotics — gasstationwithoutpumps @ 11:09
Tags: , , , ,

Yesterday I had an opportunity to play with a Nao robot in a three-hour workshop at UCSC, run by Dr. Denise Szecsei of the University of Iowa. I found out about the workshops through an article in Santa Cruz Tech Beat, an on-line publication about local tech events. (Santa Cruz Tech Beat is worth reading, with a high signal-to-noise ratio and only about 30 articles a month.)

The basic idea that Denise was pushing is the use of the Nao robots in introductory programming courses—she created a course at the University of Iowa called Dancing Robots that supposedly been successful in recruiting women into programming courses there (she did not give us detailed information, as the focus of the workshop was on getting us to experience programming the robots, not on academic justification). She was also looking for collaborators on either educational projects or art projects, so was glad to have some grad students from the Digital Arts and New Media project at the workshop.

You can see an example of the results of the dancing robots courses on YouTube:

I’ve always thought that the Nao robots were cute, but at $9000 from the US distributor RobotsLab (and $890/year for warranty), they are too expensive for me to buy as a toy.  So I welcomed the chance to play with one for 3 hours.

What the workshop was intended to be was a brief tutorial on Choreographe, the drag-and-drop scripting environment for the robots. That environment looks ok to use, with simple message passing along wires to trigger pre-written boxes for actions like “say” or “timeline” animation.  Most of the dancing work is done by using the timeline like stop-frame animation (at 25 “frames” per second), with the software interpolating between target poses.  The target poses are created by physically manipulating the robot, making the whole process accessible to 6th graders (though the fragility and cost of the robots makes me think that you would need careful supervision even for high-school and college students).

I was not interested in the dance aspects of the robots, so I worked with one of the workshop staff (Denise’s son) on diving into the Python SDK (there are also C++, Java, and JavaScript interfaces, but the Python one is best integrated with Choreographe and the best for rapid prototyping, which is all I had time for). I spent a little time the night before the workshop looking at the programming interface (which I did not really understand from the quick glances at the documentation) and at the capabilities of the robot in terms of sensors and actuators.

What I wanted to do was to program one action—shifting the weight of the robot onto one leg, then picking up the other leg, so that the robot stood on one foot.  I planned to do the weight shifting by coordinated motion of the hip roll and ankle roll actuators.  Initially, I had thought to do it on just one leg, but I ended up doing it on both legs, since the starting position had the feet approximately the hip distance apart, so rotating both hip-roll actuators one way and both ankle-roll actuators the other way results in a parallelogram linkage, with the hips and torso staying level while moving sideways.

To detect the weight shift, I used the force resistors in the foot.  There are several ways to access them through getData() calls: the processed “leftFootTotalWeight” and “Device/SubDeviceList/LFoot/FSR/TotalWeight/Sensor/Value”  or the raw sensor values “Device/SubDeviceList/LFoot/FSR/FrontLeft/Sensor/Value”, … .  I ended up using “leftFootTotalWeight” or “rightFootTotalWeight”.  The basic idea was to start a thread running moving the hips far to the left, and set up an interrupt routine triggered by the event footContactChanged. When that event fired, I checked the weight on the right foot, and stopped the motion if the weight was low enough (I think I used 300g, since the unloaded force resistor with the foot in the air was reporting something like 200g).

I did not have time to add a further refinement to the weight shift, to adjust the weight to be centered over the supporting foot, using the center of pressure sensor. I had the robot speak the values of that sensor, but did not use it to tweak the hip and ankle angles of the supporting leg.

Once the weight had been shifted on the left leg, I had Nao pick up the right leg by adjusting the hip pitch, knee pitch, and ankle pitch of that leg.  The posing software in Choreographe made it fairly easy to figure out what the correct signs were for the angles and for picking target values. The robot lurched a little bit as the right foot was picked up, probably because the foot had not been fully unweighted, but possibly because of the foot not being lifted quite vertically.

If I’d had more time, I would have done the centering of the weight over the supporting leg before lifting the other foot. I would also have moved the motion of the supporting foot into a separate script, so that different gestures could be made from the one-legged stance. In doing that, I’d probably want to have any one-legged boxes run a balancing thread to keep the weight centered over the supporting foot, so that sticking out an arm or a leg doesn’t topple the robot. Either that, or have a one-legged balance box that is runs in parallel with any gesture actions.  It would probably take me a day or two of programming to create one-legged action boxes robust enough for someone else to use them, and probably a few weeks of use testing before they would be worth adding to Choreographe.

Working with the robots definitely needs two people, as one person needs to spot the robot any time it is moving (remember, fragile and expensive!).

It was very useful working with someone familiar with part of the programming interface, so that I did not have to waste much time figuring out how to create a new box or test it out. He was the one who suggested outputting a string to pass into a “say” box for reporting the foot sensor values for debugging, for example. I started out just reading the sensors and reporting the weight, then tried figuring out how to interrupt an ongoing motion by raising one arm very slowly, to be interrupted by any of the touch sensors. Once I had the basics of starting a parallel thread and stopping it on an interrupt, I programmed the weight shift. Once that was working I added lifting and lowering the non-supporting leg.

I fully expected that the spotter would be called on to catch the robot as it fell during my development of the program, but despite a little worrisome lurching as the unweighted foot was lifted, the robot never toppled.

I doubt that the Python SDK is fast enough to do a closed-loop feedback system for balance, but it was certainly good enough for a slow weight shift onto a single leg, and the feet are wide enough that no dynamic balancing was needed. It would be a fun programming environment for a Python course, as long as the students were into humanoid robotics.

The Choreographe environment provides a reasonable interface for fairly simple sequencing and synchronization, but I suspect one hits its limits pretty soon. Being able to create new blocks rapidly, by copying existing blocks and editing the Python in them, makes the system a lot more powerful, though I got the impression that the “Dancing Robots” courses rarely get that far.

The Nao robots were, as I expected, a lot of fun, but I couldn’t really recommend them for a beginning programming class. At $9000 for each pair of students, they are way too expensive and way too fragile.  For beginning programmers, you really want things that students don’t have to worry about breaking if they make a little mistake. One can get fun robot toys for students to program for $100 each, since wheeled robots are much cheaper and easier to make than humanoid ones. Not only is the financial risk lower with cheap robots, but they can be made much more robust (though educational robots needn’t be made to sturdiness required of combat bots).

To break into the consumer market or school market, I think that Nao would have to come down in price by about a factor of 10, which would be difficult to manage, given what the robots contain.  There are 24 actuators (2 head, 6 in each arm, 6 in each leg), plus cameras, ultrasonic range finder, touch sensors, foot pressure sensors (4 per foot), gyroscope, accelerometer, LEDs, … . The motors and 12-bit joint angle sensors (magnetic rotary encoders) alone probably cost close to $1000.


2013 June 30

Robots in physics

Filed under: Robotics — gasstationwithoutpumps @ 13:34
Tags: , , ,

I just watched the Global Physics Department of Matt Greenwolfe showing his use of Scribbler 2 robots as physics education tools.  See also Matt Greenwolfe’s blog, starting with The Robot Lab (formerly known as the Buggy Lab). It may be easier to get the content from the Matt’s blog posts than from the Global Physics Department recording, which had a lot of technical difficulties.

Scribbler 2 robot.  Picture copied from the Parallax web site that sells the robot.

Scribbler 2 robot. Picture copied from the Parallax web site that sells the robot.

He did some modifications to the code for the Scribbler 2 to provide precise control of the robots (1 mm/sec velocity and 1 mm/sec/sec acceleration accuracy) and made some nice graphical interfaces for students to control the robots with x vs. t, v vs. t, and a vs.t plots (plus one interface that does 2D motion).

One big problem with the Scribbler 2 was the limitation to about 18.5 cm/s velocity, which is pretty slow. The cool thing about them is that they have wheel encoders that allow 0.491 mm resolution with 507.4 counts per revolution. One limitation that is a complete deal killer for me is that the Scribbler 2 library is only available for Windows machines, so porting to a Mac platform would be a major effort.

I was looking to see whether one could easily build such a robot from easily available parts. One cool new part is an integrated wheel, motor, controller, and shaft encoder called the HUB-ee (available for $35 from SparkFun):

The HUB-ee is a type of robot servo but designed for wheels, in fact it is a wheel, but it is also a motor, a sensor and a motor controller. What’s that? Did we just blow your mind?
When you want to add wheels to your robot you would normally start with a whole collection of parts: The motor and gearbox, a motor driver board, and maybe some sensors for measuring wheel speed and a controller to count revolutions or provide closed loop speed control. Well, the folks over at Creative Robotics thought it would be handy if you could just buy a wheel that had all of those things built in, so they designed HUB-ee – just bolt it onto a chassis, apply power and away you go!
The HUB-ee is easy to mount, too! There are two threaded inserts for M3 bolts built in, there’s also a right angle bracket included for situations when you can’t go horizontal into the chassis. The mounting holes are even LEGO® lug compatible!! HUB-ee uses Micro-MaTch connectors to keep electrical connections tight and easily changed, check out the related items for mating connectors.

HUB-ee wheel picture copied from Sparkfun web page

HUB-ee wheel picture copied from Sparkfun web page, which says that images are licensed by CC BY-NC-SA 3.0.

The HUB-ee has a resolution of 128 counts per revolution of the wheel (1.473 mm resolution, 3× the step size of the Scribbler 2). The HUB-ee runs at 120rpm no load at 7v, which would be 37.7 cm/s, about twice the speed Matt reports for the Scribbler 2.

Although Matt reports 18.5 cm/s, the Parallax spec for the Scribbler 2 claims up to 80RPM, which would be 33.2 cm/sec, but that is probably with a 12V power source, rather than 6AA batteries. I suspect that Matt’s need for very precise control and operation with batteries limited the top speed he could use.  He does say that he would have liked a voltage controller (which would have added a $3–15 part cost to the robot, so a $7–40 increase in retail cost, based on the designs by TI’s WebBench tool or the PTN78000W module from TI) in order to have better speed control without having to worry so much about keeping the batteries fully charged.

The Hub-EE takes up several pins on a microcontroller (1 PWM pin, 2 output pins to control direction, 2 input pins for the quadrature encoder feedback) in addition to power and ground.  Two HUB-ee wheels would cost $70 and use 10 pins on a microcontroller—doable with an Arduino, but not leaving a lot of pins for other functions like sensor inputs.  There aren’t enough interrupt pins on the standard Arduinos to have all 4 wheel encoder pins triggering interrupts (which would be the highest-precision way to use the feedback information to get precise motor control).

Internally, the HUB-ee wheels use a Toshiba TB6593FNG motor controller, an H-bridge designed to work with 1.0A average current, with an on-resistance of about 0.35Ω for the output low.  The Toshiba data sheet doesn’t give the on-resistance of for high voltages directly, but if I’m interpreting their “Vsat” parameter directly, the on-resistance for each leg is about 0.5Ω, about a sixth that of the popular L293D H-bridges.  At under $3 a H-bridge (in single units), the TB6593FNG does not look like a bad choice for a small H-bridge.

Of course, to use the HUB-ee, one would have to build the rest of the robot (chassis, microcontroller, battery, … ). The Hub-EE is designed to mount to Lego beams, which could make chassis building easy, at least for prototyping.

I wonder whether, in a couple of years, we’ll be seeing integrated wheel units like the HUB-ee with an SPI interface, with registers that say how many steps to make, with specified velocity and acceleration curves.  That would provide very simple interfacing with fewer wires and could allow much tighter servo loops, at the price of putting a microcontroller at each wheel (probably adding $10 to the retail price of the wheels).

%d bloggers like this: