This is an edited extract from an episode of BBC Radio 4's Material World, which you can hear in full on the BBC website.
Transcript of the audio
M: We’ve got a meter.
M: Measure over that distance. Okay, ready.
M: One, two, three, go.
Interviewer: During the week, the 100 or so 21 to 69 year olds on the course will get hands on experience of five areas where technology’s changing and arguably improving our world: monitoring water quality, energy conversion, bridge building, waste management and the area we’re going to focus on, robotics.
M: Do you want to do it again?
M: Yeah, go on, we’ll do an average. Four seconds that time.
John Rosewell: Well I’m John Rosewell, I’m a lecturer at the Open University, I lecture in the Department of Information & Communication Technologies, and I’m running this robotics activity here at summer school.
Interviewer: And what’s the thinking behind this particular bit of the course?
John Rosewell: Well it gives students an opportunity to do a really quite a difficult task, get some hands on experience of robotics, a bit of programming, a bit of sensors, a bit of mechanics, a whole range of different experiences, and that usually generates a fair bit of a buzz as well, so they come away enjoying it.
Interviewer: I’ve got one of the handouts in front of me here, there’s a bit about mobility subsystem, control subsystem, sensory subsystem, power subsystem, mechanical assembly, it sounds when you read it like that quite a daunting task to be giving to people when they’re just getting the hang of what robotics is.
John Rosewell: It is in a way, and their challenge is in 24 hours they’ve got to build a robot that works like a rescue robot, that goes into a, well actually it’s a model of a collapsed building and finds a victim and comes out with it. That sounds quite difficult, but actually…
Interviewer: It does. It sounds like the sort of thing an industrial company might spend 20 years working on.
John Rosewell: We do it in 24 hours, yeah. Well we do make it easy for them because we’re working with a robotics prototyping system which is actually Lego, which is quite an easy way of putting stuff together to test things out. We’ve got our own software here which makes the programming quite easy for them as well, and we set it out as a series of little activities that they go through one after the other, which builds up towards that final solution if they can make it.
F: So we do it on three.
M: On three again.
F: Okay, so one, two, three. No, the sensor’s not going to pick it up because the remote control sensor’s at the front.
M: Of course, yes, okay. We can’t do it backwards.
John Rosewell: Well essentially once that’s programmed you want it to be autonomous, so the sensors feed back into here and that’s going to make the decisions about what to do. It’s no good coming back to the computer. And one of the reasons for that is that the connection between here is just a little infrared link, you know, like a remote control on the telly, so it could easily get blocked off as somebody leans across it or if it’s pointing the wrong way, so it’s not a reliable connection at all.
M: But we can do it backwards, as long as you catch it.
F: Right. I’ll catch it.
M: We can send it this way, yeah, okay.
F: Okay, are you ready?
F: So on three: one, two, three. Stop. How many?
M: just over four.
Interviewer: They’re building robots but are they actually also learning to programme them?
John Rosewell: Yes they will because the robot is going to be an autonomous robot so you’ve got to have a programme in it that does all the different things. So they’ll have to develop their own programmes, we’ll do on the PC and then we’ll download it.
Interviewer: Do you ever find that they come up with something that you’ve never thought of doing?
John Rosewell: They certainly do, usually with a great deal of confidence which sometimes works and sometimes there’s a great crash when the robot falls on the floor and it didn’t quite stop when it got to the edge of the table, yeah.
M: Right, whenever you start to press the button, right, I will stop whenever the robot actually stops, so that you’re checking for that two second delay. Right? So you tell me when you’re going to press the button.
Interviewer: Now when you say they’re building a rescue robot to get a person from a tunnel, we need to make it clear that it’s not a real person trapped in a real tunnel.
John Rosewell: Yeah, maybe I should just go and get it. This actually is our victim. He’s called Ted the LED; he’s a little teddy bear who’s got some bright LEDs on his tummy.
Interviewer: Ah, hence LED, right okay.
John Rosewell: And so he will shine up quite nicely inside the tunnel. The robot that they built has got light sensors on it, and so they can drive it down and look for a little peak of light which shows where the victim is.
Interviewer: So this is kind of necessity is the mother of invention, in this case obviously the bear necessity is the mother of invention.
John Rosewell: That’s true, yes, I mean you’ve got to go in there and find the bear, yeah.
John Rosewell: Okay folks I’m going interrupt you once more I’m afraid because we’ve got another bit of stuff to get through. We’re going to move on now and start looking at more features of the robot in order to improve things. So in your little pots you will find a couple of sensors. So using the light sensor you’ll be able to detect the difference between black and white, and this one which is a touch sensor, there’s a big wheel on the front with an axle poking out the back and the point of that is simply to, if it bumps against any obstacle it’ll be able to press through and touch that tiny little on off switch at the back.
M: Right, here we go. [laughs] Got too much power, hasn’t it?
M: It’s too slow, yeah.
M: Ahh. Yeah.
F: So how about if we do that, how about if we do that and then when it touches we get it…
John Rosewell: As soon as you touch it, it stops, but then it goes around again and you’re no longer touching it so it sets off again. It’s lost his foot as well.
M: Yeah, try it now, it should just stop.
M: Yeah, that’s right.
M: I’ve just put the on up there.
F: So we’ve turned it on…
M: And then we’ve only...
F: And then it tells him.
John Rosewell: Oh, you only do that, right, yes. Okay guys, if I can have your attention please we’ll, I think we’re ready to move on to do something a little bit different. So your next task is going to be to see if you can write a programme that will make your robot follow the line. Maggie’s standing there by the test track on top there, there’s an oval with a black line on it and what I’d really like to see are some robots that will trundle round the outside of that track, okay? So using the light sensor pointing downwards you’re going to be able to detect the difference between black and white, so your task is to figure out a strategy that will make the robot go around and stay roughly on the line. There’s lots of different ways of doing it so it’s worthwhile just taking a little bit of time to discuss how you’d go about doing the task.
Interviewer: Can you just talk us through what it’s doing and why it’s not quite doing what it needs to?
M: We think we need to reduce the time on the motor for turning, it’s driving too far forward. It’s supposed to stop the left wheel as well but it’s not doing that either.
Interviewer: Right, so by the time it notices it’s in the wrong position, it’s gone so far the sensor’s not where it needs to be?
F: We had some problems this morning and it didn’t stop at the black line and then go back and so we kind of had to get back up to speed again, and now we’re trying to make it hit the black line, turn away from it, at the next obstacle turn away from it and basically just work within a boundary.
M: If it’s in there, it’ll start turning the other way, so it’ll sort of bounce back and forth along the line, I think.
Interviewer: Can I just briefly derail your train of thoughts and ask what stage you’re at and what it is you’re trying to get it to do?
M: Yeah, we’re trying to get it to follow the black line on the floor. The idea is that if it’s on the line, it keeps turning to the left until it falls off. As soon as it’s off the line it turns to the right to come on again, so it doesn’t follow the middle of the line, it follows an edge of the line.
Interviewer: Right, so it’s using the edge to kind of autocorrect itself?
M: Yeah, that’s right.
Interviewer: I suppose it’s a combination of logic and trial and error.
M: Very much so yes. At the moment that’s not doing at all, it’s just ignoring the thing. It won’t even stop now.
M: We can only send it round one way. If it went over it, it’d get to the other side.
John Rosewell: So if you had a figure of eight, what would it do?
M: It would go round one side and then it’d go straight off the other side. Because we’ve only got one sensor, with one light sensor it can’t tell the difference between left and right of the line, so you can only run it to turn one way.
Interviewer: In other words, it’s perfect for the task it’s just been given but if you ask it to do anything slightly more complicated, you may have to go back to the beginning and start again.
John Rosewell: Okay folks, can I have your attention again? I’ve got a last bit of stuff to tell you and then I’ll tell you about the final challenge. I’m afraid we’re beginning to run out of time so don’t worry if you don’t get to do the final challenge but you can have a go. The challenge is quite difficult and for everyone to do it is probably asking a bit much, but we’ll see. The final bit of stuff I need to tell you about is how to deal with a rather more complicated situation when you’re trying to log data and drive the robot at the same time. That means mixing up motor commands with log commands and it all gets a bit hairy. So there is a way of making life a little bit easier, which is to make the computer which is inside the brick do two different things at the same time. You’re probably used to having a Windows PC where you can play music in the background, surf the web and whatever, so what we need to do is simply make our brick do that.
Okay, that will give us a good intro to what our task is. Remember our challenge is to find a casualty inside the tunnels, alright, you can’t see inside them but there is a network of tunnels under this box in which a casualty is hidden, and your task is to go and find the casualty. The practice one over there is just the same so you can go and have a look to see how that’s arranged, and basically it’s a straight run through with side tunnels, and there’s a wall at the end. So what you need to do is make a robot that will drive down the end, hit the wall, reverse, come back again and while it does that keep looking down the side tunnels until it spots a casualty. And here’s the casualty, this is Ted the LED, and rather nicely he’s got a couple of bright lights on his tummy. These can be picked up by the light sensors that you’ve got, and really those are light sensors but they’re also infrared sensors and this is actually very similar to the way a rescue robot might work. It may be able to detect infrared signals which relate to temperature, which a casualty may well show up as a brighter, warmer signal in the background. So this is just the same sort of thing that a real rescue robot might be doing.
Interviewer: So John, they now realise for the first time the full scope of the daunting task that faces them. So if I’ve got this right, at the moment the scope is that as the robot moves in its line, it will be looking to see where Ted the LED is, whereas the stage that you might get to on some occasions would be to actually then turn off and effectively bump into it as if it would be able to pull the Ted out.
John Rosewell: That’s right; in fact I mean you can do this in three phases. The first phase is a reconnaissance; you just go in and out and you look, so we gave them two light sensors that can look to the left and to the right as it goes down. So they should be able to tell me how far in the casualty is and whether it’s to the left or to the right. Then they could, maybe using a different robot, send one in that would go in the same distance and then turn the right direction and go up to actually find the casualty. Then, if they were really good, they could build a claw or something on the front to actually grab the patient and drag it out. Well that’s a bit difficult and I don’t think we’re going to get that at the end of this session.
Interviewer: And also there is a slight element there of being somewhat impractical in that that would only work for stuffed patients, wouldn’t it. I mean in real life you wouldn’t want the trapped victim to be dragged by a claw in case it broke their arm.
John Rosewell: Well there’s some research going on at the moment in the States, funded by the American army I think, to develop battlefield robots which are humanoid which are capable of going into a danger situation and picking up a casualty and bringing them out.
Interviewer: Blimey. So again, anything you can think of will have an application somewhere down the line.
John Rosewell: Anything we can think of probably somebody else has thought of already.
Interviewer: Okay, final question now; you’ve given them less than an hour to do this final phase, where would you expect them to be, where would you hope they might be?
John Rosewell: I’d have hoped that most of them would get to do the reconnaissance mission. I don’t think today we’ve got a chance of doing the more elaborate things. Some of them might not get to the end, but enough of them will and we’ll be able to share what we’ve done, the different teams, so that they can show each other what the final stage ought to look like.
Interviewer: Let’s see. Update time, where have you got to?
M: We’re just trying to get it through the tunnel to the end, to stop for a second and then go back without falling off the end.
Interviewer: And from what I just saw, you’d managed most of that apart from the stopping at the end bit.
M: It stopped but then it wouldn’t go again, so we need to just have a look why, see if we can work it out and try it again.
Interviewer: So do you know why it’s not starting again and not quite stopping as fast as you want it to?
M: Not at the moment. We’re going to have a look, just test the pressure sensor on it and see if we can work it out from there.
Interviewer: Right, we’ll come back to you, we hope.
M: Yeah, it’s driving the wheels at a different speed even though they’re connected now.
Interviewer: It’s stopping in the right place.
M: We’ve got it on the rotation sensor so it counts from zero up to whatever number and then it’ll stop when it gets back to zero again because it’s reversing the same distance.
Interviewer: Oh I see, so it knows where it is or it knows where the count is.
M: Yeah, it counts.
Interviewer: So we’re plotting the graph of light against distance, so we can see…
M: So we can predict how far along the tunnel the bright light’s going to be or not.
Interviewer: Oh right, so it’s almost, it’s literally the light at the end of the tunnel is the light at the end of the tunnel. Or at the side of the tunnel, as the case may be.
Interviewer: Okay there’s a whole queue of people here now, again I’m assessing the level of cockiness and confidence before you go in. How are you feeling?
M: Not much.
Interviewer: Not much?
M: Fingers crossed, did alright on the test one. It’s a bit more worrying when you can’t see all the way through.
F: It’s coming back.
Interviewer: Oh, yours actually stops before the edge, that’s bonus points for that.
M: We put that in, yeah.
Interviewer: I heard a great peal of laughter from over here as you examined your data, what did it reveal?
M: It’s right there.
Interviewer: And where is there?
M: Right hand in the middle.
Interviewer: Right hand middle, okay.
M: No, because that’s where it’s coming back.
M: Oh no, no, no, no, no, no.
M: That’s where it’s coming back.
M: So it’s at the end.
M: That’s right, yeah, it’s right at the end.
M: It’s right at the end.
M: That’s what I said. At the end of the middle.
M: At the end of the middle.
F: It’s at the end before it turns back.
Interviewer: Right at the end?
M: Yeah, on the right hand side.
Interviewer: So as the robot’s going down the tunnel, it’s on the right hand side?
M: That’s just a theory, you’re not going to influence anybody else.
Interviewer: Do you have a hypothesis?
Interviewer: Are you about to confirm it.
M: On the left side.
Interviewer: And how far?
M: Not very far.
Interviewer: Not very far, okay that’s fine, that’s all we want, those are the technical terms, not very far, middle-ish, a long way. Fine.
Interviewer: So we’re now seeing with a video camera attached and we’re looking for the tell-tale dots of light. There, there’s some light.
F: Spot a teddy.
Interviewer: Well done.
John Rosewell: I’ve seen quite a few people have a go at this so I think we’ve managed a bit of success. Let’s just have a quick whizz around and see who managed to actually identify where Teddy is. So let’s go around group by group, did you?
M: We didn’t.
John Rosewell: You didn’t manage, you did?
John Rosewell: Very good, next lot? No, not quite. Yes. Yes. Yes. Yes. So whereabouts was he? In that box, yeah that’s a good answer isn’t it. Left or right?
John Rosewell: First?
John Rosewell: Second?
John Rosewell: Third?
John Rosewell: So we’re all agreed he was on the left in the first tunnel.
John Rosewell: That’s really good, I think you’ve done really well, you’ve made a reconnaissance robot that’s gone in, done a rescue mission, located where the casualty is, so congratulations everyone, I think you’ve done really well and you’ve found Teddy which was the goal of the challenge so well done everyone.
Interviewer: John, were you happy with that in the end? I mean most of them managed not only a rescue attempt but managed to identify correctly where Teddy was.
John Rosewell: Yeah, that’s right. I think most of them managed to do it. It got a bit rushed at the end but most of them succeeded with that challenge, which is pretty good going.
Interviewer: It is impressive. If you said to somebody okay, how long do you think it would take from going absolutely nothing about what a robot is to building a robot and programming it so it’s able to go into a dark enclosed environment and autonomously detect something else in there and tell you its position, I’m guessing they’re not going to say six or seven hours, which is what they’ve effectively had.
John Rosewell: Yeah, I think you’re probably right. But they can do it, so yeah, I’m very pleased with them.
Interviewer: Thanks to John Rosewell and to the students and their new Teddy rescuing robot chums on the Open University’s Technology in Action course, here at Bath University.