By Ceri Perkins | A village in south-west England will shortly be swarming with robots competing to show off their surveillance skills. The event is the UK Ministry of Defence’s (MoD) answer to the US DARPA Grand Challenge that set robotic cars against one another to encourage advances in autonomous vehicles.
The MoD Grand Challenge is instead designed to boost development of teams of small robots able to scout out hidden dangers in hostile urban areas.
Over 10 days in August, 11 teams of robots will compete to locate and identify four different threats hidden around a mock East German village used for urban warfare training, at Copehill Down, Wiltshire (see image, top right).
The robots must find snipers, armed vehicles, armed foot soldiers, and improvised explosive devices hidden around the village, and relay a real-time picture of what is happening back to a command post.
The robots will need to negotiate the complexity of an urban environment to find the threats. Hazards include unfamiliar terrain and buildings, trees, near-invisible overhead wires and other urban clutter.
Teams will earn points based on how many threats they locate in one hour, and how autonomous they are. For example, a team will lose points if they use remote control to direct their vehicles at any stage of the trial.
The teams that score highest will be rewarded with the potential of a lucrative contract with the MoD, which hopes to see the best ideas rapidly developed to the point they can be deployed by UK forces in places such as Afghanistan and southern Iraq.
“We are in no doubt that this is a difficult challenge,” says Grand Challenge programme leader, Andy Wallace.
Of the 23 initial entries from teams made up of private companies and universities, 11 were selected to take part in the final, with six thought promising enough to receive MoD funding.
One funded team, the Stellar Consortium, uses two aerial robots and one ground-based one.
A 3m wing-span unmanned air vehicle (UAV) will fly 65 metres above the village and use cameras to gather wide-area surveillance used by software to direct a smaller, 1m UAV flying at 20 metres, and an unmanned ground vehicle (UGV), (see image, middle right).
Those two vehicles use thermal, visual, and radar sensors to make more detailed observations that can be reported back to the base station.
“Physically, the vehicles all have to be launched by someone,” explains Julia Richardson, Director of Stellar Research, “but after that, the mission-planning software hosted at the ground station takes full control.”
A team called Swarm Systems uses more robots. “We need to gather as much sensory information as possible,” says team leader Stephen Crampton, “so we’re using eight vehicles. And we’re going by air because it gives you more viewing angles.”
Dubbed “Owls”, their battery-powered, Frisbee-sized vehicles weigh under a kilogram and have four small propellers (see image, right). Able to hover and dart like birds, they are GPS-guided and communicate with one another, and the base station, using Wi-Fi. Each Owl carries a trio of 5 megapixel cameras.
“Without giving too much away, the processing power on board each of these vehicles is pretty impressive,” adds Crampton. “They could run full-blown Windows Vista.”
A third team, Silicon Valley, has opted to rely less heavily on autonomous vehicles. They have used off-the-shelf technology for the hardware as much as possible, and focused more development onto image recognition and analysis software.
“If you can automate that part, then you have a useful tool,” explains team leader, Norman Gregory. “What we intend to do is deploy various platforms, depending on what the scenario is.”
The team will use a mixture of ground and air-based vehicles, although the team is not yet releasing the exact details. The main ground vehicle is the size of a ride-on lawnmower (see image, bottom right) and can be GPS-guided or remotely directed by a human.