Google's online video offering YouTube may not seem the obvious marketing showcase for highly sensitive military robotics, but there's a an underground technological cult brewing when it comes to the art of making machines fight machines. More than five years after the doctrine of network-centric warfare emerged from the heavily shielded halls of the
United States Pentagon's Defence Advanced Research Projects Agency, cybernetic warriors are taking their first tentative steps to replace soldiers on the battleground.
Over the past year, a procession of bizarre-looking beasties has emerged into the light of day from developers such as Boston Dynamics, which has produced a petrol-powered robotic quadruped named Big Dog that traverses across rubble, forest floors and can even skittle precariously across ice.
Akin to an autonomous pack-mule, one corporate video promoting the device's dexterity shows it receiving solid kung-fu style kicks to the body from an unkind master only to stumble for a few steps before it self-corrects and proceeds on its way.
But there is a deadly intent behind such seemingly comical contraptions, not least because they may soon allow those who possess them to gain a substantial tactical advantage over their adversaries in real world conflicts.
In the world of fighting machines, combatants that can topple opponents without putting their own troops in mortal danger are the most likely victors in either sporadic or sustained conflicts.
Known as "Automation of the Battlespace", military robotics is an area of technological development into which Australia has quietly invested millions in research dollars as the Defence Science and Technology Organisation seeks to stay one step ahead of its adversaries.
Australia has committed about $1.5 billion to various unmanned aerial vehicles projects in an effort to bolster capability around surveillance and weapons delivery systems, most of which will be delivered over the next ten years.
But while big-ticket machines such as Northrop Grumman's jet-powered Global Hawk have taken much of the immediate headlines, DSTO has been quietly working away on a series of smaller, unmanned-vehicle projects as much about investigating the economics of putting robots onto the battlefield as developing their fighting prowess.
"You have to be able to afford them," says the head of DSTO's Automation of the Battlespace initiative, Anthony Finn. "There is no free lunch here. It's an exploration of the cost [to] capability equation of small [automated vehicles] versus big."
An advantage of smaller, unmanned vehicles or devices, and particularly unmanned aerial vehicles, is that their lower price means they can be expendable, thereby increasing their value in difficult or highly hostile situations.
But the cheaper, smaller machines carry less equipment or weaponry than their bigger compatriots, meaning they must be ganged together into swarms - a job that requires substantial hardware and software integration.
"The reason you don't have robots roaming the streets is integration," Finn says, adding that a key reason military unmanned aerial vehicles have proved successful in the real world is they are operated in a relatively "uncluttered" environment - open space - compared to land vehicles.
Finn politely offers that he can't provide numbers of UAVs, given their operational standing.
UAVs have found a clear niche in providing land-based troops with short-range imagery and sensory data of what lies ahead - a service particularly useful for navigating through awkward topography. Airborne sensors can detect and pinpoint the flash of weapons fire or heat from vehicle exhausts.
Other projects include the Wayamba (an Aboriginal word for turtle), an unmanned underwater vehicle that is being developed for rapid environmental assessment and to clear notoriously hard to detect sea mines.
But valuable as these machines are, they're a far cry from the self-sufficient fighting machines portrayed in movies such as Terminator 3.
Debate also still rages over whether or not the level of autonomy portrayed in science fiction is desireable in the real world.
Finn is adamant the limitations of military robots need to be carefully taken into account. He is hesitant about the scenario that one day all commanders will have to do is push a button and let their machines do their fighting for them.
"Realistically, that is so far out. It's philosophical, but war is a human endeavour," he says.
Another issue in letting robots control themselves is that there are distinct ethical, legal and philosophical considerations to be taken into account when the decision to prosecute with lethal force is applied.
One area under debate is who takes responsibility when an error, illegal action or war crime takes place, particularly if it is the result of equipment or software malfunction.
Who would front a tribunal or an international court - the soldiers using the device, their commanders, the equipment manufacturer or the software programmers?
Given some artificial intelligence and allowed off the leash, robots can also behave in ways that surprise their human masters.
Finn offers a real scenario where an unmanned land vehicle was programmed to find the safest and fastest way from one point to another across difficult terrain that also included a dam full of water, using data delivered to it from an EVA that had conducted a topographical scan with a laser.
Instead of going cross-country as anticipated, the machine drove in the opposite direction onto a bitumen road and then took the longer but ultimately faster and safer journey around the proving ground rather than through it.
"Sometimes, they can be smarter than we think," Finn says. "My personal opinion is that automation will be introduced to make things easier and safer."
Fairfax Business Media
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.