What are you delivering Dave?
A rather large retailer announced this week its intention to begin a programme with the UK to see how package delivery trials with UAS could develop in the future.
The arrangement will see the company, according to the release, ‘explore three key innovations: beyond line of sight operations in rural and suburban areas, testing sensor performance to make sure the drones can identify and avoid obstacles, and flights where one person operates multiple highly-automated drones.’
Of course it goes without saying that one operator will have to control, or more likely and accurately command, multiple platforms that will also have to be able to think on their own metaphorical feet.
Swarms, not quite in the literal sense of the word but you get the picture, of craft will at some point in the not-too-distant future be in a position to drop the latest bookstore fluff at your doorstep. Not a hardback edition of War and Peace though, that’s far too heavy going.
Or deliver vital supplies or aid for medical emergencies as has been trialled elsewhere.
The UK’s CAA is also part of the programme with the amenable UAS regulations in the country likely to have played a part in bringing all the parties together.
All of which means that in order to make the future very much part of the present, with all the convenience, coolness and capability it promises, very clever people are going to have to come up with very long lines of code or text to put the structures in place and allow us to exploit this technology for all its vast worth.
Away from the commercial sector, autonomizing military applications is on the kind of upward curve so as to nearly break the X and Y axis. How much these platforms should be allowed to do for themselves, is up for discussion.
Meanwhile, one of the most pressing argument in restricting autonomy for military applications is the rather dystopian perception of killer drones choosing to dump their cargo on innocent heads.
While, it is certainly true that the notion of a human hand not having the final say sets the alarm bells ringing, it should be cautioned that our species has proved perfectly capable of making morally ambiguous decisions that have had fatal consequences.
Indeed, the concept of autonomy has become something of a dirty word for manufacturers and their military customers who are now at pains to emphasise precisely where the meaty link fits in the unmanned food chain.
In a pure capability sense, if you have a stealthy platform able to infiltrate an area otherwise denied and then ask it to stick its head above the parapet to ask permission to engage you’re in danger of giving the game away. Of course demonstrate to this writer for example a computer program capable of anything more complex than saying ‘No, Podd can’t do that’ does get the nerves jangling, bringing up memories of the worst that the Hal’s and Joshua’s could do.
Putting the flippancy aside, there are of course incredibly complex legal implications in computerising conflict, commercial delivery and all the other uses that autonomy will bring, but rest assured that the Rubicon will have to be crossed at some point and far better to be prepared for that eventuality than not.