We Should Not Ban ‘Killer Robots’

image0

During a Moral class in Form 4, the teacher throw a question to the class this hypothetical question: "Should the world ban military weapon development?". The whole class of 16 year-olds raised their hands in support; when the no-vote came along, I was the sole supporter.

I enjoy fucking around like this.

Mr. Guna the teacher, ever the supportive and progressive person (I wish I could meet him again somehow), invited me to explain why. I wasn't as articulate to say "guns don't kill people; people kill people", but I cited examples of modern advances that stemmed from scientific development that was done for militarily reason.

Look no further than the internet itself. Mr. Guna was quite impressed with the reasoning and supported my view.

From time to time, there will be opinions that are too obvious to be of quality. Here's one.

A while back the International Joint Conference on Artificial Intelligence in Buenos Aires called for a “ban on offensive autonomous weapons.” A bunch of people signed it, including “more than 1,000 experts and leading robotics researchers.”

Of course they’d sign it, because who would seriously be for “killer robots?”

Well...

Fact is, no letter, UN declaration, or even a formal ban ratified by multiple nations is going to prevent people from being able to build autonomous, weaponized robots.

The better way forward argues the author, are better Rules of Engagement that minimize human deaths.

On the surface, such ban sounds like they're "robots shouldn't kill humans." But deeper readings tells a different story, it's more about the potential values of armed robots.

That debate should be given a chance to happen.

Link