I love the forward thinking and radical approaches of Elon Musk. It's great for someone who doesn't need to do anything to keep pushing on many different levels. As an entrepreneur and computer scientist, I appreciate all that he has done. It's exceptionally amazing.
Elon Musk a while ago called for a ban on autonomous killer robots. I agree. And I agree we have to continue pushing and voicing concerns about this threat. But I feel the need to project force at a distance is too strong of an instinct for people especially “bad” people of the world to adhere to this ban. This is particularly difficult for people especially when defining what “robot” or “autonomous” really means is ambiguous.
As a computer scientist, especially in the area of artificial intelligence for the past 30 years, I've thought of many different cool things the computers will do eventually for people. I'm a very optimistic positive person and always think about the positive things that they would do for people. However, I'm mindful of the negative things a similar technology can do as well. Of course the world full of people who like to do "negative" things for whatever reason. (power, ego, lack of ego, insanity, etc.) These bad things include even good people because sometimes the circumstances may call for it or so it would appear to call for it.
I spent two years in the Navy ROTC at Northwestern University and almost committed to becoming a naval officer. I opted out at two years because at that time new computers like the IBM PC/Macintosh were up-and-coming and I felt that I would be left behind if I stayed in the Navy. What I did learn during my Navy time, my later work at Northrop Corp on the F15 fighter jet and later observations are how military's (good and bad) develop and use weapons systems. It boils down to one principle: projecting force at a distance is inherent to humans.
Human beings exist because we have the capability of building/using tools that help us adapt to the environment. These tools mainly protect us from other animals/entities/people. Humans always want to be able to attack or defend an opponent at a distance when possible. This started long ago when our ancestors were able to pick up rocks and throw them at animals (or each other). The simplest tool, a rock, but compared to our hands at close range the rock is awesome. Humans developed a great ability to throw many things accurately and at reasonable distances, thus saving us from hand to paw confrontations we’d probably lose.
This force projection at a distance and with speed was the driving motivation for most tool (weapon) advancement. Throwing rocks was followed by spears, various swords, bow and arrow, trabue shea (not accurate), guns, cannons, ships, aircraft carriers, aircraft, missiles, ICBMs, aircraft drones, rail guns, lasers and other non-aircraft drones (ship/land-based). Each of these tools moved the human using the tool projecting force further away from the opponent to be safer, like the rocks. We almost don’t even hear or think about the aircraft drones used all over the world anymore. The drones are operated many times by people in DC and fly far away over Iraq or similar locations. Regardless of whether they are used for reconnaissance or bombing, they are projecting force. Aircraft carriers are a massive collection of tools used to project force without endangering people back in the USA.
Drones are at this time are only partially controlled by people in DC. “Partially” controlled because to fly they use a great deal of computer “robot” autonomous functions to move around on their own. Yes, they don’t make the decisions exactly, but they do make many other micro-decisions already. It’s easy to increment little by little the abilities of drones to operate more and more on their own, yet not call them autonomous. At some point the question will be, “what does that even mean to says drones are autonomous?”. They already kill. They do as such without jeopardizing the person in charge of the drone. But if the drone is capable of locating, targeting ‘bad people’ accurately they are only missing the ‘go’ or ‘no go’ human button press to finish the job. If “it” (the drone) makes the decision then is it autonomous? I guess.
So, what’s the point? The point is, this march of autonomous robots is inevitable. It is hard to think people will not weaponize them. Unfortunately, while I agree we shouldn’t weaponize them, I don’t know how we keep all countries from doing this esp. the typical rouge ones. It will be too easy and ambiguous to define.
I’m optimistic still that we will work through these scary new set of “tools”. Part of that will be to take stands as Elon Musk, and others are doing.