Reply To: Want to get involved? Advocate for registry reform!

#45003
Avatar
Timothy D.A.Lawver

Once man decided it is appropriate behaviour to force another man subservient to machine, a computer’s database; man extinguished humanity itself in favor of machine. https://www.auburn.edu/~vestmon/robotics.html

Even if we presume Asimovs’ rules were wrong in practical application, as this piece: Isaac Asimov’s Laws of robots are wrong. By Peter W. Singer, (https://www.brookings.edu/opinions/isaac-asimovs-laws-of-robotics-are-wrong/)
we nevertheless diminish humanities’ value in favor of machine value.

Mr.Singer makes his point by utilizing the example describing the use of military drones. He writes:
“The most important reason why Asomovs’ laws are being applied yet is how robots are being used in the world. You don’t arm a Reaper drone with a Hellfire missile or put a machine gun on a MARRS (Modular Advanced Armed Robot System) not to cause humans to come to harm. That is the very point!”

Yes! That indeed is precisely Asimovs’ point too in developing the laws in the first place. Singer’s positional piece does not diminish my point; rather he fortifies it.

Asimovs’ laws prescribe a particular regulatory regime upon man’s uses of a machine. Asimov proceeded so not, “as a plot device to help advance his stories” but to serve as a warning mechanism. He did so out of his sense of morality. I have little doubt that our nation’s war mongers would even bother to consider the immorality of drone use. There is no money in it. From their perspective morality and mortality are mutually exclusive and thereby irrelevant. Given registrants relationship to state’s database, being subservient, I have even less doubt why such notions as Asimov puts forward are being completely ignored in our civil discourse.

If NARSOL were going to alter their tact to a more effective course I suggest they consider making the machine, and unconscionable uses thereof, their focus. Indeed this a tact ALL HUMANS can empathise with. Man over machine.