In-Sight: The Need for Technological Morality

I receive a daily e-letter that keeps me connected to “the outside world,” and especially the high-tech world.  In today’s edition, one of the featured articles was on the growing need to figure out how to make machines “moral” as they become increasingly powerful.  The need for this is apparently obvious and imminent.

I am a low-tech guy, so this article fascinated me.  I cannot even imagine how engineers would go about doing this.  But the fact that some folks embedded in this business and industry see the need for a “moral” machine strikes me as significant.

From a spiritual formation standpoint, all I can think to note is that whatever “morality” ensues within the machine, it will necessarily flow from those who eventually program the device.  So, the real need is not so much to have  moral machines as it is to have moral people programming them.

And that’s where the challenge comes in.  We seem to lack consensus (and any means for achieving it) as to what “truth” (from which morality emerges) means, or even can mean.  We now seem deadlocked in a mounting conflict between alleged absolutes (which is what both religion and the new atheism are–with intermediate stops between these extremes).

So….some day (apparently) soon….someone will sit down at the table and “enter a morality” into a machine.  In that moment, we will have turned a cosmic corner and as of today, it’s impossible to say what kind of world we will be creating and sustaining.

From the vantage point of spiritual formation:  “Lord, have mercy!  Christ, have mercy!”

About jstevenharper

Retired seminary professor, who taught for 32 years in the disciplines of Spiritual Formation and Wesley Studies. Author and co-author of 31 books.
This entry was posted in In-Sight. Bookmark the permalink.

2 Responses to In-Sight: The Need for Technological Morality

  1. Larry White says:

    The problem point they are reaching is one that comes after programming. Machines are starting to learn and as they learn, they go beyond their original programming or they may even be start to be able to alter their programming. They will need a “ethics” or “morality” that is a set of parameters outside of their regular programming that will be guidelines for decision making. It is difficult to include these parameters in such a way that they too cannot be altered (which is the danger). We see what happens when good ethics in humans get reprogrammed, imagine the speed of corruption in a machine with a simpler brain than a human.

  2. Donna Bogan says:

    Thanks for this, Steve. It’s going to be critical to remember the old computer adage: “Garbage IN;
    garbage OUT”. We’ll just have to pray that the programmers know Ethics IN; ethics OUT. Asimov covered this in his sci-fi books using the 3 laws.

Comments are closed.