Saturday, December 1, 2007

Stranger Than Fiction

Chapter 31 reads like a Lucas/Spielberg script; accelerated learning through focused electronic stimulation of the brain, personal learning and life profiles maintained in centralized data mines, enhanced group think through brain synchronization... We accept these ideas readily in science fiction because it is just that, fiction.  However, science fiction has often cautioned us by its very cynical depiction of such "advances".  Chapter 31 has only one sentence that indicates there will certainly be ethical and legal questions to answer as we develop more invasive learning methods.

In this blog, I submit my personal opinion that the ethical questions start with learning itself.  I often argue that education is an unconditionally good thing.  But I also believe that learning must remain a process mediated by free will.  And sociologically, I believe we must maintain a society where various levels of education have their place and value.  Even if we had a pill that turned any person into an instant Einstein, should everyone take it, and would that really serve our variety of human needs best?  People learn for many reasons.  Some need job skills.  Others enjoy the process and the result is not so important.  Just about everyone needs to have reading skills, but not everyone needs to know how to fly a plane.  But what if a law were passed that required everyone to submit to an electronic brain implant that gave them emergency CPR skills?  That would save a lot of lives, but would it be fair, ethical, moral?  

Realistically, learning needs vary with each person and over the course of a lifetime.  Just in time learning might be the way to address the actual development of a person's life.  If I could rapidly acquire parenting skills during the nine months of pregnancy, I'd be a better parent and not have to guess five or ten years earlier, when I was originally in school, if those skills were really going to be necessary in the first place.  This might be a great opportunity for advanced learning systems to be applied.  On the other hand, maybe I prefer to learn by doing and wish to learn parenting as I go.  In either case, should I be forced to learn to be a good parent in advance?  

I don't want to live in a world ruled by a tyranny of education.  Alexis de Tocqueville is often noted for the concept of a tyranny of the majority, in which the interests of the majority completely override the interests of the minority.  The Bill of Rights is one example of how this tyranny is meant to be avoided, securing certain essential rights regardless of the wishes of the dominant powers.  I suggest that we need similar thinking wherever systems of power are developed.  The right to be blissfully ignorant might be right number one.  In our enthusiasm to create more advanced learning systems I think we need to remember that learning begins as a natural process for survival, adaptation, and a good quality of life.  We must be careful to let real human needs motivate those advances and not let the potential advances cause us to treat people as receivers of our zealous implementations of the latest technologies because those technologies can theoretically create statically better learning results.  Is it not apropos to consider that in the end, learning must remain a consequence of free will, personal choice and even the desire to have a bit of fun?

1 comment:

Kellie08 said...

Lawrence, I found your post to quite thought provoking. When you mentioned the idea of being prewired to do CPR, and have the potential to save hundreds of lives, it immediately reminded me of the Matrix movie. As cool as it would be to know how to do everything, or to speak all the languages of the world, I agree with you in that it would take out the fun of learning. There's nothing quite as satisfying as struggling over a problem, then having the 'light bulb' that solves the problem without you getting help from the outside. If self discovery was taken out of the learning equation, I feel that people in general would become lazy, our productivity as a society or even the entire human race would decrease.