Artificial Intelligence and the Collingridge Dilemma.
People have this perceptual limitation:
we can't see the future.
We kind of stumble along paths and by the time we get to where it's going we can't turn back.
When I was 20 I the idea that I was stumbling towards an unknown destination was far from my mind - I was absorbed in the details of the moment.
Looking back over 50 years it's pretty obvious.
A similar thing happens with technology.
By the time a tech is developed well enough to see it's problems the problems are embedded in the system and are very hard to fix.
So a society is on the horns of a dilemma with new tech.
One horn of the dilemma is the potential benefit of the new tech.
The other horn is the very serious problems that the tech presents when it is well established.
In technology this is the Collingridge Dilemma.
For instance: it used to be said "Give a man a fish and you feed him for a day but teach a man to fish and you feed him for a lifetime."
What could possibly go wrong with that?
And now fisheries are being wiped out by too many fishermen taking too many fish in a classic abuse of the commons situation.
Our society now is facing that other horn when it comes to fossil fuels.
For many reasons we have to greatly reduce our usage - but now we need a lot of fossil fuels.
And the infrastructure producing them is approaching its useful life and needs expensive maintenance
Nobody wants to do that maintenance because whatever is built would last 50 years but might only have a market for it's product for 10.
Capitalists aren't stupid. They read of the push by states all over the world to reduce our fossil fuel consumption by switching to electricity produced by wind, solar, tides (what's called 'flow' - taking energy from the movement of air and water driven by the sun).
It seems that with artificial intelligence we are now on the first horn of the Collingridge Dilemma.
The tech promises wonders and is burgeoning.
Soon we may be as dependent on AI as we now depend on electricity.
(Full disclosure: I've written before about how Google now is probably an AI and I'm certainly dependent on it.
I remember the days when researching a topic meant hours at a library among books and cardfiles and periodical indexes and all knowing librarians.
Now just typing in a question provides tons of answers from articles to books.)
I'm pretty aware of the potential for Google to be used by unscrupulous states to manipulate and surveil citizens.
Like when I type "ch" and a link to Chichester Cathedral Perigrines pops up at the top of the list its pretty obvious that I'm being tracked.
(still can't figure out why half the ads I see are for gutter shields so it's not perfect:-)
And I don't even think that sort of tracking involves AI.
I read about the AI systems that make pictures from a text prompt.
"A wall in a royal castle. There are two paintings on the wall. The one on the left a detailed oil painting of the royal raccoon king. The one on the right a detailed oil painting of the royal raccoon queen" produced this.
https://cdn.cnn.com/cnnnext/dam/assets/220607144211-google-imagen-raccoon-royal-painting-exlarge-169.jpg
I didn't notice the bias at first - but it's kind of front and center - "royal" is interpreted in Eurocentric terms.
This example is from a system that is not open to the public but illustrates the problem - choosing a eurocentric vision of royalty as the norm unconsciously reinforces the bias that eurocentism is the norm.
But you wouldn't find the bias by looking at the code for the AI or it's training dataset.
With this image creation software I'd expect obvious biases like that to be detected and fixed.
Like "royal" might be ambiguous so the AI could ask for which culture the royalty exists in - like a human artist would.
Pictures have viewers and over time the viewers can become part of the AI's training
That works for pictures that people see and are aware of.
What if that sort of bias is working in us on the pre-awareness level of our cognition - then it just becomes normal and influences our behavior and we never see it.
What do you think?