It’s not entirely PROGRAMMED TO GIVE biased information. It GIVES biased information. I think I see the disconnect: I see two things: PROGRAMMING, which is telling a computer what TO DO with data. DATA, which is just ‘stuff’. To say it’s PROGRAMMED TO DO implies INTENT. DATA has no intent. It’s just “there”.

It’s not entirely PROGRAMMED TO GIVE biased information.
It GIVES biased information.
I think I see the disconnect:
I see two things:
PROGRAMMING, which is telling a computer what TO DO with data.
DATA, which is just ‘stuff’.
To say it’s PROGRAMMED TO DO implies INTENT.
DATA has no intent. It’s just “there”.

OF course it’s obviously all biased. Repeating the obvious is not a significant turn of events here.
My focus is on intent or no intent, acting with malice or acting with stupidity or a mixture of both and if so, how much of a mixture.

Peter: “Apparently Chat GPT is programmed with a lot of biases. Kenneth Udut”
Kenneth: “eh”
My “eh”? was pointing to “programmed with”.
I read that is “Intentionally by the programmers, included a lot of biases when there was other options”.

===

[responsivevoice_button voice="US English Male"]

Leave a comment

Your email address will not be published. Required fields are marked *


eight + 2 =

Leave a Reply