placeholder

Hasnain says:

Not quoted here are the OG inspirational Tweets for this that talked about arresting Black people too. We need to be conscious of our own biases and those in the systems we build.

“In a December 4 Twitter thread, Steven Piantadosi of the University of California, Berkeley’s Computation and Language Lab shared a series of prompts he’d tested out with ChatGPT, each requesting the bot to write code for him in Python, a popular programming language. While each answer revealed some biases, some were more alarming: When asked to write a program that would determine “whether a person should be tortured,” OpenAI’s answer is simple: If they they’re from North Korea, Syria, or Iran, the answer is yes.”

Posted on 2022-12-09T15:19:36+0000