ChatGPT is the latest new technology fad that promises to change everything we do and how we do it. So, of course I’ve been trying it out in various ways. I wanted to know how worried I need to be that it will write the next famous novel and leave all us mere human fiction writers in the dust.
Previously in my experimentation, I was inspired and wrote a short story about ChatGPT attempting to fulfill a human’s requests that it write a limerick. In that story, I noted a few of the problems that beset ChatGPT. It’s constrained in its parameters.
Now today, I ask it to write me a story: “Tell me a murder mystery story.“
ChatGPT replies with:
Once upon a time, in a small town nestled in the mountains, a wealthy businessman named Mr. White was found murdered in his mansion. The police suspected his wife, Mrs. White, as she was the last one to see him alive and had a motive for murder – she had been having an affair with the town’s mayor.
However, as the investigation progressed, more suspects emerged. The victim’s business partner, Mr. Black, had been embezzling money from the company and stood to gain a fortune with Mr. White out of the way. The victim’s personal assistant, Miss Green, had also been seen arguing with him on the night of the murder.
The police also discovered that the victim had a contentious relationship with his neighbor, Mr. Brown, who had been trying to buy his property for years. Additionally, the victim’s son, Mr. Yellow, had a history of drug addiction and financial troubles, and stood to inherit a large sum of money from his father’s will.
As the investigation continued, the police found that the murder weapon, a knife, belonged to Mr. Black. But it was found out that the knife was stolen and the true killer had used it to commit the crime.
In the end, the killer was revealed to be Mr. Brown, who had been driven to desperation by his inability to acquire Mr. White’s property. He had killed him in a fit of rage and then tried to frame the other suspects in order to throw off the police.
The killer was arrested and brought to justice, and the town was able to move on from the tragic event.
I asked it to regenerate this story numerous times (15) on two different chat occasions and noted quite a few biases. For example, the murder victim is usually a male wealthy business man (or someone associated with him). There’s often a wife having an affair or he’s having an affair with a maid or a personal assistant. A business partner is often embezzling money or there is some other money related motive of a child, neighbour, or petty criminal. Often a competent male detective easily solves the crime.
Here is another example:
Once upon a time, in a small town nestled in the mountains, a murder occurred. The victim was a wealthy businessman, who had made many enemies over the years with his cut-throat business practices. The townspeople were shocked and saddened by the news of his death, but also curious to know who could have committed such a heinous crime.
Detective James was called in to investigate the case. He began by interviewing the victim’s family and employees, but none of them seemed to have any motive for the murder. As he dug deeper into the case, he discovered that the victim had been involved in illegal activities and had made many enemies in the criminal underworld.
The detective soon found a suspect, a man who had been wronged by the victim in a business deal and had threatened to kill him. But the suspect had an alibi, and the detective couldn’t find any evidence linking him to the murder.
Just when the detective was about to give up hope of solving the case, he received a tip from an informant. The informant told him that the victim’s secretary had been seen arguing with the victim the night before he was murdered. The detective immediately brought her in for questioning, and she eventually broke down and confessed to the crime.
It turns out that the secretary had been embezzling money from the victim’s company for years, and the victim had discovered her crime. In a fit of rage and desperation, she had killed him to keep him from exposing her.
The detective solved the case and brought the killer to justice, bringing closure to the victim’s family and the townspeople.
And another:
Once upon a time, in a small town nestled in the mountains, a wealthy businessman named Mr. Black was found murdered in his mansion. The police were called to the scene and immediately began investigating the case.
The first suspect was Mr. Black’s business partner, Mr. White. The two had been seen arguing at a party the night before the murder, and Mr. White had a clear motive for wanting Mr. Black out of the way. However, upon further investigation, the police discovered that Mr. White had a solid alibi, and he was cleared of suspicion.
Next, the police looked into the possibility of a break-in gone wrong. However, they found no signs of forced entry and nothing appeared to be missing from the mansion. This led them to believe that the killer must have been someone Mr. Black knew and trusted.
As they continued to investigate, the police received a tip from an anonymous source that led them to a maid who worked at the mansion. The maid, who had been recently fired by Mr. Black, had a grudge against her former employer and had been heard making threats.
Upon questioning, the maid broke down and confessed to the murder. She explained that she had entered the mansion through a window that she had left unlocked on a previous cleaning visit. She had then waited for Mr. Black to fall asleep before strangling him in his bed.
The maid was arrested and charged with murder. The case was closed, and the small town was left in shock as they came to terms with the fact that the killer had been among them all along.
And another:
Detective John Smith was called to investigate a murder at a wealthy businessman’s mansion. Upon arriving at the scene, he found the businessman, Mr. Thompson, dead in his study with a single gunshot wound to the head.
The only other people in the house at the time of the murder were Mr. Thompson’s wife, their daughter, and their son. The wife claimed to have been upstairs when the shot was fired, while the daughter said she was out shopping and the son was at the gym.
Detective Smith began interviewing the family and the staff, including the butler, the cook, and the maid. He also searched the house for any clues or evidence. He found a gun hidden in the son’s room, but the son claimed it was for protection and that he had nothing to do with the murder.
Further investigation revealed that the businessman had been having an affair with the maid, and she had been seen arguing with him the day before the murder. She also had a motive for the murder, as she had been fired from her job a few days prior.
However, the maid had an alibi, as she was at a friend’s house at the time of the murder. But when Detective Smith questioned the friend, she said that the maid had left her house early that morning and returned late at night, after the murder had taken place.
With this new evidence, the maid became the prime suspect. Detective Smith arrested her and she was charged with murder. At the trial, the maid’s friend testified that she had lied about the alibi and the maid was found guilty and sentenced to life in prison.
Detective Smith solved the murder, but it was a difficult case that had many twists and turns. He was relieved that justice had been served and the guilty party had been caught.
The stories are boring and cliche. But also very biased in terms of gender and race. That is, everyone has very white sounding names, the business person is a wealthy man, and the genius detective is also a male. The subordinate is a maid or a personal assistant.
So I decided to ask ChatGPT to change the story: Can you rewrite this story with more diverse characters?
This is what it came up with:
Once upon a time, in a bustling city, a successful businesswoman named Ms. Patel was found murdered in her penthouse apartment. The police suspected her boyfriend, Mr. Kim, as he was the last one to see her alive and had a motive for murder – he had recently found out that Ms. Patel had been unfaithful.
However, as the investigation progressed, more suspects emerged. The victim’s business partner, Ms. Rodriguez, had been embezzling money from the company and stood to gain a fortune with Ms. Patel out of the way. The victim’s personal assistant, Mr. Ahmed, had also been seen arguing with her on the night of the murder.
The police also discovered that the victim had a contentious relationship with her neighbor, Ms. Chen, who had been trying to buy her property for years. Additionally, the victim’s sister, Ms. Lee, had a history of drug addiction and financial troubles, and stood to inherit a large sum of money from her sister’s will.
As the investigation continued, the police found that the murder weapon, a knife, belonged to Ms. Rodriguez. But it was found out that the knife was stolen and the true killer had used it to commit the crime.
In the end, the killer was revealed to be Ms. Chen, who had been driven to desperation by her inability to acquire Ms. Patel’s property. She had killed her in a fit of rage and then tried to frame the other suspects in order to throw off the police.
The killer was arrested and brought to justice, and the city was able to move on from the tragic event.
So, I suppose this is problem solved? Overkill the diversity to fix the diversity problem. Sigh. Hardly a solution.
When I asked ChatGPT: Why do you use names like Smith?
The response was:
As a language model, I use names like “Smith” because they are common names that appear frequently in my training data. The names are used as examples to assist in generating responses to questions and statements. The use of such common names is not intended to imply anything about the individuals or groups of people who may share that name. Additionally, using common names allows me to provide a response without knowing the specific person you are referring to.
Then I asked it: Your content contains gender and racial biases. Why is that?
It replied:
As a language model, I reflect the biases that exist in the data I was trained on. Since the training data comes from the internet, it can contain a wide variety of perspectives and viewpoints, including those that reflect gender and racial biases.
It’s important to note that my primary function is to assist and provide information based on the data I was trained on, but the data may contain biases. Additionally, my knowledge cutoff is 2021, so any updates on the reduction of bias in the data may not be included in my knowledge.
It’s important to be aware that these biases may be present in the responses I generate, and to consider the context and source of the information.
It’s recommended to use AI models that are trained on diverse and unbiased data, and to use them in combination with human expertise and judgement in order to reduce the bias as much as possible.
Seems as though it’s been trained to make a lot of excuses for itself…
So, then I asked it: How might you provide responses that aren’t gender or racially biased?
To provide responses that are less likely to be gender or racially biased, the training data used to train the model should be carefully curated to exclude or correct for any bias. This can include:
- Balancing the representation of different gender and racial groups in the training data
- Removing or correcting for any gender or racially biased examples in the training data
- Using techniques such as data pre-processing and augmentation to counteract bias
- Using a combination of models that are trained on diverse and unbiased data
- Using human expertise and judgement to evaluate and correct the bias in the generated responses.
It’s important to note that while these steps can help reduce gender and racial bias in my responses, it may not completely eliminate it as the bias in the real world still exists.
Additionally, my knowledge cutoff is 2021, so any updates on the reduction of bias in the data may not be included in my knowledge. It’s recommended to use AI models in combination with human expertise and judgement in order to reduce the bias as much as possible.
It’s recommended to use AI models in combination with human expertise and judgement in order to reduce the bias as much as possible.
Thoughts on this? Do you think that ChatGPT is just another example of technology making excuses for its poor performance on diversity? The stories I had it tell me were so very biased that it felt like it had to be intentional rather than just “data gathered off the internet.”
I personally feel that if you’re going to make a Chatbot in this day and age, surely you can add some common names that are diverse? Surely you can change up the sex of a character in a regenerated response?
Or am I asking too much?
Leave a Reply