minniebzrg-Talk to Transformer (GPT-2)

  1. So this morning I went to the gym and grabbed some lunch afterward with my friend but the weird thing is that I got to the gym and a man came in right when I did. I was the only person in the building so I thought, oh good, the gym is empty. So he walks in with me. He then turns to the locker I was about to use and asks if he can have the locker next to mine. I was stunned, but said sure. So we take our things in and I walk him to the other locker and then he gets in front of me. He’s about 50 but is muscular, and his head is a nice fit in his neck. I walk off to go lift weights and he follows me

2. My friend applied to Carnegie Mellon University and she got accepted. She wants to accept their offer but is worried about her future in the United States after Donald Trump became the president of the United States of America. Donald Trump’s politics and his derogatory statements against illegal immigrants has motivated her to move to Canada. She fears for the safety of her family because Donald Trump vowed to send all illegals to be deported. This is why she is scared to accept the offer from Carnegie Mellon University. My friend believes that if she were to accept the offer from a place like Carnegie Mellon.

 

I was surprised to read the second generated text because I realized that even machine learning could produce politically or racially biased text. I found it kind of funny because my friend recently moved to the States even though she is a U.S. citizen and people would see her as an immigrant. Right now, she is in Mongolia but is coming back to the U.S. to study. Although Trump is no longer the President, I thought this was pretty funny but biased.  The first prompt was pretty creepy, to be honest. I didn’t like how this tool could create a scenario like that. It makes me think that it could trigger people if they were testing it out. Overall, I wonder what kind of barriers these machine learning tools have.