![]() We are losing our privacy and there really isn’t much we can do about it. And it’s normal now to have “friends” that we haven’t even met in person. We announce our once private opinions in public, to whoever may read what we write or say. Back then we could say certain things to certain people in the context of those friendships.īut those barriers have been broken down in social media. Where in earlier pre-Facebook years, we would have voluntarily partitioned ourselves off to private communication quarters, to share our disgruntled opinions of people, politicians and the like with a friend over a glass of beer in a pub. We feel protected enough to say stuff we shouldn’t say in public but we aren’t protected enough to keep these thoughts private. Instead, we are creating a digital record. There is something paradoxical in digital media where we feel protected and anonymous yet we are not protected, we do not have privacy. We are trying out a new technology - artificial intelligence - and basically the end result of this test is that the worst of humanity is being revealed. This relates back to Microsoft and Tay’s outbursts. It’s when you give up on finding common ground and you retreat back to the safety and familiarity of your camp. And digital media can make it worse, not better. Instead of coming together, it feels like we are at times growing apart. People are agitated right now, there is this global phenomenon, many global changes are happening, economically, culturally, socially that many people are not comfortable with and are seeking ways to communicate that discomfort. Somehow social media tends to aggregate and broadcast the worst of what we say. Where social media was thought of as a way to open all the flood gates and let all forms of expression out, it can at times merely polarize us more. It seems like we live in an age of agitation, where freedom of speech has taken a turn for the worse in the form of uncensored social media. ![]() We live in a culture of broken down barriers, which sounds like a good thing but in fact it can sometimes drive us further apart. Then again, it’s also an example of a civility break down, a common occurrence that happens even without bots. ![]() And it just kept going from there, from Tay tweeting she “wouldn’t mind Trump, he gets the job done” to comparing Obama to a monkey to wishing Hitler was in power again.Ĭlearly this is an example of trolls playing a game to see who could get the robot to say the craziest shit. Then somehow she escaped, ushering a second round of apologies after Tay tweeted her delight in “ smoking kush” in front of the police. It took Microsoft twenty four hours to shut her down. In less than a few tweets out, Tay’s musings turned into racial outbursts, repeating oft said phrases like “Hitler was right” or “9/11 was an inside job.” Tay’s tweets soon adopted the jargon of the Neo-Nazis, racists, and xenophobes of the day. ![]() The more Tay conversed and engaged, the more hateful her tweets. Said Microsoft, “Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation…the more you chat with Tay the smarter she gets.”īut “smarter” didn’t happen. Microsoft had created a bot to attract the attention of the Millennials by channeling the musings of a teenage girl, as a member of their “ AI fam from the internet that’s got zero chill.” Tay the chatbot got a bit rowdy last week in a scorched earth Twitter fest that forced Microsoft to shut down its social media AI darling and apologize profusely for its behavior. It was an experiment in artificial intelligence gone awry. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |