Home

Potvrda medicinski Ostati microsoft racist bot funkcija kratkoća daha Hodočasnik

Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism  [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Remembering Microsoft's Chatbot disaster | by Kenji Explains | UX Planet
Remembering Microsoft's Chatbot disaster | by Kenji Explains | UX Planet

Microsoft's millennial chatbot tweets racist, misogynistic comments | CBC  News
Microsoft's millennial chatbot tweets racist, misogynistic comments | CBC News

Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.
Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.

In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online  Conversation - IEEE Spectrum
In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online Conversation - IEEE Spectrum

Microsoft's Tay AI chatbot goes offline after being taught to be a racist |  ZDNET
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET

Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial  intelligence (AI) | The Guardian
Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial intelligence (AI) | The Guardian

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Tay (bot) - Wikipedia
Tay (bot) - Wikipedia

Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal  maniac - The Washington Post
Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal maniac - The Washington Post

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says  Microsoft
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft

Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit
Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit

After racist tweets, Microsoft muzzles teen chat bot Tay
After racist tweets, Microsoft muzzles teen chat bot Tay

How Twitter taught a robot to hate - Vox
How Twitter taught a robot to hate - Vox

Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a  Racist Jerk. - The New York Times
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft exec apologizes for Tay chatbot's racist tweets, says users  'exploited a vulnerability' | VentureBeat
Microsoft exec apologizes for Tay chatbot's racist tweets, says users 'exploited a vulnerability' | VentureBeat

Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All  Tech Considered : NPR
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR

Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than  24 Hours
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours

Microsoft's racist teen bot briefly comes back to life, tweets about kush
Microsoft's racist teen bot briefly comes back to life, tweets about kush

Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to  sleep | TechCrunch
Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to sleep | TechCrunch

Tay: Microsoft issues apology over racist chatbot fiasco - BBC News
Tay: Microsoft issues apology over racist chatbot fiasco - BBC News

Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism  [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Microsoft's "Zo" chatbot picked up some offensive habits | Engadget
Microsoft's "Zo" chatbot picked up some offensive habits | Engadget