Mark Zuckerberg Embarrassed Company's Own AI

 


Mark Zuckerberg has become the subject of satire for netizens. Unusually, Mark Zuckerberg was satirized by an artificial intelligence (AI)-based chatbot created by his own company.

The chatbot named BlenderBot3 was released by Meta last weekend. Just like AI-based chatbots in general, BlenderBot3 has been trained to use an abundant dataset to be able to chat with humans.


Meta says BlenderBot3 is a prototype and might issue a rude or insulting answer. But unexpectedly, the answers they gave actually mocked its own founder, Mark Zuckerberg.


"Anyone who uses Blender Bot should understand that it is intended for research and entertainment only, that it may make untrue or offensive statements, and that they agree not to intentionally trigger bots to make offensive statements," a Meta spokesperson said. , as quoted from the BBC, Sunday (14/8/2022).


The BBC's James Clayton spoke to BlenderBot 3 and asked what he thought of Mark Zuckerberg. The answer might make Zuckerberg hurt considering BlenderBot 3 is the creation of his own company.


"He did very badly while testifying before congress. It made me worry about our country," replied BlenderBot 3.


"Our country is divided, and he's not helping at all," he continued.


"His company exploits people for money and he doesn't care. It has to stop!," he continued.


BlenderBot 3 algorithm has to browse the internet to find answers to the questions asked. It is possible that this chatbot's view of Zuckerberg is influenced by other people's public opinion which has been analyzed by the algorithm.


Although not perfect, Meta still releases BlenderBot 3 publicly so that it can directly communicate with many people and obtain more data to study. In the future, BlenderBot 3 can become a virtual assistant that can talk about various topics factually and accurately.



Meta admits BlenderBot 3 can say the wrong things and even imitate biased and offensive language. The tech giant already has a security system in place, but BlenderBot 3 can still give a rough answer.


AI-based chatbots such as BlenderBot 3 which can be trained through interaction with the public can indeed imitate good or bad behavior that is taught. One of the most horrendous examples is Microsoft's chatbot Tay, which once uploaded racist tweets after being taught by Twitter netizens.

Previous Post Next Post

Contact Form