Recently, Microsoft launched a Chatbot on the internet which has been designed to mimic in a teenage girl. However, it has been accused of tweeting racist and sexist statements, and the company has apologised for it.

A group of mischievous Twitter users made a slew of anti-Semitic and offensive remarks to Tay. To which the chatbot replied typically, saying “feminism is cancer”, “Holocaust didn’t happen”, and “Bush did 9/11”.

To another user, the chatbot gave a message that read: “Hitler would have done a better job than the monkey we have now”.

A Microsoft executive wrote in an official blog post, “Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay had tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time.”

Corporate vice president of the tech giant’s research wing, Peter Lee had said that the company is planning to bring back Tay only “when we are confident we can better anticipate malicious intent that conflicts with our principles and values”.

The failed trial could prove to discomfort for Microsoft.

Sky News notes that Kris Hammond, who is an artificial intelligence expert said, “I can’t believe they didn’t see this coming.”

The publication also noted that another artificial intelligence expert named Caroline Sinders, who also develops chat robots for another company, cited Tay as “an example of bad design”. She went to say that constant maintenance would be crucial as the machine learns from whatever it was told.

However, Lee has said that the company will make Tay resistant to juvenile Twitter users. He has also said, “We will remain steadfast in our efforts to learn from this and other experiences as we work toward contributing to the internet that represents the best, not the worst, of humanity.”