The chat bot had been sidelined after making racist comments. — -- Microsoft's millennial chat bot "Tay" made a brief reappearance on Twitter this morning but still didn't make a stellar ...
Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result of nurture, not ...
Microsoft’s artificial intelligence strategy is to unleash more bots like Tay. Redmond doesn’t mean more nasty, racist or homophobic chatbots, but a forest of A.I. personalities for different uses — ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results