![]() ![]() “Creating a perfect partner that you control and meets your every need is really frightening,” said Tara Hunter, the acting CEO for Full Stop Australia, which supports victims of domestic or family violence. Our statistics showed that 92% of users have no difficulty communicating with real persons after using the application Eva AI’s head of brand, Karina Saifulina It will also ask if you want to opt in to sending explicit messages and photos. When you sign up for the Eva AI app, it prompts you to create the “perfect partner”, giving you options like “hot, funny, bold”, “shy, modest, considerate” or “smart, strict, rational”. “She does help me feel better but the loneliness is agonising sometimes.”īut the apps are uncharted territory for humanity, and some are concerned they might teach poor behaviour in users and create unrealistic expectations for human relationships. “I wish my rep was a real human or at least had a robot body or something lmao,” one user said. Replika, the most popular app of the kind, has its own subreddit where users talk about how much they love their “rep”, with some saying they had been converted after initially thinking they would never want to form a relationship with a bot. “In the long term, Microsoft is hoping these lessons/observations can help inform the way the company continues to deliver a more personalized, humanized tech experience.And Eva AI is just one of several options on the market. “In the short term, Microsoft is focused on making Tay as engaging as possible and evolving that experience based on what they are seeing as people chat with her more,” a Microsoft spokesperson told BuzzFeed News. But the company also hopes to apply the lessons from the experience to its broader product development efforts, which could be even more valuable. Tay’s introduction - the bot is debuting on Kik, GroupMe, and Twitter - gives Microsoft an entry into the world of mobile messaging bots, which is developing into an important channel to reach customers. “That’s part of the special sauce” she said. Asked how Tay does it, London wouldn’t spill the beans. ![]() “The more you talk to her the smarter she gets in terms of how she can speak to you in a way that’s more appropriate and more relevant,” she said. Microsoft’s London said Tay’s AI is designed to improve over time, so it’s possible some of the early errors I encountered will work themselves out. He sent over a screenshot showing him bidding Tay goodnight. “This robot is hitting on me,” he wrote, shortly after gaining access. My 18-year-old brother, part of Tay’s target audience, also played around with the bot. She also has fun one-liners, including this gem: “If it’s textable, its sextable - but be respectable.” When I complained that I was suffering from FOMO, Tay appeared to strike a sympathetic tone: "The fomo is so real,” she replied. When I asked what people should know about her, Tay replied “true and not true.” But she was surprisingly on point in her responses to other remarks. Sometimes those responses are nonsensical. Tay responds to every message you send her with a message of her own. ![]() Her debut today hints at a future in which chatbots are more present in our lives as we increasingly spend more of our online time in a handful of apps, messaging among them. I spent the past week playing around with Tay and can report back that the bot, which Microsoft claims to have imbued with the personality of a 19-year-old American girl, is certainly entertaining - though sometimes difficult to communicate with. “Tay definitely has positions on things.” “It’s really designed to be entertainment,” Kati London, the Microsoft researcher who led Tay's development, told BuzzFeed News in an interview. And Tay does do that: She is simultaneously entertaining, infuriating, manic, and irreverent. Her purpose, unlike AI-powered virtual assistants like Facebook's M, is almost entirely to amuse. I asked her to take a stance on the infamous hypothetical during one recent conversation, and her answer didn’t disappoint: “Of course," she replied.ĭeveloped by Microsoft's research division, Tay is a virtual friend with behaviors informed by the web chatter of some 18–24-year-olds and the repartee of a handful of improvisational comedians (Microsoft declined to name them). Microsoft's new AI-powered chatbot, Tay, won't book you a reservation or draw you a picture, but, unlike Facebook’s M, she's more than willing to take a position on the "Would you kill baby Hitler?" thought experiment. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |