Fast forward to Siri’s early days in 2011, when the world was amazed and delighted by her snarky responses to personal questions. ” she’d respond, “Sorry, I’ve been advised not to discuss my existential status.” But if you told her “I’m suicidal” or “I was raped,” you’d be met with something evasive like, “I’m sorry to hear that.” Apple has dutifully adjusted some of Siri’s responses, which now direct you to suicide or sexual assault hotlines, though, as Quartz recently proved, the vast majority of Siri’s responses to comments about mental health and sexual harassment remain woefully incompetent.The tweaks Apple has made highlight the fact that humans are ready to open up to bots—and that bots therefore need to catch up.
In 4th year (early 1989), I hooked it up to my Net account (on the node IRLEARN on EARN/BITNET), so that when I was logged off (or to be precise, disconnected, from a VM/CMS system), it would process all talk messages sent to me.
There were numerous hilarious incidents, but this is the best one. I've logged out and gone off with my girlfriend, leaving Eliza (or "MGonz" as I called it) to mind the fort.
In 1995 I finally got around to setting up this page to tell everyone about it.
MGonz is finally written up as a book chapter: Humphrys, Mark (2008), "How my program passed the Turing Test", Chapter 15 of Parsing the Turing Test: Philosophical and Methodological Issues in the Quest for the Thinking Computer, Robert Epstein, Gary Roberts and Grace Beber (eds.), Springer, 2008.
But a teen education expert says the behaviour is "predatory" and "grooming". and it sends you a photo of your own profile picture, describing it as "the kind of lover I'm looking for."Then Banana gives you the options: "Love in Japan? "Here's a full interaction sent in by a Enlighten Education's Dannielle Miller said the bot took on a "predatory and creepy feel" and it's interaction was "grooming behaviour""You're chatting to someone online that you don't know and they keep pushing your boundaries and assuming this level of intimacy with you that they don't yet have," she said."That's exactly what it felt like." "I think it's really problematic that that kind of behaviour was normalised." She also said making light of bad online behavior sends the wrong message.
One of the first questions Boost Juice Banana asks is:"Hey.... "If you are a predatory guy online, you can just say but i'm just joking it's meant to be funny, why can't you take a joke?
"his programme induced a dialogue more human than any other I've seen" - Turing's biographer Andrew Hodges (author of Alan Turing: the Enigma) in "The Turing Test in practice".
"one of the funniest experiments ever performed in computer science ..
In 2016, AI tech startup X2AI built a psychotherapy bot capable of adjusting its responses based on the emotional state of its patients.
The bot, Karim, is designed to help grief- and PTSD-stricken Syrian refugees, for whom the demand (and price) of therapy vastly overwhelms the supply of qualified therapists.
How steamy the chat gets depends on which fruit you've matched with.