I can't decide whether I want this AI customer service chatbot Rickrolling users to actually be real… but I'm still here for it
I will admit, an AI Rickrolling a human completely off its own digital back has a certain appeal, and I'm willing to suspend some disbelief in the hope that maybe this customer service chatbot really did decide to fake-out a user with an OG meme. Freaky as that thought might be.
But oh, sweet memories. Home from school, dinner's cooking, and I'm browsing the forums. Someone's explaining why I'm wrong to think a Frost-spec Death Knight can play DPS effectively, linking out to some previous WoW patch notes, a link I'm invested enough to click on. I anticipate a wall of text. But lo, before me, a boyish figure clicks his heels and swings his hips to hypnotic '80s synth, singing words that teach me the simple beauty of love and commitment. What a wholesome and innocent game that's been played on me.
Those were the days. Days it now seems that AI chatbots might bizarrely bring about once again, as we see Lindy AI's founder Flo Crivello showing the AI bot seemingly responding to a user request email with a bona fide Rickroll, stating, "Lindy is literally f***ing rickrolling our customers."
The bot, which boasts being "the world’s easiest way to build AI automations to save you time and grow your business", is seen in the post responding to a user request for an instructional video by linking out to what it said was a "comprehensive video tutorial". Lurking beneath the link's hypertext, however, in a tale as old as time, was a link to Rick Astley's 1987 pop hit Never Gonna Give You Up.
Rickrolling was good, harmless fun back in the day. The worst thing I can say about it is that it turned a genuinely good song into a punchline, but I'm sure Rick Astley's bothered by that that almost zilch.
Yes, it was fun, but before long it joined the likes of minion memes, trollfaces, and cats with a ravenous appetite for cheeseburgers. If you Rickrolled someone, you were behind the times (sonny). These days, it might be funny again in a kind of ironic way: because it's just so not funny. Maybe.
A customer reached out asking for video tutorials. We obviously have a Lindy handling this, and I was delighted to see that she sent a video. But then I remembered we don't have a video tutorial and realized Lindy is literally fucking rickrolling our customers. pic.twitter.com/zsvGp4NsGzAugust 19, 2024
But does AI know that? Is Lindy doing ironic, here? Somehow, I think not. Unknowingly absurd, for sure, but not ironic. For that there'd need to be genuine self-awareness, and I think AI's a priori lack of self-awareness is what makes this Rickroll give me a minor case of the heebie jeebies. A human being who Rickrolls me gets a genuine kick out of it, but what does AI get out of it?
Though, it could also be an easy way of getting publicity for your chatbots from internet memologists and news writers by forcing this interaction for socials. But that's nowhere near as much fun.
Heebies and jeebies and fakery aside, I'll take Rickrolling over the Big Red Button and resulting mushroom cloud that some AI doom-and-gloomers predict. Plus, there's something kinda cute about an AI chatbot resurrecting an almost-20-year old internet prank, isn't there. Oh God, I just called AI cute, didn't I? Nobody tell the posthumanists.