When Tools Start Talking Back

Biju Neyyan
4 min readAug 23, 2024

--

So, picture this: I’m having a casual chat with a designer buddy of mine. You know, the usual small talk, swapping ideas, maybe a joke or two. In one of the exchanges, he said “I’m working on a stove.”

Now, anyone would normally assume that he must be working on that great design project of his, solving world’s problems and what not. But I, being a good old friend of his, decided to take his words quite literally and make a joke out of it. In my mind, I visualised my friend perched atop a stove and working hard to meet his deadline.

Naturally, I needed a visual. So, I turned to my trusty AI image generator, ready to craft the most absurd, meme-worthy image of a man hard at work… on a burning stove.

But here’s where things took a turn for the unexpected. My AI seemed to have zero sense of humour. The response? A firm, “Nope. That’s dangerous. Not gonna happen.” I was floored. My AI, the same one that once cheerfully created a cat riding a unicycle, had suddenly morphed into a digital safety inspector, wagging its metaphorical finger at me.

Now, I’m not one to back down from a challenge, especially when it comes to turning my silly ideas into reality. So, I went to another AI. We have plenty of them available for free these days. This one was a bit more accommodating. It did whip up the image, but not without a stern, “This is a dangerous situation, but okay, you asked for it.” The AI equivalent of “Fine, but don’t say I didn’t warn you.

And there I was, staring at the screen, realising I was being lectured by my own creation. I mean, I get it; fire is bad, people shouldn’t sit on stoves. But come on, AI! It’s just a joke! A few years ago, I could’ve drawn this on a napkin, no questions asked. But now, my tools are turning into overprotective nannies, making sure I don’t accidentally hurt myself with a virtual doodle.

Come on, AI! It’s just a joke!

This got me thinking: when did our tools start developing personalities? Remember when a hammer was just a hammer? You could use it to build a birdhouse or, I don’t know, smash a watermelon — no judgment. It didn’t pause to consider the ethics of smashing said watermelon. It didn’t worry that you might make a mess. It just did what it was designed to do.

But now, here we are, with AI that won’t let you create an image of a man sitting on a stove because “that’s dangerous.” What’s next? Will my coffee machine refuse to brew another cup because “that’s too much caffeine, Biju”? Will my vacuum cleaner stop mid-suction because “you should really sweep more often”?

The funny part is, this whole situation makes me nostalgic for a time when tools were just… tools. They didn’t care what you did with them — always up for whatever crazy idea you had in mind, no questions asked.

In the end, I did get my stove image, but not without a side of sass from the AI. And really, that’s the moral of the story. We’ve entered a new era where our tools are starting to think for themselves — and sometimes, they think they know better than we do. Whether that’s a good thing or just the beginning of an AI-led intervention, only time will tell.

But one thing’s for sure: Next time, I’m sticking to doodling on napkins. At least they won’t argue with me.

END

Of course, this was written with Chat GPT. You know, I'm not a writer; I just wanted to share an interesting observation from my personal experience; and chatGPT was the easiest way to get it done and move on.

Ya, they start yapping when they don't like what I want them to write or create, but at the end of the day, I'm the editor; I decide what to add, what to keep, and what to snip out. Isn't that the best way to live in Today’s world?

--

--

Biju Neyyan
Biju Neyyan

Written by Biju Neyyan

artist ∙ designer ∙ tech enthusiast | Works @Samsung, creating amazing products blending design and technology; including AI assistants.

Responses (1)