Many people have misgivings about AI, especially the generative flavour. It’s not really intelligent, they say. It has no feelings. Fine. I’ll cede those points without so much as a flinch.
But here’s the thing: some use cases don’t require intelligence, and feelings would only get in the way.
Take one of mine. I feed my manuscripts into various AIs – is that the accepted plural? – and ask them, “What does this read like? Who does it read like?” I want to know about content, flavour, format, cadence, posture, and gait.
A human could answer that too – if that human had read my manuscript, had read a million others, and could make the connexions without confusing me with their personal taste, petty grievances, or wine intake. AI just spits out patterns. It doesn’t need a soul. It needs data and a difference engine.
Cue the ecologists, stage left, to witter on about climate change and saving the whales. Worthy topics, granted, but that’s a different issue. This is where the conversation slides from “AI is bad because…” to “Let’s move the goalposts so far they’re in another sport entirely.”
I’m not asking my AI to feel, or to virtue-signal, or to single-handedly fix the carbon cycle. I’m asking it to tell me whether my chapter reads like Woolf, Vonnegut, or the back of a cereal box. And for that, it’s already doing just fine.