Good Taste
2026-01-25
There is a lot of doom going around, how all developers, creatives and others are losing our jobs because AI is coming to take them. Some of that doom is real; there's work that's basically throughput and spec compliance, and AI can replace big chunks of it. But what I’m talking about here is the other kind of work, where you have authorship and stake in what you're producing, as a human creator.
I get why many of us fear AI and carry this sense of doom with us, but at the same time, as a creative I'm not super worried about it, and not because I think AI cannot generate things, or even generate good things, because I think it can. But because the hard thing has never been to just produce something. What's always been hard, is producing something good, something made by someone with Good Taste. A real human being that stand in front of hundreds of choices and knows (or feels) what gets to stay, what gets cut, what to push and what should be refused, and finally sharing the choices they made with you, through the medium.
Just to be clear, I'm mainly talking about "creating" and authorship here, not consumption, since you as a consumer are the only judge if something is good or not for you. But when you're making something, if your goal is for others to enjoy it, Good Taste becomes a huge part of what you actually excel at. Not taste as in what the snobby critics do (who ultimately are consumers), but the creator's taste; direction, restraint, pacing, taking risks and sometimes causing offense.
The AI models and the platforms seems to tend to regress to the mean. They optimize for something that "sounds right" and "doesn't upset anyone", creating something that is "acceptable" or even "plausible". This default voice is the opposite of authorship, where you explicitly don't want to just average thing out. You want to hit a specific emotional effect with specific rhythm, and you're sometimes willing to risk choices that look wrong until the entire thing is put together. I think this is why most AI output feels so bland and emotionless.
I'm not trying to say that "AI can't make you feel anything" because I don't think it's true, I think AI can generate something that hits, you can even explicitly train and/or steer AI to produce more "emotional" outputs. But what's actually happening, even there, is that there is a human deciding and curating what counts as a "hit", and the model is learning the shape of that. Ultimately you're bottling taste, not replacing it. There is no stake at the other side, no point of view that is getting committed to, no moment where it suddenly goes "No, I'm not saying that" or "Yes, that's amazing, completely new direction now".
Without any human steering and editing, the AI will just keep handing you infinite plausible takes until one of them happen to work. Infinite "fine", but not much more.
Ultimately, I think AI is an accelerant. If you already have Good Taste, it'll help you move faster, which feels like a good thing: great people continue to produce great things.
But on the other hand, it also makes it easier for people without taste to generate a lot of output that's either bland or sometimes straight up nonsense. Without the ecosystem rewarding high quality and good things, it instead rewards volume and speed, everything slightly tilting to noise.
I feel like we're building the wrong things. The whole vibe right now is "replace the human part" instead of "make better tools for the human part". I don't want a machine that replaces my taste, I want tools that help me use my taste better; see the cut faster, compare directions, compare architectural choices, find where I've missed things, catch when we're going into generics, and help me make sharper intentional choices.
Versions
2026-01-25 a0a9c92 New "Good Taste" post
@@ -0,0 +1,22 @@
+---
+title: Good Taste
+date: 2026-01-25
+---
+
+There is a lot of doom going around, how all developers, creatives and others are losing our jobs because AI is coming to take them. Some of that doom is real; there's work that's basically throughput and spec compliance, and AI can replace big chunks of it. But what I’m talking about here is the other kind of work, where you have authorship and stake in what you're producing, as a human creator.
+
+I get why many of us fear AI and carry this sense of doom with us, but at the same time, as a creative I'm not super worried about it, and not because I think AI cannot generate things, or even generate good things, because I think it can. But because the hard thing has never been to just produce *something*. What's always been hard, is producing *something good*, something made by someone with Good Taste. A real human being that stand in front of hundreds of choices and knows (or feels) what gets to stay, what gets cut, what to push and what should be refused, and finally sharing the choices they made with you, through the medium.
+
+Just to be clear, I'm mainly talking about "creating" and authorship here, not consumption, since you as a consumer are the only judge if something is good or not for you. But when you're making something, if your goal is for others to enjoy it, Good Taste becomes a huge part of what you actually excel at. Not taste as in what the snobby critics do (who ultimately are consumers), but the creator's taste; direction, restraint, pacing, taking risks and sometimes causing offense.
+
+The AI models and the platforms seems to tend to regress to the mean. They optimize for something that "sounds right" and "doesn't upset anyone", creating something that is "acceptable" or even "plausible". This default voice is the opposite of authorship, where you explicitly *don't* want to just average thing out. You want to hit a specific emotional effect with specific rhythm, and you're sometimes willing to risk choices that look wrong until the entire thing is put together. I think this is why most AI output feels so bland and emotionless.
+
+I'm not trying to say that "AI can't make you feel anything" because I don't think it's true, I think AI can generate something that hits, you can even explicitly train and/or steer AI to produce more "emotional" outputs. But what's actually happening, even there, is that there is a human deciding and curating what counts as a "hit", and the model is learning the shape of that. Ultimately you're bottling taste, not replacing it. There is no stake at the other side, no point of view that is getting committed to, no moment where it suddenly goes "No, I'm not saying that" or "Yes, that's amazing, completely new direction now".
+
+Without any human steering and editing, the AI will just keep handing you infinite plausible takes until one of them happen to work. Infinite "fine", but not much more.
+
+Ultimately, I think AI is an accelerant. If you already have Good Taste, it'll help you move faster, which feels like a good thing: great people continue to produce great things.
+
+But on the other hand, it also makes it easier for people without taste to generate a lot of output that's either bland or sometimes straight up nonsense. Without the ecosystem rewarding high quality and good things, it instead rewards volume and speed, everything slightly tilting to noise.
+
+I feel like we're building the wrong things. The whole vibe right now is "replace the human part" instead of "make better tools for the human part". I don't want a machine that replaces my taste, I want tools that help me use my taste better; see the cut faster, compare directions, compare architectural choices, find where I've missed things, catch when we're going into generics, and help me make sharper intentional choices.