Let me deviate from this series a moment. I had a conversation this morning and I want to get it down before I lose it.
I'll start with some ancient history. In 1988, CD sales surpassed those of vinyl LPs, and in 1992, they surpassed the sale of pre-recorded cassette tapes. CDs did, at the time, feel better; they were easier to handle, we weren't aware of their degradation rates (which are awful), they're smaller and in general the newer, niftier way to listen to music felt cool. But it would be stupid not to note that one reason that CD sales surpassed other formats was because the same persons responsible for selling the CDs made those formats. And as everyone pointed out bitterly at the time, we were being forced to buy a lifetime of albums over again as CDs. It was a terrific moneygrab by the recording studios... and when asked about it, they said a lot of bullshit that amounted to, "fuck the customer, we're doing this."
Between 1988 and 1991 in Canada, it was not possible to buy a vinyl record at a mainstream music outlet. The only option was to buy CDs.
And eight years later the music industry was shafted by the launching of Napster, which their clever format change enabled.
If the music industry hadn't make the push into digital media in the late 1980s, the entire timeline of personal audio digitization would have slowed by several years, maybe even a decade. Much of the early commercial incentive to produce affordable CD drives, blank discs, and digital-to-analog converters came directly from the consumer audio market — the desire to play and copy music. Without that mass market, CD drives would have remained a professional or data-storage niche. Companies like Sony, Philips and Yamaha developed consumer CD-R technology largely because music buyers already owned CD players, and record labels were selling billions of CDs. That base of hardware justified R&D for writable discs and affordable burners.
If vinyl had remained dominant, there’d have been little reason for households to own optical drives at all until data-heavy computing needs (software, multimedia, backups) grew enough to sustain the market independently, likely not until the early 2000s. Audio piracy would then have taken a very different path — slower, perhaps remaining tape-based well into the decade, or moving laterally into mini-discs and other analog-digital hybrids. File sharing like Napster relied not just on digital formats but on standardised, already-ripped audio libraries sitting on hard drives. Without the CD revolution to normalise that digital music layer, the infrastructure for ripping, compressing and trading songs online would have been delayed, possibly until broadband and solid-state storage made digital music independently attractive.
In short, the labels’ forced digitization didn’t just accelerate the CD market — it created the hardware ecosystem that made their own downfall technologically inevitable.
This is old news. This is not what my conversation was about this morning. I'm including the above for context.
Between 1997 and 1999, a series of artistic programs flooded into the market that identified themselves as "artistic assistance" or "instant-aesthetic" tools: CorelXara, MetaCreations Bryce 3D, Poser 3, Kai’s Power Tools and Kai’s Power Goo, Synthetik Studio Artist, Microsoft PhotoDraw and others. These programs marked the first time that software openly promised to bridge the gap between imagination and ability. They didn’t just give users digital brushes or palettes; they offered to do part of the creative work themselves. A person with no training could drop a mountain into Bryce, smooth a human figure into existence in Poser, or warp a photograph into surreal fluidity with Kai’s Power Goo. The selling point wasn’t mastery—it was transformation. The computer became a kind of collaborator, quietly taking over the technical or aesthetic parts that used to demand skill. What had once been a slow apprenticeship in technique was now a series of sliders and presets that made beauty achievable in minutes.
As the 2000s progressed, the things you could do with these programs grew by leaps and bounds, until the line between artist and operator began to blur completely. Each new version added smarter automation, richer presets, and more “intelligent” correction tools that could compensate for nearly any weakness in composition or draftsmanship. By the mid-2000s, programs were no longer just assisting creativity — they were manufacturing it, translating vague gestures into polished, gallery-ready output. What began as a set of digital conveniences evolved into a creative prosthetic: filters that could mimic entire schools of painting, renderers that could light a scene like a film set, and interfaces that promised to make anyone an artist with enough clicks. The underlying message shifted from "learn to create" to "let the software create with you," and that shift quietly redefined what it meant to make art at all.
Untold thousands of children grew up with access to these tools on their parents computers or were introduced to them in grade school, so that by the early 2010s, we had created an entire community of "artists" who self-identified as that, but had never actually committed any art they'd ever created without the help of a computer... and that subtle distinction — between those who used computers to express something and those whose expression only existed because of the computer — became invisible almost overnight. What emerged was a generation fluent in aesthetics but not in craft, people who knew how art should look without ever having learned how art was made.
But it didn't matter, because the demand for generated art by 2015 was monstrous. Those who graduated from an art school with the right papers found themselves in a ready made career — and the irony is hard to miss. We’re talking about a considerable number of late-teen and twenty-something young adults making oodles of money as “artists,” their actual job title. For the first time, the label didn’t require struggle, gallery shows, or a patron—it came with a salary, benefits, and a workstation. These were kids who’d grown up surrounded by digital tools that turned experimentation into creation, and creation into a marketable skill. What had once been an unforgiving, unfulfilling, failure-ridden aspirational vocation was now counted alongside the coder, the data analyst and the technical research. A hipster in the richest salon in New York City might be an "artist" working for a company no one ever heard of, but with the money to pay for a $900 do. Being an artist had arrived.
Then Covid happened.
Overnight, the machinery that kept the creative industries running locked up. Studios couldn’t access shared servers or asset libraries, licensing contracts became tangled in jurisdictional and insurance questions, and managers who had built entire workflows around in-person oversight didn’t know how to function without seeing people at their desks. Overnight, the constant churn of projects—film previsualizations, ad campaigns, game assets, UI design—just stopped. Even though the artists themselves could have kept working from their bedrooms, the bureaucratic systems around them couldn’t adapt fast enough. Deadlines vanished and approvals stalled. But Covid sucked for everyone. And the processes that enabled working from home was slowly managed, so that by 2022 the atmosphere was clearing, the work was ramping up again, the business was getting on its feet and there was a sense that this "work from home" thing would be something that companies would just go on doing once the fog lifted.
And then... A.I.
All the fast-tracking of computer generated artistic production under the human hand over the previous two decades had been flying along. "Art" was less drawing and more "generally sweeping your hand over what you wanted changed." Work that would have taken a really gifted physical artist in 1985 a week to do could be done in a couple of hours by anyone with a reasonable understanding of the computer tools available. The entire digital art ecosystem had evolved toward efficiency and manipulation rather than construction. The skill had shifted from creating an image from nothing to knowing how to command and correct what the software could already generate. Artists had become conductors of process rather than builders of form — editing, refining and directing outcomes instead of producing them stroke by stroke.
It is any surprise that all A.I. really had to do was lift the "conductor" out of the loop?
When seen this way, with the changes in artistic design that we've seen, A.I. isn't that big a leap forward. It's the next obvious one. Just as the sort of writing quality that's needed to produce an advertisement for a car can be easily generated — there are hundreds of thousands of car ads written by human writers that can be digitally pulped and recast — it's just as logical that the sweeping movements of "artists" could be likewise assigned without the real people any more. And no, I'm not kidding with those quotes. None of these people were ever an actual artist. Most of those in 1999 were... they took their experience and applied Synthetik Studio Artist to it and made some fantastic things. But that's because those older horses had learned not only how to use pencils and paints in two years of art class, but had spent forty years learning the actual language of art.
The youths of today who played with the toys in the 2010s, who are now crying because they're careers are shattered, were never really artists. They were just a different kind of coder, who got a chance to make a lot of money because they hit the market at just the right time. There was no such market for 20-something visual coders in 2001, and there won't be one again for visual coders in 2026. Because that's the speed at which our technology moves.
What I'm saying is that they're not special. They're taking advantage of a self-styled label that they didn't earn, while actual artists who did actually earn that label right now AREN'T crying because they're not affected. A.I. can't recreate their art because there's no mountain of content for A.I. to re-pulp. I write articles that A.I. can't conceive of, because I think of things that no one else has thought of. That's how I stay ahead of the curve. It's how all artists do.
Those who are getting eaten alive? Calling yourself an artist doesn't make you one. If you're an artist, then do what artists have always done. Find a way to make yourself relevant, stop pretending the world owes you a free lunch. You've had all the free lunches you're ever going to get. A lot more than most get.
Why did I start this post with that business about compact discs?
Without wasting our time talking about whether or not change is good, we need to realise that change always begets more change. The fat cats at Capitol and Sony though they were controlling their market, making tons of money, until someone got creative with their little tech development and came near to flattening their entire industry. A few brilliant programmers thought it would certainly be convenient if programs could do more than let people draw on computers... what if, they asked, the computer could help? Made sense. But obviously nothing ever stops at the property line. Someone, somewhere, is always figuring out how to make this thing better, how to get something out of it that hasn't been thought of already... how to shake the pillars of heaven, as it were. That is never going to stop. Forget complaining about A.I. Not only is it here to stay, it's already fast becoming... in someone's imagination, we don't know who... a thing it isn't right now.
And we have no idea what that's gonna be. So get a grip, find a place from which you can navigate yourself... and enjoy the ride.
No comments:
Post a Comment