It's actually not as easy as you think, it "looks" easy because all you seen is the result of survivorship bias. Like instagram people, they don't post their failed shots. Like seriously, go download some stable diffusion model and try input your prompt, and see how good the result you can direct that AI to get things you want, it's fucking work and I bet a good photographer with a good model can do whatever and quicker with director.(even with greenscreen+etc).
I dab the stable diffusion a bit to see how it's like, with my mahcine(16GB vram), 30 count batch generation only yields maybe about 2~3 that's considered "okay" and still need further photoshopping. And we are talking about resolution so low most game can't even use as texture.(slightly bigger than 512x512, so usually mip 3 for modern game engine). And I was already using the most popular photoreal model people mixed together.(now consider how much time people spend to train that model to that point.)
Just for the graphic art/photo generative AI, it looks dangerous, but it's NOT there yet, very far from it. Okay, so how about the auto coding stuff from LLM, welp, it's similar, the AI doesn't know about the mistake it makes, especially with some specific domain knowledge. If we have AI that trained with specific domain journals and papers, plus it actually understand how math operates, then it would be a nice tool, cause like all generative AI stuff, you have to check the result and fix them.
The transition won't be as drastic as you think, it's more or less like other manufacturing, when the industry chase lower labour cost, local people will find alternatives. And look at how creative/tech industry tried outsource to lower cost countries, it's really inefficient and sometimes cost more + slower turn around time. Now, if you have a job posting that ask an artist to "photoshop AI results to production quality" let's see how that goes, I can bet 5 bucks that the company is gonna get blacklisted by artists. And you get those really desperate or low skilled that gives you subpar results.