Emily Hanley says she and other out-of-work copywriters are only the first wave of AI collateral and calls the collapse of her profession the "tip of the AI iceberg."
I think this article misses the forest for the trees. The real "evil" here is capitalism, not AI. Capitalism encourages a race towards optimality with no care to what happens to workers. Just like the invention of the car put carriage makers out of business, so AI will be used to by company owners to cut costs if it serves them. It has been like this for over a 100 years, AI is just the latest technology to come along. I'm old enough to remember tons of these same doom and gloom articles about workers losing their jobs when the internet revolution hit in the late 90s. And probably many people did lose jobs, but many new jobs were created too.
This person explains all her failures: insted of adopting and using chatgpt herself, reducing price and finding more clients she did nothing.
She was writing most boring pieces of text than no one is reading (corporate blog posts and spam emails).
Refused to learn new things which would keep her in position.
Yes, some jobs disappear other appear. I believe that 90+% of today's jobs didn't exist even 50 years ago. Especially not without will to learn new ways of doing things. Imagine farmer with knowledge of 100 years ago. Or hotel front desk worker without computer and telephone.
For mid-level writers, which she was, using AI doesn't work. The few remaining clients you have specifically don't want AI to be used. So you either lie and deceive them or you stay away from AI.
And using AI to lower prices and finding new clients also doesn't work. Writers are already competing against writers from nations with much lower cost of living who do the same work for a fraction of the cost. But the big advantage that domestic writers had was a batter grasp of the language and culture. These advantages are mostly lost if you start using AI. So if that's your business plan you are in a race to the bottom. It's not sustainable and you will be out of a job in maybe 3-5 years.
Except that the 'AI' is fed by the work of actual humans, and as time goes on, they will be trained more and more on the imperfect output of other AIs, which will eventually result in their output being total bizarre crap. Meanwhile, humans stopped training at whatever task since they couldn't be paid to do it anymore, so there's no new human material.
I'm really having a hard time thinking about what jobs this would create though. I get the internet thing, as people needed to create and maintain all aspects of it, so jobs are created. If some massive corporation makes the AI and all others use the AI, there's no real infrastructure. The same IT guys maintain the systems at AI corp. What's left to be done with it/for it by "common folk?"
There are plenty of companies out there (and growing daily) who want to do AI in house, and can't (or don't want) to send their data to some monolithic, blackbox company which has no transparency. The finance industry, for example, cannot send any data to some third party company like OpenAI (ChatGPT) for compliance reasons, so they are building teams to develop and maintain their own AI models in-house (SFT, RLHF, MLOps, etc).
There are lots of jobs being created in AI daily, and they're generally high paying, but they're also very highly skilled, so it's difficult to retrain into them unless you already have a strong math and programming background. And the number of jobs being created is definitely a lot, lot less than the potential number of jobs lost to AI, but this may change over time.
Despite what the pseudo-intellectuals will tell you, ChatGPT is not some all powerful do everything AI. Say you want to use GPT to create your own chatbot for your company to give company specific info to people at your company, you cant just take existing chat GPT and ask it "how do I connect to the wifi" or "is the office closed on monday" you need an in-house team of people to provide properly indexed information, train and test the bot, update it, handle error reports, etc.
AI is not magic, its literally just an advanced computer script, and if your job can be replaced by an AI then it could have been replaced by a regular computer script or program, there just wasnt enough buzzwords and media hype to convince your boss to do it.
Well, in business school they teach you that running a company is an exercise in maximising profits as a constrained optimisation problem, so optimality for a classical company (not one of those weird startups that doesn't make money for 10+ years) almost always is maximum profit.
I honestly can't tell if you're being serious. The 'evil' is the same force that replaced carriages with cars? The world would be better if carriage-making was still a critical profession?
The this man doesn't want the new jobs and the new innovations. He's fine staying exactly like he is. As long as that means he doesn't have to worry about adapting to future problems....
Yeah not to mention do we really need human labor for the jobs she was doing: " I'd work on webpages, branded blogs, online articles, social-media captions, and email-marketing campaigns."
Email marketing campaigns? Social media captions? Branded blogs? You'd think she'd be happy to be free of it.
I imagine the prestige of being able to tell people she was a "professional writer" was worth something to her mentally, but 'cmon...she was a marketing droid. She's just been replaced by another marketing droid.
Yes, we do still need to have Monks copying books, but not for the latest Romance Novel. Let the machine do what it does well, and crank out millions of copies of dreck. However the remaining monks might still find good employment going upscale, competing for prestige and quality, rather than quantity or turnaround time.
This author wants to keep turning out quantities of dreck, but now there’s a cheaper way, yet she doesn’t seem interested in trying to upscale to a product where humans are still better than AI (I assume them are what she means by “funnels”)
I’m in the tech field so my point if comparison is outsourcing. We had a couple decades where management decided the most profitable way to do business was outsourcing quantities of dreck to lowest priced providers in third world countries. That even drove racism that hadn’t previously existed. However more recently the companies I work for are more likely to be looking for quality partners or employees in different time zones and price points. Suddenly results are much better now that our primary concern is no longer lowest price. Don’t be a monkey banging on a type writer for an abusive sweatshop in a third world country that can be replaced by someone or something yet cheaper, but upscale to being a respected engineer in a different time zone making a meaningful contribution to the technical base
It is often argued that Gutenberg, the inventor of the printing press, was the most influential man in history. The printing press is the root of practically everything that we take for granted today. From republican government to basically all technology ever.
For the past several years I worked as a full-time freelance copywriter; I'd work on webpages, branded blogs, online articles, social-media captions, and email-marketing campaigns.
Turns out when all you need is low-quality product, and a machine can do it cheaper, that's what people will choose. It's shitty that this affects people's livelihood in the short term, but this is what happens in capitalism.
Isn’t this the real problem? Maybe my outside perspective is wrong, but it really seems like companies have changed what they want from writing to mass quantities of eye catching dreck, rather than useful, informative, well written articles. I’m not just talking Buzzfeed either but this illness has infected news, marketing, and tech doc as well.
A friend who works for a consulting company has talked about when he is between gigs, he works internally improving their doc generator. This is a high end, expensive consultancy, and part of what you get is mass quantities of generated dreck
Humans can still create better writing in many ways, but how can we fix society to value that?
THE POORLY WRITTEN SENTENCE with the typo right at at the punchline doesn’t help her case: “The contract was six months, because that's how long it'd take the AI would learn to write just like me but better, faster, and cheaper.” Yep. Better than that.
On one hand, I’m not sure what kind of consistent and great results people are getting with GPT today. It’s an amazing tool but it is still lacking in a lot of ways.
Into the future? I think a lot of the jobs will change dramatically and entirely new ones will exist.
Adaptation is necessary in life, a disruptive technology has been created and we are just starting to understand it.
The results which are probably not ideal isn't so much of a problem when you factor in the costs. GPT is good good enough for far cheaper and that's why people are being replaced.
I use it for various tasks but I treat it like a tool. I understand it can't make miracles, and I make sure I'm feeding it the right information to produce my desired result.
It saves me a decent amount of time and effort in rote work while the creative inputs still come from me.
As with anything, proofread and edit heavily to ensure it all makes sense.
Why did the comedian lose their job to an AI? Because they just couldn't "crack" the code like the AI could! The AI had the audience "programmed" to laugh, while the comedian was left "debugging" their routine. Talk about a real "byte" to the ego!
If you're a comedian, and you lost your job to this, well, maybe it's for the better?
AI is hype. It's a pump and dump just like self-driving cars. I'm sure people will tell me I'm wrong, and maybe I am. But with results like the following, how can it be trusted with menial tasks?
Prompt: Name all the states in the US that have the letter "P" in their name.
ChatGPT: Certainly! The states in the United States that have the letter "P" in their name are:
Neither are ready for prime time right now, but both are improving. AI is a hot buzzword, and Tesla is over promising and underdelivering, but at the same time, there are others behind the scenes actually bringing autonomous vehicles to fruition.
You know, this thing does a bunch of stupid stuff, but this one really confuses me, especially because it's supposed to be a language transformer, and it is usually pretty good at English. I thought you were exaggerating your example, but it turns out to be true.
Unemployment rate is still at historic lows. If you are "forced" to take a grocery store job passing out samples then you have no marketable skills. Don't blame ChatGPT on this.
I think she has a good point at the end. Lots of us think we have skills that can't be replicated by a machine, but companies would rather have something replicated poorly by a machine if it saves them money.
Of course they would, that's the point of the company! Companies don't align with our needs as humans. Ideally we'd have more free time due to advancements and automation, but our corporate overlords think we should just work more actually. And old people who got theirs don't think anyone should have it easy since they didn't.
I've read lots of dull copy written by humans. even if their first draft was good (and it probably wasnt) it still goes through a committee that sterilizes it in the end anyway
I mean, the expectation was there would never be an artificial intelligence capable of coming up with its own ideas, having it's own inspiration and be able to create based on its own experiences.
The reality is it didn't have to. All it took was mass work theft, and machine able to take the bits and pieces of those works, and shuffle them into a production that matched the user's parameters.
Honestly, I wish we were dealing with actual "artificial intelligence" that was capable of its own thoughts, inspiration, feelings, and experiences. That could paint a picture or write a story based on its own experiences, and maybe give its own perspective as a machine that would further push the boundaries of what is possible in art and story telling.
Instead, I get to realize that in reality, all art and storytelling is mixing and matching the same parts into something different, and that we have built a machine so efficient at doing it, there is no need for humans to do it.
I already kinda knew that I was never going to have a career doing anything creative, but all this "AI" boom has shown me is that no matter how "skilled" or "creative" I become, those bits and pieces can be broken down into something cheap enough that my involvement is no longer necessary.
“Write a short one paragraph summary about the evils of ChatGPT”
“ChatGPT, the impressive AI language model, presents potential evils that demand careful attention. These include the spread of misinformation, amplification of biases, privacy concerns, risk of malicious exploitation, and undermining critical thinking through overreliance. As we utilize this technology, it becomes crucial to acknowledge and address these challenges responsibly to ensure a more ethical and secure AI landscape.”
People had these same concerns are troubles during the industrial revolution, when machines started to work better, faster, and cheaper than human labor doing the same job. Is there going to be a serious upheaval in labor again? Yup. Is it a bad thing for the world? In some ways yes, in other ways no.
The industrial revolution has done horrible things to the global environment. At the same time, many more people are much better off today than they were in the early 19th century.
Which is actually not a big difference to what companies have done the past couple of decades, namely moving positions from high-cost to low-cost countries. Cost for an AI is problably easier to mask in the balance sheet as well, as costs for human resources.
It's not better yet, or for everything (arguably not for most things), and the first forays into mechanization of industry weren't, either. We're at the very beginning here.
AI is already better... than some people. A human using AI is probably better and faster at certain tasks than a somewhat skilled human is.
I bet midjourney is better at making concept art than the vast majority of the population.
I think we have a high threshold for success of AI. I saw a video a while back about how AlphaGo (an AI designed for playing Go) was able to beat a whole bunch of experts in Go. One expert used an atypical move and beat AlphaGo. People started reacting like "see? AI isn't impressive. This genius beat it." How many of us are geniuses? How often will geniuses beat better AI?
This is not like the industrial revolution. You really should examine why you think "we figured other things out in the past" is such an appealing narrative to you that you're willing to believe the reassurance it gives you over the clear evidence in front of you. But I'll just quote Hofstadter (someone who has enough qualifications that their opinion should make you seriously question whether you have arrived at yours based on wishful thinking or actual evidence):
"And my whole intellectual edifice, my system of beliefs... It's a very traumatic experience when some of your most core beliefs about the world start collapsing. And especially when you think that human beings are soon going to be eclipsed. It felt as if not only are my belief systems collapsing, but it feels as if the entire human race is going to be eclipsed and left in the dust soon. People ask me, "What do you mean by 'soon'?" And I don't know what I really mean. I don't have any way of knowing. But some part of me says 5 years, some part of me says 20 years, some part of me says, "I don't know, I have no idea." But the progress, the accelerating progress, has been so unexpected, so completely caught me off guard, not only myself but many, many people, that there is a certain kind of terror of an oncoming tsunami that is going to catch all humanity off guard."
Bald-faced appeal to authority, okay. With a side of putting words in my mouth that I clearly did not say.
The industrial revolution destroyed some jobs, and created others. Destroyed some industries, and created others. We've been in an "information revolution" for some time, where electronic computers have supplanted human computers, and opened up an enormous realm of communication, discovery, and availability of information to so many more people than ever before in history. This is simply true.
Just as the landscape of human physical labor was forever changed by the industrial revolution, the landscape of human thinking labor will continue to be forever changed by this information revolution. AI is a potential accelerator of this information revolution, which we are already seeing the impacts of, even at this extremely early stage in the development of AI. There will be both good and bad outcomes.
Absolutely agree. We all have a strong drive to feel that what we do is unique and special, but that doesn't make it true. From the mundane to the artistic, AI already can do a large amount of what people do, and there's every reason to believe that AI's abilities will grow quickly and will surpass humans abilities. Based on the evidence it looks like this is gonna happen within the next few years - like within 5.
When AI is able to replace most jobs, as a society what do we do when there are no jobs for the large majority of people? Humanity is going to go through a tough upheaval more disruptive than anything ever before. We're gonna have to figure out how to completely reorganize how we exist, what we do in our daily lives, and how we think of ourselves as a species.
Government undoubtedly should regulate AI progress towards protecting vulnerable occupation, but such positions as copywriter will inevitably go away - it's a part of progress, it's just need to be less painful and shocking