Junior Dev VS Machine Learning
Junior Dev VS Machine Learning
Junior Dev VS Machine Learning
You're viewing a single thread.
The sad thing is that no amount of mocking the current state of ML today will prevent it from taking all of our jobs tomorrow. Yes, there will be a phase where programmers, like myself, who refuse to use LLM as a tool to produce work faster will be pushed out by those that will work with LLMs. However, I console myself with the belief that this phase will last not even a full generation, and even those collaborative devs will find themselves made redundant, and we'll reach the same end without me having to eliminate the one enjoyable part of my job. I do not want to be reduced to being only a debugger for something else's code.
Thing is, at the point AI becomes self-improving, the last bastion of human-led development will fall.
I guess mocking and laughing now is about all we can do.
at the point AI becomes self-improving
This is not a foregone conclusion. Machines have mostly always been stronger and faster than humans, because humans are generally pretty weak and slow. Our strength is adaptability.
As anyone with a computer knows, if one tiny thing goes wrong it messes up everything. They are not adaptable to change. Most jobs require people to be adaptable to tiny changes in their routine every day. That's why you still can't replace accountants with spreadsheets, even though they've existed in some form for 50 years.
It's just a tool. If you don't want to use it, that's kinda weird. You aren't just "debugging" things. You use it as a junior developer who can do basic things.
This is not a foregone conclusion.
Sure, I agree. There's many a slip twixt the cup and the lip. However, I've seen no evidence that it won't happen, or that humans hold any inherent advantage over AI (as nascent as it may be, in the rude forms of LLMs and deep learning they're currently in).
If you want something to reflect upon, your statement about how humans have an advantage of adaptability sounds exactly like the previous generation of grasping at inherant human superiority that would be our salvation: creativity. It wasn't too long ago that people claimed that machines would never be able to compose a sonnet, or paint a "Starry Night," and yet, creativity has been one of the first walls to fall. And anyone claiming that ML only copies and doesn't produce anything original has obviously never studied the history of fine art.
Since noone would now claim that machines will never surpass humans in art, the goals have shifted to adaptability? This is an even easier hurdle. Computer hardware is evolving at speeds enormously faster than human hardware. With the exception of the few brief years at the start of our lives, computer software is more easily modified, updated, and improved than our poor connective neural networks. It isn't even a competition: conputers are vastly more well equipped to adapt faster than we are. As soon as adaptability becomes a priority of focus, they'll easily exceed us.
I do agree, there are a lot of ways this futur could not come to pass. Personally, I think it's most likely we'll extinct ourselves - or, at least, the society able to continue creating computers. However, we may hit hardware limits. Quantum computing could stall out. Or, we may find that the way we create AI cripples it the same way we are, with built-in biases, inefficiencies in thinking, or simply too high of resource demands for complexity much beyond what two humans can create with far less effort and very little motivation.
creativity has been one of the first walls to fall
Uh, no? Unless you think unhinged nonsense without thought is "creative". Right now, these programs are like asking a particularly talented insane person to draw something for you.
Creativity is not just creation. It's creation with purpose. You can "create art" by breaking a vase. That doesn't mean it's good art.
Artwork is never the art.
And, yet, I've been to an exhibit at the Philadelphia Museum of Fine Art that consist of an installation that included a toilet, among other similarly inspired works of great art.
On a less absurd note, I don't have much admiration for Pollock, either, but people pay absurd amounts of oof for his stuff, too.
An art history class I once took posed the question: if you find a clearing in a wood with a really interesting pile of rocks that look suspiciously man-made, but you don't know if a person put it together or if it was just a random act of nature... is it art? Say you're convinced a person created it and so you call it art, but then discover it was an accident of nature, does it stop being art?
I fail to see any great difference. AI created art is artificial, created with the intention of producing art; is it only not art because it wasn't drawn by a human?
If you're talking about
https://en.wikipedia.org/wiki/Fountain_(Duchamp)
that's a seminal work of avant guard art. You are still talking about it 100 years later. It's obviously great art.
Art is a work of visual, auditory, or written media that makes you feel emotion. That's it. Does this pile of rocks make you feel happy or sad or anything? Then it's art.
AI makes pictures like a camera does. It doesn't make it art unless you make something that evokes emotion.
We're saying the same thing. AI can create art. My point was that we used to claim that art was a domain that was unassailable by machines, and this obviously is not true. So now, humans - or the particular human to whom I was replying - had a new goalpost: adaptabiility.
We'll keep coming up with new goalposts where "humans have an edge" that will keep us relevant and ascendant over machines, and irreplaceable. I believe we'll run out of goalposts faster than many people would like.
You know, there is one small other hope I have: that, despite how we've raised them, our children will be better than us, and will stop the cycle of wealth concentration. It's unlikely, but it's the only chance I see.
I do have one notion where we can have edge over. Human brain is quite optimized in energy usage, as a consequence of natural selection. Meanwhile, IIRC computers are optimized for speed, and so it often wastes energy. Now let's see where this goes - will they be able to operate without gushing in energy?
Well, we could end capitalism, and demand that AI be applied to the betterment of humanity, rather than to increasing profits, enter a post-scarcity future, and then do whatever we want with our lives, rather than selling our time by the hour.
The only way I see that happening is if the entire economy collapses because nobody has jobs, which might actually happen pretty soon π€·
Amen. Let's do that thing: you have my vote.
The best part is that dumbass devs are actively working on self improving AI that will take their jobs.