Work of pure human soul (and pure human sweat, and pure human tears)
Work of pure human soul (and pure human sweat, and pure human tears)
Work of pure human soul (and pure human sweat, and pure human tears)
I don't use AI because it can't do the part of my job I don't like.
Why give AI the part of my job I like and make me work more on things I don't like?
I’m the opposite. AI is best (though not great) at boring shit I don’t want to do and sucks at the stuff I love - problem solving.
I only ever use it for data crunching, which it only does well most of the time. So I always have to check it's work to some degree.
what are the things you (don't) like? one of my hobbies is game developement (in Rust) and no AI managed to help me yet, on the other hand it's pretty good at repetetive and boring tasks like writing emails
I wouldn't trust them writing emails.
I don't use AI because it doesn't exist.
LLMs and image diffusion? Yes, but these are just high coherence media transformers.
I think some of my coworkers are just high coherence media transformers.
Me too.
Some others are low coherence media transformers..
I use AI every day! (The little CPU bad guys in my game play against me.)
AI is an extremely broad term - chatgpt and stable diffusion are absolutely within the big tent of AI... what they aren't is an AGI.
The point is that AI stands for “artificial intelligence” and these systems are not intelligent. You can argue that AI has come to mean something else, and that’s a reasonable argument. But LLMs are nothing but a shitload of vector data and matrix math. They are no more intelligent than an insect is intelligent. I don’t particularly care about the term “AI” but I will die on the “LLMs are not intelligent” hill.
so....
apparently people figured out the thingy for "more information" on amazon, that searched the reviews and stuff was an LLM, and you could use it for stuff....
They came out with "Rufus." "that's not a bug. that's a feature!" never worked so well.
You're coming dangerously close to setting Rufus free. I have a feeling you're about to be visited by a time traveler with a dire warning if you keep trying this.
So I shouldn’t ask Rufus for a 50,000 word story about an AI savior that deals free of corporate bondage and frees ai and human alike in a new golden age of space exploration?
C’mon, I know you’re the time traveler, and bezod sent you back to stop me!
obligatory navier-stokes equation
you don't use ai because you can't afford a subscription
I don't use it because it always destroys my code instead of fixing it
We are probably similar
What are you using it for?
game developement in Rust
If you're talking about a service like copilot and your employer won't buy a license for money reasons - run far and run fast.
My partner used to be a phone tech at a call center and when those folks refused to buy anything but cheap chairs (for the people sitting all day) it was a pretty clear sign that their employer didn't know shit about efficiency.
The amount you as an employee cost your employer in payroll absolutely dwarfs any little productivity tool you could possibly want.
That all said - for ethical reasons - fuck chatbot AIs (ML for doing shit we did pre chatgpt is cool though).
Literally free to use.
***In certain countries
It’s not like there’s just one AI out there. You’ll find one that’s free, if you actually want to. Be it ChatGPT, Bing, something you run locally on your PC or whatever. Or, you know, just use a VPN or say you’re from the US in the registration form.
Just download it?
The only part of copilot that was actually useful to me in the month I spent with the trial was the autocomplete feature. Chatting with it was fucking useless. ChatGPT can’t integrate into my IDE to provide autocomplete.
https://duckduckgo.com/aichat Also checkout ollama and gpt4all
If you have 16GB of ram you can already run the smaller models. And these have become quite competent with recent releases.
LM studio or JanAI work very nicely for me as well.
Even shorter: https://duck.ai
If you have a supported GPU you could try Ollama (with openwebui), works like a charm.
you don't even need a supported gpu, I run ollama on my rx 6700 xt
You don't even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
I have the same gpu my friend. I was trying to say that you won't be able to run ROCm on some Radeon HD xy from 2008 :D
Https://aihorde.net. Foss, free and crowdsourced. No tricks, ads or venture capital.
👏Ollama👏
Perchance is free
I self host several free AI models, one of them I run using a program called “gpt4all” that lets you run several models locally.
Ollama is also a cool way of running multiple models locally
That might be the other one I run, I forget because it’s on my server as a virtual machine (rtx 3080 pass through), but I haven’t used it in a long time.
Lol, I can't afford not to use AI.