What is the technical hill you are willing to die on in your industry?
What is the technical hill you are willing to die on in your industry?
Every industry is full of technical hills that people plant their flag on. What is yours?
What is the technical hill you are willing to die on in your industry?
Every industry is full of technical hills that people plant their flag on. What is yours?
For any non-trivial software project, spending time on code quality and a good architecture is worth the effort. Every hour I spend on that saves me two hours when I have to fix bugs or implement new features.
Years ago I had to review code from a different team and it was an absolute mess. They (and our boss) defended it with "That way they can get it done faster. We can clean up after the initial release". Guess what, that initial release took over three years instead of the planned six months.
In my team we manage 2 software components. 1 of them (A) has 2 devs, the other (B) approximately 5.
Every time a feature needs to be added, B complains that it's going to take forever, while A is done in a fraction of the time.
The difference? B is a clusterfuck of a codebase that they have no time to refactor because they run low on time to implement the features.
I work in A, but I'm not going to steal the credit, when I entered the company, A already had a much cleaner codebase. It's not that me and my partner are 10x better than the ones working in B, they just have uglier code to deal with.
I can't comprehend why management doesn't see the reason A needs half the devs to do the job faster.
I can’t comprehend why management doesn’t see the reason
Management cannot see beyond the next quarter, it's a genetic precondition of the species.
Not everything needs to be deployed to a cluster of georedundant K8s nodes, not everything needs to be a container, Docker is not always necessary. Just run the damn binary. Just build a .deb package.
(Disclaimer: yes, all those things can have merit and reasons. Doesn't mean you have to shove them into everything.)
But then how will I ship my machine seeing as it works for me?
Damn, I haven't thought of that! Looks like I have to use a subdirectory of your Homedir from now on.
I work in disability support. People in my industry fail to understand the distinction between duty of care and dignity of risk. When I go home after work I can choose to drink alcohol or smoke cigarettes. My clients who are disabled are able to make decisions including smoking and drinking, not to mention smoking pot or watching porn. It is disgusting to intrude on someone else's life and shit your own values all over them.
I don't drink or smoke but that is me. My clients can drink or smoke or whatever based on their own choices and my job is not to force them to do things I want them to do so they meet my moral standards.
My job is to support them in deciding what matters to them and then help them figure out how to achieve those goals and to support them in enacting that plan.
The moment I start deciding what is best for them is the moment I have dehumanised them and made them lesser. I see it all the time but my responsibility is to treat my clients as human beings first and foremost. If a support worker treated me the way some of my clients have been treated there would have been a stabbing.
Disabled people are so often treated like children and it just sucks.
Like you, I tend to feel that in general, people need to stop trying to force people to live the way they think is best. Unless there is a very real, very serious impact on others ("I enjoy driving through town while firing a machine gun randomly out my car windows"), people should be permitted to choose how to live as far as possible. Flip side is that they gotta accept potential negative consequences of doing so. Obviously, there's gonna be some line to draw on what consitutes "seriously affecting others", and there's going to be different people who have different positions on where that line should be. Does maybe spreading disease because you're not wearing a facemask during a pandemic count? What about others breathing sidestream smoke from a cigarette smoker in a restaurant? But I tend towards a position that society should generally be less-restrictive on what people do as long as the harm is to themselves.
However.
I would also point out that in some areas, this comes up because someone is receiving some form of aid. Take food stamps. Those are designed to make it easy to obtain food, but hard to obtain alcohol. In that case, the aid is being provided by someone else. I think that it's reasonable for those other people to say "I am willing to buy you food, but I don't want to fund your alcohol habit. I should have the ability to make that decision." That is, they chose to provide food aid because food is a necessity, but alcohol isn't.
I think that there's a qualitative difference between saying "I don't want to pay to buy someone else alcohol" and "I want to pass a law prohibiting someone from consuming alcohol that they've bought themselves."
I disagree with restricting alcohol for food stamps. In fact, it shouldn't be food stamps, it should be cash. When you attach all these requirements and drug testing and restrictions you are destroying the autonomy of the person you are claiming to help.
It is like with housing. Many of the housing programs available require drug tests, job seeking documentation, separating men and women, and so on. In some cases this can make a little sense, given that men are much more likely than women to be domestic abusers, but other cases make less sense. If someone uses drugs to cope with their life and then you offer housing only if they stop the thing that is helping them cope they will not be helped, they will be harmed. They will not be able to take the housing and end up off the street in a secure place building a life, they will be still on the street and still on the drugs.
If I go and work a job and get paid should my employer be able to say "I'm fine with paying you so you can have housing and food, but alcohol? No, I don't want to pay for alcohol"? This would be insane. Your employer choosing what you can do with your money outside of work hours is authoritarian nonsense and yet when it comes to welfare or charity people think it is fine. I disagree vehemently.
If I give you money to alleviate your suffering who am I to decide how you employ that? I want you to have more money because it is fungible, you can do almost anything with money, so you can make choices. I want you to have more power to effect your life, not less.
I assume you are an American given your reference to food stamps. Where is the American spirit of independence? Of self determination? Of rugged individualism? It seems quite dead in the modern era of state capture and authoritarian oligarchy. It is a loss and a tragedy.
Patient autonomy!
Not strictly technical, although organizational science might be seen as a technical field on it's own.
Regularly rotating people between teams is desirable.
Many companies just assign you in a team and that's where you're stuck forever unti you quit. In slightly better places they will try to find a "perfect match" for you.
What I'm saying is that moving people around is even better:
You spread institutional knowledge around.
You keep everyone engaged. Typically on a new job you learn for the first few months, then you have a peak of productivity when you have all the new ideas. After some 2 years you either reach a plateau or complacency.
I'm in health sciences and I wish we would do more education days/conferences. I'm a med lab tech and I feel like no one knows what the lab actually does, they just send samples off and the magic lab gremlins Divine these numbers/results. I feel the same way when another discipline discusses what they do, its always interesting!
I'll allow it, institutional knowledge while sounding good does cause business continuity problems.
Cleaning, organizing, and documentation are high priorities.
Every job I've worked at has had mountains of "The last guy didn't..." that you walk into and it's always a huge pain in the ass. They didn't throw out useless things, they didn't bother consolidating storage rooms, and they never wrote down any of their processes, procedures, or rationals. I've spent many hours at each job just detangling messes because the other person was to busy or thought it unimportant and didn't bother to spend the time.
Make it a priority, allocate the time, and think long-term.
Starting a new job soon, and I’m paying for some holes in documentation as I prep my offboarding documentation for my current team. Definitely making it a priority to do better going forward! Being lazy in the moment is nice but the “stitch in time” adage is definitely true
Make it a priority, allocate the time, and think long-term.
In many jobs, someone with the power to fire you makes the priorities, allocates your time and does not think long-term.
I'm so hot for you right now.
AI is a fad and when it collapses, it's going to do more damage than any percieved good it's had to date.
I can believe that LLMs might wind up being a technical dead end (or not; I could also imagine them being a component of a larger system). My own guess is that language, while important to thinking, won't be the base unit of how thought is processed the way it is on current LLMs.
Ditto for diffusion models used to generate images today.
I can also believe that there might be surges and declines in funding. We've seen that in the past.
But I am very confident that AI is not, over the long term, going to go away. I will confidently state that we will see systems that will use machine learning to increasingly perform human-like tasks over time.
And I'll say with lower, though still pretty high confidence, that the computation done by future AI will very probably be done on hardware oriented towards parallel processing. It might not look like the parallel hardware today. Maybe we find that we can deal with a lot more sparseness and dedicated subsystems that individually require less storage. Yes, neural nets approximate something that happens in the human brain, and our current systems use neural nets. But the human brain runs at something like a 90 Hz clock and definitely has specialized subsystems, so it's a substantially-different system from something like Nvidia's parallel compute hardware today (1,590,000,000 Hz and homogenous hardware).
I think that the only real scenario where we have something that puts the kibosh on AI is if we reach a consensus that superintelligent AI is an unsolveable existential threat (and I think that we're likely to still go as far as we can on limited forms of AI while still trying to maintain enough of a buffer to not fall into the abyss).
EDIT: That being said, it may very well be that future AI won't be called AI, and that we think of it differently, not as some kind of special category based around a set of specific technologies. For example, OCR (optical character recognition) software or speech recognition software today both typically make use of machine learning --- those are established, general-use product categories that get used every day --- but we typically don't call them "AI" in popular use in 2025. When I call my credit card company, say, and navigate a menu system that uses a computer using speech recognition, I don't say that I'm "using AI". Same sort of way that we don't call semi trucks or sports cars "horseless carriages" in 2025, though they derive from devices that were once called that. We don't use the term "labor-saving device" any more --- I think of a dishwasher or a vacuum cleaner as distinct devices and don't really think of them as associated devices. But back when they were being invented, the idea of machines in the household that could automate human work using electricity did fall into a sort of bin like that.
I'm a bit more pessimistic. I fear that that LLM-pushers calling their bullshit-generators "AI" is going to drag other applications with it. Because I'm pretty sure that when LLM's all collapse in a heap of unprofitable e-waste and takes most of the stockmarket with it, the funding and capital for the rest of AI is going to die right along with LLMs.
And there are lots of useful AI applications in every scientific field, data interpretation with AI is extremely useful, and I'm very afraid it's going to suffer from OpenAI's death.
The issue that I take with AI is that it's having a similar effect on ignorance that the Internet created but worse. It's information without understanding. Imagine a highschool drop out that is a self proclaimed genius and a Google wizard, that is AI, at least at the moment.
Since people imagine AI as the super intelligence from movies they believe that it's some kind of supreme being. It's really not. It's good at a few things and you should still take it's answers with skepticism and proof read it before copy/paste it's results into something.
React sucks. I'm sorry, I know it's popular, but for the love of glob, can we not use a technology that results in just as much goddamn spaghetti code as its closest ancestor, jQuery? (That last bit is inflammatory. I don't care. React components have no opinionated structure imposed on them, just like jQuery.)
They should stop teaching the OSI model and stick to the DOD TCP/IP model
In the world of computer networking you are constantly hammered about the OSI model and how computer communication fits into that model. But outside of specific legacy uses, nothing runs the OSI suite, everything runs TCP/IP.
Understanding that other protocols are possible is important. Sure, reality doesn't fit neatly into the OSI model, but it gives you a conceptual idea of everything that goes into a networking stack.
So does the TCP/IP model and that is what systems actually use.
I don't discount your point but coming from the support side, it'll cause confusion to call it a layer 6 issue than a layer 8 issue. Layer 0 issues won't change at least, especially as long as sales gets in the middle of project planning.
Is there anybody on Lemmy that isn't a software engineer of some description? No? Anyone?
Yes, me. I am a network engineer with an expired CCNA
I'm a geologist!
Carpenter
Just because I'm not in a technical job doesn't mean I'm not a technology user.
I'm a machinist.
Shoe Cobbler
My field is HR.
As for technical hills, I'm not sure.
Low voltage electrical engineer!
I'm not, although I do write some code.
I do (workplace) safety, compliance and hazardous waste handling.
I do gynecology as a hobby.
Musician and Audio Engineer (so my 9-5 is in IT)
I do data engineering, so to software engineers, the answer is no. To non-software engineers, they'd probably say yes
I was thinking the same thing.
otoh, the post does say 'technical'
Lots of trades are starting to use more modern equipment, to the slow embracement of older more “established” tradespeople.
An example from a couple decades ago, when auto leveling tiling systems were coming out, more “prestigious” companies refused to use them since they think it’s cheating. It’s like a calculator, a tool to make your life easier though.
Anyways, now you won’t catch tilesetters NOT using them.
I am a former Solutions Architect. So yeah, the only code I would ever write is some hacky scripting.
Paper pusher!
IT restrictions should be much more conservatively applied (at least in comparison to what's happening in my neck of the woods). Hear me out.
Of course, if you restrict something in IT, you have a theoretical increase in security. You're reducing the attack surface in some way, shape or form. Usually at the cost of productivity. But also at the cost of the the employees' good will towards the IT department and IT security. Which is an important aspect, since you will never be able to eliminate your attack surface, and employees with good will can be your eyes and ears on the ground.
At my company I've watched restrictions getting tighter and tighter. And yes, it's reduced the attack surface in theory, but holy shit has it ruined my colleagues' attitude towards IT security. "They're constantly finding things to make our job harder." "Honestly, I'm so sick of this shit, let's not bother reporting this, it's not my job anyway." "It will be fine, IT security is taking care of it anyway." "What can go wrong when are computers are so nailed shut?" It didn't used to be this way.
I'm not saying all restrictions are wrong, some definitely do make sense. But many of them have just pissed off my colleagues so much that I worry about their cooperation when shit ends up hitting the fan. "WTF were all these restrictions for that castrated our work then? Fix your shit yourself!"
you will never be able to eliminate your attack surface, and employees with good will can be your eyes and ears on the ground.
All the good will in the world won't make up for ignorance. Most people know basically next to nothing about IT security, and will just randomly click shit to make the annoying box go away and/or get to where they think they want to go. And if that involves installing a random virus they'll happily do it, and be annoyed that it requires their password.
You pay me to admin 400 servers on a couple million dollars worth of hardware. Let me install a fucking app on my own machine without 4 levels of bullshit.
Me and the IT admin in my previous job had this understanding, as I dealt with field hardware, and he dealt with the "normal" IT stuff.
Once a merger caused the corporate requirement of only allowing whitelisted apps to run, my laptop was simply disappeared from the requirement list. It made it easier for the both of us. I could be on the other side of the world in sudden need of running some proprietary BS software that had to be whitelisted, and nobody wanted me to have to wake someone up to whitelist stuff.
When you deal with network hardware that cost more than most PCs, and the server clusters cost more than a house, some leeway should be allowed.
Until you install solarwinds between 2015 and 2020......
A major part of that is, I think, that desktop OSes are, "by default, insecure" against local software. Like, you install a program on the system, it immediately has access to all of your data.
That wasn't an unreasonable model in the era when computers weren't all persistently connected to a network, but now, all it takes is someone getting one piece of malware on the computer, and it's trivial to exfiltrate all your data. Yes, there are technologies that let you stick software in a sandbox, on desktop OSes, but it's hard and requires technology knowledge. It's not a general solution for everyone.
Mobile OSes are better about this in that they have a concept of limiting access that an app has to only some data, but it's still got a lot of problems; I think that a lot of software shouldn't have network access at all, some information shouldn't be readily available, and there should be defense-in-depth, so that a single failure doesn't compromise everything. I really don't think that we've "solved" this yet, even on mobile OSes.
Sure, but the reason isn’t always just security.
We have government contracts and want more. But to get those, they insist on us doing a bunch of security things.
So it sucks for the users, but if we don’t implement the restrictions, we lose the contracts and thus the income.
And as a side benefit, holy shit we are pretty secure. Next annual pentest soon and I’m expecting good things from it!
Workplace safety is quickly turning from a factual and risk-based field into a vibes-based field, and that's a bad thing for 95% of real-world risks.
To elaborate a bit: the current trend in safety is "Safety Culture", meaning "Getting Betty to tell Alex that they should actually wear that helmet and not just carry it around". And at that level, that's a great thing. On-the-ground compliance is one of the hardest things to actually implement.
But that training is taking the place of actual, risk-based training. It's all well and good that you feel comfortable talking about safety, but if you don't know what you're talking about, you're not actually making things more safe. This is also a form of training that's completely useless at any level above the worksite. You can't make management-level choices based on feeling comfortable, you need to actually know some stuff.
I've run into numerous issues where people feel safe when they're not, and feel at risk when they're safe. Safety Culture is absolutely important, and feeling safe to talk about your problems is a good thing. But that should come AFTER being actually able to spot problems.
I'm always in favour of actually testing safety stuff.
Does that fall arrest line actually work? Go walk over to that way until you can't.
Can this harness hold you without cutting circulation off to your legs? Go sit in it for an hour and see.
The mining industry emphasizes safely culture, just like what you said, and a lot of it is focused on wearing PPE.
There are still too many preventable deaths and accidents.
I think safety is talked about and vibe-based to please investors.
Any tolerance on a part less than +/- 0.001 isn't real. If I can change the size of the part enough to blow it out of tolerance by putting my hand on it and putting some of my body temperature into it then it's just not real.
Technisation and standardisation are good for the EMS sector.
The whole "it was better when we could do what we want and back then we had only real calls with sicker people and everything was good" is fucking aweful and hurting the profession.
Look, you fucking volunteer dick, I know you do this for 10 years longer than me (and I do it for 25 now),but unlike you I did it full-time and probably had more shifte in one year than you had in your life. Now my back is fucked because back then there was no "electrohydraulic stretcher", no stair chair, the ventilator was twice as heavy (and could basically nothing), the defibrillator weighted so much we often had to switch carrying it after two floors up.
And we had just as many shit calls,but got actually attacked worse because the shit 2kg radios were shit and had next to zero coverage indoors, and so had cellphones which led to you being unable to even call for backup.
And of course we had longer shifts,needed to work more hours and the whole job market was even more fucked.
"But we didn't need this and that,we looked at the patient". Yeah,go fuck yourself. MUCH more people died or took damage from that. So many things were not seen. And it was all accepted as "yeah, that's how life is".
So fuck everyone in this field and their nostalgia.
In the medical system here, there is a trend toward imaging and other tests but no actual examination of the patient.
I have a friend whose injury didn't look too bad on MRI. But a lesser scan (CT?) they don't value as much showed the actual problem and confirmed the complaint. Our greater trust for the new hotness, and discounting tools we needed to use before the new exam tools even when the patient begs, is not a perfect solution.
It seems we could be doing both and getting a better understanding.
I totally agree with everything you say about the heavy tools and bad radios - family was in rural EMS, and the bodily wear and tear seems to be prevalent among all the old peers.
Cognitive behavioral therapy/dialectical behavioral therapy are not the universal cure for everything and they need to stop being treated as such
I'll join you on this hill, soldier.
CBT is the only one they've tested, and they tested themselves, and of course they look great. It offloads all success and failure 100% to the victim, and so many failures don't reflect on the process; ever. It resembles a massive sham.
My counsellor friend calls it "sigma-6 for mental health" and notes how it's often not covered by insurance (even outside America's mercenary system) so it's a nice cash cow for the indu$try.
If people used a language that actually leverages the strengths of dynamic typing, they wouldn't dislike it so much.
I encourage every programmer to build a Smalltalk program from the ground up while it's running the entire time. It really is a joy
Should also try programming in Rockstar so you can actually say you are a rockstar developer.
build a Smalltalk program from the ground up while it's running the entire time.
Lisp works this way too. An editor can provide completions, documentation, and the like by introspection of the running program. Experimental code can be tested immediately against live state.
I'm puzzled that this approach isn't more common.
I use, say, bash quite happily. But I will also come down pretty firmly on the side of static typing for large software packages. It lets software handle a bunch of rigorous checking that otherwise eats up human time.
I fucking hate AI in HR/hiring. I try so hard not to spread my personal data to LLMs/AI ghuls and the moment I apply for a job I need to survive I have to accept that the HR department's AI sorting hat now knows a shit ton about me. I just hope these are closed systems. if anyone from a HR department knows more, please let me know
Hardly a hot take really...
Weird i haven't seen this one yet: the cloud is just someone else's computers.
Hardly a hot take really...
Your favorite AI enabled LLM does a very, very good job of simulating language tests based on previous tests and there's no reason at all not to use it to study and prepare.
It can write you a poem, it can't write you a play.
It can't write you anything that hasn't been written a million times before, but it can give you a paragraph and tell you to find the verbs, and then mark the exercise to a shocking level of accuracy. It can explain what you did wrong and give accurate examples and details. Then you can say "I don't get it...I still don't get it" a hundred times and it will try and try to explain it to you, endlessly, and it will never get frustrated or impatient.
More like it can write you a poem but it won't make you a poet.
Take the time to do it right the first time but also don't waste time if it doesn't add value.
Having a process is great but if the process exceeds the value then the process not only harms profit margins but also erodes morale. If the reason a process exists is to counter bad behavior then it's an employee problem not a process problem.
Open office floorplans are a terrible idea!
Work from home shouldn't be considered a given based on the job tasks but a privilege and benefit extended to those employees that have shown the discipline and reliability to work from home. But the in office requirement shouldn't be forced on everyone just to satisfy a "butts in seat policy" or a managers insecurity.
People are idiots and it's the designers' duty to remove opportunities for an idiot to hurt themselves up and just short of impacting function.
Snapshot tests suck. That's a test that stores the dom (or I guess any json serializable thing) and when you run the test again, compares what you have now to what it has saved.
No one is going to carefully examine a 300 line json diff. They're just going to say "well I updated the file so it makes sense it changed" and slap the update button.
Theoretically you could only feed it very small things, but if that's the case you could also just assert on what's important yourself.
Snapshots don't encode intent. They make everything look just as important as everything else. And then hotshot developers think they have 100% coverage
Abilify is a beautiful long term maintenance med but wholly inappropriate for an acutely agitated and combative patient.
I don't think an aesthetics opinion counts as a technical hill to die on.
Don’t fucking paste content from a word doc into your IDE. Some people I work with think it’s a time saver.
Do it via an actual text editor like Notepad++ to clear out all the bullshit.
Professionally: Waterfall release cycle kills innovation, and whoever advocates it should be fired on the spot. MVP releases and small, incremental changes and improvements are the way to go.
Personally: Don't use CSS if tables do what you need. Don't use Javascript for static Web pages. Don't overcomplicate things when building Web sites.
A dirty hack that exists now is infinitely better than a properly developed tool that has gone through all stages of approval and quality control at some theoretical point in the future.
My shitty report.pl script was heavily frowned upon when I put it on the production servers. Not only was it an undocumented script, but there was going to be a "proper" tool for that soon. Well, the proper tool never arrived and now three years later everyone is using my script because we are all too lazy to compile a list of warnings manually.
I always tell people "There is nothing more permanent that a temporary hack that works". It also means that sometimes I will refuse the hack because I know ut will become permanent.
I think it depends on how dirty it is and how easy it is to replace. A decent solution now is better than waiting for the perfect solution that may never come.
Your Polish real estate maintenance script? 😆
A perl script which collects some basic stuff from a few servers in a cluster to get some statistics and compile a quick overview of things to keep an eye on.
Jack it together now but then a Italy run the proper fix. Don't leave dirty hacks in production or test.
"installing a library" should not exist as a concept. A library is either so essential that the OS needs it (and therefore it is already installed), or is not essential enough that each program can have its own copy of the library.
"But I want all my 3 programs that use this random library to be updated at the same time in case a security flaw is found in it!" Is no excuse for the millions of hours wasted looking for missing dependencies or dependencies not available for your system. If that library does have a security vulnerability your package manager should just find your 3 programs that use it and update their copy of the library.
each program can have its own copy of the library.
Efficiency out the window...
I don't care about 10KB or even 100KB of disk space per installed program if it saves humanity the collective millions of hours wasted on .dll/.so issues.
If your program needs libcirnfucb to run, it should be in the same directory as your program, and you are responsible for putting it there for me. No other program in my computer needs libcirnfucb, there's no efficiency gains and now I have to go to some random website from the 90s and find where they put the damn download link and now I have to learn all about how libcirnfucb manages their versions and if I am in the correct webpage, because the project is abandonware that was formed 10 years ago and now it is in another 90s looking website that has a name completely unrelated to libcirnfucb.
Efficient code beats easy code, regardless of resources.
Electric vehicles are not a solution for environmental problems, they pollute when building the batteries and, unless nuclear energy is widespread, they will be powered by coal/gas making them pretty polluting.
Bonus: people should stop being lazy and learn to setup a server infrastructure instead of using "the cloud". Your data are safer, you save money and give less power to gargantuan cloud companies.
Weren't there multiple researches concluding that even an EV powered by a coal plant is better for the environment than an ICE vehicle?
If you tell me gasoline yeah probably (diesel generator to power electric motors is done in big ships), caol I highly doubt it.
But apart from pollution per se, an electric car used everyday would require at least 50% of a household power budget to charge (2-3 kW). If every single ICE vehicle would be immediately swapped to electric, I doubt many countries would be able to cope with the increased power consumption. That's why we need more energy infrastructure before a full switch. Or you know, less cars and more public transport.
Bonus: people should stop being lazy and learn to setup a server infrastructure instead of using “the cloud”. Your data are safer, you save money and give less power to gargantuan cloud companies.
If change happens here, I'm pretty sure that it's going to be in the form of some sort of zero-administration standardized server module that a company sells that has no phone-home capability and that you stick on your local network.
Society probably isn't going to make everyone a system and network administrator, in much the same way that it's not going to make everyone a medical doctor or an arborist. Would be expensive to provide everyone with that skillset.
My "everyone" was a bit too wide I think. I'm not talking about everyday people of course. I'm talking about 50+ employees companies, that would save money by hiring a sysadmin and running their own servers. I know of companies with thousands of employees that pay millions on Azure and AWS and have no in-house infrastructure. That's how you get to Amazon running half of the internet
Software engineering: Don't script stuff! Seriously, just stop, it's a huge waste of everybody's time. I don't fucking care if you think mygitwrapper.sh is the bomb. I want you to know your git commands by heart or just learn to use the damn history on your terminal.
Scripting is only allowed if it's part of the project's infrastructure. Stop faffing about.
The problem isn't mygitwrapper.sh. You need to know those git commands to write that wrapper. The problem is people taking someone else's mygitwrapper.sh and using that instead of learning git.
I had this fight at work once. Someone wanted to write a makefile to invoke pytest. I didn't want to do that because I wanted people to know how pytest works, so when something goes wrong they know they can do -vv or --pdb or whatever.
Scripts that cover trivial steps and obscure stuff people should know, I'm not a fan of.
Are you a fan of curl?
https://justuse.org/curl/
Best lib on the planets.
Agree but I’d add “unnecessarily” or something, because yeah many common aliases and smaller convenience functions offer meager cumulative time savings in trade for the skill atrophy, but script files can also contain seriously lengthy and/or complex logic that would simply be counterproductive to attempt typing line-by-line into a terminal without any mistakes, especially for scripts that are run often.
Every day I have to open a VM as I turn on the computer. I could go find and open Virtual box, then select the VM and open it. Why would I do that when I can open the terminal and run a script that does that in a single action. Then I have to SSH into that machine that always has the same IP. Why should I have to type the IP every time?
Scripts are good when used correctly. I don't need to know what vboxmanage to run to do whatever I want, I just needed to search it once and remove it from my brain.
Kinda unrelated but:
I don't think you should know how to do everything from the git CLI. For 99% of use cases, the IDE already knows how to do what you want to do, with a simple button. For the rest, just search for the git command when you need it.