to add to this, id like standardization of qualification and competencies - kind of like a license so I don't have to "demonstrate" myself during interviews.
I hate being in a candidate pool that all have a degree and experience, we all go through a grueling interview process on college basics, and the "best one gets picked." Company says "our interview process works great, look at the great candidates we hire." like, duh, your candidate pool was already full of qualified engineers with degrees/experience, what did you expect to happen?
I'm betting you aren't involved in hiring? The number of engineers I've interviewed with graduate degrees from top universities who are fundamentally unable to actually write production quality code is mind-boggling. I would NEVER hire somebody without doing some panel with coding, architecture/systems design, and behavioral/social interviews.
to add to this, id like standardization of qualification and competencies - kind of like a license so I don’t have to “demonstrate” myself during interviews.
I strongly disagree. There is already a standardization of qualification of competences in the form of cloud vendor certifications. They are all utter bullshit and a huge moneygrab which do nothing to attest someone's experience or competence.
Certifications also validate optimizing for the wrong metric, like validating a "papers, please" attitude towards recruitment instead of actually demonstrate competence, skill, and experience.
Also, certifications validate the parasitic role of a IT recruiter, the likes of which is responsible for barring candidates for not having decades of experience in tech stacks they can't even spell and released just a few months ago. Relying on certifications empower parasitic recruiters to go from clueless filterers to outright gatekeepers, and in the process validate business models of circumventing their own certification requirements.
We already went down this road. It's a disaster. The only need this approach meets is ladder-pulling by incompetent people who paid for irrelevant certifications and have a legal mechanism to prevent extremely incompetent people from practicing, and the latter serves absolutely no purpose on software development.
Attention and awareness of the ways in which modern technology is harming ourselves.
We're providing people with the electronic equivalent of heroin, from a young age, completely rewiring our brains and detaching us from nature and each other.
The statistic that ~90% of American teens own an iPhone was shocking to me. It makes me think that from a young age, children are taught not to question but just accept their cage. If closed source is all they grow up with, opensource will be foreign to them. And that in a way that's worse than when you grow up with windows which doesn't completely lock you in.
This! I feel it myself, my ADHD was much better when I stayed in a relatively natural setting with only little technology. for a few weeks (I did some programming there though, and boy was I focused in complex problems without medication etc. had one of my best coding sessions there I think). I'm pretty sure that a lot of ADHD but also other psychiatric issues like autism or social anxiety etc. that is diagnosed these days is because of all this unhealthy environment we have created. Or in other words, our modern technology promotes psychiatric issues such as ADHD, autism, social anxiety etc.
Way more stuff publicly funded with no profit motive
Severe sanctions on US tech giants all around the world, with countries building up their own workforce and tech infrastructure. No more east india company bullshit.
For the hell of it? Because they’re inherently evil? Protectionism am to develop local industry?
I’ve worked for a few, but not the consumer giants most people think of. I haven’t found them evil, and they support employees across the world.
I’ll go even further with developing countries in particular. From my perspective, entire software industries were built on multi-national funding, and we still pay better than local companies. The biggest change over the last decade or two has been switching models from cheapest outsourcing to employing local talent everywhere
If you live in the western sphere, the US tech giants control half of your critical infrastructure and invade every aspect of your personal and professional life. If you live outside the US, they do not answer to you or to anyone you can vote for. They lean on your government for permission to turn your whole existence into a series of transactions, and then extract as much value as possible from each one. The money doesn't swirl around your community making everyone richer. Instead, 5% goes to pay a few nice salaries in your biggest city, and the rest of it gets funneled straight out of the country and into california.
Even Europe - their imperial mentor and favourite uncle - is treated like shit. Europe built half of their technology but controls none of it. There is not a single european tech giant. Every last one is american, with extensive ties to the US government and security apparatus.
More focus on the ability to maintain, repair, and perhaps even upgrade existing tech. So often people are pushed to upgrade constantly, and devices aren't really built to last anymore. For example, those yearly trade in upgrade plans that cell phone providers do. It sucks knowing that, once the battery in my cell phone finally dies, the whole phone is essentially garbage and has to be replaced. I miss my older smartphones that still had replaceable batteries, because at least then it's just the battery that's garbage.
We're throwing so much of our very limited amount of resources right into landfills because of planned obsolescence.
I think the solution to this will come by itself: the supply chain will break down and people will have to learn to make do with what they have. It was like that in the Soviet Union, is like that in some parts of the world right now, and can easily return if we don't get climate change in check.
once the battery in my cell phone finally dies, the whole phone is essentially garbage
I don’t get this. I understand they aren’t user replaceable but surely you can get it replaced? Given how good batteries are, they easily last 2-3 years. iPhones are supported for 5-6 years so you only ever need one replacement
Getting my iPhone battery replaced has typically cost about $75, not all that different from a decade ago spending $35 for a user replaceable battery for a flip phone
One major difference now is that at least iOS gives me a good measurement of battery health so I can make data driven decision
I realize most people could rather not pay for a service they currently have for free (which is partly due to the lack of transparency regarding our data usage).
It's possible that a donation based-society might work. However, I'm not sure how that can be achieved in parallel to a profit-based society (the on we majorly have to take part in).
IMO one way is to force the issue by making certain methods of profit impossible or not worth it in the long run. Something like "don't use it? you lose it" in terms of patents or proprietary solutions. For example if a company stops producing and supporting something, then it has to release the designs, code, and intellectual property to the public.
Have developers be more mindful of the e-waste they're contributing to by indirectly deprecating CPUs when they skip over portions of their code and say "nah it isn't worth it to optimize that thing + everyone today should have a X cores CPU/Y GB of RAM anyway". Complacency like that is what leads software that is as simple in functionality as equivalent software was one or two decades ago to be 10 times more demanding today.
Yes!! I enjoy playing with retro tech and was actually surprised on how much you can do with an ancient Pentium 2 machine, and how responsive the software at the time was.
I really dislike how inefficient modern software is. Like stupid chat apps that use more RAM while sitting in the background than computers had 15-20 years ago...
It leads to software obesity and is a real thing. I think it has to do with developer machines being beefy, so if you write something that runs on it and don't have a shit machine to test it on, you don't know just how badly it actually performs.
But it also has to do with programming languages. It's much much easier to prototype in Python or Javascript and often the prototype becomes the real thing. Who really has time (and/or money) to rewrite their now functional program in a language that is performant?
IMO there doesn't seem to be a clear solution.
I don't think that even the languages are the problem, it's the toolchain. While certainly if you went back to C or whatever, you can design more performant systems, I think the problem overall stems from modern toolchains being kinda ridiculous. It is entirely common in any language to load in massive libraries that suck up 100's of mb of RAM (if not gigs) to get a slightly nicer function to lowercase text or something.
The other confounding factor is "write once, run anywhere" which in practice means that there is a lot of shared code and such that does nothing on your machine. The most obvious example being Electron. Pretty much all of the Electron apps I use on the reg (which are mostly just Discord and slack) are conceptually simple apps that have analogues that used to run on a few hundred mbs of storage and 10's of mb of RAM.
Oh, one other sidetone - how many CPUs are wasting cycles on things that no one wants, like extremely complex ad-tracking/data mining/etc.
I know why this is the case, and ease of development does enable us to have software that we probably otherwise wouldn't, but this is a thing that I think is a real blight on modern computing, and I think it's solvable. I mean, probably the dumbest idea, but improving translation layers to run platform-native code can be vastly improved. Especially in a world where we have generative AI, there has to be a way to say "hey, I've got this javascript function, I need this to work in kotlin, swift, c++, etc."
The death of the device and the return of the system.
A device is a sealed thing provided on a take it or leave it basis, often designed to oppose the interests of the person using it. Like hybrid corn, a device is infertile by design: you cannot use a device to develop, test, and program more devices.
A system is a curated collection of interchangeable hardware and software parts. Some parts are only compatible with certain other parts, but there is no part that cannot be replaced with an alternative from a different manufacturer. Like heirloom seeds, systems are fertile: systems can be used to design and program both other systems and devices.
A system is a liberatory technology for manipulating information, while a device is a carceral technology for manipulating people.
Pretty much anything that's only available via an app store. The difference with web apps is that I can also use them on a laptop/PC and I have a bit more control about tracking (by using ad/tracking blockers).
Probably less elitism. "Oh you build it in x language? Well that's a shit language. You should use y language instead. We should be converting everything to y language because y language is the most superior language!"
(If this feels like a personal attack, Rust programmers, yes. But other languages as well)
Well sure, it depends on the context. If it's a shitpost on /c/programmer_humor, whatever, meaningless banter.
If it's a serious question, (maybe for a beginner) asking how to do something in their language, and the response is "It would be a lot easier in y language" - I don't think it's particularly helpful
As someone who's quite vocal about my support for Rust, I can definitely see how it can go overboard.
But on the other end of the spectrum, saying that all languages are just as good or capable and it doesn't matter which one you use is definitely wrong. There are meaningful differences. It all comes down to what your needs are (and what you/your team knows already, unless you're willing to learn new stuff).
Yea, I kept my original comment language-agnostic (Just referring to it as y language) - but added the extra wink to Rust because generally they seem to be the highest offenders.
I have years of experience in loads of languages: PHP, Ruby, Java, Python, C#, C++, Rust - And that's probably how I'd order the level of elitism. PHP Devs know everything they're doing is shit - Python should probably be next in ranking of how shit they are, but they're not self-aware enough - (Sarcastic elitism aside here - )
Anyways, besides that - at the end of the elitism-spectrum there seems to be Rust. Someone like me says something about Rust in a general unrelated-to-Rust thread like this - and a Rust enthusiast sees it, and it would just devolve into a dumbass back-end-forth about how good Rust is
But, I get what you're saying. I usually filter out this bullshit (because I'm a Rustacean myself 😜) but this doesn't mean that it is as easy for someone else as it is for me.
The cargo culting is always going to happen and turn into elitism. But it stems from real advantages of specific technologies, and sometimes you should actually consider that the tech you're using is irresponsible when better alternatives exist.
Data is a part of a person's individual self. Storing such data on another person is owning a part of their person. It is slavery for exploitation, manipulation, and it is wrong.
This distinction is the difference between a new age of feudalism with all of the same abuses that happened in the last one, or a future with citizens and democracy.
Never trust anyone with a part of yourself. Trust, no matter how well initially intentioned, always leads to abuse of power.
Honestly, just less waste. Wasted time, wasted hardware, etc. We spend so much time building devices that are meant to break, and be unfixable,, and making software that fights the user instead of helping. All in the name of profits or something.
We could be making so many cool things, but instead we're going back and forth not making any progress.
I’m not sold on user replaceable phone batteries, but USB-C was a long time coming.
I just wish they had moved faster on USB standardization - I’m trying to switch but my phone and Kindle are my only USB-C devices. Either I need to waste functioning products by updating everything else or I still need chargers for older stuff back to mini-USB. It’d be nice to standardize on USB-C charging blocks but even that would mean buying new cables or adapters for four different USB form factors
User first, non-profit software companies.
To maximize profits, software keeps sacrificing the users happiness. I want to stop having the argument that the user would want X, but hearing we can't do that because it will hurt profits.
Boot out corporate shitware, boot out adverts, and stop collecting data unless it is absolutely necessary, or alternatively just cancel the fucking product and don't do it.
Personally, I'm just sick and tired of modern UI design. Bring back density, put more information on the screen, eliminate the whitespace, use simple (and native!) widgets, get rid of those fucking sticky headers, and so on.
In addition to all the software freedom stuff, and so on. Also, I wish GPL were more popular too.
Yeeees, why do modern websites have so much horizontal whitespace? That 3 column design where 2 are empty. Just... why? Luckily firefox has a reader mode. Makes news websites much more bearable.
It’s inefficient, there are many alternate layouts that are “better”. I feel like AI is going to give us auto-fill that makes the keyboard efficiency less important though.
A friend of mine got asked if she had a boyfriend. She asked back "why that question". It was to know whether she would be likely to get pregnant and miss work.
I'm curious, are you in the USA? Working in Western Europe, so far I have never seen sexism (nor racism) happen at work. Outside of it, for sure though.
Are you a guy by any chance. I also hadn't noticed until the day I asked a couple of my women colleagues. Turns out it can be very subtle but "effective". And it can also come from women.
As a guy, my manager kicks ass and we’re all extra motivated to make her look good. She used to be a peer but once she became manager, her true skills shone
I’ll go further and say some sort of OpenStack like thing should be mainstream. Why shouldn’t home computers by default be able to deploy cloud-like services?
Afaik, the way it currently works is by calling via javascript. Ironically, the way strings are handled in the browser is also a major performance block with rust at least.
Accessibility and internationalization first. A lot of projects start without it and tack it on later. It's so much better to have good roots and promote diversity and inclusivity from the start.
Could you elaborate in what context and to what extend? I can agree that bigger companies with large user-bases should have a focus on accessibility and internationalization -
But generally a lot of projects start with just one dev solving a problem they have themselves and make their solution Open-Source. Anecdotally, I'm dumping my solutions on Github that are already barely accessible to anyone somewhat tech-illiterate. No one is paying me anything for it. Why would I care whether it's accessible or internationalized for non-English speakers?
Internationalization isn't about the translation. It's about not hard coding the strings that display. Putting them somewhere that is easy to swap out would allow users to provide their own if they wanted.
As a solo developer, some things are out of scope like writing translations or ensuring full compliance with accessibility standards. What's important is to have some knowledge of what things block progress in these areas. For example, not treating all strings like ASCII, or preferring native widgets/html elements as those better support accessiblity tools.
I love how we have free to use licences (MIT, GPL, CC, etc) and it would be really great to see the same idea used with terms of services and privacy policies! How great it would be to quickly see that this site uses fair tos and to understand what it includes? Maybe this would also nudge (at least smaller) companies toward not being horrible privacy invading monsters
That would probably be pretty hard, considering every service is different. Google drive stores your data and so their ToS probably says you can't store pirated content, but that wouldn't make sense for most other services that you can't upload stuff to.
The disappearance of all these tech peacocks and web turkeys who focus on their number of followers and the quantity of talks rather than quality. The dev rel advocates made the atmosphere toxic
Stop forcing updates on the lower level stuff that forces people to spend billions on maintaining code. This way, we could return to a world where you can just buy software and use it for years without some update borking it.
Also outlawing financially motivated (i.e. greedy) retroactive ToS changes.
Any sort of “contract”with the user including ToS, licensing agreements, etc. These consistently violate contract law since it’s not a negotiation between peers, you don’t have an opportunity to read before purchasing, and there’s no direct quid pro quos for what you’re giving up. By all rights these should be unenforceable
Fucking always-on connectivity and security problems caused by it are the main reason why things can't just work. You need to be updated or else.
I visited a friend not that long ago and he kept using Windows XP and The Bat and Opera around version 9. He knew every keyboard shortcut because he didn't have to relearn every few years. Never got hacked, I just wonder when his bank stops working because of TLS incompatibilities.
I mean it did change for a very good reason. Stuff gets hacked because everyone is online always. In "the good old days" it wasn't a problem because people weren't really online so there was pretty much zero risk of old software being used to exploit your machine. These days? It's a liability to have old stuff on your phone because someone could exploit it to steal stuff from a large number of users.
Small security updates when necessary would be fine, but all the time I just see software (especially with the web) be like, we're deprecating these features (that millions of websites use).
On the internet, more open standards and community driven stuff. It's currently really, really annoying that on my mastodon there are a lot of people sharing bluesky codes, as if that's not just punting the ball for another couple of years. Although this will hopefully be a better outcome than straight up silos like the old social media, fediverse still should be the default way we think about connecting humanity (or something like it, the underlying tech isn't really that important.) Also, far more things should just be like, a dollar a month or whatever instead of having a massive amount of privacy invading, user experience destroying ads.
In software in general, more privacy. It should be assumed that unless I explicitly opt in, my data is just that, mine. This is a tricky one because I remain hopeful about generative AI and that needs data to improve the models, I'm leery of sharing my data with it because so far the more pedestrian uses of data mining have not been used for things that I can really support. I remain extremely leery about GAI that isn't explicitly open source and can't be understood generally.
On the hardware side, computers have mostly been good enough for a while now. Tech will always get better, but I would like to see more of a focus on keeping working devices useful. Like, at some point, technology products will cease being possible to be useful in a practical way because it can't run modern software, but we're leaving a lot of shit behind where that's not the case. Just about any device with an SSD and a processor from the last 10 years (including phones!) should be able to be easily repaired, supported longer, and once support ends, opened up for community support.
FYI the bluesky protocol is open and there's plans to standardize. It's also federated (the sandbox network is open to 3rd parties)
There's lots of new privacy techniques from cryptography, stuff like differential privacy and MPC could help a lot with making it easier and safer to use collaboration tools.
I am skeptical of Bluesky. It's led by Jack and we've already seen how that goes. Second, there isn't really a good technical reason for it to exist as it's own protocol outside of the fact that they want to control it given that Fedi/Mastodon was already there and they could have just as easily contributed to that with the things they wanted, they just wouldn't have had full control. Similar to Threads promise to federate, I will be somewhat surprised if they ever do it.
Were Bluesky/Threads not a corporate effort, I have a feeling that it would have followed a similar pattern as the fediverse - build the protocol and release that, then the clients will follow. Bluesky still isn't federating even with its own protocol, and Threads isn't either. Given that's stuff that tiny teams with far, far fewer resources than the corps have accomplished, it's a little wild that neither have gotten there.
Especially with Bluesky, there doesn't seem to be a stated plan for how it's going to make money. And we're talking about a lot of the same people that destroyed the Twitter API and started locking things down even before Elon killed it completely and they're trying to convince us that they are pushing for an open environment.
Same thing for companies that run out of business. When you pay for something there's a (sometimes tacit) agreement that bugs will be fixed. At least this would allow companies/users to do that themselves when needed.
I want my devices to run on an OS/framework which allows everything to be scriptable. Data should be visualised using simple/consistent interface.
There will be events, Actions, variables, data-streams, etc and the operating system should provide easy interface to quickly create new programs which can
Visualize data streams (filterable) using simple interfaces(configurable)
Create scripts which can create custom events or custom actions which are just built upon existing events/actions.
In such a system, the focus of apps should not be to add fancy interfaces for simple things, but to register new events, actions, data streams, visualizers into the OS and maybe provide new templates to use these additions.
I suppose.. but when you have frameworks like Angular that update every 6 months, even the best efforts for backwards compatibility fall by the wayside.
Awareness if not prioritizing energy efficiency. We rarely give any thought to services that run 24/7 even though they're only used 8/5 or we don't care how many CPU cycles a process uses even though it has no SLO for runtime. Most companies probably think it's a question of dumping millions of carbon units into to the atmosphere or becoming luddites.
Developers should go back writing efficient code in lower level programming languages to stop wasting CPU cycles for stupid reasons, like not wanting to use types, or something more stupid than that.
For personal projects and prototypes i believe it's fine, but when you consume the electricity of mid-size countries just because you prefer to write your production code in convenient languages don't lecture others about ecology and climate change (i am not refering to you).
Less consumerism, more focus on real social aspects:
Macro: robust (decentralized) political system, that's not easily corruptible, e.g. via something like blockchain
Micro: more focus on direct interaction with other people, not via something like a screen, as another post here already said, we're harming ourselves (promote psychiatric issues etc.) with the current state of technology (smartphone overuse). We have gone much less social (direct interaction with others) because of this I'm sure of.
Actually that's one of the few cases, where a (distributed/decentralized) blockchain really makes sense (trustless ledger which can be used for incorruptible/transparent political systems)...
We could end the era of the developer as a specialized caste. Our tools should be powerful enough that they allow people with problems to collaborate on software to solve those problems, without having to let that become their full time job.
I think there's a step in between: forcing proprietary solutions (hardware, software, designs, ...) to be opensourced once they aren't maintained or supported anymore.
@onlinepersona I'd like to see less skinny-jeans male faux feminists in tech and more big-chested manly men. I would also like to see less dishonest feminized virtue-signalling from soydev bugmen about what great feminist egalitarians they are. We all know that's bullshit.