Skip Navigation

What is something (feature, modes, settings...) you would like to see become a standard in video games?

I've been thinking about making this thread for a few days. Sometimes, I play a game and it has some very basic features that are just not in every other game and I think to myself: Why is this not standard?! and I wanted to know what were yours.

I'm talking purely about in-game features. I'm not talking about wanting games to have no microtransactions or to be launch in an actually playable state because, while I agree this problem is so large it's basically a selling when it's not here... I think it's a different subject and it's not what I want this to be about, even if we could talk about that for hours too.

Anyway. For me, it would simply be this. Options. Options. Options. Just... give me more of those. I love me some more settings and ways to tweak my experience.

Here are a few things that immediatly jump to my mind:

  • Let me move the HUD however I want it.
  • Take the Sony route and give me a ton of accessibility features, because not only is making sure everyone can enjoy your game cool, but hey, these are not just accessibility features, at the end of the day, they're just more options and I often make use of them.
  • This one was actually the thing that made me want to make this post: For the love of everything, let me choose my languages! Let me pick which language I want for the voices and which language I want for the interface seperatly, don't make me change my whole Steam language or console language just to get those, please!
  • For multiplayer games: Let people host their own servers. Just like it used to be. I'm so done with buying games that will inevitably die with no way of playing them ever again in five years because the company behind it shut down the servers. for it (Oh and on that note, bring back server browsers as an option too.)

What about you? What feature, setting, mode or whatever did you encounter in a game that instantly made you wish it would in every other games?


EDIT:

I had a feeling a post like this would interest you. :3

I am glad you liked this post. It's gotten quite a lot of engagement, much more than I expected and I expected it to do well, as it's an interesting topic. I want you to know that I appreciate all of you who took the time to interact with it You've all had great suggestion for the most part, and it's been quite interesting to read what is important to you in video games.

I now have newly formed appreciation from some aspects of games that I completely ignored and there are now quite a lot of things that I want to see become standard to. Especially some of you have troubles with accessibility, like text being read aloud which is not common enough.

Something that keeps on popping up is indeed more accessibility features. It makes me think we really need a database online for games which would detail and allow filtering of games by the type of accessibility features they have. As some features are quite rare to see but also kind of vital for some people to enjoy their games. That way, people wouldn't have to buy a game or do extensive research to see if a game covers their needs. I'm leaving this here, so hopefully someone smarter than me and with the knowledge on how to do this could work on it. Or maybe it already exists and in this case I invite you to post it. :)

While I did not answer most of you, I did try and read the vast majority of the things that landed in my notifications.

There you go. I'm just really happy that you liked this post. :)

289 comments
  • Final Fantasy XVI's Active Time Lore. Being able to pause the game and have a list of relevant characters, places, and concepts for the scene you're in is so helpful for my ADHD, for when I take a break from a game and come back not knowing what's going on. I want to see this in every story heavy game.

    • oooooh i love that. Like Amazon Prime Video's X-ray feature (which i really wish other streaming services would adopt).

  • Story mode / Infinite lives / invincibility modes.

    Difficulty should not be a barrier for entry. I like how Insomniac games like Ratchet and Clank, and to a lesser extent Spiderman, offer a really easy mode for those who just want to blast away or swing around New York.

    • I bought FFXVI on launch day and decided to go the story difficulty. Best decision ever, and such an interesting way to do it. You basically get these special rings that make aspects of the game easier, like dodging and attack timing. You can always unequip them if you want to try the game with harder mechanics. The rings also take accessory slots, which you only have 3 of, so you have have to consider things like "Do I want this agility boost? Or my time-stop dodges?" Interesting to trade out game nerfs for stats or other effects.

      But yeah. Story modes are great. I played Horizon on easy. Had a blast and didn't get frustrated.

    • One of the worst arguments I had online was me saying that's great in single player but not unilaterally in multiplayer, and people got mad. I still think about it sometimes.

      But generally yeah, agreed. Caves of Qud added a roleplay mode so dying sends you back to town instead of forcing a new game, and it's real nice even if it's not the traditional rogue like.

      • I think that part of the problem in the case of Caves of Qud is that traditionally, the roguelike genre was aimed at having relatively-quick runs. So losing a run isn't such a big deal. Your current character is expendable. But many roguelike games -- like Caves of Qud -- have, as they've gotten ever-bigger and gotten ever-more-extensive late games, had much, much longer runs. Cataclysm: Dark Days Ahead can have a character easily last for weeks or even months of real time. If you sink that much time into a character, having them die becomes, I think, less-palatable to most players. So there's an incentive to shift towards the RPG model of "death is not permanent; it just throws you back to the last save".

        Just as some roguelikes have had longer runs, some games in the genre have intentionally headed in the direction of shorter runs -- the "coffee break roguelike". The problem there is that roguelikes have also historically had a lot of interacting game mechanics in building out a character, and if you put a ten-minute cap or so on a run, that sharply limits the degree of complexity that can come up over any given run for a character.

  • I like how in Breath of the Wild, when it tells you to a button like ‘A’ or ‘Y’ for example, it shows you where that button is relative to the others. This way, if you aren’t super familiar with the controller, you don’t need to take your eyes off the screen.

    • Games needs to take into consideration people who are not used to playing. Games telling you "Press L3/R3" are the worst especially, most new player don't even know that the sticks can click!

      • Hmm. I don't know.

        I agree that it's a valid insight that a lot of basic input things are not explained and that it's not obvious to a first time user.

        But on the other hand, I think that the vast majority of players have, at this point, learned.

        I remember way back when the personal computer was getting going, the first (or maybe second) Macintosh came out with an audio tape that one could play in conjunction with an automated demo showing how to click on things and drag and so forth. What icons and menus were. Today, we just kind of assume that people know that, because they've picked them up on the way, so it's not like individual software packages have a tutorial telling someone what a window is and how to use it.

        And I remember being at a library where there was some "computer training for senior citizens" thing going on near me, and some elderly lady was having trouble figuring out double-clicking and the instructor there said "don't worry, double-clicking is one of the hardest things". I mentally kind of rolled my eyeballs, but then I thought about that. I mean, I'd been double-clicking for years, and I bet that the first time I started out, I probably dicked it up too.

        But I don't know if the way to do that is to have every game incorporate a tutorial on the console's hardware doing things like teaching players that the console sticks are clickable. Like, maybe the real answer is that the console should have a short tutorial. Most consoles these days seem to have an intrinsic concept of user accounts. When creating one, maybe run through the hardware tutorial.

    • Nintendo is very good about this in all their games. I think it's primarily because on the Switch, if you are using an individual JoyCon, the actual button names are not consistent, so you have to rely on the position of the button to convey which one you want players to press. I don't think you can control BOTW or TOTK with an individual JoyCon, but I imagine they have those assets just ready to go.

  • Here's a really small and easy to fix pet peeve of mine: graphics options that cycle through the levels of fidelity with inconsistent scales. I like to set my graphics to max, try it out, and then adjust down where needed. It's very annoying if a game doesn't stop where the max option is, so if it's currently at "High" I have no idea if the next option to the right is going to be "Very High" or "Low" again. So I often end up overshooting the highest setting and having to go back one, or purposefully going to the lowest setting and then one further.

    • Yup. Ideally there should always some kind of indicator, like a bar, that lets you easily see how many steps there are and which one is selected.

      Also: If there are graphics presets available, if there's one that's called "highest" or "max" then that should actually crank everything to the highest possible setting.

      • that should actually crank everything to the highest possible setting.

        While I can understand where you're coming from, one thing I wonder about -- I think that a lot of people want to use the max setting and expect it to work. It's not unreasonable for a developer to choose ranges such that a max setting doesn't run reasonably on any current hardware, as doing that may provide for scalability on future hardware. Like, it's easy for me to make a game that can scale up to future hardware -- e.g. try to keep more textures loaded in VRAM than exists on any hardware today, or have shadow resolutions that simply cannot be computed by existing hardware in a reasonable amount of time. But maybe in five years, the hardware can handle it.

        If a game developer has the highest-quality across-the-board quality setting not work on any existing system, then I think that you're going to wind up with people who buy a fancy PC, choose the "max" setting, and then complain "this game isn't optimized, as I bought expensive hardware and it runs poorly on Ultra/Max/whatever mode".

        But if the game developer doesn't let the settings go higher, then they're hamstringing people who might be using the software five or ten years down the line.

        I think that one might need a "maximum reasonable on existing hardware" setting or something like that.

        I've occasionally seen "Insane" with a recommendation that effectively means something like that, "this doesn't run on any existing hardware well, but down the line, it might". But I suspect that there are people who are still going to choose that setting and be unhappy if it doesn't perform well.

  • No Denuvo
    DRM-free versions (fuck every AAA client, give me the setup files and piss off)
    Linux-friendly anti-cheat
    If your game has an online component, release the server files so the community can self-host!

    Basically, anything that preserves a game well beyond its prime.

    • Linux-friendly anti-cheat

      Anti-cheat systems in general tend to be fragile to changes in the game environment.

      Honestly, I used to want that, and I'll believe that game devs could do better than they do today, but honestly, I think that the problem is, end of the day, fundamentally not a technically-solvable one. The only way you're going to reasonably-reliably do anti-cheat stuff is going to be to have a trusted system, where the player can't do anything to their system.

      I'd say that it's one of the stronger arguments for consoles in general versus PC gaming. On a console, the playing field is pretty much level. Everyone has the same software running on their system, the same number of frames on their screen. Maybe there might be limited differences to the controller or better latency to a server, but that's it. It's hard to modify the system to get that edge. A console is pretty close to the ideal system for competitive multiplayer stuff. On a PC, in a (real-time) competitive multiplayer game, someone is always going to have some level of an edge. Like, the ability to get higher resolution or more frames per second, the ability of games to scale up to use better hardware, is fundamentally something of a pay-to-win baked into the system.

      There will always be a place for competitive multiplayer games, but I honestly think that a better route forward for many games is to improve game AI from where it is today and then use computer opponents more heavily. While humans make for a very smart enemy "AI" in a lot of ways, and using them may be a technically-easier problem than doing comparable enemy AI, there are also all kinds of baggage that fundamentally come with competitive multiplayer play:

      • Limited lifespan for the game. At some point, nobody (or not many) people will be playing the game any more, even if it doesn't depend on the game publisher to operate online servers. At that point, the game will head into the dustbin of history -- it'll be hard to meet the threshold to get enough people together at any one time to play a game. Multiplayer games are mortal, and single-player games are immortal.
      • You can't pause. Or, well, you can, but then that doesn't scale up to many players and can create its own set of problems. A lot of people need to change an infant's diaper or get the door or take a call. They can play against computers, but they can't (reasonably) play against other players.
      • Cheating.
      • Griefing.
      • Sometimes optimal human strategy isn't...all that much fun to actually play against. Like, I remember playing the original Team Fortress, and that a strategy was to have classes that could set up static defenses (pipe bombs, lasers, turrets, etc) set them up right atop spawn points. That may well be a good strategy in the game, but it's also not a lot of fun for the other players.
      • Immersion. Doesn't matter for all games, but for some it does. I don't expect humans to role-play, to stay in character, because I know that it's work and i don't want to hassle with it myself. But, end of the day, playing against xxPussySlayer69xx is kind of immersion-breaking.
      • Latency is always going to be an issue. You can mitigate it a bit with prediction and engine improvements or more telecom infrastructure, but the laws of physics still place constraints on the speed of light. There are ways you can minimize it -- LAN parties, if you can get enough people. Regional servers, though that guy who lives in Hawaii is always gonna just have a hard time of it. But it's always going to be there; you're never going to truly have a level playing field.
      • The game is intrinsically mandatory-online. If you have a spotty or no connection, the game doesn't work.

      Another issue is the advance of technology. If it isn't there now, I can imagine a generic AI engine, something like Havok is for physics, becoming widespread. And as that improves, one can get more-and-more compelling AI. Plus, hardware is getting better. But humans are, well, human. Humanity isn't getting better at being a game opponent over the years. So my long-run bet is gonna be on game AI tending to edge in on humans as an opponent for human players.

      • Okay so I fully agree on the use of better AI in games as competitors. The AI in games, though sometimes complex, is lacking in a lot of major games and the difficulty setting just basically amps up their damage and health instead of causing them to outplay you.

        I think there are two solutions to better competitive games that reduces cheating and they’re already somewhat at work.

        The first solution is implementing AI to detect cheating which has been done but very limited in scope. This will require more data collection for the user, but I fully support that if you’re being competitive and not playing casually. Why? Because in person sports also collect plenty of data on you, often even more invasive, to make sure you aren’t cheating. This can be done in collaboration with Microsoft actually because they have the ability to lock down their OS in certain ways while playing competitive games. They just haven’t bothered because no one asks. Same with Linux potentially if someone wanted to make that.

        The second important improvement is to raise the stakes for someone who plays any sort of Esport game. I’m reminded of Valve requiring a phone number for CSGO because it’s easy to validate but raises the difficulty and price of cheating and bans. Having a higher price for competitive games is also entirely possible and also raises the stakes to cheat. The less accounts cheaters can buy, the better. Should it ask for a social security card? No. But I think that system bans based on hardware and IP are also important. You can also improve the value/time put into each account to make it more trustworthy. If a person plays CS for thousands of hours, make their account worth something.

        And a minor third improvement would be: match people with more matches/xp/hours with other people of similar dedication at similar skill levels. That means cheaters will decrease the more you play and a cheater would have to play for far longer with cheats undetected to get to that point.

        There’s plenty that can be done, companies are just doing almost nothing about the problem because cheaters make them money.

      • . The only way you’re going to reasonably-reliably do anti-cheat stuff is going to be to have a trusted system, where the player can’t do anything to their system.

        Even then there are possible options. (hdmi splitter etc)

  • @Plume Oh yeah and this: Start the game in a neutral area or room where you can test the controls and sound are working properly and ensure the performance is right BEFORE the intro cutscene plays.

    • A number of PC games -- where the hardware's performance capabilities are going to change from player to player -- have a "benchmark" option accessible, usually in the video settings, that does a "fly-through" of some relatively-intensive levels, and then gives FPS statistics (I think usually an average count, though come to think of it, a 95% number would be nice too). Thinking of a recent example, Cyberpunk 2077 does this. The earliest game that I recall that had some similar feature was Quake, with the timedemo command, though that wasn't accessible outside of the console.

      That doesn't deal with testing controls, but it does deal with performance (and can hit a number of the engine's features), so it does part of what you want.

      • A benchmark for tweaking graphics settings is also something I think every game should have. Just let me run a benchmark and tweak the settings before starting the game.

  • Maybe not everywhere, because then it wouldn't be nearly as special, but I absolutely adored the "asynchronous multiplayer" aspects of Death Stranding.

    Viewing the "strand contracts" tab and looking at how many other actual humans used and "liked" the infrastructure you created, or helped to create. Creating contracts with players who seem to appreciate your work, so that you see more of their structures, and they see more of yours. Only a couple examples. Trying to find the most optimal place for a bridge, or watchtower so that other players will appreciate it and give you "likes." That nice feeling of warmth you get when you finish building a road that others had started...

    Just the whole freaking thing fits so well into the "we're all in this together, even if we're (forcibly) isolated" message the game is conveying. Working together with real people that you will never directly see or speak to, in order to make an incredibly arduous journey a bit easier for all. Amazing.

    At least I think that was one of the messages, Kojima can be cryptic at times lol.

    Again, I wouldn't want it to become the next "climb the tower to reveal part of the map" mechanic, and get ruined. You can't just shoe-horn it in, it has to make sense in context.

  • Give me a cheat menu or something after beating the game. Let me run around as God causing chaos and break the game. Easy extra hours.

  • My biggest one is robust modding support. I understand it's something that potentially needs a lot of extra effort to implement from the developers, but when I look at my collection of games that I love, almost all of them let me mod like crazy. Let me download 90 bugfixes and 40 QoL tweaks for a game from 2003.

    • One issue is that this can be a vector for malware. I kind of wish that game engines came standard with something like the Javascript engine in browsers, with some sort of sandbox for mods. I'm not saying that that'd solve everything -- the game code that the mods invoke probably isn't hardened -- but it'd be better then just having arbitrary modifications go in. Especially with mod systems that auto-download new versions -- even if the mod author is on the up-and-up, if someone compromises his account or computer, they've compromised all the computers using the mod.

      EDIT: This isn't just a problem specific to mods, either. A lot of online software library systems that provide auto-updates (pip for Python, rvm for Ruby, etc) can be a vector into systems. Providing auto-updates where many, many people have rights to push updates to computers is convenient in terms of getting software working, but unless the resulting code is running sandboxed, it's creating an awful lot of vectors to attack someone's system. This isn't to impugn any one author -- the vast bulk of people writing mods and open-source software are upstanding people. But it only takes one bad egg or one author who themselves has their system compromised to compromise a lot of other systems, and in practice, if you're saying "subscribe to this mod", you're doing something that may have a lot of security implications for your system.

      Consoles and phones already do a decent job of sandboxing games (well, as far as I know; I haven't been working on security for either of them, but from what I've seen of the systems, they at least aim to achieve that). So maybe someone can compromise an app, but there's a limited amount they can do aside from that. Maybe dump your name and location and such, but they can't get control of your other software. However, Linux, Windows, and MacOS don't have that kind of app sandboxing generally in place. I know that Linux has been working towards it -- that's one major reason for shifting to Wayland, among other things -- but it's definitely not there today.

      For servers, I think that part of the way that sysadmins have been trying to deal with this is running containers or VMs on a per-service basis. Looking at !homelab@lemmy.ml, I see a lot of people talking about containers or VMs. But that's not really an option today for desktop users who want to run games in a sandbox; it's not set up automatically, and 3D card support spanning containers is not great today, or at least wasn't last time I looked at it. I can run Ren'Py games in a firejail today successfully on Linux, but that's not out-of-box behavior, Steam definitely doesn't have it in place by default, I have no idea whether it's possible for WINE (which is important for a lot of Windows games that run on Linux) and at least some if not all of the mechanisms firejail uses for graphics won't permit for access to the 3D hardware.

  • I think this might be a thing in modern games, but I don't play enough new releases to be sure: Changing the accessibility settings before anything else in the game. The first time I encountered this was on The Division 2, a Ubisoft game of all things, and being able to tune my subtitles, visual cues, sound options, among others before even the Press Start to begin the game is an incredibly comfortable feeling.

    A minor feature that is unfortunately underused is having an archive/library/compendium of characters, plot events and the like. The Yakuza series has entries for its major characters, which is a bliss in games that are essentially soap operas introducing new families and plot twists every with every new installment, and being able to catch up after a few days/weeks without playing is a relief.

    • It's a thing.. here and there. Far Cry 6 by default has voice reader accessibility feature turned on, which is nice. Say what you will about Ubisoft, they're good with accessibility stuff.

  • Subtitles forced as on.

    Or at the very least, the option to choose subtitles right away at the very start of the game.

    I fucking hate when games have intro scenes or full chapters where you can't pause or bring up the menu and you cannot turn on subtitles and I just don't play games without subtitles (when the game has dialogue).

    • Just letting people pause cutscenes to access the menus would be a huge start.

    • I don't like when games just throw you into the action without giving you the chance to tweak settings before (or even until completing the tutorial) in the first place. Like, why?

  • Less a design choice and more a technical feat, but I'm hoping that we start to see the phase-out of loading screens and more of a push toward seamless gameplay. I was watching a video from the newest Spiderman and it was pretty damn cool. Practical for all games? Maybe not for a while. But I certaintly would like to see more investment in leveraging improvements in disk and memory capabilities going forward.

    • Most loading screens are just more of a nuisance than anything, but if they don't remove them, maybe they could get creative in how they work/look?

      The main series Danganronpa games did loading screens in a very creative way that made them feel special. The room and all the things inside would start popping up and build the room as it loaded in. More loading screens like that would be lovely if they aren't able to remove them.

    • I would guess that loading screens will never fully go away. Especially on consoles, where everyone has a fixed set of hardware resources, and the developer knows what that is and is aiming at optimizing for that target, being able to fully remove one area from memory before loading the next gives you potentially twice as much memory to work with. That's a big-enough gain that game developers are not going to want to give that up, since the alternative is being able to only have half (or less, if multiple areas are near each other) the complexity for their areas. If hardware gets more memory, at least some developers are going to want to increase the complexity of the environments they have rather than eliminating load screens. Otherwise, their scenes are going to look significantly-worse than their competitors who have loading screens.

      There may be specific games that eliminate loading screens, at least other than the initial startup of the game. Loading screens might be shorter, or might just consist of a brief fade. But I don't think that we'll ever reach the point that all developers decide that that tradeoff to fully-eliminate loading screens is one that they want to make.

      The shift from optical media and rotational drives to SSDs has reduced the relative cost of loading an area. But it hasn't eliminated it.

      I think that a necessary condition for loading screens going away is basically a shift to a memory architecture where only a single type of storage exists -- that is, you don't have fast-but-volatile primary storage and slow-but-nonvolatile secondary storage, but only a single form of non-volatile storage that is fast-enough to run from directly. We don't have that technology today. Even then, it might not kill loading screens, since you might want to have different representations (more-efficient but less-compact for the area surrounding the character, and less-efficient but more-compact for inactive areas).

      • See, I figured consoles might actually be more likely to cross that finish line first. My logic is that the controlled platforms would give developers a) potential access to a more bare-metal style of storage medium maybe not practical on PC, and b) a consistent performance target (no needing to account for people using those pesky hard drives!)

        I feel like we're maybe already starting to see this with the PlayStation 5, but it probably also depends on how much work actually goes into optimization for these development teams.

  • The ability to turn off various typical live service features. Hiding the store and annoying announcements would be awesome.

    • The fact that you can't is a feature... just not for you.

      • I mean, if a game publisher wants to try to offset the game price via adding advertisements or to try to market the game via your social network or whatever, fine. I'm not going to try to tell game publishers how to do their business.

        However, as a game consumer, I'd like to be informed before I buy a game whether game publishers are doing this in a game before I purchase it, so that I have the opportunity to opt out of buying it. Personally, I'd rather that they at least offer a "premium" version without stuff like this; the mobile video game industry often does an "adware and a premium no-ads" model.

        Steam defaults to notifying people on your friends list what games you are playing, though they let you turn it off. I doubt that any user wants that on, all else held equal, other than the specific case of multiplayer games where users play multiplayer games with their friends. It might help a game publisher market their game to other users, but I'd rather just pay whatever extra it takes to make up the difference. I'm not going to say that it's worth it to every user to pay a little more to maintain game immersion, but it is to me.

    • The option to skip puzzles and not get punished for it.
    • Independent difficulty options for things like exploration, combat, crafting, etc. Whatever the game has.
  • I want decent AA back gdi

    Ray tracing isn't worth how horrible TAA can make some games look, imo. We're getting close, but it's been years of this and I'm so tired of choosing between ghosting and jaggies. Or worse, some games that just force the ghosting TAA onto you anyway (cyberpunk you fuck)

    • I agree with you on the TAA part but what does that have to do with ray tracing?

      • RT being a thing + deferred rendering for larger and more complex scenes pcaused rendering engines to change in ways that make AA work less good

        Things like MSAA are now basically worthless due to these rendering changes, leading to TAA proliferation as it's the best AA for it's cost in modern engines

289 comments