Skip Navigation

What's the deal with it being impossible to natively set up a PC to be gaming with a gamepad on monitor 1 while mouse is actively browsing desktop or other open windows on monitor 2?

I enjoy playing modded games on PC. My IT competency is all informal and hands on so I apologize if this is a dumb question or it seems like I have no idea what I'm talking about. What I was looking to do without using a 3rd party software was to have a game playing in any type of windowed, borderless windowed, or full screen display window on monitor 1 while also having the ability to navigate an open browser window or even just navigate my desktop on display 2. This is all a very normal set up I know but the brick wall I hit happens because my ability with a keyboard is a huge gaming removedant for me. So what I wanted to try and set up was for the gamepad and gaming window, regardless of what display monitor it's being used on, to remain as an "up front" and active process while being able to use my keyboard and mouse as their normal operating devices to navigate my dektop or other active program windows simultaneously.

I have only looked into this briefly because it was very evident after searching the internet that there was not a way to natively set up my PC to allow me to game on one monitor using a gamepad while also having the ability to freely navigating other open windows on monitor 2 using mouse and keyboard with the gaming window remaining active.

I'm playing Witcher 3 currently but earlier this year on the same PC and Windows 11 setup, I swear I was able set up Fallout London by editing the setting.ini file to do exactly what I describred no problem. Admittedly I'm usually as baked as an apple pie when playing with games or modding, but I flipping swear by editing the fallout ini it allowed me to set the the window as always on top and active (meaning it didn't move to the background, or pause, or cut the gamepad off when the mouse was used on other dosplay). It was a series of of like 3 lines of true/false display parameters I entered that allowed me to choose how the program would respond when my input switched from the program window's gamepad to the PC window's mouse/keyboard input.

Fast forward to me trying to get the same setup for a modded Witcher 3 and the results of my searching was that it is universally accepted technical impossibility to use a gamepad exclusively in 1 open window program while also using a mouse to navigate a PC's desktop or other app's open windows simultaneously.

My questions are all to ubderstand why this would be so difficult to implement as a standard option you can set in your OS settings? I'm certainly not qualified to actually be questioning this but in my limited understanding of the devices and platforms in play, this really seems like a very do-able and sought after feature.

Pc's with touch screens are basically doing everything I would like to happen but using touch screen inputs and parameters instead of a gamepad's input. What am i missing in this workflow that is brickwalling this from being an actual feature?:

  1. Creating an OS native setting where the user can choose how the OS handles active and sleeping windows. For example, option A is to keep the default function of determining active windows based on actual cursor location. No clue how it's programed but basically keeping things as they currently are where the hover over an open window without clicking in it but still be able to scroll with mouse wheel. Then option B would allow you disable the function where inactive windows are relegated to only running background processes.
  2. The setting would then require a subsequent setting option to pop up for the user set when the step 1 selection was set to have multiple open windows running simultaneously as seperate active windows. This setting would be for the user to choose between utilizing one input device to control the PC globally as normal or to be able to assigned an installed device to only perform within the limits of an assigned program's window when it is open and active.

Basically I'm picturing this working similar to the way a touchscreen laptop allows you to choose between standard desktop mode and tablet mode. Having the actual keyboard and mouse with global permissions as they always had so they will over ride the controller in its assigned program window for troubleshooting, saftey net in case input device 2 (gamepad or w/e) fails mid use. And the 2ndary input device is only able to operate in the confines of the assigned program window.

It also seems like I have messed around with software or device settings in the past that are already doing this for shit like Android Auto, augmented note pads and their stylus, the already mentioned touch screen displays, and I'm guessing but I wanna say some of the more accommodating accessibility options available for different types of handicapable input devices. I mean shit, don't they have to make all computers capable of being used with only a keyboard or only a mouse option already?

I'm fully expecting the answer to be that Windows and Microsoft are too far in the mindset of fuck what users want to include a feature that will require any added operating/programming costs. But like I started, I know that I don't know enough to know if there are major obstacles engineering this to work.

9 comments
  • There really aren't. This is down to the developer of the game in question and not necessarily the OS. Windows does indeed allow for true multitasking, especially in these modern times of multicore processors being ubiquitous, etc. However, nearly all game developers assume that their game will have focus 100% of the time and pause some/most/all processing any time their application does not have focus.

    What Windows does do is assume only one window can have focus at a time, i.e. it is the window receiving default and direct input from the mouse and most especially the keyboard. Most other graphical environments make the same assumption. If your application or window does not have the focus it's also not a given that it's even visible, so it's not necessarily safe for the user to be able to interact with it anyway. It's not that you can't, but rather that it's risky to allow your program to do so. Imagine if you had e.g. a file manager that accepted input from shortcut keys even if it did not have focus, and deploying ECM in your flight simulator happens to be bound to the delete key. See what I mean?

    Gamepad input is actually not necessarily application exclusive on Windows, either. I know for sure that even as far back as the Windows 98 days I had several emulators that could continue running even when they did not have focus and would accept inputs from a controller, although not the mouse or keyboard. This allowed you to do exactly as you describe, keep playing the game (or allow someone else to keep playing the game) via the controller while you browse the internet using the mouse and keyboard, or whatever.

    It's probably also safe for most game developers to assume that regardless of what the computer is capable of, the player's attention can only be directed in one place at a time. There's not much benefit in making sure your program can accept inputs even if the user's nose is buried in their web browser because it's unlikely they'd have the game in any state except paused while they look something up anyway. In Windows specifically you absolutely can even set up an API hook to intercept and read keyboard input and even relative mouse movements before the focused window gets it, but 99.999% of the time there is no non-nefarious reason to do so.

    • Awesome reply. Thank you. A couple questions I have:

      What Windows does do is assume only one window can have focus at a time, i.e. it is the window receiving default and direct input from the mouse and most especially the keyboard.

      Correct me if I'm wrong but this seems like it may have had some changes around the time they started the snap function for windows because it was around then I thought that you started to be able to just hover a mouse over a window and use the mouse wheel to scroll around the open window without actually clicking or activating on the window before having to scroll. Would the window in this case still be considered to be running in the background or does the system keep the window in some sort of suspended but still active state? It is able to know what window is over lapping another. For example if I have two instances of file Explorer open but not open to full page, it can determine what window is behind the other.

      If your application or window does not have the focus it's also not a given that it's even visible, so it's not necessarily safe for the user to be able to interact with it anyway.

      For this, it reminded me of when I was running Fallout London and one of the mods req custom buttom mapping for use with a gamepad. I can't remember the name of the software but it was one of the most frequently used software for mapping keyboard controls to a gamepad and it allowed me to set keyboard buttons for the different mod featured, then I used the software to map the keyboard buttons to the controller buttons. Basically I could've replaced my keyboard and mouse with a controller. I feel like allowing a user to use a 3rd party software to make that many changes to an input device would be way more of a security risk than it would be for them to design a means to do the same thing thru the actual system.

      It's probably also safe for most game developers to assume that regardless of what the computer is capable of, the player's attention can only be directed in one place at a time. There's not much benefit in making sure your program can accept inputs even if the user's nose is buried in their web browser because it's unlikely they'd have the game in any state except paused while they look something up anyway.

      Did not even consider this but in think you hit the nail on the head with this part and the other part where the developers want the least amount of variable that could distract you from focusing on their product.

      Seriously, awesome reply and appreciate all the information you were able to provide. 🍻

      • Re: Scrolling on unfocused windows, I believe this change happened circa Windows XP but it may have been slightly earlier. Windows that do not have the focus can still receive window messages, scroll wheel inputs now being one of them. This does not grant focus to the window in question. Some old/poorly coded apps that ostensibly support scroll wheel input will do this day ignore it if they do not have focus, even if your mouse cursor is over them.

  • The trouble is that most games nowadays are specifically set up to not receive any input when they are not focused so as to not start moving when you want to use your gamepad for something else. Many also reduce their framerate and mute the audio when not in focus and you are lucky if you get a choice in that.

    There might be a way to get what you want with Linux. But configuring that sounds like a nightmare.

    This is generally called multiseat. Every time I hear about it somewhere it seems to be a pain the ass.

9 comments