As a die hard Windows hater that games (I haven’t had Windows installed on any pc I own since 2015) all of the AAA games always get absolutely dogshit performance when they first come out. It was like that with Cyberpunk and it was like that with Hogwarts Legacy. Today, those games play just as well on Windows as on Linux. I’m sure they’ll eventually work it out
I remember seeing a video where they compared linux to windows starfield performance and it was basically the same on average fps but the 1% lows were less prevalent on linux so it might actually work better on linux.
I bet it works fine on amd gpus right now. If you’re on a 10 series Nvidia card you’re fucked. If you’re on a newer Nvidia card it’s still kind of bad though but not every protondb report involving Nvidia 3xxx or 4xxx cards is complaining about performance. I suspect there exists some kind of performance fix for later Nvidia cards that is not yet well known.
The latest driver is 537.13 I think. Most of the time they only bother to put every multiple of 5 driver version in Linux distro repositories. Someone that was familiar with how exactly the low level parts of this worked could manually get driver 537 working on Linux probably. No idea if that would work or not but I haven’t seen someone claim to have tried it yet.
Cyberpunk’s performance was awful for everyone at launch, Windows, Linux and consoles.
But it wasn’t? There were a lot of bugs, to be sure, but PC performance was not among them. Hell, I was on a 970 at the time, and it was still fine.
The console versions specifically were a shit show.
But in regards to running better on Linux, a lot of it tends to come down to shader precaching. Lots of stutters on Windows are the first time a shader loads. That was definitely the case with Elden Ring.
As a die hard Windows hater that games (I haven’t had Windows installed on any pc I own since 2015) all of the AAA games always get absolutely dogshit performance when they first come out. It was like that with Cyberpunk and it was like that with Hogwarts Legacy. Today, those games play just as well on Windows as on Linux. I’m sure they’ll eventually work it out
I remember seeing a video where they compared linux to windows starfield performance and it was basically the same on average fps but the 1% lows were less prevalent on linux so it might actually work better on linux.
Ah here I found it https://www.youtube.com/watch?v=zC6fb889qo4
Here is an alternative Piped link(s): https://piped.video/watch?v=zC6fb889qo4
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
As someone playing on Linux desktop, yes. It’s fine*.
*as fine as it can be, because it needs some general optimization
Edit: and yes, I’m on AMD (it’s the obvious choice for Linux gaming; drivers are in the kernel)
I bet it works fine on amd gpus right now. If you’re on a 10 series Nvidia card you’re fucked. If you’re on a newer Nvidia card it’s still kind of bad though but not every protondb report involving Nvidia 3xxx or 4xxx cards is complaining about performance. I suspect there exists some kind of performance fix for later Nvidia cards that is not yet well known.
The latest driver is 537.13 I think. Most of the time they only bother to put every multiple of 5 driver version in Linux distro repositories. Someone that was familiar with how exactly the low level parts of this worked could manually get driver 537 working on Linux probably. No idea if that would work or not but I haven’t seen someone claim to have tried it yet.
I get a lot of crashes on my rx 6700s, mainly when loading into Neon or The Well
I’m on a 6600 XT and have had not a single crash. I wonder what the difference is. I’m using Pop! with the Liquorix kernel.
I’m on Windows 11
deleted by creator
But it wasn’t? There were a lot of bugs, to be sure, but PC performance was not among them. Hell, I was on a 970 at the time, and it was still fine.
The console versions specifically were a shit show.
But in regards to running better on Linux, a lot of it tends to come down to shader precaching. Lots of stutters on Windows are the first time a shader loads. That was definitely the case with Elden Ring.