• ocassionallyaduck@lemmy.world
    link
    fedilink
    English
    arrow-up
    114
    arrow-down
    7
    ·
    10 months ago

    I love that the game is such a CPU hogging mess that LTT used it to test over clocking a brand new AMD thread ripper and the game still ran like garbage even on one of the fastest and most multithreaded CPUs that exist.

    I love Cities Skylines but whatever is happening in 2 is a three alarm fire and needs to be fixed.

    • HolyDuckTurtle@kbin.social
      link
      fedilink
      arrow-up
      72
      arrow-down
      2
      ·
      edit-2
      10 months ago

      I imagine LTT did that for meme purposes more than anything else. Threadrippers are not built for games. They’re built for production workloads which don’t translate to gaming performance.

      That said, the point still stands. This game needs the most powerful gaming hardware (e.g. Ryzen X3D series and RTX 4090) on “recommended” settings and 1080p to get averages above 60fps, which is wild. There’s a rather dedicated fellow on reddit who does detailed performance tests after each patch.

        • Nythos@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          10 months ago

          I bought it for my girlfriend’s birthday and had to go through and refund it because of just how poorly the game ran even with everything set to minimum.

          • Xara@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            10 months ago

            Are you on a potato?.

            My system is 8 years old and it plays this game just fine. Granted I am not running 4K. I am still on 60 Hertz monitors. I also haven’t gotten very far into the game so any population over 30k I have not experienced.

      • LordKitsuna@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        They did it because the developers said the game will use however many cores you can give it. And i mean, yeah it maxed out all cores. Likely doing nothing but struggling to keep them synchronized but it was using em

      • yarr@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I imagine LTT did that for meme purposes more than anything else. Threadrippers are not built for games. They’re built for production workloads which don’t translate to gaming performance.

        What are some characteristics of modern, multi-threaded games that don’t match up to production workloads as far as the CPU is concerned? What do you consider a production workload? How does it differ from CS2’s simulation system?

    • Encrypt-Keeper@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      3
      ·
      10 months ago

      Not sure why LTT or anyone else would have thought that would even help considering simulation games like that rely heavily on single core performance.

        • Encrypt-Keeper@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          10 months ago

          CS2 uses multiple cores for… something, but it’s a Unity game and there’s only so much you can do to avoid dependence on a main thread. Your single core perforemance is still going to be a limiting factor.

          • deur@feddit.nl
            link
            fedilink
            English
            arrow-up
            4
            ·
            10 months ago

            CS2 uses a design paradigm called Entity Component System, which allows for aggressive multi core utilization by splitting up game logic into self contained “systems” that operate on a subset of “Components” per “Entity”. This allows for data dependencies to be statically analyzed and a scheduler to maximize CPU Utilization thanks to the better separated workflows.

            It uses DOTS from Unity to accomplish this. There is a small bottleneck in communicating this work back to the game’s renderer, but it is doing a lot of valuable work with all those cores.The communication with the renderer and their rendering implementation sucks right now and thats where the performance tanks.

            I am very aware of how at some level there are less multicore workloads involved but a CPU core can do a metric shitload of work, it’s the RAM and GPU transfers that kill performance. We dont need to blame Unity here, they are fucking this up 100% themselves.

            Theres a video that explains all this but I cant find it and thats pretny annoying so whatever.

            • arin@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Wasn’t most of the frame latency caused by shaders in graphics? There was a deep dive video but i forgot the title and YouTuber

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      10 months ago

      The game when it saw that CPU:

      It seems like we have more power than we know what do do with.

      That means we’re not cutting it close enough!

      Edit: I don’t remember the exact quote but y’all get it.