• Catoblepas@lemmy.blahaj.zoneOP
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 day ago

    Graphics are cool! I just also think the story, mechanics, and game difficulty balance should have an equal amount of consideration, which seems poorly lacking in a lot of modern AAA games. A mile wide and an inch deep is a saying for a reason.

    • tobis@lemm.ee
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 day ago

      The real point where this argument falls apart is that modern AAA games almost exclusively use TAA, which ruins graphics. I’m so sick of shadows blurring and everything looking terrible and people saying it’s next level.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        4
        arrow-down
        5
        ·
        18 hours ago

        This is very weird and I am pretty sure it can be traced to some influencer ranting about something somewhere and I genuinely don’t have the energy to go trace it back.

        • tobis@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 hours ago

          If you are suggesting that I got this position from listening to an influencer, I’m afraid not. I earned this opinion through my own efforts.

          I had three computers in a row experience horrible shadow blurring and nothing seemed to fix it. I spent maybe 100 hours meticulously troubleshooting what the hell was wrong before asking a computer savvy friend to come over and take a look, and he was just like “Oh, that’s just TAA, everyone’s computer looks like that.” And lo and behold, if you turn anti-aliasing off, it disappears.

          A couple of examples I took while troubleshooting: https://imgur.com/a/cqdgIRq https://imgur.com/a/x6mTKx0

          It turns out everyone sees the same thing in almost every game, but until you notice it your brain just filters it out. In many modern games they don’t even give you the option to turn it off. I would have started a hate movement myself, but found out a r/fucktaa community already existed. No lemmy equivalent yet, I believe.

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            2
            arrow-down
            2
            ·
            10 hours ago

            Aaaand there we go. Subreddit it is.

            FWIW, it’s hard to tell from gifs, but that amount of ghosting and frame-blending is neither TAA nor a normal thing that is happening on everybody’s computers without them noticing.

            I’m not entirely sure of what’s going on there, but your “computer savvy friend” was not right.

            Thanks for the source, though. I had been noticing a bunch of people being simultaneously unclear about the difference between temporal antialiasing, temporal upscaling and antialiasing while somehow being extremely opinionated about it and I had no idea how it was going viral. “r/fucktaa” goes some ways towards explaining it. Also why this is probably going to get even more annoying over time as people latch on to yet another random bit of misinfo and run with it.

            If those examples are really from your PC I am kinda curious to know what the hell is going on, though. What GPU are you using? Given the way you’re talking about it I’m assuming you aren’t running any custom post-processing via ReShade or any mods or anything like that?

            • tobis@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              ·
              5 hours ago

              Aaaand there we go. Subreddit it is.

              If you had bothered to actually read my post you would realize that I found the sub after the fact.

              FWIW, it’s hard to tell from gifs, but that amount of ghosting and frame-blending is neither TAA nor a normal thing that is happening on everybody’s computers without them noticing.

              This is both reproducible and repeatable. I can reliably make it happen in several games, and it goes away completely when I turn off TAA in all cases. It has done this on all 3 of my previous computers, and it happened on two of my friend’s (who insisted it did not) computers when checked. I’m not running any custom post processing. All of our cards were Nvidia, so it’s possibly an Nvidia only thing, but even then the point stands.

              I’m much more inclined to believe the effect I’ve done my due diligence to investigate is real, and that it’s simply too mild in most cases for people to notice, than believe some rude stranger with an uninformed “nu uh” and nothing else.

              If you put some of that effort you put into sounding right into actually being right, you can find many clips of the same effect on youtube.

              • MudMan@fedia.io
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                5 hours ago

                It’s not “an Nvidia-only thing”. It’s not a thing at all.

                I mean, ghosting artifacts are a thing. Normally not a TAA thing, they are typically a problem with older upscaling methods (your FSR 1s and whatnot). You caaaan get something like that with bad raytracing denoising, but it doesn’t look like that. And your examples are extreme, so it’s either an edge case with a particular situation and a particular configuration or something else entirely.

                This is one of those wild claims that become hard to disprove by being so detached from reality it’s hard to start. How do I disprove that hundreds of millions of people who have been gaming in games using TAA for about a decade aren’t constantly ignoring vaseline-smeared visuals on par with the absolute worst artifacts of early DLSS? I mean, I can tell you I played multiple games today and none of them do that, that I’ve played a ton of Cyberpunk and it doesn’t do that and that this is not the default state of a very well understood piece of technology.

                It feels weird to even try to be nice about it and bargain. You MAY have stumbled upon a particular bug in a particular config or a game. You MAY be just mistaking “TAA” for temporal upscaling and just using some weird iteration of it in a worst case scenario. I mean, if you’re not outright trolling I can see what you call “too mild in most cases” just being some DLSS ghosting and you’re just lumping several things that cause ghosting as “TAA”. But all that is just… too much credit to the idea, if I’m being honest.

                I’d still ike to know what specific GPUs you’re using and how you set up the games when it “happened” in all those computers. Direct video capture wouldn’t be a bad idea, either. I don’t know why I’m even entertaining this as anything other than some weird videogame iteration of flat earth stuff, but I’m still fascinated by how brazen it is and kinda want to know now.

                • tobis@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  5 hours ago

                  It feels weird to even try to be nice about it and bargain.

                  This is you being nice?

                  The issues with TAA are so widely known, I’m surprised you can be ignorant of them. People in the know generally acknowledge it, but consider it worth the downsides for efficient AA. 99% of the time the effects are much less severe than what I posted, as I had to put in effort to find a moment to illustrate what was happening to diagnose it, but once you see it you can’t unsee it.

                  Essentially it’s like if I was talking about screen tearing, and you were arguing that screen tearing didn’t exist because “hundred of millions of people” weren’t experiencing it. Most people don’t even notice the screen tearing until you tell them it’s happening. The TAA blurring is even harder to spot. Also, people ARE experiencing the blurring, which is why enough people talk about it to annoy you. They also have documented evidence of the exact same thing I’m talking about, if you actually cared to look.

                  To be honest I’m not convinced you’re here in good faith and not to troll me, so I’m going to block you and move on. If you actually are curious, google “what are the issues with TAA” and plenty of people will have clips just like mine taken with capture software with their specs and settings listed.

                  • MudMan@fedia.io
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    5 hours ago

                    No, there are definitely tradeoffs with TAA. Just… not extreme ghosting trails like the stuff you posted unless something is kinda glitchy. Which is where the weird layers of misinformation seem to be creeping out. You have a layer of people talking about how they find soft looking TAA images annoying and what seems to be an expanding blob of people attributing a whole bunch of other stuff to the thing as if it was the standard, which it absolutely isn’t.

                    FWIW, I took a peek at that subreddit and it’s mostly relatively informed nerds obsessing over maxing out for a specific thing (edge sharpness, presumably) over anything else. I was pleasantly surprised to see they’re not as much of a cultish thing where soft edges or upscaling are anathema and instead they mostly seem interested in sharing examples of places where temporal upscaling works better/worse than TAA.

                    Most of them are doing so in video so compressed it’s impossible to tell what looks better or worse than that, but hey, it’s at least not entirely delusional.

              • MudMan@fedia.io
                link
                fedilink
                arrow-up
                2
                ·
                7 hours ago

                That’s a very random question asked in a very random place.

                I don’t know, what year are we on and how am I feeling that day? I’ve played thousands of games, “my favorite” is entirely meaningless.

                Currently I’m trying to find time to get through Expedition 33, I just found out that there is apparently a Tetris the Grand Master 4, so I’m messing around with that. I’ve booted up Capcom Fighting Collection 2, I am staring at the 80 bucks price point on Doom Dark Ages and reminding myself I won’t have time to play that for at least a few weeks and I should wait. Steam says my most played games are Metaphor Re: Fantazio, Dragon Ball FighterZ, Street Fighter 6 and Metal Gear V. Nintendo says it is Breath of the Wild. I have 100%-ed the Insomniac Spider-Man trilogy twice. I can beat Streets of Rage 2 in speedrun-worthy times and I’ve played through a bunch of 4.

                This is not a question, it’s an existential crisis.

                • mossy_@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  5 hours ago

                  It might be less strange than you think. I’d like to clarify that I’m not trying to take away your gamer creds or anything.

                  I just wanted to know where your style was, if you had a preference for a certain type of graphical design, but it seems like you’re very widespread in interests with a (current) leaning towards modern games. I haven’t personally tried any of the games on your list because I can’t run them but they should be fun

                  • MudMan@fedia.io
                    link
                    fedilink
                    arrow-up
                    2
                    ·
                    5 hours ago

                    Somebody once told me about cinephiles that “some people really like movies, other people really like the movies they like”.

                    I like games, man.

                    There are very few types of games I outright reject. At most I’ll tell you I’m pretty antisocial and I don’t like multiplayer stuff as much, but it’s not a hard rule.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      1 day ago

      I’ve stopped acknowledging the term AAA because I’m increasingly convinced nobody knows what they mean when they use it beyond “games that look expensive and I don’t like”.

      I also don’t think there’s that many developers that don’t give “story, mechanics and game difficulty balance” an equal amount of consideration, mostly because those things are typically handled by entirely different people in any production that is bigger than a skeleton crew. It’s not like designers in big studios are just twiddling their thumbs waiting for the rendering engineers to finish the peach fuzz on people’s cheeks.

      The way people perceive opportunity cost in collaborative media is always weird to me.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        For me, I shy away from AAA games in general because the bigger the studio and higher the budget, the greater chance that there’s MBAs involved that will push design decisions that favour making more money over making a good game.

        I think some people correlate that with graphics, maybe because the diminishing returns on effort put into graphics means those amazing graphics could have come at the cost of time spent on the gameplay elements, though I don’t personally think a great game and great graphics are mutually exclusive.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          7 hours ago

          This view on things where a guy in a suit is telling a bunch of passionate artists how to do their day to day jobs is entirely separate from reality.

          Don’t get me wrong, there are plenty of big, merecenary operations out there, but this is a) not how those play out, and b) very easy to sus out without needing to have a roll call of the studio.

          Case in point, Baldur’s Gate 3, which people keep weirdly excluding from “AAA” was made by a large team that balooned into a huge team during the course of development. That never seemed a budget, size or graphics problem to me. Or about what degree people in the studio happen to have, for that matter.

          If you don’t want to play hyper compulsive live services built around a monetization strategy that is perfectly legitimate. Gaming is very broad and not everybody needs to like everything. It’s just the categorizations people use to reference things they supposedly dislike that seem weird to me.

      • drosophila@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        1 day ago

        It’s not like designers in big studios are just twiddling their thumbs waiting for the rendering engineers to finish the peach fuzz on people’s cheeks.

        This is true to an extent, but the visual fidelity of a game also determines how much work it is to author assets and (more importantly) the interactions between those assets.

        If I want to make a new enemy in DOOM I have to make a series of 128x128 sprites that show 2-3 frames of walking animation from a few different angles, then add some simple programming for its abilities and AI and I’m done.

        If I want to do the same thing in a game with high visual fidelity I have to make a 3D model, rig it and make animations, worry about inverse kinemetics, and make a bunch of textures and shaders to go on the model. And for anything extra I want the AI to do, or any extra gameplay element I want it to interact with, I have to worry about most of that stuff all over again.

        For example, let’s say I want to add a gun that freezes enemies to both types of games. In the case of the DOOM-like game I can make a semitransparent ice texture and overlay it on all the sprites I’ve already made to make new textures, although I could probably get away with just tinting them blue. Then I have to change the enemy code to make their AI and animations freeze when hit with freeze gun and swap out their sprites to the frozen textures.

        If I want to do that in a high visual fidelity game I have to think of how I’m going to cover the character in ice in the first place if I want the ice to be a 3D model. Sure I can freeze their animations pretty easily, but if they can be in any pose (including a pose generated by inverse kinemetics) when they get frozen I’m going to have to write a system to dynamically cover the model in ice crystals. I’m also going to have to author materials and shaders for the ice, and worry about what that looks like in combination with the existing materials in different lighting conditions, not only for that enemy but for all the ones than can be frozen.

        This sort of stuff is mitigated somewhat by modern tooling, and mitigated even more by the production pipelines that large studios have, but its these same production pipelines that make impossible the sort of drive-by-development and flexibility that you saw in the creation of a lot of earlier games like DOOM, Thief, and Half Life 1. (And when lots of changes are made late in development you usually end up with horrific crunch and a bad game at the end.)

        Ultimately there’s a reason that low-budget indie games gravitate towards pixel art or low-poly art styles. Sometimes an indie game will come out that is very high fidelity, but these will generally be walking simulators with no visually present human characters (so, no interactions or characters to animate). And its for the same reason that games with the most complicated gameplay interactions (Dwarf Fortress, NetHack, etc) or the most highly branching storylines (Fallen London) tend to be text based.

        EDIT: A couple of real world examples:

        This is a retrospective on the development of Anthem. While obviously a lot of bad things were going on there, including a lack of leadership or clear vision, I can’t help but think of that as an example of old style drive-by-development running up against its limits. Every time they’d change the traversal mechanic it would invalidate the world design another team already did, and how many other mechanics are tied to level design? Any time one part of the team couldn’t decide on what they wanted to do, other parts also had to redo their own work, and as a result they spun their wheels for years.

        My other example is this developer. He’s clearly passionate about his work and wants it to look as good as possible, but I worry that he doesn’t realize what a can of worms he’s opened by setting that standard for animations now. Its not that he spent months learning how inverse kinemetics work, its that he’s now committed to making sure every single enemy looks good when interacting with anything it can encounter so that nothing stands out by looking jank or unfinished compared to its peers. A problem that’s made worse by the fact that it sounds like you can take control of any enemy and take it anywhere, meaning anything can interact with anything in the entire game. He has already run into a problem with his manticore enemy not being able to fit through doorways and has started talking about making an IK-driven animation where it tucks its wings in as the player approaches the door. What’s gonna happen if the player presses the fly or jump button when that’s happening? Now multiply that question for every gameplay mechanic for every enemy in the game and every weird situation that might require a custom animation.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          5
          ·
          19 hours ago

          This somewhat wall-of-text-y post is most interesting for going back to my original point about how this argument has been in place for decades with little change.

          I mean, seeing Half-Life 1 pop up as an example of drive-by game development is interesting, given that it’s THE game that debuted highly scripted interactive narrative sequences. I wasn’t there, but I can only imagine the sort of planning it must have taken to figure out the complex sequences where AI entities do things in tandem with the player. I do know for a fact that it took a long time to rig any sort of map for HL1 (because they did provide the dev tools for modding and I did mess around with them). HL1’s enemy AI was an early example of a game that required level designers to manually set up navigation nodes on the geometry (because those flippy ninja enemies later in the game needed to know where they could jump and take cover). I remember it being so cool but noticeably harder to do in your spare time than, say, the old Quake stuff.

          Anyway, nobody is saying there aren’t costs to complexity, but the focus on visual fidelity as a linear relationship to gameplay complexity is at best reductive. The problem is that end users can see graphics. So if something looks nice but just hasn’t figured out core aspects to gameplay it’s easier to assume one thing drove the other rather than, say, not having spent enough time prototyping, or having trend-chasing leadership change their minds multiple times about fundamental aspects of the game because something new got big last month or whatever other behind the scenes stuff may have sent development off the rails.

          I don’t have time to go over the other guy’s stuff, but I will say that I respect anybody who tries to do development with their whole ass out in public, especially if they’re learning. I get anxious just thinking about it.

          • drosophila@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            18 hours ago

            I don’t mean to completely disagree with you, I do think that graphics actually get somewhat of a bad rap, just that I think the tradeoff is real even if it doesn’t really scale linearly.

            I mean, seeing Half-Life 1 pop up as an example of drive-by game development is interesting

            Its true that Half Life 1 marked the turning point from systems focused games to content focused games with its scripted sequences and setpieces. Its also where Valve created the “cabal” development process, which was supposed to be more organized than the development for games like Quake.

            I included it mainly because in the making of Half Life 1 documentary the texture artist mentioned that whenever she would make a new texture oftentimes a level designer would just grab it and use it simply because it was new and exciting. The problem was that if every level used every texture then they started to look same-y, so she actually had to start labeling the files as groups and tell people to try to avoid mixing them together. And this was supposed to be more organized than earlier games lol, you can imagine how thrown together they must’ve been.

            I’m reminded of a similar story from the development of Deus Ex 1. There’s one level where you walk around an abandoned mansion searching for clues. Unlike the rest of the game, which is mainly stealth / fps, in that level you have to explore and solve a puzzle while listening to an NPC talk about her childhood growing up there. Apparently the guy making the level had to yell at his fellow level designers to stop adding enemies to the rooms of the mansion.

            Anyway, sorry if my posts are really long and rambling, I just like talking about games.

            • MudMan@fedia.io
              link
              fedilink
              arrow-up
              2
              ·
              18 hours ago

              I mean, that all depends on the game I suppose, but you’re assuming that sort of anecdotal interaction doesn’t happen now, which I would argue may have more to do with games not doing as much behind the scenes content right away as other media.

              And if something killed it (big if), it was producers getting good, rather than graphics.

              Look, games have changed a lot, nobody is denying that. The spaces for flexibility and creativity have moved around in some places, but not necessarily disappeared. It’s also true that games are more diverse now than they used to be. There are also more of them, by a lot. I have a hard time making uniform blanket statements about these things. Which of course also makes it hard to push back against inaccurate but simple, consumable statements like “the pores on characters’ skins killed fun gameplay” or whatever.

              But there are tradeoffs in lots of directions here. People are talking a lot this month about Expedition 33 getting to those visuals with a small team by effectively using Epic’s third party tech very directly. That’s not wrong. You may have more moving pieces today, but you also don’t have to build a whole engine to render them or code each of them directly instead of having tools to set them up.

              I wish the general public was a bit more savvy about this, though, because there are plenty of modern development practices and attitudes that deserve more scrutiny. It loops back around to the behind the scenes access, though. Nobody has time or interest in sitting down and arguing about prototyping, or why the modern games industry sucks at having any concept of pre-production or whatever. Gamers have always been quick to anger driven by barely informed takes in ways disproportionate to their interest in how games are actually made, and that part of the industry hasn’t changed.

              • drosophila@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                2
                ·
                17 hours ago

                You may have more moving pieces today, but you also don’t have to build a whole engine to render them or code each of them directly instead of having tools to set them up.

                That’s definitely true. Even with my ice gun example there’s actually a system in UE5 that does exactly what I was talking about with the 3D ice crystals (though, whether it works for animated objects with deforming meshes I don’t know).

      • luciole (he/him)@beehaw.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Hey, love your take on this. Can you tell us more on what is “opportunity cost in collaborative media” and how it relates to games? I don’t understand these words, and I’d rather get the explanation from a human being :3

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          19 hours ago

          Right. Sorry about the overly nerdy way of saying that.

          Basically, people tend to think that any time or effort spent doing one thing means time and effort not spent doing another thing. So good graphics means they took time away from designing the gameplay or whatever.

          But that’s not how it works. In big projects everybody tends to be a highly specialized expert. So gameplay people are doing gameplay all the way through, modellers are modelling all the way through, rendering programmers are working on rendering features all the way through and so on.

          When you have big teams like that it can actually become harder to make sure everything is coordinated and organized so that nobody has to sit and wait for anybody else. Complexity does grow and changing things does become much harder, just… not in the ways people often think. It’s not like there are freely interchangeable “game units” that you can choose to spend in either gameplay or graphics but not both, you know?

      • Catoblepas@lemmy.blahaj.zoneOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        It’s not like designers in big studios are just twiddling their thumbs waiting for the rendering engineers to finish the peach fuzz on people’s cheeks.

        Okay! I don’t think that either. I think they’re underpaid and overworked like virtually everyone in the games industry and unable to put out quality products because of arbitrary deadlines. That kind of thing is much more common with AAA games (which studios don’t seem to know how to define either, given that now they’re going on about AAAA and AAAAA games) than it is with indie games.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 day ago

          I’m gonna need some sourcing for that assertion, because man, there are no more underpaid and overworked developers in the gaming industry than indies living on a friend’s garage, having two jobs and coding all-nighters on a passion project.

          Crunch horror stories are real, but big “AAA” devs are more likely to have some type of overtime policy they can adhere to and a decent compensation package.

          I’d argue about arbitrary deadlines, too, but it’s a case by case basis there. In any case, both indies and larger devs are often working to the same type of deadline, that being “we’re running out of money”.

          • Catoblepas@lemmy.blahaj.zoneOP
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 day ago

            Do you know people working in gaming or are you working in it yourself? Because “just ask for overtime and you’ll get it without any repercussions” absolutely doesn’t match the experience of anyone I know. Especially since people tend to jump from big studio to indie, not the other way around, for quality of life reasons.

            • MudMan@fedia.io
              link
              fedilink
              arrow-up
              1
              ·
              1 day ago

              People tend to jump from big studio to indie by way of either getting laid off or having a game they want to do that won’t get greenlit in a big studio (mostly because very few people get to even bring up projects to a greenlight process in the first place).

              Working for ages with next to no financial security on the off chance that you pull off a minor miracle and get an investor backing you or your own startup money back is hardly what I call “quality fo life”. Best case scenario you have the investment already lined up on your way out of a big dev, but that is getting harder these days.

              On the other question, if I wanted to share my resume I would not post under a pseudonym, so apply your best judgement there.

              I’ll say this, though, if that counts for something. I am NOT in the US.

              • Catoblepas@lemmy.blahaj.zoneOP
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                I wasn’t asking for personal information beyond whether or not you’re in or adjacent to the industry, or anything I hadn’t already shared myself, peace ✌️

                • MudMan@fedia.io
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 day ago

                  Oh, I take no offense in asking, I just don’t like disclosing even trivial stuff. Even stuff you can sort of reverse engineer from my post history. It’s more habit than anything else at this point.