I believe the article is based on a wrong assumption. The author argues that everything could look more realistic and that VFX could pop more with stronger HDR, but in my opinion it makes a lot of sense to keep a stylized cartoon game also stylized in its brightness choices.
When you drive towards the sun, what is more fun? A realistic HDR brightness that blinds you, or a „wrong“ brightness level that helps the background stay in the background without interrupting your flow? Similarly, should eye candy like little sparks grab your attention by being the brightest object on screen? I’d say no.
The hardware can handle full HDR and more brightness, but one could argue that the game is more fun with incorrect brightness scaling…
MBCook 66 days ago [-]
That’s not the problem.
The game should look like a normal Mario game at a minimum. It should use its additional color palette available in HDR to look better, and the additional brightness to make make effects pop as you describe.
The problem is that’s not what it’s doing. Some things pop better, but it’s not because they’re using extra colors. It may be a little brightness, but mostly it’s that everything else just got toned down so it looks kinda washed out.
If they did nothing but use the expanded color palette and did not use the additional brightness at all I would be a lot happier than with what we have right now.
I haven’t turned it back to SDR mode but I’m legitimately considering it. Because I suspect the game looks better that way.
BoorishBears 66 days ago [-]
Mario Kart World generally looks more saturated than previous games if anything, so it definitely meets that minimum.
And the article is about they missed out on the optionality of using the additional gamut, but that additional gamut wouldn't intrinsically look better.
It's easy enough to edit a screenshot to show us what could have been, but even in that single screenshot there are things that look worse: like the flames gained saturation but lost the depth the smoke was adding, and some reasonable atmospheric haze vanished.
(similarly the game in the side-by-side has some downright awful looking elements, like the over-saturated red crystals that punch a hole through my HDR display...)
Given Nintendo's track record for stylization over raw image quality, I'm not sure why this isn't just as likely them intentionally prioritizing SDR quality and taking a modest-but-safe approach to HDR... especially when the built-in screen maxes out at 450 nits.
zapzupnz 65 days ago [-]
> Mario Kart World generally looks more saturated than previous games if anything, so it definitely meets that minimum.
Compare any of the retro tracks to their World counterpart, then say that again. The game’s general palette and design is so washed out and bland compared to the DS, Wii, 3DS, and Tour versions of those tracks.
BoorishBears 65 days ago [-]
I mean you included a title that launched on a frontlit TFT with 18-bit color and maybe 60% sRGB coverage, for example.
If there's a track that's actually less saturated than it was then, it's definitely not the result of an SDR-first workflow.
arghwhat 66 days ago [-]
> It should use its additional color palette available in HDR to look better, and the additional brightness to make make effects pop as you describe.
It could, but that's different than shouldn't. There could be good reasons for it, such as not wanting the primary gameplay to lie outside a color palette available to people playing on their TV in sRGB or in the common complete shit HDR modes that lie about their capabilities.
It would be neat if more was used, but nothing about being HDR means that you should, or that it's even a good idea, to rely on the maximum capabilities.
> I haven’t turned it back to SDR mode but I’m legitimately considering it. Because I suspect the game looks better that way.
To be honest, without a TV with proper HDR, SDR mode will often look much better. The problem is that TVs are often quite awful when it comes to color volume, specular lighting and calibration. The SDR mode is often very untrue to the content, but stretches things within the TV capabilities to make it bright and vivid to look nice. The HDR mode on the other hand has to give up the charade, and in particular SDR tonemapped content, which if the TV didn't lie would have looked identical to SDR mode, looks really awful.
A favorite of mine is to switch Apple TV's between SDR and (tone-mapped) HDR mode and see how different the main menu and YouTube app looks. I have yet to find a TV where UI doesn't look muted and bland in HDR.
_carbyau_ 66 days ago [-]
Every game I first start requires a trip to turn off music, in-game VoIP, HDR, bloom, lensflare, screenshake if possible.
It's like a keyword bingo for usually poor implementations. I grant that maybe the implementation is good for any specific game you care to mention - but history has shaped my habits.
JohnBooty 65 days ago [-]
You automatically turn off music?
The presence of in-game music is a poor implementation indicator?
I am a big proponent of "there's no wrong way to enjoy a game" but wow. In nearly 50 years of gaming that's a new one for me. Congratulations. But do whatever works for you... and I mean that sincerely, not sarcastically or dismissively.
_carbyau_ 65 days ago [-]
The mere presence of any of these things is not an indicator of overall game implementation clearly. I mean, trying to avoid the mere presence of these options would severely limit game choice.
But any one of these aspects can individually be crap and often are.
If I have a need for ingame VoIP I'll turn it back on. But I don't want to default to hearing randoms on the internet espousing bullshit using mic-always-on.
If it turns out the game is good with music, I'll turn it on. Rocket League works.
If one of my friends is raving about the graphics, I'll turn HDR or bloom on - though I haven't ever agreed with them enough to leave it that way.
So by default, yes I turn those things down or off before I even start playing.
Further detail for music. Often poorly implemented:
repetitive, sounds similar to game sound effects, changes and gets louder when there's a rush horde. All distracting and ruining immersion.
I quit playing Day of Defeat because of the end of round music that I couldn't do anything about. I either couldn't hear footsteps next to me (part of the gameplay) or I was deafened with the end of round blaring. I don't have time to put up with poor UX when there are plenty of other games to play.
As I get older and find it harder to discern different sounds it is just easier to focus on what I want - the game and gameplay. It's the same thing as drivers turning down the stereo when they want to pay attention to finding a park or driveway - it's a distraction to what I'm here for.
I like music, and like to listen to it, either as a dedicated thing to do or while doing house chores or cruising long distances in the car. But generally not while gaming.
Thankfully so far, these are still options I can set to taste. I guess when the "the way it's meant to be played" crowd takes over I'll do something else with my time.
zapzupnz 65 days ago [-]
I can’t believe that in 50 years, you’ve been unable to imagine people disabling music. I turn it off in games like Minecraft because it serves no purpose, and my husband routinely disables it in Factorio. Then you’ve got people with hearing difficulties I would rather hear the sound effects as they help gameplay.
Truly, in 50 years, none of this has occurred to you or you’ve never witnessed it in friends or family? That seems hyperbolic
roguetoasterer 65 days ago [-]
That's not what JohnBooty was responding to, though. Turning off game music because it's not doing much for you or because you have other things you'd rather listen to is not at all unusual. Listing game music alongside poorly implemented features that get turned off by default before you even start playing the game is, yes!
_carbyau_ 65 days ago [-]
I have a long rant responding to JohnBooty but ... what is it that game music is meant to do for me most of the time?
I understand that it helps some people get "into the flow" or something. But I don't have an issue with that. I occupy my mind trying to grasp the mechanics and be good at using them to play. If the gameplay doesn't have enough then papering over it with music doesn't do it for me.
And I'm not always looking for "intense" stuff. I like to chill out too. But I've played quite a few games over the years and so the gameplay has to have something to keep me entertained.
I enjoy music with Rocket League because I don't play competitive and so some music playing while I'm hanging out with others on my couch shooting-the-shit as it were is fine. It's more of a "social setting music" than "game music".
After all these years, I don't miss it at all.
zapzupnz 65 days ago [-]
I don't think it's unusual at all, depending on what genre of games they play. For example, they may be thinking of the likes of many PC strategy games where the music seems to just come in seemingly at random and distract, or completely ruin the ambience that the game world is trying to set.
JohnBooty 65 days ago [-]
That seems hyperbolic
I think maybe you skimmed instead of reading. I'm referring to the parent poster's practice of turning music off ASAP on first boot as standard operating practice, because they find it to be part of what they consider "keyword bingo for usually poor implementations" Here's what they said:
Every game I first start requires a trip to turn off music,
in-game VoIP, HDR, bloom, lensflare, screenshake if possible.
It's like a keyword bingo for usually poor implementations
But as you suspected: yes! I can imagine turning off game music, in general. Thank you for believing in me!
I have turned the music off in many games.
user____name 64 days ago [-]
I also often have to turn of music because for some weird reason a lot of games have stopped providing a slider for music volume and replaced it with a checkbox. Game music is often too loud for my taste, I like it to remain in the background of the mix.
Infernal 66 days ago [-]
Don’t forget motion blur!
3eb7988a1663 66 days ago [-]
Motion blur is my only instant disable. I paid a lot of money and self-justification for this expensive GPU, and you want to make things look a blurry mess?
jasonjmcghee 64 days ago [-]
Motion blur can look fantastic / realistic. But almost never does. Especially when it's temporal.
_carbyau_ 65 days ago [-]
I had forgotten it for my comment, but not when I see it in the graphics options! Yes, that is an absolute off switch every time.
animal531 66 days ago [-]
I agree on the HDR blinding you, its mostly just a gimmick. Where it works is in for example Diablo 4 when there is something really shiny lying on the ground, which makes it pop out.
Other games like Senua did actually manage to pull off an amazing sun/scene though. Because its slower they can use it to accentuate, for example you walk around a corner and go from the darkness into full on sunlight which is blinding, but then falls off so as to become bearable.
mcdeltat 65 days ago [-]
Not to detract from your point, but ironically if they created the design with a more "HDR" (i.e. physically based) colour pipeline, it would probably be easier to tune the brightness of the sun correctly...
sim7c00 66 days ago [-]
HDR is a scam. u can do exactly the same with SDR. its just a different format to express things.
the monitor can display what it can display. format of transfer doesn't change hardware capabilities, just how you express what u want towards them
freeone3000 66 days ago [-]
SDR literally has fewer brightness bits. You can tonemap with the darkest in a scene pulled to zero and the brightest to 1, but, this is not usually what the color grader intended so usually isn’t done automatically. And when you do this, you end up with gaps in the middle (like 18-bit color). So usually, a range “in the middle” is reserved for SDR, and the extra two (or four, or six) bits of brightness in HDR allow you to pull it darker or brighter.
sim7c00 62 days ago [-]
thanks for your insights. my comment was overly cynical. in practice though visually , on a good (really good) monitor, you really can not see a difference. trained eyes, cannot really see a real difference. the difference becomes visible more due to a monitors own quircks not displaying accurately (mapping of this sdr/hdr to colourspace).
this was posted before here on HN and i dont think its wrong. though ofcourse, technically, there are differences, which might be in some applicaitons ( but usually are not) exploited to the viewers benefit. (?? maybe?? someone with real good eyes :))
https://yedlin.net/DebunkingHDR/index.html
maybe my interpretation is wrong, but i dont think it is far off if it is. specification differences and differences in human perception are not the same thing
brookst 66 days ago [-]
Would you say 8 bit audio can express the same audio quality as 16 bit audio?
BobaFloutist 65 days ago [-]
It is weird, though, that they're different modes and you have to choose on both the device and the software which to use.
Imagine if every individual song, speaker/earbud, and MP3 player had a different implementation of 8/16 bit music, and it was up to you to compare/contrast the 8 variations to decide if 8 or 16 bits was more auditorially satisfying.
You get this with like levels and mixers, but not with such a fundamental quality as definition. Sure, you can chase hi-fi, but I feel like music has less artificial upscaling and usually hi-fi is a different product, not a toggle buried in various menus.
I don't know, it's kind of crazy.
brookst 65 days ago [-]
It’s the transport. Imagine if the D/A was in the speaker; you might set params differently then.
kalleboo 65 days ago [-]
We kind of did have an equivalent situation back in the day - you used to have A-law vs μ-law
sim7c00 62 days ago [-]
32 bit audio is higher resolution, it can have effects applied to it in higher detail. like subpixel coordinates might more clearly specify a pixels colours...
anything beyond 16bit (which is streamed to your speakers) is not audible though in the sense of playing it to the speaker. it matters in processing (dsp). (44 or 48khz inguess?)
ac29 65 days ago [-]
Super Audio CD used 1 bit audio (at a very high sample rate) and sounded just fine
jldugger 66 days ago [-]
> But when Gamers in ESA surveys report that the quality of the graphics being the #2 factor in deciding when to purchase a game
Somehow I doubt this survey is representative of the typical Mario Kart player. And to those for whom it is a concern, I don't think SDR is high on the list relative to framerate, pop-in, and general "see where I'm going and need to go next" usability.
Loughla 66 days ago [-]
You are exactly right. I don't care if it's all blocks and squares. As long as I can not lag and see enough to destroy my children at the game.
mcphage 66 days ago [-]
> see enough to destroy my children at the game
That really is the joy of Mario Kart. You think you’re going to beat me, kid? You’re 12 and I’ve been playing Mario Kart for 30 years.
(And then they do… oof)
aranelsurion 66 days ago [-]
I personally find HDR as one of the most impactful recent display tech, but there are plenty of hoops to jump through before you get to that experience.
You need a real HDR display (800 nits+), FALD or OLED for contrast, some calibration, and software that uses it well (really hit and miss at least on Windows).
Once all the stars align, the experience is amazing. Doom Eternal has one of the best HDR implementations on PC, and I suggest trying it once on a good display before writing HDR off as a gimmick.
There’s something about how taillights of a car in a dark street in Cyberpunk look, and that just can’t be replicated on an SDR display afaict.
Then you have some games where it’s implemented terribly and it looks washed out and worse than SDR. Some people go through the pain and mod them to look right, or you just disable HDR with those.
I’d vouch for proper HDR any day, that being said I wouldn’t expect it to improve Mario Kart much even with a proper implementation. The art style of the game itself is super bright for that cheery mood, and no consumer display will be able to show 1000nits with 99% of the frame at full brightness. It’ll likely look almost the same as SDR.
tokinonagare 65 days ago [-]
On the other hand I find it the lamest and most annoying "feature" of the last 10 years. The article itself is a good demonstration (at least on my M1 MBP): as soon as an image has a single pixels row displayed, the luminosity of the whole page fades, and the reverse happens when no image is in sight. The comparison video is the first time ever I see the tech doing anything else than changing the luminosity of the screen.
qingcharles 65 days ago [-]
This is the problem we have right now. Cheap nasty displays that put HDR on the box and are totally trash. Most people aren't buying $4000 Sony OLED TVs, they're buying the $300 SANGUULQ brand on Black Friday sale at Wal-mart.
One reason for keeping Apple hardware around is a decent display test bench. I do the best I can with image work, but once it leaves your hands it's a total lottery.
cco 65 days ago [-]
HDR means my phone will blind me at night and my laptop screen will be super bright in some parts and dim on others.
So far this is my experience of HDR.
freetime2 66 days ago [-]
Fair point I suppose, but honestly I don't really care. The game looks plenty bright and colorful and Mario-y, and I certainly never stopped to notice banding in the sky or lack of detail in the clouds.
There are about a thousand other things in any given game that matter more to me than HDR tone mapping, and I'm happy for developers to focus on those things. The one exception might be a game where you spend a lot of time in the dark - like Resident Evil or Luigi's mansion.
Looking at his example video where he compares Godfall Ultimate footage to Mario Kart - I quite dislike the HDR in Godfall Ultimate. Certain elements like health bars, red crystals, and sparks are emphasized way too much, to the detraction of character and environment design. I find Mario Kart to be much more tasteful. That's not to say that Mario Kart World couldn't be better looking in HDR, but the author doesn't really do a compelling job showing how. In the side-by-side examples with "real" HDR, I prefer the game as-is.
braiamp 66 days ago [-]
There's someone that did the job to actually figuring out how to make the HDR of the switch work, but it needs your display to support certain features to be correct https://www.youtube.com/watch?v=X84e14oe6gs
MBCook 66 days ago [-]
That only helps some. I have a display that supports that feature but I still can’t get it to look right.
It’s a little better than I had it set. But it’s still a problem. As this article shows, it just wasn’t designed right.
mortenjorck 66 days ago [-]
The most unpleasant effect from cut-rate HDR is when graphics with bright backgrounds get lazy-mapped to HDR.
Perhaps the worst offender I've ever seen was the Mafia remake by Hangar 17, which loads every time with a sequence of studio logos with white backgrounds that cut from black. The RGB(255,255,255) backgrounds get stretched to maximum HDR nits, and the jump from RGB(0,0,0) (especially on an OLED) is absolutely eye-searing.
I literally had to close my eyes whenever I'd load the game.
xnx 66 days ago [-]
Even worse when you're in a dark room. The white flash when loading many otherwise dark mode websites and apps is the worst.
baobun 66 days ago [-]
At least for Firefox, having the browser chrome set to dark (typically by having it picked up from GTK theme preferences) should make the default page background dark too. And I think userchrome.css to override default bgcolor should still work.
Of course there are individual wonky sites which will still flash but if applicable, those two things should reduce the occurrences significantly.
theshackleford 66 days ago [-]
> (especially on an OLED)
Why would it be any more impactful on OLED than any given FALD display capable of putting out >1000 nits sustained?
noname120 66 days ago [-]
Contrast ratio
theshackleford 66 days ago [-]
> Contrast ratio
Perceived intensity in HDR is dominated by luminance, not just contrast ratios on paper.
OLEDs do have effectively infinite contrast (since black pixels are off) and it's why I love them, but that doesn’t inherently make white flashes more intense on them via any other display type unless the peak brightness is also there to support it.
Or in other words, a 800 nit flash on OLED is going to be less intense than a 1600 nit one on a FALD LCD. Brightness is the bigger factor in how harsh or impactful that flash will feel, not just the contrast ratio.
It's not down to your panel technology in this case, but the limitation of any given panels peak and sustained brightness capabilities.
noname120 66 days ago [-]
That comment was of course for a given nit level. The question doesn't make sense otherwise. Your iris expands when the overall ambient lighting is low. If there is no backlight then there is less ambient lighting so your iris are stretched to the maximum and a jump to a given luminance level blinds you more than the same luminance would on a screen with a backlight.
theshackleford 66 days ago [-]
FALD screens showing this kind of imaging would have the backlighting off in the dark sections.
This kind of scenario is in fact where FALD is strong. OLED really starts to pull ahead in more complex scenes where the zone counts simply can’t match up to per pixel control.
SirMaster 65 days ago [-]
I hate FALD though. I have yet to see a FALD LCD that doesn't exhibit distracting blooming around the brighter areas. even the latest TVs with like ~10,000 zones still aren't good enough to my eye.
theshackleford 65 days ago [-]
Fair enough, I get it, I was literally that person once myself, there was a time I had only OLEDs in my house and refused to use LCD's full stop. But this wasn’t about likes or dislikes. If it were however...
I love OLED's motion clarity (I still use CRTs, that’s how much I care). But I dislike its inability to maintain brightness at larger window sizes and VRR flicker is a dealbreaker across the board.
My FALD display on the other hand, delivers the most jaw dropping HDR image I've ever seen, producing an image so spectacular it's the biggest graphical/image jump iver seen in greater than a decade, but its motion resolution is garbage and yes, in some specific content, you'll get some minor blooming. It's nice that it doesnt have VRR flicker though.
My OLEDs win for motion, my FALD wins for HDR + lack of VRR flicker, it's very unfortunate that there’s no perfect display tech right now. Since most of my content is bright (games), I’m happy to trade some blooming in starfields for overall better performance across the other 80% of content. Other peoples content will differ, like perhaps they love horror games, and will choose the OLED for the opposite reason and I'd get that too.
I still use OLED in the living room though, it doesnt seend the kind of abusive usage my monitors do and OLED TVs are way brighter than OLED monitors, not as bright as i'd like, but bright enough i'm not gonna go out and replace it with a FALD, not until the new Sony based stuff drops at the very least.
HDR video/images in the macOS/iOS browser have to be one of the most confusing UX in recent memory.
Why are people able to craft an image/video that bypasses my screen brightness and color shift settings?
If I wanted to see the media in full fidelity, I wouldn't have my screen dimmed with nightshift turned on in my dark bedroom.
It's not OP's fault. My mind is just blown every time I see this behavior.
gloxkiqcza 66 days ago [-]
I consider this to be a bug. I want to view HDR content with brightness differences scaled in a way that the maximum value barely, if at all, exceeds the set display brightness.
The fact that it just goes full 1600 nits blast when you view a video from a sunny day is a terrible UX in most cases and it’s the reason why I have HDR turned off for video recording, even though I might miss it later. To make matters worse, it also applies to 3rd party views.
eclipxe 66 days ago [-]
I really enjoy it
bastawhiz 66 days ago [-]
This is what happens on my Pixel 9 Pro. I'll scroll through some HDR photos and have to turn down my brightness so I'm not blinded in bed. Then I have to turn it back up once I've scrolled past. It's maddening, I don't want to manually adjust brightness because someone took a picture outside.
eclipxe 66 days ago [-]
No I love this behavior. Makes the content really stand out and more enjoyable to view.
rideontime 65 days ago [-]
Then you are free to turn up your brightness.
hombre_fatal 65 days ago [-]
You might be saying that you'd prefer if all media were at full nits and color fidelity. But that's not what's being discussed here.
The current implementation means that only the occasional image/video behaves that way, and only if it were crafted that way.
twoodfin 66 days ago [-]
The intrinsic research and analysis behind this article are great. I’m having a hard time, though, not tripping over the obvious (tell me I’m wrong!) ChatGPT “polish”. Multilevel tutorial outlines, bolded key points, “X is Y—not Z”, …
I can’t articulate why it bothers me. Except maybe the implied assumption that the author’s real voice & style benefit more than they are harmed from being submerged in what is ultimately mathematically derived mush.
isoprophlex 66 days ago [-]
I know why it bothers me. It's lazy. Just look at those telltale fucking bulleted lists. The same point, re-phrased slightly; the shit you thought was clever when you were 15 years old, bullshitting your way through a high school history exam.
If you consider your writing bad enough to warrant a LLM fluffing pass, I consider it no better than the 99% of worse than mediocre, lazy, attention grabbing bullshit without real intellectual quality that pollutes the internet.
kybernetyk 66 days ago [-]
Yeah there’s also those people who aren’t native speakers. I tend to run ChatGPT over most stuff I write because while writing in English for over 30 years now I still can’t compare to a native speaker.
Call it lazy but I think for the reader reading a rephrased LLM article is more enjoyable than trying to parse some borderline “Kauderwelsch”(German for gibberish) :)
1123581321 65 days ago [-]
I’m just one reader but I enjoy reading English from non-native speakers. There’s often a terse, wry or precise quality to it that native English writers won’t make the time for, since they can roll out so many idioms. Your translations are legitimate contributions to the English-language community.
jlarcombe 66 days ago [-]
yes it's hard to understand why people feel the need to do this and I worry it's just going to become ubiquitous and ruin everything
numpad0 66 days ago [-]
> HDR is mainstream – From just a quick browsing of BestBuy, nearly all TVs over 42” are 4K and support HDR. 9th gen consoles are shipping with HDR on by default. The majority of your audience is HDR-equipped.
"Mainstream" or "majority" in context of Nintendo is a $20-40k/yr white collar household with 2 kids. The REAL mainstream. Some would have real ashtrays on a dining table. ~None of them had bought any of TVs over 42" with 4K resolution and HDR support in past 10 years.
Though, I do wonder how globally mainstream is such a household buying Nintendo hardware. Admittedly it could be somewhat of a local phenomenon.
thinkingtoilet 66 days ago [-]
I don't know how out of touch you have to be to think a family with two kids making $20k a year is affording a switch 2 on release.
numpad0 66 days ago [-]
Japanese average income is $32k, the JP special of Switch 2 is $350, Switch 1 sold 37m, and Japanese working age population under is ~74m(out of 120m).
Either six-figure Nintendo gamers are hoarding boxes full of Switch 1 in the attic and completely destroying the statistics, or everyone at that income bracket is sophisticated enough to desire one.
Frankly, this is why I'm wondering how normal it is globally, because I believe Japan is not supposed to be a village of broke Vulcans. Maybe the nerds are in fact hoarding tons of Switches.
thinkingtoilet 65 days ago [-]
My bad. I didn't know you were talking about Japan. I assumed you were talking about America.
66 days ago [-]
anon7000 66 days ago [-]
You’re probably underestimating how many “real” mainstream families still have a TV. A 4k HDR tv is cheaper than a Switch 2. Hell, you can get a 75” 4k HDR tv for the same price. The cheapest 55” 4k HDR tv at Walmart ($238) is almost half the price of a Switch 2 ($450)
TVs are a very cost effective home entertainment device, and 4k HDR is the default these days.
freetime2 66 days ago [-]
Lots of cheap LCD panels advertise HDR, but don’t really have the brightness or contrast to do it properly. I’ve got two HDR TVs and one HDR monitor where the claims of being HDR are an absolute gimmick.
BearOso 65 days ago [-]
I'd qualify that as "most" LCD panels can't do HDR properly. I'm likely grossly overestimating, but maybe 10% have enough local dimming zones and high enough peak brightness that they can demonstrate a passable effect with some glowing and slow transitions.
OLED has ABL problems, so they can do HDR400, but anything with a higher brightness than that is problematic.
I feel like HDR support is so divergent that you're not ever going to match "properly mastered" content in the way the article author wants. That's why no developers want to spend time on it.
epcoa 66 days ago [-]
Not sure why you would associate “white collar” with poverty. While technically there are poor office workers, this is not the typical association.
msk-lywenn 66 days ago [-]
That sentence raised my eyebrows too. What it actually demonstrates is that shops sell hdr, nowhere it means that everyone bought hdr screens.
tempaway43563 66 days ago [-]
Literally only makes a difference to the clouds. Nintendo know what they're doing and made the right call
MBCook 66 days ago [-]
It looks more washed out than it should. I’m not talking about “doesn’t have blowout colors and brightness“. I’m talking about looks bland.
How is that the right call?
jldugger 66 days ago [-]
1) They have a screen built into the console. It would be kinda derpy to treat portable gaming as a second class citizen. So "SDR first" it is.
2) More than past Mario Karts, World needs to visibly delineate the track into multiple sections: the track itself, the "rough" off track, the border between those two, and the copious number of rails you can ride and trick on. Rails in particular are commonly bright primary colors in order to better stand out, often more primary color coded and saturated than the track itself. Green Pipes, yellow electrical wire tie downs, red bridge rail guards, etc.
3) Bonus gamut for particle effects is kinda not required and probably distracting when drifting around a curve avoiding attacks.
4) It feels pretty good to me, but maybe I need to adjust some settings on my LG C1 to get the full bland experience?
MBCook 66 days ago [-]
The screen on it is supposed to be HDR, though I don’t know how good it is at that.
BearOso 65 days ago [-]
It has a peak brightness of 400cd/m^2 and no local dimming. It's only possibly HDR-compatible by virtue of accepting an HDR signal. It can't display HDR at all. The closest thing it can do would be content-adaptive fullscreen brightness changing.
MBCook 65 days ago [-]
It may still have the extended color gamut.
mcdeltat 65 days ago [-]
Interesting article and it's great that it touches on colour science concepts. IMO "HDR" is one of the most commonly misunderstood and butchered graphics terms out there. A lot of people who probably should understand it (artists, photographers, designers, computer graphics engineers) don't understand it. Like the author mentions, the industry has been stuck in SDR world for historical reasons and probably because SDR is a fudge that looks good enough most of the time. Hence a proliferation of poor understanding and poor colour pipelines. Also, interestingly, even with decent colour science there is still an art to it, so you can do a lot of things which are half correct and it doesn't obviously look bad to most people.
What surprised me is why a new game from a big studio, designed to support "HDR", would not be designed all in linear space to begin with. Because then doing tone mapping correctly for different display technologies becomes easy. However my knowledge is mostly from the photography world so perhaps someone with game knowledge can weigh in.
danbolt 66 days ago [-]
> You might look at the HDR representations of this game and think “Wait, the game appears more colorful” and this is because of the Hunt Effect. The Hunt Effect describes how we think a brighter color is more saturated, but in reality, it’s just an optical illusion.
Sounds like an incredibly cost-effective optical illusion!
user____name 64 days ago [-]
Nintendo really seems to dislike dithering their intermediate buffers for some reason. Their first party titles often have quite visible banding artifacts. Dithering is easy and cheap and solves it completely.
Don’t disagree with the findings wrt MKW specifically, but PSA that, in general, HDR on the Switch 2 can be substantially improved by enabling HGIG tonemapping on TVs which support it, then going through the HDR calibration steps again. HDTVTest covered it here: https://youtu.be/X84e14oe6gs?si=bh1U7OHxGlzzJO8w
colechristensen 66 days ago [-]
It reminds me of the Wizard of Oz, color was new and being done poorly. Now Oz is nostalgic so it gets a pass because of its uniqueness, but new films that look like that would be a little absurd.
I like some of the choices on Mario Kart World with HDR, but a lot of it just needs to be toned down so the things which do blow out the colors are impressive but also fit instead of just everything being turned up to 11.
rohansood15 66 days ago [-]
I have never owned a gaming console, and I was actually considering getting the Switch 2 as a casual gamer to play with friends/family.
My first reaction when I saw the launch/gameplay video was why does this look so washed out? Now I kinda know why - thank you!
WhereIsTheTruth 66 days ago [-]
This is a game on a handheld
You want low latency and long battery life, HDR has an impact on the two
Have people forgotten what a handheld is supposed to be? portable device on a battery
whatevaa 66 days ago [-]
HDR doesn't impact processing much. Some games implement HDR toggles by rendering in HDR and using shaders to map to SDR when needed, apparently.
Firehawke 66 days ago [-]
Then why even have a HDR display as the handheld screen itself?
Come ON.
WhereIsTheTruth 66 days ago [-]
it perhaps was less expensive to produce on a mass scale
the game perhaps started development when they had a different screen planned for the console?
according to rumors, the console is at least 1year late
i'm just pointing out a fact, i'm not saying that everything they do make sense
simoncion 66 days ago [-]
> it perhaps was less expensive to produce on a mass scale
It might be true that an HDR-capable panel was cheaper than one that was only SDR-capable.
That doesn't mean that Nintendo was obligated to incur the (probably very modest) increase in energy drain from additional processing load and (probably quite a bit less modest) increase in energy drain from making the screen far brighter on average than when running in SDR mode. They could have just run the panel in SDR mode and limited HDR mode to when one was running on AC power in a dock. Or even never enabled HDR mode at all.
NOTE: I've not looked into whether or not enabling HDR in a given game has a significant effect on battery life. I'm also a big fan of manufacturers and game developers creating good HDR-capable hardware and publishing good HDR-capable software whenever they reasonably can. Higher contrast ratios and (-IMO- more importantly) wider color gamuts are just nice.
ge96 66 days ago [-]
Interesting how the images pop on that site, everything else has like lower opacity/faded, worked great, maybe more noticable on retina monitors
badc0ffee 66 days ago [-]
On macOS 15.5, Firefox 139 shows super dark images for me. Safari seems to work fine, though.
ziml77 66 days ago [-]
I'm curious if someone knows what's going on. It feels to me like Firefox is showing the image in HDR but with the wrong gamma curve applied.
GRiMe2D 66 days ago [-]
MacBook Pro can “display” HDR content on SDR displays.
macOS puts a slightly higher brightness than it required and artificially (in software) changes absolute white (0xFFFFFF) to greyish color (0xEEEEEE). So when a HDR content is required it will remove mask around that content. Safari ideally, probably that’s on Firefox why tone mapping doesn’t work well
badc0ffee 66 days ago [-]
There's a HDR video on that page, and on my built-in MBP display, it's much brighter/has noticeably more range than the rest of the UI. Moving the window to my non-HDR external 4k monitor, it looks like a regular YouTube video.
The video looks the same in both Safari and Firefox, whereas the images are dim in Firefox on both my MBP display and external monitor.
genezeta 66 days ago [-]
Firefox does not have HDR support for images. It has some support for HDR video on Mac. Support for HDR images seems to have gained some traction in latest months but seems still distant.
Nintendo has never competed on graphics. They compete on having the most fun, accessible, entertaining games as possible. And say what you will about their business practices, they’ve probably done a better job of that than any other gaming company in history. As more devs bundle ever higher quality graphics with ever higher in-app purchases and pay to win schemes, Mario remains…Mario.
I seriously doubt many Switch users would bail on the system because of “fake” HDR. They probably don’t care about HDR at all. As long as Mario remains Mario, they’re happy.
Retr0id 66 days ago [-]
Nintendo graphics are rarely technically impressive, but their games do tend to look good. I'd expect their games not to have washed-out colours.
echelon 66 days ago [-]
The author's changes look more washed out to me than the original screenshots.
bitwize 66 days ago [-]
Nintendo has ABSOLUTELY competed on graphics. The NES, SNES, N64, and Gamecube were graphical powerhouses upon release, at or near the top of their generations in graphical performance. It was only with the Wii, when they chose to iterate on the Gamecube design rather than pair a powerful multicore processor with a powerful shader-capable CPU like the PS3 and Xbox 360 did, that Nintendo started going all "but muh lateral thinking with withered technology" and claimed they never intended to compete in that space.
Taek 66 days ago [-]
The GameCube was released 24 years ago. Its hardly fair to hold Nintendo accountable to a direction they haven't moved in for two and a half decades.
The visual difference between the N64 and GC was enough that it made sense to focus on upgraded graphics. When you play an N64 game, there's always the initial shock of "wow these graphics are a bit dated".
But you don't get that feeling when playing Melee, or Wind Waker, or many of the other artfully done GC games.
Essentially, somewhere around the GameCube era, graphics became good enough that the right artist direction could leap a game into the "timeless graphics" category.
And so it makes sense that Nintendo said "let's stop chasing better graphics, and instead focus on art direction and gameplay".
pimeys 66 days ago [-]
I think the biggest issue with Nintendo games until Switch at least has been the abysmal frame rates. We're not talking about dips just under 60fps, there are good examples of 15fps frame rates even with the best games such as Tears of Kingdom. I think they've finally fixed that issue with Switch 2, but the horrible performance of the games have been a huge issue since forever.
And of course it does not matter, Nintendo still sells because it's Mawio (and I say this with all the love, I'm a huge Mario fan myself).
zimpenfish 66 days ago [-]
> the Wii [.vs.] a powerful shader-capable CPU like the PS3 and Xbox 360
Outsold both the PS3 and XBOX360 by 15M units though. Given the lower hardware costs of the Wii (I've seen estimates of ~$160 compared to $840 for the PS3 and $525 for the Xbox 360 - both higher than launch price btw!), I'd suggest Nintendo made the right choice.
theshackleford 66 days ago [-]
I'm a Nintendo purchaser. I absolutely care about HDR. Given they specifically advertised HDR, I suspect they expect me to care, otherwise why make noise about it?
jonhohle 66 days ago [-]
I don't think it's fair to say they _never_ competed on graphics. The Super Nintendo was comparable and surpassed the Genesis in some graphics areas. The Nintendo 64 was a 3D monster compared to other consoles at the time. On paper, the GameCube out performs the PS2. It wasn't more powerful than the Xbox, but not a generation behind.
It wasn't until the Wii that Nintendo stepped out of the hardware race. Somehow this has been retconned into Nintendo never focusing on hardware.
If they thought it would sell more systems, they'd compete. The Switch 2 is evidence that it doesn't matter.
maratc 66 days ago [-]
The "Lateral thinking with withered technology" quote goes back to Gunpei Yokoi designing the Game Boy in the late 80s.
bigstrat2003 66 days ago [-]
> It wasn't until the Wii that Nintendo stepped out of the hardware race. Somehow this has been retconned into Nintendo never focusing on hardware.
Fair point, but on the other hand... that was 20 years ago. So it's easy to understand why that gets rounded off to "never".
jonhohle 65 days ago [-]
smh I feel so old
MBCook 66 days ago [-]
I’m heavily disappointed. I’ve always been a HUGE Nintendo fan.
If the system was SDR only I would be disappointed but fine.
But they made it HDR. They made a big deal about it. And it doesn’t work well. It’s impossible to calibrate and ends up just looking washed out.
It’s broken.
And I don’t appreciate the insinuation that Nintendo fans will buy any piece of junk they put out. See: Wii U.
badc0ffee 66 days ago [-]
The Wii U had terrible wifi, but I can't really say I hated it. There were some real classics on that console - Mario Kart 8 and Super Mario 3D World (although those were both eventually ported to the Switch). It played Wii games and supported all the original controllers, but output HDMI and had real power management. I still use mine to play Wii games.
MBCook 66 days ago [-]
I loved mine. It had real problems but great games.
It was just an easy at-hand example.
I also liked the VirtualBoy. But I bought it and a bunch of game from Blockbuster for $50 total when they gave up on it. So my value calibration was very different from those who paid retail.
HeyMeco 66 days ago [-]
Agree, with the HDR marketing for the Switch 2 I expected a proper implementation. Sad that they cheaped out on it but at least we got this great article out of it
whoisyc 66 days ago [-]
I am not a Nintendo fan (I do have a Switch 1) but this article is the first time I learned Nintendo added HDR graphics the the Switch 2 and this thread is the first time I learned HDR was actually being marketed. I genuinely doubt most Nintendo customers know about these features. It isn’t like Nintendo takes pride in technically impressive graphics anyway.
I read the above post, and honestly thought it was satire.
MBCook 66 days ago [-]
I’m glad to see this getting attention in the last day or two. HDRVTest did a video too.
I’m having a blast with MarioKart but the track usually looks washed out. Some of the UI and other things have great color on them but most of the picture just looks like the saturation was turned down a bit.
Very disappointing as a Mario game and its colorful aesthetic is the kind of thing that should be able to look great in HDR.
throawayonthe 66 days ago [-]
[dead]
66 days ago [-]
jekwoooooe 66 days ago [-]
Nintendo cheaped out just so they can resell the same thing to the people obsessed with their branding
Anything that isn’t an oled simply cannot do HDR. it’s just physically impossible to get the real contrast.
Rubberducky1324 66 days ago [-]
LCD screens with FALD (Full Array Local Dimming) can do HDR just fine. Usually they even have higher sustained brightness due to the inherent limitations of OLED.
jekwoooooe 66 days ago [-]
With huge halos sure. Go watch fireworks on an lcd with local dimming even if it has thousands of dimming zones you will see huge halos which ruin contrast
When you drive towards the sun, what is more fun? A realistic HDR brightness that blinds you, or a „wrong“ brightness level that helps the background stay in the background without interrupting your flow? Similarly, should eye candy like little sparks grab your attention by being the brightest object on screen? I’d say no.
The hardware can handle full HDR and more brightness, but one could argue that the game is more fun with incorrect brightness scaling…
The game should look like a normal Mario game at a minimum. It should use its additional color palette available in HDR to look better, and the additional brightness to make make effects pop as you describe.
The problem is that’s not what it’s doing. Some things pop better, but it’s not because they’re using extra colors. It may be a little brightness, but mostly it’s that everything else just got toned down so it looks kinda washed out.
If they did nothing but use the expanded color palette and did not use the additional brightness at all I would be a lot happier than with what we have right now.
I haven’t turned it back to SDR mode but I’m legitimately considering it. Because I suspect the game looks better that way.
And the article is about they missed out on the optionality of using the additional gamut, but that additional gamut wouldn't intrinsically look better.
It's easy enough to edit a screenshot to show us what could have been, but even in that single screenshot there are things that look worse: like the flames gained saturation but lost the depth the smoke was adding, and some reasonable atmospheric haze vanished.
(similarly the game in the side-by-side has some downright awful looking elements, like the over-saturated red crystals that punch a hole through my HDR display...)
Given Nintendo's track record for stylization over raw image quality, I'm not sure why this isn't just as likely them intentionally prioritizing SDR quality and taking a modest-but-safe approach to HDR... especially when the built-in screen maxes out at 450 nits.
Compare any of the retro tracks to their World counterpart, then say that again. The game’s general palette and design is so washed out and bland compared to the DS, Wii, 3DS, and Tour versions of those tracks.
If there's a track that's actually less saturated than it was then, it's definitely not the result of an SDR-first workflow.
It could, but that's different than shouldn't. There could be good reasons for it, such as not wanting the primary gameplay to lie outside a color palette available to people playing on their TV in sRGB or in the common complete shit HDR modes that lie about their capabilities.
It would be neat if more was used, but nothing about being HDR means that you should, or that it's even a good idea, to rely on the maximum capabilities.
> I haven’t turned it back to SDR mode but I’m legitimately considering it. Because I suspect the game looks better that way.
To be honest, without a TV with proper HDR, SDR mode will often look much better. The problem is that TVs are often quite awful when it comes to color volume, specular lighting and calibration. The SDR mode is often very untrue to the content, but stretches things within the TV capabilities to make it bright and vivid to look nice. The HDR mode on the other hand has to give up the charade, and in particular SDR tonemapped content, which if the TV didn't lie would have looked identical to SDR mode, looks really awful.
A favorite of mine is to switch Apple TV's between SDR and (tone-mapped) HDR mode and see how different the main menu and YouTube app looks. I have yet to find a TV where UI doesn't look muted and bland in HDR.
It's like a keyword bingo for usually poor implementations. I grant that maybe the implementation is good for any specific game you care to mention - but history has shaped my habits.
The presence of in-game music is a poor implementation indicator?
I am a big proponent of "there's no wrong way to enjoy a game" but wow. In nearly 50 years of gaming that's a new one for me. Congratulations. But do whatever works for you... and I mean that sincerely, not sarcastically or dismissively.
But any one of these aspects can individually be crap and often are.
If I have a need for ingame VoIP I'll turn it back on. But I don't want to default to hearing randoms on the internet espousing bullshit using mic-always-on.
If it turns out the game is good with music, I'll turn it on. Rocket League works.
If one of my friends is raving about the graphics, I'll turn HDR or bloom on - though I haven't ever agreed with them enough to leave it that way.
So by default, yes I turn those things down or off before I even start playing.
Further detail for music. Often poorly implemented: repetitive, sounds similar to game sound effects, changes and gets louder when there's a rush horde. All distracting and ruining immersion.
I quit playing Day of Defeat because of the end of round music that I couldn't do anything about. I either couldn't hear footsteps next to me (part of the gameplay) or I was deafened with the end of round blaring. I don't have time to put up with poor UX when there are plenty of other games to play.
As I get older and find it harder to discern different sounds it is just easier to focus on what I want - the game and gameplay. It's the same thing as drivers turning down the stereo when they want to pay attention to finding a park or driveway - it's a distraction to what I'm here for.
I like music, and like to listen to it, either as a dedicated thing to do or while doing house chores or cruising long distances in the car. But generally not while gaming.
Thankfully so far, these are still options I can set to taste. I guess when the "the way it's meant to be played" crowd takes over I'll do something else with my time.
Truly, in 50 years, none of this has occurred to you or you’ve never witnessed it in friends or family? That seems hyperbolic
I understand that it helps some people get "into the flow" or something. But I don't have an issue with that. I occupy my mind trying to grasp the mechanics and be good at using them to play. If the gameplay doesn't have enough then papering over it with music doesn't do it for me.
And I'm not always looking for "intense" stuff. I like to chill out too. But I've played quite a few games over the years and so the gameplay has to have something to keep me entertained.
I enjoy music with Rocket League because I don't play competitive and so some music playing while I'm hanging out with others on my couch shooting-the-shit as it were is fine. It's more of a "social setting music" than "game music".
After all these years, I don't miss it at all.
I have turned the music off in many games.
Other games like Senua did actually manage to pull off an amazing sun/scene though. Because its slower they can use it to accentuate, for example you walk around a corner and go from the darkness into full on sunlight which is blinding, but then falls off so as to become bearable.
the monitor can display what it can display. format of transfer doesn't change hardware capabilities, just how you express what u want towards them
this was posted before here on HN and i dont think its wrong. though ofcourse, technically, there are differences, which might be in some applicaitons ( but usually are not) exploited to the viewers benefit. (?? maybe?? someone with real good eyes :)) https://yedlin.net/DebunkingHDR/index.html
maybe my interpretation is wrong, but i dont think it is far off if it is. specification differences and differences in human perception are not the same thing
Imagine if every individual song, speaker/earbud, and MP3 player had a different implementation of 8/16 bit music, and it was up to you to compare/contrast the 8 variations to decide if 8 or 16 bits was more auditorially satisfying.
You get this with like levels and mixers, but not with such a fundamental quality as definition. Sure, you can chase hi-fi, but I feel like music has less artificial upscaling and usually hi-fi is a different product, not a toggle buried in various menus.
I don't know, it's kind of crazy.
anything beyond 16bit (which is streamed to your speakers) is not audible though in the sense of playing it to the speaker. it matters in processing (dsp). (44 or 48khz inguess?)
Somehow I doubt this survey is representative of the typical Mario Kart player. And to those for whom it is a concern, I don't think SDR is high on the list relative to framerate, pop-in, and general "see where I'm going and need to go next" usability.
That really is the joy of Mario Kart. You think you’re going to beat me, kid? You’re 12 and I’ve been playing Mario Kart for 30 years.
(And then they do… oof)
You need a real HDR display (800 nits+), FALD or OLED for contrast, some calibration, and software that uses it well (really hit and miss at least on Windows).
Once all the stars align, the experience is amazing. Doom Eternal has one of the best HDR implementations on PC, and I suggest trying it once on a good display before writing HDR off as a gimmick.
There’s something about how taillights of a car in a dark street in Cyberpunk look, and that just can’t be replicated on an SDR display afaict.
Then you have some games where it’s implemented terribly and it looks washed out and worse than SDR. Some people go through the pain and mod them to look right, or you just disable HDR with those.
I’d vouch for proper HDR any day, that being said I wouldn’t expect it to improve Mario Kart much even with a proper implementation. The art style of the game itself is super bright for that cheery mood, and no consumer display will be able to show 1000nits with 99% of the frame at full brightness. It’ll likely look almost the same as SDR.
One reason for keeping Apple hardware around is a decent display test bench. I do the best I can with image work, but once it leaves your hands it's a total lottery.
So far this is my experience of HDR.
There are about a thousand other things in any given game that matter more to me than HDR tone mapping, and I'm happy for developers to focus on those things. The one exception might be a game where you spend a lot of time in the dark - like Resident Evil or Luigi's mansion.
Looking at his example video where he compares Godfall Ultimate footage to Mario Kart - I quite dislike the HDR in Godfall Ultimate. Certain elements like health bars, red crystals, and sparks are emphasized way too much, to the detraction of character and environment design. I find Mario Kart to be much more tasteful. That's not to say that Mario Kart World couldn't be better looking in HDR, but the author doesn't really do a compelling job showing how. In the side-by-side examples with "real" HDR, I prefer the game as-is.
It’s a little better than I had it set. But it’s still a problem. As this article shows, it just wasn’t designed right.
Perhaps the worst offender I've ever seen was the Mafia remake by Hangar 17, which loads every time with a sequence of studio logos with white backgrounds that cut from black. The RGB(255,255,255) backgrounds get stretched to maximum HDR nits, and the jump from RGB(0,0,0) (especially on an OLED) is absolutely eye-searing.
I literally had to close my eyes whenever I'd load the game.
Of course there are individual wonky sites which will still flash but if applicable, those two things should reduce the occurrences significantly.
Why would it be any more impactful on OLED than any given FALD display capable of putting out >1000 nits sustained?
Perceived intensity in HDR is dominated by luminance, not just contrast ratios on paper.
OLEDs do have effectively infinite contrast (since black pixels are off) and it's why I love them, but that doesn’t inherently make white flashes more intense on them via any other display type unless the peak brightness is also there to support it.
Or in other words, a 800 nit flash on OLED is going to be less intense than a 1600 nit one on a FALD LCD. Brightness is the bigger factor in how harsh or impactful that flash will feel, not just the contrast ratio.
It's not down to your panel technology in this case, but the limitation of any given panels peak and sustained brightness capabilities.
This kind of scenario is in fact where FALD is strong. OLED really starts to pull ahead in more complex scenes where the zone counts simply can’t match up to per pixel control.
I love OLED's motion clarity (I still use CRTs, that’s how much I care). But I dislike its inability to maintain brightness at larger window sizes and VRR flicker is a dealbreaker across the board.
My FALD display on the other hand, delivers the most jaw dropping HDR image I've ever seen, producing an image so spectacular it's the biggest graphical/image jump iver seen in greater than a decade, but its motion resolution is garbage and yes, in some specific content, you'll get some minor blooming. It's nice that it doesnt have VRR flicker though.
My OLEDs win for motion, my FALD wins for HDR + lack of VRR flicker, it's very unfortunate that there’s no perfect display tech right now. Since most of my content is bright (games), I’m happy to trade some blooming in starfields for overall better performance across the other 80% of content. Other peoples content will differ, like perhaps they love horror games, and will choose the OLED for the opposite reason and I'd get that too.
I still use OLED in the living room though, it doesnt seend the kind of abusive usage my monitors do and OLED TVs are way brighter than OLED monitors, not as bright as i'd like, but bright enough i'm not gonna go out and replace it with a FALD, not until the new Sony based stuff drops at the very least.
https://www.theverge.com/news/628977/sony-rgb-led-backlight-...
Why are people able to craft an image/video that bypasses my screen brightness and color shift settings?
If I wanted to see the media in full fidelity, I wouldn't have my screen dimmed with nightshift turned on in my dark bedroom.
It's not OP's fault. My mind is just blown every time I see this behavior.
The fact that it just goes full 1600 nits blast when you view a video from a sunny day is a terrible UX in most cases and it’s the reason why I have HDR turned off for video recording, even though I might miss it later. To make matters worse, it also applies to 3rd party views.
The current implementation means that only the occasional image/video behaves that way, and only if it were crafted that way.
I can’t articulate why it bothers me. Except maybe the implied assumption that the author’s real voice & style benefit more than they are harmed from being submerged in what is ultimately mathematically derived mush.
If you consider your writing bad enough to warrant a LLM fluffing pass, I consider it no better than the 99% of worse than mediocre, lazy, attention grabbing bullshit without real intellectual quality that pollutes the internet.
Call it lazy but I think for the reader reading a rephrased LLM article is more enjoyable than trying to parse some borderline “Kauderwelsch”(German for gibberish) :)
"Mainstream" or "majority" in context of Nintendo is a $20-40k/yr white collar household with 2 kids. The REAL mainstream. Some would have real ashtrays on a dining table. ~None of them had bought any of TVs over 42" with 4K resolution and HDR support in past 10 years.
Though, I do wonder how globally mainstream is such a household buying Nintendo hardware. Admittedly it could be somewhat of a local phenomenon.
Either six-figure Nintendo gamers are hoarding boxes full of Switch 1 in the attic and completely destroying the statistics, or everyone at that income bracket is sophisticated enough to desire one.
Frankly, this is why I'm wondering how normal it is globally, because I believe Japan is not supposed to be a village of broke Vulcans. Maybe the nerds are in fact hoarding tons of Switches.
TVs are a very cost effective home entertainment device, and 4k HDR is the default these days.
OLED has ABL problems, so they can do HDR400, but anything with a higher brightness than that is problematic.
I feel like HDR support is so divergent that you're not ever going to match "properly mastered" content in the way the article author wants. That's why no developers want to spend time on it.
How is that the right call?
2) More than past Mario Karts, World needs to visibly delineate the track into multiple sections: the track itself, the "rough" off track, the border between those two, and the copious number of rails you can ride and trick on. Rails in particular are commonly bright primary colors in order to better stand out, often more primary color coded and saturated than the track itself. Green Pipes, yellow electrical wire tie downs, red bridge rail guards, etc.
3) Bonus gamut for particle effects is kinda not required and probably distracting when drifting around a curve avoiding attacks.
4) It feels pretty good to me, but maybe I need to adjust some settings on my LG C1 to get the full bland experience?
What surprised me is why a new game from a big studio, designed to support "HDR", would not be designed all in linear space to begin with. Because then doing tone mapping correctly for different display technologies becomes easy. However my knowledge is mostly from the photography world so perhaps someone with game knowledge can weigh in.
Sounds like an incredibly cost-effective optical illusion!
Here's some good examples: https://www.shadertoy.com/view/MslGR8
I like some of the choices on Mario Kart World with HDR, but a lot of it just needs to be toned down so the things which do blow out the colors are impressive but also fit instead of just everything being turned up to 11.
My first reaction when I saw the launch/gameplay video was why does this look so washed out? Now I kinda know why - thank you!
You want low latency and long battery life, HDR has an impact on the two
Have people forgotten what a handheld is supposed to be? portable device on a battery
Come ON.
the game perhaps started development when they had a different screen planned for the console?
according to rumors, the console is at least 1year late
i'm just pointing out a fact, i'm not saying that everything they do make sense
It might be true that an HDR-capable panel was cheaper than one that was only SDR-capable.
That doesn't mean that Nintendo was obligated to incur the (probably very modest) increase in energy drain from additional processing load and (probably quite a bit less modest) increase in energy drain from making the screen far brighter on average than when running in SDR mode. They could have just run the panel in SDR mode and limited HDR mode to when one was running on AC power in a dock. Or even never enabled HDR mode at all.
NOTE: I've not looked into whether or not enabling HDR in a given game has a significant effect on battery life. I'm also a big fan of manufacturers and game developers creating good HDR-capable hardware and publishing good HDR-capable software whenever they reasonably can. Higher contrast ratios and (-IMO- more importantly) wider color gamuts are just nice.
macOS puts a slightly higher brightness than it required and artificially (in software) changes absolute white (0xFFFFFF) to greyish color (0xEEEEEE). So when a HDR content is required it will remove mask around that content. Safari ideally, probably that’s on Firefox why tone mapping doesn’t work well
The video looks the same in both Safari and Firefox, whereas the images are dim in Firefox on both my MBP display and external monitor.
https://bugzilla.mozilla.org/show_bug.cgi?id=hdr
I seriously doubt many Switch users would bail on the system because of “fake” HDR. They probably don’t care about HDR at all. As long as Mario remains Mario, they’re happy.
The visual difference between the N64 and GC was enough that it made sense to focus on upgraded graphics. When you play an N64 game, there's always the initial shock of "wow these graphics are a bit dated".
But you don't get that feeling when playing Melee, or Wind Waker, or many of the other artfully done GC games.
Essentially, somewhere around the GameCube era, graphics became good enough that the right artist direction could leap a game into the "timeless graphics" category.
And so it makes sense that Nintendo said "let's stop chasing better graphics, and instead focus on art direction and gameplay".
And of course it does not matter, Nintendo still sells because it's Mawio (and I say this with all the love, I'm a huge Mario fan myself).
Outsold both the PS3 and XBOX360 by 15M units though. Given the lower hardware costs of the Wii (I've seen estimates of ~$160 compared to $840 for the PS3 and $525 for the Xbox 360 - both higher than launch price btw!), I'd suggest Nintendo made the right choice.
It wasn't until the Wii that Nintendo stepped out of the hardware race. Somehow this has been retconned into Nintendo never focusing on hardware.
If they thought it would sell more systems, they'd compete. The Switch 2 is evidence that it doesn't matter.
Fair point, but on the other hand... that was 20 years ago. So it's easy to understand why that gets rounded off to "never".
If the system was SDR only I would be disappointed but fine.
But they made it HDR. They made a big deal about it. And it doesn’t work well. It’s impossible to calibrate and ends up just looking washed out.
It’s broken.
And I don’t appreciate the insinuation that Nintendo fans will buy any piece of junk they put out. See: Wii U.
It was just an easy at-hand example.
I also liked the VirtualBoy. But I bought it and a bunch of game from Blockbuster for $50 total when they gave up on it. So my value calibration was very different from those who paid retail.
I’m having a blast with MarioKart but the track usually looks washed out. Some of the UI and other things have great color on them but most of the picture just looks like the saturation was turned down a bit.
Very disappointing as a Mario game and its colorful aesthetic is the kind of thing that should be able to look great in HDR.
Anything that isn’t an oled simply cannot do HDR. it’s just physically impossible to get the real contrast.