Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical... Besides whatever the provenance of these prints are (this gets complicated) the reality is that these were also made to look at best as they could for typical movie theater projector systems in the 90s. These bulbs were hot and bright and there were many other considerations around what the final picture would look like on the screen. So yeah, if you digitize 35mm film today, it will look different, and different from how its ever been been displayed in a movie theater.


Agreed. It's a fine article but leaves half the story on the table. It is supposedly comparing what these movies looked like in the theater to the modern streaming and bluray versions, but is actually comparing what a film scan (scanner settings unspecified) projected on a TV (or other unspecified screen) looks like compared to the digital versions on (presumably) the same screen. And then we can ask: how were the comparison images captured, rendered to jpeg for the web, before we the readers view them on our own screens? I'm not arguing Catmull and company didn't do a great job of rendering to film, but this comparison doesn't necessarily tell us anything.

Don't believe me? Download the comparison pictures in the article to your device and play with filters and settings. You can get almost anything you want and the same was true at every step in the render pipeline to your TV.

Ps - and don't get me started on how my 60-year old eyes see color to what they perceived when I saw this in the theater


It’s an interesting and valid point that the projectors from the time would mean current scans of 35mm will be different too. However, taking for example the Aladdin screenshot in particular, the sky is COMPLETELY the wrong colour in the modern digital edition, so it seems to me at least that these 35mm scans whilst not perfect to the 90’s are closer to correct than their digital counterparts.


And as someone who is part of those conservation communities that scan 35mm with donations to keep the existing look, a lot of the people doing those projects are aware of this. They do some color adjustment to compensate for print fading, for the type of bulb that were used in movie theatres back then (using a LUT), etc...

I do find that often enough commercial releases like Aladdin or other movies like Terminator 2 are done lazily and have completely different colors than what was historically shown. I think part of this is the fact that studios don't necessarily recognise the importance of that legacy and don't want to spend money on it.


Whats wrong with terminator 2?

Are there like multiple digital releases, one with better colour than the other?


There's a 4K version out that does interesting things with colour grading, here's a post I found: https://www.reddit.com/r/Terminator/comments/d65pbi/terminat.... The one on the left is the remaster.

There was similar outrage (if that's the right word) about a Matrix remaster that either added or removed a green color filter, and there's several other examples where they did a Thing with colour grading / filtering in a remaster.


There's multiple versions of the Matrix on the trackers and the internet that I know of. The official release all look kinda different to each other:

https://www.youtube.com/watch?v=1mhZ-13HqLQ

There's a 35mm scan floating around from a faded copy with really weird colors sometimes

https://www.youtube.com/watch?v=Ow1KDYc9XsE

And there's an Open Matte Version, which I don't know the Origin of.

https://www.youtube.com/watch?v=Z2eCmhBgsyI

For me, it's the Open Matte that I consider the ultimate best version.


To me, that just looks like what happens when I try to play HDR content on a system that doesn't know about HDR. (It looks like you're watching it through sunglasses.)


That is precisely what this one was. The OP posted a follow-up comment at https://www.reddit.com/r/Terminator/comments/d65pbi/comment/...:

> I have an updated, I found out that T2 4K is an HDR movie that needs to be played with MadVR and enable HDR on the TV itself, now the colors are correct and I took a new screenshot: https://i.imgur.com/KTOn3Bw.jpg

> However when the TV is in HDR mode the 4K looks 100% correct, but when seeing the screenshot with HDR off then the screenshot looks still a bit wrong, here is a screenshot with correct colors: https://i.imgur.com/KTOn3Bw.jpg


I own the blu-ray of The Terminator 2, and briefly owned the 4k as well. The 4k looks like dogshit, with or without HDR enabled. This is largely due to the film using DNR to remove film grain, which they did for the 4k transfer was created for the 2017 3D release (film grain is bad in 3D I guess). The transfer is also much, much more blue.


See my top level comment for more info on this, but the Aladdin scan used in the article was from a 35mm trailer that's been scanned on an unknown scanner, and had unknown processing applied to it. It's not really possible to compare anything other than resolution and artefacts in the two images.


It sounds like in the case of Toy Story, the Pixar team were working toward a 35mm print as the final product, so that probably should be considered canonical: it's what the creative team set out to make.


This reminds me of how pre-LCD console games don't look as intended on modern displays, or how vinyl sounds different from CDs because mixing and mastering targeted physical media with limitations.


Wasn't CD more so cheapening out? Doing work one time and mostly for radio where perceived listening scenario was car or background and thus less dynamic range allowed it be louder on average.

CD itself can replicate same dynamic range and more, but well that doesn't sell extra copies.


The loudness war was a thing in all media. In the 80s most of us didn't have CD players but our vinyl and tapes of pop and rock were all recorded overly loud. Compared to the classical and jazz recordings, or perhaps the heyday of audiophile 70s rock, it was annoying and sad.


And it was made by a lab that made choices on processing and developing times, that can quite easily affect the resulting image. You hope that labs are reasonably standard across the board and calibrate frequently, but even processing two copies of the same material in a lab, one after the other will result in images that look different if projected side by side. This is why it's probably impossible to made new prints of 3-strip-cinerama films now, the knowledge and number of labs that can do this are near zero.


> It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical

Same applies for people buying modern vinyl records believing them to be more authentic than a CD or (god-forbid) online streaming.

Everything comes from a digital master, and arguably the vinyl copy adds artefacts and colour to the sound that is not part of the original recording. Additionally, the vinyl is not catching more overtones because it's analogue, there is no true analogue path in modern music any more.


I don't know if this is still true, but I know that in the 2000s the vinyls usually were mastered better than the CDs. There even was a website comparing CD vs vinyl releases, where the person hosting it was lamenting this fact because objectively CDs have a much higher dynamic range than vinyls, although I can't find it now. CDs were a victim of the loudness war[0].

Allegedly, for a lot of music that is old enough the best version to get (if you have the kind of hifi system that can make use of it) is an early 80s CD release, because it sits in a sweet spot of predating the loudness war where producers actually using the dynamic range of the CD.

[0] https://en.wikipedia.org/wiki/Loudness_war


The loudness wars were mostly an artifact of the 90s-2010s, because consumers were listening on horrible plasticky iPod earbuds or cheap Logitech speakers and the music had to sound good on those.

Once better monitors became more commonplace, mastering became dynamic again.

This is most clear with Metallica's Death Magnetic, which is a brickwalled monstrosity on the 2008 release but was fixed on the 2015 release[0]. And you can see this all over, where albums from the 90s had a 2000s "10-year anniversary" remaster that is heavily compressed, but then a 2010s or 2020s remaster that is dynamic again.

[0] Interestingly enough between those dates, fans extracted the non-brickwalled Guitar Hero tracks and mastered them as well as they could. Fun times :).


Right, that makes sense. And it also makes sense that vinyls didn't suffer from this because the people who would buy those would use them at home with better speakers. Or that classical music CDs throughout the entire period made great use of the dynamic range, since that also is more likely to be listened to on high quality speakers.


Vinyl literally cannot be brickwalled because the needle can't handle it. That's also why vinyl can't handle heavy bass, it'll make the needle vibrate out of the groove. It has nothing to do with the speakers.

It was sort of a happy coincidence that vinyl's limitations forced more dynamic (but less bass-y) masters. Although if your artist didn't do vinyl releases -which really was a dying medium until hipsters brought it back in the 2010s- you were hosed.


> Vinyl literally cannot be brickwalled because the needle can't handle it.

Interesting, I did not know this! I'm not doubting you, but I'm a little confused and curious about how the physics of that works out. Wouldn't being brickwalled mean the volume stays pretty constant, meaning there's less work for the needle? Or is there some kind of limit to how many overlapping waveforms a needle can pick up at once?


Bit of a lesson incoming, skip to the vinyl bit if you don't care for that:

"Dynamic range compression" is a bit of a misleading term because it sounds like you're taking an audio signal and and squeezing it.

What you're really doing is two things: reducing (compressing) the difference between the quiet (valleys) and loudest (peaks) parts, and then pushing the volume of the peaks up to or past 0dB. Technically, that second step isn't dynamic range compression, but in practice it is / was always done. The reason they do this is because for human ears, louder sounds better. However, you lose dynamism. Imagine if you watched a movie, and a whisper during a military night raid would sound as loud as the shouty conversation they had in the planning room.

Past 0dB, a signal will 'clip'[0], which means the loudest parts of the signal cannot be expressed properly and will be cut off, leading to signal loss. Basically, 0dB is the loudest you can get.

These days, in practice, music tracks get mastered so that the average value is -14dB because streaming sites will 'normalize' tracks so that the average dB is -14dB. Here[1] you can see why that makes brickwalling bad. If your track goes full tilt and has almost no valleys, the average dB per second is rather high, so your entire track gets squeezed to average out to -14dB. But if you have lots of valleys, you can have more peaks and the average will still be -14dB!

RE: vinyl? Well, too much and / or too intense motion in the groove (the groove is effectively a physical waveform) makes the needle slightly skip out of the groove. "Too much" happens with brickwalling, "too intense" happens with very deep bass. Try to imagine the upcoming links I'm referring to as a physical groove a needle has to track, instead of a digital waveform.

Here[2] is one Death Magnetic track waveform of the brickwalled original vs. fixed remastered release. It's not too bad. But then there is this[3] insanity.

[0] https://www.youtube.com/watch?v=SXptusF7Puo / https://www.youtube.com/watch?v=g7AbmhOsrPs

[1] https://cdn.shopify.com/s/files/1/0970/0050/files/46eacedf-c...

[2] https://happyhipster.wordpress.com/wp-content/uploads/2023/0...

[3] https://happyhipster.wordpress.com/wp-content/uploads/2023/0...


Thank you for the in-depth answers!


I dunno about authentic but for a while (as another commenter pointed out) they didn't have the loudness maxed out and / or had better dynamic range. That said, music quality aside, vinyls have IMO better collectability value than CDs. They feel less fragile, much more space for artwork and extras, etc.


I think the entire premise of the article should be challenged. Not only is 35mm not meant to be canonical, but the 35mm scans the author presented are not what we saw, at least for Aladdin.

I've watched Aladdin more than any as a child and the Blu-ray screenshot is much more familiar to me than the 35mm scan. Aladdin always had the velvia look.

> Early home releases were based on those 35 mm versions.

Here's the 35mm scan the author presents: https://www.youtube.com/watch?v=AuhNnovKXLA

Here's the VHS: https://www.youtube.com/watch?v=dpJB7YJEjD8


Famously CRT TVs didn't show as much magenta so in the 90s home VHS releases compensated by cranking up the magenta so that it would be shown correctly on the TVs of the time. It was a documented practice at the time.

So, yes the VHS is expected to have more magenta.

Anecdotally, I remember watching Aladdin at the movie theatre when it came out and later on TV multiple times and the VHS you saw doesn't correspond to my memories at all.


The author here is asserting that VHS were based on the 35mm scans, and that the oversaturation is a digital phenomena. Clearly, that's not true.

I can't challenge the vividness of your memory. That's all in our heads. I remember it one way, and you remember it another.


For sure, the author simplified things for the article. Anyway, in the case of VHS, they were indeed based on the 35mm scan but then had additional magenta added (as well as pan and scan to change the aspect ratio).

The author is not wrong that oversaturation is a source transfer phenomena (which will always be different unless special care is taken to compare with the source material).

On most TVs that magenta wouldn't have shown as much as the youtube video shows because TVs tended to have weaker magentas. Of course, it's not like TVs were that uniformly calibrated back then and there were variations between TVs. So depending on the TV you had, it might have ended up having too much magenta but that would have usually been with more expensive and more accurate TVs.

TLDR: Transfers are hard, any link in the chain can be not properly calibrated, historically some people in charge of transferring from one source to another compensated for perceived weak links in the chain.


The magenta thing is interesting. I learned something new. Reading the other comments, this is seems to be as much a tale of color calibration as much as anything.

Regarding my memory, it becomes shakier the more I think about it. I do remember the purples but me having watched the cartoon could have affected that.


NTSC: Never Twice the Same Color


At the same time, I think the nostalgia people feel for those versions isn't necessarily about accuracy, it's about emotional fidelity




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: