To answer some of the questions here, the reason this has not been used before is because this technique requires being able to access the quad definitions (ie. which 4 vertices makeup each quad) within the gpu.
Up until recently with Mesh Shaders, there's really just been no good way to send this data to the GPU and read back the barycentric coordinates you need in the fragment shader for each pixel.
The article offers several options, to support older GPUs, like Geometry Shaders and Tesselation shaders. This is good, but these are really at best Terrible Hacks(tm). Proof of the ability to contort old extensions is not proof of reasonable performance!
Notably, geometry shaders are notorious for bad performance, so the fact that they list them as a viable strategy for older devices makes it pretty clear they aren't thinking much about performance, just possible compatibility.
Still, I think this is very cool, and now that GPUs are becoming much more of a generic computing device with the ability to execute arbitrary code on random buffers, I think we are nearly at the point of being able to break from the triangle and fix this! We hit this triangulation issue several times on the last project, and it's a real pain.
Yeah, B/W film was definitely not all that accurate. It was a common practice to use weird makeup, lighting, colors, in order to create a B/W film image that looked correct (see chocolate used for blood).
So in those cases, filmmakers had to counteract the limitations of the b/w transformations in the actual sets.
This is actually a very good point that nobody else has commented on. The very definition of lightness is somewhat ambiguous. I would argue that vividness itself is a unique axis, separate from saturation.
But which axis you care about is very context specific.
1. I was not aware this was public.
2. I was told that all rankings should be taken in context compared to others in the 2015 graduating class. Clearly none of these people are any contender to be best in the world at anything.
The people I know on the list are certainly some of the smartest people I know, but take this with a grain of salt.
Made this app while I was learning AngularJS. It's quite useful for getting the contact information of everyone in a group (instead of having to pass around phones and risk missing numbers, etc).
Figured I would throw it out here instead of it sitting around on my hard drive! Comments and feedback welcome!
Up until recently with Mesh Shaders, there's really just been no good way to send this data to the GPU and read back the barycentric coordinates you need in the fragment shader for each pixel.
The article offers several options, to support older GPUs, like Geometry Shaders and Tesselation shaders. This is good, but these are really at best Terrible Hacks(tm). Proof of the ability to contort old extensions is not proof of reasonable performance!
Notably, geometry shaders are notorious for bad performance, so the fact that they list them as a viable strategy for older devices makes it pretty clear they aren't thinking much about performance, just possible compatibility.
Still, I think this is very cool, and now that GPUs are becoming much more of a generic computing device with the ability to execute arbitrary code on random buffers, I think we are nearly at the point of being able to break from the triangle and fix this! We hit this triangulation issue several times on the last project, and it's a real pain.