I thought I had commented on that thread but ... apparently not.
Anyway. I built my own hand-rolled dither filter for my canvas library - see the CodePen demo to see it attempt to dither a live video stream (warning: site will ask for permission to use your device's camera before displaying the canvas) - https://codepen.io/kaliedarik/pen/OJOaOZz
The problem I have with the filter is that it's not consistent across video frames (the dither jiggles around, even when nothing is moving between frames). It's really annoying and I don't know how to solve it. Does anyone know of any research/solutions for this sort of thing?
I have seen that page before - an excellent writeup on how to morph the dither pattern to a sphere and match it to camera rotation to get rid of the dither jank as a character/camera moves through the game scene. But I don't think it applies to my particular issue of seeing dither jank on a webcam livestream.
Maybe there's a way of adapting my algorithm to give it some memory of previous frames to help minimise the jank that occurs as the current frame's output is calculated. Or identify static parts of the current frame compared to the previous frame and only recalculate the dither for the pixels that have changed beyond a minimal color distance threshold?
(Pointing this out/linking in best faith for references)