From the example in this video I’m not quite sure how you get anti-aliasing for free in sub-pixel sampling. I understand it in general, but in this example the light is being reflected on a diffuse surface. Assumedly where the diffused ray goes has a large random component, so sub-pixel sampling provides very little, if any, benefit. Is my take in this correct?
Consider a scenario where almost half of a pixel (but not the center of it) is covered by some diagonal black object that's in front of a white object.
If you ray trace and shoot one ray in the center of the pixel then it will hit the white object, and no matter how many rays you spawn from that intersection point the part of the black object that's covering part of the pixel won't actually contribute to the final image. So with this technique the final pixel will be white.
With path tracing you shoot many initial rays at random sub-pixel positions, most miss the black object and hit the white one, but some hit the black one instead. So with this technique the final pixel will be gray, a mixture of the two objects' colors and therefore anti-aliased.
While sub-pixel sampling provides little benefit for diffuse surfaces, sampling the same pixel multiple times does provide better smoothness across the surface of the object.
Since it's not much more computationally expensive to do sub-pixel sampling than to repeatedly sample the same point within the pixel you might as well just use sub-pixel sampling.
Additionally a pixel has no knowledge of its contents until you cast a ray. And even if the ray did hit a diffuse surface you would have to do more math to determine if the edge of that surface lies within the boundaries of that pixel. Might as well just use sub-pixel sampling.
When you shoot a ray from camera perspective, you've already chosen which pixel it's going to contribute to. "Platonic" pixels are infinitesimally small.
In reality, pixels on the screen and camera sensor have an area. If you choose a random position on that area, you get anti-aliasing "for free" because there's going to be variation in the direction of rays that contribute to the same pixel.
You actually don’t get anti-aliasing for free on camera sensors. Many cameras put a low-pass filter in front of the sensor specifically to avoid the problem of aliasing. More recently, high-end cameras have been omitting the filter in order to improve resolution. This occasionally does result in aliasing (called moiré by photographers) when taking pictures of fine-structured repeating patterns, however.