How I made the Slit Scan video in Blender

While I did this in Blender, there's nothing particularly special about the technique that it couldn't be done in any other package. The only requirement is that the renderer can do motion blur by any means other than faking it via motion vectors and blurring the original image (much like you'd get using Photoshop on a single image).

My rendering: https://youtu.be/MVCQr0uBCeQ

This is done using a pretty old technique called Slit Scan Photography. I hunted around for a description of the process and found a lot of illegible diagrams and vague descriptions. I finally found this video https://vimeo.com/391366006 which is just awesome. Now I know how the F/X folks made the star fields in Star Trek in 1966! The video does an excellent job of explaining the process and should, by itself, give you enough to recreate the effect in your 3D package. Here's a bit of a breakdown of the process:

The camera is moving toward (or away from) your slit plane and exposing a single frame over the course of the entire movement. In order to not get a smeared blur of the source image, you need to move the source while the camera is also moving. You need to consider exactly how the camera is moving so you can "paint" the image in the blurred space the right way.

For my setup, here's how the texture is positioned while the camera is close:

and a little further away:

and all the way out:

If you take notice of the yellow/white splotch with the red dot in it, you'll see it's sliding off to the right in these images. That happens to be up from the camera's perspective. The slide distance is actually fairly significant - about 20 times the distance the texture will advance for the next frame:

As you can see, it's nearly all the way back to where it was for the close-side of the exposure in the previous frame. This kind of animation will produce a result where the smear rendered in each frame appears to be coming at the viewer. Now, which way the texture moves (up or down) doesn't really matter. As long as the next frame starts a little more in the same direction as the advance happens during the exposure, the pattern from frame to frame will appear to fly out toward the viewer. If the start point for each frame is slightly behind the point it was in the previous frame (compared to the slide direction during the exposure), the motion will appear to recede away from the viewer. The distance the source slides during the exposure will control how stretched out the texture appears in the frame. The distance the source jumps from frame to frame will control the speed at which the texture appears to be flying at the viewer.

Blender has a few implementation details (bugs? maybe.) which can make things a bit complicated. For starters, cycles does not render motion blur for anything other than moving geometry. Lights do not blur, nor do animated textures. I first tried to do this by animating the texture's position on the object and all that resulted was big smears during the exposure. This is because it rendered the same texture slice along the entire exposure, instead of slightly different parts of the texture along the way.

I wanted to avoid having the same texture render out the top slit as the bottom slit, which is what happens if you look closely at the Dr Who example given in the video (the sliding texture appears first on the bottom half, and then a moment later the very same pattern emerges on the top half). This requires two different texture planes but now I have the problem of obscuring them such that one is exposed in the top slit, and the other is exposed only in the bottom slit. I originally just split the UV space for the plane and animated the UVs such that a), they were moving nicely and b) were entirely different parts of the fractal so they'd look different. However, see the motion blur bug I mentioned. So, I switched to having two planes and then boolean modifiers to hide the part of the plane which would be exposed to the other slit. Which then revealed the next limitation of Blender - the booleans are all computed before the motion, or maybe sometime during the motion. At random, some frames partially exposed the wrong plane in each slit. So, there was chaotic blinking from frame to frame where sometimes the wrong plane would expose in the top slit, and sometimes it wouldn't. You will need to have the slits far enough apart to avoid this problem, or just live with the result the BBC got when they made the Dr Who intro.

The slit plane needs to be as close as possible to the image plane(s). I have the image planes 0.0001m behind the slit plane. I made the slits using a boolean operator on a simple plane and the slit shapes themselves are simple planes with a Solidify modifier set to 1mm thick. This gives me the flexibility to alter the slits to other shapes as I like.

I made the image planes emissive and quite bright.

I will leave it as an exercise for the reader to figure out how to do the slow fade-in and fade-out.