How I made the Slit Scan video in Blender

While I did this in Blender, there's nothing particularly special about the technique that it couldn't be done in any other package. The only requirement is that the renderer can do motion blur by any means other than faking it via motion vectors and blurring the original image (much like you'd get using Photoshop on a single image).

My rendering: https://youtu.be/MVCQr0uBCeQ

This is done using a pretty old technique called Slit Scan Photography. I hunted around for a description of the process and found a lot of illegible diagrams and vague descriptions. I finally found this video https://vimeo.com/391366006 which is just awesome. Now I know how the F/X folks made the star fields in Star Trek in 1966! The video does an excellent job of explaining the process and should, by itself, give you enough to recreate the effect in your 3D package. Here's a bit of a breakdown of the process:

The camera is moving toward (or away from) your slit plane and exposing a single frame over the course of the entire movement. In order to not get a smeared blur of the source image, you need to move the source while the camera is also moving. You need to consider exactly how the camera is moving so you can "paint" the image in the blurred space the right way.

For my setup, here's how the texture is positioned while the camera is close:

and a little further away:

and all the way out:

If you take notice of the yellow/white splotch with the red dot in it, you'll see it's sliding off to the right in these images. That happens to be up from the camera's perspective. The slide distance is actually fairly significant - about 20 times the distance the texture will advance for the next frame:

As you can see, it's nearly all the way back to where it was for the close-side of the exposure in the previous frame. This kind of animation will produce a result where the smear rendered in each frame appears to be coming at the viewer. Now, which way the texture moves (up or down) doesn't really matter. As long as the next frame starts a little more in the same direction as the advance happens during the exposure, the pattern from frame to frame will appear to fly out toward the viewer. If the start point for each frame is slightly behind the point it was in the previous frame (compared to the slide direction during the exposure), the motion will appear to recede away from the viewer. The distance the source slides during the exposure will control how stretched out the texture appears in the frame. The distance the source jumps from frame to frame will control the speed at which the texture appears to be flying at the viewer.

Blender has a few implementation details (bugs? maybe.) which can make things a bit complicated. For starters, cycles does not render motion blur for anything other than moving geometry. Lights do not blur, nor do animated textures. I first tried to do this by animating the texture's position on the object and all that resulted was big smears during the exposure. This is because it rendered the same texture slice along the entire exposure, instead of slightly different parts of the texture along the way.

I wanted to avoid having the same texture render out the top slit as the bottom slit, which is what happens if you look closely at the Dr Who example given in the video (the sliding texture appears first on the bottom half, and then a moment later the very same pattern emerges on the top half). This requires two different texture planes but now I have the problem of obscuring them such that one is exposed in the top slit, and the other is exposed only in the bottom slit. I originally just split the UV space for the plane and animated the UVs such that a), they were moving nicely and b) were entirely different parts of the fractal so they'd look different. However, see the motion blur bug I mentioned. So, I switched to having two planes and then boolean modifiers to hide the part of the plane which would be exposed to the other slit. Which then revealed the next limitation of Blender - the booleans are all computed before the motion, or maybe sometime during the motion. At random, some frames partially exposed the wrong plane in each slit. So, there was chaotic blinking from frame to frame where sometimes the wrong plane would expose in the top slit, and sometimes it wouldn't. You will need to have the slits far enough apart to avoid this problem, or just live with the result the BBC got when they made the Dr Who intro.

The slit plane needs to be as close as possible to the image plane(s). I have the image planes 0.0001m behind the slit plane. I made the slits using a boolean operator on a simple plane and the slit shapes themselves are simple planes with a Solidify modifier set to 1mm thick. This gives me the flexibility to alter the slits to other shapes as I like.

I made the image planes emissive and quite bright.

I will leave it as an exercise for the reader to figure out how to do the slow fade-in and fade-out.

Extended Downtime

Good golly gracious. Down for nearly two weeks. That really sucks. Sorry about that. I made the BIG mistake of attempting an in-place upgrade of my Windows 2008 R2 server to Windows 2012 R2. Boy howdy, don't ever do that. I just about wrecked everything. Fortunately, I was able to piece everything back together and everything is back and functional again.

The PROPER way of upgrading your server infrastructure is to install new servers, migrate the service infrastructure onto those new servers, and then bring down the old ones. Despite the fact that Microsoft allows for in-place upgrades of operating systems, I have been sufficiently frightened away from ever trying that again.

Friends don't let friends perform in-place upgrades!

Just how does the password authorization and encryption work in the Remote Guide app/server?

As promised, here's a breakdown of the security system I implemented for my Remote Guide server. The basics are this: you enter your password on both the client (phone) and the server (PC). When the client connects, it requests a session token. The server generates a large random number and returns this to the client. Then, both the client and the server generate a session key by hashing the password and the random number using the algorithm specified in RFC 2898. The astute among you may notice an opportunity for a man-in-the-middle denial-of-service attack. In other words, an evil-doer could intercept the random number returned by the server and replace it with another one before sending the response on to your phone. At this point, a very determined adversary could potentially determine your password once your client starts making further requests against the server. The more likely outcome is your client will simply fail to communicate with your server since it won't be keying off the right random source material.

This session key is used in the authorization header passed between the client and the server. For each request made by the client, the parameter data is hashed, along with the session key and the resulting hash is sent to the server. The server is then able to verify the hash and prove to itself the client knows the password.

All communication between the client and the server is also compressed and encrypted. For each message sent (either sent from the client or sent in response by the server), the sender generates a large random number and again uses this, along with the pre-arranged password, to generate a message key. The random number is sent in the clear, immediately followed by the compressed and encrypted message. The other side is then able to decrypt the message using the received random number and the pre-arranged password.

All encryption is done with AES, all hashing is done with SHA256.

This algorithm isn't bullet-proof, I'm sure. But it's sufficient to keep the casual eavesdropper from reading your TV guide data, and it doesn't expose your password. It also prevents mischief-makers from injecting false commands against the server, so at least you won't find your Media Center suddenly recording every epsiode Barney and Friends.

Remote Guide Server for Windows 7/Windows 8.1 Media Center

This is the server component for my Remote Guide app for Windows Phone 8 and Windows 10. The app lets you browse the TV guide on your Media Center machine, schedule recordings, set up series recordings, see what’s been recorded, what will be recorded and so on. The app does not feature streaming of program data at this time. I wrote this app mostly for my own pleasure and so that I could quickly set a program to record while I was out and about. Given that I work for a living, remote viewing of my recorded shows wasn’t a priority – watching TV can wait until I get home. This server app will only run on Windows 7 or Windows 8.1 (and Windows 10, if you're clever) with Media Center. It won't work on earlier versions of Windows. Sorry. It will, however, run properly on both 32-bit and 64-bit operating systems.

The app features secure communications between the phone and the server. Well, secure enough. While other folks may go on and on about their wonderful proprietary security schemes, I’m too smart to know that I’m not smart enough to invent an unbeatable security system. It is, however, good enough. Your password won’t be exposed, and the data going over the air is all encrypted. I’m sure that a determined enough hacker will figure out how to break the encryption, but I doubt anyone would really want to work that hard. Details of how it works can be seen here.

Installation of the server should be a snap: just copy the executable file to somewhere on your Media Center machine and run it. On Windows 8.1, you'll receive a warning that Windows has protected your PC because the app is unrecognized. Click 'more info', and then 'run anyway'. On Windows 7, you'll be prompted with a security warning. Uncheck the checkbox "Always ask before opening this file" and then click "Run". After you've passed the security warning, you'll then be prompted to elevate to administrative privileges. Enter the appropriate credentials so it can run as admin. The app will finally start up. It provides a simple interface for installing and configuring the service. When you first run it, you’ll see something like this:

image

The information about your IP address and host name will be filled in. If you’ve bought your own domain name, you’ll see that there. If not, you’ll probably see whatever your ISP has assigned to your IP address. If you have a dynamically assigned IP address, you’ll very likely want to go buy a domain name and then use one of the various free services out there to link your new domain name to your dynamic IP address.

To install the service, just click the Install button. Once installed, choose a port number (the default will likely work out fine) and give your server a password.

You should click the Test Connectivity button to confirm the server will be able to configure your home router properly. Most modern routers implement Universal Plug and Play (UPnP), a protocol which allows automatic configuration of router services. Unfortunately, most modern routers implement it very poorly, so your mileage may vary. If the connectivity test fails, it may mean you’ll need to manually configure your router to direct traffic on your chosen port back to your Media Center machine. If you have no idea what this means, well, you’re not going to have much luck with this product.

Once you’ve assigned a port and password, you can start the service. Once it’s running, you no longer need to have this UI running. The service will restart on its own if you should reboot the computer. Note that the service will automatically open the appropriate port in Windows Firewall and your router when the service starts. It’ll close those ports when the service is stopped (assuming, again, that UPnP is implemented correctly on your router).

Once your server is up and running, the next step is to install the app on your Windows Phone. Scan this with your Windows Phone to install the app:

Scan Me

You need to be running at least v8 of the phone operating system. Once the app is installed, start it up and then configure it to talk to your Media Center:

Setup

Just enter your server name (visible on the setup UI), the port number and password you chose, and you’re ready to go. After you've entered your server information, you can automatically set up Remote Wake for your computer. If your computer is set to put itself to sleep, the app can remotely wake it up so that you can still see the guide.

The main menu is the portal into the various features of the app:

Main

This should be pretty self-explanatory from here. You can browse the guide, see what you’ve got set to record, what’s going to record in the near future, what you’ve got recorded, and the ability to search for programs by various keywords. Many of the listing pages feature context menus. Just press and hold a list item to see a quick list of things you can do. You can schedule programs to record right from the program details page:

Program

Just push the single dot to record that show, or the triple fading dot to record the whole series. You can configure the series settings (say, what channels and times are permissible, whether you want first run only, etc.) by tapping the edit button or from the Requests page. Explore around and let me know how you like it.

One point of note: All times and dates displayed on the phone will be in the phone's timezone, not your server's timezone. So, if you happen to be travelling, the guide data you see will be local to where you are.

Download the server:

To update the version you're already running, you do NOT need to uninstall the service. Simply stop the service, quit the setup UI (if that's how you stopped the service), and copy the new version of the server app over your existing version. Then start the service back up again. That's it. Be sure to install the server version that matches your client version. Do not install a new server version unless you've installed that same client version on your phone. The server and client are tied together and cross-version functionality is expressly prevented.

On a related note, I've been having a heckuva time keeping the clock on my Media Center machine properly set. Windows is supposed to keep the clock in sync by itself, but for whatever reason, it is doing a rather poor job. I've found this service which will properly keep your clock synchronized. Hopefully, someday, Windows itself will properly implement this and the need for 3rd party solutions will vanish.

Rigging Hydraulic Pistons in Lightwave

I just read this (post #21) and figured I'd pontificate on what I've been able to do fairly quickly. Lightwave makes this easy with the built-in tools available. Now I have to be honest here: I don't remember if I read this technique in someone else's tutorial, or if I thought of it on my own.

Once upon a time I downloaded a model of the Imperial Probe Droid from Star Wars Episode 5. The model I downloaded, however, was for another package, and had been built pretty sloppily. I ended up completely rebuilding it (nothing original though: I used the downloaded model as my template, so mine looks pretty much exactly like the original). One of the deficiencies of the download was there was nothing to animate. I wanted the spider legs to fold and unfold. Here's a short flik of my rebuilt, fully rigged model.

To do this most easily requires a bit of planning in Modeler. You need to build each animatable part in its own layer, and with the rotation point at 0,0,0 and with the body facing along the positive Z axis. For example, here are all the parts for one leg:

image image image image image

The next step is to place in Layout each part in its proper relative position:

image

Also, set up a proper parenting hierarchy:

image

Now, the really easy part: Bring up the Motion Options panel for the Piston, and have it point at the Cylinder, and vice-versa:

image image

Suddenly, the two are now properly engaged, and as you rotate the lower arm, the whole piston assembly Just Works:

image image

The next step is to assign reasonable rotation limits for each part of the arm. You don't need to worry about the piston assembly at all since it's now fully automated. Finally, add an IK target for the arm, and turn on IK on the end piece, and each piece up the chain (be sure to not enable "Keep Goal Within Reach"):

image image

Now you can just animate the goal object and the whole leg does its thing.

I've switched to BlogEngine.Net

After a couple of years of running on the DasBlog engine, I've decided to try out something different. What's this mean to you? Probably not very much.

I had to hack the code a bit to accept the DasBlog style URLs, so anyone who has existing links to my site won't have to go updating their own site.

Now I just need to keep an eye out for comment spam. Wish me luck.

Complex UV Mapping The Easy Way

I have to admit, I didn't make this one up, but I observed it in some great models I got from Al3d. I'm glad he left in the leftovers on those models or I'd probably never thought of this.

Say you've got a model with various sundry shapes and surfaces, none of which are precisely Planar, Cylindrical, or Spherical. Yet, you want to set up some UVs on one of these surfaces. The built-in tools in Lightwave make it fairly difficult (granted, I'm not really experienced at this, so I could be missing something here). You can select some polygons, click "Assign UVs" and then you get your choice on one of the built-in projection types. But what if what you want to apply on to isn't quite facing the right way? I've tried transforming the UVs after assignment, and all I ended up with was horrible distortion.

The thing to remember is the UV coordinates are assigned to points. Once assigned, they're constant, no matter what transformations you place on the points. The trick here is we're going to apply a transformation before we assign the UVs. The trickier part is the transformation we apply will be disposable (no need to get tricky with Undo, or with some funky plugin).

Just create a new morph-map for your object. Then select the morph map. Now rotate, move, scale, whatever the model so that a simple projection is possible. Select the polygons, assign the UVs, then dispose of the morph map. That's it!

Here's an example: I've got a Viper model, from Battlestar Galactica. Here's an image of a wing:

 

image

 

I want to add the "NO STEP" image to where I've got green lines. The problem here is any planar projection is going to leave me with seriously distorted images. So, I make a morph-map. With the morph map selected, I'll rotate the wing so that it's not banked over:

image

Then rotate it so that the wing is aligned to my intended text alignment, and select the relevant polygons:

image

Create a new UV map with a simple Y projection (using defaults):

image

At this point, you can dispose of the morph map (or continue rotating for more applications). The UV map isn't scaled quite right just yet, so we'll tweak it a bit:

image image

I just scaled up the UVs, and moved them a little:

image

 

I find this technique substantially easier than trying to transform the UVs after the fact, and it's much easier than using UV Edit Pro.

Making "Battlestar" Ribs and Hull Plating

You've seen the show. You may even know the visual effects are all made with Lightwave 3D. Now, you'd like to make your own battlestar. Here's a quick method for making the ribs and hull plating just like you see it on the show. These techniques should translate fairly easily into other 3D packages, so read on.

First, you want to make the basic shape of your ship. I won't go into detail on how that's done. I'm going to start with a basic shape for the head of the ship:

image

Once you have that all squared away, we're going to make a copy. Copy this into the next layer. In the 2nd layer, use the Smooth Shift tool to inflate the object by some amount (I've built mine to scale, so I'm going to inflate it by 2 meters). The amount you shift out is going to be the height of the ribs overs the main surface. The plates will then also sit on the ribs.

Then, put focus on the 3rd layer, and set the 2nd layer to be your background layer:

image

Now we're going to create an array of boxes to slice into this layer. Think about how wide you want each rib to be, and how far apart you want them. For my taste, I'm creating ribs which are 1m thick, and with a gap of 4m between. Make a single box on the left edge of the hull, 1m thick and any amount taller and wider than the ship itself:

image

Now clone it enough times (with that 4m gap in between) so it spans along the whole length of the ship:

image

And then easy part: Do a Boolean operation (Intersect) and change the surface to "Ribs":

image

Cut everything from this layer and paste it on layer 1:

image

There! Easy ribs. Now for the hull plates. Again, put focus on layer 3, with layer 2 in the background. Create a bunch of boxes in various shapes, sizes, and positions (I'm not working on accuracy to the studio model here - just giving an example of the process). Be careful about what parts of the surface on layer 2 they intersect:

image

Now go to layer 2, select layer 3 into the background and do Solid Drill with a stencil operation. For the new surface, choose "Plates."

image

Select the stuff that isn't "Plates" and delete it:

image

Because Lightwave's Smooth Shift operation doesn't leave the original geometry behind (a single plane, when shifted becomes a box with an open end where the plane was originally), we need to do a couple of tricks here. First, flip all polygons, then copy them to the clipboard. Flip them again and apply the Smooth Shift operator for, say, 1m. Then paste and merge points. You get (zoomed in):

image

Now cut all this and paste onto layer 1:

image

There, easy plates! For more aesthetically pleasing results, you may want to round off the sharp corners of everything. I hope this has been helpful and educational.

New Product from Apple

Hi, I'm Steve Jobs. Have you noticed a recent inability to, well, get things moving? Do you find yourself sitting on the toilet for what seems to be an eternity? Has your local Super Shopping Center lost business because of reduced sales of rolled paper? If so, you need the new

Apple iPoopTM

The Apple iPoop is not a drug. It's a technological device. You simply insert it into your bottom. You may feel a certain increase in personal coolness, but this is normal. It's an Apple product after all. Simply leave it in place for the duration of one music track and then you'll soon see why all the kids are using it.

Each iPoop is good for thousands of uses. Simply recover the device before flushing the toilet, rinse it off, and it's good for another day.

WARNING: Do not exceed two uses in any single twenty four hour period.

Tell your friends "Hey, iPoop!" The new Apple iPoop. Join the MovementTM.