I've only been playing with a webcam that does intervals, and trying different image rates and playback frame rates. Good cheap way to experiment and not worry about burning out a camera shutter in the process.
I don't expect anything better than the 640x480 resolution will produce, but it's useful for practice.
Here's the easy way to look at it.
Critical flicker frequency is what causes the images to look continuous or look like an old flicker. (thus the term for movies... flicks)
Without getting deep into perception, light, rods and cones, and all that, at 12 frames per second, which is what early films ran at, the Persistence of Vision was right on the edge of the roughly 1/20th of a second which the eye retains an image.
8 and 16mm film was usually shot and projected at 16 or 18 frames per second. Super 8 went to 24fps I believe, and video is roughly 30fps.
35mm film, your "movie" film runs at 24fps but there's some other technology involved and there can be things like a spinning mirror or doubling the flashes per frame, to make it look smoother. Plus there's the frame and how long it takes to go from dark to fully bright.
OK so lets say 30 frames per second, since we don't have limitation, sprockets or all that complication.

The next part of the equation you'll want to figure is how many shots you want to take to either make a slow or a quick change in the scene.
If you are shooting a plant and want to record the phototropic response, in June, on the solstice, from Sunrise until Sunset, you'll have 15 hours of light to record.
Since the plant is moving slowly, you don't need as many shots for that 15 hour period. So lets use a hypothetical one shot per minute, which gives us a total of 900 images.
900 images at 30fps = 30 seconds!
You want the whole day to take a minute? Shoot one frame every 30 seconds for 1800 frames which is 60 seconds. Or just show it at 15fps not 30fps.
Just like everything in photography, there isn't one answer for all problems. Clouds are much faster than plants, and don't always travel at the same speed, but lets say that you are going to compress a cloud bank moving across the horizon and want one hours time to be one minute.
Change any of the variables in the basic math, and one of the other numbers will change, relative to it. 1800 frames displayed at 30fps, is always going to be? One Minute! That's your starting point.
You need 1800 frames, in one hour. That's 60 seconds x 60 minutes for 3600 seconds in an hour, and you'll need to take a frame every 2 seconds. (30 frames x 60 minutes = 1800) That's a load of frames!
Taking the same clouds for an hour, but slowing the fps to 20, which may work just fine, I don't know... you only need to take 1200 frames.
Don't go crazy. Just figure your frames total, divided by frames per second, and you have your length in seconds.
Last one I did was the entire Super Bowl, at a bar, planning on four hours for the time of the sequence and I wanted it to show in one minute. I picked 12fps for the display rate, just to make life easy.
4 hours = 14400 seconds, divided by 20 (a picture every 20 seconds) = 720 frames. 720 frames at 12fps = 60 seconds for the entire file to show.
Hope that helped?