Think & Execute, Don’t Execute & Fix – A Session On Frame Rates In Perspective To Your Workflow
Going through a pre-production workflow assessment is an essential part of producing professional video content. For this, one of the firsts steps to take is to establish where your final cut is destined to be played back (e.g. online streaming platform, cinema, conference room, TV broadcast, international distribution). Amongst the various shooting parameters (e.g. bitrate/compression, recording format, gamma curves), the frame rate plays a critical part in format delivery.
It represents the frequency at which frames are recorded, usually expressed in fps, which stands for ‘frames per second’. Changing your frame rate during production can really mess up your workflow (audio sync and editing artefacts) if you don’t know what you’re doing, so as a general rule, stick to it! If you’ll want to make an educated decision, I’ll take you through some condensed motion picture history and its evolution through the digital age.
35mm film was standardised close to the year 1910, thanks to the work of Thomas Edison (American inventor and businessman). It was entitled “Academy” ratio (1.33:1), with a frame rate of only 16 frames per second. The reason for this pace is because it allowed film rolls to be limited in length size: the higher the number of frames per second, the lengthier the film roll. Recording for one second would take up 30cm of film length (versus 45cm at 24fps).
In 1927, talking cinema emerged and gave birth to the “Movietone” film format. An optical track for audio playback was added on the film strip, alongside to the moving pictures track (see the image below). But, the audio bandwidth would’ve been too long to stay in synchronisation with the moving pictures (such as the older Academy format, at 16fps) so the frame rate was increased from 16fps to 24fps1 to compensate. Since then, it has pretty much stayed the standard specification for film production.
(Source: Blog – Stephanie Caffrey)
Then video image came to birth with CRT monitors (Cathode Ray Tube) and television standards were set around the 1930s. The scanning frequencies of TVs had to be in sync with the frequencies of electrical energy supplied to the general public. Therefore, countries distributing power @ 50 Hz (e.g. Europe, Africa, Asia) naturally chose to go for 50i2, and countries distributing power @ 60Hz (e.g. USA, Japan) went for 60i (30fps). That’s when interlaced scanning was born. Let’s see what this means…
First television ever made according to WhenGuide
In CRT monitors, an electron beam travels right to left and top to bottom on a phosphorescent screen, hitting one pixel at a time (made up of Red, Green and Blue phosphor dots). Once that the whole screen has been scanned with the laser, the interlaced frame (half of a frame) has been formed. Take a look at this video (jump @ 1m48s) to see how the scanning looks like in slow motion: How a TV Works in Slow Motion – The Slow Mo Guys.
(Source: Physics 420 – UBC Physics Demonstrations)
A frame is made of lines (line 1, line 2, line 3, line 4, etc…). Such as the example below, odd lines (first field) will play in the first place, followed by even lines (second field). This sequence will repeat 25 times in a second.
Here’s an example of what an interlaced image looks like…
There are multiple reasons for interlacing frames:
Retinal persistence phenomenon
It refers to the optical illusion that occurs when visual perception of an object does not cease for some time after the rays of light proceeding from it have ceased to enter the eye. Our eyes perceive a sensation on continuity whilst watching moving images from about 10fps, but movement fluidity really kicks in at around 40fps.
Introduction of a flicker
Whilst playing back at 25 full frames per second on a CRT monitor, the initially lit pixels (at the top frame area) would’ve turned off by the time that the laser beam scanned the bottom frame pixels (you can actually see this happening in the slow motion video suggested earlier in this post).
Bandwidth limitation and refresh rate
The refresh rate for scanning the CRT screen would’ve had to be twice as fast (so 50 full frames per second) to avoid jerky motion (flicker), therefore increasing the amount of information to be transported. Given the bandwidth limitation, this was not possible.
In conclusion, engineers found a clever way to suppress these pains: frames were halved to be played one-half frame at a time, resulting in enhancing the viewer’s motion perception. So instead of playing 25 full frames per second (for countries distributing power @ 50Hz), CRT monitors were played at 50 half frames or “interlaced frames” per second (and 60 half frames or “interlaced frames” for countries distributing power @ 60Hz). And that’s what interlaced scanning is.
Then, as technology evolved, progressive scanning came along, which is an integral part of today’s video realm. But there’s no “scanning” occurring as so to speak: each pixel is simultaneous lit (at different intensities, depending on the pixel’s given information) across each frame. This allows for a single frame to be analysed and displayed in full as opposed to the interlaced scanning method. Take a look at this video to see how progressive scanning looks like in slow motion (jump @ 4m17s): How a TV Works in Slow Motion – The Slow Mo Guys.
|Interlaced scanning (analogue and digital video)|
|50i||Interlaced frame rate standard for countries distributing power @ 50Hz.
Initially emerging from PAL/SECAM (analogue video) and now DVB/ATSC (digital video) encoding standards.
|Used for analogue and digital broadcast encoding/transmission standards, since the 1930s.|
|59.94i (or 60i)||Interlaced frame rate standard for countries distributing power @ 60Hz. Initially emerging from NTSC (analogue video) and now ATSC (digital video) encoding standards.|
|Progressive scanning (digital video)|
|23.976p||Prior to the NTSC standardisation, content shot at 24fps (film) had to endure a 2 step process called the 3/2 pulldown to allow film to be converted into video and broadcasted on television3 (for countries with power distribution at 60Hz).
|24p||Film standard since 1927.|
|25p||Frame rate emerging from countries distributing power at 50Hz (e.g. Europe, Argentina, Middle East).|
|29.97p||This norm was introduced by the SMPTE in 1953 for full backwards compatibility of the colour signal transmission with Black & White televisions.
The line rate and frame rate were both reduced to around -0.1% in order to add the “Colour” information4 in the signal (because of bandwidth limitations). The chrominance and luminance modulation frequencies were spaced (still within the same signal) in a way that B&W TVs could read the luminance information only (filtering out chrominance) and Colour TVs would read both.
|30p||Frame rate emerging from countries distributing power at 60Hz (e.g. Japan, Canada, U.S).|
|Filmmaking Temporal Effects|
|Typically under 2fps||Timelapse (also known as Quick Motion) and Hyperlapse.||These images are captured at a much slower rate than the usual standard speeds (e.g. 24fps, 25fps, 30fps). When played back at normal speeds, it gives the sensation that time is moving by in an accelerated motion.|
||These speeds can either be:
|Over 60fps (commonly 100fps, 120fps, 180fps, 200fps, 240fps)||Super slow-motion||Content shot at high speeds – which when played back at normal speed – introduce an effect of slow-paced motion. Commonly used in sports, scientific/medical/experimental imagery, action films, nature documentaries.
Phantom cameras (specialist high-speed cameras) are able to shoot up to 6600fps, 12500fps and 25000fps.
|Note the following relationships:
|Geography of Video Transmission Standards|
Analog Video Transmission (roughly 1930s to 2000s – including B&W and Colour)
Digital Video Transmission (roughly since the 2000s)
Set your specifications from the start by planning ahead for your deliverables (where your content is destined to be released). Select your frame rate across your project’s array of recording instruments, and stick to it until the end of your project. If you start messing around with your frame rate switch (excluding the intentional slow motion effect), it’s gonna cost you far more than you expected: time, money and health concerns (e.g. gut wrenches, headaches and tears).
Unfortunately, for those using archive footage, you may run into various frame rate content (and definitions, resolutions, codecs, etc…), so make sure that you plan this ahead and consider putting a line in your production budget for that one if it applies. You should be able to find appropriate advice from a professionally qualified entity such as Post Production services.
On a side note, it shouldn’t be as critical to which frame rate you choose to go for (again, as long as you don’t switch your frame rate halfway through production) if your one camera’s content is the only media present on your editing timeline. If it’s destined to be delivered on a popular streaming platform such as Vimeo, make sure to check their ingest/playback specs to match with your media delivery format. Planning ahead helps you to make educated decisions.
Perhaps, I’d invite you to share your experience (in the comments section) of painful processes in multiple frame rates projects that you’ve experienced, what you learned from them, and how you’ve managed to get all your media on the same timeline (possibly excluding artefacts). In the future, this may help us to document and create a definitive matrix for the various processes for frame rates conversions.
- In other words, with this adjustment, 1s of film length (24 frames) would match 1s of sound recording
- “i” refers to the interlaced scanning system (explained further down this in this post). In reality, 50i means that screens were playing 25 frames per second. But, to avoid a flickering artefact, each frame was split into two, therefore playing 50 half frames per second (=25fps). I suppose that filming directly at 50fps would’ve demanded too much bandwidth for transmission/broadcasting and playback of the video signal. Perhaps video cameras weren’t even prepared yet to shoot at such speed
- For countries with power distribution at 50Hz (PAL), 24fps film playback was sped up to +4% in order to reach 25fps
- Also referred to as “Chrominance”.
- Some cameras may have bandwidth limitations for recording images. This may depend on the type of media it can record on (SD, CF, CFast, SSD, etc) which all have different recording speeds (bitrates). Therefore, to still be able to record at much higher frame rate, the resolution of the recorded stream may be reduced