Once upon a time there was a lonely Alien who lived in a small house on the beach. It was a very cold winter 1952, when he made a decision to repair his spaceship and travel in time to the future. He could not wait any longer as his socks were soaking wet and he was very close to attempt a suicide. After a few days of hard work, the little green creature managed to fix his spaceship and he was ready to travel. As soon as he landed in the future he found himself in a pink underground train. He was in London and the year was 2015.
An Alien lived a very happy life in the city. It was full of fun, joy and sex. He had nothing to worry about anymore as he was making lots of money by selling beautiful bags for live.
One workshop, led by Lev Kuleshov, attempted to
discover the general laws by which film communicates
meaning.
Through a study of American films, as well as a series of
experiments, Kuleshov concluded that the shot has two
distinct values:
1. its value as photographic image
2. the value it acquires when juxtaposed
with another shot - The Kuleshov Effect
The film was shown to an audience who believed that the expression on Mosjoukine's face was different each time he appeared, depending on whether he was "looking at" the plate of soup, the girl in the coffin, or the woman on the divan, showing an expression of hunger, grief or desire, respectively. The footage of Mosjoukine was actually the same shot each time.
Montage Styles
Sequential Analytical
In Analytical montage you analyse an event for its thematic and
structural elements, select the essential elements, and then
synthesise them into an intensified screen event.
One of the characteristics of an analytical montage is that the
main event or its major theme is frequently implied, but not
shown or otherwise made explicit, leaving it up to the viewer to imagine.
The example above:
a girl riding her bike
another cyclist is seen crossing her path
her bikes in collision
A woman (her mother?) is helping - looks like no one was seriously injured
The main event (accident) was not shown for effect, engaging and forcing the viewer to participate in the event and develop a response.
When the placements of the events are switched around the overall interpretation of the montage can be read differently.
The axample above shows the following order:
proposal
marriage
a baby
child is growing up and riding a bike
Another example above illustrates the same shots, but in a different order:
proposal
baby
child growing
marriage
Sectional Analytical Montage
Sectional montage isolates a section of an event or moment. It is achieved by producing a shot that establishes the context and sets the tone. The following sectional pieces add to this, going into greater depth.
The shot above starts the montage from a student's point of view.
The picture above illustrates the same set of shots, but examines the moment from the perspective of the teacher, as it establishes his point of view first.
Idea-Associative Montage
The idea-associative montage brings together two disassociated events to create a third principle concept. Similarly to the analytical montage, idea-associative montage has two types, comparison and collision.
Comparison Idea-Associative Montage
This montage type compares two or more similar themes, however, combines them in a manner that expresses them differently.
As demonstrated in the picture above, the two shots are representing the same theme, with the man and the dog both looking into the public bin. However, while the two events are similar, the comparison conveys how the man is living a dog’s life, scrapping over food. Highlighting the idea of desperation and the social degradation of the poor.
Collision Idea-Associative Montage
This montage type clashes two similar, but opposing events to reinforce a basic feeling. An example would be a montage that juxtaposes a man eating out of a bin, in contrast to a man sitting at a table eating a buffet of food. It is a powerful and insightful type, yet a conspicuous one, which can both enlighten or involve the viewer sensitively.
The video below, was produced by our group in order to presents three montage types - sequential analytical, comparison idea-associative and collision idea-associative montage.
Sergei Eisenstein
Eisenstein was a brief student of Kuleshov and was a pioneer in the usage of montage editing, arguing that it was the essence of cinema. His work on cinema came from an intellectual viewpoint, developing theories to communicate abstract ideas in a new and modern way.
These are Eisenstein’s 5 methods of montage:
- Metric
- Rhythmic
- Tonal
- Over-tonal
- Intellectual
Metric Montage
This is based purely on the length of the shot. This induces the most basic emotional response, whereby tempo can be raised or lowered for effect. An example of this would be editing that follows a specific number of frames, in conjunction with the physical nature of time.
Rhytmic Montage
This is a lot like Metric, in that it involves tempo, but is more concerned with what’s inside the frame. Effectively cutting in tempo with the action.
The cutting happens for the sake of continuity. This creates visual continuity but it may also be used in order to keep with the pace of the film. A good example of this is the the legendary car/train chase scene in The French Connection.
Tonal Montage
A tonal montage uses the emotional meaning of the shots. Not just manipulating the temporal length of the cuts or its rhythmical characteristics. The point of this is to elicit a reaction that is more complex than Rhythmic and Metric. An example of this is in one of Eisenstein’s films called Battleship Potemkin where the character ‘Vakulinchuk’ dies.
Overtonal Montage
An accumulation of metric, rhythmic, and tonal montage to synthesise its effect on the audience for an even more abstract and complicated effect.
Intellectual Montage
While the other methods focus on evoking an emotional response, the intellectual montage method is sought to express abstract ideas. It does this by creating relationships between opposing visual concepts. Intellectual montage editing was the method that most interested Eisenstein, claiming it was an alternate method to “Continuity Editing”.
A constructed scene where everything flows in a consistent,
orderly, smooth and sequential manner.
Editing involves selecting and sequencing those parts of an
event that contributes most effectively to its clarification and
intensification.
Continuity concentrates on the structuring of on- and off-screen
space and on establishing and maintaining the viewer’s mental
map.
Creating a Mental Map
A mental map helps the viewer make sense of where things
are, where they are going, or where they are supposed to be
in on- and off- screen space.
Continuity editing relies upon matching screen direction,
position, and temporal relations from shot to shot.
Continuity clarifies the event.
Types of Continuity Editing
Graphic Continuity
Rhythmic Continuity
Spacial Continuity
Temporal Continuity
Graphic Continuity
Graphic Continuity is when two successive shots are
joined so as to create a strong similarity of compositional
elements (e.g colour, shape,etc).
Graphic matches can be used to make metaphorical
associations.
Rhythmic Continuity
Rhythm in film and a video is the perceived rate and
regularity of sounds, series of shots and movements within
the shots. Rhythmic factors include beat (or pulse), accent
(or stress) and tempo (or pace).
Rhythmic editing occurs when the editor adjusts the length
of the shots in relation to each other to control their overall
duration on screen to create accent, beat and tempo.
Spacial Continuity
When the filmmaker connects any two points in space through
similarity, difference or development or even divide the whole
space into component parts to make up the scene. Spatial
relations are reinforced by the use of techniques such as:
180o Rule
The eyeliner match
Shot Reverse Shot
Match on action
Motion vector continuity
The cheat cut
180 degrees rule/ axis of action/line of action
When two characters (or other elements) are in the same scene,
they should always have the same left/right relationship to each
other.
Shifting to the other side of the characters on a cut, so that
person B is now on the left side and person A is on the right, will
disorient the viewer, and break the flow of the scene.
Eyeline Match
If the person looks left, the following shot should imply that the
looker is offscreen right.
A cut obeying the axis of action principle or 180o rule, in which
the first shot shows a person looking in one direction and the
second shows a nearby space containing what he or she sees.
Eyeline matches can be a very persuasive tool to construct
space in a film, real or imagined.
Shot reverse shot
Shot reverse shot is a film technique wherein one character is
shown looking (often off-screen) at another character, and then
the other character is shown looking "back" at the first character.
Since the characters are shown facing in opposite directions, the
viewer subconsciously assumes that they're looking at each
other (i.e. the 180 degree rule).
Shot reverse shot
Shot reverse shot is also often combined with creative
geography to create the sense that two characters are facing
each other, when in fact they're being filmed in completely
different locations or at completely different times.
Match to action/cutting to movement
A cut which splices two different views of the same moment in
the movement, making it seems to continue uninterrupted.
A match on action adds variety and dynamism to a scene,
since it conveys two movements: the one that actually takes
place on screen, and an implied one by the viewer, since her/
his position is shifted.
Match vector or Cutting on movement
A motion vector is created by an object actually moving in a specific
direction or an object that is perceived as moving on the screen.
When breaking down a sequence of shots depicting a continuous
action there are usually five questions faced by the editor:
What is visually interesting?
What part of a shot is necessary to advance the ‘story’?
How long can the sequence last?
Has the activity been adequately covered on camera? Is there a
sufficient variety of shots to serve the above requirements?
The Cheat Cut
When the camera is set up for a second shot at a different angle it is
possible to move things around a little to improve the new
composition, the difference in perspective and angle of the two shots
hides the fact that things are not exactly in the same place.
Temporal Continuity
Means of constructing the story in terms of time: order, duration, and frequency:
The order of the presentation of events - these can be manipulated in order to
reveal different story elements and play with the viewer (e.g. Memento).
The manipulation of duration of events in the story can create ellipsis. Elliptical
editing presents action in such a way that it consumes less time on screen than
in the actual story.
By presenting the same events a number of times (frequency) in the story to
build up tension but adding elements to move the story closer to the climax.
Elliptical Continuity
The shortening of
plot duration achieved by omitting intervals of story duration.
Elliptical editing is creating shot transitions that omit parts of an event,
causing an ellipsis in plot and story duration.
Elliptical editing can be achieved with a great deal of dissolves and
jump cuts in order to both shorten the time and suggest a character's
emotional states. Elliptical editing need not be confined to a same
place and time.
Jump Cut
An elliptical cut that appears to be an interruption of a single shot.
Basically, two similar shots cut together with a jump in continuity, camera
position or time.
Either the figures seem to change instantly against a constant background,
or the background changes instantly while the figures remain constant.
Jump Cuts were featured historically widely in avant-garde and radical
filmmaking, or more commonly in music videos, video art or alternative
filmmaking, like Lars Von Trier's Dogma films.
The human hearing range is commonly given as 20 to
20,000 Hz, though there is considerable variation between
individuals, especially at high frequencies, and a gradual
decline with age is considered normal.
Microphones
Omnidirectional Mic
Omni pattern provides maximum ambient pickup. Frequency
response: 20 - 18,000 Hz.
Operates on battery or phantom power
Ideal for group vocals, strings, acoustic guitar and piano.
Stereo Mic
Provides the spacial impact and realism of a live sound field.
Frequency response: 30 - 20,000 Hz.
Battery operation only.
Ideal for TV, DAT (digital audio tape) and Radio recording.
Ideal for close-up vocal, overheads, piano and strings.
Mini Cardioid Mic (Lavalier Mic)
Frequency response: 40 - 20,000 Hz
Operates on battery and phantom power
Mic is attached to an power and transmitter module
Provides crisp, full sounding voice and instrument pickup.
Unidirectional Mic (Shotgun)
Two range settings: “Normal” for close up and medium distance recording; “Tele” for long distance pickup
Frequency response: 70 - 18,000 Hz
Designed specially for voice recording with video cameras
Zoom H4 Stereo Recorder
Panning
Panning is the spread of a manual signal in a stereo or multi-channel sound field - it is critical to the makeup of the stereo image. Panning adds space in a mix through panning the instruments centre, left and right.
Mixer pan controls
Usually the most problematic area of the sound field is the center, as this is normally the busiest place within a mix. It is advisable to keep the kick, snare, bass and vocal in the center as they provide the music with a solid grounding and help aid the rhythm. For every other instrument however, it is advisable to position them either side of the center.
Waves reflected from nearby objects take less time to be reflected back, while waves reflected by objects across the pool take longer.
Reverb
If this distance is short, such as in a room or theatre, the sound will be reflected back to the source in less than one-tenth of a second. This effect is reverberation. Because there is such a small delay in the sound repetition, sometimes only a few milliseconds, reverberation is often perceived by a listener as adding fullness to the original sound. Reverberation will often be added to recorded music to better simulate the sound of a live performance, or to enhance the tone by making a thin sound fuller.
Echo
Everyone has had the experience of calling out in a valley or between large buildings and hearing our voices repeated back to us. When reflected sound travels a greater distance, such as a river valley, and takes more than one-tenth second to return, it is referred to as an echo. Echo does not add to the original sound as reverberation does, but is perceived as a distinct repetition of the sound, usually slightly fainter than the original. The sound is weaker because of the energy lost as the sound waves travel the greater distance. This is referred to as decay. Echo can be measured by the time lapse between repetitions, strength of the repetitions (i.e., how loud the repetition is) and the decay of the sound.
Equalisers are software or hardware filters that adjust the loudness of specific frequencies. As with all sound engineering, the basis is on the human ear. Certain frequencies are louder than others to our ears, despite having the same or even more energy behind it.
Equalisers were originally developed for physical venues such as movie theatres and outdoor areas, places that aren’t designed with acoustics in mind, to “equalise” all of the sound frequencies. For example, some venues will respond better to bass frequencies, so the EQ can be turned down on that end to prevent feedback and turned slightly up on the higher end to even things out. In general, you equalise for the physical space, to account for the particular combination of the room and equipment.
The photography below, was taken with a correct camera settings:
The same photo with a colour temperature set to 2500 K. Colour is shifted
to the blue end
of the spectrum:
If we set the colour temperature to 10 000K, the colour is shifted
into the orange end
of the spectrum:
Colour shifts happens when you set the
colour temperature of your camera with
the Wrong end of the kelvin spectrum.
WRONG: not the same as your scene's
colour temperature.
3 Rules
Set the camera up for the appropriate colour temperature if you want your picture to look normal.
You can setup your camera at a higher or lower colour temperature if you want a particular cinematographic effect or you are shooting day for night.
You can mix colour temperatures for creative effects.
Automatic White Balance
The Auto setting helps in adjusting
the white balance automatically
according to the different lighting
conditions, but you can try other
modes to get better results.
Types of lights
HMI Lights (5600-6000K) - outdoor lights
Dedo and Redhead (Tungsten)
Lights (3,200 K) - indoor lights
Gels
Gels can be used to achieve:
colour balance (blue gel, orange, green),
diffusion gels soften the light in order to reduce or diffuse shadows or glare,
filter - brings the intensity of light down without affecting the colour temperature.
Reflectors
Reflectors are used to reflect light on an object or subject in a situation where:
there is not enough artificial lights
artificial light is too powerful
Reflectors have different colour temperature properties. The reflected light can be strong or soft.
Class Work - A short video production
The aim of this workshop was to shoot a short video with a specific mood created by lights.
Our group decided to showcase a young man sitting in a cinema and watching a scary movie.
In order to imitate a specific type of light that can be seen in a movie theatres, we used two dedo lights. One of them was covered with a blue gel, the other with a purple gel. In addition, light intensity was manipulated during a shooting session.
The Nikon camera we used, had a white balance set to fluorescent. That helped us to achieve a desired colour temperature.
Once you have your final script for your film, promo video, etc., you as a producer/director will need to organise your shoot.
The benefits of scheduling are: save time, save energy and more importantly save money. Your shoot will be much faster and smoother and your cast crew and client will love you for it.
How does it work?
There is no exact formula for creating an effective shooting schedule. Every project has different parameters and considerations. For instance, you might need less time to shoot an hour-long corporate video, which consists of interviews and demonstrations in a studio, than you'll need a shoot a four-minute music video that requires lip-synching and shooting in varied locations.
What should we consider before scheduling?
The shooting script location or dialogue scenes (action scene - usually a few locations, dialogue scene - usually one location, how many shots for every scene).
the footage you need in order to complete the project.
the types of shots - e.g. long angle, drone shots, tripod shots, etc.
the location (indoors, outdoors).
shooting permission.
the equipment (cameras, lights, sound, jib, dolly, etc...)
jib - In cinematography, a jib is a boom device with a camera on one end, and a counterweight and camera controls on the other (image below)
camera dolly - a specialized piece of filmmaking and television production equipment designed to create smooth camera movements (image below). The camera is mounted to the dolly and the camera operatorand focus puller or camera assistant usually ride on the dolly to operate the camera. The dolly grip is the dedicated technician trained to operate the dolly.
Location scouting
indoors or outdoors
take photos of your shooting location for reference
plan to shoot outdoors shots so you can work around weather and light