Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 104

Audio/Video Production 1, Semester A 21-22

In this lesson, you’ll describe the history and evolution of various stages of audio-
video production. You’ll also describe the development of broadcasting audio, video,
and motion films as well as the history of the broadcasting industry. You’ll also list the
stages of audio-video production. Finally, you’ll analyze the effect of changing
technology on audio-video production.

Evolution of Audio Production


The history of audio production is the history of recording sound. The ancient Greeks
knew sound exists in the form of waves that travel through air. Through the ages,
various scientists contributed to the understanding and study of sound waves. The
earliest method of recording sound was to record directly onto a medium. A
diaphragm captured the sound and transmitted its vibrations to a stylus. A stylus was a
pointed needle-like structure that etched a groove in the recording medium. This was
called acoustic recording.
Acoustic recording: A Frenchman, Edouard-Leon Scott de Martinville, invented
the phonautograph in 1857. It could record airborne sound, but could not play it
back. In 1877, Edison recorded and played back his recital of Mary had a Little
Lamb using a device he invented called a phonograph. The first phonograph used a tin
foil as the medium on which you could record the sound waves. However, you could
play the foil only a few times. After that, the indentations made by the stylus wore out
or tore the foil. In 1885, Alexander Graham Bell invented the graphophone that used
wax-coated cardboard cylinders as a medium. These cylinders improved the durability
of the recording. In 1887, Emile Berliner patented a new device called
the gramophone that recorded sound on flat shellac discs. Early discs and cylinders had
the same audio quality. However, discs were easier to reproduce. In time, the disc
survived and evolved into the popular vinyl record of the 1940s.
Electrical recording: The invention of the microphone led to a significant
ancement in the evolution of audio production. The microphone converted the
mechanical vibrations of sound waves into an electrical signal. This signal, when
amplified, moved the recording stylus. The electrical recording system introduced in
1925 provided the ability to amplify the vibration signals sent to the
stylus. Consequently, the stylus cut the grooves much more precisely than in acoustic
recordings. Therefore, electrical recordings were louder and clearer. They also allowed
the recording of distant and feeble sounds. This was not possible using earlier acoustic
methods.
Because of the better quality of electrical recordings, it became feasible
to overdub. Overdubbing is a technique in which a performer listens to a previously
recorded performance and performs along with it. The performer records this
simultaneous performance as well. The final recording contains a mix of both
performances. Overdubbing enhances a singer’s recording, usually accompanied by
only one instrument, with the accompaniment of an entire orchestra.
Magnetic recording: In 1898, the Danish inventor Valdemar Poulsen, invented a
magnetic recorder called a telegraphone to record telephone messages. Electrical
amplifiers developed in the 1920s were capable of amplifying an electrical signal. These
amplifiers enabled the evolution of the telegraphone into a wire recorder. It was more
difficult to reproduce recordings from wire recorders than from phonographic
cylinders. Therefore, people mainly used wire recorders for business voice
recordings. In the early 1930s, the industry began to use steel bands as the recording
medium. These bands improved the sound quality.
In 1928, Fritz Pfleumer, a German engineer, invented magnetic tape by coating
cellulose acetate, a plastic-like material used for photographic film, with iron-oxide
powder. In 1935, the German electronics company AEG produced the first tape
recorder called the Magnetophone. After World War II, the Americans discovered this
technology, modified and improved it, and by the late 1940s, commercial tape
recorders became available.
Magnetic tapes greatly enhanced sound quality. They also allowed you to record, re-
record, and erase sound from the same magnetic tape several times, without loss of
quality.

Magnetic tapes to digital recording: With the introduction of magnetic tapes, people


began to come up with several innovations in audio production. Engineers found it
easier to edit the tapes, because it was easier to cut and splice the tape. They cut out
flawed parts of recordings and re-recorded only those parts. Multitrack recording
became possible. It involved recording two or more simultaneous tracks on a single
tape. Multitrack recording also made it possible to record and reproduce stereophonic
sound. In 1963, Philips introduced the compact cassette tape. By the 1970s, it replaced
the vinyl LP (long playing) discs as the preferred means of recording and playing back
music. In 1980, Sony introduced the palm-sized cassette player, which it called
the Walkman.
In the early 1980s, digital sound recording and compact discs emerged. Digital
recorders gained popularity because they made it easy and relatively inexpensive to
record, edit, and mix songs. Digital recording compact discs provided better quality
sound. Therefore, they gradually gained popularity with consumers and soon displaced
cassette tapes. Today, software on computer systems enables digital recording and
sharing of audio content through the Internet.

Evolution of Video Production


Early inventions: Technically, the term video refers to the recording and display of
moving images in an electronic format. However, video production has its origins in the
moving pictures (movies) recorded on film. Film, in turn, was an outcome of efforts in
the nineteenth century to simulate motion. The first device to simulate motion was
the Fantascope, invented around 1832. It depicted successive drawings of different
phases of a thing in motion on a spinning disk or inside a rotating drum. The flip
book  was a collection of pictures bound together. A person turned this book rapidly to
get the illusion of movement. As the art of photography developed, people replaced
the drawings in flip books with photographs.
Critical developments: True motion pictures only became possible once cameras
evolved. Cameras reduced the amount of time required (exposure) to capture an
image. By the 1870s, the exposure time reduced to one-hundredth of a second from
the hour or so necessary for the early photographic processes. Another important
development was the idea of using celluloid, a flexible and unbreakable material,
instead of glass plates for photographic films.
In an effort to develop a camera that could capture motion, Thomas Edison and his
assistant William Dickson invented the Kinetograph in 1891. It took pictures in quick
succession (40 photographs in one second). You could play back the pictures at a speed
that created an illusion of motion. Edison also designed a viewing instrument
called kinetoscope. It was a cabinet with an eyepiece. Inside the cabinet, a motor ran the
film between a lamp and a lens. A person could peep through the eyepiece and view
the motion picture. In 1894, the Holland brothers opened the first Kinetoscope Parlor
in New York City. Soon, kinetoscope parlors opened in major cities across the United
States.
The first films: A kinetoscope exhibition in Paris inspired the Lumiere brothers to
invent the first projector. They called it the Cinematographe. It was a camera, printer,
and projector in one portable device. The Lumieres presented the world’s first
commercial and public exhibition of cinema in 1895. There were 20 short films, of
around 40 seconds each, that depicted everyday activity. They included videos of
workers leaving the factory, feeding a baby, and bathing in the sea. In 1896, Edison
introduced the Vitascope projector. It was the first commercially successful movie
projector in the United States.
The talkies: Until 1927, films did not have a synchronized recorded sound because of
the technical difficulties in incorporating sound. In order to narrate the story, directors
often used intertitles. These were texts on slides that presented important dialogues or
explained the action to the audience. Silent films also featured live music to build up
the atmosphere and give the audience emotional cues.
The early sound films used the sound-on-disc technology, which consisted of
synchronizing a phonograph to play along with the projection of a film. Warner
Brothers developed Vitaphone, an example of sound-on-disc technology. The film Don
Juan appeared in 1926 and consisted of music and some sound effects but no spoken
dialogue. In 1927, The Jazz Singer was released. Many considered this film to be the first
sound film that included recorded dialogue. The next technology to emerge was sound-
on-film (also called optical sound). In this technology, the technicians physically
recorded sound directly on the photographic film.
In the 1970s, the industry introduced digital technology into sound. This technology
allowed greater convenience, easier manipulation, storage, and transmission of an
audio signal.

Evolution of Broadcast Media


Broadcasting is the process of distributing audio-video content electronically through
the air to several recipients simultaneously. Here is a look at the evolution of this
broadcast medium, which is a form of mass communication usually associated with the
radio and television.
Radio: Radio, the first broadcast medium, has its origins in the invention of the electric
telegraph. The telegraph made it possible to send coded electrical signals over long
distances. In 1895, Guglielmo Marconi invented wireless telegraphy. It broadly
transmitted a signal through the air so that any receiver within range (two miles in this
instance) could capture it. In 1900, Reginald Fessenden transmitted human speech
wirelessly over a distance of one mile using airwaves. Marconi’s invention was called a
radio telegraph because the signal moved radially from the transmission point.
Initial radiotelegraphy developed as a means to enable transatlantic ships to
communicate while at sea. Soon amateur radio operators began building radio sets
and broadcasting news and music.

After World War I ended in 1918, several companies explored the idea of
manufacturing radio receivers for domestic use. In 1920, the Westinghouse Electric
Company of Pittsburgh established the first commercially owned radio station. People
knew it by the call letters KDKA. It offered a schedule of entertainment programs such
as recorded music. Soon, other companies started broadcasting as well. These
corporations broadcast programs free of charge. Their intention was to make money
through the manufacture and sale of radios to those who wanted to listen to these
programs. The strategy was successful. According to the National Association of
Broadcasters, the number of households with radios increased from 60,000 in 1922 to
10 million in 1929.

The broadcasters realized that the growth in radio purchases could not go on
indefinitely. Thus, they began to consider the idea of selling advertising time on
radio. By the late 1920s, commercial advertising paid for radio broadcasting. However,
there were also a few instances of non-commercial broadcasting. State universities in
the agricultural Midwest used the radio as a tool to broadcast educational programs,
with funds provided by the state legislature. Despite the Great Depression of the 1930s
that crippled the US economy, commercial radio broadcasting continued to
grow. During World War II, news broadcast over the radio helped people keep abreast
of developments on the war front.

A radio wave is an electromagnetic wave with relatively large wavelengths. The radio


station announces the frequency of this wavelength so that listeners can tune in from
different locations. The earliest radio broadcasts were of AM (amplitude modulation)
type. In the 1930s, broadcasters introduced FM-type (frequency modulation)
broadcasts. AM broadcasts are subject to interference and distortion from lightning
and other electromagnetic interference. FM broadcasts offer better sound quality and
stereo broadcasts.

HD (high definition) radio is a new form that broadcasts a digital radio


signal. Sometimes, stations broadcast a mix of both digital and analog
signals. Nowadays, satellite radio is also a digital signal. Broadcasters beam the signal
to a satellite. The satellite sends the signal back to Earth, giving it a wider broadcast
range. Internet radio is the latest form of radio. People transmit audio files over the
Internet so that anyone can receive the transmission through their
computer. Technically, however, this is not a radio, as there is no signal sent via
electromagnetic waves. Therefore, people refer to it as simulated radio.
Television: Broadcasting images began as a mechanical technology with the Nipkow
disc designed by Paul Nipkow in 1884. The technology involved sending images over a
wire by rotating a perforated disc in front of a brightly illuminated image. Other
pioneers of the mechanical technology were the American Charles Jenkins and the
Scotsman John Baird. In 1907, an English inventor A.A. Campbell-Swinton and a Russian
inventor Boris Rosing independently developed an electronic technology for television
using a cathode ray tube. In 1923, Vladimir Zworykin invented the iconoscope. The
iconoscope was a tube to capture television images. It broke down pictures
electronically into thousands of separate elements. Eventually, scientists and inventors
favored and developed this electronic system of television.
In 1927, technicians from Bell Laboratories held a long-distance transmission of a live
picture and voice simultaneously. It was the first public demonstration of television. In
1939, at the New York World’s Fair, attendees witnessed a live broadcast of President
Roosevelt’s welcome speech through a network of televisions set up at the fair ground.

In 1946, Peter Goldmark devised a mechanical means of a color television system. He


did this by rotating a red-blue-green wheel in front of a cathode ray tube. At the New
York World’s Fair in 1964, attendees witnessed a color television in operation as they
were taped live.

In the 1940s, cable television systems originated with the invention of the coaxial
cable. A cable television system consists of a large community antenna to receive
broadcast signals. Coaxial cables then transmit these signals to individual homes. The
system developed as a solution to improve television broadcast reception in remote
and hilly areas. In large metropolitan areas, television reception began to degrade
because of the reflection of the signals from tall buildings. Thus, in the 1960s,
broadcasters introduced the system in these areas.
In 1998, television broadcasting began to transition from analog to digital television. In
analog transmission, electric signals send information. In digital transmission,
information is sent in the two states of binary format (0 and 1). Digital television
enables much clearer images by allowing the broadcast of high-definition images. The
next step in television broadcasting—4K UHDTV (4000 Ultra High-Definition TV)—
provides better resolution and significantly improves image clarity.
Corporations used to broadcast radio entertainment programs free of charge. Initially,
sale of radios
the   generated income for the corporations.
advertising time
Later, corporations decided to generate income by selling 

. Stages of Audio-Visual Production


The various phases involved in the production of a media product are called the stages
of production. The term applies to all forms of media—audio, video, and film. There are
three basic stages of production. The details of these stages vary according to the type
of media that you are producing.
Preproduction: This stage comes before the actual shooting or recording
commences. For small productions, it includes activities such as meeting with the client,
research, set design, and location scouting. Rehearsals are an important part of the
preproduction stage in music production. Other processes specific to film production
include production scheduling, set construction, workshops, and script readings with
the cast and director.
Production: Production refers to the stage when the actual recording or filming
happens. The production phase is termed the point of no return when feature films are
involved. Once the production phase begins, it is no longer financially feasible to cancel
the project. In film production, it is important that the director captures all the
necessary shots. In music production, recording the basic track comes under the
production phase.
Postproduction: The final major phase in the production process is postproduction. One of the
major tasks in postproduction is editing. In the film industry, it sometimes becomes necessary to reshoot
a scene if the director notices errors or an unsatisfactory performance. Media professionals refer to the
process of reshooting a scene as a pickup shot. Today, films increasingly rely on computer
enhancements. Therefore, the postproduction phase lasts longer than the actual production
phase. Typical activities of postproduction in films include adding graphics, sound, and special
effects. Postproduction is an important stage in which technicians correct color and exposure. In music,
postproduction includes overdubbing, editing, and music mixing. Marketing and media strategy are
other important aspects of postproduction.

Impact of Changing Technology


Changing technology is the driving force behind the advances in the audio-video
production industry. Here are some specific instances where technology has impacted
audio-video production.
Audio production: When Edison created the phonograph, the lack of an amplifying
device limited the loudness of the playback that he could achieve. In 1906, Lee De Frost
invented the audion valve. The audion valve is an electronic valve that greatly amplifies
weak electric signals. This small electronic device enabled sound recording to move
from a purely mechanical process to an electrical process. Eventually, the sound
recording moved to a magnetic process.
With the introduction of magnetic recording technology, audio editing and features
such as stereophonic sound became possible. With digital technology, people can edit
audio files cheaply and quickly. Today, music makers do not require an ensemble of
instruments, huge mixing consoles, equalizers, and rolls of expensive tape. All you
need is a computer with the right software to record and make your own music.
Film production: Improvements in technology enabled films to transition from a silent
medium to one that combined sound with visuals. The synchronized combination of
sound and visuals replaced the background music recordings and interruptive
intertitles. The introduction of color provided another significant improvement in film
production. Earlier methods of coloring films included the tedious method of hand
painting each frame. Later, as technology to print films evolved, filmmakers used
specialized cameras and film-printing techniques to produce color films.
Using digital cameras to make films has enhanced the editing capabilities of
filmmakers. It also enables audiences to enjoy high-definition clarity in films. The use of
computers allows filmmakers to create special effects, too. These effects surpass those
achievements created by sets and props. 3D technology has now enhanced the
audience’s viewing experience.

Television production: The first major impact of technology on television was Vladimir


Zworykin’s use of the cathode ray tube in television receivers. The focus of the
television development shifted from a mechanical system to an electronic
system. Development of the satellite technology in 1957 eventually led to the creation
of Telstar 1. Telestar 1 was the world’s first communication satellite. The first live global
broadcast occurred on July 23, 1962. Throughout Europe and North America, people
watched live images transmitted from across an ocean via satellite.
Early televisions that incorporated a cathode ray tube were bulky. However, with
advancement in technology, today, we have sleeker television sets. These television
sets were possible because of the development of LED and plasma technology. Apart
from a more aesthetic design, these television sets permit viewers to watch extremely
sharp and clear images.
Basic Camera Use
In this lesson, you’ll describe various types of video cameras and their parts. You’ll
compare and contrast professional and consumer cameras. You'll also learn to operate
various pieces of video equipment, including video cameras and tripods.

Types of Video Cameras


The cameras that you see today weren’t as advanced to begin with. The camera, as a
device, has had an eventful history. Let’s take a look.

French inventor Joseph Nicéphore Niépce developed the camera obscura in 1814. The
first photo he took with it took eight hours to develop. The word camera
obscura means a dark room or chamber in Latin. The word derives from the use of dark
rooms to view the earliest photographs.
In 1835, an English inventor, William Talbot, designed the mousetrap camera. In later
years, he also invented many photochemical methods; these methods helped improve
the process of making prints of photographs. In 1839, French artist and chemist Louis
Jacques-Mandé Daguerre invented the Daguerreotype camera. Frederick Scott Archer
developed the new Collodion process in 1851. With this process, cameras only needed
a few seconds of light exposure to make a picture.
In 1867, William Lincoln patented zoopraxiscope, or the wheel of life. It was a motion
picture projector with a slit. You could see moving animation or pictures through the
slit. In 1948, the Polaroid cameras made a huge leap. These allowed people to take a
photo and see it immediately, right from the camera.

The French inventor, Léon Bouly, first invented his cinematograph, a motion picture
film camera, in 1892. In 1895, the Lumiere brothers recorded and projected motion
pictures using their Cinématographe Lumière. The Lumière camera was a three-in-
one. That is, it was a portable motion picture camera, a film-processing unit, and a
projector.

Over the centuries and decades, the camera has evolved. It offers greater flexibility and
functionality. Today, various types of cameras are available for different uses. Still
cameras capture still pictures. Digital cameras capture both still images and
videos. Video cameras capture electronic motion pictures. Based on their functions,
video cameras can be camcorders or professional video cameras. Professional video
cameras are cameras that videographers operate manually. Mainly, professionals use
these cameras to produce films and television programs. Camcorders consist of a
camera and VCR on which users record home videos.

Closed Circuit Television (CCTV) cameras are another type of video camera. These can
pan, tilt, and zoom. They can also function as security, surveillance, and monitoring
devices.

Let’s look at the main subtypes of professional video cameras and camcorders.

Professional Video Cameras

Professional cameras come with manual settings. They offer the user a lot of control
over the image quality. Let’s look at a few important types of professional video
cameras.

Studio cameras: Operators use studio cameras within television production


studios. Such cameras have no recording device. They send signals directly to the
production control room. Thus, operators can use the signals in collaboration with
other cameras. Operators attach to large portable mechanical stands that are a part of
the camera and have controls to adjust the zoom and focus.
Prosumer cameras: These cameras are a combination of a professional camera and a
consumer camera at a consumer price. They are full of features meant for serious
users. These cameras have better lenses, provide high-quality resolution, and are
compact. They also provide the user complete control over the photo shooting process.
ENG and EFP cameras: Electronic news gathering, or ENG, cameras are an entire
studio in a camera. These cameras can record on tape or digitally. They receive
multiple sound inputs through XLR cables. They have interchangeable lenses and
manual controls. These cameras are large and heavy. So, you need to mount them on a
tripod or a shoulder harness when you use them. An electronic field production, or
EFP, camera is similar to an ENG but operators use them for shooting in the field.
DSLR cameras: Digital single-lens reflex, or DSLR, cameras used to offer only still
photography; but now they also allow video shooting. With the introduction of HD
video, these cameras can create high-quality videos. They are extremely versatile. They
provide great shutter speed, image quality, and manual control. They are slightly bulky
but allow the user to see exactly what the camera can see through the viewfinder. With
a DSLR, you can capture images of still and moving objects—from slumbering puppies
to racing cars. People view them as the modern-day replacement of SLR cameras, with
added video capability.
Camcorders

A camcorder (camera recorder) is a portable device that can record live action and
sound. It is an ideal device to capture home videos. It consists of three important parts;
a lens that captures light; an imager that converts light into electric signals; and a
recorder that converts electric signals into digital for recording.

Let’s look at some important types of camcorders based on recording format.

MiniDV camcorders: These camcorders record video and audio on a MiniDV tape and
offer about 80 minutes of digital video. In high-resolution mode, they can record 60
minutes of footage. Using a Firewire cable, you can transfer the footage from your
MiniDV camcorder directly to a computer. These camcorders are lightweight, loaded
with features, and give excellent results; they’re particularly popular with documentary
filmmakers.
DVD camcorders: These camcorders use DVD for recording high-definition footage in
the AVCHD format. The footage can be ready for instant viewing if your DVD player can
play both mini DVD-R and mini DVD-RW discs.
Hard disk drive (HDD) camcorders: This is a new and popular type of camera. These
cameras have a built-in hard drive for recording. They’re extremely convenient for
editing. They save the need to buy and change tapes. They provide the highest-
quality videos; and are more convenient than conventional camcorders that use a tape
or an optical disc. They’re smaller and lightweight than the MiniDV and HDV
camcorders and therefore are easy to hold.
Flash memory camcorders: These camcorders use the same memory cards as digital
cameras use. They require flash memory drives (cards) for recording. They’re less likely
to get damaged, but they’re more expensive than hard drive cameras. Hard drives
provide better-quality footage than flash drives. They record footage to flash memory
through built-in memory in the camera. Alternatively, they record footage directly to a
removable flash memory card.
Combo models: The combo model offers the best of both worlds. You can use either a
hard drive or a flash drive for recording. These cameras provide excellent image quality
and flexible recording space. But they may cost a little more than the other types of
cameras.
Features of a Camera

Most consumer cameras can do everything automatically. Professional cameras offer


manual settings for various operations. The zoom function helps move the object of
view further or closer. The auto focus function is for amateur
photographers. Professional cameras have a manual focus ring at the front of the lens
to provide sharper images.
White balance (color balance) tells the camera what each color should look
like. Professionals adjust the white balance manually for different light situations.
Most consumer cameras have a built-in microphone. Professional cameras have
separate mikes for better sound recording than home videos. Thus, they provide better
audio effects.
You can record data on tapes or go tapeless. Tapeless allows much higher bit rates
and higher-quality video recording. Tapeless recording provides interesting camera
effects. One is over-cranking (or slow motion). Here, time appears to be moving slowly
in a video. Another option is time lapse. Here, time appears to be moving faster and
thus lapsing in a video.
Another notable feature of professional cameras is the ability to interchange
lenses. This feature provides image capture with full manual control.

In a multicam studio situation, professionals prefer cameras with timecode and


genlock. Timecode is a signal recorded with your video that identifies every frame of
your tape. It marks a time stamp in hours, minutes, and seconds.
Genlock is a technique that synchronizes two different audio and video signals. This
allows you to mix images and graphics. You must have genlock and timecode for
a multi-camera shoot. This keeps the timecode in perfect sync for all the cameras while
recording.

Parts of the Camera


Cameras consist of three major parts: the image sensor, the lens system, and a
recording device. Let’s look at them in detail.

Image Sensor

An image sensor is a special type of light-sensitive silicon chip. It helps provide the


perfect shots. The charged coupled device (CCD) and the complementary metal oxide
semiconductor (CMOS) are two major types of image sensors. CMOS sensors have
been replacing CCDs in the newer camera models. When you take a picture, the light
falling on the image sensor converts electrical signals to digital signals; and then
transfers the signal to an onboard computer for processing. The sensor then calculates
and stores the final image on the memory card.
Lens System

The lens contains several round glass elements that collect light and direct it to the
camera’s pickup chip. It has a zoom ring to change magnification. It also has a focus
ring to adjust focus. Lenses help create dynamic shots and emphasize depth of field.

Iris: Camera design is similar to the human eye structure. An important camera lens
part is the iris. It is an adjustable aperture, which controls the amount of light coming
through the lens. The more the iris opens, the more light the camera allows into
it. More light makes the scene brighter.
Shutter: A shutter is also a part of the lens. It controls the length of time that light
enters the camera. It may be located in the lens (leaf shutter) or right in front of the
film (focal plane). Shutter speed affects the quality of the image. You can define it as
the amount of time the shutter is open. The ideal shutter time is 1/60th of a second or
faster. Slower speeds can shake the camera and cause images to blur.
You can choose from two basic types of lenses. One type of lens is built into the
camcorder. The other type consists of attachable accessory lenses. You can buy
accessory lenses to get certain effects while shooting. Point-and-shoot cameras have
a built-in lens system. They don’t give you the freedom to be creative with your
lens. DSLRs and professional cameras allow you to use various lenses for perfect shots.
Lens accessories also help protect lenses from external damage.
You can add various lens accessories to your lenses. The polarizer helps cut down
glare. The zoom through, wide angle, and tele-extender adapter increase your view
angle. They also help fit everything you want in your frame. Lens filters are clear pieces
of glass that protect the lens, prevent harm from UV radiation, and protect the lens
from scratches.

Viewfinder

The viewfinder is a small video monitor attached to the camera. It allows the camera
operator to view the images in the shot. Some viewfinders display zebra stripes on a
brightly lit object. 
You can use the viewfinder to set the brightness and contrast using the following steps:

1. Switch the camera to color bars.


2. Adjust the viewfinder brightness and contrast until you see a smooth grayscale
from peak white to black. You should be able to see a dividing line between each
bar.
3. Switch the camera to picture.
4. Check your exposure on a reliable monitor by connecting a cable from the
camera output or by doing a test record.
Battery Pack

The battery pack is part of the recording section. It is usually at the rear end, directly
opposite the lens. Without a battery, the camera would not function. Different camera
models and formats use different types of battery packs. Most batteries are
rechargeable.

Recording Device

Recording of a camcorder is similar to that of a VCR. All camcorders come with a built-


in microphone, which allows you to record the sound and the image
together. Professional video making requires a separate microphone. A telescopic
microphone works best to record sound and images separately.
Professionals always keep in mind the broadcasting format while recording. National
Television Standards Committee (NTSC) is the broadcast format for the United
States. PAL stands for Phase Alternating Line, which is the standard broadcast format
in Europe, Australia, and Asia. If you try to play a movie from Europe in a DVD player in
America, the picture will be hazy and the sound quality will be poor. This is due to
electrical differences between the two broadcast standards.
Camera Recording Techniques

Camera angles, movement, and lighting help videographers capture amazing


shots. Let’s look at various camera techniques.

Pre roll: While recording a scene, your camera may take a few seconds to reach the
accurate recording speed. These few seconds of shooting may be vital for your scene,
and you may lose the initial action. Pre roll footage is what you shoot before your scene
begins. It allows you to capture a more complete image.
End roll: With this technique, you continue to record after the scene has ended so that
the shot doesn’t end abruptly. This technique is important in the editing and
postproduction process. It allows the editors to cut your scene without cutting out any
action.
Pan: With this technique, you move the camera horizontally left or right. You can use
pan shots to follow a subject or show distance between two objects. Pan shots work
great for panoramic views. For example, a shot from one corner of a large hall to the
other while the character moves.
Tilt: Moving the camera up or down without raising its position helps show the top and
bottom of a static object. With a tilt, you can also show how high something is. For
example, a slow tilt from a large tree’s roots upward, to the top of the tree, would show
its grandness and enormity.

Cameras Used in the Industry


Now let’s take a peek at how professionals in the audio-video industry use
cameras. Let’s first understand the Five Cs a camera must fulfill for great shots.
 Camera angles: The angle of the camera relates to the Point of Interest (POI). It
is the location or character the camera focuses on. The angle of the camera
helps establish the viewer’s emotional relationship with the POI.
 Continuity: This refers to building sequences that flow smoothly between shots.
 Cutting: Cutting means showing multiple views of the same action.
 Close-ups: This is a closer view of the object or person in the scene.
 Composition: Composition is the display of position, arrangement, and view of
the objects within the frame.

Now, let’s take a look at some popular types of cameras that professionals use for
recordings in studios and outdoor locations.

A studio or an indoor camera is a television camera on a tripod. A tripod is a three-


legged stand. You may also use a studio pedestal for exclusive use within the studio. It
is huge and heavy. It comes with a camera control unit (CCU) that controls the signals
the camera sends to the control room. Studio cameras don’t need battery packs; they
can receive their power directly from cables.
A professional camcorder is lightweight and portable. But it is not as small as a
consumer camcorder. It is a combination of a television camera and a recorder; Users
can easily carry it in the field. Users can also place it on their shoulder or on a tripod.
A convertible camera is a camera with a variety of accessory packages that users can use
both in the field and in the studio. It can adjust to diverse situations; and is less
expensive than a studio camera.

Mounting the Camera

There are two ways to support a camera when shooting a scene: handheld shooting
and tripod shooting.
Handheld shooting: The camera operator holds the camera in the hands. The camera
sits on the operator’s right shoulder, and the operator’s left hand controls the
lens. Operators typically hold ENG cameras in this way. It gets tiring for the operator to
hold the camera for many hours. There are many handheld shots in the news. But the
image cuts from one shot to another as the operator moves in the field. The images
can also be jerky or shaky.
To avoid shaky images and get steady shots without using a tripod, operators use a
Glidecam. It is a body mount device. It attaches to a harness that the operator
wears. The harness takes the weight of the camera. The camera attaches to the
harness in the same way a camera attaches to a tripod. A spring-loaded and shock-
absorbing arm also attaches to the harness. The harness keeps the shot steady even
when the operator moves or climbs stairs.

Shooting with a Tripod

A tripod is a useful accessory for making the camera stable when shooting photos and
videos. The tripod is a three-legged stand that a camera attaches to. It has adjustable
legs that let the operator position the camera at varying heights.
Tripods come with a mounting head to couple with the camera. The head has several
handles and knobs to allow the operator to pan and tilt the camera. It is impossible to
use heavy studio cameras without placing them on a tripod. A tripod is very handy
when shooting for long hours in the field.

The tripod is essentially the legs of the camera. Early cameras were heavy and needed
the tripod even more than today. The tripod’s main function is to eliminate shakes and
vibrations that come from holding the camera in hand. Tripods never wobble and
remain balanced even on a hill slope.

Functions of a tripod

Let’s look at the important functions of a tripod.

 A tripod keeps the camera steady in a precise position.


 It prevents the picture from turning out blurred during low light and low shutter
speed conditions.
 It increases the depth of field for sharper images.
 It allows the photographer to take a series of photos at different angles to
produce a digital panorama.
 It takes photos at different exposures.
 It helps in taking time-lapse photos to produce animation.
 It allows the photographer to set the right composition well in advance of an
event such as a sporting event.

Safety Rules
Professionals in the audio-video industry work with potentially hazardous
materials. Even a harmless object can pose a safety hazard if you ignore safety rules
and precautions.

Common Hazards

Let’s look at some common hazards in the film and video industries.

Fire hazards: Materials such as solvents, oily rags, and chemical oxidizers are potential
fire hazards. Solvents vaporize quickly. They combine with air to form an ignitable
mixture, which catches fire when a flame is present. Handle flammable sources (stoves
and matches) safely and store them far away from a studio.
Physical hazards: Intense vibration, noise, and some forms of light (ultraviolet and
infrared) are potentially dangerous for artists. Intense vibration that you feel when
operating a drill, for example, may stress muscles in the body. Excessive noise can
make a person deaf and cause intense ringing in the ears. Ultraviolet light that artists
use in ultraviolet photography can cause cancer. Take care to avoid direct radiation,
and wear protection.
Chemical hazards: Many chemicals are toxic to humans. Humans can feel the effects
of a toxin immediately (acute) or after a long time (chronic). Some of the effects of
toxins are nausea, lightheadedness, and organ damage. Chemicals such as toluene (in
paint removers) can cause liver and kidney damage. You can wear protective masks
and gloves when handling chemicals and avoid inhalation or accidental consumption of
chemicals.

Safety Procedures

Let’s look at some safety procedures that you can follow to protect yourself from
potential hazards.

Ventilation: A studio should have adequate ventilation to dissipate fumes. If needed,


wear respirators to avoid inhaling harmful gasses. Chemical safety cabinets are helpful
in avoiding the spread of chemical fumes.
Personal protective equipment: Personal safety equipment will protect you from
harm while working. One of the most common pieces of safety equipment is the
glove. Rubber gloves will protect you from many chemicals; leather gloves will protect
you from heat. Use safety glasses to protect your eyes from flying material, chemical
splashes, and radiation. Use aprons and kneepads where required.
Disposal of waste: You must appropriately dispose of all waste that you
generate. Each institution and government has hazardous waste disposal systems and
rules.
Elements of Audio Production

In this lesson, you’ll identify various types of microphones and their pickup
patterns. You’ll determine the optimal placement of microphones. You’ll also describe
various audio formats, audio cables and connectors, and their industry applications.

Microphones and their Optimal


Placement
Sounds are vibrations that travel through air as sound waves, causing fluctuations in
atmospheric pressure. A microphone is a transducer that converts acoustical energy
into electrical energy. When sound is in its electrical form, it can be amplified, mixed,
and recorded. Every microphone contains a housing, a diaphragm, a magnetic field,
and a moving part within that field. Microphones have specific characteristics and cater
to different needs. There are three main types: dynamic, ribbon, and condenser.
Dynamic Microphones: These use electromagnetic induction to generate an output
signal voltage. They are durable, moisture resistant, and relatively inexpensive. Often
used for stage performances, these microphones are an ideal choice when working
with a relatively loud sound source that lacks high-frequency detail. For example,
instruments played on stage like the drum.
Advantages: They are simple to construct and easy to repair. They do not require
external power to operate as the movement of air generates energy. They come in
several sizes and can function as speakers.
Disadvantages: Their major drawback is their inability to capture multiple sounds that
play simultaneously. Moreover, their output current is very low and requires a lot of
amplification.
Ribbon Microphones: Ribbon microphones are one of the oldest and most fragile
types of microphones. They are bidirectional, which means that they can pick up
sounds from both sides of the microphone. They are capable of enhancing higher
frequencies of sound. They give the best output when placed close to the source of
sound. They tend to capture high frequency sounds better than the moving coil designs
of dynamic microphones. This is due to decreased mass as compared to a moving
coil. However, they are considerably more fragile than moving coil types. Hence, they
should not be exposed to wind blasts. Their output levels are significantly lower than
those of the moving coil designs.
Advantages: A ribbon microphone has excellent sonic characteristics. It offers warmth
to the recording sound. Moreover, the ribbon diaphragm has a unique way of
capturing sounds. It is an ideal option for capturing strings, brass, and percussion.
Disadvantages: Ribbon microphones are not very durable, and the ribbons tend to
break easily. They are not suitable for recording outdoor sounds as they lose sound
quality due to wind noise.

Condenser Microphones: A condenser stores energy in the form of an electrostatic


field. Condenser microphones get power from a battery or other external source. They
are more sensitive and responsive than dynamic microphones. Hence, they are best
suited for capturing subtle nuances in a sound. The resulting sound is natural, clean,
and clear, with excellent detail. They have inherently lower handling and mechanical
noise than dynamic microphones. They are smaller and lighter than dynamic
microphones. This makes them the logical choice for tie-clips, headsets, shotguns, and
other miniature microphones.
Advantages: Every condenser microphone model has its own peculiar
characteristics. Small-diaphragm condenser microphones capture sounds most
accurately. The large-diaphragm condensers make vocals warm, detailed, and full-
range. For example, it has full bass quality and full-throated voice in vocals.
Disadvantages: The condenser microphones are more sensitive than the dynamic or
ribbon microphones. In addition, these microphones are expensive and require
external power sources. They are not moisture-resistant.

Omnidirectional Microphone Pattern

Directionality is the sensitivity of a microphone to sounds from certain


directions. Microphones are designed to pick sound from specific directions. Some
microphones pick sounds from all directions, while some catch sounds only from one
particular direction. Let’s take a look at the various types of directional patterns found
in a microphone.

The word “omni” means “all.” An omnidirectional microphone can pick up sounds
equally from every direction. This type of microphone is the ideal choice for an on-
location TV interview. The interviewer need not hold the microphone in a particular
position. Hence, people standing in any direction can talk into the
microphone. However, an omnidirectional microphone is not a good choice for music
concerts because it does not offer any special proximity effect. It even picks up
unwanted background noise.
Bidirectional Microphone Pattern

Also called the “figure eight pattern,” the bidirectional pattern is most commonly
associated with ribbon microphones. This pattern captures sounds from two opposite
directions. It is an ideal choice for recording a conversation between two people sitting
opposite to each other. This pattern does not offer any special proximity effect. Hence,
it is not a viable option for studio recording.

Unidirectional or Cardioid Microphone Pattern

Microphones with this pattern capture sound mostly from the front. They have a heart-
shaped pattern. Microphones with this kind of pattern do not record unwanted
background noise. Hence, such microphones are ideal for studio recordings. They are
good if you wish to capture clear voices as they have a proximity effect. They easily
capture sounds that are close to the microphone.

Supercardioid and Hypercardioid Microphone pattern

These two types of microphone patterns may seem similar, but they have a slight
difference. The supercardioid microphone has lower directionality and a smaller rear
lobe as compared to the hypercardioid pattern. The hypercardioid pattern has higher
directionality and rejection of sounds from the sides than the supercardioids. Its
primary sensitivity lies in the front of the microphone.

Specialty Patterns

The Shotgun microphone, which is hypercardioid, has a very narrow pattern. However,


it can capture sounds even from a distance of three feet. It is a good option for on-
location shoots, TV shows (when hung from a boom), and so on. The Parabolic
microphone has a wider reach as it can capture sounds from a distance of around 30
feet. It is the best option for capturing sounds in any sports event.

Optimal Placement of Microphones

You are now aware of the types of microphones and their polar patterns. Let’s take a
look at the placement of microphones. You can place microphones according to their
range and requirement. Microphone placement differs for various locations and
instruments. The selection and placement of a microphone plays an important role in
acoustic recording.
If a music track has been recorded using good instruments and by a skilled musician,
then the audio will not need to undergo any significant modifications. Sometimes, such
an original and unaltered piece of work sounds better in comparison with the one
edited several times to sound perfect. Here are a few guidelines for audio recording.

When using multiple microphones, the distance between each of the microphones
must be three times the distance between the microphone and the source of
sound. For instance, if the distance between one microphone and the source of sound
is one foot, then the distance between the individual microphones must be around
three feet.

To avoid unwanted background noise and feedback, place the microphone close to the
source of sound. Moreover, try placing the microphone at various distances and
positions until you achieve the desired tone and sound.

If you are using a unidirectional microphone, point it toward the desired sound source.

For reducing handling noise and stand thumps, use an accessory shock
mount. Preferably, use an omnidirectional microphone. You could also use a
unidirectional microphone that has an internal shock mount.

To avoid explosive breath sounds occurring with the letters p, b, and t, place the
microphone either three inches closer to or farther from the mouth.
To select a microphone, always consider its directional pickup. For example, if an
instrument radiates in more than one direction, use an omnidirectional microphone for
its recording.

Similar to selection and placement of microphones, proper application of microphones


is also very important. Experts with a good understanding of microphones obtain the
best possible results. Let’s take a look at a few experts involved in audio production.

Sound Engineer: They are responsible for the soundboard and other electrical
equipment in a recording studio. They create high quality music, speech, and sound
effects. They then mix the recorded tracks into the final piece of audio. They run the
recording session under the guidance of the producer.
Sound Technician: They are in charge of providing superior quality sound during live
performances. These technicians are also responsible for setting up the equipment
before a live concert. They must ensure that the equipment is functioning properly.
Acoustic Consultant: They are responsible for the complete audio, video, and acoustic
design services. They examine the performance, offer suggestions for any
improvement in equipment, and fix other audio related problems.
Audio Engineers for Videos: They are responsible for ensuring that the audio tracks
are coordinated with the video.
Live Sound Engineers: They control the soundboard during any live performance.
Mastering Engineers: They take the final mixes of recordings and add finishing
touches such as EQ (equalization), overall effects, and possibly compression, before
finalizing the output.

Audiotapes, Tapeless Carriers, and File


Formats
The quality of digital audio depends on the sampling rate and bit depth settings. The
choice of compressed and uncompressed storage formats also makes a difference to
the audio quality. Carrier type is a classification that helps you identify the format of
the storage medium. It also lets you know the housing of a carrier. It helps you identify
the type of recording device needed to view, play, and run the content of a resource.
First, let's look at some of the carriers used for audio recordings and reproduction.

Types of Carriers
Mechanical Carrier

Mechanical carriers, which are one of the oldest types of carriers, were invented for
dictation purposes. Over time, they were used for scholarly recordings of language and
ethnic music. The first recording system was the cylinder phonograph, invented in
1877. The introduction of gramophones led to a drastic drop in the use of these
cylinders. Mechanical disc formats were popular in the market from the late nineteenth
century until the 1980s. The recording technique for the mechanical medium is based
on the principle of air pressure. A rotating surface is engraved, which allows for sound
playback through needle contact with the recorded groove. The rate of deterioration in
mechanical formats due to regular use is generally high. Often, misalignments and
inexperienced operation cause damage to the carrier.

Magnetic Carrier

Invented in the nineteenth century, magnetic carriers were initially used on a small
scale, along with cylinders and gramophones. In the 1960s, several cassette formats
were developed using this form of audio recording. The 1980s saw the introduction of
digital audio recording on magnetic tapes. Information is stored on a narrow ribbon of
plastic tape. One side of this tape is coated with a magnetic material. Compared to
mechanical carriers, a magnetic tape can be played repeatedly without affecting its
quality. However, you must clean and de-magnetize these tapes regularly to avoid
damage to their surfaces. Further, the tape can be damaged or broken if it is pulled out
of the hard plastic housing that normally protects it.

Optical Carrier

Optical carriers are the newest carriers for audio and video storage. These carriers are
of three main types: mass-produced carriers, optical discs and tapes that you can use
only once, and re-recordable discs. Polycarbonate plastic is molded into a die, which
contains tiny patterns of raised bumps along the surface. Data is saved on these
bumps and the flat areas between them (lands). There is no (measurable) deterioration
by replay with optical disks.

Let’s take a look at some of the most commonly used digital audio file formats.

MP3: MP3, MPEG-1 or MPEG-2 Audio Layer III, are some of the most popular file


formats. MP3 compresses the size of a file. It can reduce the size to around one-twelfth
the size of the original file. Files saved in this format use the name format “.mp3”. MP3
files are compatible with Windows, Apple, and Android devices.
WAV: Waveform Audio File or WAV format stores audio. Any file saved in this format
will have the name extension of “.wav”. WAV files are compatible with Windows,
Macintosh, and Linux operating systems.
Audio Interchange File Format: Audio Interchange File Format, or AIFF, is widely used
in Macintosh operating systems. Also referred to as “Apple Interchange File Format,”
these files need not have any name extension when saved on a Mac. However, when
saving on any other operating system, the files will have the name extension of
“.aif”. Files saved in this format are compatible with Windows, Macintosh, and Linux
operating systems.

Every audio format faces the risk of corruption. A few factors that threaten the
longevity of an audio file are listed here.

Humidity: Water, present as humidity in the air, can damage audio carriers. Contact


with water causes oxidation, which is a type of chemical reaction. Direct contact of
water with any kind of audio carrier can have harmful effects. However, if you
thoroughly clean and dry it, the data can be saved.
Temperature: Temperature causes dimensional changes in carriers. High temperature
is harmful for lacquer discs, as it affects the lacquer coating. Another negative impact
of temperature is its influence on print-through in magnetic tapes, which increases with
rising temperatures.
Mechanical Deformation: Audio carriers are not sturdy and cannot withstand rough
handling. Components such as cylinders and shellac discs can break. Hence, they are
prone to surface damages, which will affect the sound quality. Discs should be scratch-
free, while cassettes and tapes must be stored in proper cases.

Editing Recorded Sound: Not every piece of sound recording is perfect in the first
attempt. For example, imagine you are recording a song in your house. You will first
have to eliminate all the unwanted background noise to make the singer’s voice more
audible. You may have to use various editing methods to ensure that the voice sounds
more melodious. Here are a few basic tasks that you should perform while editing an
audio file.
 When you record sounds for a video, you can split them into clips. You can work
on them individually to make sure that they are synchronized with the video.
 Sometimes, while recording an audio, we do not realize the length of the
file. You can edit the audio file and reduce the length of the file. You can also
edit out any kind of unwanted sound or noise.
 For a video, the volume of a track is also important to create emphasis on a
certain point. You can adjust the sound volume according to the requirement of
the video.
 If a certain video clip needs extra effects to create a desired impact, you can add
the effects during the editing phase.
 While editing, two or more audio files can be cross-faded or transitioned
depending on the need of the audio track.

Signal
Signal Type Voltage
Strength

Mic level 2 millivolts or .002 volt Weakest

Instrument
50 millivolts or .050 volt Weak
level
Line level 0.316 volt (also called –10 dBV) for unbalanced Strong
equipment

1.23 volts (also called +4 dBu) for balanced equipment

Speaker level 1 to 1000 watts, or about 3 to 90 volts Strong

Audio Cables and Connectors


Cables and connectors bring to life any electrical device. For audio-related electrical
equipment, every stretch of the cable wire, right from the power line to the final
transporting cable, is important.
Cables are made up of several wires that transfer electric signals between various
components of an audio system. Connectors create a temporary link between two
components. Audio cables generally consist of one or two wires that are insulated
called conductors. The conductors within a cable are surrounded by a wire-mesh
shield. This shield decreases the “hum” sound. Cables are designed to carry either
balanced or unbalanced signals. To carry a balanced signal, two conductors are used. To
carry an unbalanced signal, a single conductor is used. However, in this cable, not only
does the center conductor carry signals, but the shield can also transfer signals
The signals can be one of the four signal levels or voltages, as shown in the table.

Cable Types

Cables are categorized according to their function. Take a look at the different types of
cables.

Mic Cable: This cable consists of a female XLR and a male XLR connector. Between


these two connectors lies a two-conductor shielded cable. You can use this cable to
connect microphones to a mixer or a mic preamp. You can also use it to connect
professional balanced equipment together to pass line-level signals.
Guitar Cord: You can use this cable to connect two phone plugs. Between the plugs is
a 1-conductor shielded cable. You can use this cable with an electric guitar, acoustic
guitar pickup, electric bass, or electronic keyboard.
Speaker Cable: You can use this cable to connect an amplifier source to loudspeakers.
Using this cable, you can connect the cable's banana plug to another banana plug with
a 2-wire cord between them.
There are two main types of connectors: cable connectors and chassis
connectors. Cable connectors are attached to the cable itself. Chassis connectors are
built into the equipment chassis. Cable connectors are plugged into the chassis
connectors. Let’s now take a quick look at a few basic cable connectors.

RCA Connector: Also known as the “phono” connector or “cinch” connector, this


connector transfers audio and video signals. These connectors are cheap and easily
available in the market. They also have a very long life and are very durable. However, a
single connector can carry only one audio channel, so you will require two or more of
them for stereo.
1/8-inch or 3.5 mm Connector: Generally used with iPods or other personal music
devices and most personal computers, this small-sized connector is suitable for pocket-
sized devices. However, at times, its size itself becomes a problem as the small wires
are fragile and easily damaged.

1/4-inch Connector: Musicians and audio professionals mostly use this connector,


since it is durable and tough. It is bigger and heavier than other connectors, and is
more expensive. It is not useful for small devices.
XLR Connector: This connector is circular and has three to seven pins. It is a high
quality connector with locking connections and large surface areas. Mostly used in
recording studios, it is a professional connector. It is heavy and expensive.
Cable connectors connect devices in a quick and easy way. The four connectors listed
here are the basic and most commonly used ones. The market is full of various types of
connectors. Of the thousands of connectors, audio connectors form only a tiny
fraction.

Industry Uses and Production


Requirements
Many people believe that sound stirs the soul. Most people love music. Making music
can be a lot of fun. The only thing you must have to be a music producer is the
knowledge about equipment involved and a space for recording. Take a look at the
things needed to record an audio clip at your home.

Audio interface: Your personal computer most likely has a microphone input


port. You might buy microphones and plug them directly into your sound
card. However, this will not give you high quality audio. For good quality sound, you
need an audio interface that will allow you to use high quality microphones and studio
monitors. While buying an audio interface, you must keep the following things in
mind: It should be compatible with the other systems that you have, it must connect
the microphone with the computer, and it must have multiple input points.
Microphone: To record vocals or any acoustic instrument, you will need a
microphone. Every microphone is specialized in capturing sound in a unique
way. Consider the nature of your recording when you purchase a microphone and
select one that best accomplishes your specific task.

Studio Monitors: These provide clear and accurate sounds. You can find a variety of
studio monitors that achieve your desired sound.
MIDI Keyboard/Controller: If you are planning to create music on virtual instruments,
you will need an MIDI keyboard to operate them.
MIDI Interface: Using the MIDI interface is optional as it connects the MIDI controller
to the computer. This interface comes in handy only for those computers that lack
traditional MIDI ports.
Cables: These connect different types of hardware used for recording.

Audio mixer

An audio mixer modifies and shapes an audio file. When an audio mixer receives audio
from an external source, it adjusts the audio by modifying its volume and other
features. After modification, you can send the audio to another device. Let’s take a look
at a few elements involved in audio mixing.

Level: Level refers to the volume of the sound. If you want to hear something louder,
you have to turn up a fader. In any piece of music, the louder components easily grab
the listener’s attention.
EQ or Equalizer: EQ enables you to boost and reduce the tonal quality of an audio
piece by adjusting the volume of various frequencies. This also highlights certain
effects by making changes in levels of various frequencies.

Panning: Panning separates the sounds of two instruments. It reduces the likeliness of


one instrument overshadowing another. It makes the sound of every instrument
distinct and clear. You can use it in a mix to create the impression that a source is
moving from one side of the sound stage to the other. It might be in sync with a video.
Time-Based Effects: This feature adds effects, such as reverb and delay. They make an
instrument sound farther away, or even bigger than a normal instrument.
Audio mixers are essential for recording as they capture and play music
clearly. Moreover, audio signals consist of binary data that your computer can
alter. Since a digital mixer is a computer, it can help you make changes to an audio
project. Digital mixers are more efficient than analog mixers at capturing and
preserving sounds. A digital mixer converts the original sound into a series of numbers
that are transferred to different components in the mixer. It keeps the signal intact. In
an analog mixer, the signal passes through the components of the mixer, which causes
deterioration in the original sound.

Now let’s take a look at a few microphones based on the location of recording.

Indoor Sound Recording: When recording indoors, you may use AKG D202 for


recording. This microphone’s design is suitable for recording instruments, sound
effects, and narration. Its frequency response can go deep into the bass range and very
high into the treble range. The NTI microphone is mostly used to record voice and
instruments in a studio. It is a cardioid condenser microphone with a large capsule. Its
frequency response rises in sensitivity through the treble range. It is an ideal choice for
recording narration. SM81 is one of the best alternatives for recording plucked and
hammered instruments. Its frequency response is very flat. It also has an extended
bass and high frequency response.
Outdoor Sound Recording: The AT835 model is a shotgun microphone. You can hold
the microphone two or three feet away from an on-camera actor while
shooting. The AT897 microphone is also a shotgun microphone that helps in video
production. It has a flat frequency response on-axis, unlike the side and back where it
has an uneven frequency. The MKH416 microphone is known for its exceptional
reach. You can place this microphone three or four feet away from an on-
camera actor. It is known for its high output level.
Concert Recording: When recording a large ensemble, you can use a stereo
microphone setup to accurately capture every sound. An omnidirectional microphone
is ideal for concerts, as it has equal sensitivity at every angle. Sound produced through
this microphone is equal, whether it is arriving at 0 degrees (at the front of the
microphone) or 180 degrees (at the rear of the microphone). The best feature about
this microphone is that it will pick up ambient sound as well as the sound you intend to
amplify or record. You can record the ensemble using multiple microphones that
adequately cover every section. For such a setting, the suggested equipment are as
follows:
 two cardioid-pattern condenser microphones
 microphone stands
 microphone cables with XLR connectors
 stereo microphone adapter
 a stereo microphone mixer with a minimum of two microphone inputs
 recording medium
 cables to connect the mixer to the recording medium

Ambient sound refers to the background sound or noise at the place of recording. The
sounds of wind, water, birds, and traffic are common ambient sounds. Ambient sounds
are important to a video as they provide continuity between shots, create a certain
mood, and avoid unnatural silence in a scene. Here are a few types of ambient sounds
used in video production.

Matching Ambient Sound: For this technique, you create sounds separately from the
video. You must ensure that these sounds match the video. This process covers
mistakes made during filming. It also maintains consistency in the background score of
a video.
Wild Sound: This is audio recorded without any images, and is not synchronized with
the video footage. You can add wild sound to the video during post-production. You
can use this sound to accompany any new audio that you add to the scene.
Buzz Track: This refers to the general ambient sound that is the natural background
sound of a location. These sounds are often recorded on location before or after
shooting the main actions.
Room Tone: This refers to the sound of an empty room. Every room has its own
unique sound and is generally subtle and low in volume.

let's take a look at the various problems that can occur during audio recording. The
setup for recording an audio consists of a microphone, an amplifier, and speakers.

 Feedback is a problem that occurs when the sound leaves the microphone,
reaches the speakers, and makes its way back into the microphone. This sound
gets re-amplified, forming a loop, which is sent through the speakers again. This
loop makes a howling sound that disrupts a recording. To avoid this problem,
you must maintain proper distance between the microphone and the speakers.
 While recording, the microphone has a tendency to capture wind noise and
excessive air movement, which affects the clarity of the recording.
 “Phasing” occurs when two or more microphones are involved in the
recording. In such cases, microphones tend to cancel each other’s sounds and all
you hear is silence.
 When stray magnetic fields cause a vibration between the enclosure and
accessories of electrical connections that are close to the microphone, you will
hear a buzzing sound. This sound is known as the “electric hum.”
 Loose cable connections affect the sound quality of a recording.
 Failure to maintain the sensitive diaphragms of a microphone can cause them to
be damaged, leading to poor sound quality.
 If a microphone is not protected from moisture, humidity, and temperature, its
functioning may be negatively affected.
Elements of Preproduction

In this lesson, you’ll describe the various steps and activities at the preproduction
planning stage, including identifying equipment, crew, cast, and set
requirements. You’ll also describe the audition process, the contents of a production
proposal, and steps to schedule the production process. Finally, you'll create a
production proposal.

Activities in the Preproduction Stage


Audio and video production includes various stages such as preproduction, production,
post-production, planning and scheduling. A preproduction plan includes various steps
that serve as a roadmap for your next stage of production. Once the team makes a
final decision on the project and its budget, the next step is to create a production
schedule.
When you are ready to launch a project, you must plan its entire production process
months in advance. You have to ensure that the required cast and crew are
available. You’ll need to schedule a list of tasks. Typically, you’ll need to scout locations,
work out production and set designs, cast actors for each role. You also have to
arrange for the logistics and gear, rework the script if necessary, and so on.

The schedule should also describe the entire plan for the shoot and include everything
from start to wrap. Stick to the planned schedule to keep costs within the estimated
budget. Keep a checklist handy to mark completed tasks and to track the incomplete
ones. Always keep a back up plan to handle unforeseen events that can otherwise
upset your entire project.

Now let’s take a look at some of the important activities involved in the production
schedule.

Defining the look of the film and characterization: The screenwriter writes the
screenplay of the chosen script in discussion with the director. Later, the producer, the
director, and the screenwriter work together to decide the
project’s characterization (the personality of the characters) according to the
storyline. Casting directors cast the actors for most of the roles in the project. They
read the script and work closely with the writer, director, and producer to understand
the physical attributes, skills, and experience a particular role requires. Casting
directors then conduct auditions and shortlist potential candidates.
Identifying locations: A video shoot involves an initial site survey that helps the team
identify locations that suit the setting of the story. The production team conducts a
recce (or exploration), which involves a visit to a location to determine its suitability for
shooting. You’ll be using the script to determine the sites like natural areas, urban
landscapes, or historic places, which would best suit the story. You’ll need to consider
several elements before you select an appropriate site.

Getting legal permission: You must first get legal permission before you start any
work on a site.
Arranging power supply: When you shoot outdoors, arranging for a continuous
power supply can be a challenge. Make sure to equip the location with multiple camera
batteries, generators, and other power sources.
Checking for accessibility: The location you select must be easy for the cast and crew
to reach. Some scripts may require locations that may be in remote areas. In such
cases, you’ll need to arrange for temporary accommodation for the cast and crew.
Checking the lighting conditions: Lighting has a major impact on the final
appearance of the video. Indoor locations such as buildings or restaurants are likely to
have less natural light. So you’ll need extra lights to shoot in these spots. Outdoor
shoots require less artificial lighting. However, one disadvantage of shooting outdoors
is, that you can’t control the lighting conditions. To get the perfect lighting condition for
a shot, synchronize your work with available sunlight.

Types of lighting: Lighting is an integral part of any video because it adds aesthetic


value. Lights and shadows help create a certain mood, highlight certain elements, and
draw the viewer’s attention toward a certain object or a person. Different types of lights
serve different purposes. Consider a few types of lights and their uses.
The key light is the main light that focuses and highlights the subject of the scene. The
crew places this light in front of the subject, and at an angle with respect to the camera.
The fill light is less intense than the key light and fills the dark spots that the key light
creates. The crew generally places this light just above and as close as possible to the
camera.
The backlight creates a light ring around the subject’s edges, which makes the subject
stand out from the background and adds dimension to the frame’s outline. The crew
places this light behind the subject, and usually one to two feet above the subject’s
head.
The background light illuminates the background of a set behind the subject. The crew
places this light behind the subject.
Practical lights are the other sources of lights on the set that viewers see on the screen,
such as lamps.

To start any kind of project you require funds. Unless you do not have some capital
allocated separately for certain tasks, it becomes difficult to start a project. This is the
reason why production-scheduling starts only after a budget is finalized. Preproduction
process involves many activities. Sometimes, you may tend to overlook some minor
activity. A checklist helps to ensure that you have a tab on all the activities and you do
not miss anything.

Crew, Cast, and Equipment


Requirements
Every project involves people with varying levels of expertise, coming together and
working as a unit. Take a look at the crew requirements for a typical video production
and understand their roles.

Producer: Producers are responsible for the complete video production project. In a


filmmaking project, they oversee the entire process of filmmaking, from development
to completion of the film.
Director: Directors study the script and control the flow and content of a video. They
are involved primarily with the creative aspects of video production. They direct the
performance of actors; position the cameras and camera angles. They also decide the
look of a scene and the type of lighting it needs. They oversee every shot, the
sequences of scenes, and the entire video production process.
Screenwriter or Scriptwriter: Screenwriters do the research for a story and develop a
screenplay in a format that a producer will accept. They may first create a script and
then try to sell it to a producer, or a producer may hire a screenwriter to create a script
based on a specific storyline.

Casting Director: Casting directors or casting associates recommend the actors


required for a video. They conduct the auditions and negotiate the terms of a proposed
contract between the actor and the producer.
Production Designer: Production designers head the art department. They apply their
imagination and creativity, and through their sketches, create a world that brings the
director’s vision to life. They need to be ready to work on a variety of genres, such as
historical documentaries, urban stories, or science fiction.
Art Director: After the Director approves the production designer’s sketches, the art
directors start building the sets. They ensure that the creation is in line with the
production designer’s expectations. They control the look and feel of the video. They
also supervise the costumes, make-up, and props.
Storyboard Artist: Storyboard artists read the story and sketch frames to outline the
scenes and characters. Directors typically hire freelancers who can work just on a
particular story. They may or may not work again with the same director.

Cast Requirements

The process of casting involves finding and hiring the right talent for a role. The
producer and director select the most appropriate professionals for a project. The
financial budget for the project is a main consideration when hiring the cast and
crew. For some projects, producers may hire an experienced cast; for others, they may
hire new talents. Moreover, if the script demands, producers may hire professionals for
specific tasks like creation of advanced special effects.

Making a video or a film is a collaborative effort that uses individual talent. The


outcome of the video project is the result of the team’s collective effort and
dedication. The producer and director work closely and guide the cast and crew
throughout the project. The coordination and cooperation within the team determines
the success of the final product.

Equipment Requirements

The crew uses a variety of equipment to shoot a video. One part of the preproduction
planning process is to arrange for the equipment. Depending on the budget for the
project, you can either rent or purchase this equipment. The crew chooses the
equipment according to the requirements of the project’s producer and director. Now
consider some of the basic equipment used in video production.

Video camera: The camera specifications depend entirely on the budget and the type
of video the crew is shooting. Generally, the director of photography heads the camera
and lighting department. However, the camera operator captures the scenes according
to the photography or video director’s instructions. The steadicam operator holds
the steadicam, which records actions without losing focus, ensures that the shots are
steady (especially if there is any movement in the shot). It helps stabilize the camera’s
movement by reducing the impact of any jerky movements the camera operator may
make. The data wrangler transfers (wrangles) data from the camera to a computer or
hard drive.
Light: No matter what type of camera you use, onboard light is essential for almost
every type of shoot. Lights for indoor shoots differ from lights for outdoor shoots. The
chief lighting technician (the gaffer) is in charge of the electrical department. The best
boy assists the gaffer, and lighting technicians set up and control the lighting
equipment.
Sound: New filmmakers often do not pay much attention to the sound
department. However, sound is a very important part of any video. The crew must
correctly record the dialogues while shooting a video because it is time-consuming and
expensive to re-record them. The crew uses various types of microphones to record
voices for a video. The boom operator is responsible for the proper placement of the
microphone. The boom operator uses a boom pole to hold the microphone above, or
below, the actor but out of the camera’s frame. The production sound mixer heads the
sound department. This person selects microphones for the shoot, operates a sound
recording device, and, on some projects, manages the mix of audio signals in real time.
Wardrobe and make-up: The producer, director, and costume designer decide what
costumes to use for the main and supporting artists. The costume designer
understands the storyline of the video and the kind of look that will go with the actors
of the movie in a particular shot. The costume supervisor supplies the costumes,
manages the costume budget, and provides support staff. Hairdressers and make-
up artists collaborate with the costume designer to create an appropriate look for the
artists. They are responsible for making the actors look authentic in the roles they play.
A video project involves many elements. The preproduction team must create and
maintain harmony among crew members and ensure that all departments complete
their assigned work within the projected deadline.

Producers arrange for the main source of capital for the team. Every decision has to be
taken after consulting the producers as they decide the budget and finance. The make-
up artist must coordinate with the hairdresser to ensure that the make-up suits the
hairstyle decided by the hairdresser.

The Audition Process


A casting director’s job is to assemble a cast for a film. They must have up-to-
date information about new and promising talents, as well as the schedules of
experienced actors in the industry. The casting process breaks down scripts, identifies
the different roles and profiles of the characters, and informs agents about the
available roles.
After the casting directors conduct interviews and auditions, they shortlist actors based
on several factors such as experience, reputation, and availability. Producers and
production houses may sometimes demand experienced and well-known actors. In
these cases, casting directors typically make arrangements for the look test of these
actors and check if they fulfill the director’s expectations. The producer and the director
typically select only the main lead actors. However, at times, a project may require a
fresh face for a specific role. Casting directors contact agencies that arrange for actors
per the producer’s specifications. If they can’t find a suitable main lead actor, casting
directors may conduct auditions in different cities in search of talent. Casting directors
must keep in mind the qualities they are looking for before they conduct auditions.

Casting directors must be prepared before the audition process starts. Following are
some useful tips if you are performing the role of a casting director:

 Be clear about the kind of person you are seeking. You must know the actor’s
previous work experience.
 Inform the potential candidates about the audition at least a month in
advance. The audition notice must specify the various documents and
photographs that the actors need to bring to the audition.
 Before the candidates enter the audition hall, give them an information form to
provide their contact details.
 Create several copies of the script and the character description for the roles.
 Give the actors freedom to display their talent, and provide props if required.
 After the audition, inform the candidates that your team will contact them if they
are shortlisted or selected.

There are differences in the criteria you would use to select on-camera actors


and stage  actors. As a casting director, you need to keep these distinctions in mind.
The degree of expressiveness required for on-camera acting is different from that
required for stage acting. In front of a camera, an actor’s expressions need to be subtle
because the camera catches every slight physical movement of the actor. Conversely,
on stage, an actor can be more animated (and less concerned with subtle expressions)
to connect with the audience.
You must also note the volume of the actor’s voice. Stage actors must typically have a
loud voice because audiences at the back of an auditorium need to hear them. Actors
for a video production, however, don’t need to use a loud voice because the crew can
enhance the volume in a re-recording.

Content of a Production Proposal


A video production proposal contains information about the project. It includes details
such as program description, approach, characters involved, and equipment used. The
most important element of a production proposal is  clarity. The production process is
complex, and the proposal therefore must clearly communicate the project to all the
people involved. These people include the producers, directors, sponsors, and anybody
else who is interested in launching the production.
For a project to begin, every person involved in the production process should agree to
the project proposal. The proposal follows the same structure as the video. It contains
details of the progress of video preparations, details of the parallel storyline, and the
climax of the video. The production proposal also details the project’s budget and a
statement of the allocation of funds for various expenses. Each production section
should have a separate budget.

A production proposal must follow a specific, easy-to-read format and cover all the


important elements related to preproduction. Now take a look at a sample proposal.
The video project proposal should start with the video title and the theme, which
describes the main topics in the video. The synopsis, a short description of the story,
follows. The video style, which explains the treatment and describes the types of light
and music to use, comes next. The proposal then mentions the target audience and the
duration of the video. The objectives describe the purpose of the video and the goals
that it wishes to achieve. The setting and resources section describes the set, along with
the equipment and budget required for the final production. The plausibility
statement explains why the project is a viable option. The proposal also contains the
estimated work schedule of the project.
Writing a Synopsis

You may need to create different types of synopses when you apply for funds. A one-
line synopsis is a short description of the story. It highlights the main sections of the
story. A one-paragraph synopsis provides the essence of the story and, in two to four
sentences, answers questions such as the following:
 Who is the protagonist?
 What is his or her story?
 What does he or she do about it?
 What is the conclusion of the whole story?
A one-page synopsis covers one or two pages. Apart from answering questions such as
“Who,” “What,” and “How” about the story, it also deals with the other major characters
and their conflicts and the major turning points. In short, a one-page synopsis must
include the event that highlights the beginning of the story, the protagonist’s decisions,
the characters’ goals and objectives, the obstacles they face (and how they overcome
them), and, finally, the end of the story.

Steps in Scheduling a Production


A preproduction checklist keeps track of the various important activities to complete
before the actual production begins. The checklist groups tasks by the various
departments involved in the preproduction planning and the number of days
remaining before the production. Now take a look at a few important highlights in the
checklist to ensure that the preproduction process works effectively.

You can use the first phase checklist for tasks to complete with two months in hand for
production. In this phase, complete tasks like writing the script, designing
costumes. Identify set designs and lighting, and select assistant directors.
Use the phase two  checklist when 50 days are left for production. In this phase,
complete tasks like arranging food for the cast and crew, revising the script, and
selecting the cast.
When 40 days are left before shooting begins, complete tasks like signing off on the
agreement with the actors (giving them advance payments), and recording the details
of all crew members.
When a month is left before production, complete tasks like booking or renting cars,
arranging the props and property. Create time cards for cast and crew, sign off on the
shooting script, and prepare for the final shooting.
Twenty days before the shoot, complete tasks like releasing and signing all the
documents and permits for shooting. Conduct rehearsals; lock all the equipment for
production. Get all the contracts signed off, and pay off initial deposits and other
advances.
Ten days before the shoot, complete tasks like assembling all the equipment, testing
safety on the sets. Also, conduct a meeting of the final cast with the director to solve
difficulties before the shoot.
Your checklist should also include tasks for the set design. A set design, or production
design, is a layout that shows the shooting set. The set is a physical layout of the place
that the scriptwriter described in the story. For example, for a war-based movie, the
production team will create a set that looks like a battlefield. The team works hard to
make the movie look and feel like it belongs to a particular period. The actors’ make-
up and costumes also help create an authentic feel. However, it is difficult to re-
create certain locations entirely on a set. For example, an Indian village that set
designers create in an American studio does not look authentic. In such projects, the
cast and crew may travel to actual locations for the shoot.
Set design also includes props or accessories that help make the set look more
real. The smaller props are hand props, whereas the larger props decorate the sets
themselves. However, props and decorations are distinct items. A prop is an essential
item for the shoot because the actor uses it as part of a role. Decorations, on the other
hand, are not a part of the script—they just add a genuine feeling to the set.
Other important aspects of the checklist are the timetable and the daily call sheet for the
cast and crew. The assistant director maintains the daily call sheet. This sheet contains
the dates, times, and location details that the cast needs to report for shooting.
The timetable must be flexible to accommodate changes when shooting continues
without a break. When the cast and crew work in shifts to meet tight deadlines, create
a schedule to coordinate the overlap in time. Actors and Directors typically work on
multiple projects. Their dates are blocked months in advance, and the shooting must
execute accordingly. You should also inform the crew well in advance about the
working and non-working days. The call sheet also contains important contact details
of the crew members, hospitals, restaurants, and so on.
Genres and Subgenres
In this lesson, you’ll describe the various genres and subgenres in audio, video, and
motion films. You’ll also describe the development of various genres and
subgenres. Finally, you'll analyze the different genres in video and motion films and
explain how audiences react to different genres.

Genres in Films and Television


The word genre comes from a French word that means “type” or “class.” You can
categorize different works of art, literature, and audio-video media into groups
called genres. Let’s take a look at genres in two forms of audio-video media—films and
television.
You can recognize a genre on the basis of a set of common features. A commonality in
the theme, the setting, or the filming technique determines film genres. Broadly, a
film theme can be a drama (fiction), or a documentary (nonfiction). You may also group
films in a genre based on the emotions the films evoke. The horror genre evokes fear,
comedy evokes happiness, and romance evokes love. At times, the physical or social
setting in which the story of the film occurs determines the genre. Examples of such
genres are Westerns and war movies. The format  or  filming technique can also be the
basis for a genre. Film noir is a genre defined by sharp contrasts in lighting. The lighting
and low camera shots indicate dark psychological states. Animation films form another
genre. The animation technique combines drawings, paintings, or illustrations frame-
by-frame to simulate movement.
Documentary: A documentary is a nonfictional film that records some aspect of
reality. The purpose of a documentary is to instruct or to record events. The first film
that the Lumiere brothers created in 1895 showed the everyday life of ordinary
people. This was an early genre of films called actualities. It did not have any creative
storytelling, narrative, or staging. One of the early actualities documented workers
leaving a factory, and another recorded a train arriving at the station.
Filmmakers use different styles to make a documentary. The footage usually consists of
actual recordings of events. Another documentary style involves a reconstruction of
events that actors play out. The first instance of a recreation or staging to document a
real-life event was Sigmund Lubin’s The Unwritten Law (1907). It told the story of the
murder of the American architect, Stanford White.
At times, a voiceover is used to explain or question facts. Robert Flaherty’s Nanook of
the North (1922) is a good example of this style. Flaherty uses third-person narration
and combines it with a film of the actual events. He portrays a native person as the
hero to present the harsh life of the Canadian Inuit. Interviews are another common
feature of documentaries.
Drama: The main feature of a film of the drama genre is the detailed description of its
characters. Character development helps the viewers relate to the feelings and the
struggles that the characters are going through. Such films have the ability to bring out
strong human emotions in the viewers. When based on real-life issues and people,
dramas include fictionalized portrayals of the story.
Drama covers a wide variety of themes. The Grapes of Wrath (1940) depicted the effects
of the Great Depression of the 1930s. All About Eve (1950) focused on women and their
relationship with men. War dramas such as Patton (1970) and Apocalypse Now (1979)
educate viewers about the trials and hardships of war.
Comedy: Comedy films focus on the use of humor to draw laughter from the
audience. These films usually have a happy ending. Sometimes the themes of comedy
and tragedy appear together in one film (Life Is Beautiful, 1997). Comedies often involve
comic exaggerations of a situation, action, or character. They provide the viewers a
momentary escape from the problems and frustrations of daily life.
Actor-driven comedies rely on comic timing and the creative performance by
actors. They portray humor through well-timed jokes and visual actions, such as
harmless violence and horseplay. Charlie Chaplin, Buster Keaton, and The Marx
brothers were some of the earliest comic artists in American cinema.

Action: In the action genre, the story is fast paced and full of successive high-energy
scenes. The main action centers on a protagonist who performs several physical
stunts. The hero or heroine is usually a resourceful character. They portray intense
levels of human endurance. They struggle and achieve victory against incredible odds,
life-threatening situations, and an evil villain. The James Bond movies and the Lara
Croft series are examples of action movies.
Adventure: The adventure films focus on the exploration and pursuit of the
unknown. These films tell adapted stories of historical, literary, or fictional adventure
heroes (Sinbad and Tarzan). While action films focus mainly on action and fighting,
adventure films often include searches for lost lands and treasure.
Crime: The films in the crime genre revolve around the sinister actions of criminals or
gangsters. The story may chronicle the rise and fall of a criminal (Scarface, 1983). Or it
may tell the story of a victim. It could also chronicle the life of a criminal investigator
(Donnie Brasco, 1997).
Westerns: Films of the western genre revolve around European white settlers’
westward expansion and settlement of America.  This is one of the oldest and most
typical American film genres. These films feature gritty stories of gun-toting cowboys,
outlaws, bandits, and Native Americans. Some common themes of this genre are: the
conquest of the Wild West; the conflict between white settlers and the Native American
Indians; gold prospecting; and the triumph of justice over lawlessness. Historians
consider The Great Train Robbery (1903) to be the first Western film.
Horror: Horror films are based on stories that invoke fear and cause their viewers to
experience panic. These stories focus on the dark side of life. They include mythical
creatures such as vampires and zombies, which attack and scare the characters in the
story. These films show a familiar everyday world that the viewers can relate to as their
own; and thus create an initial sense of security. The evil and destructive attack
disrupts the stability of the characters’ world, and the film draws the viewers into the
fears of the characters. Some examples are The Omen (1976) and Halloween (1978).
Thriller: A thriller generates a sense of intense suspense, anticipation, and tension in
the audience. In a typical thriller, the lead character faces life-threatening
situations. The story shows a sense of uncertainty and excitement; and viewers are led
to wonder whether the hero or heroine will come out of the situation.
Romance: In romantic films, the plot is typically based on a love story between the
leading characters. The genre explores various themes about love—sentimental love;
love at first sight; forbidden love; sacrificial love; and unrequited love. Most romantic
films lead to a point where the lovers overcome many obstacles and live happily ever
after. However, some romantic films have tragic conclusions; and these, too, may
invoke strong emotions in the audience.
Musicals: Musicals tell a story primarily in the form of songs and dances. The early
musicals were just adaptations or filmed versions of stage hits of the time. The
Broadway Melody (1929) was the first musical that included songs and dances along
with a backstage plot. The songs in musicals either move the story further or simply
enrich viewer experience.
Science fiction: Science fiction films are invariably set in the future, either on Earth, or
in space. They include elements of futuristic technology. The first science fiction film
was A Trip to the Moon (1902). These films may express concerns about the potential
impact of an uncontrolled new technology on society and the environment.
Fantasy: The fantasy genre incorporates films based on imagination and myth. It
differs from the science fiction genre, which is rooted to some degree on scientific
fact. Fantasies often involve imaginary lands and unreal worlds. They may include
magical fantasy creatures such as fairies, dragons, witches, and giant monsters.
Animation: Animated films (earlier called cartoons) get their name from the style of
filmmaking involved in making them. In these films, creators photograph individual
drawings, paintings, or illustrations frame-by-frame to simulate movements. In earlier
days, an illustrator had to hand draw each frame; today, computers do most animation
work. Animations are mostly devoted to children’s fantasies. In some cases, they have
plots with adult themes as well. Some examples are WALL-E, and Fantastic Mr. Fox.
Television Genres

Television stations broadcast different types of programs. You can sort them into
several genres based on the common conventions that they share. Let’s look at some
of these genres.

Commercials: Commercials are unique to television. They form an important genre


that supports the television industry in the United States. The advertisement fees that
the corporations pay broadcast companies help the latter fund their operations.
News shows: News is another important genre of television. It updates viewers on
happenings around the world. These shows typically last for a half hour; but some
weekly news programs run for an hour. In earlier times, broadcasters offered news as
a public service; but in recent years, it has become an important source of revenue for
television stations.
Sports broadcasts: Sports broadcasts focus on competitions based on sports such as
tennis, baseball, basketball, and football. The broadcasts may be live or recordings of
previous sporting events.
Soap operas: Soap operas are ongoing serials that sometimes last for decades; for
example, Dynasty and Bold and the Beautiful. When stations first introduced them, they
aired these programs in the daytime along with commercials for soap products to a
primarily female audience. The stories reveal the day-to-day lives of the characters and
the plots build up over time. The programs dramatize interpersonal and family
conflicts; most conflicts relate to deception, lies, greed, infidelity, and jealousy.
Sitcoms: Situational comedies (sitcoms) originated from radio. The programs feature a
regular cast of characters who interact in a common environment, such as an office or
a family home. The storyline places the characters in unusual situations such as
misunderstandings and embarrassing coincidences. The characters eventually resolve
the problem in a humorous way. Some well-known sitcoms are Friends, Frazier,
and Seinfeld.
Game shows: Game shows include participants who play a game. Typical gameshows
involve a format to answer questions or to solve some puzzles. Successful participants
win money or prizes. Some shows, such as Wheel of Fortune, are based purely on luck;
some other shows, such as Who Wants to Be a Millionaire, are based on competency.
Talk shows: Talk shows typically feature a host who interviews famous personalities;
they may also include a panel of experts who debate on a subject. Sometimes a live
audience interacts with the host or the guest. Examples of talk shows are The Tonight
Show, Dr. Phil, and The Ellen DeGeneres Show.
Cookery shows: Cookery shows feature a chef who shows how to make various dishes
from different cuisines. Some prominent cooking shows are America’s Test
Kitchen and Good Eats.
Reality shows: Reality shows make up a relatively new genre in television. They involve
programs in which ordinary people or celebrities are filmed living their everyday
lives. The shows are unscripted, but they are heavily edited to create some kind of
narrative thread. Some popular reality shows are Keeping Up with the
Kardashians and The Apprentice.

Subgenres in Films and Television


You can divide genres into subgenres to better describe movies and television
programs. A film such as Kramer vs.  Kramer, described as a drama, informs viewers that
it will portray reality; but the addition of a subgenre—courtroom drama—describes the
film more specifically. Similarly, television genres have their own set of subgenres. For
example, news is a genre with subgenres such as financial news (describing stock
market conditions); sports news (describing the outcome of significant sporting events);
and political news (describing the political scene).
Subgenres in film: You can further categorize all film genres into
subgenres. Subgenres of documentary films are war documentaries such as The
Memphis Belle (1944); it presented real-life footage of Allied bombing missions by B-17
bombers during World War II. An example of a concert documentary is Madonna:  Truth
or Dare (1991); it chronicles the life of the American singer and songwriter Madonna
during her 1990 tour.
Several subgenres exist in comedy. In slapstick, the humor is based on physical and
visual action, such as The Mask (1994). Verbal comedy builds humor into the dialogues,
such as the films that Woody Allen directs. A black or dark comedy, such
as M*A*S*H (1970), observes serious subjects such as war or discrimination in a
humorous way. Parodies or spoofs humorously mimic popular films; for example, the
films in the Austin Powers series that parody the James Bond spy movies.
The martial arts films make up a prominent subgenre of the action genre. This
subgenre became popular because of films starring Bruce Lee, such as Enter the
Dragon. Disaster films are another subgenre within the action genre. They focus on
manmade and natural disasters that threaten the characters' lives.
The most renowned and earliest subgenre of the adventure films was
the swashbuckling subgenre. It included lavish sets, costumes, and a brave and daring
protagonist engaged in a romantic relationship with a lady in distress. The action
included battle sequences involving antique weapons such as swords and
cutlasses. Films based on legends such as Zorro, Robin Hood, and The Three
Musketeers are typical of this subgenre. Another popular subgenre is the pirate film; an
example is the Pirates of the Caribbean series.
Crime subgenres include mob or gangster films that focus on organized crime or the
mafia; for example, Scarface (1932, 1983), Goodfellas (1990), and Gangs of New
York (2002). Film noir is another subgenre that is characterized by high-contrast or stark
lighting. It typically includes a cynical hero. It was a style of filmmaking that was popular
from the 1940s to the 1960s. Examples of film noir include The Maltese Falcon (1941)
and Sunset Boulevard (1950).
Subgenres of the traditional Western genre include spaghetti, contemporary, and
revisionist Westerns. Spaghetti Westerns (The Good, the Bad, and the Ugly) were low-
budget films. They were shot mainly in the arid regions of Spain; and mostly Italians
produced and directed them. They introduced more violence than the traditional
Westerns. Contemporary Westerns (Brokeback Mountain, Hud) use Western themes and
motifs; and the stories are set in present-day America. Revisionist Westerns (Dances with
Wolves) inspire viewers to re-examine elements of the traditional Western. For example,
they portray Native Americans in a sympathetic way, and not as savages.
Horror films have subgenres based on some source of fear or panic. Many of the early
horror films, such as Dracula and Frankenstein (1931) belonged to
the Gothic subgenre. The grotesque creatures, spooky mansions, and foggy and dark
locales used in this subgenre induced terror. Supernatural-horror includes films that
depict ghosts, demons, and other supernatural occurrences; some examples are
The Exorcist, (1973) and Poltergeist (1982). Slasher is a subgenre that induces fear
through the rampage of a psychopath killer (Friday the 13th series). Zombie-
horror evokes fear in the form of grotesque beings raised from the dead (Night of the
Living Dead, 1968).

Subgenres in Television

You can divide the various genres in television into subgenres. Let’s look at some of the
subgenres within the television genres.
News programming is divided into subgenres by subject—general news, political,
business, sports, and entertainment news. Breaking news is another subgenre that has
grown into a genre of its own. It is news of any event that is happening and is reported
as it unfolds. The news must be of such significance that broadcasters consider it
necessary to interrupt a scheduled program or an ongoing news program to report its
details. Satellites and the Internet have enabled faster communication; in turn,
television stations can keep audiences informed continually with live updates. Some
typical items of breaking news are a major natural disaster or death of a prominent
world figure.
The talk show genre has subgenres based on the timing of the show and its
content. The morning show is a subgenre aired in the morning that includes segments
of news and weather with discussions on various topics. Some examples are Good
Morning America and The Today Show. Shows that air during the day are called
the daytime talk shows. This subgenre includes shows such as the Oprah Winfrey show
and The Jerry Springer Show. Talk shows that air at prime-time or late-night slots are a
genre of their own. Examples include The Tonight Show and The Late Show. Another
subgenre in talk shows is the political talk show. Examples include Crossfire and Face the
Nation.
The sitcom genre is divided into three subgenres. The actcom (action comedy) is the
original type of sitcom. The plots are based on the characters’ personal crises; their
verbal and physical actions evoke humor. Typically, the story is shallow, and there is no
verbal wit. Examples of actcoms include The Beverly Hillbillies and Night Court.
The domcom (domestic comedy) plots emphasize domestic crises. The characters are
complex, and capable of deep thought. They are more serious than actcoms and focus
on emotional and moral issues. Examples of domcoms include Different Strokes and The
Cosby Show.
The dramedy (dramatic comedy) is the rarest type of sitcom. The humor derives from
the way the characters handle the crisis; but the main plot always presents a serious
theme such as war. Examples of dramedy include M*A*S*H and The Simpsons.

Evolution of Genres and Subgenres


Just as fashion and cultural movements change over time, genres also evolve. Many
movie genres are as old as the movies. The Lumiere Brother’s first film in 1895 (of
workers leaving a factory) laid the foundation for visual storytelling. This, and the
other actualities, documented everyday activity. They were the precursors of the
documentary genre. In a scene in The Sprinkler Sprinkled, a kid steps on a hose pipe and
stops the flow of water. He then steps off the hose just when the gardener looks down
the nozzle. This was cinema’s earliest comedy film, and it has survived to this day as an
example of the comedy genre.
Technology and cost play a major role in genre evolution. As cameras became more
compact, lighter, and more mobile, they enabled filmmakers to experiment with new
styles. In 1922, Robert Flaherty traveled to the North Pole and shot the film, Nanook of
the North. This film was the first to be called a documentary, and it gave rise to an
important genre. With improvements in technology, filmmakers were able to introduce
sound and color in the 1930s. Musicals became possible as a new genre, with the
inclusion of sound.
Every genre goes through many phases. However, a genre never disappears—it
invariably follows a predictable evolutionary path. There is the initial phase, which can
typically be traced to a specific successful movie. For example, most film critics agree
that the subgenre of slasher-horror movies began with the 1978 film Halloween. Once a
particular movie is successful, film studios churn out more films that follow the
formula. This marks the period of growth and maturity of the genre. In time, however,
audiences get bored with the subject and the genre loses public appeal. Filmmakers
may, then, try to merge themes and create hybrid genres. The hybrid genres keep the
original genre alive and attract a wider audience. Western comedies such as The
Paleface (1948) and Cat Ballou (1965) are examples of such hybrid genres.
In recent times, technology has aided the widespread popularity of 3-D films. In the
1950s, filmmakers tried offering 3-D to viewers, but the goggles were a failure. They
created blurred images. They were also expensive and heavy. Today, 3-D goggles are
cheap and light. The films are clear and sharp. Computer-aided animated films have
enabled producers to create animation films rapidly. This has led to a spurt in the
number of animation films.

Technology has also changed for television productions. With satellite communication,


television has reached out widely. With the vast reach of television, broadcasters feel
the need to cater to different audience preferences. Several channels broadcast only
specific genres. For example, HBO (Home Box Office) is a dedicated movie
channel; Nickelodeon and Discovery Kids focus on children.
Technology has enabled live coverage and updates of events; and thus boosted the
growth of a number of genres. Live telecast of sporting events attracts a vast
population of sports fans. Special events such as the State of the Nation addresses
make the political news subgenre more attractive. Similarly, updates on the weather
make the weather subgenre more relevant. They also increase the public’s viewership
and subsequent popularity of the show.
The difference between film and television genres is that film genres have a broader
description. Television subgenres cover almost every aspect of people’s interests—
ranging from religion to cooking; and thus they are far greater in number.

Influence of Genres on Audience


Audiences have always reacted strongly to cinema and television. People’s lives,
experiences, and concerns inspire cinema. In turn, films also influence people’s
thinking. Cinema is an attractive medium for distraction from the problems of everyday
life. It creates vast audiences who are hooked on films of various genres. This
phenomenon was most pronounced during the Great Depression, when people wished
to escape from the misery of their lives. During the war years, movies such as Mrs.
Miniver (1944) lifted the public spirit and instilled patriotism.
Musicals tell reassuring stories that focused on people going from rags to riches. The
musical genre became popular because of its uplifting nature. Genres in film and
television shape public opinion. Slavery as a subgenre has been topical for
decades. Some examples are Ben-Hur (1959), Amistad (1997), and 12 Years a
Slave (2013). Al Gore’s An Inconvenient Truth (2006) is a documentary campaign to
educate people across the world about global warming. Philadelphia (1993) and Dallas
Buyers Club (2013) are movies that have stayed in the public psyche. These movies have
helped raise awareness about homophobia and the AIDS epidemic.
An audience’s reaction to a genre varies, depending on the type of genre. Some genres,
such as comedies, fantasies, and musicals, can evoke positive emotions. Corporates
mostly opt for brand placement strategies in such films and television genres.

In the earlier days, television programs primarily focused on entertainment


programming, such as sitcoms and soaps. The news genre was a short 15-minute
slot. With growing interest in current affairs, television networks began to dedicate
more time to news. For example, news channels produced and telecast the presidential
debates between Kennedy and Nixon. A record 70 million viewers watched these
debates. This encouraged networks to move the genre to prime time in later years. The
audiences’ appetite for news has grown over the years, and this shift has given a boost
to this genre.

Genres such as erotica and action overstep the limits of decency with explicit, erotic
scenes, and extreme violence. Recognizing films’ and television’s influence on
audiences, the government has taken adequate steps. Censorship and legislation
ensure that the content adheres to a certain level of moral standards.
Musicals require sound to carry the effect of songs and compositions. Animation was
popular from the early days of film making; however, it has become a lot easier to
create animated movies today with the aid of computers. Film noir was a very popular
genre a few years after it emerged. Three-dimensional (3-D) films were not a success
for any length of time, and after a few years filmmakers stopped making them.
Writing for Different
Formats
In this lesson, you'll describe the process of creating storyboards for movies and TV
shows. You'll describe the appropriate treatment note for each medium. You'll describe
how to write scripts for different formats such as news features, public service
announcements, commercial scripts, promotional scripts, biographies, and
documentaries. You'll also create a detailed storyboard that includes a shot list.

Audio Visual Media: Differences in


Format
We live in an era of advanced technologies that enable us to choose from a wide
variety of entertainment options. You may laze around at home and watch TV, or plug
in to listen to news or enjoy your favorite music on radio when you travel or jog. Your
choice varies with your mood, the time available, and the portability of the medium.

Based on the Hierarchy of Needs postulated by American psychologist Abraham


Maslow, theorists Blumier and Katz noted that people choose to watch programs (such
as soaps and sitcoms) that make them feel good. Alternatively, they may choose to
listen to news that updates their knowledge of the world.
Producers plan and schedule their segments to match the preferences of viewers or
listeners who engage with the media. Producers, when creating a script, need to keep
in mind the complex nature of different audiences. People choose media that best
suits their varying needs.

Producers conduct research to understand the media preferences of their potential


audience. They plan the presentation of the script to match the demographic details of
the audience. An effective script is an essential element to connect with the
audience. Film scripts generally have a one-time engagement for 120 to 180 minutes. In
contrast, television characters and news presenters face their audience on a daily or
weekly cycle. Therefore, the challenge is to ensure that the audience reconnects when
they come back to a TV show.
Radio often gives you company when you are distracted by another activity. Therefore,
the content has to be simple and direct. Radio engages only your auditory sense. TV
engages your auditory as well as visual senses. Clarity of content is important for a TV
broadcast because the viewer is likely to hear and see it only once.

Storyboards for Different Formats


A storyboard is a visual representation of the script. It helps you share your ideas and
vision. It lays out a plan for production with a series of visual representations for
composing the shots. Radio does not have visual elements. Its storyboard typically
includes sound effects, presenter's comments, interviews, and commercials. The list of
sound effects, the song list, and so on, appears on the left. The presenter's comments
or dialogue appears on the right side of the storyboard. The radio newscast goes
sequentially. It means that the listener cannot make sense of the second story without
hearing the first one.

Therefore, all the chosen stories must hold the listener's interest. Music is one of the
most popular genres on radio. Each music show has a song list, or includes interviews
with people in the music business. Sometimes, radio jockeys choose a theme for the
show, for which they require a rough script. In a radio script, you need to repeat the
name of the interviewee or the characters often to ensure audience recall. For
television and films, you need not repeat the names of people because the audience
can see them.

Aspect
Usage Type of TV
Ratio
4:3 standard channels old TVs

16:9 HD channels majority of HDTVs

21:9 most movies very few TVs

Keep in mind the distinctions between scripts for auditory (radio) and visual (TV or film
broadcast) media. TV and radio use a conversational writing style. Avoid using similar-
sounding words such as "weather" or "whether" when you create radio scripts. The use
of figures should be minimal, and the sentences should be brief.
Hundreds of people may be present on a set for TV or film productions. The storyboard
therefore needs to show the precise scene sequences, as well as information about
every frame. This includes details about the actors who will enact the scene, camera
angles, special effects, etc. The storyboard helps you organize the set before the shoot
begins.

When you develop a storyboard for TV or film production, the proportion of the width
and height of the screen is an important consideration. This proportion is the aspect
ratio. We express it in a W:H format (W is the width and H is the height). For example, a
16:9 aspect ratio implies a width of 16 units and a height of 9 units. Most television sets
have an aspect ratio of 16:9. This offers a perfect fit for high definition TV
shows. However, the aspect ratio for films is 21:9. When a movie is compressed to fit
the TV screen's aspect ratio, black bars fill the extra space. This process
is letterboxing. You'll see black bars above and below the picture on your 16:9 screens.

Constructing a Storyboard

Once you draft a script, the next step is to create a storyboard, with an outline of each
shot. You can use professional storyboard paper, or draw a series of boxes on a sheet
of paper. The boxes represent video frames. The size and shape of these boxes differ
for films and TV. TV sets originally had a 4:3 aspect ratio, and thus storyboard panels
were square. Even though modern HD TV sets have an aspect ratio of 16:9, writers for
TV continue to use the square storyboard panels. A film storyboard typically has
rectangular panels. They are twice as wide as a TV screen. If you mount a thick black
border (approximately one-half inch in width) around each square or rectangle, you get
the illusion of a TV or theatre screen. This gives you an idea of the size of individual
shots.
When you plan a shot panel, consider elements such as location, characters, camera
angle, lighting, and so on. You can also provide important information (in a box below
the panels) to describe the elements not present in the illustration. For news features,
the storyboard and script appear on the same sheet. One-third of the space is for
video, and two-thirds for audio.
Public Service Announcements (PSAs): PSAs are short, non-
commercial advertisements that a television or radio station airs for a cause or for
charity. Non-profit organizations (who generate most of the PSAs) typically use them to
promote their community-building activities. Broadcasters donate the airtime to PSAs
as part of their commitment to serve the public interest. Broadcasters allot free PSA
airtime (usually 10- to 60-second slots) to groups such as community foundations, non-
profit groups, and advocacy groups.
A PSA must contain information beneficial to all people. It should not include any
controversial or self-serving material. The choice of the medium used to broadcast a
PSA depends on its reach to the target audience, and your ability to produce the
spot. For example, to promote an organ donation campaign, you would most likely
choose a national radio station and a time slot when people are more likely to pay
attention. Most of the PSAs appear in slots when paid commercials or programs
are unavailable—typically late at night, or early in the morning. Therefore, the agency
providing the PSA must negotiate with public service directors to get the best time slots
to reach the appropriate target audiences.

Tips for Writing a PSA Script

Here are some tips to develop an effective PSA script:

1. Decide on the purpose of your PSA.


2. Survey the media outlets to reach the maximum target audience.
3. Keep your language simple and clear because you have only a few seconds to
convey your message.
4. Limit your script size to 75 words or less because most stations prefer a 30-
second slot.
5. The top of your PSA sheet must include the length (airtime) of the PSA, the name
of the agency or group that you represent, and the title of the PSA.
6. Type on only one side of the 8-1/2 x 11 inch paper. Triple space the entire PSA so
that it is easy to read.
7. Try to submit a live copy—or a ready-to-air script for radio. However, TV stations
rarely accept live copies.
8. For TV scripts, describe each shot in writing, and provide dialogue along with it.
9. Your PSA format for TV must include the audio portion on the right and the
video portion on the left.

News Formats: Breaking News,


Reports and Features
Let's study how reporters create catchy leads for news stories.
A lead is the opening paragraph of a news story that grabs the readers' attention. It
gives a reader the sense of the story that follows. Reports use two basic types of
leads: direct and delayed. The direct lead tells the reader or the listener the most
important aspect of the story at once. Breaking news events usually use this type of
lead. The delayed lead entices the reader or listener to the story by hinting at its
contents. A delayed lead often appears with feature stories.
Reporters create leads based on three factors: their skill, the item's editorial
importance, and the duration given to the story. A direct lead has four essential
requirements. First, your lead must say something specific to the reader, and move
directly to the heart of the event. Next, include the time element in your lead. Then
identify the source of action or information. Finally, mention the place where the event
occurred. You can also combine the direct and delayed leads, and create a combination
lead. Here, the story begins with a few general sentences and then provides the reader
with facts at the end of the paragraph.

Radio and Television News Leads

Radio is not a visual medium. Your audience will connect with your story based on what
they hear. Therefore, for radio, you need to create leads that grab listeners' attention.

The lead style for TV news is quite similar to the style for radio news. As with radio
listeners, TV viewers cannot go back to review the facts already presented. TV adds new
complexities because it combines pictures and words. In TV stories, reporters use
a voice-over (spoken dialogue) to explain the contents of a screen. Editing decisions
depend on the kind of footage (audio and video clips) available. The news writers create
the script to match the shots that are available, and then edit the tape to fit the script.

Types of Leads

Summary lead: A summary lead is the most traditional form of writing. It is used to


announce breaking news. It mentions the "who," "what," "when," "where," "why," and
"how" aspects of the story. Breaking news highlights key events. It includes one or two
headlines that run in rotation, one after another. For example, the sudden onset of a
hurricane and a football team's match-winning goal can be summary leads. These are
happenings, and the newsroom doesn’t have the time to produce a story about them
right away.
Question lead: A question lead begins with a question, making the readers think before
they connect with the story. An example of a question lead can be, "Are you willing to
put in an extra dollar for healthcare?"
Descriptive lead: A descriptive lead describes the event or scene in great detail, rather
than offering information about the event. For example, "The bridesmaid’s burgundy
gowns matched the wine that flowed at the reception."
Narrative lead: A narrative lead describes an interesting action in the story. It carries
the reader through the story and finishes with a surprising twist at the end. "At 1:30, I
was having lunch with my friends, who were trying to cheer me. I was sure I would
never be able to fulfill my dream of becoming an actor. At 2:30, I received a call that
changed my life forever," is a narrative lead.

Feature Stories

Unlike breaking news, feature stories do not need urgent reporting. They aim to build
up interest in the audience. They include in-depth stories about personalities, events,
and current trends. They typically use delayed leads that set the scene with an
associated incident. Feature stories have ample time to develop themes.
Here are some categories of feature stories:

1. Profiles of prominent or unique individuals of interest to the community


2. Reviews of books, films, music, restaurants, and so on
3. Q&A-type interviews
4. Background for certain events or traditions
5. Historical pieces that delve into the past
Examples include:

 Human interest: recruitment of young people by ISIS


 People profile: Malala Yousafzai, Nobel Peace Prize awardee
 Historical features: recession and global economy
 Behind the scenes: a story about the Kodak Theater on the eve of the Oscar
awards

Commercial and Persuasive Scripts


Commercial films are big-budget movies that aim to attract a large audience based on
their themes, actors, special effects, and so on. If you want to persuade the producer to
invest in your script, you must know how to pitch your idea. You present a treatment
note to the producer, which gives an idea about your script. When the producer is
impressed with the treatment, you present the script. To enhance the key features of
your script, you must learn to create an effective storyboard.
Before you begin to create a storyboard, remember to first break down your script into
shots and create storyboard frames. The shot list will help you identify the important
scenes that need to be added and maintain a sequential flow of the story. Storyboards
for TV and films include frames that combine images and text. Evaluate each shot and
add the required information, such as location, actors, props, type of shot, special
effects, and so on to the text box below each image frame. Let's look at the details of
the art of effective scriptwriting.

Format for Commercial Scripts


1. Decide whether you want to adapt a story or write an original one.
2. Decide the structure of the screenplay. The classic structure includes three
acts: "setup," "conflict," and "resolution."
3. Remember that TV and films differ in script format. TV scripts need to include
commercial breaks.
4. Format your script with the font 12 pt. Courier (not Courier New or Prestige
Pica). These fixed-pitch fonts give you ten characters per horizontal inch and six
lines per vertical inch. Preface a scene with a heading that indicates if it is
internal or external, the place, and time of the day, in that order. All dialogue
is center-justified and begins with the character's name in capitals. Descriptions
of the character appear between parentheses on a line just under the
character's name. Dialogue, without quote marks, comes next.
5. Submit your script with a cover letter, one to two lines that describe the story,
and one to seven pages of synopsis (short summary of key points of the story).

Persuasive Writing

Advertisements are a form of persuasive writing. Their primary aim is to persuade


people to buy a product or an idea. Your script needs to be attention grabbing and
effective enough to influence your audience to buy a product or an idea.

Here are some characteristics of an effective persuasive script:

1. It mentions the virtues of the product or idea.


2. It includes the unique selling point of the product or idea.
3. It addresses the needs of the audience.
4. It sounds interesting.
5. It opens with a question, a quote, or a story.

Components of a Persuasive Message


Gain your readers' attention

Grab the reader's attention with a startling announcement or a testimonial. Offer


something valuable to your reader. For example, "Tired of spending hours in a crowded
and noisy gym, lifting heavy weights? Choose a healthy lifestyle, close to nature—
with power yoga!"

Make it interesting for your reader

Build interest by describing the product or idea's benefits to the reader. For example,
"Stay young! Power yoga will help you shed that excess fat you have been struggling to
get rid of. No machines, no crash diets, no supplements. Learn the ancient secret of
fitness!"

Reduce resistance

Now that your script has attracted the audience's interest, you can reduce buying
resistance through persuasive techniques. For example, you can include guarantees or
free samples, or you can build credibility by citing awards that your product has won or
other performance test results. For example, "We offer you the flexibility to exercise at
your own convenience, with guaranteed results. Try our free demo offer and enjoy the
bliss of yoga and meditation. Speak to our satisfied clients, who have benefitted from
our programs."

Motivate your audience to try your product

Finally, close by repeating the unique selling point. You can prompt the reader to act
immediately with benefits such as free gifts, special price offers, etc. For example, "Join
today to enjoy the benefits of yoga and meditation. Don't miss the free demo. This
offer lasts for a limited period. What’s more, you can get a discount of 25 percent if you
act soon!"

Selling Your Idea


A good idea is one that can influence your audience to invest in your product. Highlight
and creatively present your product's unique selling points to capture audience
attention.
A treatment helps you pitch an idea to a financier. It is a summary of a screenplay,
movie, or TV show. Treatments can be as long as 40 pages, but typically, a short
document (of about two to five pages) is ideal. A treatment's effectiveness depends on
two factors: your ability to sell your story, and your ability to convey the story in the
least number of words. Keep in mind the fact that you can copyright the execution of
your idea, but not the idea.
A treatment should include the following:

1. a working title
2. the writer's name and contact information
3. the Writers Guild of America (WGA) Registration number
4. a short logline
5. an introduction to the main characters
6. the "who," "what," "when," "why," and "where" details
7. Act 1, Act 2, and Act 3—set the scene, introduce conflict, and state the resolution
Writing a Treatment
1. Find a title: Choose a name that matches your story. Remember, the first
contact a producer has with your script is its title. Choose a title that gives a clear
idea of the genre of the story.
2. Write a logline: Most writers provide a two-line or three-line description of their
story called a logline. It includes the main character, the antagonist, and the
genre of the story. The logline is a great marketing tool to highlight your story.
3. Write a synopsis: A synopsis is an extension of the logline into a three-
act story. It covers the introduction, the conflict, and the resolution.
4. Write a treatment: A writer can develop the synopsis to create a treatment by
communicating the story structure and adding details to the script.

Drafting a Documentary
1. A documentary is the easiest script format for beginners. It's a recording of any
event: people and cultures, road safety, wildlife habitats, etc.
2. The documentary needs careful planning. The process is the same: idea,
treatment, script, storyboard, and shoot.
3. The process is more relevant when you create documentaries because
documentaries typically have low budgets.
4. An effective plan will ensure less expenditure on shoots, editing and less camera
and crew fees.
5. A plan helps you to think ahead and stay organized when you shoot. For
example, if you are shooting a film about road safety, you will need to set up
your camera and crew on the road during rush hour.
6. You can use your smartphone to click photos and make a documentary
storyboard.
7. A documentary storyboard format will display images on the left and the script
on the right.
8. A checklist will enable you to keep track of your activities as you shoot your
documentary.
Shooting a Video
In this lesson, you'll demonstrate the best practices in operating and maintaining video
cameras and equipment. You'll also explain ways to frame and compose pictures and
describe various focusing techniques. Finally, you'll demonstrate healthy behaviors and
safety skills as well as efficient work practices.

Camera Operation and Maintenance


To shoot a video, you must be familiar with the various buttons and functions of a
video camera. These functions differ from model to model. Though the button
positions vary, there are a few basic functions common to all the models. Let's take a
look at some of the functions in detail.

Power and Record: The power switch turns the camera on and off. However, it also
can switch the device to power saving mode or standby mode. Some cameras have a
single switch or knob for functions such as power, standby, and playback. The power
buttons in some cameras have a "lock" feature so that you don't switch the camera on
by accident.
The record button is typically a small red button located conveniently so you can reach
it when you hold the camera. Some cameras have the button on top or front for easy
reach when you're using a tripod.
Brightness and Contrast: You can adjust brightness and contrast on your camera
through the electronic viewfinder (EVF). You can see how your shot will turn out before
you record it, on the EVF screen. The final video recording reflects the changes you
make on an EVF. Most professional cameras use black and white EVFs. Switch your
camera to color bars to adjust brightness and contrast. Now adjust the brightness and
contrast on the EVF until you get a consistent grayscale from white to black. You should
be able to see each bar distinctly. Finally, switch back to picture mode.
Focus and Sharpness: Before you start shooting a video, you need to find a subject or
a location for your focus. Now adjust your camera on your target. You can do this
automatically or manually. Auto-focus is good for beginners, because the camera sets
the lens automatically. In Auto-focus, however, the focus keeps changing depending on
the number of moving objects in the frame. To overcome this problem, focus
automatically, and then switch to the manual setting. The focus will remain in place
until you change it again. You can use EVFs for focusing. First, zoom in on the subject
completely. Next, adjust the focus until the picture is sharp. Finally, zoom out to the
frame you require. The picture will remain in focus. Keep in mind that a subject with a
lot of contrast makes it easier for you to focus.
Audio: Cameras have built-in microphones to record audio. However, sometimes you
may need an external microphone to suit your requirement for a particular shot. Most
cameras have an input for this purpose. Additionally, you need to determine audio
levels for each shot. Most consumer cameras have an automatic function that enables
you to set the audio levels. However, professionals prefer to do it manually to get
better accuracy.
You need to make sure that the background noise is consistent when you shoot, unless
your subject requires that you include variations. Also, minimize or get rid of
background music. Lastly, you should be wary of wind noise, because it can ruin the
audio completely. A built-in "low-cut filter" can solve this issue to some extent, but the
best remedy is to block the wind. You can even use a unidirectional microphone with a
wind filter when shooting in exceptionally windy conditions. You can direct the
microphone right at the sound source to eliminate noise coming from other
directions. Keep a set of professional headphones handy, and check the sound
regularly.
Additional EVF Features

You know how to use a camera. However, a video shoot requires much more
knowledge. For example, you learned about the various uses of electronic
viewfinders. Here are some rules to remember when you use them. Viewfinders can
fog up if your eye is too close to the eyepiece, or if you sweat a lot. If this happens, take
a step back. Also, drink fewer fluids to avoid sweating. Some cameras have viewfinders
that flip open. This is a convenient feature when many people need to look at the EVF,
or if you're using a tripod.

The messages that appear on the viewfinder are informative for troubleshooting. Read
the manual to know what the symbols on the EVF mean.

Role of a Professional Camera Operator

Professional camera operators multitask to avoid the need to hire additional


personnel. Apart from being an expert camera operator, you need to be imaginative
and creative to be a professional. Plus, you need to have a good eye for detail. You
should be technically proficient, and analyze every aspect of a shot. You need to ensure
that your equipment is in good working condition at all times, and know how to
improvise when required. Moreover, you must work in harmony with other
departments to ensure a good shot. For example, if you want to figure out what type of
lighting works best for a shot, you will need help from the lighting department.

A TV or film camera operator's job, or ability, is similar to that of any other camera
operator. You need to have creative insight on how to achieve your goal, especially as it
relates to film and television. You also need physical stamina, good hand-
eye coordination, directing ability, and strong eyesight. You should be self-reliant, and
seek to reduce dependence on others. Even if you want to shoot an event, you only
need two cameras, a video switcher, and an audio mixer. In fact, you can preset shots
on Pan/Tilt/Zoom (PTZ) cameras, and operate multiple cameras at once to capture the
details of a particular event.
Safety Precautions

For a video shoot, you may often need to travel to different places. You may face
situations where you have to deal with dust, heat, and moisture. You must remember
to take good care of your equipment and protect it under varying conditions. Treat
your equipment carefully and follow some basic rules to keep it safe and in good
working condition. Here are some of these rules.

 Charge your camera with the AC charger that came with the camera, or one that
is meant for your camera's make and model. Using any other charger may spoil
your device, void its warranty, or worse, short-circuit the battery and start a
fire. The same applies for rechargeable batteries.
 A damaged cable may cause a fire. Inspect all cables (USB, adapter, and so on)
before you use them.
 If your device stops working, don't try to open it and fix it yourself. This will most
likely void your warranty and may even damage a component.
 Remove the batteries and store them separately to avoid acid leakage, if you are
not going to use your camera for a week or more. While you store batteries, do
not allow them to come in contact with each other, or any metal object. This may
cause the batteries to short.

 Make sure your device is meant for extreme conditions. This includes extreme
temperatures and exposure to water. Sudden changes in temperature,
especially in high humidity conditions, may cause condensation inside the
camera, and spoil it.
 Never turn off the camera when it is actively performing a function. This may
result in damage to the device, loss of data, or both.
 Avoid touching the lens or the LCD screen with your fingers. This may leave hard
to remove smudges and fingerprints.
You need to be careful with your camera. This means cleaning the lens and inspecting
it regularly to maintain high video quality. Here are a few basic steps.

 Use a soft brush to remove the dust from your lens. Hold the camera with the
lens facing the ground, and gently dislodge the dust with the brush.
 Next, use a microfiber cloth to wipe the lens clean. You can also use it to wipe
other parts of the device.
 You can use a lens cleaning fluid if the above two steps aren’t enough to get rid
of all the dust. However, always put the fluid on a piece of cloth first, rather than
directly on the lens. Too much fluid is bad for the lens.
 If you have no other option, you may dampen a soft piece of cloth or tissue
paper with water, and use it to clean the lens.
 There are many special lens filters available to guard against dust. They fit at the
front of the lens and prevent dust particles from entering. You can also use
them in wet conditions. These filters don’t affect picture quality or color in any
way.

You should be very careful when you handle or transport your camera. Avoid carrying a
tripod with the camera attached to it, because it may damage either or both the
devices. Always carry the tripod in a bag to prevent damage to its parts, and to avoid
hurting others.

Mount your camera on the plate at the head of the tripod. To fix the camera, tighten
the knobs on the tripod that are intended to keep your camera in place. Avoid over
tightening the knobs. Lastly, take good care of the tripod’s feet and make sure they are
securely placed when shooting.

Focusing and Composing a Frame


You are now familiar with a video camera and its use. You also know all about its
functions and accessories. You are now ready to start shooting. However, you need to
learn some basic techniques about focus and how to compose a frame.

The auto-focus feature, for example, uses smart electronics to enable the camera to


determine the object of focus. It is fairly convenient for basic shooting, because the
camera does most of the work. The problem, however, with an auto-focus camera is
that it does not know exactly what to focus on. It's easy if there is only one subject in
the frame, but if it sees additional movement, it gets confused. This causes the focus to
fluctuate repeatedly and often produces irritating delays. For example, if you're
shooting a video of a dog and a cat together, the camera will focus on both
automatically. But if the dog moves, and you want to focus on it, you will need to use
the pan feature and get the dog in focus manually.

Focusing and Depth of Field

Professionals typically use manual focusing for their shots. It's an easy and useful skill
to acquire. Your video camera will either have a "focus ring" on its lens, or an "external
knob" to adjust focus. The trick in either case is to do it patiently. Ideally, you should set
the focus before pushing the "record" button for every shot.

To experiment with focusing, you must understand depth of field. This technique allows


you to determine the level of focusing between the nearest and farthest objects in your
shot. You can either have a shallow or an extended depth of field, depending on your
preference for a particular shot. A shallow depth of field keeps only the main object in
focus. An extended depth of field keeps the entire area, from the lens to the farthest
point, in sharp focus.
An image of cars stuck in traffic, shot in a shallow depth of field ( top) & An image of
moving cars on the highway, shot in an extended depth of field ( bottom).

Depth of field depends on three elements: camera aperture, focal length of the lens,
and the distance between the lens and the subject.

The aperture is the diameter of the iris of the lens. It basically changes with the amount
of light on your shot. If you shoot in bright light, the aperture diameter decreases, to
increase depth of field. Similarly, depth of field has an inverse relationship with the
focal length of the lens. However, the depth of field increases with the increase in
distance between the object in focus and the lens.

"Follow focus" is another useful technique. It is difficult to keep a constantly moving


subject in focus. For this technique, you need to have an adequate depth of field. If it's
too shallow, your camera will have a tough time keeping the subject in focus. But, if you
increase the distance between the subject and the lens too much, the subject will
become part of the background. With a large depth of field, the subject will stay in
focus. Even if it strays, it will start to blur, and you will need only a minor adjustment to
fix the problem.

Creating a Shot

Even if you have all the technical details in place, you need to pay full attention to every
shot. You need to compose it with care, using the process of framing.
Framing a shot requires perception and an idea of what you want to convey to the
audience. For example, if you want a shot of people entering a room, you need to tell
them how fast you want them to walk, based on how long you want your shot to be.
Framing is a subjective art. One videographer may find a composition fascinating, while
another may find it boring. However, there are some basic tips that you can use to
create good results when you shoot.

Framing defines how you place a subject in a shot with respect to the background. It
helps you to clearly distinguish between the subject and the background. Most EVFs
have a mode to divide the screen into nine equal squares. You can place the subject
where any two of the lines intersect. For an ideal shot, you can place the subject at a
point that is one-third or two-thirds the way up or across the frame.
Another basic rule is to avoid tilting your frame, unless you wish to do it for
effect. Ensure all the horizontals are at level and the verticals are straight up and down
in your frame—the horizon, buildings, lamp posts, benches, etc.
Leave some room around the subject when you compose a shot. The space between
the subject's head and the top of the frame is called "head room." The space in front of
a moving subject is called "leading room." While the space towards which the subject is
looking is called "looking room." You should leave enough space for all three, or the
shot won't look right. However, leaving too much space will make the shot look empty.

Pay close attention to other elements in your frame as well. Everything is important to


highlight your subject and to enhance the desired effect. The background must be apt,
and the lighting right. You also need to make sure that nothing in the frame will distract
the audience from the subject, or disrupt the continuity of the shot.

Always lock the tripod head after framing a shot.

Types of Shots

Photogenic subjects and beautiful locations aren't enough to compose ideal shots. You
need to understand the differences between some standard types of shots. There are
no strict parameters or dimensions for a perfect shot. However, they're useful as
references for beginners. Here are some typical types of shots.

Extreme Wide Shot (EWS): The subject is not visible in the EWS, because it is taken
from a distance. This type of shot emphasizes the subject's surroundings rather than
the subject itself. The first shot of a new scene is usually an EWS.
Types of Shots

Photogenic subjects and beautiful locations aren't enough to compose ideal shots. You
need to understand the differences between some standard types of shots. There are
no strict parameters or dimensions for a perfect shot. However, they're useful as
references for beginners. Here are some typical types of shots.

Extreme Wide Shot (EWS): The subject is not visible in the EWS, because it is taken
from a distance. This type of shot emphasizes the subject's surroundings rather than
the subject itself. The first shot of a new scene is usually an EWS.
Very Wide Shot (VWS): The VWS is similar to the EWS. The only difference is that the
subject may be visible in the VWS.

Wide Shot (WS): In the WS, the subject is in the frame, but not up close. This shot
enables the audience to view the subject in its entirety. For a person or an animal, it's
a full-body shot.
Mid Shot (MS): In the mid shot, the audience must feel that they are looking at a
normal shot at the subject. For example, when you chat with people, you typically don't
pay attention to their lower body. Therefore, that part is not important in this shot.
Medium Close Up (MCU): The MCU shot is used to show the subject more clearly, but
is not a total close up.
Close Up (CU): The CU shot concentrates on some part or feature of the subject. Films
typically use close up shots of actors' faces to highlight expressions or to capture
intense feelings.
Extreme Close Up (ECU): The ECU shot shows very high detail. For example, a single
tear on an actor's cheek is a close up shot. The close up shot is a useful method to
portray emotion.
Cutaway (CA): Cutaway shots act as fillers between important or intense shots. You
can add them to help the editing process and to provide extra information or interest.
Cinematography
and Composition
In this lesson, you’ll describe various cinematography techniques. You’ll demonstrate
ways to frame and maintain picture composition and focusing techniques. You’ll also
describe the importance of setting camera positions in a storyboard. Finally, you'll
define time code and describe how frame and time code relate to the Society of Motion
Picture and Television Engineers (SMPTE).

Techniques of Cinematography
Cinematography is the art of making a movie that stirs the emotions of
viewers. Cinematographers, or the directors of photography (DP), manage the process
of cinematography. They decide elements such as camera placement and
lighting. These elements help to portray a character’s actions and
emotions. Cinematographers also choose camera angles and set up the lighting in a
shot to eliminate shadows. These tasks require patience and an eye for detail. Plus,
these tasks take a lot of time. These considerations make cinematography one of the
most challenging areas in filmmaking.

The cinematographer’s team is typically the largest on any set. Gaffers and chief
electricians are an important part of the team. They are responsible for setting up
lights and operating dollies.

Framing is one of the most important elements in cinematography. Framing is the


placement of the main subject in relation to the other objects on the screen. As a
Cinematographer, you typically focus on the subject’s eyes. However, sometimes the
script calls for the focus to be on something else. Eyes are a point of focus because
they communicate more effectively than words or gestures. Whatever the main subject
in a scene, you can use the same technique to bring it into focus. You first zoom in,
focus, lock focus, zoom out, and then compose the shot.

The composition of a frame is another important task in cinematography. The


word composition means, “bringing together.” In cinematography, composition refers to
the frame of the image. The term also refers to how you arrange the elements in the
frame. When you frame a composition, you shoot a scene from several angles. You
want to draw the viewer’s attention to a certain point in the frame. Composition is so
important because it lets you create a relationship between all the elements in the
frame, including the subject.

Types of Angles in Cinematography

High Angle

High angle refers to placing the camera above the main subject. The camera looks
downward at the main subject, whose face is usually looking down, too. This angle
makes the subject appear scared, troubled, or weak.

Low Angle

In a low angle, you place the camera below the eye level of the subject. The subject is
usually looking upward in this type of shot. This angle makes the subject look
dominating and aggressive.

Eye Level

In an eye level shot, you place the camera at the subject’s height. Filmmakers use these
angles most often. The images are neutral and do not have any major dramatic effect.

Dutch Tilt

In the Dutch tilt or the canted angle, the camera slants sideways to create a more
dynamic composition. You can use this angle to add dramatic effects. This angle also
enables you to portray unstable states of mind such as disorientation and intoxication.

Other Shots

In addition to these shots, cinematographers use several other shots to achieve various
effects. An establishing shot creates a context or setting for a scene by showing the link
between its important figures and objects. You typically use this shot to show a scene’s
geographical location.
A master shot captures all the actors in a scene. It runs from the start to the end of the
scene. A two shot shows the view of two subjects in a frame. Likewise, a three shot has
three subjects in a frame. A reverse angle is a shot captured from the opposite direction
of the previous shot. A point-of-view (POV) shot directs the viewer’s attention to
something that the subject is looking at. 
Combination of Angles

In cinematography, you can use multiple camera angles to achieve the best shot for a
scene. For instance, a tilt shot combines an eye level shot with a high or low angle
frame. If you want to bring attention to the height of a massive structure, you can use a
tilt shot with a low angle. The combination of these two techniques makes the shot
look interesting.
You can use additional equipment such as a tripod, a dolly, and cranes. Their use can
enhance basic shots in some scenes. For example, sports events are challenging to
shoot because the players are constantly moving. You have to move along with them
without losing focus. In such shoots, you can use a spidercam to get top angles and
move both vertically and horizontally over a specific area. You can use the crane shot to
shoot an event that has a large crowd. In a crane shot, you typically sit up high on a
crane, along with the camera. The high position enables you to move both vertically
and horizontally while shooting from a higher angle. In this way, you can compose a
wider shot for a scene.

Use of Lighting in Cinematography

Every scene demands unique lighting. You can add different moods and effects to a
scene by changing the light exposure. For instance, if you shoot a scene in dim light,
you can create a sense of fear and suspense. Typically, dim lighting is used for a horror
movie or in a scene involving mystery. In cinematography, exposure refers to the
amount of light that the camera captures. The level of exposure in a shot determines
how dark or bright the scene will appear. You can adjust the camera settings to control
exposure. Exposure can be of three types: normal exposure, overexposure, and
underexposure.

Normal exposure refers to the regular exposure of light, without adjusting for extra
brightness or dimness. Overexposure means reflecting too much light into the
camera. Overexposure makes the image appear brighter than it would under normal
exposure. Underexposure means reflecting too little light into the camera, so the image
appears dark. The dark image hides what the viewer sees on the screen. However,
when you shoot outdoors, you cannot control natural light. In such scenes, you need to
either adjust your camera settings, or change the location of the shot.

Factors Influencing Light Exposure

Now, let’s look at the factors that regulate light exposure.


Light exposure: The sun is the brightest source of natural light. If an image looks
normal in sunlight, it may appear underexposed indoors under incandescent
lightbulbs. With light adjustments, you can make an image appear either bright or dim.
Aperture: The opening and closing of a camera’s lens has the ability to admit light,
block light, and control the amount of light exposure. If you open the aperture wider,
the light exposure will be greater. The resulting image will be brighter. If you open the
aperture less, the light exposure will be less. You will get a darker image.
Shutter: A shutter is the plate between the lens and the camera’s recording
surface. Shutter speed is the length of time the camera shutter remains open when
you capture an image. It determines the amount of light exposure for each frame. You
can adjust the speed at which this plate rotates and decide the exposure time for each
frame.

Shooting Techniques
Over-the-Shoulder Shot

As the name suggests, you take this shot from over an actor’s shoulder, which is in the
foreground. This shot lets you show the subject’s point of view. It is effective because it
helps create a sense of intimacy in a scene.

Tilt Shot

You can take a tilt shot by freeing the tilt lock. This plane allows the camera to move
vertically up and down. This shot lets you create a movement that mimics the nodding
of a person’s head. Cinematographers frequently use tilt shots. These shots create a
dramatic effect.

Panning Shot

Panning shots are similar to tilt shots. Tilt shots move up and down, whereas panning
shots cover horizontal left and right movements. You can use this technique to track
the horizontal movement of an object.

Zoom Shot

You adjust the focal length of the camera to take a zoom shot. To zoom in or out, you
do not need to move the camera. With this technique, you can make an object look
bigger or smaller.
Tracking Shot

A tracking shot requires a camera mounted on a dolly that moves along specially
positioned tracks. You can use this technique to shoot a scene where a subject moves
within a frame.

Crane Shot

Filmmakers place the camera on a hovering crane for a crane shot. This technique
enables you to view actors or objects from a top angle.

Track in shot

The track-in shot allows you to bring an object into focus as you move the camera
toward it for a medium or a tight close-up. For example, imagine a scene where a
stranger is standing at the door. However, the subject is unable to see the stranger
clearly. The subject now puts on a pair of glasses to look closely at the stranger. You
can bring the stranger into focus by slowly moving the camera in on the
subject. Another type of track-in shot consists of a foreground object. The object is
closer to the camera. So, its size increases as the camera moves closer. This technique
gives the subject a three-dimensional illusion. A variant of the track-in shot is the over-
the-shoulder shot, in which the camera moves in over the main subject’s shoulder.

Combination of focal lengths

Many cinematographers use the same lens for taking two complementary shots. That
is, if they use a 50mm lens to frame a shot, then they use a 50mm lens to shoot its
reverse shot. However, you can choose different lenses in combinations, especially if
one character has to appear more dominant than the other character in a scene.

Match Cut

Match cut, or graphic match, is a cut in editing between two elements. These elements
include shots, objects, or even spaces. These cuts help emphasize continuity in space
or time. This type of cut aids in continuity editing. It maintains a common element
between two shots. It further establishes continuity of action.

Rules of Video Composition


Several elements work together in the construction of a shot. In cinema, the how is as
important as the what. Everything—from the balance of lights, to space in focus, to the
use of empty space in a frame—makes a difference to a shot.
For any image, its edges form the frame. The frame plays a crucial part in including and
excluding important and unimportant elements in the shot. The aspect ratio is a ratio of
the horizontal to the vertical sides of an image. Until the 1950s, cinematographers
mostly opted for the 4:3 or 1.33:1 aspect ratio. However, filmmakers now are more
willing to experiment. They try shots with various types of aspect ratios.
When widescreen films are telecast or released on video, the format must be brought
down to a ratio that fits the TV screen. However, this alteration can make the objects in
the scene look crammed. In computer graphics, the aspect ratio refers to the shape of
a single pixel in a digitized image. Digital imaging systems generally use square
pixels. That is, the image is of the same resolution horizontally and vertically. However,
not all devices accommodate such resolution. A 2:1 aspect ratio describes a digital
image in which the horizontal resolution is twice its vertical resolution.

Framing Techniques

Framing techniques are subjective. You may think a shot is dramatic, but another
person may think the same shot is boring. However, cinematographers follow
guidelines to ensure that every shot is meaningful and creative. Now let’s look at some
important framing techniques:

Horizontal and vertical lines: When you frame a shot, check for horizontal and
vertical lines in the frame. You may use the horizon or poles in the shot as
references. The horizontal lines must be level and the verticals must be straight up and
down, unless you want the shot deliberately tilted at an angle.
Headroom and nose room: You should leave adequate headroom and nose room in
the front and above the subject of the image. Headroom is the amount of space
between the top of the subject's head and the frame. An image with no headroom
usually looks awkward. Nose room is the space between the edge of the frame and the
subject’s face seen in profile.
Rule of thirds: The rule of thirds refers to the practice of aligning the subject to some
guidelines and their intersecting points. According to this rule, you should divide an
image into nine equal sections, with two horizontal and two vertical lines.
The main point of interest or importance must appear at one-third or two-thirds of the
way up (or across) the frame, instead of at the center. Look at the adjoining
image. Note how the subject (the stone figure), seems to dominate the image when it is
at the side of the frame, rather than at the center.

Negative spacing: Negative space (also known as white space) benefits an image, even
though it sounds like something you should avoid. Negative space is the space
surrounding the main subject. This space can help define the subject more clearly for
stronger emphasis. When used appropriately, negative space helps to create a balance
against the positive space in a scene.
Let’s look at an example of the effective use of negative space. In the picture, you can
see that the main subject (the boat) almost at the bottom of the image frame. The rest
of the image is empty. The bright blue color used in this image gives you a feeling of
openness or solitude. It also creates a feeling of uncertainty. You do not know how big
the expanse of the water is, or where it may end.

Another technique for highlighting the subject is to create a frame within a frame. This
frame can be anything that enables you to emphasize the subject of the image. It might
be a doorframe or a tree branch.

Line of action: When you shoot any video, you need to stick to a line of action. This
line helps to maintain the consistency of direction and space. The line of direction is an
imaginary line connecting the relative positions of two subjects in a frame. The absence
of a line of direction leaves the audience confused about the subject’s direction.
For example, look at the adjoining image. The camera sits beside the two main
subjects. The director of photography (DOP) has created an imaginary line between the
two characters. This line acts as a reference line for the cameraperson. It also ensures
that there is consistency in the screen space and direction. However, the camera must
avoid crossing this line. Otherwise, it will reverse the spatial relations of the shot. Some
directors use reverse shots known as shot reverse shot  or counter shots. In these shots,
the characters face each other. Viewers assume that they are looking at each other,
because the characters are looking in opposite directions. Pickup shots and cutaways
are two other techniques that you can use to create inserts. Inserts are shots or scenes
recorded after the completion of the primary shoot. You can pre-plan pickup shots and
schedule them the same as any other production activity. After shooting the main
scenes (with the major actors), you may use the next day for shooting pickup shots that
do not include the characters. These shots could be close-up shots of props or any
other filler shots.

Shot List in Storyboards

A storyboard is the visual representation of a story or a script of a film. It is similar to a


comic book, with images relating to the story. A storyboard lets you develop a basic
idea of the story’s plot. You can use it to collect feedback and make any changes before
you go ahead with a shoot or an animation. This practice of pre-visualizing a story
helps save time and money. Otherwise, you may need to reshoot many scenes. Every
storyboard conveys a lot of useful information. Here are a few elements that go into a
storyboard:
 characters in the frame and their movements
 conversation (if any) between the characters in the frame
 time difference between two consecutive shots
 camera positions in the scenes
The storyboard must specify the exact placement of the camera. It must indicate the
camera’s distance from the subject. It indicates whether the subject is still or in
motion. However, you do not need a storyboard for every shot in a film. You may
choose to include the essential shots in the storyboard, and just add a description for
others.

A detailed storyboard also contains written content, along with the image. It describes
every shot in detail right from the type of angle to the actor’s reaction. If the storyboard
calls for the use of multiple angles, then you must plan to cover them through a variety
of shots. Here are some tips to create a storyboard:

 Avoid flat staging—use a 3-D perspective. A flat image is not useful or interesting
in a storyboard.
 Always add the elements that occupy the foreground and background of a shot.
 Add ground grids to identify the position of the subject as well as the camera. It
becomes difficult to locate the exact position of a subject without the ground
grids.
 Drawing a conversation between more than two people is difficult. In such a
scene, create groups of people to help make the cut easier.
 For adding more depth to the image, create an offset background instead of
placing objects parallel to the frame.

Every camera lens offers a specific type of range. A wide-angle lens has a focal length of
less than 35mm. It gives you an angle of view greater than 55 degrees across your
photo's widest dimension.
You can identify and distinguish lenses by their measurements. For example, a wide-
angle lens is about 12mm. A normal lens is between 24mm and 50mm. A telephoto
lens is between 100mm and 200mm.
When you shoot a 2-D image, you can push the perspective by adjusting the position of
vanishing points. A vanishing point is a point in an image where parallel lines seem to
intersect. You can use it to show depth and distance. It can also add some perspective
to an image.
Basic Shots for Beginners

Typically, you can shoot a video with the medium close-up, extreme close-up, and the
extreme wide shot. However, if you’re a beginner, you can learn video composition and
sequencing with the hand shot, face shot, and a combination of both shots. Let’s look
at these basic shots and learn how to create a sequence in a video shoot.

Hand shot: For your first video project, make sure that the subject does some hand
movements. Hand movements are easy to shoot and make a scene intriguing.
Face shot: The next shot can be a shot of the face of the subject whose hand first
appears on the screen. The face shot thereby answers the question that arises in the
viewers’ minds when they see the hand shot. (Whose hand is it?)
Hands and face combo shot: The above two shots create another question in the
mind of the viewer—about the activity that’s implied in the two shots. To answer that
question, you need to add a wide shot. This shot gives the viewer an idea of the activity
going on.
Over-the-shoulder shot: This shot enables you to show the subject’s point of
view. You can emphasize the subject’s point of view by including the character's
shoulder or the side of the head in the shot.
Any different shot: Once you’ve captured the above shots, try an entirely different
shot from a different angle. Be creative and try to think up a new way to take the next
shot.

Audio Visual Time Codes


The Society of Motion Picture and Television Engineers (SMPTE) is an international
group based in the United States. The group works to develop the profession and
support the advancement of technology in motion pictures and television engineering.

The SMPTE is a key governing body that implements standards, practices, and
guidelines for the motion pictures and television industry. It supervises the audio
involved with motion pictures. It publishes a monthly journal called the SMPTE
Journal. The journal contains information related to networking opportunities for its
members, academic conferences, and exhibitions.
SMPTE has developed a standard called the time code. The time code is a type of
electronic signal. It enables audio and video professionals to spot a specific location in
digital systems and on time-based media such as audio or video tapes. People working
in audio and video production use the time code for synchronization.
Time Codes

Audio-video editors across the world use several types of time codes. The two main
differences between the time codes are the reference frequency and the frame
count. The reference frequency refers to the alternating current’s main frequency,
which is 60Hz in North America, South America, and Japan. It may also refer to 50Hz
frequency in Europe, Asia, and Africa. The frame count is the number of frames in a
single second of time code.
In any video, every frame records a reference with it. Several frames together make up
a video. Each frame has a specific reference in time and its own unique identifying
number. The time code is a name stamp of every frame that is present on the tape.

When you record audio and video using multiple devices, synchronizing the audio and
video becomes difficult. While transferring the audio to tape, the digital video (DV)
format compresses the audio. Therefore, audio technicians usually record the audio in
a separate device that offers better quality. Synchronization is essential in movie
theaters, where the audio is stored on a CD and video recordings on a film reel.

You can use time codes to synchronize an audio to a video. The process involves
writing time codes on the video camcorder and on the audio recording equipment. The
editor then looks for a point of reference in the video to match with the audio. The
point of reference could be any sound. It might be someone clapping or coughing or
any other distinct audible tone. The clapboard used before shooting every scene is a
reference point to synchronize the audio to the video.

The time code is a video synchronizing standard. It follows a 24-hour clock readout as
hh:mm:ss:ff. For example, you would read a time code of 01:20:40:09 as “1 hour, 20
minutes, 40 seconds, and 9 frames.” This readout can either be time of day time
code or relative time code. Time of day time code means that the material was recorded
at 20 past one in the morning. Relative time code means the recording occurred one
hour and 20 minutes into the tape. The maximum number that can occur in the frames
readout depends on the video standard that you use.
The National Television System Committee (NTSC) is a video system or standard
followed in North America and Japan. It holds 30 frames per second. The frames
counter may read any number between 0 and 29. Also, the film industry uses phase
alternating line (PAL), in which a cinema picture is made up of 24 pictures per
second. The frames time code counter can be anywhere between 0 and 23. Europe and
large parts of Asia follow sequential color with memory  (SECAM). This system lets you
convert footage from one format to the other. If you shoot in Europe using SECAM, you
can use your footage in the United States and vice versa. There are two types of time
codes: LTC and VITC. Linear or longitudinal time code (LTC) refers to encoding SMPTE
time code data in an audio signal. You record this series of square waves on an audio
track of the master, or the main device. Vertical interval time code (VITC) refers to the
data that is encoded into the blank vertical interval between video frames.
Audio Techniques
and Sound Mixing
In this lesson, you’ll identify and use appropriate microphones, identify optimal
microphone placement, and establish appropriate recording conditions for different
production needs. You'll use various audio techniques. You'll describe basic audio
levels and mixing techniques. You'll explain ways to use a variety of audio
elements. You'll also explain how music can create an emotional impact. Finally, you'll
describe various audio-mixing techniques.

Sound Production Using Microphones


Sound production of any kind is incomplete without the use of a suitable
microphone. Microphones are extremely helpful in recording appropriate sounds. For
example, while conducting an interview or recording a song, you can use a specialized
microphone to eliminate noise and record streamlined sound. However, you cannot
use one microphone for all purposes. The choice of microphone depends on its
primary use.

Dynamic microphone: You can use a dynamic microphone for most purposes. It's


inexpensive and sturdy, and it doesn't require a battery or any external charging
facility. It's ideal for all-around sound pressure levels. Dynamic microphones can record
a musical concert, an interview, or any live action sequence for a film. However, they
are not as sensitive as condenser microphones.
Condenser microphone: This type of microphone is very sensitive and emits high-
quality sound. It has special built-in capacitors that act as voltage. Therefore, it needs
external charging. In addition, its output is louder than the output of a dynamic
microphone.

Microphone Placement

Getting the right microphone is not enough. You also need to learn to place it
appropriately and make sure that you get the sound output that you want.

Microphones are directional in nature. They catch sound from its source. Therefore,


omnidirectional microphones have equal sensitivity to sounds from every
angle. Unidirectional microphones catch sound from only one direction. Bidirectional
microphones catch sound from both the front and back of the microphone. You should
place microphones in a recording studio with this directionality in mind.

Suppose you have to record a school choir performance. What kind of sound recording
setup will you choose? For studio recording, you can set up the microphones at three
points. The first rule is to place the first microphone with the lead vocal, the second
with the backup vocals, and the third with the chorus vocals. A lead vocalist can use a
handheld microphone and keep it very close to the mouth. The backup vocalists can
use one microphone per singer. The choral vocalists can use only one
microphone. Keep this microphone two to three feet away from them and place it in
the middle of their row. For large vocal groups, you can use more than one directional
microphone.

Other placement rules: Let's look at some other rules of microphone placement using
the same choir-recording scenario. While recording, you can place the axis of the
microphone in front of the mouth of the vocalist. You could maintain a six- to twelve-
inch space. This eliminates all external sound or noise. In addition, if you intend to
bring more clarity to the recording, you can place the microphone slightly off-axis to
remove sounds that breath blasts cause. This technique eliminates additional sounds
of consonants, such as "p," "b," "t," and "d," and is, therefore, the best option for a
single vocalist.
If you have to record the voice of more than one person, you can follow the 3-to-
1 rule. Here, you place the first microphone at a distance of one foot from the vocalist's
mouth. You then place the second microphone at least three feet away from the first
vocalist's microphone.
You can reduce noise by using a pop filter in front of the microphone and placing the
microphone away from the vocalist. Good-quality sound also depends on the
positioning of the microphone. If you place a condenser microphone too far from the
subject, it might not record quality sound. Similarly, if you place a dynamic microphone
too close to the subject, it will capture breath noises and affect the recording.

After you place the microphones, the actual sound recording process begins. Here are
some tips to produce high-quality audio.
Maintain a constant recording level: The recording level determines the level of
volume of the sound that you can record. The volume must be constant for all
recordings of the same kind. For example, if you record a dialogue for a film, a
conversation between two people must be at the same level.
Adjust levels of volume: You also need to adjust a recording that has two levels of
volume. Bring down the volume of loud sounds, and increase the volume of low-
level sounds. This process equalizes the volume of the rest of the recording.
Outdoor noise: When you record outside a studio, wind and location noise may cause
disruption. These are ambient noises. These sounds occur naturally, and an external
force can't control them. So, before your shot, note the weather conditions. If there is
thunderstorm, use a condenser microphone. For pleasant weather, use a dynamic
microphone.
Indoor noise: While recording indoors, you can record some blank audio. This blank
audio fills in gaps between dialogues and adds ambient sound.

Elements of Audio
The purpose of sound design in any audio-visual medium is to reinforce onscreen
emotions effectively. You can accomplish this by manipulating sound to emphasize or
dramatize the setting, place, action, character, and twists in the plot. Audio has four
basic elements: dialogues or narration, sound effects, music, and silence. You need to
bring them together to complete a video production.
The production team analyzes the dialogues, music, ambient noise, and sound effects
that they want to use for the video. Usually, they work on sound design alongside video
production. You can prerecord some elements. For example, you can record the
dialogues or the sound track (music accompanying shots) before the shoot begins. You
can also add sound effects during the postproduction stage. You can record other
sounds, including background sounds such as traffic noises, chirping birds, or market
noises, on the spot and add them to the video clips during postproduction. Similarly,
you can record ambience sounds in a loop. These sounds don’t distract the audience's
attention from an ongoing scene.

Let's examine the functions and roles of audio elements in the audio-visual medium.

Dialogues or narration: Dialogues or spoken words are an important audio


element. They add subtext to scenes and authenticate the speaker as a real
person. Narratives also use personal pronouns to lend perspective to a scene. Let's
look at a few popular types of narratives.
 The first-person  narrative is a technique of storytelling that uses the first-
person point of view. In this narrative style, you can use the pronoun I.
 In an interior monologue, narrators narrate their inner thoughts. This could be
from either the first-person or third-person point of view.
 In subjective narration, narrators are the central characters but they are aware of
the feelings of other characters.
 The detached autobiography is a technique of storytelling where narrators
objectively recount their past. They are detached from their own stories, and so
they deliver the story in the third-person point of view.
 In a memoir or observer narration, narrators are observers or eyewitnesses who
passively participate in the story. They observe events from the third-
person point of view.

Sound effects: After the production team records the narrative, they begin recording
sound effects to set the mood for a scene. These noises provide a realistic continuation
of scenes and cover up any choppy dialogue. You can add ambient noises for mood,
such as chirping birds, traffic, whistling wind, and crickets in the night. You can also add
sound effects to supplement the ambient sounds in the scene, such as a door slam, an
audience reaction, a baby's cry, a ringing telephone, or a ticking clock.
You can also use the room tone technique and record the silence (or blank) of an empty
room without any noise. This helps add background sounds to a scene shot in a room.
Sound technicians record action sounds in the studio, but they capture ambient noises
on location. Sound technicians record most of the sound effects from the action filmed
on location and integrate them seamlessly into different scenes.

Music: Like sound effects, music is an important part of films. It adds emotional effects
to scenes and is sometimes more appropriate than dialogues. For example, in a tense
scene, it might be more effective to use tense music than to have an actor deliver a
dramatic dialogue. Most films hire music composers who use non-diegetic music to
create impact. They work with singers, musicians, and background scorers to create
music and background scores to create the desired effect.
The purpose of using sound effects and music is to add supra-reality, or a sense of
exaggerated reality, to scenes. This technique makes the audience notice the
characters' varying levels of emotion and behavior. These emotions may not
necessarily come across to the audience with just dialogues or visuals. Therefore,
music adds dimension to the plot of the film and brings a sense of continuity and
drama.
Silence: Silence is the final element of audio in films. It is a very dramatic element. At
key sections in a film, directors use silence to arrest the audience's attention. Silence
also highlights a change in story or an abrupt end to an action. Its use can also build up
anticipation or indicate an impending disaster.

Audio Techniques
For a sound technician, understanding basic audio levels is essential to produce good-
quality sound. The radiating pressure waves present in the atmosphere alternate up
and down in ambient pressure, at a particular speed, and turn into frequencies. The
rate at which they vary or go up and down determines the pressure level or basic audio
level of sound. You can calculate this in decibels.

The input level is the audio signal you receive at a particular voltage. You can divide
input into three primary levels: microphone level (low), line level (higher), and
loudspeaker level (very high). The output level, on the other hand, is the voltage at
which a device, such as a speaker or a cell phone, plays audio.

If the output level is excessive, the audio that you hear won’t be clear. Similarly, input
levels that are too high will clip the resulting output, no matter what playback level you
use. In this case, you can use a simple technique: simply reduce the level on the input,
or reduce the record level or the playback level.

Recording Attributes

Another way to check audio levels is by checking the recording settings. In the file for
recording audio, use the following settings:

 Set the sample format to 44,100 or 48,000.


 Set the bit depth to 24 bit.
 Set the channels to 2 for stereo.
Whereas the sound technician can monitor audio levels from the recorder, a boom
microphone operator can monitor from the mixer. Any good-sized mixer or console
will have a Gain, Trim, or Sens knob. Bring the white knob of the mixer to zero. Then,
use the Gain knob of the mixer channel to set levels for each channel. The mixer meter
will hit zero VU (which corresponds to -20dB on the digital recorder). After you set
these base levels, use the white knob to raise or lower individual channels, as needed.
While recording, keep in mind that the sound interference occurs when audio levels are
too high and cause the signal to distort or lose clarity. In this case, lower the level of
sound to remove distortion because you cannot rectify it in postproduction.

Sound Mixing

After you record your sound, you can move on to sound mixing. Let's look at the
elements of an audio mix in detail.

Balance: This element focuses on the arrangement of music, voice, and sound in an


audio recording. During the editing process, you can add elements of arranging. These
include clear voice, audible sound effects to match the visual, definitive musical sounds,
and compatible ambient sound.
You should avoid playing two sounds of the same frequency together. For example,
you can't arrange the sound of guitar and lead vocal together. If there are similar
sounds, you can arrange or re-record the track. Mute the offending sounds, lower the
level of the problem sound, or pan it to a different direction. This will attract the
listener to the music or dialogue. Balance, therefore, creates a difference between the
various sounds that are playing simultaneously.
Frequency range: The next element is the frequency or repetition of sound waves. You
can measure frequency in Hertz (Hz). Typically, the human ear can hear sounds
between 15 to 20 Hz. During a sound mix, lower the frequency of the music for better
sound. Also, make sure that you gradually increase and blend the sounds for a
smoother transition.
Panorama: A sound panorama is a specialized tool that uses stereo sound to create
realistic three-dimensional audio. With this tool, you can pan sound in three
dimensions. You can also select the point in space where you want to place this
sound. This tool also helps you create excitement by adding movement to the track, as
well as adding clarity to an idea by clearing other clashing sounds. With correct
panning, you can create bigger, wider, and deeper sounds.
Dimension: Sound dimension is the process of adding depth to a track in order to
spruce up a boring sound. You can do this by adding effects, such as reverb, echo, and
delay.
Dynamics: You can use the technique of dynamics to control the volume on CDs and in
the movies. Sound technicians usually compress the sound of modern records and
films. They do so by using an automated level control so they can get every element as
loud as possible. In a video, this is essential to keep dialogue at a constant volume. You
can keep some audio turned off until the track reaches a predetermined threshold
level (volume) and then let it through. For example, a noise gate can turn a track off
when the musician is not playing the instrument.
Psycho-acoustic principle: The psycho-acoustic principle puts the brain's ability to
separate individual sound to use. As a sound technician, you can determine a range of
pitch that will be most appropriate to use with other sounds that also occur in the
scene.
Synchronous and asynchronous sounds: Synchronous sounds are those sounds that
match what you see on the screen. For example, if you see a man playing a guitar on
the screen, the sound you hear is the sound of a guitar.
Asynchronous sounds, on the other hand, are sounds that do not match the screen. An
example of this sound is, if the filmmaker uses the sound of an ambulance for a scene
where a young couple is seen arguing. You would use them to project emotional
relevance in the scene.

The use of these two types of sounds began when filmmakers realized that dialogues
alone could not convey meaning. They began to employ sound effects to portray
emotion in various scenes.

Sound Editing
A superior quality sound mix adds the punch  that enhances a video
production. Usually, sound technicians mix different levels of sounds into a single
channel. They do so by using a sound mixer or a mixing console. This console receives
audio from an external source such as a computer. With its various buttons and
adjusters, you can change the volume and add various characters to the original
sound. You can also play your final mix and send it to another device.
As easy as it becomes, after practice, sound mixing is a difficult activity to get right at
first. So don't worry if your final track is not very professional. Begin by cleaning the
original track and removing unwanted noise. You can do this by using mutes or
controlling the volume of the audio. Now mix the rhythm section separately, and adjust
the master volume to the level you want the listener to hear the sound.

You can start with the bass drum, and then add the rest of the percussion
instruments. After you prepare the rhythm section, add the bass line. Next, add the
vocals or lead instrument to this mix, and then add other instruments such as the
guitar. Finally, adjust the audio levels and mix the final sound.
Application of Sound Mixing

Now that you know the concept of sound mixing, you should understand its
application. As you know, movies and videos include clips, graphics, dialogues,
recorded voiceover, music, noises, and sound effects.

The mix that you prepare on the console helps set the scene in the movie. It creates the
right atmosphere and adds emotional meaning to the scene. It also serves as
background filler and creates continuity across different shots.

During scenes such as the climax, use sound effects and tracks to keep viewers on the
edge of their seats.
Learning to Edit
In this lesson, you'll explain and demonstrate mastery in the process of editing audio-
video productions and describe the use of software-based editing equipment. You'll
also explain how to analyze and modify raw footage. Additionally, you'll describe the
process of editing a video using an original storyboard and script.
Further, you'll explain the differences between linear and nonlinear systems. Finally,
you'll describe the use of control peripherals to capture or ingest media and describe
various digital platforms.

Software-Based Editing
Making a home video is a simple process. All you need to do is shoot the given subject,
and upload and edit the file on your computer. You can then give it a title and
distribute it to your friends. However, the process for a home video is very different
from making a feature film or an advertisement. Both these mediums employ highly
skilled video editors to make the end product look more appealing.

There are many editing software programs currently available in the market. They cater
to individual needs and purposes. In order to choose a software that fits your need, it is
necessary to understand the basics of Software-Based editing.
With the advent of digital editing technology, software-based editing has become
accessible and easier. Video and audio editing is now just a click away for both amateur
and professional video editors.
When you select the editing software, you will observe four windows that enable a
smooth editing experience.

The first window primarily stores and lists the shooting material. This window includes
all the graphics and titles. It also includes all the versions of audio, sound effects, music,
and dialogues.

In the second window, you can preview the clips. This window allows you to edit the
required portions and apply visual effects to your video. You can also apply filters and
transitions using this window.

The third window allows you to view the final edited portions of your video.

The final, or fourth, primary window tracks all your edit changes. For example, this
window shows you the various layers of a video with transitions and effects. You can
also see the graphics and title layers, the various sound effects, music tracks with
transitions, as well as the dialogues on the video.

These four basic windows are important elements in any editing software. You place
the raw footage on the first window, add effects and transitions on the next, and view
them in the third window. Remember that all of these windows are
interdependent. Hence, all of them are equally important.

Therefore, once you have finished editing the desired video on these windows, you can
export the final video as a single file with one video and one audio track.

Once exported, you can use this file across all platforms in various formats.

Peripherals for Capturing or Ingesting


Media
To edit a video clip, first capture the raw footage on a computer. This process is
called video-in. A computer has several primary peripheral devices that facilitate
the video-in process. For an analog video, use the printer or the modem port. You can
also import analog files using the RCA (Radio Corporation of America) connectors, or
the phono or cinch connectors. RCA jacks can handle live video feed from camcorders,
VCRs, and disc players.
The next most superior option for analog video is the SVideo (Separate Video) cable
connector. It separates the black-and-white and coloring signals and achieves a higher
quality image. It has better quality input and output compared to RCA cables.
For digital video-in, the most commonly used peripheral option is the USB port and has
a transfer rate of 12 mbps. It can connect your computer or laptop to any digital
recording device and incurs less data loss. The next option is the digital device called
IEEE 1394 or the Firewire tool. It is the most superior technology for video-in and incurs
very little data loss, with a transfer rate of 400 mbps. Therefore, it gives the best quality
video and audio input.

Now, look at the distinction in the quality of the video-in clips.


Understanding high-definition and standard definition: The number of scan lines
on screen determines the quality of a clip. Each horizontal line represents a video
image. You can notice these lines on your television screens as well.
Based on these scan lines, a video has a "definition," or visible detail in the video
image. Thus, the number of horizontal lines in a single frame usually measures the
quality of a video. In the United States and Japan, if the video has 525 lines, the video is
called standard definition (SD). In most European countries, SD has 625 lines. If
anything breaks the SD level of 625 lines and goes up to 720 and 1080 scan lines, the
video is called high definition (HD). The following shows ways to transfer saved video
clips from a tape, DVD, and camera to your computer.

Transferring Videos to a Computer

Sometimes, you may want to digitize, edit, and preserve files or change the format of a
video. In such cases, you can transfer video files from a Video Home System (VHS), Digital
Video Disk (DVD), or a camera onto a computer.
Videotape: To transfer data from a VHS tape, connect your VCR to a digital camera
using an RCA connector. As soon as you play the video on the VHS player, press the
record button on the digital camera. Continue recording the video until the VHS tape
stops playing. Now, transfer the recorded video from the digital camera onto your
computer. Here, use the yellow RCA connector is for video and the red and white for
audio.
DVD: To transfer data from a DVD, copy the contents onto your computer. Do not open
the files while copying them. Create a folder with a label for the copied files and save
them.
Camera: You can transfer data from a camera just as you did from a DVD. You simply
have to connect the camera using a Firewire or a USB cable to the computer. As you
play the video on the digital camera, copy it simultaneously onto your computer. You
can do this by copying the video directly on video editing software. You can also create
a folder on your computer and copy the contents in it.

Analyze and Modify Raw Footage


Feature-films usually have a run time of two or less hours. However, the shoot can
generate about 20 to 40 hours of raw footage. This footage comprises the original or
unedited shots. From these shots, the director and the editor select the best shots for
the film. These best shots are the rough cut. This rough cut contains information about
the source of the video. It also includes the recording date and the geographical
location. If you alter the original file, all this information would be lost.
Once you obtain the raw footage, music, sound track, dialogues, titles, and graphics,
you take it to the editor. The proportion of edited to raw footage used for the film
varies. In other words, the usage of edited and unedited scenes varies. This usage
depends on the process of video shooting and its usability. However, with the advent
of in-phone camera technology and digital cameras, shooting has become easier. This
has resulted in higher proportion of unused footage or raw footage

Screening, logging, and paper cutting: To make editing easier and faster, you can use the process
of screening and logging. In order to analyze the footage, transfer it onto a VHS tape or a
DVD. During this screening, note the beginning and end time of each scene. This timing also helps in
preparing the sequence of scenes.
To create a log, note the details of all the shots from the start to the end of each reel. Include a
description of each scene along with the dialogues. This process is very useful because it eliminates
the time spent in understanding the scene before its selection.
The next method is the paper-cut technique in which you, as editor, prepare a rough estimate of
the most efficient way to perform the edit. You can use transcripts of interview footages or
dialogues to find sections in the tape that you may want to reorder.

Scripting, Storyboarding, and Editing


A script outlines all the audio and visual elements of a story. Based on this script, you
develop a storyboard. This storyboard provides a detailed sketch of the key frames of a
video like a film. The storyboard is a graphic organizer and includes information on
camera angles, light setup, character framing, and dialogues. Walt Disney first came up
with the concept of the storyboard. He used it to create the Mickey Mouse cartoons in
the 1930s.
A typical storyboard has two columns. The first column has sketches of a scene. The
second column describes the scene and includes dialogues. You may also find
storyboards with just photographs and short descriptions below them. The director can
choose to improvise on the storyboard based on the location or scene. Therefore,
storyboards are most useful to create animated feature-films. Once the big idea
translates into a storyboard, the production team starts preparing documents such as
a proposal that includes creative treatment and the video style. These documents
present a way of sharing the intended message with the audience. An audience take-
out document is also important. This document describes the audiences' reaction to
the production. Therefore, all these documents ensure a smooth flow of the shooting
process.
Storyboard and video edit: A storyboard provides details of every shot in a panel
format. You can use it as a reference to remember the shot sequence. The storyboard,
however, differs from the actual shot as a 2D image or sketch would differ from a live
shot. This live shot is in 3D, with actual characters, set, lights, and camera movement.
Editors get the entire shot sequence clearly set in their minds with the help of the script
and the storyboard. This allows editors to work accordingly. This also lets editors
experiment with changes in the storyline to evoke a stronger reaction or interest. Using
the storyboard and script, editors can also judge what works best after each scene and
so edit accordingly. The storyboard also helps the editor keep the audio in mind while
making visual decisions. Thus, a storyboard acts as a roadmap of the film's final
appearance.

Linear and Non-Linear Editing Systems


Video editing has undergone radical transformation in the recent years. There was a
time when filmmakers used videotapes for editing purposes. Today, the digital editing
systems that use computer drives to store video images have replaced
the traditional editing methods. They are affordable and enable faster recording,
transferring, and editing of videos. However, the techniques of editing the videotape
still influence the process of acquiring, processing, and distributing video images.
Based on editing methods, you can divide video editing into linear  and non-
linear editing. Linear editing involves tape-to-tape editing of the footage, that is, from
the source tape to the destination tape. You can edit the footage after selecting and
arranging the video images based on a pre-decided script. On the other hand, in non-
linear editing, the footage is loaded onto the computer and edited using digital video-
editing software. This is not the only difference, however, between these two editing
formats. Here are a few more differences.

Linear versus non-linear editing: You do linear editing in a tape-to-tape format. Here,


you edit the images, sound, and videos in a pre-decided sequential order. It is the most
popular format of editing live-feed videotapes, commonly used by journalists. The
linear editing process allows journalists to digitally record and edit simultaneously, as
soon as the studio receives feeds from the field. The process begins by logging the
scene time and date from the source tape. Next, the editor selects the best takes and
copies them onto the destination tape. The disadvantage of this process is that you
cannot change the order of shots once you edit them.
Alternatively, a non-linear edit is more flexible. The editors copy the video from the
digital camera onto a computer. Then, they place it in the editing software. Here, they
add the raw footage onto the Timeline and choose the correct takes. They also add
effects and mix sounds in their edit. In non-linear editing, the editor can go back
and re-edit. It is easy to access and edit the files because they are on computer hard
drives. With just a click, editors can view the work-in-progress video in real time. In
linear editing, you have to rewind or fast-forward the video. Yet, linear editing does also
provide access to all edited and non-edited videos.
offline editing, online editing, and edit decision lists (EDL):
Editors create multiple copies of the original footage before editing. This eliminates the
reduction in the quality from overuse of original tapes. This is because most of the
editing takes place using its copies. This is the offline editing technique.
Using the time code from the copy, editors create an edit decision list (EDL). This
document lists each cut-in, or the in-point (where the editor cuts the beginning of the
shot), and the cut-out, or the outpoint (where the editor cuts the end of the shot). Using
the EDL, they create a rough cut for the approval of the director or producer. Once
approved, they use the original footage to create a final version using online tools. This
is the online editing technique.
A career in video editing has become lucrative with the advent of digital video
technology. Therefore, if you want to make it your profession, then you should be
aware of related computer technologies. In addition, video editing is growing because
of the Internet. So, if you have good knowledge of software and know about non-
linear editing setup, you could easily get a job in a television or website-
based production. You could also freelance as a video editor for advertising firms. This
career offers many options for you to explore.
Using Graphics and
Animation
In this lesson, you'll describe the application of animation effects to video. You'll
describe the use of character generators, fonts and colors, and principles of
composition. You'll explain the situations that warrant the use of captions or titles for
video and graphics. Finally, you'll describe the steps involved in creating full-screen
graphics.

Applying Animation Effects to a Video


Imagine you are creating a video that narrates a futuristic story of cities under the
sea. These cities do not exist in the real world. So, how would you depict them in your
movie? One possible solution is to use animation. Today, because of computer
technology, people rely increasingly on creating computer-generated  imagery (CGI).
Filming your story under the sea requires you to create computerized animations of
the underwater environment. You may also need to shoot a video of the actors
engaged in a live action scene. You may need to include certain scenes where you add
animation to a live action scene. For example, consider a sequence where the actors
are in a submersible vehicle. You would create the underwater environment visible
through the window using CGI elements.

The next step combines the animations with the live action video into a single
composite whole. Your final video should give viewers the impression that the actors
are physically present and interacting with the elements of the animated underwater
environment you have created.

There are several advantages for integrating CGI animation with video:

 It's usually cheaper than constructing physical sets. You can even use animation
to create a crowd effect. Thus, you need not hire and pay extras to create this
effect.
 Animation allows you to create images that would not be feasible using any
other method. You can create fairies, dragons, aliens, or a world with three
moons. Your imagination is the only limit.
 With just a few artists, you can create an entire film sequence. You don't need
actors, set designers, or other people's contribution to create the video.
Many videos and movies such as Jurassic Park and The Matrix series include both
animation and live action. Compositing is the process of combining these two
aspects. The success of the compositing image largely depends on two aspects. First,
the success depends on the quality of compositing, which, in turn, depends on the
technical tools that the technicians use such as the software. Second, the composition's
outcome depends on the artistic sensibilities and design skills of the technician.

Serif fonts have


thick and thin strokes and decorative flourishes (serifs). Sans serif fonts do not have
these decorative flourishes or variation in line thickness.

Character generators: A character generator (CG) is a software that can generate static


or animated text. Designers use them to insert text into videos. Modern character
generators can also generate graphics. Broadcast companies use character generators
in live television sports or news programs to rapidly insert text or graphics into the
video. They also use it to create the watermark-like station logo that appears in a
corner of the screen area.
Fonts: Fonts are a specific style, size, and weight of typeface—a set of characters that
share the same design. Some common typefaces are Arial, Times New Roman,
and Verdana. Arial bold and Arial italics are two different fonts of the Arial typeface. The
appearance of a font can create a certain mood or atmosphere that can alter the
effectiveness of your message. Fonts can also provide visual clues about what the text
contains. Research indicates that in electronic media, readability is easier with a sans
serif font.
Color: It's important to remember the psychological impact colors create on human
minds. If used appropriately in titles, color can evoke moods and emotions that
enhance the meaning of the image.
Tips for Effective Graphic Designing

Contrast: Use contrast while creating your image. If you plan to have a dark
background color, use a light colored font and vice versa.
Font palette: Select a variety of fonts to work with. Use different typefaces for the title
and the body. Choose a font that stands out for your title. Choose simple and readable
fonts for the subheadings and body text. If you use a serif font for the titles, vary the
serif with a sans serif font for the body text.
Color scheme: Colors affect the tone of the image. A pleasing color scheme, therefore,
should include contrasting colors. You can begin with two or three main colors.
Grids and frames: Use images that help your design look attractive. Avoid naked
images, however, in your design. Use a grid or frame to give your design a professional
look.
Keep it simple: It's important to incorporate a variety of colors and fonts in your
design. A cluttered design, however, will lose its appeal. Make it creative, but keep it
simple.

Elements of Composition: Creating Good Graphics

Composition refers to the visual structure and organization of elements within a


design. Compositing is the process of combining distinct parts of different images to
form a complete design. Let's look at a few composition styles.

The S or Z composition: Using a zigzag curve, such as an S or Z curve, adds motion


and harmony to the composition. The curved lines of an S shape add beauty and grace.
They draw your attention to the composition. The straight lines of a Z shape denote
energy and business.
The L composition: This compositional form separates one vertical and horizontal side
from the rest of the image. The form creates a sense of peace and serenity. The L
shape composition works well if the space left out of the L is large or small. A
composition can include several L shapes.
The diagonal line: Diagonal lines in a composition indicate direction, motion, activity,
and speed. Opposing diagonals that cross each other create paths for the eye to follow.
When diagonals converge to a point, they create a focal point.

The rule of thirds: Imagine your image divided by two vertical lines that make three
columns, and two horizontal lines that make three rows. The rule of thirds shows that
the focal elements and leading lines of your image should be on or near the imaginary
lines and at the point where the lines intersect.
Asymmetry and symmetry: Symmetry is the term for splitting a composition into
equal parts to create mirror images. The results of such a composition are usually
poor. Asymmetrical compositions are visually more pleasing. You can create these
compositions by keeping lines that cross the image away from the center
point. Asymmetrical compositions also imply that balancing out the image, with its
elements equally distributed, tends toward a "boring" composition. On the other hand,
an imbalance in the symmetry or an asymmetrical composition is more interesting.
Poor composition: It is usually not pleasing to view a symmetrical
composition. Dividing an image into halves in either the portrait or the landscape
orientation leads to a poor composition. Similarly, placing too many objects with too
much space between them makes a composition poor and unattractive to the viewer.

Principles of Design

The principles of design are the manner in which you arrange various design elements
such as lines, color, and shapes.

Balance: Balance gives an image a sense of equilibrium or equal visual weight within a


composition. You achieve balance through the way in which you distribute the visual
elements. A composition may be symmetrical, possessing a mirror-like effect. It may
be asymmetrical, in which a single large object balances several smaller ones.
Alternatively, the composition may also be radial, in which objects radiate from a
central point.
Proportion: Proportion is the relationship between the size or number of different
components in an image. Proportion helps indicate an object's size, distance, and
location.
Movement: Movement is the path that the viewer's eye takes when observing the
graphic or image. You can create a movement with lines, edges, shapes, or colors.
Emphasis: Emphasis is the part of the image that catches the viewer's attention. You
can achieve emphasis by contrasting elements through difference in size, shape, color,
or texture.
Rhythm: You can create rhythm in a design by repeating one or more visual elements
in it. Rhythm creates a sense of organized movement in a composition.
Variety: Variety is important because it creates interest in a design. It involves using
several visual elements together, and so adds interest to a composition. However, you
should be careful not to clutter the design.
Unity: Creating unity in a design involves using the visual elements so that they appear
to belong together. Unity gives a sense of completeness to the design. You can also
refer to unity as harmony.

Use of Graphics
Graphics are an integral part of video production. Let's look at some areas where you
can effectively use graphics in video production.

Titles and credits: Video titles and credits give additional information about the
video. They include the names of the actors, director, and producer. Graphics are an
integral part of the opening sequence and end credits in a video. Graphics may take the
form of text, small icons, symbols, or illustrations related to the subject or
topic. Graphics may also take the form of bands or highlights for the titles. The title or
end credit graphics may also be fully animated.
The choice of color and font is an important factor in designing a graphic. The graphics
used in the opening titles of a horror film will be different from those in a comedy
film. Often, the style of the graphics, including the font, color, and icons, become the
identity of the film. The graphics, then, become useful for marketing and promotional
purposes. Often, designers use the same pattern they chose in the opening font
throughout the video. This pattern then forms the identity of the video.
Subtitles: There are several instances in a video where it becomes necessary to add
subtitles to explain the events unfolding in the video.
 If the video is in a foreign language, there are usually subtitles added to the
bottom half of the screen. These subtitles can be in English or the native
language of the viewers.
 If the audio is unclear, the filmmaker usually incorporates the message of the
audio as a subtitle at the bottom of the screen.
 Sometimes, the filmmaker uses subtitles to indicate the use of archival footage,
or dramatized and recreated situations.
 Often, filmmakers use subtitles to indicate the name of places and
people. Subtitles are also useful to indicate a date or time.
 Sometimes, in a narrative, subtitles can be useful to indicate a leap of time (such
as into the future or past). At other times, subtitles can indicate a shift across
borders or large distances.
A screen grab
from a video. Note the use of subtitles to indicate the name of the place where the
action is about to unfold.

Visual Effects: Visual effects result from an unusual display of visuals that are not
possible to capture through normal photography. Explosions, fires, avalanches, and
mass depictions of historical architecture are just some of the places where you can
use visual effects. Graphics and animation play an important part in creating these
visual effects. The key point is that these visual effects should be invisible. The viewer
should not be able to detect a filmmaker's use of graphics in the video.
Apart from films and television programming, mixing graphics with live action also
creates augmented reality (AR). AR is a type of virtual reality in which CGI supplements
live views of the real physical world. There are many applications for this system. In
medicine, surgeons can view the physical body along with CGI images of the internal
organs. In the military, soldiers can use AR to supplement their view of a terrain with
views from a satellite or overhead drone.

Graphics in packaging: In marketing, packaging is an important aspect that helps the


buyer identify the product and make the decision to buy the product. Similarly, in a
video production such as an advertisement you can identify the product by the way the
video is packaged. The color scheme is important in packaging the video, because color
and graphics help to show uniformity among the different aspects of a video.
In television programming, there are common types that display similar title sequence
and graphics. This similarity helps identify the type of content that the program
contains. Television channels, too, use different fonts and graphics that give them an
identity. This combination of text in a particular color and font, with an associated
graphic, forms the logo of the channel. Users then associate the logo with that
particular channel. A promotional video always uses similar elements so that the public
easily recognizes the show on its release.

Generating Graphics
Image-editing software is a powerful tool that can help you create and edit your
designs. There are a number of image-editing software programs such
as Adobe Photoshop, Pixelmator, Pixlr, and GIMP. Here are a few of the important tools
that Pixlr offers. Some basic tools are common to all image-editing software. Once you
know the features of a particular tool, you can use it in any software.
Supported file formats: When working with image-editing software, the first thing you
need to know is the file formats that the software supports. Pixlr supports almost any
file format, such as JPEG, PNG, BMP, GIF, TIFF, PSD, and Mac Pict. However, when saving
images, you can save in either one of these four formats: BMP, JPEG, PNG, and TIFF.
Save: Once you have created or edited your graphic, it's important to save it for web
use. Saving an image for the web reduces its file size so that a user can easily view it on
the web. In Pixlr, the Save option from the File menu opens a dialogue box that asks
where to save the file. It also gives the option to adjust its quality. A lower number
means a smaller file size and lower quality.
Getting started: To begin creating or editing an image, you can open an image that
you've already stored on your computer. Select Browse and then choose the image
from the location where you have saved it on your computer. The image will open on
the canvas of the photo-editing software.
Toolbar: The toolbar provides many different tools that enable you to edit your image
in a variety of ways. There are some basic tools, such as selection  tools, blending  tools,
and special effects and filter  tools.
 Selection tools: You can use these tools to select, move, and crop the desired
parts of your image. The marquee tool lets you select part of the image in a
particular shape. The lasso is a free form-selection tool. The wand tool selects
the spot you clicked and anything around it that is similar (such as color).
 Editing tools: Editing tools are a set of tools that help you make changes to the
selected image. Using the pencil, you can draw freehand. There is an eraser to
delete sections and a brush tool to paint, both of which come in different sizes
and styles.
 Blending tools: Blending tools enable you to blur, sharpen, and smudge areas
of your image. A sponge tool helps you saturate or desaturate
colors. The burn tool darkens areas of the image, while the dodge tool lightens
areas.
 Special effects and filter tools: These tools include the red eye tool that
enables you to remove red parts from the eye. The bloat tool helps you flatten
areas of the image. You can use the pinch tool to shrink areas.

Layers: It is an editing feature that lets you separate the different elements or
effects. It's like drawing various aspects of an image on different sheets of plastic and
stacking them together to view the final image. You can use the Layers menu option to
perform several tasks. The Layers tool lets you see all the layers in your image file. The
tool lets you organize and arrange your layers. You can use this tool to blend the
features of two or more layers. You can also group and merge layers or make them
visible or opaque.
Channels: Digital images consist of pixels or tiny dots of color. The color of a pixel is a
combination of three primary colors: red, blue, and green. You can think of a channel
as one layer of your image viewed with only one of these primary colors. Each color is
stored on a scale of 0 to 255. The image in each channel will appear in grayscale. The
darker parts of the image contain the least or no amount of that particular color. The
lighter or white parts contain the most amount of the color. Pixlr contains a color
channel tool that allows you to change the amount of color in each channel. This tool is
useful to create highlights and soften colors in an image.

Color Palette: A palette is a group of tools that the software displays in a small


window. The color palette lets you select and change colors with the help of brush
tools. It contains one large color area, which is in the color you are currently working
with. Below it are six smaller color areas, which are the most recent custom colors that
you have used. Clicking within the large color box will bring up a window that lets you
choose different colors. You can choose colors based on different color modes, such as
web and RGB (the red, green, and blue primary color mode). The color picker is one of
the editing tools in the toolbar menu. This tool replaces the main color in your color
palette with the color you select from the image.
Text: The Text tool enables you to create a box in which you can type. You can add
letters, characters, and numbers. This tool allows you to change the font and its style,
size, and color.
History: The History palette lets you go back to the changes that you've made to your
image earlier. It lets you undo these changes or make additional edits to a previous
action. You can access the history palette from the menu bar.
Menu: The menu bar appears on the top panel of the software. It contains several
tools and options to help you edit your image.
 File: This menu contains options to open, close, print, and save files. You can use
it to close the program as well.
 Edit: This menu contains the basic editing options, such as cut, copy, and paste.
 Image: This menu enables you to change the canvas or image size and the
orientation of the canvas.
 Layer: This menu provides different options to work with layers in your file.
 Adjustments and Filter: These menus give you the ability to add effects to your
image.
 View: This menu enables you to display various palettes. View also controls the
zoom and full screen modes.
 Language: This menu changes the language in which descriptions and
instructions appear.
 Help: This menu enables you to get assistance and view the FAQ and the blog
sections.

You might also like