Download as pdf or txt
Download as pdf or txt
You are on page 1of 30

ACKNOWLEDGEMENT

The above project has been a great working experience for me. I had learnt about the
various aspects of working of DOORDARSHAN. Topic of my project is Study of Television
Transmission and Broadcasting System. It helped me to know the steps involved from
inception of the video signal from source via transmission medium to its reception at television
sets. First of all, I would like to show our deepest gratitude to Mr. V. B. Patel, Head of
Vocational Training (Doordarshan Kendra, Lucknow), who provided us this opportunity to
work with Doordarshan. We are grateful to Mr. R. Naithani and Mr. Rohit Bhatt who through
their deep knowledge of the topic helped us in understanding it in a simple and better way. We
are also thankful to all the employees of DOORDARSHAN KENDRA, Lucknow without
whose support in providing the necessary materials this project report would not have been
possible. At last we are thankful to Training and Placement Cell of our college which gave us
the opportunity to pursue summer training at Doordarshan Kendra, Lucknow.

- SHIKHA SINGH

CONTENTS
1. Introduction
2. Analog Television System

3. Television Studio

Basics of picture
Interlaced Versus Progressive Scan
Composite/ CVBS Interface
Y/C Interface
Component Interface

Studio Floor
Production Control Room
Master Control Room
Other Facilities
4. The Camera Imaging Device
Charge Coupled Devices (CCDs)
Three CCDs camera
Main parts of a Camera
Studio Camera
ENG Camera
EFP Camera
Dock Camera
Lipstick Camera
7. Color Temperature and Color Balance
8. Satellite Communication

9. Conclusion

Features of Earth Station


System Layout
Specification
Video Audio Termination Panel
UP Converter
Audio Processing
Transmitting Antenna
Protection System

INTRODUCTION

Doordarshan is the public television broadcaster of India and a division of Prasar Bharati. It is
a public service broadcaster nominated by the Government of India. It is one of the largest
broadcasting organizations in the world in terms of the infrastructure of studios and
transmitters. Recently, it has also started Digital Terrestrial Transmitters. Doordarshan had a
modest beginning with the experimental telecast started in Delhi on 15 September 1959 with a
small transmitter and a makeshift studio. The regular daily transmission started in 1965 as a
part of All India Radio. The television service was extended to Bombay (now Mumbai) and
Amritsar in 1972. Till 1975, only seven Indian cities had a television service and Doordarshan
remained the sole provider of television in India. In 1982, it came into existence as national
broadcaster.
Presently, Doordarshan operates 21 channels two All India channels-DD National and DD
News, 11 Regional languages Satellite Channels (RLSC), four State Networks (SN), an
International channel, a Sports Channel DD Sports and two channels Rajya Sabha TV & DDLok Sabha for live broadcast of parliamentary proceedings.
On DD National (DD-1), Regional programmes and Local Programmes are carried on timesharing basis. DD News channel, launched on 3 November 2003, which replaced the DD
Metro (DD-2) Entertainment channel, provides 24-Hour news service.
The Regional Languages Satellite channels have two components The Regional service for
the particular state relayed by all terrestrial transmitters in the state and additional programmes
in the Regional Language in prime time and non-prime time available only through cable
operators. DD-Sports Channel is exclusively devoted to the broadcasting of sporting events of
national and international importance.

ANALOG TELEVISION SYSTEMS


Television (TV) is a telecommunication medium for transmitting and receiving moving images
that can be monochromatic (shades of grey) or multi-coloured. Images are usually
accompanied by sound.
The etymology of the word is derived from mixed Latin and Greek origin, meaning "far sight":
Greek tele, far, and Latin visio, sight (from video, vis- to see, or to view in the first person.
Broadcast TV is typically disseminated via radio transmissions on designated channels in the
54890 MHz frequency band.[1] Signals are now often transmitted with stereo and/or surround
sound in many countries. Until the 2000s broadcast TV programs were generally transmitted as
an analogue television signal, but in recent years public and commercial broadcasters have
been progressively introducing digital television broadcasting technology.
All but one analog television system began as monochrome systems. Each country, faced with
local political, technical, and economic issues, adopted a color system which was grafted onto
an existing monochrome system, using gaps in the video spectrum (explained below) to allow
color transmission information to fit in the existing channels allotted. The grafting of the color
transmission standards onto existing monochrome systems permitted existing monochrome
television receivers predating the changeover to color television to continue to be operate as
monochrome television. Because of this compatibility requirement, color standards added a
second signal to the basic monochrome signal, which carries the color information. The color
information is called chrominance or C for short, while the black and white information is
called the luminance or Y for short. Monochrome television receivers only display the
luminance, while color receivers process both signals. Though in theory any monochrome
system could be adopted to a color system, in practice some of the original monochrome
systems proved impractical to adapt to color and were abandoned when the switch to color
broadcasting was made. All countries now use one of three color systems: NTSC, PAL, or
SECAM. In India PAL technique is used for television broadcasting.

Basics of picture
A television creates a continuous series of moving pictures on the screen. This section will
describe in detail how pictures are created in a television. A camera works exactly on the same
principle applied the other way round.
A picture is "drawn" on a television or computer display screen by sweeping an electrical
signal horizontally across the display one line at a time. The amplitude of this signal versus
time represents the instantaneous brightness at that physical point on the display.
At the end of each line, there is a portion of the waveform (horizontal blanking interval) that
tells the scanning circuit in the display to retrace to the left edge of the display and then start

scanning the next line. Starting at the top, all of the lines on the display are scanned in this way.
One complete set of lines makes a picture. This is called a frame. Once the first complete
picture is scanned, there is another portion of the waveform (vertical blanking interval, not
shown) that tells the scanning circuit to retrace to the top of the display and start scanning the
next frame, or picture. This sequence is repeated at a fast enough rate so that the displayed
images are perceived to have continuous motion. This is the same principle as that behind the
"flip books" that you rapidly flip through to see a moving picture or cartoons that are drawn
and rapidly displayed one picture at a time.

Interlaced versus Progressive Scans


These are two different types of scanning systems. They differ in the technique used to cover
the area of the screen. Television signals and compatible displays are typically interlaced, and
computer signals and compatible displays are typically progressive (non-interlaced). These two
formats are incompatible with each other; one would need to be converted to the other before
any common processing could be done. Interlaced scanning is where each picture, referred to
as a frame, is divided into two separate sub-pictures, and referred to as fields. Two fields make
up a frame. An interlaced picture is painted on the screen in two passes, by first scanning the
horizontal lines of the first field and then retracing to the top of the screen and then scanning
the horizontal lines for the second field in-between the first set. Field 1 consists of lines 1
through 262 1/2, and field 2 consists of lines 262 1/2 through 525. The interlaced principle is
illustrated in Figure 2. Only a few lines at the top and the bottom of each field are shown.

There are many different kinds of video signals, which can be divided into either television or
computer types. The format of television signals varies from country to country. In the United
States and Japan, the NTSC format is used. NTSC stands for National Television Systems
Committee, which is the name of the organization that developed the standard. In Europe, the
PAL format is common. PAL (phase alternating line), developed after NTSC, is an

improvement over NTSC. SECAM is used in France and stands for sequential coleur avec
memoire (with memory). It should be noted that there is a total of about 15 different subformats contained within these three general formats. Each of the formats is generally not
compatible with the others. Although they all utilize the same basic scanning system and
represent color with a type of phase modulation, they differ in specific scanning frequencies,
number of scan lines, and color modulation techniques, among others. The various computer
formats (such as VGA, XGA, and UXGA) also differ substantially, with the primary difference
in the scan frequencies. These differences do not cause as much concern, because most
computer equipment is now designed to handle variable scan rates. This compatibility is a
major advantage for computer formats in that media, and content can be interchanged on a
global basis. In India we use the PAL system. It has 625 lines in each frame and uses interlaced
scanning.

Typical Frequencies for Common TV and Computer Video Formats


Video Format

NTSC

Description

Television
Format for
North America
and Japan

PAL
Television Format
for Most of Europe
and South America.
Used in India

HDTV/SDTV
High Definition/
Standard Definition
Digital Television
Format

Vertical Resolution
Approx 480
Approx 575 (625 1080 or 720 or 480; 18
Format (visible lines
(525 total lines)
total lines)
different formats
per frame)
Determined by
Horizontal Resolution
Determined by
bandwidth,
1920 or 704 or 640; 18
Format (visible pixels
bandwidth, ranges
ranges from
different formats
per line)
from 320 to 720
320 to 650
Horizontal Rate (kHz)
15.734
15.625
33.75-45
Vertical Frame Rate
29.97
25
30-60
(Hz)
Highest Frequency
4.2
5.5
25
(MHz)
There are three basic levels of baseband signal interfaces. In order of increasing quality, they
are composite (or CVBS), which uses one wire pair; Y/C (or S-video), which uses two wire
pairs; and component, which uses three wire pairs. Each wire pair consists of a signal and a
ground. These three interfaces differ in their level of information combination (or encoding).
More encoding typically degrades the quality but allows the signal to be carried on fewer
wires. Component has the least amount of encoding, and composite the most.

Composite/CVBS Interface

Composite signals are the most commonly used analog video interface. Composite video is
also referred to as CVBS, which stands for color, video, blanking, and sync, or composite
video baseband signal. It
combines the brightness information (luma), the color
information (chroma), and the synchronizing signals on just one cable. The connector is
typically an RCA jack. This is the same connector as that used for standard line level audio
connections. A typical
waveform of an all-white NTSC composite video signal is shown
in Figure.

This figure depicts the portion of the signal that represents one horizontal scan line. Each line
is made up of the active video portion and the horizontal blanking portion. The active video
portion contains the picture brightness (luma) and color (chroma) information. The brightness
information is the instantaneous amplitude at any point in time. From the figure, it can be see
that the voltage during the active video portion would yield a bright-white picture for this
horizontal scan line, whereas the horizontal blanking portion would be displayed as black and
therefore not be seen on the screen. Color information is added on top of the luma signal and is
a sine wave with the colors identified by a specific phase difference between it and the colorburst reference phase. The amplitude of the modulation is proportional to the amount of color
(or saturation), and the phase information denotes the tint (or hue) of the color. The horizontal
blanking portion contains the horizontal synchronizing pulse (sync pulse) as well as the color
reference (color burst) located just after the rising edge of the sync pulse (called the "back
porch"). It is important to note here that the horizontal blanking portion of the signal is
positioned in time such that it is not visible on the display screen.

Y/C Interfaces
The Y/C signal is a video signal with less encoding. Brightness (luma), which is the Y signal,
and the color (chroma), the C signal, are carried on two separate sets of wires.

Component Interfaces

Component signal interfaces are the highest performance, because they have the least
encoding. The signals exist in a nearly native format. They always utilize three pairs of wires
that are typically in either a luma (Y) and two-color-difference-signals format or a red, green,
blue (RGB) format. RGB formats are almost always used in computer applications, whereas
color-difference formats are generally used in television applications. The Y signal contains the
brightness (luma) and synchronizing information, and the color-difference signals contain the
red (R) minus the Y signal and the blue (B) minus the Y signal. The theory behind this
combination is that each of the base R, G, and B components can be derived from these
difference signals. Common variations of these signals are as follows:
Y, B-Y, R-Y : Luma and color-difference signals.
Y, Pr, Pb
: Pr and Pb are scaled versions of B-Y and R-Y. Commonly found in high-end
consumer equipment.
Y, Cr, Cb
Y, Pr, Pb.

: Digital-signal equivalent to Y, Pr, Pb. Sometimes incorrectly used in place of

Y, U, V
: Not an interface standard. These are intermediate, quadrature signals used in
the formation of composite and Y/C signals. Sometimes incorrectly referred to as a component
interface. Some important terms and their meanings in this context are listed below
Aspect Ratio : Aspect ratio is the ratio of the visible-picture width to the height. Standard
television and computers have an aspect ratio of 4:3(1.33). HDTV has aspects ratios of either
4:3 or 16:9(1.78). Additional aspect ratios like 1.85:1 or 2.35:1 are used in cinema.
Blanking Interval: There are horizontal and vertical blanking intervals. Horizontal blanking
interval is the time period allocated for retrace of the signal from the right edge of the display
back to the left edge to start another scan line. Vertical blanking interval is the time period
allocated for retrace of the signal from the bottom back to the top to start another field or
frame. Synchronizing signals occupy a portion of the blanking interval.
Blanking Level: Used to describe a voltage level (blanking level). The blanking level is the
nominal voltage of a video waveform during the horizontal and vertical periods, excluding the
more negative voltage sync tips.
Chroma: The color portion of a video signal. This term is sometimes incorrectly referred to as
"chrominance," which is the actual displayed color information.
Color Burst: The color burst, also commonly called the "color subcarrier," is 8 to 10 cycles of
the color reference frequency. It is positioned between the rising edge of sync and the start of
active video for a composite video signal.

Fields and Frames: A frame is one complete scan of a picture. In NTSC it consists of 525
horizontal scan lines. In interlaced scanning systems, a field is half of a frame; thus, two fields
make a frame.
Luma: The monochrome or black-and-white portion of a video signal. This term is sometimes
incorrectly called "luminance," which refers to the actual displayed brightness.
Monochrome: The luma (brightness) portion of a video signal without the color information.
Monochrome, commonly known as black-and-white, predates current color television.
PAL: Phase alternate line. PAL is used to refer to systems and signals that are compatible with
this specific modulation technique. Similar to NTSC but uses subcarrier phase alternation to
reduce the sensitivity to phase errors that would be displayed as color errors. Commonly used
with 626-line, 50Hz scanning systems with a subcarrier frequency of 4.43362MHz.
Pixel: Picture element. A pixel is the smallest piece of display detail that has a unique
brightness and color. In a digital image, a pixel is an individual point in the image, represented
by a certain number of bits to indicate the brightness.
RGB: Stands for red, green, and blue. It is a component interface typically used in computer
graphics systems.
Sync Signals/Pulses: Sync signals, also known as sync pulses, are negative-going timing
pulses in video signals that are used by video-processing or display devices to synchronize the
horizontal and vertical portions of the display.

TELEVISION STUDIO
A television studio is an installation in which television or video productions take place, either
for live television, for recording live to tape, or for the acquisition of raw footage for postproduction. The design of a studio is similar to, and derived from, movie studios, with a few
amendments for the special requirements of television production. A professional television
studio generally has several rooms, which are kept separate for noise and practicality reasons.

These rooms are connected via intercom, and personnel will be divided among these
workplaces.

Studio Floor

Fig: Studio Room of a news channel in making

The studio floor is the actual stage on which the actions that will be recorded take place. A
studio floor has the following characteristics and installations:

decoration and/or sets


cameras (sometimes one, usually several) on pedestals
microphones
Lighting rigs and the associated controlling equipment.
several video monitors for visual feedback from the production control room
a small public address system for communication
A glass window between production control room (PCR) and studio floor for direct
visual contact is usually desired, but not always possible

While a production is in progress, the following people work in the studio floor.

The on-screen "talent" themselves, and any guests - the subjects of the show.
A floor director or floor manager, who has overall charge of the studio area, and who
relays timing and other information from the director.
One or more camera operators who operate the television cameras, though in some
instances these can also be operated from PCR using remote heads.
Possibly a teleprompter operator, especially if this is a news broadcast

Production-Control Room
The production control room (PCR), also known as the "gallery" or Studio Control Room
(SCR), is the place in a television studio in which the composition of the outgoing program
takes place. Facilities in a PCR include:

A video monitor wall, with monitors for program, preview, VTRs, cameras, graphics
and other video sources. In some facilities, the monitor wall is a series of racks
containing physical television and computer monitors; in others, the monitor wall has
been replaced with a virtual monitor wall (sometimes called a "glass cockpit"), one or
more large video screens, each capable of displaying multiple sources in a simulation of
a monitor wall.
A vision mixer, a large control panel used to select the video sources to be seen on air
and, in many cases, in any monitors on the set. The term 'vision mixer' is primarily used
in Europe, while the term 'switcher' is usually used in North America.
An audio mixing console and other audio equipment such as effects devices.
A character generator, which creates the majority of the names and full screen graphics
that are inserted into the program
Digital video effects, or DVE, for manipulation of video sources. In newer vision
mixers, the DVE is integrated into the vision mixer; older models without built-in
DVE's can often control external DVE devices, or an external DVE can be manually
run by an operator.

Fig: A Production Control Room

A still store, or still frame, device for storage of graphics or other images. While the
name suggests that the device is only capable of storing still images, newer still stores
can store moving video clips.
The technical director's station, with waveform monitors, vector scopes and the CCUs
or remote control panels for the CCUs.
In some facilities, VTRs may also be located in the PCR, but are also often found in the
central machine room
Intercom and IFB equipment for communication with talent and crew

Master-Control Room

The master control room houses equipment that is too noisy or runs too hot for the production
control room. It also makes sure that wire lengths and installation requirements keep within
manageable lengths, since most high-quality wiring runs only between devices in this room.
This can include:

The actual circuitry and connection boxes of the vision mixer, DVE and character
generator devices
camera control units
VTRs
Patch panels for reconfiguration of the wiring between the various pieces of equipment.

In a broadcast station in the US, master control room or "MCR" is the place where the on-air
signal is controlled. It may include controls to play back programs and commercials, switch
local or network feeds, record satellite feeds and monitor the transmitter(s), or these items may
be in an adjacent equipment rack room. The term "studio" usually refers to a place where a
particular local program is originated. If the program is broadcast live, the signal goes from the
production control room to MCR and then out to the transmitter.

Other Facilities:A television studio usually has other rooms with no technical requirements beyond program
and audio monitors. Among them are:

One or more make-up and changing rooms


A reception area for crew, talent, and visitors, commonly called the green room.

THE CAMERA IMAGING DEVICE


The principal elements of a typical black-and-white television camera are the lens and the
camera imaging device. This used to be a camera tube (with its associated scanning and focusing
coils), but now is a CCD. The lens focuses the scene on the front end of the imaging device.

Charge Coupled Device:


Broadcasters have used charge-coupled devices (CCDs) for ENG cameras since the
early 1980s. Their light weight, low cost and high reliability allowed CCDs to gain rapid
acceptance. Manufacturers now produce these devices for use in professional and consumer
video camcorders. The first step in creating a camera image is to gather light. CCDs are rigidly
and permanently mounted, usually to the prism itself. There is no possibility for adjusting the
scanning process. Lens manufacturers, in turn, standardize their product to work under stringent
conditions.

How CCDs Work:


There are three sections in your average CCD. An array of photo diodes is positioned at
the output of the prism. As varying amounts of light strike the diodes, those that are illuminated
become "forward biased", and current flows that are proportional to the intensity of the light.

Fig: Layers of a CCD


The shift gate acts as a switch. This permits the current from each diode to be stored in a
solid state capacitor in the CCD. As we know, capacitors store voltages, and these little guys are
no exception.
The actual transfer of the voltages out to the real world is the key to why CCDs are so
ingenious. The CCD unit can transfer the voltage from cell to cell without any loss. This is
called charge coupling, which is how the CCD gets its name: Charge Coupled Device.
When the transfer gate of a CCD image sensor is activated, the CCD's clocking
circuitry moves the contents of each picture cell to the adjacent cell. Clocking the shift registers
in this manner transfers the light input value of each cell to the output, one value at a time. The
CCD chips provide their own scanning circuitry, in a way. The last cell in the chain sends its
voltage, in turn, to the output circuit of the chip. As an added bonus, cycling through all of the
cells this way will not only send out all of the stored voltages, but also discharges all of the
cells, too. Everything goes back to normal and the cells are ready to take in a new analog
voltage value.

The CCD analog shift register deals with the charges coming from the capacitors. Each
of these registers has an address decoder that allows each portion of the image to be
individually addressed. An address encoder cycles through the field of photosensitive registers,
and reads out the analog voltages for each pixel. The speed of operation of this decoder is
synchronized to the scan rate of television.

COLOUR CAMERAS
Three Chip Cameras:

Fig: Colour camera head end


A three-CCD camera is a camera whose imaging system uses three separate chargecoupled devices (CCDs), each one taking a separate measurement of red, green, or blue light.
Light coming into the lens is split by a trichroic prism assembly, which directs the appropriate
wavelength ranges of light to their respective CCDs. The system is employed by some still
cameras, video cameras, television systems and camcorders.
Compared to cameras with only one CCD, three-CCD cameras generally provide
superior image quality and resolution. By taking separate readings of red, green, and blue
values for each pixel, three-CCD cameras achieve much better precision than single-CCD
cameras. By contrast, almost all single-CCD cameras use a Bayer filter, which allows them to
detect only one-third of the color information for each pixel. The three electrical signals that
control the respective beams in the picture tube are produced in the colour television camera by
three CCD (Charge Coupled Device) integrated circuit chips. The camera has a single lens,
behind which a prism or a set of dichroic mirrors produces three images of the scene. These are
focused on the three CCDs. In front of each CCD is a colour filter; the filters pass respectively
only the red, green, or blue components of the light in the scene to the chips. The three signals
produced by the camera are transmitted (via colour encoding) to the respective electron guns in
the picture tube, where they re-create the scene.

The combination of the three sensors can be done in the following ways:
Composite sampling, where the three sensors are perfectly aligned to avoid any color artifact,
when recombining the information from the three color planes.
Pixel shifting, where the three sensors are shifted by a fraction of a pixel. After
recombining the information from the three sensors, higher spatial resolution can be achieved.
[2] Pixel shifting can be horizontal only to provide higher horizontal resolution in standard
resolution camera, or horizontal and vertical to provide high resolution image using standard

resolution imager for example. The alignment of the three sensors can be achieved by micro
mechanical movements of the sensors relative to each other. Arbitrary alignment, where the
random alignment errors due to the optics are comparable to or larger than the pixel size.

Main Parts of Camera:


Lens: The lens is the first component in the light path. The camcorder's optics generally has
one or more of the following adjustments:
Aperture or iris to regulate the exposure and to control depth of field;
Zoom to control the focal length and angle of view;
Shutter speed to regulate the exposure and to maintain desired motion portrayal;
Gain to amplify signal strength in low-light conditions;
Neutral density filter to regulate the exposure.
Imager: The imager converts light into electric signal. The camera lens projects an image onto
the imager surface, exposing the photosensitive array to light. The light exposure is converted
into electrical charge. At the end of the timed exposure, the imager converts the accumulated
charge into a continuous analog voltage at the imager's output terminals. After scan-out is
complete, the photosites are reset to start the exposure-process for the next video frame.
Recorder: The recorder is responsible for writing the video-signal onto a recording medium
(such as magnetic videotape.) The record function involves many signal-processing steps, and
historically, the recording-process introduced some distortion and noise into the stored video,
such that playback of the stored-signal may not retain the same characteristics/detail as the live
video feed.

Studio Cameras
Most studio cameras stand on the floor, usually with pneumatic or hydraulic mechanisms
called pedestals to adjust the height, and are usually on wheels. Any video camera when used
along with other video cameras in a studio setup is controlled by a device known as CCU
(camera control unit), to which they are connected via a Triax, Fibre Optic or the almost
obsolete Multicore cable. The camera control unit along with other equipment is installed in
the production control room often known as the Gallery of the television studio. When used
outside a studio, they are often on tripods that may or may not have wheels (depending on the
model of the tripod). Initial models used analog technology, but are now obsolete, supplanted
by digital models. Studio cameras are light and small enough to be taken off the pedestal and
the lens changed to a smaller size to be used on a cameraman's shoulder, but they still have no
recorder of their own and are cable-bound. Cameras can be mounted on a tripod, a dolly or a
crane, thus making the cameras much more versatile than previous generations of studio
cameras.
Eng Camera: Though by definition, ENG (Electronic News Gathering) video cameras were
originally designed for use by news camera operators; these have become the dominant style of
professional video camera for most productions, from dramas to documentaries, from music

videos to corporate training. While they have some similarities to the smaller consumer
camcorder, they differ in several regards:
ENG cameras are larger and heavier, and usually supported by a shoulder stock on the
cameraman's shoulder, taking the weight off the hand, which is freed to operate the lens zoom
control. The weight of the cameras also helps dampen small movements. 3 CCDs are used
instead of one, one for each primary color. They have interchangeable lenses.
All settings, white balance, focus, and iris can be manually adjusted, and automatics
can be completely disabled. The lens is focused manually and directly, without intermediate
servo controls. However the lens zoom and focus can be operated with remote controls in a
studio configuration. Professional BNC connectors for video and at least two XLR input
connectors for audio are included. A complete time code section is available, allowing time
code presets; and multiple cameras can be time code-synchronized with a cable. "Bars and
tone" are available in-camera (the color bars are SMPTE (Society of Motion Picture and
Television Engineers) Bars, a reference signal that simplifies calibration of monitors and
setting levels when duplicating and transmitting the picture.)
Efp cameras: Electronic Field Production cameras are similar to studio cameras in that they
are used primarily in multiple camera switched configurations, but outside the studio
environment, for concerts, sports and live news coverage of special events. These versatile
cameras can be carried on the shoulder, or mounted on camera pedestals and cranes, with the
large, very long focal length zoom lenses made for studio camera mounting. These cameras
have no recording ability on their own, and transmit their signals back to the broadcast truck
through a triax, fibre optic or the virtually obsolete multicore cable.
Dock cameras: Some manufacturers build camera heads, which only contain the optical block,
the CCD sensors and the video encoder, and can be used with a studio adapter for connection
to a CCU in EFP mode, or various dock recorders for direct recording in the preferred format,
making them very versatile. However, this versatility leads to greater size and weight. They are
favoured for EFP and low-budget studio use, because they tend to be smaller, lighter, and less
expensive than most studio cameras.
Lipstick cameras: "Lipstick cameras" are so called because the lens and sensor block
combined are similar in size and appearance to a lipstick container. These are either hard
mounted in a small location, such as a race car, or on the end of a boom pole. The sensor block
and lens are separated from the rest of the camera electronics by a long thin multi conductor
cable. The camera settings are manipulated from this box, while the lens settings are normally
set when the camera is mounted in place.

COLOR TEMPERATURE AND COLOR BALANCE


Color temperature is a characteristic of visible light that has important applications in
lighting, photography, videography, publishing, manufacturing, astrophysics, and other fields.
The color temperature of a light source is the temperature of an ideal black-body radiator that
radiates light of comparable hue to that of the light source. Color temperature is conventionally
stated in the unit of absolute temperature, the kelvin, having the unit symbol K.

Color temperatures over 5,000K are called cool colors (blueish white), while lower
color temperatures (2,7003,000 K) are called warm colors (yellowish white through red). In
photography and image processing, color balance is the global adjustment of the intensities of
the colors (typically red, green, and blue primary colors). An important goal of this adjustment
is to render specific colors particularly neutral colors correctly; hence, the general method
is sometimes called gray balance, neutral balance, or white balance. Color balance changes the
overall mixture of colors in an image and is used for color correction; generalized versions of
color balance are used to get colors other than neutrals to also appear correct or pleasing.
Image data acquired by sensors either film or electronic image sensors must be
transformed from the acquired values to new values that are appropriate for color reproduction
or display. Several aspects of the acquisition and display process make such color correction
essential including the fact that the acquisition sensors do not match the sensors in the human
eye, that the properties of the display medium must be accounted for, and that the ambient
viewing conditions of the acquisition differ from the display viewing conditions.
The color balance operations in popular image editing applications usually operate
directly on the red, green, and blue channel pixel values,[1][2] without respect to any color
sensing or reproduction model. In shooting film, color balance is typically achieved by using
color correction filters over the lights or on the camera lens.

SATELLITE COMMUNICATION
Frequency range: 5.85 GHz to 6.425 GHz for transmission and 3.625 GHz to 4.2 GHz for
reception.
The digital earth station operates in the frequency range of 5.85 GHz to 6.425 GHz for
transmission and 3.625 to 4.24 GHz for reception of signals. The whole system operates with
DVB/MPEG2 Standards. The base band processor subsystem and base band monitoring
subsystem operates in fully digital domain. An OFC carries digital base band signal from
studio to earth station site to minimize the noise and interference. It is controlled by a PC called
NMS PC.
The compression segment has an MPEG encoder, digital multiplexer and digital
modulator. The monitoring and receiving segment comprises of two digital receivers for
receiving and decoding program. The output of modulator (70MHz) is sent to an up converter.
The up converted signals are sent to an HPA. Then this signal is given to a PDA (parabolic
dish antenna) for up linking to satellite. The uplinked signal is received again by the same PDA
for monitoring purposes. The signal between earth station and satellite are given a long line of
sight which means there must be a clear path from earth to satellite. The uplink signal is fed
from the earth station by a large PDA. The satellite is equipped with its own dish antenna
which receives the uplink signals and feeds them to a receiver. The signal is then amplified and
changed to a different frequency which is downlink frequency. This is done to prevent
interference between uplink and downlink signals. The down linked signal is then again sent to
the transmitter which again retransmits it. Each satellite has a transponder and a single antenna
receives all signals and another one transmits all signals back. A satellite transmits signals
towards earth in pattern called the satellite footprint of the satellite. The footprint is strongest at

centre and the footprint is used to see if the earth station will be suitable for the reception of the
desired signal. The parts of the DES are Antenna subsystem including LNA Antenna control
unit, beacon tracking unit, beacon tracking receiver and up converter system high power
amplifier and power system. The system operates in 2 + 1 mode and is compliant with DVB
MPEG 2 standards. The base band processor subsystem and base band monitoring system
operates in digital domain. An OFC contains the digital base band signal for studio to earth
station to minimize noise interference
The network management system or NMS monitors and controls baseband equipments
compression equipments and test instruments like video audio generation and video audio
analyzer. They are provided to ensure quality of transmission and help trouble shoot. The base
band segment comprises of baseband subsystems at studio site and base band subsystem at
earth station site. This baseband segment processes two video Programmes. The base band
segment is monitored and controlled using a PC placed near the base band earth station
equipments called base band NMS PC. The compression segments comprises of Mpeg
encoders in 2 + 1 configuration for providing redundancy. It also comprises of digital
multiplexers and digital modulators in 1 + 1 configuration. The compression segment is
monitored and controlled by compression NMS PC. The receive and monitoring segment
consists of two digital receivers for receiving and decoding of the video programmes and one
ASI to SDI decoder for decoding of the transport stream for monitoring video programmes at
the multiplexers output. RF NMS PC is placed near the receiver monitoring segment and video
audio generator placed in the base band segment. For monitoring of video programmes
professional video monitor, LCD video monitor and audio level monitor are provided in the
base band segment. An operator console has one 14 professional video monitor a video audio
monitor unit for quantitative monitor of video programmes and a personal computer for
centralized merit and contention of earth station sub system.

Features of Earth Station

All major sub systems operate in redundant mode and takes over immediately without
any noticeable break in the service in the event of failure of the main chain
A fibre optic connectivity to transport two SDI video and two AES audio signals from a
studio to the earth station separated by a distance of approximately 200m
System configuration in MCPc in 2+1 mode
Base band process in fully digital domain. In case input video and audio are analog A/D
counter in first and converts analog signal in to digital signal to ensure operation in
fully digital domain
Digital encoding system compliant to MPEG2/DVB standards
On line trouble shooting with the help of converter, IRD and other associated test and
measuring equipment
Exhaustive professional quality measuring of video and audio
Control and monitoring using NMS
Single point remote monitoring and control on the console

The physical configuration of the racks in the digital earth station is as follows:
Base band Rack(studio)
Base band rack (earth station)
Compression rack
Receive and monitoring rack
Console
NMS

System Layout
All the above systems are located in the station as per the typical station layout to have smooth
flow of all signals mainly video audio RF and control so as to reduce cabling length between
racks. An OFC of 200m length with NMS control cable (RG 5A) is provided for base band
between the studio and the earth station.

Specifications:
Electrical specifications:
System Voltage 230V AC, Single phase
Satellite communication systems
System configuration: (2 +1) mode with full redundancy
Transmitter:
Video audio input parameter
No of program input
Type of input format
Input format (analog)
Input level (analog)
A to D converter
Video Bandwidth
Input format (digital)
Input level (digital)
No of audio input
Input Frequency Range
Input Standard
Input Level
No of audio Input Digital
Input Standard
Sampling rate
Data Rate

:2
: Analog or digital, 75ohm
: 625 line PAL- B CCIR standard
: 1VPP+-5%
: 10 Bits
: 5.5MHz
: SMTPE 259M, 270Mbps
: 800mVPP+-10%
: Analog dual mono/normal stereo/joint stereo per program
: 20 Hz to 20 kHz
: Balanced analog 600ohm
: 0dB with +-10dB adjustment
: Single at specified program
: 110 ohm
: 32/44.1/48 kHz (selectable)
: 32-384kBPS

Video/ Audio Compression Parameter:


Video compression
: MPEG-2 4:2:2@ML4:2:0@ML

Bit Range
Resolution
Audio Coding
Multiplexer O/P rate
Modulation Type
FEC Rate

: 1.0 To 15Mbps for 4:2:0 to 50Mbps for 4:2:2


: 704X576/720X576(selectable)
: MPEG layer2
: 1-80Mbps
: QPSK selectable
: 2/3 5/6 7/8

Receiver
Domain Concession receiver frequency
C to L o/p frequency
Video/Audio Decoder and Receiver
Monitoring
RF Monitoring
IF (70 MHz monitoring)
L Band monitoring
C Band monitoring
Base Band monitoring
through router
RF Measurement
RF Parameters

: 3.6 to 4.2 GHz


: 950 to 1750 MHz
: L Band

: using 70 to L converter
: Using IRD
: Using downlink through satellite
: Video and Audio monitoring in transmit or receive path

: Spectrum Analyzer

Video Generation and Monitoring


Video Monitoring(digital)
: one 14 professional colour monitor, one 5.6
LCD monitor in the base band rack for high quality monitoring and one 14
professional and one 4 LCD in console for confidence monitoring
Video analyzer (SDI/Analog)
: Waveform Monitor (wfm-601M) and VM-700
Video Generator (SDI/Analog)
: TG-700
Environmental specifications
Temperature
Operation 00C to 450C
Storage -200C to 800C
Humidity 0% to 95% non condensing
Altitude 0 to 3000msl
NMS Functions
Monitoring all the subsystems
Control of the subsystems
Configuration of all the subsystems
Separate monitor and control computer for baseband and compression system

Monitor and control of the earth station subsystem for a remote computer wanted in the
console
Interface between the computer and equipment is RS 232

Video audio termination panel


The base band segment of the system carries two programs from the studio to the earth stations
equipment separated by a distance of about 200m. To cater to these needs two video and two
audio signals each one stereo are processed. The video signals are handled in the digital
domains in SDI (serial 4:2:2@ 270 Mbps data rate) and the audio signals in AES/EBU as per
the AES 2- 1992 standards. If all the input signals are analog, A/D converters will have to be
used in the transmitter end, which give SDI and AES outputs for operation in fully digital
domain. One A/D card is mounted in the frame and wired up to the patch panel so that in case
of failure of main video A/D card this spare A/D card can take over.
The analog or digital input from the camera or VTR and from live events are fed to the suitable
connectors on video and audio termination panel depending upon whether the type of signal is
analog or digital. If the signal is analog, then the video ADC cards perform the analog to digital
conversion of the incoming video and audio signals. The serial digital video and audio outputs
are further fed to the audio embedder through a patch panel. If the input video and audio
signals are digital, suitable patching is to be done and the video patch panel and audio patch
panel for routing these inputs to the embedder. The dual channel audio embedder can embed up
to two AES/EBU streams in to a serial 4:2:2 video streams. In the earth station, one AES/EBU
stream embeds one digital video signal so that the cards are used for two program channels.
The embedder is fed to the fibre optic transmitter.
The OFC takes two inputs of SDI at 270Mbps for the two embedder and provides multimode
operation option for each input in accordance with SMPTE 297M. The O/P signal from the
optional transmitter is in the optimum form so it protects the signal from EM interference and
cross talk. The OFC loss is less than co axial loss and so signal can travel longer distances. In
earth station an OFC is used to handle two embedded SDI signals. The channel A and channel
B optical output from the unit are made available via a SC connector with shutters. These two
optical outputs are fed to the line interface unit; they are transported back to earth station base
band rack for further processing through the optical link.
The video patch panel (2x24 way) employed in the system is 2u unit suitable for the digital
video. The two patch cords are used for making connection through on the patch panel either
for analog or digital video input. The audio patch panel (2x24 ways) is a 1u unit. Two patch
cords are used for making connection through on the patch panel either for analog or for digital
input. Both the patch panels are configured through for analog input in normal condition for
video as well as audio. All the IQ modules from the Sand W are incorporated in the IQH3A
enclosure. It can accommodate 8 double or 16 single width modules or every combination
fitted with a roll call gateway for roll net 2.5Mbps network. The enclosure consists of dual PSU
for redundancy. The max power consumption of the unit is 225VA.The BNC connector on the

near panel of the connector allows it to be connected to the roll call network. The bicolor
LEDs V1 and V2 indicate positive and negative supplies. They are green if PSU supplies
power and is red otherwise.

Up Converter (1+1)
The UPC will add in any frequency within stated transmission BW in 125 kHz stepped
increments. The IF bandwidth is indented for operation within an 80Mhz BW centered at
70MHz (for +/- 40 MHz) Due to its low phase noise and HF stability the model UC6M2D5
(satellite networks) meets INTELSAT, DOMSAT, EUTELSAT and regional requirements. It
can stand alone up converter or in a 1:1 protection switch option. The uplink frequency for
Trivandrum is 6036.5 MHz and downlink is 3811.5MHz.

Audio Processor
Designed specifically for the demands of television audio, the programmable OPTIMOD-TV
8282 digital audio processor meets all requirements of the various systems in use around the
world. It is impossible to characterize the listening quality of even the simplest limiter or
compressor on the basis of the usual specifications, because such specifications cannot
adequately describe the crucial dynamic processes that occur under program conditions.
Therefore, the only way to meaningfully evaluate the sound of an audio processor is by
subjective listening tests. Certain specifications are presented here to assure the engineer that
they are reasonable, to help plan the installation, and to help make certain comparisons with
other processing equipment. Some of the specifications are for features that are optional. The
TXs sampling rate can be synchronized with that of audio processors or can be allowed a free
run of 32 kHz, 44.1 kHz or 48 kHz. The audio signal is sent to the digital I/O cards and analog
cards separately. These cards provide pre emphasis truncations required and attenuation on the
digital signal before transmission.

Performance
Specifications for measurements from analog left/right input to analog left/right output are as
follows:
Frequency Response (all structures, measured below gain reduction and clipping thresholds,
high-pass filter off): Follows standard 50 microseconds. Or 75 microseconds. Pre-emphasis
curve 0.20dB, 5Hz-15 kHz. Analog and digital left/right outputs can be independently userconfigured for flat or pre-emphasized output.
Noise: Output noise floor will depend upon how much gain reduction the processor is set for
(AGC and/or DENSITY), gating level, equalization, noise reduction, etc. It is primarily
governed by the dynamic range of the A/D Converter. The dynamic range of the digital signal
processing is 144dB.

Total System Distortion (de-emphasized 100% modulation): Less than 0.01% THD, 20Hz-1
kHz rising to less than .05% at 15 kHz. Less than 0.02% SMPTE I MHz Distortion.
Total System Separation: Greater than 80dB, 20Hz-15 kHz.
Polarity:
(PROTECTION
or
BYPASS
structure)
Absolute
polarity
maintained. Positive-going signal on input will result in positive-going
signal on output.
Impedance: 600 ohms or 10k ohms load impedance, electronically balanced, jumper
selectable Common Mode Rejection: Greater than 70dB, 50-60Hz. Greater than 45dB, 60Hz15 kHz.
Sensitivity: -40dBu to +20dBu to produce 10dB gain reduction at 1kHz
Maximum Output Level: +23.7dBu into 600 ohm or greater balanced load Connector: XLRtype, male, EMI-suppressed. Pin 1 Chassis, Pins 2 and 3 electronically balanced, floating and
symmetrical.

Transmitter Antenna
A 6.3m diameter antenna with a simplified manual track device features ready erection, ease of
maintenance and high reliability.
Reflector structure
The 6.3 m diameter antenna is made up of 4 quarter segment. Each and every quarter is made
up of 10 segments fixed on five trusses. Panels which are fixed to the trusses are made up of
fine aluminium expanded mesh strengthened with the help of channel sections and tee sections
whose ends are fixed to the backup structure. Trusses are composed of aluminium square tubes
and the welded back up made up of hub and 20 trusses. The hubs and trusses are constructed in
such a way that they constitute to the high level of surface accuracy.
Mount structure
A simple tubular steel space frame makes up most of the mount structure. It allows rotation
about x-axis as well as y axis. The x axis drive rod is connected between the top of the
mounted structure and the concrete foundation. The y axis drive rod is connected between the
base of the x axis bearing mount and the reflector back up structure on the left hand side as
viewed from the rear of the antenna. The mount is rigidly attached to the concrete base which
is facing north such that it can survive even in wind speeds up to 200 kmph.
Drive mechanism

It has a telescopic pipe arrangement and a screw rod within it along with manual handle. There
are mechanical angle indicators along the screw rod which indicate the exact position and angle
of the antenna with respect to both the axes.
Most of the parts of the panel and antenna structure are made up of aluminium alloy which has
corrosion resistance and yield strength. The reflector is treated in the following order before
installation
(A) Etch primer is applied after caustic soda acid treatment
(b) Painted with white matt paint
The mount is treated with the following
(a) A hot dip which galvanizes all steel parts
(b) Etch primer treatment
(c) White enamel paint is applied as a last coating
Fixing the feed onto the antenna
The feed is supported by a set of four pipes called as a quadripod. It is fixed before the whole
antenna structure is hoisted, that is, it is fixed on the ground itself before the whole antenna
structure is fixed. Care should be taken that the feed is at the exact focus of the reflector. A
maximum tolerance of +3mm is allowed for the separation between the actual focus and feed
position. Also the feed entrances and cable output ports are covered with waterproof Teflon
sheet to prevent the entry of moisture into the arrangement.
The LNBC (Low Noise Block Converter) and cables are connected to the feed output. The x-y
adjustment is then done and fixed. The bolts are tightened with care and the arrangement is set.
Care should be taken while lifting and fixing of the whole apparatus to prevent any damage.
The signals which are received by the antenna are given to the feed and from there it goes to
the LNB from where the signals are given to the receiver. The receiver changes the frequency
Sat long
74.0 6.95

lat
8.55

Y angle
-3.46

Y length
2778.37

X angle
-10.0

X length
3269.0

Az
119.12

El
79.37

93.5 "
"
19.37
2424.36
-10.13
3265.4
116.58
68.23
bandwidth of the signal so as to decrease the losses through noise. These signals can now be
observed on a TV screen. And this is the principle which is used in home dish antennas and by
cable operators for broadcasting in a small area. For transmitting these signals back to air there
are some changes which are to be made to these signals. I.e., these signals have to be properly
set according to the specifications given. So the signal is next fed to a control console. From
here the different programmes or channels have to be selected first and then each channels
visual and aural property can be set properly before transmission to air. The visual properties
can be seen in the video waveform screen
Video waveform modifications

In the video waveform as can be observed, 625 vertical lines make up one frame of the video
which appears on the TV screen. It is divided into odd lines and even lines on either side of the
video waveform. In this video waveform, the peak to peak voltage is 1 volt. The synchronizer
or the synch voltage which extends below the other parts of the graph and in the middle has a
voltage of .3V.this is the standard level for horizontal as well as for vertical synch. The next
part is colour burst which controls the colour characteristics of the video. The remaining 0.7V
is the video level. Many characteristics of the video signal like its brightness its chroma etc can
be modified here at this stage before transmission. A colour stability amplifier is used at this
stage to regenerate synch colour burst and brightness level of the signal. Many times the signal
which is received from the antenna do not confirm to the standards. Hence it might need
modifications before transmission so that it can be received uniformly by all the viewers.
The 5KW and 10 KW TX of TW200 HP series com band 1 on band 3 and are equipped with
two dims operating in a passive reserve mode. The sound and vision channels are amplified
separately. It is designed to operate in all the negative modulation standards with PAL, NTSC
and SECAM colour systems. Each transmitter is designed for a precise output power and a
specific frequency but is built using a series of common modules based on the same technology
the standardization has following advantages like the maintenance personnel of one type can
work with the other type as well and spare parts can be shared.
All amplifiers are WB devices (170 to 230 MHz in B3 and 44 and 88 MHz in B1) and can
operate in band 3 and band 1 of both sound and vision. In the driver Audio and video I/P
signals are connected to vision and sound IF signals. These IF proceed prior to concession to
RF output frequencies and amplified. The attenuated 5 and 10 KW sideband pattern is obtained
through the use of a lithium niobate ground wave filter. Each amplifier is equipped with AGC.
The driver also consists of a vision synchronization detection circuit used to automatically
switch the transmitter on and off. Also the transmitter can be controlled locally and remotely.
All IF and RF interconnections use 50ohm coaxial links to simplify maintenance. By the use of
redundant of the ampliform and power supplies, briefly can estimated reduced power levels in
the event of a failure in several transistors amplifiers or a power supply. This man machine
interface ensures high user friendliness both in terms of operation and maintenance. System
info and controls are accessed through a touch screen controlled by a microprocessor.
Description of TX
The TX is in a single cabinet which the diplexer and filter assembly is associated. The TX as
discussed above has two drivers two RF amplification channels, power supplies and
associated co-ordination and control system, a diplexer and a RF filter. All amps power
supplies and their driver components are plug-in drawers and sub assemblies are designed for
easy access and removal. The main switch is designed for use with all types of 3phiW/W with
or without neutral 208V or 480V.

Driver
This subassembly is used to generate vision and sound signals corresponding to the selected
standard using input video and audio signals. This sub assembly performs the processing and

conversion required to generate the filtered and vision and sound signals in the selected RF
band. The dent also provides phase and amplitude corrections to ensure that the linearity
specifications comply with various standards. The driver acknowledges s the presence or
absence of the video and audio signals that are applied to the driver. The driver consists of
plug-in mounted in a single PCB rack, 6 units high. Each driver has 5 modules connected to the
mother board. Each can be replaced separately without changing the entire assembly.
Maximum output power is 19ddBm for vision signals and 13dBm for sound signals.
Local driver controls are on the local freq and interface board. In the maintenance mode of the
TX these controls are active. The 2 drivers and associated passive resonance relays are directly
controlled by the control system. (Each driver has +_ 12V power supply).Each driver has its
own internal oscillator. However they can be made to work with an external frequency
synthesizer. In case of synthesizer failure the change into internal oscillator takes place
automatically. In this dual drive configuration the sys automatically switches over to the
reserve driver.
The LCD screen provides control system monitoring and analysis. The amplifier drivers are
provided by plug in high power supplies. (1power supply for 2 amplifiers.) These highly
reliable units generate 50V with 120A. Each RF O/P of amplifier is coupled with a balanced
WILKINSON COMBINER. Ensuring insulation of approximately 18dB between O/P. This
drive makes it possible to remove an amplifier driver when on the air without disrupting
broadcast. This way a faulty amplifier can be replaced with a spare drawer and also a sound
amplifier can be used in case of a vision amplifier.
CPU or Control System
It is a microprocessor board and with a LCD screen coupled to it with a command and control
facility. Safety is achieved through hardwired systems to maintain operations and safety
precautions and optimum performance. The CPU can in fact control:

Sound to vision ratio


System power
Type of pilot wave
Synthesizer frequency
Single drive or dual drive
Filtering assembly

It is formed by a diplexer reflecting sound signal and an RF pass band filter introducing 2
rejecters. A wave counter reset signal is sent to sample vision and output signal
Tx Cooling
The amplifiers are cooled with pressurized air through an external vertical system that lets
filtered air.

Protection systems
Thermal protection: The Tx is protected against excess temperature increase. For air if T>
450C then the output power is reduced and when outside temperature is greater than 600C the
Tx is shut down.
SWR protection: It is independent for each high gain amplifier. If a faulted amplifier is
detected it can be restarted. If the failure causes power rise then the TX is cut off
Power surge protection: The amplifier has a fast protection circuit in the event of a power
surge at amplifier locations.

CONCLUSION
The television transmission consists of inception of signal, encoding, decoding and receiving at
required place. A standard television set comprises multiple internal electronic circuits,
including those for receiving and decoding broadcast signals. A visual display device which
lacks a tuner is properly called a monitor, rather than a television. A television system may use

different technical standards such as digital television (DTV) and high-definition television
(HDTV). Television systems are also used for surveillance, industrial process control, and
guiding of weapons, in places where direct observation is difficult or dangerous.
Broadcasters using analog television systems encode their signal using NTSC, PAL or SECAM
analog encoding and then modulate this signal onto a VHF or UHF carrier. In India, Phase
Altrenating by Line (PAL) technique is used for television broadcasting.
Broadcasting starts from Camera present in studio, from where it goes to Camera control units
(CCUs). From CCUs signals move to Vision Mixer. Editing is done here using Character
Generator. VTR output is also given to VM. Then the signal goes to MSR and then through
Earth Station it is transmitted to satellite (say INSAT 4B). This satellite signal is received by
T.V. tower, which then transfers it to Antenna. From Antenna the receiver receives the signal
and the process gets completed.

You might also like