Professional Documents
Culture Documents
3rd Lectures in TLE7-Computer - 3rd Quarter Period
3rd Lectures in TLE7-Computer - 3rd Quarter Period
Lectures in TLE7-Computer
TOPIC: The Computer Monitor
What is a Computer Monitor?
A computer monitor is an output device that displays information in pictorial or textual form. A discrete monitor
comprises a visual display, support electronics, power supply, housing, electrical connectors, and external user
controls.
The display in modern monitors is typically an LCD with LED backlight, having by the 2010s replaced CCFL backlit LCDs.
Before the mid-2000s, most monitors used a cathode-ray tube (CRT) as the image output technology.[1] A monitor is
typically connected to its host computer via DisplayPort, HDMI, USB-C, DVI, or VGA. Monitors sometimes use other
proprietary connectors and signals to connect to a computer, which is less common.
Originally, computer monitors were used for data processing while television sets were used for video. From the 1980s
onward, computers (and their monitors) have been used for both data processing and video, while televisions have
implemented some computer functionality. In the 2000s, the typical display aspect ratio of both televisions and
computer monitors changed from 4:3 to 16:9.[2][3]
Modern computer monitors are often functionally interchangeable with television sets and vice versa. As most
computer monitors do not include integrated speakers, TV tuners, or remote controls, external components such as
a DTA box may be needed to use a computer monitor as a TV set.
TYPES OF COMPUTER MONITORS/TECHNOLOGIES
Multiple technologies have been used for computer monitors. Until the 21st century most used cathode-ray tubes but
they have largely been superseded by LCD monitors.
1. Cathode-ray tube
The first computer monitors used cathode-ray tubes (CRTs). Prior to the advent of home computers in the late
1970s, it was common for a video display terminal (VDT) using a CRT to be physically integrated with a keyboard
and other components of the workstation in a single large chassis, typically limiting them to emulation of a paper
teletypewriter, thus the early epithet of 'glass TTY'. The display was monochromatic and far less sharp and detailed
than on a modern monitor, necessitating the use of relatively large text and severely limiting the amount of
information that could be displayed at one time. High-resolution CRT displays were developed for specialized
military, industrial and scientific applications but they were far too costly for general use; wider commercial use
became possible after the release of a slow, but affordable Tektronix 4010 terminal in 1972.
Some of the earliest home computers (such as the TRS-80 and Commodore PET) were limited to monochrome CRT
displays, but color display capability was already a possible feature for a few MOS 6500 series-based machines
(such as introduced in 1977 Apple II computer or Atari 2600 console), and the color output was a specialty of the
more graphically sophisticated Atari 800 computer, introduced in 1979. Either computer could be connected to the
antenna terminals of an ordinary color TV set or used with a purpose-made CRT color monitor for optimum
resolution and color quality. Lagging several years behind, in 1981 IBM introduced the Color Graphics Adapter,
which could display four colors with a resolution of 320 × 200 pixels, or it could produce 640 × 200 pixels with two
colors. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had
a resolution of 640 × 350.
By the end of the 1980s color progressive scan CRT monitors were widely available and increasingly affordable,
while the sharpest prosumer monitors could clearly display high-definition video, against the backdrop of efforts at
HDTV standardization from the 1970s to the 1980s failing continuously, leaving consumer SDTVs to stagnate
increasingly far behind the capabilities of computer CRT monitors well into the 2000s. During the following decade,
maximum display resolutions gradually increased and prices continued to fall as CRT technology remained
dominant in the PC monitor market into the new millennium, partly because it remained cheaper to produce.[9]
CRTs still offer color, grayscale, motion, and latency advantages over today's LCDs, but improvements to the latter
have made them much less obvious. The dynamic range of early LCD panels was very poor, and although text and
other motionless graphics were sharper than on a CRT, an LCD characteristic known as pixel lag caused moving
graphics to appear noticeably smeared and blurry.
2. Liquid-crystal display
There are multiple technologies that have been used to implement liquid-crystal displays (LCD). Throughout the
1990s, the primary use of LCD technology as computer monitors was in laptops where the lower power
consumption, lighter weight, and smaller physical size of LCDs justified the higher price versus a CRT. Commonly,
the same laptop would be offered with an assortment of display options at increasing price points: (active or
passive) monochrome, passive color, or active matrix color (TFT). As volume and manufacturing capability have
improved, the monochrome and passive color technologies were dropped from most product lines.
TFT-LCD is a variant of LCD which is now the dominant technology used for computer monitors.[10]
The first standalone LCDs appeared in the mid-1990s selling for high prices. As prices declined they became more
popular, and by 1997 were competing with CRT monitors. Among the first desktop LCD computer monitors were
the Eizo FlexScan L66 in the mid-1990s, the SGI 1600SW, Apple Studio Display and the ViewSonic VP140[11] in
1998. In 2003, LCDs outsold CRTs for the first time, becoming the primary technology used for computer
monitors.[9] The physical advantages of LCD over CRT monitors are that LCDs are lighter, smaller, and consume
less power. In terms of performance, LCDs produce less or no flicker, reducing eyestrain,[12] sharper image at
native resolution, and better checkerboard contrast. On the other hand, CRT monitors have superior blacks,
viewing angles, and response time, can use arbitrary lower resolutions without aliasing, and flicker can be reduced
with higher refresh rates,[13] though this flicker can also be used to reduce motion blur compared to less flickery
displays such as most LCDs.[14] Many specialized fields such as vision science remain dependent on CRTs, the best
LCD monitors having achieved moderate temporal accuracy, and so can be used only if their poor spatial accuracy
is unimportant.
High dynamic range (HDR)[13] has been implemented into high-end LCD monitors to improve grayscale accuracy.
Since around the late 2000s, widescreen LCD monitors have become popular, in part due to television series,
motion pictures and video games transitioning to widescreen, which makes squarer monitors unsuited to display
them correctly.
Organic light-emitting diode (OLED) monitors provide most of the benefits of both LCD and CRT monitors with few
of their drawbacks, though much like plasma panels or very early CRTs they suffer from burn-in, and remain very
expensive.