Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

3rd.

Lectures in TLE7-Computer
TOPIC: The Computer Monitor
What is a Computer Monitor?
A computer monitor is an output device that displays information in pictorial or textual form. A discrete monitor
comprises a visual display, support electronics, power supply, housing, electrical connectors, and external user
controls.
The display in modern monitors is typically an LCD with LED backlight, having by the 2010s replaced CCFL backlit LCDs.
Before the mid-2000s, most monitors used a cathode-ray tube (CRT) as the image output technology.[1] A monitor is
typically connected to its host computer via DisplayPort, HDMI, USB-C, DVI, or VGA. Monitors sometimes use other
proprietary connectors and signals to connect to a computer, which is less common.

Originally, computer monitors were used for data processing while television sets were used for video. From the 1980s
onward, computers (and their monitors) have been used for both data processing and video, while televisions have
implemented some computer functionality. In the 2000s, the typical display aspect ratio of both televisions and
computer monitors changed from 4:3 to 16:9.[2][3]
Modern computer monitors are often functionally interchangeable with television sets and vice versa. As most
computer monitors do not include integrated speakers, TV tuners, or remote controls, external components such as
a DTA box may be needed to use a computer monitor as a TV set.
TYPES OF COMPUTER MONITORS/TECHNOLOGIES
Multiple technologies have been used for computer monitors. Until the 21st century most used cathode-ray tubes but
they have largely been superseded by LCD monitors.

1. Cathode-ray tube

The first computer monitors used cathode-ray tubes (CRTs). Prior to the advent of home computers in the late
1970s, it was common for a video display terminal (VDT) using a CRT to be physically integrated with a keyboard
and other components of the workstation in a single large chassis, typically limiting them to emulation of a paper
teletypewriter, thus the early epithet of 'glass TTY'. The display was monochromatic and far less sharp and detailed
than on a modern monitor, necessitating the use of relatively large text and severely limiting the amount of
information that could be displayed at one time. High-resolution CRT displays were developed for specialized
military, industrial and scientific applications but they were far too costly for general use; wider commercial use
became possible after the release of a slow, but affordable Tektronix 4010 terminal in 1972.

Some of the earliest home computers (such as the TRS-80 and Commodore PET) were limited to monochrome CRT
displays, but color display capability was already a possible feature for a few MOS 6500 series-based machines
(such as introduced in 1977 Apple II computer or Atari 2600 console), and the color output was a specialty of the
more graphically sophisticated Atari 800 computer, introduced in 1979. Either computer could be connected to the
antenna terminals of an ordinary color TV set or used with a purpose-made CRT color monitor for optimum
resolution and color quality. Lagging several years behind, in 1981 IBM introduced the Color Graphics Adapter,
which could display four colors with a resolution of 320 × 200 pixels, or it could produce 640 × 200 pixels with two
colors. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had
a resolution of 640 × 350.

By the end of the 1980s color progressive scan CRT monitors were widely available and increasingly affordable,
while the sharpest prosumer monitors could clearly display high-definition video, against the backdrop of efforts at
HDTV standardization from the 1970s to the 1980s failing continuously, leaving consumer SDTVs to stagnate
increasingly far behind the capabilities of computer CRT monitors well into the 2000s. During the following decade,
maximum display resolutions gradually increased and prices continued to fall as CRT technology remained
dominant in the PC monitor market into the new millennium, partly because it remained cheaper to produce.[9]
CRTs still offer color, grayscale, motion, and latency advantages over today's LCDs, but improvements to the latter
have made them much less obvious. The dynamic range of early LCD panels was very poor, and although text and
other motionless graphics were sharper than on a CRT, an LCD characteristic known as pixel lag caused moving
graphics to appear noticeably smeared and blurry.

2. Liquid-crystal display

There are multiple technologies that have been used to implement liquid-crystal displays (LCD). Throughout the
1990s, the primary use of LCD technology as computer monitors was in laptops where the lower power
consumption, lighter weight, and smaller physical size of LCDs justified the higher price versus a CRT. Commonly,
the same laptop would be offered with an assortment of display options at increasing price points: (active or
passive) monochrome, passive color, or active matrix color (TFT). As volume and manufacturing capability have
improved, the monochrome and passive color technologies were dropped from most product lines.

TFT-LCD is a variant of LCD which is now the dominant technology used for computer monitors.[10]

The first standalone LCDs appeared in the mid-1990s selling for high prices. As prices declined they became more
popular, and by 1997 were competing with CRT monitors. Among the first desktop LCD computer monitors were
the Eizo FlexScan L66 in the mid-1990s, the SGI 1600SW, Apple Studio Display and the ViewSonic VP140[11] in
1998. In 2003, LCDs outsold CRTs for the first time, becoming the primary technology used for computer
monitors.[9] The physical advantages of LCD over CRT monitors are that LCDs are lighter, smaller, and consume
less power. In terms of performance, LCDs produce less or no flicker, reducing eyestrain,[12] sharper image at
native resolution, and better checkerboard contrast. On the other hand, CRT monitors have superior blacks,
viewing angles, and response time, can use arbitrary lower resolutions without aliasing, and flicker can be reduced
with higher refresh rates,[13] though this flicker can also be used to reduce motion blur compared to less flickery
displays such as most LCDs.[14] Many specialized fields such as vision science remain dependent on CRTs, the best
LCD monitors having achieved moderate temporal accuracy, and so can be used only if their poor spatial accuracy
is unimportant.
High dynamic range (HDR)[13] has been implemented into high-end LCD monitors to improve grayscale accuracy.
Since around the late 2000s, widescreen LCD monitors have become popular, in part due to television series,
motion pictures and video games transitioning to widescreen, which makes squarer monitors unsuited to display
them correctly.

3. Organic light-emitting diode

Organic light-emitting diode (OLED) monitors provide most of the benefits of both LCD and CRT monitors with few
of their drawbacks, though much like plasma panels or very early CRTs they suffer from burn-in, and remain very
expensive.

KEY TERMS ABOUT THIS TOPIC


1. OUTPUT DEVICE - An output device is any piece of computer hardware that converts information or data into a
human-perceptible form or, historically, into a physical machine-readable form for use with other non-
computerized equipment.
2. ELECTRONIC VISUAL DISPLAY - An electronic visual display is a display device that can display images, video, or
text that is transmitted electronically. Electronic visual displays include television sets, computer monitors, and
digital signage.
3. POWER SUPPLY - A power supply is an electrical device that supplies electric power to an electrical load. The
main purpose of a power supply is to convert electric current from a source to the correct voltage, current, and
frequency to power the load.
4. ELECTRICAL CONNECTOR - Components of an electrical circuit are electrically connected if an electric current
can run between them through an electrical conductor. An electrical connector is an electromechanical device
used to create an electrical connection between parts of an electrical circuit, or between different electrical
circuits, thereby joining them into a larger circuit.
5. LED-BACKLIT LCD - An LED-backlit LCD is a liquid-crystal display that uses LEDs for backlighting instead of
traditional cold cathode fluorescent (CCFL) backlighting. LED-backlit displays use the same TFT LCD (thin-film-
transistor liquid-crystal display) technologies as CCFL-backlit LCDs, but offer a variety of advantages over them.
6. DISPLAY PORT - DisplayPort (DP) is a digital display interface developed by a consortium of PC and chip
manufacturers and standardized by the Video Electronics Standards Association (VESA). It is primarily used to
connect a video source to a display device such as a computer monitor. It can also carry audio, USB, and other
forms of data.
7. HDMI - High-Definition Multimedia Interface (HDMI) is a proprietary audio/video interface for transmitting
uncompressed video data and compressed or uncompressed digital audio data from an HDMI-compliant source
device, such as a display controller, to a compatible computer monitor, video projector, digital television, or
digital audio device.[3] HDMI is a digital replacement for analog video standards.
8. USB-C - USB-C, or USB Type-C, is a 24-pin connector (not a protocol) that supersedes previous USB connectors
and can carry audio, video and other data, e.g., to drive multiple displays, to store a backup to an external drive.
It can also provide and receive power, such as powering a laptop or a mobile phone. It is applied not only by
USB technology, but also by other protocols, including Thunderbolt, PCIe, HDMI, DisplayPort, and others. It is
extensible to support future standards.
9. DVI - Digital Visual Interface (DVI) is a video display interface developed by the Digital Display Working Group
(DDWG). The digital interface is used to connect a video source, such as a video display controller, to a display
device, such as a computer monitor. It was developed with the intention of creating an industry standard for the
transfer of uncompressed digital video content.
10. VGA CONNECTOR - The Video Graphics Array (VGA) connector is a standard connector used for computer video
output. Originating with the 1987 IBM PS/2 and its VGA graphics system, the 15-pin connector went on to
become ubiquitous on PCs, as well as many monitors, projectors and high-definition television sets.

You might also like