Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 3

Computer Monitors

      The computer monitor is an output device that is part of your computer's display
system. A cable connects the monitor to a video adapter (video card) that is installed in
an expansion slot on your computer’s motherboard. This system converts signals into text
and pictures and displays them on a TV-like screen (the monitor).

The computer sends a signal to the video adapter, telling it what character, image or
graphic to display. The video adapter converts that signal to a set of instructions that tell
the display device (monitor) how to draw the image on the screen.

Cathode Ray Tube (CRT)


The CRT, or Cathode Ray Tube, is the "picture tube" of your monitor.
Although it is a large vacuum tube, it's shaped more like a bottle. The
tube tapers near the back where there's a negatively charged cathode,
or "electron gun". The electron gun shoots electrons at the back of the
positvely charged screen, which is coated with a phosphorous
chemical. This excites the phosphors causing them to glow as
individual dots called pixels (picture elements). The image you see on
the monitor's screen is made up of thousands of tiny dots (pixels). If
you've ever seen a child's LiteBrite toy, then you have a good idea of
the concept. The distance between the pixels has a lot to do with the
quality of the image. If the distance between pixels on a monitor screen is too great, the
picture will appear "fuzzy", or grainy. The closer together the pixels are, the sharper the
image on screen. The distance between pixels on a computer monitor screen is called its
dot pitch and is measured in millimeters. (see sidebar). You should try to get a monitor
with a dot pitch of .28 mm or less.

Note: From an environmental point of view, the monitor is the most difficult computer
peripheral to dispose of because of the lead it contains.

There are a couple of electromagnets (yokes) around the collar of the tube that actually
bend the beam of electrons. The beam scans (is bent) across the monitor from left to right
and top to bottom to create, or draw the image, line by line. The number of times in one
second that the electron gun redraws the entire image is called the refresh rate and is
measured in Hertz (Hz).
If the scanning beam hits each and every line of pixels, in succession, on each pass, then
the monitor is known as a non-interlaced monitor. A non-interlaced monitor is preferred
over an interlaced monitor. The electron beam on an interlaced monitor scans the odd
numbered lines on one pass, then scans the even lines on the second pass. This results in
an almost imperceivable flicker that can cause eye-strain.
This type of eye-strain can result in blurred vision, sore eyes, headaches and even nausea.
Don't buy an interlaced monitor, they can be a real pain in the ... ask your optometrist.

Interlaced computer monitors are getting harder to find (good!), but they are still out
there, so keep that in mind when purchasing a monitor and watch out for that "steal of a
deal".

Video Technologies
      Video technologies differ in many different ways. However, the major 2 differences
are resolution and the number of colors it can produce at those resolutions.

Resolution
      Resolution is the number of pixels that are used to draw an image on the screen. If
you could count the pixels in one horizontal row across the top of the screen, and the
number of pixels in one vertical column down the side, that would properly describe the
resolution that the monitor is displaying. It’s given as two numbers. If there were 800
pixels across and 600 pixels down the side, then the resolution would be 800 X 600.
Multiply 800 times 600 and you’ll get the number of pixels used to draw the image
(480,000 pixels in this example). A monitor must be matched with the video card in the
system. The monitor has to be capable of displaying the resolutions and colors that the
adapter can produce. It works the other way around too. If your monitor is capable of
displaying a resolution of 1,024 X 768 but your adapter can only produce 640 X 480,
then that’s all you’re going to get.
      When we talk about the different technologies, we’re talking about the video card and
monitor that make up that display system. Also, standards describe the basic number of
colors and resolutions for each technology, but individual manufacturers always take
liberties, providing options and enhancements that are designed to make their product
more appealing to the end user. This is, of course, how new standards come about.

Monochrome
      Monochrome monitors are very basic displays that produce only
one color. The basic text mode in DOS is 80 characters across and
25 down. When graphics were first introduced, they were fairly
rough by todays standards, and you had to manually type in a
command to change from text mode to graphics mode. A company
called Hercules Graphics developed a video adapter that could do
this for you. Not only could it change from text to graphics, but it
could do it on the fly whenever the application required it. Today’s adapters still basically
use the same methods.

CGA/EGA
      The Color Graphics Adapter (CGA) introduced color to the personal computer. In
APA mode it can produce a resolution of 320 X 200 and has a palette of 16 colors but can
only display 4 at a time. With the introduction of the IBM Enhanced Graphics Adapter
(EGA), the proper monitor was capable of a resolution of 640 X 350 pixels and could
display 16 colors from a palette of 64.

VGA
      Up until VGA, colors were produced digitally. Each electron beam could be either on
or off. There were three electron guns, one for each color, red, green and blue (RGB).
This combination could produce 8 colors. By cutting the intensity of the beam in half,
you could get 8 more colors for a total of 16. IBM came up with the idea of developing
an analog display system that could produce 64 different levels of intensity. Their new
Video Graphics Array adapter was capable of a resolution of 640 X 480 pixels and could
display up to 256 colors from a palette of over 260,000. This technology soon became the
standard for almost every video card and monitor being developed.

SVGA
      Once again, manufacturers began to develop video adapters that added features and
enhancements to the VGA standard. Super-VGA is based on VGA standards and
describes display systems with several different resolutions and a varied number of
colors. When SVGA first came out it could be defined as having capabilities of 800 X
600 with 256 colors or 1024 X 768 with 16 colors. However, these cards and monitors
are now capable of resolutions up to 1280 X 1024 with a palette of more than 16 million
colors.

XGA
      Extended Graphics Array was developed by IBM. It improved upon the VGA
standard (also developed by IBM) but was a proprietary adapter for use in Micro Channel
Architecture expansion slots. It had its own coprocessor and bus-mastering ability, which
means that it had the ability to execute instructions independent of the CPU. It was also a
32-bit adapter capable of increased data transfer speeds. XGA allowed for better
performance, could provide higher resolution and more colors than the VGA and SVGA
cards at the time. However, it was only available for IBM machines. Many of these
features were later incorporated by other video card manufacturers.

You might also like