0% found this document useful (0 votes)
27 views6 pages

Conntentt

A computer monitor is an output device that displays information visually, typically using LCD technology with LED backlighting, having largely replaced CRTs. Monitors can be used for both data processing and video, and they are connected to computers via various interfaces. The document discusses the history, technologies, performance measurements, and mounting options for computer monitors, as well as security vulnerabilities.

Uploaded by

jennifer loh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views6 pages

Conntentt

A computer monitor is an output device that displays information visually, typically using LCD technology with LED backlighting, having largely replaced CRTs. Monitors can be used for both data processing and video, and they are connected to computers via various interfaces. The document discusses the history, technologies, performance measurements, and mounting options for computer monitors, as well as security vulnerabilities.

Uploaded by

jennifer loh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Contents

hide

(Top)


History


Technologies

o
Cathode-ray tube

o
Liquid-crystal display

o
Organic light-emitting diode


Measurements of performance

o
Size

o
Aspect ratio

o
Resolution

o
Gamut


Additional features

o
Universal features

o
Consumer features

o
Professional features


Mounting

o
Desktop
o
VESA mount

o
Rack mount

o
Panel mount

o
Open frame


Security vulnerabilities


See also


References


External links

Computer monitor
 Article
 Talk
 Read
 Edit
 View history
Tools












Appearance
hide
Text


Small
Standard
Large
Width


Standard
Wide
Color (beta)


Automatic
Light
Dark
From Wikipedia, the free encyclopedia
Not to be confused with Computer terminal or Monitor (synchronization).

A flat-panel display (FPD) computer monitor

A cathode-ray tube (CRT) computer monitor


A computer monitor is an output device that displays information in pictorial or
textual form. A discrete monitor comprises a visual display, support electronics,
power supply, housing, electrical connectors, and external user controls.

The display in modern monitors is typically an LCD with LED backlight, having by the
2010s replaced CCFL backlit LCDs. Before the mid-2000s, most monitors used
a cathode-ray tube (CRT) as the image output technology.[1] A monitor is typically
connected to its host computer via DisplayPort, HDMI, USB-C, DVI, or VGA.
Monitors sometimes use other proprietary connectors and signals to connect to a
computer, which is less common.
Originally computer monitors were used for data processing while television
sets were used for video. From the 1980s onward, computers (and their monitors)
have been used for both data processing and video, while televisions have
implemented some computer functionality. Since 2010, the typical display aspect
ratio of both televisions and computer monitors changed from 4:3 to 16:9[1]

Modern computer monitors are often functionally interchangeable with television sets
and vice versa. As most computer monitors do not include integrated speakers, TV
tuners, or remote controls, external components such as a DTA box may be needed
to use a computer monitor as a TV set.[2][3]

History
Early electronic computer front panels were fitted with an array of light bulbs where
the state of each particular bulb would indicate the on/off state of a particular register
bit inside the computer. This allowed the engineers operating the computer to
monitor the internal state of the machine, so this panel of lights came to be known as
the 'monitor'. As early monitors were only capable of displaying a very limited
amount of information and were very transient, they were rarely considered for
program output. Instead, a line printer was the primary output device, while the
monitor was limited to keeping track of the program's operation.[4]

Computer monitors were formerly known as visual display units (VDU), particularly
in British English.[5] This term mostly fell out of use by the 1990s.

Technologies
Further information: Comparison of CRT, LCD, plasma, and OLED
displays and History of display technology
Multiple technologies have been used for computer monitors. Until the 21st century
most used cathode-ray tubes but they have largely been superseded by LCD
monitors.

Cathode-ray tube
Main article: Cathode-ray tube
The first computer monitors used cathode-ray tubes (CRTs). Prior to the advent
of home computers in the late 1970s, it was common for a video display
terminal (VDT) using a CRT to be physically integrated with a keyboard and other
components of the workstation in a single large chassis, typically limiting them to
emulation of a paper teletypewriter, thus the early epithet of 'glass TTY'. The display
was monochromatic and far less sharp and detailed than on a modern monitor,
necessitating the use of relatively large text and severely limiting the amount of
information that could be displayed at one time. High-resolution CRT displays were
developed for specialized military, industrial and scientific applications but they were
far too costly for general use; wider commercial use became possible after the
release of a slow, but affordable Tektronix 4010 terminal in 1972.

Some of the earliest home computers (such as the TRS-80 and Commodore PET)
were limited to monochrome CRT displays, but color display capability was already a
possible feature for a few MOS 6500 series-based machines (such as introduced in
1977 Apple II computer or Atari 2600 console), and the color output was a specialty
of the more graphically sophisticated Atari 8-bit computers, introduced in 1979.
Either computer could be connected to the antenna terminals of an ordinary color TV
set or used with a purpose-made CRT color monitor for optimum resolution and color
quality. Lagging several years behind, in 1981 IBM introduced the Color Graphics
Adapter, which could display four colors with a resolution of 320 × 200 pixels, or it
could produce 640 × 200 pixels with two colors. In 1984 IBM introduced
the Enhanced Graphics Adapter which was capable of producing 16 colors and had
a resolution of 640 × 350.[6]

By the end of the 1980s color progressive scan CRT monitors were widely available
and increasingly affordable, while the sharpest prosumer monitors could clearly
display high-definition video, against the backdrop of efforts at HDTV standardization
from the 1970s to the 1980s failing continuously, leaving consumer SDTVs to
stagnate increasingly far behind the capabilities of computer CRT monitors well into
the 2000s. During the following decade, maximum display resolutions gradually
increased and prices continued to fall as CRT technology remained dominant in
the PC monitor market into the new millennium, partly because it remained cheaper
to produce.[7] CRTs still offer color, grayscale, motion, and latency advantages over
today's LCDs, but improvements to the latter have made them much less obvious.
The dynamic range of early LCD panels was very poor, and although text and other
motionless graphics were sharper than on a CRT, an LCD characteristic known as
pixel lag caused moving graphics to appear noticeably smeared and blurry.

Liquid-crystal display
Main articles: Liquid-crystal display and Thin-film-transistor liquid-crystal display
There are multiple technologies that have been used to implement liquid-crystal
displays (LCD). Throughout the 1990s, the primary use of LCD technology as
computer monitors was in laptops where the lower power consumption, lighter
weight, and smaller physical size of LCDs justified the higher price versus a CRT.
Commonly, the same laptop would be offered with an assortment of display options
at increasing price points: (active or passive) monochrome, passive color, or active
matrix color (TFT). As volume and manufacturing capability have improved, the
monochrome and passive color technologies were dropped from most product lines.

TFT-LCD is a variant of LCD which is now the dominant technology used for
computer monitors.[8]

The first standalone LCDs appeared in the mid-1990s selling for high prices. As
prices declined they became more popular, and by 1997 were competing with CRT
monitors. Among the first desktop LCD computer monitors were the Eizo FlexScan
L66 in the mid-1990s, the SGI 1600SW, Apple Studio Display and
the ViewSonic VP140[9] in 1998. In 2003, LCDs outsold CRTs for the first time,
becoming the primary technology used for computer monitors.[7] The physical
advantages of LCD over CRT monitors are that LCDs are lighter, smaller, and
consume less power. In terms of performance, LCDs produce less or no flicker,
reducing eyestrain,[10] sharper image at native resolution, and better checkerboard
contrast. On the other hand, CRT monitors have superior blacks, viewing angles,
and response time, can use arbitrary lower resolutions without aliasing, and flicker
can be reduced with higher refresh rates,[11] though this flicker can also be used to
reduce motion blur compared to less flickery displays such as most LCDs.[12] Many
specialized fields such as vision science remain dependent on CRTs, the best LCD
monitors having achieved moderate temporal accuracy, and so can be used only if
their poor spatial accuracy is unimportant.[13]

High dynamic range (HDR)[11] has been implemented into high-end LCD monitors to
improve grayscale accuracy. Since around the late 2000s, widescreen LCD monitors
have become popular, in part due to television series, motion pictures and video
games transitioning to widescreen, which makes squarer monitors unsuited to
display them correctly.

Organic light-emitting diode

You might also like