VGA is a video transmission standard introduced by IBM in 1987 along with PS/2, and is the longest used video transmission method in the IT industry [2].
Various VGA cables can support a variety of resolutions, Range from 320×400px @ 70 Hz / 320x480px @ 60 Hz (12.6 MHz signal bandwidth) to 1280×1024px (SXGA) @ 85 Hz (160 MHz) and up to 2048×1536px (QXGA) @ 85 Hz (388) MHz). There is no standard that defines the quality required for each resolution, but higher quality cables often contain coaxial wiring and insulation, making them thicker. Shorter VGA cables are less likely to introduce significant signal degradation. Good quality cables should not be subject to signal crosstalk, so signals in one wire can create unwanted current or ghosting in adjacent wires. Ginning occurs when impedance mismatches (75 ohms (Ω)) cause the signal to be reflected. However, gaping with a long cable can be caused by an improperly terminated device or a passive cable splitter rather than the cable itself.
While there are already high definition digital standards such as HDMI, Displayport, etc., VGA can support much higher resolutions than HD. Therefore, at present, the command centers of major projects and major organizations (such as: military police and television stations) all use VGA technology to transmit ultra-clear large-screen images.
|
Previous:NO!
Next:What are the classifications of audio signal lines |
Back to list |