How does a computer tell a video card what to display?

3

3

How does a computer communicate pixel values to the screen?

Is there a certain command in the processor's instruction set architecture, that will make the processor communicate directly to the monitor, or does the software store the graphics information in a certain location in memory where it is automatically picked up by the processor to be sent to the monitor? How does this work and where does a graphics card come into play in this process?

bigblind

Posted 2012-05-20T20:37:47.650

Reputation: 233

Answers

3

Let's take DVI. According to the article, red, green, and blue are each transmitted on their own pair of wires, 8 bits per color per pixel, in an uncompressed rasterized fashion. That is, the top line gets sent, left to right, pixel by pixel, then the next line, and so on, with the RGB values lining up because there is no compression. Each channel of data is encoded according to TMDS, which is designed to take into account various physical realities of high-frequency electrical signals. You could also look up VGA and HDMI for similar information.

However, the OS is far removed from sending the data to the monitor. That is handled by dedicated hardware in the video card. Roughly speaking, the video card driver gives the video card commands to draw the image. (In many 2D cases it will just be giving the video card the complete pixel by pixel image, while in 3D cases the video card will be doing a lot of the work itself.) In any event, I am reasonably confident that the video card ultimately places the pixel data for each frame in a buffer, from which a tiny piece of dedicated circuitry sends the signal to the monitor.

In general, it is considered good for dedicated hardware to handle I/O work under the control of the OS. Most of the OS's I/O work is just telling dedicated hardware what to do, and that hardware then knows the low-level protocol for external communication. As well as offloading work from the CPU, dedicated hardware means that a delay on the OS's part won't necessarily disrupt communication with the peripheral. For another example of this, take Ethernet. The OS tells the NIC where to put incoming packets, and the NIC signals the OS when it's placed a packet there, but if the OS doesn't do anything with the first packet right away, the NIC can still receive more packets until it runs out of space allocated to it for incoming packets.

From How does a computer monitor work, OS-side? on the XKCD forums.

sgtbeano

Posted 2012-05-20T20:37:47.650

Reputation: 575

sorry, I guess I phrased my question differently when I searched on google. thx for the answer. – bigblind – 2012-05-20T20:48:47.870

No problem, it happens to me all the time! – sgtbeano – 2012-05-20T20:54:36.210

0

Depends on the hardware but a common way is to use Memory mapped I/O, that is certain memory addresses are mapped to the device (for example the video card).

See Memory Mapped I/O on wikipedia for example.

Mattias Isegran Bergander

Posted 2012-05-20T20:37:47.650

Reputation: 429