This depends entirely on how the monitor and the graphics card are configured.
I.e., I can configure my graphics card to stretch smaller resolutions to fill the entire monitor, as AthomSfere described. I can also configure it to simply center the image on the monitor, leaving the unused pixels black (the concept of a pixel 'on' or a pixel 'off' varies from technology to technology with LCD / LED / OLED).
Some monitors also allow you to set this kind of behavior on the display itself, for when it's receiving a resolution lower than it's native resolution.
To additionally clarify, the pixels on LCD/LED/OLED displays are fixed in position. If the resolution is lower than the native resolution, and it's stretched to fill the screen, those pixels don't move to take up the extra space. Rather, the digital circuitry either in the monitor or the graphics card is upscaling the resolution to the monitor's native resolution, and this is why LCDs look terrible when they aren't running at their native resolution.
1
It's not specific to Windows. Both Linux and Mac OS X have been using subpixel antialiasing for years. (Not that it's of any relevance to the question...)
– user1686 – 2013-04-19T23:01:46.073