CRTC vs VGA attribute controller 8/4 bit depth with 256 color shift mode?

What effect does the CRTC controller display resolution have when 256/16 color is used with(out) 8-bit set in the attribute mode control register the way it's normally used?

So what happens during:
Single shift mode with 8-bit set? (4 bit input, combined output in 1 pixel?)
Interleaved shift mode with 8-bit set? (See Single Shift mode?)
256 color shift mode without 8-bit set? (Every byte produces 2 attribute pixels on-screen?)

Also, how are the pixels counted in active display when these examples are applied? Is the original width in pixels maintained with the overflow not applied?

So does the screen produce the same amount of pixels as specified in the CRTC registers, but in the case of 256 colors, half a byte can get unused (final pixel of a scanline, lower nibble). In the case of single/interleaved shift, you get some kind of strange mixed mode? (2 interleaved/planar pixels become 1 pixel on output?
Last edited on
Topic archived. No new replies allowed.