VGA emulation ammount of pixels before giving the CPU time?

How does software apply getting more colours out of the VGA by changing the pallette during run-time (while the VGA is rendering scanlines)? Do they use some timing based on the ammount of pixels processed in some time (using the Horizontal and Vertical frequency and the ammount of pixels in the display resolution horizontally)?

So if a game wants to change the pallette or DAC every scanline, how would old software do this? How would it know it's time to change the pallette/DAC?
Old software would use old API. Old API probably doesn't exist anymore, which is why old software has to be patched/ported.
Last edited on
Topic archived. No new replies allowed.