• Forum
  • Lounge
  • Debate over the theory of graphics outpu

 
 
Debate over the theory of graphics output

Pages: 12
I started a thread on OSdev which lead to a debate which has had no closure or mutual agreement. Here's the link to the thread (debate starts 5 or so posts in): http://forum.osdev.org/viewtopic.php?f=13&t=27853

The debate is over how graphics can be put on the screen directly, and whether or not a GPU is needed to do this. It's pretty interesting, and I want to post here because I know some of you people here have experience in lower-level programming.

Would like to know your ideas on this as well.
Disclaimer: I haven't read the thread all the way through...

But the actual answer is.... It depends on the hardware.

At some point there is some section of memory that contains data that the graphics card turns into electrical potentials which the display hardware understands to make pretty pictures.

The caveat is that that "section of memory" may be hooked in numerous ways. What the CPU thinks it is writing two may actually be redirected and modified before it gets to the final, actual volatile storage IC that we originally referred to as our section of memory.

One of the hooks is the GPU. The GPU is itself a rather vague term, because while it technically refers to a removable piece of hardware, it is commonly tightly integrated with the rest of the video hardware -- and we use the term to simply distinguish what function the hardware is performing.

It is always possible to "directly" write video memory (assuming you have the proper permissions) to perform basic VGA control. This is part of the required standards.

But remember our caveat: the "direct" write is actually passing through some filters to the actual storage chips.

The OSDev OP was trying to do something special with those video filters without accessing them. Non sequitor.

Hope this helps.
Disclaimer: I am the OP of that subject, and I know I did a non-sequitor by taking the topic off track.

Helps a small bit, but not ideal to give me the full idea. Also, you say "the rest of the video hardware". Care to explain what the "rest of" this video hardware is?

Also, you are setting this off ... "basic VGA control" is still writing to the screen, so assuming basic VGA has access somehow, why would extended resolutions not be? Why is Combuster on OSdev telling me that "no GPU is needed to drive data to the screen", and then he tells me to use VESA BIOS extensions?

VESA BIOS extensions are provided by a GPU -- you can't use such features without it.

Point is if you are to say it depends, I agree with you there, because the scope of engineering can be very various (I am learning EE). However, you say the "required standards". That is also opening a new can of worms...

What does the "required standards" mean, and why can they be accessed directly, but people impose limits on HD? HD is just higher set of pixels, or drawing areas of memory.

If like one poster said on that discussion I linked, "you can set any video mode without a GPU", is true, GPUs are only really useful for these things:
'
1.Hardware acceleration.

2.3-D computations, processing power/support, GPGPU, etc.

3.Speed, extra memory, DMA.

Otherwise, one would figure that if you can use basic VGA resolution, any resolution should work.

PS: I can't stand that Combuster user, btw.
Last edited on
It sounds to me like the actual physical display hardware is being confused with the "screen".

The screen is an abstract construct created to assist programmers to manipulate the display hardware.

I suppose you could get some wires and hook them up to a number of unused ports and send the proper signals to those ports to make the display do what is wanted... but the whole point and purpose of the graphics hardware is to do that for you.

Each manufacturer has a vested interest in the way his hardware provides non-standard bells and whistles. And every manufacturer thinks his way is best. Meaning that NVIDIA's hardware will differ from ATI's, say.

By "non-standard" is meant not VGA/VESA stuff that everyone in the industry must support. The way it is supported is two-fold, but first you need to remember that the VGA is a nearly 30-year old piece of technology. PCs work by making everything at least look (in terms of hardware) the same, so that it will just work with any PC. But 30 years is pre-history when it comes to computers.

The first thing manufacturers must do is make their graphics card default to looking and acting just like one of those ancient VGA cards.

The problem there is that the VGA has a maximum resolution of 640x480 resolution in 16 colors (unless you know how to play with the hardware just right, but the improvements aren't all that spectacular to us today).

Video card manufacturers began providing competing methods to display higher resolutions with more colors. This resulted in some very predictable confusion. If you wanted to support a higher resolution, you had to have a application-side software driver for a specific card. If you wanted your program to be usable by the general PC-using public, you had to have a significant number of drivers distributed with your program to support as many video card manufacturers as you could.

VESA was born to help with that. They provided extensions to the standard VGA BIOS interrupts to help with higher resolutions with more colors. Now the card manufacturer had to provide the hardware (or software) to make it possible for anyone to use those higher modes by simply using the standard INT10 facilities.

But today's displays can handle significantly higher resolution than even the VESA standard mandates.

HD is just higher set of pixels, or drawing areas of memory.

No, alas, it is not. Again, the video memory is mapped into host PC's address space based upon ancient standards.

It is possible for some cards, maybe all today, IDK) to map their entire address space into the host's memory space, but it is such a huge space that it is typically impractical to do that. What is more common is "windows" into that space -- again accessible via VESA standards.

Consider, my current display resolution is 1920x1080 by 24-bit color (what most people have been conditioned to believe is 32-bit color), or 4,179,600 bytes of memory. That's bigger than a lot of executable programs.


Nowdays, however, manufacturers also follow the hardware standards dictated by Microsoft Windows and its plug-n-play architecture for these things.

The actual, physical hardware can be designed any way the manufacturer likes. It just has to appear and behave certain ways when accessed via its software driver. That's why when you don't have the proper driver installed for your video card on windows, windows thinks you have a rinky-dink SVGA.

It is the software driver that lets programs access all these advanced features in the hardware.


"you can set any video mode without a GPU"

No, you can't.

It may be possible for some individual models, but in general that cannot be true. (Nor should it be! Otherwise card manufacturers cannot create new cool stuff!)


GPUs are only really useful for these things

Like I already responded, the "GPU" is not necessarily distinct from the rest of the graphics hardware.

In order to play with the advanced hardware, you have to obey the rules governing that hardware. And unless you want to handle every piece of hardware by scratch, just use the software driver that comes with the thing.


That Combuster knows what he is talking about, but he is responding to the randomness of the thread with generalized specifics.

He is correct. If you want to play with the graphics card knowing nothing about it and not touching its driver software, you are limited to VESA functions.

Hope this helps.
It doesn't help, it is counterintuitive, and everyone seems to have a different opinion on it with no cold hard, undeniable facts presented.

In other words, I don't have the detailed answer I want yet. I can't confirm that Combuster knows what he's talking about 100%, and neither can I confirm you do either.

A software driver is just software that accesses hardware. In other words, anything can be considered a software driver if you vaguely refer to a device driver as any software that controls hardware.

But again, this doesn't help, sorry.

You and I don't seem to be on the same page here.

Manufacturers creating "cool new stuff" is not relevant at all. Might as well just end this discussion, because it's going to get nowhere like this.

I'll learn the facts as I progress through engineering and programming much further.
I'm not ready for this thread to die yet. I have to go home in a bit but I'll pick it back up when I get there if my time allows.

Interfacing with the video card is done through the video BIOS interfacing with the video BIOS is done through the software drivers, this is the only way to talk to the video buffer which is directly between you and the monitor. If you want to know if it is possible the send raw electrical signals up a video cable and still get a visible output on the screen then yes, you can and yes I have done it with a PCB and an oscilloscope (I finished my degree in EE a few years ago by the way). But it would not be practical in anyway to do this through another port on the PC that was not designed specifically for this kind of thing.

Limits are proposed on standards to make sure that everybody in the industry is on the same page. If manufacturer A says that HD is one thing and manufacturer B says that it is another then the only thing they accomplish is having customers pissed off at the both of them because their system came from manufacturer C who doesn't know what the heck is going on with either one.

Monitors all have a maximum resolution, refresh rate contrast ratio etc. These are determined automatically when the monitor is plugged in through a process that is called negotiation. So Duoas is right, you can not set the video mode without a GPU unless you want to re-write these negotiations by hand for every possible model on the market and the ones that haven't been made yet.

I'm sorry if you couldn't find a merry band of uneducated sycophants to differ to you about a topic that they no nothing about. If that's the kind of thing you're looking for then I'm pretty sure you know where Tumbler is.
LieutenantHacker wrote:
It doesn't help, it is counterintuitive, and everyone seems to have a different opinion on it with no cold hard, undeniable facts presented.

No, it is your own bias getting in the way. You can say the sky ain't blue all you want, but it doesn't change the cold, hard, undeniable facts that have been presented to you.

And to be frank, we've been taking it easy on you since you are asking about a whole lot of stuff you don't understand (yet, I hope).

At the moment, you seem unwilling to accept that your own conceptions about how computers work inside may not be entirely correct.

In other words, I don't have the detailed answer I want yet. I can't confirm that Combuster knows what he's talking about 100%, and neither can I confirm you do either.

Yes you can. We're not going to give you several 300-level university courses sufficient to do it, though.

What you do have is several apparently knowledgeable sources telling you the same thing. (Whether or not you recognize it as such.)

A software driver is just software that accesses hardware. In other words, anything can be considered a software driver if you vaguely refer to a device driver as any software that controls hardware.

Nah, that's just a circular generalization. A device driver is a specific type of software, identifiable by the system as such. But you've almost made my point for you:

The device driver software knows exactly which ports to play with and exactly how to play with them to make the hardware do what is desired. It is necessary because the hardware will only respond properly to the correct inputs in the correct places.

Another manufacturer's hardware will not respond the same way.

Hence the hardware-specific software is necessary, so other programs on the PC, which know absolutely nothing about the video hardware's specific requirements, can still use it. Why? Because the software presents an API that other programs do understand.

You and I don't seem to be on the same page here.

You keep jumping off.

Manufacturers creating "cool new stuff" is not relevant at all. Might as well just end this discussion, because it's going to get nowhere like this.

Fine. Go away then, wagging your superior head at us ignoramuses.

Frankly, it is clear that you simply don't know enough to judge what is relevant and what is not.

I'll learn the facts as I progress through engineering and programming much further.

If that's what it takes.

You'll have a hard time as long as you believe everything looks and acts the same, and is as simple as doing the same thing to get the same result on everything you see.

As soon as you drop that preconception, and understand that there are about a quadrillion different ways to design functioning computer systems, you'll realize you can't just press magical buttons and expect it all to behave the same way.

Relevant:

http://www.xkcd.com/1349/


I really don't want you to run away with a sour taste in your mouth. All this stuff spins everyone's heads around at first. There is a whole friggin' lot of stuff to learn just to begin to get a more complete understanding of what previously seemed to be very basic.

Jump in, enjoy the ride, ask questions, and if something goes over your head the first two (or three, or four or five) times, just file it away for something later. Because you'll eventually get it.

Unless you walk-in believing you already have a superior capacity for understanding than everyone else.


I still hope this helps.
I'm confused... isn't the whole purpose of the GPU to rasterize and place textures on stuff? I don't think it was ever used for just putting stuff on a monitor. We can do that just fine in software.

I clearly did not read linked topic.
Last edited on
What the hell does it matter. That's really not your business to be posting things like that.
I know nothing about this topic, but I just wanted to say that yes, brownycup that is offensive.
I only got the basic bootloader and minimalist kernel before moving to another project.

@LieutenantHacker
You may not like Combuster, but he is a moderator on that site for a reason. He knows what he is talking about even if you don't want to listen to it (sounds familiar). If you are truly interested in OS development, then you have to level your expectations and listen to the people there because they do know what they are talking about. Jumping from forum to forum seeking those who will take your side on a discussion is about as sane as going from doctor to doctor looking for one that will tell you that you're sick when you really aren't.
closed account (N36fSL3A)
I don't see how moderator powers makes someone knowledgeable of OS-Development.
I agree. I'd rather prefer something he made as evidence of his abilities.
Well you could look through this (I've not done it personally), but it appears to be OS code of Combuster that he is currently working on:
http://www.d-rift.nl/combuster/mos3/?p=browsesources

If you question the abilities of a moderator on another site, does that mean you question twicker's abilities too?
Yeah, it does. twicker is no exception and why should he be? Of course, it depends on the context but outside of managing this forum, I don't know the extent of his abilities. I have no intention of calling twicker out and saying he ain't shit, but that doesn't mean the opposite is true either.

Also, stop reporting people for moral based opinions. It's disgraceful at best.
Last edited on
You may not like Combuster, but he is a moderator on that site for a reason. He knows what he is talking about even if you don't want to listen to it (sounds familiar)


1
2
3
4
5
6
7
8
9
10
11
#include <iostream>
#include <type_traits>

#include "Moderation_Skills.h"
#include "Programming_Skills.h"

int main(){
    std::cout << std::boolalpha << std::is_same<Moderation_Skills, Programming_Skills>::value;

    return 0;
}
false


Edit:
Point - Non sequitur argument is non sequitur.
Last edited on
Let me ask you this, what does proof of abilities do? Are you HRs at programming companies looking to hire that programer? Are you wanting to start a project with them? Those are the only two valid reasons I would bother showing proof of my abilities to anyone.

Looking at how some of the veteran members interact, it doesn't garnish any respect to prove your abilities. Who are you (anyone that demands proof) to expect proof? I know none of you are Bjarne Stroustrup so there is no benefit to proving oneself just to appease your curiosity of their abilities.

@Daleth
It is my understanding of that site, that the admin picks existing members of the site who have proved themselves knowledgeable and able to maintain an unbiased view in discussions to be moderators.

Also, stop reporting people for moral based opinions. It's disgraceful at best.

Only person I reported was Nathan2222, upon which prompted Twicker to email me where I argued my case which ultimately ended with him being banned. I also report any spam threads I see pop up from time to time. Other than that I am fine with his opinions (after all he as said worse about me and I never bothered reporting him).
All that was stated was that his position as moderator probably isn't an indicator of his ability. You seemed to have implied that that it is.

But now your argument is that abilities don't matter since we're not HR... which begs the question of why should we take the opinion of the person in question seriously?

It doesn't matter his abilities, I think you're right in that regard. It's our own job to determine whether what someone says is valid or not. Even if he was Super Man (who was hyper intelligent), there's no reason to assume that what someone says is automagically valid and correct.
You seemed to have implied that that it is.

When did I do this? My exact remark was:
Me wrote:
You may not like Combuster, but he is a moderator on that site for a reason. He knows what he is talking about even if you don't want to listen to it (sounds familiar).

Knowledge != abilities. For example, I have a cousin who is knowledgeable about C++, but has never touched a compiler nor wrote a line of code in his life, but every time he is over he thinks it is his job to critique my code. He has it in his head that he can become a programming consultant without actually programming.

As for the HR remark, I said that was one of the two reasons I would be compelled to prove my abilities by posting projects.

For example, from another thread Fredbill said this to me:
Fredbill wrote:
You throw numbers out there, bragging about how much experience you have, but without hard evidence of the works you've made, I'm not going to believe a word you say.

This is why I don't understand the point of demanding proof. In the end it is his loss, not mine. If he chooses not to believe me then it means he is out the 20 years of knowledge I've gained and someone else will get it. That is why it makes no difference to me and I'm not running around saying "Oh No! I have to upload my projects for proof!"
Last edited on
That was taken out of context clearly. In that context, someones abilities are in question. Not whether what that person says is valid.

I often relate "abilities" to "knowledge". If you have knowledge about C++, then you have the ability to write lines of code. If you'd like to get ridiculous, we could question whether he has fingers to write code with or a working keyboard but assuming that he has functional limbs and a working computer, why or in what case would we assume that someone who is knowledgeable about C++ doesn't have the ability to code in C++ or vice versa?

You may not like Combuster, but he is a moderator on that site for a reason.


This seems to imply that because Combuster is a moderator on that site, he has knowledge on a subject outside of moderation.
Last edited on
Pages: 12