Is it possible to make a Virtual Video Card in C++?

Hi! I'm trying to create a Virtual Video Card that has the ability to render 3D Graphics and Games just like NVIDIA Cuda to all capable CPU's that does not have any video card installed like laptops and does limit the use of CPU, more faster than render farms... Has someone solve these problem? Can a Direct X solve this problem?

Not sure what you mean at the end there, but laptops do have video cards. A laptop without a dedicated graphics card will use the one that comes with the CPU. For example, an i7-7700HQ will come with the intel graphics 630 (I think that one anyway).

If I understood, you want to make a "Virtual Video Card", which will instead simply be the CPU rendering the graphics? Nvidia Cuda Cores and CPU cores are made for different purposes. The CPU is good at handling load, a graphics card is good at handling complex calculations.

You can always make a CPU do a GPU's job, but it would likely be slower than using the graphics card that comes with the CPU, which is never that good.

OVERALL, if you want to have rendered graphics, even through some fake graphics card, there will still have to be an ACTUAL graphics card to render those graphics.
Last edited on
Hah! Marketing has it so easy; just add current hype-word ("organic", "virtual", "AI", etc) and product will sell.


Computers did draw images way before "the GPU" had been invented.
Movie makers had CPU-based "server farms" for rendering movie frames.

Computing an image is simple math, but there is a lot of it. We know the math that is done "in video card". Therefore, it is possible to implement the same operations in software. A program that does not necessarily care whether it is software or hardware that does the rendering.

In fact, that has been done. For example, the Mesa 3D Graphics Library: https://www.mesa3d.org/intro.html


Wait, what do you mean by "more faster than render farms"?
Yes and no. Yes, you can do this. No, it won't work very well. If you emulate a GPU with your own magic driver that makes your game think you have the latest 64gb ram 5000 CPUs or whatever they have these days cards, your typical 4 core CPU is going to render in seconds per frame instead of frames per second for a typical modern game.



Thanks for all the reply guys I appreciate it...
Topic archived. No new replies allowed.