Program test

Would anyone mind running my program? I'm getting odd performance patterns from OpenGL and I'd like to try to narrow it down.

http://www.mediafire.com/?bu0f0wiv4nvj0v1

Run it with something like Engine.exe > fps.txt. Let it run for 30-60 seconds, then close it. Send me the redirected output including details about your setup (OS version, graphics card, driver version).
Did you supply source code, or just an exe?
It's only the executable. The source code is not ready for distribution yet.
closed account (o1vk4iN6)
That's a pretty trippy pattern, is it the sin curve going outward from the center ? Ya I saw that post you made that you deleted, even had a reply that got dumped :<.

You could also gather the necessary system specs yourself :P.

http://pastebin.com/rTB10DzD

Windows 7 enterprise, Intel Q965/Q963 chipset and uh the driver looks like it's using the stock windows one from 2009.

I'll run it again on my computer when I get home.
Windows 7 Ultimate x86-64, GTX 560 Ti, 9.18.13.697

http://pastebin.com/GexAnmJC
is it the sin curve going outward from the center ? Ya I saw that post you made that you deleted, even had a reply that got dumped :<.
Sorry about that.
It's a Bézier curve. Although my post was about sine, what I was really interested was finding the arc length of curves in general.

You could also gather the necessary system specs yourself :P.
No, I'm probing the limits of the program's performance. So far I've managed to get 10000 particles on the screen at 60 fps, but I'm getting lag spikes from glSwapBuffers() at regular intervals, and I'm trying to pin down the source.

See what I mean:
http://imgur.com/BpbJQ
The blue line is my run, the reddish one is yours (higher is better performance). Although on average it runs faster on my computer, the variance is three times as bad.

EDIT: Here's chrisname's run included:
http://imgur.com/cP3PI
His variance is pretty bad, too (2.55x relative to xerzi's), but not quite as bad as mine. It's possible Nvidia cards just do that.
Last edited on
This looks a bit like a parody of one of "try my game.exe" threads. Though I can't recall whether you are among the "never download anythig" crowd. Regardless. http://pastebin.com/SX43T7d2 although it might be useless? The setup is wine (as Windows XP). Real OS is openSuse 12.1 (not sure if that has any relevance). The card GeForce GT 525M. The driver is nouveau.
I can't use data from systems with vsync forced on.
Do you have a version I can run on my Linux machine? Or is it only compiled on windows?
It's only compiled for Windows. I don't have the time to setup a second build environment, sorry.

Plus, Linux is a nightmare when it comes to distributing binaries.
http://pastebin.com/9dbtB28M

Windows 7 pro, i7 Q820, ATI Mobility Radeon 4670 driver version 8.902.0.0
Alright, I've verified that it's not my code doing this. Quake 3 Arena shows the same behavior when I turn vsync off. OpenGL just seems to do that for me, for some reason.
Dunno if you still need stuff:

http://pastebin.com/dPaVkkK7

------------------
System Information
------------------
Time of this report: 11/26/2012, 14:16:35
Machine name: ITSACOMPUTER
Operating System: Windows 7 Home Premium 64-bit (6.1, Build 7600) (7600.win7_gdr.120830-0334)
Language: English (Regional Setting: English)
System Manufacturer: TOSHIBA
System Model: Satellite A665
BIOS: Phoenix SecureCore Version 2.00
Processor: Intel(R) Core(TM) i3 CPU M 370 @ 2.40GHz (4 CPUs), ~2.4GHz
Memory: 4096MB RAM
Available OS Memory: 3890MB RAM
Page File: 1982MB used, 5796MB available
Windows Dir: C:\windows
DirectX Version: DirectX 11
DX Setup Parameters: Not found
User DPI Setting: Using System DPI
System DPI Setting: 96 DPI (100 percent)
DWM DPI Scaling: Disabled
DxDiag Version: 6.01.7600.16385 32bit Unicode
Thanks everyone for your help. I finally found the source of my problem.
Process Explorer since version 15 can monitor GPU usage. Every time PE refreshed its state, it added a delay of approx. 15 ms to two or three frames in a row, which is huge (a frame lasts 16.(6) ms).

Lesson learned. To run expensive OpenGL applications, turn PE off.

EDIT: Oh, and here's what my timings look like now: http://imgur.com/J3bwt
The y axis is milliseconds or thousands of sprites. The x axis is frame number.
The blue line is time spent that frame creating new objects.
The reddish line is time spent on the CPU, including animating sprites and generating vertex information for sending to the GPU, but excluding the blue line.
The green line is sprite count in thousands.
The brown line is total time for the frame.
Last edited on
That's odd because I don't have Process Explorer but on my GPU there was a lot of variance as well.
It is odd. If you weren't running something in the background, I can't imagine what could cause the framerate to momentarily drop by 18x.
Topic archived. No new replies allowed.