Unexpected test results?

Hello. I wrote a program to test which is faster: a static_cast + a enum check or a dynamic_cast. It basically averages however many tests of however many objects each and prints the results. These are the results:

tests: 100
objects: 100

average dynamic time: 142(μs)
average static time: 1(μs)

tests: 100
objects: 1000

average dynamic times: 1376(μs)
average static times: 2(μs)

Does this seem off to you? Is it because static_cast doesn't do runtime checks? I don't see how it's possible to have such a big difference. I'm going to run the program again with a much bigger size (it takes a while) but I honestly don't see it changing that much.

Here is this program:

http://pastebin.com/1FchZGZT

It's very straightforward. Thank you for the read!

Oh, and one more question. Would multi threading make the test less accurate? The tests would go faster but I wasn't sure if that would taint the times.
Last edited on
You never create `Actor'(s) only `GameObject'(s),
however ¿where are you setting their `type' variable?
If they happen to have type=actor then your static_cast is a bad cast, and using is undefined behaviour.


By the way, ¿are you compiling with optimizations?
You never create `Actor'(s) only `GameObject'(s),


right..! i changed it. still didn't impact static casting time.

however ¿where are you setting their `type' variable?


oops. accidentally put it in the Actor constructor. still didn't impact static casting : /

By the way, ¿are you compiling with optimizations?


i'm using vs 2013 so i'd assume so. should I use inline or something else?

it seems like a breaking point on the static_cast part isn't breaking. so i'm obviously doing something really wrong. I updated the pastebin to my new code.
Last edited on
i'm using vs 2013 so i'd assume so...

So you are building and running the release target rather than the debug one?

Andy
no, i'm running debug. i don't have sfml set up for release right now.
no, i'm running debug.

Then your assumption about optimization is wrong -- the debug build in Visual Studio uses /Od for compilation -- that is, absolutely no optimization (O is for optiization and d for disabled).

Andy
Last edited on
With single simple inheritance, the static down-cast has no run-time overhead at all (not even the subtraction of an offset).

So the comparison is between:
a. the cost of the run-time comparison of two integers
b. the cost of a dynamic down-cast with simple inheritance.

No tests are required; the dynamic cast would be an order of magnitude slower (unless the optimiser is clever enough and the context is simple enough to allow the dynamic type to be inferred at compile time from static flow analysis).
Topic archived. No new replies allowed.