• Forum
  • Lounge
  • Getting around the rounding error in bin

Getting around the rounding error in binary division

How do x86 processors usually go about dividing numbers while getting around the rounding error of binary division? That's probably a broad question with a lot of answers, but if you know any techniques they use to divide numbers, I'd like to know.
They don't -- the rounding error propagates.
But how do computers deal with it? Why don't we just have numbers really close to the actual answer when you do division on a computer in whatever program? Rounding? But then if the answer was actually supposed to be that number non-rounded, it would be wrong... Clearly I'm pretty ignorant on the subject. Haha.
Computers don't deal with it -- they are too stupid.

A smart programmer will organize his mathematical expressions in a way to avoid as much error as he needs.

It might be worth your time to read through this:
Topic archived. No new replies allowed.