Dictionary of Computer and Internet Terms: rounding error
rounding error
an error that occurs because the computer cannot store the true value of most real numbers; instead, it can store only an approximation of a finite number of digits.
If you wish to write 1/3 as a decimal fraction, you can approximate it as 0.333333333, but it would require an infinite number of digits to express it exactly. The computer faces the same problem except that internally it stores the numbers in binary, and it can lose accuracy in converting between binary and decimal. For example, 0.1 has no exact representation on a computer; if you add the computer's representation of 0.1 to 0 ten times, you will not get exactly 1. To avoid rounding error, some computer programs represent numbers as decimal digits.