Percent Error Calculator
Solution
Absolute Error
Relative Error
Absolute Error
Relative Error
Percent error quantifies how far an experimental measurement deviates from a known or accepted value, expressed as a percentage. It is a standard metric in chemistry, physics, and engineering for evaluating measurement accuracy. A lower percent error indicates a measurement that closely matches the accepted value, while a higher percent error reveals greater inaccuracy.
The formula takes the absolute difference between the measured and actual values, divides it by the absolute value of the actual value, and multiplies by 100. The absolute value ensures the result is always positive, reflecting the magnitude of deviation regardless of direction.
Three related values are calculated: the absolute error (the raw difference between measured and actual), the relative error (absolute error divided by the actual value), and the percent error (relative error times 100). Each tells a different part of the accuracy story.
Suppose you conduct a lab experiment and record the boiling point of water as 101.3 degrees Celsius. The accepted boiling point at standard pressure is 100.0 degrees Celsius.
Measured Value = 101.3
Actual Value = 100.0
Absolute Error = |101.3 - 100.0| = 1.3
Relative Error = 1.3 / |100.0| = 0.013
Percent Error = 0.013 x 100 = 1.3%
A 1.3% error is generally acceptable for a classroom experiment, suggesting your thermometer and technique produced a reliable measurement.
By the standard formula, no. The absolute value in the numerator ensures the result is always zero or positive. Some instructors use a signed version (without absolute value) to show whether the measurement was above or below the actual value, but the unsigned version is the most widely accepted definition.
The formula requires dividing by the actual value, so when the actual value is exactly zero the result is undefined (division by zero). In such cases, consider using absolute error alone or a different error metric like mean absolute error.
There is no universal threshold. In a college chemistry lab, anything under 5% is typically considered acceptable. In precision manufacturing, tolerances may demand errors below 0.1%. The acceptable range depends entirely on the context and how sensitive the application is to measurement accuracy.
Percent error compares a measured value to a known standard (actual/theoretical value). Percent difference compares two measured values to each other using their average as the denominator. Use percent error when you have an accepted reference value; use percent difference when comparing two independent measurements.
Yes. If the measured value deviates from the actual value by more than the actual value itself, percent error will exceed 100%. For example, measuring 250 when the actual value is 100 gives a percent error of 150%.