What Is the Difference Between Accuracy and Precision?

Accuracy and precision are two important factors to consider while taking measurements. Both these terms reflect how close a measurement is to a known or accepted value. In this article, let us learn in detail about precision and accuracy.

Precision is a measure of how close a series of measurements are to one another. Precise measurements are highly reproducible, even if the measurements are not near the correct value. Darts thrown at a dartboard are helpful in illustrating accuracy and precision. The ISO applies a more rigid definition, where accuracy refers to a measurement with both true and consistent results. The ISO definition means an accurate measurement has no systematic error and no random error.

definition of accuracy

The top left image shows the target hit at high precision and accuracy. The top right image shows the target hit at a high accuracy but low precision. The bottom left image shows the target hit at a high precision but low accuracy.

If your scale gives you values of 49.8, 50.5, 51.0, and 49.6, it is more accurate than the first balance but not as precise. The average of the measurements is 50.2, but there is a much larger range between them. The more precise scale would be better to use in the lab, providing you made an adjustment for its error. In other words, it’s better to calibrate a precise instrument than to use an imprecise, yet accurate one. In the fields of science and engineering, the accuracy of a measurement system is the degree of closeness of measurements of a quantity to that quantity’s true value. The precision of a measurement system, related to reproducibility and repeatability, is the degree to which repeated measurements under unchanged conditions show the same results.

The term is more tightly defined in certain fields, e.g. scientists distinguish it from precision. You can think of accuracy and precision in terms of a basketball player. If the player always makes a basket, even though he strikes different portions of the rim, he has a high degree of accuracy. If he doesn’t make many baskets but always strikes the same portion of the rim, he has a high degree of precision.

Meaning of accuracy in English

When the level of the noises increases, the difference between the accuracies of one and four proton sources also increases. Figure 4 shows the accuracies of the 15-nonterminal grammar on the test set, differentiated by sentence length. In both the low frequency and low density conditions, little sustained generalization took place, with production accuracies never exceeding 20 % at any point in time.

With these precise data, the evaluation will be very accurate. This yielded an accurate temporal measurement of the pause time between the adult and child utterances. Sensitivity and flexibility are needed for engaging with individual families, but accurate measurement is needed for systems control. Internet communication instantly provides timely and accurate data for evaluating investment opportunities.

Although the two words precision and accuracy can be synonymous in colloquial use, they are deliberately contrasted in the context of the scientific method. Sometimes, a cognitive process produces exactly the intended or desired output but sometimes produces output far from the intended or desired. Furthermore, repetitions of a cognitive process do not always produce the same output. Cognitive accuracy is the propensity of a cognitive process to produce the intended or desired output.

definition of accuracy

Measurements that are both precise and accurate are repeatable and very near true values. According to ISO , the general term « accuracy » is used to describe the closeness of a measurement to the true value. When the term is applied to sets of measurements of the same measurand, it involves a component of random error and a component of systematic error. In this case trueness is the closeness of the mean of a set of measurement results to the actual value and precision is the closeness of agreement among a set of results. While scales and balances might allow you to tare or make an adjustment to make measurements both accurate and precise, many instruments require calibration.

Where not explicitly stated, the margin of error is understood to be one-half the value of the last significant place. For instance, a recording of 843.6 m, or 843.0 m, or 800.0 m would imply a margin of 0.05 m , while a recording of 843 m would imply a margin of error of 0.5 m . The difference between the mean of the measurements and the reference value, the bias. Establishing and correcting for bias is necessary for calibration. In military terms, accuracy refers primarily to the accuracy of fire , the precision of fire expressed by the closeness of a grouping of shots at and around the centre of the target. Alternatively, ISO defines accuracy as describing a combination of both types of observational error , so high accuracy requires both high precision and high trueness.

accuracy | American Dictionary

All processes should have a target value you wish to achieve. How close you come to hitting that target will be the measure of your accuracy. Correct, accurate, exact, precise, nice, right mean conforming to fact, standard, or truth. In industrial instrumentation, accuracy is the measurement tolerance, or transmission of the instrument and defines the limits of the errors made when the instrument is used in normal operating conditions.

definition of accuracy

In other words, accuracy is the extent to which the average of the measurements deviate from the true value. Decisions can be made highly effective if the measurement data have high accuracy and precision. High accuracy and precision in the measurement data can be achieved only if there will be small variations in the measurement system. Accuracy is the degree to which an expression or measurement conforms to a true value.

The extent to which a given measurement agrees with the standard value for that measurement. The weights of raw material bags seemed to be lower than expected. While the weights from bag to bag showed little variation, the average was lower than the target weight provided by the vendor. Precise adds to exact an emphasis on sharpness of definition or delimitation. Exact stresses a very strict agreement with fact, standard, or truth. When talking about the accuracy of medical tests, doctors and researchers use the terms sensitivity and specificity.


If you use an unmarked flask to try to obtain 1 liter of liquid, you’re likely not going to be very accurate. If you use a 1-liter beaker, you’ll probably be accurate within several milliliters. If you use a volumetric flask, the accuracy of the measurement may be within a milliliter or two. Accurate measuring tools, such as a volumetric flask, are usually labeled so a scientist knows what level of accuracy to expect from the measurement. In math, science, and engineering, accuracy refers to how close a measurement is to the true value. In logic simulation, a common mistake in evaluation of accurate models is to compare a logic simulation model to a transistor circuit simulation model.

  • This information should not be considered complete, up to date, and is not intended to be used in place of a visit, consultation, or advice of a legal, medical, or any other professional.
  • Any opinions in the examples do not represent the opinion of the Cambridge Dictionary editors or of Cambridge University Press or its licensors.
  • Cognitive accuracy is the propensity of a cognitive process to produce the intended or desired output.
  • The validity of a measurement instrument or psychological test is established through experiment or correlation with behavior.
  • Accurately hitting the target means you are close to the center of the target, even if all the marks are on different sides of the center.

When these are sorted, a classification is considered correct if the correct classification falls anywhere within the top 5 predictions made by the network. It is usually higher than top-1 accuracy, as any correct predictions in the 2nd through 5th positions will not improve the top-1 score, but do improve the top-5 score. A common convention in science and engineering is to express accuracy and/or precision implicitly by means of significant figures.

Difference between Accuracy and Precision

She says she can type 85 words per minute with 90% accuracy. The difference between the actual value and the measured value is known as error. Basketball is one of those sports where you need to hit the target. A football field goal kicker might have room for some deviation from a straight line – for college and pro football there is an 18 foot 6 inch space for the ball to go through. In basketball, the basket is only 18 inches across and the ball is a little less than 10 inches across – not much room for error. The ball has to be on target in order to go into the basket and score.

definition of accuracy

The accuracy of the instrument only at a particular point on its scale is known as point accuracy. It is important to note that this accuracy does not give any information about the general accuracy of the instrument. The text cannot be guaranteed as to the accuracy of speakers’ words. Six Sigma professionals will agree that achieving precision should be the first goal of any improvement activity, and then you can worry about the accuracy. These explorations of modern life—which are accurate, heartfelt, and depressing in their blandness—illuminate why nostalgia has become a default creative path. For the first time in a long time, the Hoosiers have been among the Big Ten’s and the nation’s most-accurate outfits from distance.

Related word

In medicine, it is important to differentiate between these two measures of accuracy. For example, a test with low specificity would not make a useful screening tool. It would screen too many people without the disease as positive, causing them to undergo unnecessary diagnostic procedures. Also, for every 100 without the virus, the test would likely correctly identify 96 as virus free and also incorrectly return four positive results. To ensure accuracy, the scientists carried out the experiments three times. The researcher tested the accuracy of the World Wide Web in answering some general knowledge questions.

Key Takeaways: Accuracy Versus Precision

Accuracyis the degree of closeness of the measurements to the target or reference value. Precision on the other hand is the degree of closeness of measurements with each other. Scientists often report percent error of a measurement, which expresses how far a measured value is from the true value.

Precision is how consistent results are when measurements are repeated. Precise values differ from each other because of random error, which is a form of observational error. Percent error is used to assess whether a measurement is sufficiently accurate and precise. Accuracy is the proximity of measurement results to the accepted value; precision is the degree to which repeated measurements under unchanged conditions show the same results. In the first, more common definition of « accuracy » above, the concept is independent of « precision », so a particular set of data can be said to be accurate, precise, both, or neither.

Essentially, the ISO advises that accurate be used when a measurement is both accurate and precise. You can think of accuracy and precision in terms of hitting a bull’s-eye. Accurately hitting the target means you are close to the center of the target, even if all the marks are on different sides of the center. Precisely hitting a target means all the hits are closely spaced, even if they are very far from the center of the target.

In the previous few sections having discussed what each term means, let us now look at their differences.

The terminology is also applied to indirect measurements—that is, values obtained by a computational procedure from observed data. This level of accuracy can provide a highly precise and stable measurement. Accuracy refers to the closeness of the measured value to a standard or true value. It is possible for a measurement to be accurate on occasion as a fluke. For a measurement to be consistently accurate, it should also be precise.Results can be precise without being accurate.

Posted in: Software development

Leave a Comment (0) →