Inspection Space

Where the Inspection and
NDT community meets
Group 1633.svg

What you should know about the accuracy and repeatability of UCI probes



Article highlights

  • Accuracy for Ultrasonic Contact Impedance (UCI) is an "inaccurate term" and shall always be replaced by the coefficient of variation and measurement deviation
  • Coefficient of variation (repeatability) and measurement deviation (also known as accuracy) shall be used together to better describe the UCI parameters as demanded by the most rigorous standards
  • ASTM A1038 standard controls only measurement deviation and does not monitor repeatability. 
  • DIN50159 and GB/T34205 are the most rigorous standards for UCI and ensure accuracy and repeatability
  • The best practice is to use UCI probes calibrated against all three standards to make sure that the measurements are not only accurate but repeatable

After reading this 5-minute-long article you will understand the difference between accuracy and precision, why they are essential and how to make the best choice when choosing your equipment. 



Many producers provide technical sheets and talk about the UCI accuracy and how accurate the probes are, but this creates even greater confusion among the users when it comes to the technique itself. 

So why is the probe's accuracy an inaccurate statement? The probe accuracy refers to “how accurate the technology and its components” are but does not define what multi-point accuracy the probe can deliver.

Even more critical is that UCI method is typically executed with a handheld device and the operator's experience or handling contributes to the absolute values. For the UCI method, the two parameters that directly deal with the probe's performance are of much greater significance for the user: measurement deviation (also referred to by standards as accuracy) and coefficient of variation (repeatability). Both of them are used to calibrate devices complying to most rigorous standards (DIN & GB/T).


Measurement deviation (accuracy) and coefficient of variation (repeatability) 

How are these two parameters described and what do they mean?  

According to DIN 50159, ASTM A1038, and GB/T34205 the measurement deviation (accuracy) is defined as follows:


(E – measurement deviation,  – Average value of n measurements, H – reference value i.e. test block)

In other words: it describes how the average value deviates from the reference value on % scale and is also strongly correlated with the quality of the reference and the calibration.

The coefficient of variation of UCI  device (repeatability) is defined in DIN 50159 and GB/T34205 and describes the relative difference between the highest and the lowest hardness value concerning the average:


(r-repeatability, Hmin and Hmax – the lowest and the highest hardness values respectively, - average value)

In other words: it describes how far the measurement values are scattered from each other. Repeatability is mainly dependent on the quality of the instrument and is sometimes used interchangeably as the precision of the device.


Why is repeatability important?

To illustrate better the meaning of accuracy and repeatability we use a simple target. Typically several measurements are executed to compute the average - the true value compared against the reference test block. In the below example, four possible measurement outcomes are compared and classified into two columns; a - low accuracy and b high accuracy.

For both cases a and b, the red point indicates the computer average which is identical (for a column and b column respectively). A high accuracy but low repeatability indicates that measurements must be carried in a higher population of measurement points to compute the average value, as the single data points are widely spread.



This is a problem for many hardness testing applications, for example, Heat Affected Zones (HAZ) where a weld profile is inspected by collecting a hardness profile of a weld, that consists of single measurements. In this particular case, the readings may be distorted to the extent that the border between the affected and unaffected zone is not easy to spot or blurred. Furthermore, calibrations of the devices are done in laboratories with high precision in a very controlled environment, minimizing the user's influence on the measurement - users use those in the field on non-ideal surfaces and not always perpendicularly to the tested surface, which is crucial. And hence accurate but not repeatable devices add unnecessarily deviation into the quality and reliability of the data. 

What are the limits used by manufacturers and why do those values depend on hardness and test force?

Let the following table be a guideline concerning the maximum tolerable measurement deviation and repeatability. Please note that these values are used for the device's calibration by the manufacturers and not as a basis for the daily verification conducted by the end-user.


The UCI method uses a vibrating rod (with Vickers diamond at the tip) to measure the indentation depth. The indenter vibrates with a specific frequency (f0), which changes upon the indentation (fi) - as a response to the contact area between the diamond and the material. The greater the indentation the higher the frequency change. Typically, the indentation depth varies between 5 microns and 35 microns.

It is correct to say that the greater frequency changes can be measured more accurately, which indicates that: the higher the contact with the diamond (i.e. higher indentation depth) the lower the uncertainty of the measurement.

Higher tolerance values for lower HV scales and harder material ranges are based on the fact that for the low HV scales such as HV 0.1- HV 0.8 the low force is applied, leading to lower material penetration by the intender. 

This combined with other measurement uncertainties such as different than 90° angle of the probe or the wobbly hand of the operator can contribute additionally to a greater measurement deviation. This effect is even more pronounced for harder materials, whereby the penetration depth is even lower. In other words: the highest accuracy is expected for soft materials measured with HV10 scale whereas hard materials measured with the low force are more prone to errors. 


What is the best practice?

This article shows how accuracy and repeatability are computed and how important repeatability is for the end users. 

Important to highlight is that ASTM standard does not demand repeatability during the calibration process, hence the users can not avoid purchasing accurate but not repeatable instrument devices. 

It is always advisable to use devices that are also controlled against repeatability, which is demanded by German DIN50159 and Chinese GB/T 34205 standards. By using devices that are compliant with all three standards, the end users make sure that their equipment is best-in-class not only by means of accuracy but also in repeatability and foremost reliability of harvested data.


Note: This document shows only a fraction of the information described in DIN 50159, ASTM A1038, and GB/T 34205-2017. Screening Eagle Technologies has done everything in its power to translate the sections of the DIN 50159 and GB/T 34205-2017 standards accurately. For authorized translations or more information interested readers are encouraged to read the full version of standards DIN 50159,  ASTM A1038, and GB/T 34205-2017 available at respectively. 



Metallische Werkstoffe – Härteprüfung nach dem UCI-Verfahren – Teil 2: Prüfung und Kalibrierung der Härteprüfgeräte, DIN 50159-2:2015-01, 2015

Standard Test Method for Portable Hardness Testing by the Ultrasonic Contact Impedance Method, ASTM A1038-19, 2019

Metallic materials – Hardness testing – Ultrasonic contact impedance method, GB/T 34205-2017, 2017

Portable Hardness Testing. Theory practice, Applications, guidelines. Burnat, D., Raj, L., Frank, S., Ott, T. Schwerzenbach, Screening Eagle Technologies AG, 2022.