All measuring devices have inaccuracies, except for those designated as standards, and even those have specified uncertainties. When you look at the packaging, they usually list some unit of uncertainty (often in units like "+/- 0.1 grams") but that assumes the device is properly calibrated to begin with, which chances are it's NOT. Calibrating a device boils down to comparing it to a known standard, and adjusting it to be as close to that standard as possible. Once properly calibrated, the uncertainty starts to have some meaning.
Everything in our lives is subject to this. A great example is the oven in your kitchen. Just because the knob says 350 doesn't mean it's actually 350. Or maybe you have a digital version with buttons and a thermometer built in? Just because the display says 350 doesn't mean it's 350. One way to "calibrate" it then is to get an external thermometer that you trust, and measure the heat. Keep adjusting the temperature until the thermometer actually settles to 350. Now you know what to set your oven to!
The other thing to consider is that some devices are less accurate at certain ranges than others. For example, your scale could have a range of 10 grams to 1000 grams. It could be well calibrated for something around 500 grams but the further you get away from that point the more error there is. Using the oven analogy, just because you set it to 380 to get a true 350 doesn't mean that you set it 30 higher for all other temperatures. So it's best to calibrate it at various points along the range and interpolate between them.
I bet no one read this wall of text
