【正文】
al noise appears in the output voltage causing small instantaneous errors in the output. Even when the probe/target gap is perfectly constant, the output voltage of the driver has some small but measurable amount of noise that would seem to indicate that the gap is changing. This noise is inherent in electronic ponents and can be minimized, but never eliminated. 畢業(yè)設(shè)計(jì)(論文)外文翻 譯 If a driver has an output noise of V with a sensitivity of 10 V/1 mm, then it has an output noise of ,2 mm ( 181。 however, lowerbandwidth sensors will have reduced output noise which means higher resolution. Some sensors provide selectable bandwidth to maximize either resolution or response time. Resolution is defined as the smallest reliable measurement that a system can make. The resolution of a measurement system must be better than the final accuracy the measurement requires. If you need to know a measurement within 181。1 V at low frequency will only produce a 177。s sensitivity is set during calibration. When sensitivity deviates from the ideal value this is called sensitivity error, gain error, or scaling error. Since sensitivity is the slope of a line, sensitivity error is usually presented as a percentage of slope, a parison of the ideal slope with the actual slope. Offset error occurs when a constant value is added to the output voltage of the system. Capacitive gauging systems are usually zeroed during setup, eliminating any offset deviations from the original calibration. However, should the offset error change after the system is zeroed, error will be introduced into the measurement. Temperature change is the primary factor in offset error. Sensitivity can vary slightly between any two points of data. The accumulated effect of this variation is called linearity erro. The linearity specification is the measurement of how far the output varies from a straight line. To calculate the linearity error, calibration data are pared to the straight line that would best fit the points. This straight reference line is calculated from the calibration data using least squares fitting. The amount of error at the point on the calibration line furthest away from this ideal line is the linearity error. Linearity error is usually expressed in terms of percent of full scale (%/.). If the error at the worst point is mm and the full scale range of the calibration is 1 mm, the linearity error will be %. 畢業(yè)設(shè)計(jì)(論文)外文翻 譯 Note that linearity error does not account for errors in sensitivity. It is only a measure of the straightness of the line rather than the slope of the line. A system with gross sensitivity errors can still be very linear. Error band accounts for the bination of linearity and sensitivity errors. It is the measurement of the worstcase absolute error in the calibrated range. The error band is calculated by paring the output