Summary

Read through the summarized points below carefully. As a last exercise, continue with the mixed exercises (link below), which summarize everything that you (should) have learned today.

Error propagation

  1. All experiments contain experimental error.
  2. For many analytical instruments (glassware, balance, ...), the error is shown on the equipment.
  3. The error in a final answer can be calculated by using error propagation rules.
  4. Errors for multiplication and division have to be calculated relatively, because these operations often involve different quantities.

The normal distribution

  1. Many experimental errors are normally distributed.
  2. They can be characterised by a mean value and a standard deviation.
  3. The mean describes the location.
  4. The standard deviation describes the spread around the mean.
  5. Random errors are related to the standard deviation.
  6. Systematic errors are related to the difference between the mean and the (unknown) true value (the bias).

Prediction & Confidence intervals

  1. The larger the standard deviation (the spread around a central value), the wider a prediction interval will be.
  2. A 95% prediction interval will contain the next experimental value with a probability of 95%.
  3. A crude estimate of a 95% prediction interval is the mean plus or minus twice the standard deviation.
  4. A crude estimate of a 99% prediction interval is the mean plus or minus three times the standard deviation.
  5. Confidence intervals are narrower than their corresponding prediction intervals, since errors cancel out.
  6. The standard deviation of the mean is the standard deviation of the individual values divided by the square root of the number of measurements.

Least squares regression

  1. The linear relationship between two variables x and y can be described by the y intercept b and the slope a of the optimal straight line.
  2. Standard errors for the intercept b with the y axis and slope a can be calculated, and therefore confidence intervals too.
  3. It is assumed that the error in x is much smaller than the error in y.
  4. Residuals are assumed to be independent and normally distributed with constant variance.
  5. Regression lines may be used for calibration: the calibration line is set up using a set of calibration samples and the concentration in an unknown sample can be predicted.
  6. Regression lines can be used to compare methods.

As a final test, prepare yourself for the mixed exercises.