In UV-Vis spectroscopy, we usually assume Beer's law to be valid: the absorbance of a compound is proportional to its concentration. With higher concentrations, this may no longer be true. The absorbances then are lower than expected. Suppose someone ignores this effect and calculates the best line from a set of calibration samples. What would you expect the plot of the residuals (true values minus fitted values) to be like when the highest concentrations cause Beer's law to fail?
At the very low and very high concentrations, the calibration line will be too low. The residuals will appear to lie on a kind of parabola pointing downwards.
The samples with a high concentration will distort the calibration line so that they lie on the calibration line and the samples with a low concentration are far off.
At the high concentrations, the calibration line will be higher than the calibration samples. The residuals will get larger at high concentrations.
At the very low and very high concentrations, the calibration line will be too high. The residuals will appear to lie on a kind of parabola pointing upwards.