The concept of limits seems fundamental to calculus. Knowledge of that should help the reader develop (or doubt), the calculus of real world quantities. Definition:
Limit of a function f(x) as x tends to x0 is a number L such that, f(x) gets closer to L as x gets closer to x0.
As per it's dictionary meaning, the word 'Limit' refers to the limiting (or bounding) value of the function f(x), as the independent variable 'x' takes values close to 'x0'. Some interesting notes about the concept of Limits:
- Limit is only an Estimate i.e. a tentative value of the function that is guessed or anticipated (How? - is an interesting question) by studying values of the function f(x), for x values close to 'x0'.
- When x takes values less than x0, limit is called Left Hand Limit (L.H.L.) and for x values more than x0 its called Right Hand Limit (R.H.L.).
- Limit is the value unanimously estimated by L.H.L. and R.H.L. So if L.H.L. ≠ R.H.L., a limiting value or Limit does not exist.
- Limit by definition, is NOT the value of function at x0. However, it may be equal to the function value f(x0).
A Limit may exist, even if the function is not defined at the point x0. - An interesting historical account of the development of this concept is given in 'History' section of wikipedia page on: ε-δ definition of limits. It's a more rigorous one, used historically.