An introductory course that explores the fundamental concepts and techniques of mathematical analysis.
A sequence is just a list of numbers in a specific order, like \( 2, 4, 6, 8, \ldots \). Each number in the list is called a term.
The limit of a sequence is the value the numbers are getting closer and closer to as the list goes on forever. If \( a_n \) is your sequence, we say it converges to \( L \) if the terms eventually get as close as you want to \( L \).
We write:
\[
\lim_{n \to \infty} a_n = L
\]
if for every tiny distance \( \varepsilon > 0 \), there is a point after which all the terms are within \( \varepsilon \) of \( L \).
Limits help us make sense of infinite processes and are the core idea behind calculus. They let us define things like derivatives and integrals.
If you keep dividing 1 by larger and larger numbers — like \( 1, \frac{1}{2}, \frac{1}{3}, \frac{1}{4}, ... \) — the numbers get closer to zero. Zero is the limit!
The sequence \( 1, \frac{1}{2}, \frac{1}{3}, ... \) has limit 0.
The sequence \( (-1)^n \) does not have a limit, because it bounces between 1 and -1 forever.
Sequences are ordered lists of numbers, and their limits show what they approach as the list continues forever.