Easy Ways to Remember the Series Convergence Tests Calc 2
Contents (Click to skip to that section):
- Series Convergence Tests in Alphabetical Order
- Abel's Test
- Absolute Convergence
- Alternating Series Convergence Tests
- Deleting the first N Terms
- Dirichlet's Test
- Direct Comparison Test
- Geometric Series Convergence Tests
- Integral Series Convergence Tests
- nth-Term Test for Divergence
- P series
- Ratio Test
- Root Test
- Taylor Series Convergence
- What Does "Converge" Mean?
- Absolutely Convergent
- Conditional Convergence
- Pointwise Convergence
- Rate of Convergence
- Radius and Interval of Convergence
- Uniform Convergence
- What does "Diverge" Mean?
Series Convergence Tests in Alphabetical Order
Often, you'll want to know whether a series converges (i.e. reaches a certain number) or diverges (does not converge). Figuring this out from scratch can be an extremely difficult task —something that's beyond the scope of even a calculus II course. Thankfully, mathematicians before you have calculated Series Convergence Tests: the convergence or divergence of many common series. This enables you to figure out whether a particular series may or may not converge.
Abel's Test
Abel's test is a convergence test for infinite series; It tells us whether some infinite series converges in certain situations.
More info: Abel's test.
Absolute Convergence
If the absolute value of the series
converges, then the series
converges.
Alternating Series Convergence Tests
If for all n, an is positive, non-increasing (i.e. 0 < = an) and approaches 0, then the alternating series test tells us that the following alternating series converges:
If the series converges, then the remainder R,sub>N = S – SN is bounded by |R N|< = aN + 1. S is the exact sum of the infinite series and SN is the sum of the first N terms of the series.
Deleting the first N Terms
The following series either both converge or both diverge if N is a positive integer.
Series Convergence Tests: Dirichlet's Test
Dirichlet's test is a generalization of the alternating series test.
Dirichlet's test is one way to determine if an infinite series converges to a finite value. The test is named after 19th-century German mathematician Peter Gustav Lejeune Dirichlet.
Formally, Dirichlet's test states that the infinite series
a1b1 + a2b2 + … + anbn converges if the following two statements are true:
- The sequence of partial sums
sn = a1 + a2 + … an
is a bounded sequence. In other words, there is a positive number K so that
Sn < K for all n. - b1 + b2 + … bn is a monotonic decreasing sequence (i.e. a steadily decreasing sequence) converging to zero (i.e. bn < bn-1 and limn→∞) bn = 0).
When to Use Dirichlet's Test
Dirichlet's test is one of the lesser known tests. In general, the common rules for convergence of series—the ones you learn in elementary calculus—suffice for testing the vast majority of series. But there are some specific cases where the "usual" tests just don't work.
For example, you can use the ratio test or root test to show that the following power series diverges (for |z|> 1) or converges absolutely for |z| < 1.
However, neither of those tests tell you what happens when z = 1. For that, you can use Dirichlet's test to show that the series converges (Evans, 2009).
Example of Dirichlet's Test
Use Dirichlet's test to show that the following series converges:
Step 1: Rewrite the series into the form a1b1 + a2b2 + … + anbn:
Step 2: Show that the sequence of partial sums an is bounded. One way to tackle this to to evaluate the first few sums and see if there is a trend:
- a2 = cos(2π) = 1
- a3 = cos(2π) + cos(3π) = 1 – 1 = 0
- a4 = cos(π) + cos(2π) + cos(3π) = 1 – 1 + 1 = 0
It appears the sequence of partial sums is bounded (≤1).
Step 3:Evaluate bn to see if it decreasing. One way to do this is to graph the function (I used Desmos.com):
Clearly, the function (and therefore the sequence) is decreasing and the limit as n→∞ is 0. Therefore, this series converges.
Proof of Dirchlet's Test
Watch the following video for a proof of convergence using Dirchlet's test:
Proof that sum(sin(n)/n) Converges using Dirichlet's Test
Direct Comparison Test
In the direct comparison test, the following two rules apply if 0 < = an < ;= bn for all n greater than some positive integer N.
Geometric Series Convergence Tests
With the geometric series, if r is between -1 and 1 then the series converges to 1⁄(1 – r).
Integral Series Convergence Tests
The following series either both converge or both diverge if, for all n> = 1, f(n) = an and f is positive, continuous and decreasing. If the series does converge, then the remainder RN is bounded by
See: Integral Series / Remainder Estimate.
Limit Comparison Test
The limit comparison test states that the following series either both converge or both diverge if lim(N → ∞) (an ⁄bn where an,bn>0 and L is positive and finite.
nth-Term Test for Divergence
P series
If p > 1, then the p-series converges.
If 0 < p < 1 then the series diverges.
Ratio Test
The following rules apply if for all n, n≠0. L = lim (n→ ∞)|an + 1⁄an |.
If L<1, then the series
converges.
If L>1, then the series
diverges.
If L = 1, then the ratio test is inconclusive.
More info: Ratio Test
Root Test
Let L = lim(n→ ∞)|an|1/n
If <, then the series
converges.
If >, then the series diverges.
If L = 1, then the test is inconclusive.
See: Root Test
Taylor Series Convergence
The Taylor series converges if f has derivatives of all orders on an interval "I" centered at c, if lim(n→ infin;)RN = 0 for all x in l:
The Taylor series remainder of RN = S – SN is equal to (1/(n + 1)!)f(n + 1)(z)(x – c)n + 1 where z is a constant between x and c.
What Does "Converge" Mean?
Converge means to settle on a certain number. For example, the series {9, 5, 1, 0, 0, 0} has settled, or converged, on the number 0.
Integrals, limits, series and sequences can all converge. For example, if a limit settles on a certain (finite) number, then the limit exists. The opposite is diverge, where the integral, limit, series or sequence fails to settle on a number. In the case of a limit, if it diverges, then it doesn't exist.
Pointwise Convergence
Pointwise convergence is where a sequence of functions converges onto a single function, called a limiting function (or limit function). A sequence of functions, denoted { fn(x) }, is a family of functions with a parameter set of natural numbers (whole, non-negative numbers that we use to count like 1, 2, 3,…).
For example, the sequence of functions f(x) = x/n converges to the limiting function f(x) = 0 for the closed interval [0, 1], as shown in the following image:
Compared to uniform Convergence, this is a fairly simple type of convergence. One of the main differences between the two types of convergence is that the limiting function of a pointwise convergent sequence doesn't have to be a continuous function, while a limiting function of a uniformly convergent sequence does have to be continuous.
Series Convergence Tests: Formal Definition of Pointwise Convergence
Pointwise convergence is a relatively simple way to define convergence for a sequence of functions. So, you may be wondering why a formal definition is even needed. Although convergence seems to happen naturally (like the sequence of functions f(x) = x/n shown above), not all functions are so well behaved. In order to show that a series of functions has pointwise convergence, you must prove that it meets the formal definition. That said, the definition is fairly straightforward:
A sequence of functions f n shows pointwise convergence for a set A if the following holds for all x ∈ A:
Rate of Convergence
Rate of convergence tells you how fast a sequence of real numbers converges (reaches) a certain point or limit. It's used as a tool to compare the speed of algorithms, particularly when using iterative methods.
Many different ways exist for calculating the rate of convergence. One relatively simple way is with the following formula (Senning, 2020; Hundley, 2020),
Where:
- α = the order of convergence (a real number > 0) of the sequence. For example: 1 (linear), 2 (quadratic) or 3(cubic),
- xn = a sequence,
- λ = asymptotic error; A real number ≥ 1,
- r = the value the sequence converges to.
In general, algorithms with a higher order of convergence reach their goal more quickly and require fewer iterations. See: Asymptotic error.
Radius and Interval of Convergence
A radius of convergence is associated with a power series, which will only converge for certain x-values. The interval where this convergence happens is called the interval of convergence, and is denoted by (-R, R). The letter R in this interval is called the radius of convergence. It's called a "radius" because if the coefficients are complex numbers, the values of x (if |x| < R) will form an open disk of radius R.
Absolutely & Conditionally Convergent
Although you can generally say that something converges if it settles on a number, convergence in calculus is usually defined more strictly, depending on whether the convergence is conditional or absolute.
A series is absolutely convergent if the series converges and it also converges when all terms in the series are replaced by their absolute values.
Conditional Convergence is a special kind of convergence where a series is convergent when seen as a whole, but the absolute values diverge. It's sometimes called semi-convergent.
A series is absolutely convergent if the series converges (approaches a certain number) and it also converges when all terms in the series are replaced by their absolute values. In other words,
…if |u1| + |u2| +… is convergent, then the series u 1 + u 2 +… is absolutely convergent.
This statement is usually written with the summation symbol:
if Σ |u n| is convergent, then the series Σ u n has absolute convergence.
Positive Terms Series
If the series of positive terms converges, then both the series of positive terms and the alternating series (i.e. a series with alternating positive and negative terms) will converge.
If a convergent series is a set of positive terms, then that series is also absolutely convergent. That's because Σu n and Σ|u n| are the same series.
For example, the following geometric series is both:
Series with Positive and Negative Terms
If a convergent series has an infinite number of positive terms and an infinite number of negative terms, it only has absolute convergence if Σ|u n is also convergent.
Conditional Convergence
Conditional convergence is a special kind of convergence where a series is convergent (i.e. settles on a certain number) when seen as a whole. However, there's a catch:
- The sum of its positive terms goes to positive infinity and
- The sum of its negative terms goes to negative infinity.
It has a very special property, called the Riemann series theorem, that says that it can be made to converge to any desired value—or to diverge—by simple rearrangement of the terms.
One Way to Identify a Conditionally Convergent Series
In order to find out if a series is conditionally convergent:
- Find out if the series converges, then
- Determine it isn't absolutely convergent.
- The Alternating Series Test tells us that if the terms of the series alternates in sign (e.g. -x, +x, -x…), and each term is bigger than the term after it, the series converges.
- Take the absolute values of the alternating (converging) series. If the new (all positive term) series converges, then the series is absolutely convergent. If that new series is not convergent, the original series was only conditionally convergent.
Example of Conditional Convergence
One example of a conditionally convergent series is the alternating harmonic series, which can be written as:
It converges to the limit (ln 2) conditionally, but not absolutely; make a new series by taking the absolute value of each of the terms and your new series will diverge.
Understanding the Riemann Series Theorem
It might seem counter-intuitive that a series can be made to converge to anything just by rearranging the terms. But if you have a well-defined limit you want it to approach, all you need to do is:
- Take enough positive terms to just barely exceed the desired limit, then
- Add enough negative terms to go below the desired limit, then
- Continue in this way.
Since all terms of the original series go to zero, the new, rearranged series will converge to the limit you chose.
As an example of the Riemann series consider the alternating harmonic series, which we looked at above. As written, it converges to ln2. But can we make it converge to half of that, (ln2)/2. The ordinary way, it would be written
1 – 1/2 + 1/3 – 1/4 +….
etc.
Every other term is negative. But if we arrange it as (one positive term) + (two negative terms), we get this:
1 – 1/2 -1/4 + 1/3…
We can rewrite this as:
Which is one half of what the original series converged to.
References
Absolute and Conditional Convergence. Retrieved from https://www.math.utah.edu/lectures/math1220/22PostNotes.pdf on December 22, 2018.
Non-Absolute (Conditional) Convergence
A series is non-absolutely (conditionally) convergent if the series is convergent, but the set of absolute values for the series diverges. This is also called semi-convergence, or conditional convergence. For example, the following alternating series converges:
However, the series,
diverges.
Uniform Convergence
Uniform convergence is where a series of continuous functions converges on one particular function, f(x), called the limiting function. This type of convergence is defined more strictly than pointwise convergence.
The idea of uniform convergence is very similar to uniform continuity, where values must stay inside a defined "box" around the function. If you aren't familiar with what it means to be uniform, you may want to read about uniform continuity first.
What does it mean for a series of functions to converge?
As an example, the series f(x) = x/n converges to f(x) = 0 on the closed interval [0, 1]:
Note how the slope of each function gets lower and lower, eventually converging on f(x) = 0 (which is essentially, a function that goes along the x-axis).
Although these functions are converging on a limiting function (f(x) = 0, in the above example), the sequence may or may not be converging uniformly to that function. Uniform convergence is a particular type of convergence where the limiting function must be within a set "boundary" around two values: between two tiny values ("epsilon"):-ε and ε.
Formal Definition of Uniform Convergence
A sequence of real-valued continuous functions (f 1, f 2…f n ), defined on a closed interval [a, b], has uniform convergence if the following inequality is true for all x in the domain:
|f n (x) – f(x)| < ε for all x ∈ D whenever n ≥ N ,
Where:
- N = a positive integer that only depends on ε,
- D = the domain,
- ∈ = "is an element of" (i.e. "is in the set")
The following image explains graphically what is happening here:
Pointwise Convergence vs. Uniform Convergence
If a function is uniformly convergent, then it is also pointwise convergent to the same limit (but note that this doesn't work the other way around). The main difference is in the values N is dependent on:
- Pointwise: N depends on ε and x. A single value (x) is chosen, then an arbitrary neighborhood is drawn around that point.
- Uniform: N depends only on ε A neighborhood is drawn around the entire limiting function,.
Series Convergence Tests for Uniform Convergence
You can test for uniform convergence with Abel's test or the Weierstrass M-test.
History
The term "uniform convergence" is thought to have been first used by Christopher Gudermann in his 1838 paper on elliptic functions. The term wasn't formally defined until later, when Karl Weierstrass wrote Zur Theorie der Potenzreihen in 1841 (Kadak, 2014).
What Does Diverge Mean?
"Diverge" generally means either:
- Settles on a certain number (i.e. has a limit), or
- Doesn't converge.
In some areas of math, diverge might simply mean "takes a different path" (for example, in KL Divergence in statistics). However, in calculus, it almost always pertains to limits or behavior of sequences and series.
Series and Sequences that Diverge (The Divergence Test)
Series and Sequences can also diverge. In a general sense, diverge means that the sequence or series doesn't settle on a particular number.
A divergent series will (usually) go on and on to infinity (i.e. these series don't have limits). For example, the series
9 + 11 + 13 …
will keep on growing forever.
Not all series diverge though: some diverge all the time, others converge or diverge under very specific circumstances. For example:
- Series that diverge all the time include every infinite arithmetic series and the harmonic series.
- Series that sometimes converge include the power series, which converges everywhere or at a single point (outside of which the series will diverge).
Proving divergence (or convergence) is extremely challenging with a few exceptions. For example, you can show that an infinite series diverges by showing that a sequence of partial sums diverges.
Series Convergence Tests: Related Articles
Weierstrass M-Test
Series Convergence Tests: References
Arfken, G. (1985). Mathematical Methods for Physicists, 3rd ed. Orlando, FL: Academic Press.
Boas, R. et al. (1996). A Primer of Real Functions. Cambridge University Press.
Browder, A. (1996). Mathematical Analysis: An Introduction. New York: Springer-Verlag, 1996.
Clapham, C. & Nicholson, J. (2014). The Concise Oxford Dictionary of Mathematics. OUP Oxford.
Evans, P. (2009). Math 140A Test 2. Retrieved September 18, 2020 from: http://math.ucsd.edu/~lni/math140/math140a_Midterm_Sample2.pdf
Hundley, D. Notes: Rate of Convergence. Retrieved September 8, 2020 from: http://people.whitman.edu/~hundledr/courses/M467F06/ConvAndError.pdf
Hunter, K. Sequences and Series of Functions.
Jeffreys, H. and Jeffreys, B. S. (1988). "Uniform Convergence of Sequences and Series" et seq. §1.112-1.1155 in Methods of Mathematical Physics, 3rd ed. Cambridge, England: Cambridge University Press, pp. 37-43, 1988.
Kadak, U. (2014). On Uniform Convergence of Sequences and Series of Fuzzy-Valued Functions. Retrieved February 10, 2020 from: https://www.hindawi.com/journals/jfs/2015/870179/
Kevrekidis, P. 132class13 (PDF). Retrieved December 14, 2018 from: http://people.math.umass.edu/~kevrekid/132_f10/132class13.pdf
Knopp, K. "Uniform Convergence." §18 in Theory of Functions Parts I and II, Two Volumes Bound as One, Part I. New York: Dover, pp. 71-73, 1996.
Kuratowski, K. (2014). Introduction to Calculus. Elsevier.
Mathonline. Dirichlet's Test for Convergence of Series of Real Numbers Examples 1. Retrieved September 18, 2020 from: http://mathonline.wikidot.com/dirichlet-s-test-for-convergence-examples-1
Nelson, D. (2008). The Penguin Dictionary of Mathematics. Penguin Books Limited.
Rudin, W. (1976). Principles of Mathematical Analysis, 3rd ed. New York: McGraw-Hill, pp. 147-148.
Senning, J. Computing and Estimating the Rate of Convergence. Retrieved September 8, 2020 from: http://www.math-cs.gordon.edu/courses/ma342/handouts/rate.pdf
Spivak, M. (2006). Calculus, 3rd edition. Cambridge University Press.
Vasishtha, A. Algebra & Trigonometry.
Vogel, T. Pointwise and Uniform Convergence of Sequences of Functions (7.1). Retrieved February 10, 2020 from: https://www.math.tamu.edu/~tvogel/410/sect71a.pdf
Wood, A. (2012) Absolute and Conditional Convergence. Retrieved December 14, 2018 from: https://resources.saylor.org/wwwresources/archived/site/wp-content/uploads/2012/09/MA102-5.4.6-Absolute-and-Conditional-Convergence.pdf
Infographic based on Professor Joe Kahlig's original graphic.
---------------------------------------------------------------------------
Need help with a homework or test question? With Chegg Study, you can get step-by-step solutions to your questions from an expert in the field. Your first 30 minutes with a Chegg tutor is free!
Source: https://www.calculushowto.com/sequence-and-series/series-convergence-tests/
0 Response to "Easy Ways to Remember the Series Convergence Tests Calc 2"
Post a Comment