- 3. In Probability Theory
- 4. In Real Analysis
In this post, a lot of useful equations would be introduced, and mots of them would be proved formally.
3. In Probability Theory
3.1. Combinatorial Number Approximation
This subsection describes several ways to approximate the combinatorial numbers, which would be very useful for example when doing some analysis on binomial distribution.
3.1.1. By Power Function
proof:
Lemma:
proof:
Since,
Therefore,
Hence,
3.1.2 By Exponential Function (Upper)
proof:
According the above two inequalities, it is easy to obtain this inequality.
3.1.3. By Exponential Function (Lower)
proof:
As,
Hence, for any ,
Therefore,
3.2 Binomial Distribution
Give a binomial distribution , the summation of all odd elements is,
3.1. Markov’s Inequality
Markov’s inequality gives an upper bound (a function of its expectation) for the probability that a non-negative function of a random variable is no less than a constant.
3.1.1 Basis Version
Given any random variable and , we have,
proof:
3.1.2 Extended Version
Given a monotonically increasing function from non-negative real numbers to the non-negative reals, is a random variable, , and , then,
proof:
3.2. Chebyshev’s Inequality
Chebyshev’s inequality is about how “far” can the values of a distribution deviates from its mean. Formally speaking, it guarantees that for any distribution no more than of the distribution’s values can be more than standard deviations away from the mean.
3.2.1 Basic Version
Let be a random variable with finite expected value and finite non-zero variance . Then for any real number .
proof
Another expression is as follows:
For any , the above expression can also written as,
proof
3.2.2 Extensions
- Asymmetric case: for any and , we have, proof
- Vector Version: for a random vector with mean
, variance and an arbitrary norm that, . proof
4. In Real Analysis
4.1. Holder’s Inequality
Suppose that and are non-negative numbers, let , and is the dual of that is,
Then, we have
Lemma: Given any two non-negative numbers and , and two positive numbers and such that,
,
then, we have,
4.2. Minkowski’s Inequality
Suppose that and are two non-negative sequences and , then,
4.3. Infinite Norm
If and , then,
moreover,
4.4 Convergent Sequence & Cauchy Sequence
If a sequence in a metric space is convergent, then it is a Cauchy sequence.