Chebyshev equation - Wikipedia
https://en.wikipedia.org/wiki/Chebyshev_equationChebyshev's equation is the second order linear differential equation + =where p is a real (or complex) constant. The equation is named after Russian mathematician Pafnuty Chebyshev.. The solutions can be obtained by power series: = = where the coefficients obey the recurrence relation + = (+) (+) (+). The series converges for | | < (note, x may be complex), as may be seen by …
Chebyshev polynomials - Wikipedia
https://en.wikipedia.org/wiki/Chebyshev_polynomialsThe Chebyshev polynomials are two sequences of polynomials related to the cosine and sine functions, notated as and . They can be defined several equivalent ways; in this article the polynomials are defined by starting with trigonometric functions: The Chebyshev polynomials of the first kind are given by Similarly, define the Chebyshev polynomials of the second kind as
Chebyshev's inequality - Wikipedia
https://en.wikipedia.org/wiki/Chebyshev's_inequalityIn probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k of the distribution's values can be k or more standard deviationsaway from the mean (or equivalently, over 1 − 1/k of the distribution's values are less than k standard deviations away from the mean)…
Chebyshev's inequality - Wikipedia
en.wikipedia.org › wiki › Chebyshev&The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers .