# Wikidata:Property proposal/convergence rate

### order of convergence (changed from "convergence rate")

Originally proposed at Wikidata:Property proposal/Natural science

Withdrawn
Description the rate at which a sequence or numerical method converges rate of convergence (Q1783502) Item numerical method (Q24262840) PABLO QUINTERO (Q133250) instancesof big O notation (Q269878) Euler method (Q868454) → ${\displaystyle O(h^{2})}$  of (P642) local error (Q97796690) relative to (P2210) Broyden–Fletcher–Goldfarb–Shanno algorithm (Q2877013) → quadratic? to-do: look up convergence rate of BFGS gradient descent (Q1199743) → [1] gradient descent (Q1199743) → [2] Newton's method in optimization (Q17086396) → [3] worst-case space complexity (P3755), average space complexity (P3757), best-case space complexity (P3756), worst-case time complexity (P3752), average time complexity (P3754) best-case time complexity (P3753), big O notation (Q269878)

#### Motivation

Well-posed numerical methods (theoretically) converge to the exact solution as ${\displaystyle n\to a}$ . For discretized methods, ${\displaystyle n}$  is the grid size of the discretization and ${\displaystyle n\to \infty }$ . For iterative methods, ${\displaystyle n}$  could be the number of iterations and ${\displaystyle n\to \infty }$ . Alternatively, for iterative numerical methods for solving differential equations, ${\displaystyle n}$  is the tiem-step size and ${\displaystyle n\to 0}$ . In each case, there is an upper bound on the on the error of the method as a function of ${\displaystyle n}$  as ${\displaystyle n\to a}$ . It would be useful to have methods categorized according to this property.

Specifically, I am proposing that we add a property whose subject is the big-O order of convergence for the method. That is, we would set <order of convergence> <g(n)> where g(n) is a function such ${\displaystyle f(x)=O(g(x)){\text{ as }}x\rightarrow a}$  if ${\displaystyle \limsup _{x\rightarrow a}\left|{\frac {f(x)}{g(x)}}\right|<\infty }$

There are a lot of definitions that are used similarly, so I'm going to list them here and we'll try to sort them out.

• Rate of convergence: if a sequence converges linearly, ${\displaystyle \mu :=\lim _{k\rightarrow \infty }{\frac {\left|x_{k+1}-L\right|}{\left|x_{k}-L\right|}}}$  is called the rate of convergence
• If a sequence converges to a limit, then r

The following excerpt from [4] provides clear definitions of order and rate of convergences.

"whether this iteration will converge, and, if so, the rate of convergence. Specifically we use the following to represent how quickly the error $$e_{n}=x_{n}-x^{*}$$ converges to zero: ${\displaystyle \lim _{n\rightarrow \infty }{\frac {\left|e_{n+1}\right|}{\left|e_{n}\right|^{p}}}=\lim _{n\rightarrow \infty }{\frac {\left|x_{n+1}-x^{*}\right|}{\left|x_{n}-x^{*}\right|^{p}}}=\mu }$  Here $$p \geq 1$$ is called the order of convergence, the constant $$\mu$$ is the rate of comergence or asymptotic error constant."

I think it makes sense to create an order of convergence (OC) property, but not a rate of convergence (RC). The reason for this, is that OC is (1) a much more important property of an algorithm and (2) there will only be (in practice) a limited number of values. We'll have logarithmically, sublinear, linear, quadratic, cubic, quartic, and maybe a few more. Whereas RC could be any real number.

The-erinaceous-one (talk) 07:47, 31 July 2020 (UTC)