orderprecision
Order precision, or order of accuracy, is a concept in numerical analysis used to describe how the error of a numerical method decreases as the discretization parameter is refined. If the global error satisfies |error| ≤ C h^p for small h, where h represents a typical discretization size (such as grid spacing or timestep) and C is a constant, the method is said to have order p. The exponent p is the order of precision. Higher p means faster convergence as the discretization is refined. In many contexts, the local truncation error is O(h^{p+1}) and the global error is O(h^p), given suitable smoothness of the exact solution.
In floating-point computation, precision also refers to the number of significant digits used to represent numbers.
Common examples illustrate typical orders of accuracy: the forward Euler method for ordinary differential equations is
Practical assessment of order involves computing a solution with different step sizes (e.g., h and h/2) and