peroperation
Peroperation is a theoretical cost model used in computer science to quantify the resource usage of an algorithm by assigning a fixed cost to each elementary operation. In a peroperation model, the total cost is the number of elementary operations performed, multiplied by a constant unit cost, typically normalized to one. This approach highlights the count of operations rather than data size or memory traffic and is commonly used to obtain simple, architecture-agnostic estimates of running time in asymptotic analysis.
Definition and scope: An elementary operation refers to a basic step such as an arithmetic calculation, a
Applications: Peroperation analysis is used in teaching to illustrate how algorithms scale with operation counts, in
Limitations: The unit-cost assumption is often unrealistic on modern hardware, where costs vary with memory access,
Relation to other models: It is related to the unit-cost RAM model, differing mainly in how the
See also: computational complexity, RAM model, unit-cost model, operation counting.