Home

Algorithms

An algorithm is a finite, well-defined set of instructions that, given input, produces output and terminates after a finite number of steps. Each instruction is unambiguous, and the sequence is designed to be executable by a computer or carried out by a person following the rules. Algorithms are the basis of computer programs and of many systematic procedures used in mathematics, data processing, and everyday tasks.

The study of algorithms combines design methods with analysis of resource use. Their history stretches from

Algorithms are evaluated by efficiency and correctness. Time complexity measures how the running time grows with

Algorithms are categorized by approach and domain. Common families include brute-force, divide and conquer, dynamic programming,

Applications span computing, databases, optimization, numerical simulation, cryptography, and artificial intelligence. Limitations arise from fundamental results

ancient
procedures
for
arithmetic
to
the
formal
theories
developed
in
the
20th
century
by
researchers
such
as
Turing,
Church,
and
von
Neumann,
which
established
formal
models
of
computation
and
the
framework
for
complexity
and
optimization.
input
size,
often
expressed
with
Big-O
notation.
Space
complexity
measures
memory
use.
Analyses
consider
worst-case,
average-case,
and
sometimes
best-case
scenarios.
greedy,
and
graph
algorithms.
Classic
examples
include
sorting
(quicksort,
mergesort),
searching
(binary
search),
shortest
path
(Dijkstra,
Bellman–Ford),
and
minimum
spanning
tree
methods
(Prim,
Kruskal).
such
as
undecidability
and
from
problems
believed
not
to
admit
efficient
general
solutions,
known
as
NP-hard
problems.
The
study
of
algorithms
continues
to
evolve
as
new
models
of
computation
and
new
data
challenges
emerge.