Home

Concurrency

Concurrency is the ability of a system to manage multiple tasks that make progress over overlapping time periods. It enables a program to handle several sequences of operations at once, either by interleaving their execution on a single processor or by running them in parallel on multiple cores. Concurrency focuses on structure and coordination, while parallelism emphasizes actual simultaneous execution.

A typical concurrent system involves tasks, processes, or threads that share resources or communicate with each

Synchronization and communication are central to correctness in concurrent programs. Tools such as mutexes, locks, semaphores,

Programs use various techniques to manage concurrency, including preemptive or cooperative scheduling, thread pools, and asynchronous

Many languages provide built-in support for concurrency, such as threads in Java and C++, goroutines in Go,

other.
Models
of
concurrency
include
shared-memory
concurrency,
where
tasks
access
a
common
address
space,
and
message-passing
concurrency,
where
tasks
communicate
through
explicit
messages.
The
actor
model
is
a
related
approach
in
which
independent
entities
(actors)
exchange
messages
and
make
progress
independently.
monitors,
and
condition
variables
help
enforce
mutual
exclusion
and
ordering.
Barriers
coordinate
phases
of
work.
Proper
memory
models
ensure
visibility
and
ordering
of
updates
across
threads.
Common
problems
include
race
conditions,
deadlocks,
livelocks,
and
starvation,
as
well
as
subtle
memory
visibility
issues.
I/O
with
futures,
promises,
or
async/await.
Design
patterns
include
shared-memory
synchronization
with
locks,
message-passing
with
channels,
and
the
actor
model,
as
well
as
data
parallelism
and
pipelining.
the
event
loop
in
Node.js,
or
processes
in
Erlang.
Modern
hardware
with
multi-core
CPUs
enables
true
parallelism,
though
concurrency
can
be
exploited
on
a
single
core
through
interleaved
execution.