Home

raytracing

Ray tracing is a rendering technique that generates an image by simulating the paths of light as rays that travel through a virtual scene. In a basic setup, rays are traced from a virtual camera through each image pixel into the scene; their intersections with surfaces determine the color seen at that pixel.

At each intersection, the algorithm computes the surface color from the material properties and the lighting

Variants include classic Whitted-style ray tracing, distributed/Monte Carlo path tracing, and bidirectional or Metropolis light transport.

Real-time ray tracing has become practical with hardware acceleration and specialized APIs. Modern GPUs include dedicated

Origins date to early computer graphics research in the 1960s and the 1980 Whitted-style ray tracing, with

in
the
scene.
To
model
reflective
and
refractive
effects,
rays
are
spawned
recursively
to
capture
reflections,
refractions,
and
shadows.
The
resulting
images
can
exhibit
realistic
phenomena
such
as
soft
shadows,
color
bleeding,
and
caustics.
Path
tracing
estimates
the
rendering
equation
by
averaging
many
sampled
light
paths,
trading
determinism
for
potentially
more
accurate
global
illumination.
Performance
is
improved
by
spatial
data
structures
such
as
bounding
volume
hierarchies
(BVH)
or
kd-trees,
which
accelerate
ray-scene
intersection
queries.
Denoising
and
adaptive
sampling
help
reduce
noise.
ray-tracing
cores,
and
APIs
such
as
DirectX
Raytracing
(DXR)
and
Vulkan
Ray
Tracing
enable
interactive
applications.
Many
renderers
mix
rasterization
for
primary
visibility
with
ray
tracing
for
reflections,
shadows,
and
global
illumination.
substantial
advances
in
the
1990s
and
2000s.
It
is
widely
used
in
film
production,
architectural
visualization,
and
increasingly
in
video
games
and
real-time
visualization.
Limitations
include
computational
cost
and
residual
noise
in
stochastic
methods.