Home

HCI

HCI, or human–computer interaction, is an interdisciplinary field that studies the design, evaluation, and implementation of interactive computing systems for human use. It examines the interfaces between people and technology across devices, software, and networks, with the goal of making systems usable, efficient, and engaging.

The field draws on ergonomics, cognitive psychology, and computer science. It gained prominence with graphical user

Core concepts include usability (effectiveness, efficiency, and satisfaction), user-centered design, and accessibility. Designers consider affordances, feedback,

Common methods are prototyping, usability testing, heuristic evaluation, task analysis, and think-aloud protocols. Evaluation uses metrics

Applications span consumer electronics, enterprise software, healthcare, education, and automotive interfaces. Emerging areas include voice and

Current trends emphasize multimodal interaction, AI-assisted design, privacy, and inclusive, equitable technology.

interfaces
in
the
1980s
and
has
evolved
to
cover
a
wide
range
of
modalities
and
settings.
Influential
thinkers
such
as
Ben
Shneiderman
and
Jakob
Nielsen
promoted
usability
principles
and
evaluation
methods,
building
on
earlier
work
by
pioneers
like
Douglas
Engelbart.
mental
models,
and
cognitive
load
to
create
inclusive
experiences.
HCI
emphasizes
user
needs
and
contexts,
iterative
design,
and
empirical
validation
through
observation
and
testing.
such
as
task
success,
time
on
task,
error
rate,
and
standardized
scales
like
the
System
Usability
Scale.
These
methods
guide
iterative
improvements
and
collaboration
among
developers,
designers,
and
users.
conversational
interfaces,
touch
and
gesture,
virtual
and
augmented
reality,
and
assistive
technologies.
Accessibility
standards,
such
as
WCAG,
influence
design
and
compliance,
while
ethical
considerations
increasingly
shape
research
and
practice.