Home

prompting

Prompting is the practice of crafting inputs, or prompts, to guide the behavior of a language model or other AI system. A prompt encodes the task, context, and desired format, and the model's output is conditioned on that input. Prompt design aims to elicit useful, accurate, and aligned behavior without explicit retraining, by leveraging the model's learned patterns and knowledge.

Common prompting paradigms include zero-shot prompting, where the model is asked to perform a task without

Techniques such as chain-of-thought prompting encourage the model to reveal reasoning steps, while structured prompts specify

Prompting intersects with broader issues of reliability and safety. Small changes in wording can lead to substantially

In practice, prompting underpins many applications, including chat assistants, writing aids, code generators, data extraction, and

examples;
few-shot
prompting,
which
includes
a
small
set
of
demonstrations;
and
instruction
prompting,
where
a
concise
directive
states
the
goal.
In-context
learning
places
examples
directly
in
the
prompt
to
steer
the
response.
System
or
role
prompts
set
global
behavior,
while
user
prompts
request
specific
outputs.
required
formats
such
as
lists,
code
blocks,
or
data
tables.
Prompt
templates
and
prompt
libraries
enable
reuse
and
standardization,
and
prompt
chaining
combines
multiple
steps
or
subtasks
into
a
sequence.
different
results,
affecting
accuracy
and
bias.
Risks
include
hallucinations,
unsafe
content,
and
prompt
injection,
in
which
adversaries
craft
prompts
or
inputs
to
bypass
safeguards
or
extract
sensitive
information.
tutoring.
It
is
often
used
alongside
fine-tuning,
retrieval
augmented
generation,
or
model
instruction
tuning
to
improve
robustness
and
controllability.