Home

neuroncentric

Neuron-centric is a term used in neuroscience and artificial intelligence to describe approaches that treat neurons as the primary units of information processing. It contrasts with views that prioritize networks, architectures, or algorithms, and emphasizes biological plausibility and local computation within individual neurons and their connections.

In neuroscience, neuron-centric models view neurons as nonlinear integrators of synaptic inputs that produce spikes when

In AI and neuromorphic engineering, neuron-centric design motivates spiking neural networks and hardware that emulate neural

Critics contend that focusing solely on neurons can miss emergent properties that arise from large-scale networks

Related topics include neuron, spiking neural network, neuromorphic engineering, Hebbian learning, and spike-timing-dependent plasticity.

a
threshold
is
exceeded.
Classic
models
include
integrate-and-fire
and
Hodgkin-Huxley
neurons.
Learning
is
often
conceived
as
local
synaptic
plasticity,
such
as
Hebbian
rules
or
spike-timing-dependent
plasticity
(STDP),
shaping
connectivity
based
on
activity.
elements.
Such
systems
aim
for
energy-efficient,
event-driven
computation
and
temporal
processing
by
implementing
neurons
and
synapses
in
silicon.
Training
can
combine
local
learning
rules
with
global
signals
or
use
surrogate
gradients
to
tackle
optimization.
and
multi-scale
interactions.
Proponents
argue
that
neuron-centric
approaches
provide
a
tractable,
biologically
grounded
framework
for
understanding
brain
function
and
building
efficient
neuromorphic
systems.