Home

ngdL

ngdL is an acronym used in the field of graph-based machine learning to denote a family of diffusion-driven learning approaches. The central idea of ngdL is to propagate information across the structure of a graph so that each node’s representation or prediction reflects the influence of its local neighborhood. This diffusion perspective can be implemented in various forms, from linear models that use diffusion kernels to neural architectures that integrate diffusion-like smoothing as a component of end-to-end training.

In practice, ngdL methods construct a graph from data, choose a diffusion operator (such as a Laplacian-based

Applications of ngdL span social networks, biological networks, knowledge graphs, and other domains where relational structure

See also: graph neural networks, diffusion processes, manifold regularization, spectral clustering.

kernel
or
a
personalized
PageRank
scheme),
and
compute
a
smoothed
or
propagated
representation
over
multiple
steps
or
iterations.
The
resulting
diffusion-enhanced
features
are
then
used
in
a
learning
objective,
which
may
include
supervised
loss
on
labeled
nodes,
regularization
terms
that
encourage
consistency
with
the
graph
structure,
or
tasks
such
as
node
classification,
link
prediction,
or
clustering.
Some
ngdL
variants
combine
diffusion
with
neural
networks,
enabling
deep
representations
that
respect
the
graph
geometry
while
still
benefiting
from
nonlinear
modeling
power.
is
important.
Strengths
include
robustness
to
scarce
labels
and
the
ability
to
encode
manifold
or
neighborhood
information
directly
into
learning.
Limitations
involve
sensitivity
to
graph
construction,
scalability
to
large
graphs,
and
the
need
to
tune
diffusion
parameters
and
regularization
choices.
Related
concepts
include
diffusion
maps,
spectral
methods,
and
graph
neural
networks.