Home

plattformansvaret

plattformansvaret is a legal concept describing the responsibilities and, in some cases, liabilities of online platform operators for content and activity on their services. It encompasses how platforms moderate user-generated content, respond to illegal or harmful material, and protect users from abuse, misinformation, and other risks. While kept neutral, it recognizes that platforms can influence public discourse and safety through their policies and tools.

In the European Union, the precise liability of intermediaries has historically rested on the E-Commerce Directive,

Practically, plattformansvaret is implemented via terms of service, notice-and-takedown procedures, content moderation policies, user reporting mechanisms,

Critics argue that the evolving framework can both curb illegal activity and risk over-censorship, while supporters

which
generally
shields
hosting
providers
from
liability
for
user
content
if
they
act
as
mere
conduits
and
do
not
have
actual
knowledge
of
illegality;
upon
receiving
a
notification,
they
must
act
expeditiously
to
remove
or
disable
access
to
the
content.
The
Digital
Services
Act
(DSA)
expands
these
duties,
especially
for
very
large
online
platforms,
by
imposing
risk
assessments,
content
moderation
transparency,
redress
mechanisms,
age-appropriate
safeguards,
and
data
access
requirements
for
oversight
and
research.
and
transparency
reporting.
Compliance
costs
and
policy
choices
shape
how
platforms
balance
user
expression
with
safety,
and
small
platforms
face
different
pressures
compared
with
global,
very
large
platforms.
emphasize
improved
accountability
and
user
protection.
The
concept
applies
to
sectors
such
as
social
media,
video
sharing,
marketplaces,
and
app
stores.