Home

fitintercept

fit_intercept is a boolean parameter used by many linear models to control whether the model should learn an intercept (also called a bias term) as part of the fitting process. When the flag is enabled, the model estimates a constant term that shifts the decision boundary or linear prediction.

If fit_intercept is True, the predictive equation takes the form y = X w + b, where w

In practice, many linear models in libraries such as scikit-learn (for example LinearRegression, Ridge, Lasso, ElasticNet,

The setting also interacts with regularization and data preprocessing. Regularization typically applies to the coefficients, while

Practical guidance: choose True if you expect a meaningful baseline offset; choose False if you know the

is
the
vector
of
feature
coefficients
and
b
is
the
intercept.
If
fit_intercept
is
False,
the
model
assumes
the
intercept
is
zero
and
fits
y
≈
X
w
without
an
additive
bias
term.
In
that
case,
the
interpretation
of
the
coefficients
changes:
predictions
are
anchored
to
the
origin
in
feature
space.
and
LogisticRegression)
expose
fit_intercept
as
a
parameter.
After
fitting,
the
learned
intercept
is
typically
stored
in
an
attribute
named
intercept_.
When
fit_intercept
is
False,
intercept_
is
usually
zero.
the
intercept
is
often
not
regularized.
When
fit_intercept
is
True,
the
model
can
accommodate
a
nonzero
baseline,
which
can
be
important
when
the
target
variable
has
a
nonzero
mean
or
when
features
are
not
centered.
Conversely,
setting
fit_intercept
to
False
is
appropriate
when
the
data
are
centered
or
when
domain
knowledge
indicates
the
relationship
should
pass
through
the
origin.
relationship
should
have
no
intercept
or
if
you
have
preprocessed
data
that
makes
an
intercept
unnecessary.
Proper
scaling
and,
if
needed,
centering,
should
be
considered
to
ensure
meaningful
and
stable
estimates.