You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a use case where I want to train a linear regression model (could be any of ridge, lasso, elastic net) but instead of regularizing the weights such that the coefficients tend to zero, I would like to penalize coefficients for deviating from prior weights that I've passed when initializing the model.
From theoretical knowledge and similar datasets, I already know roughly what the weights should be (e.g., which variables should be positively and which negatively correlated), so I would like the model to use these weights unless the given dataset strongly suggests that a coefficient should be adapted.
Basically, instead of doing $$\min_w |w|^2 = |w-0|^2$$
for the regularization, I would like to do $$\min_w |w-w_{prior}|^2$$
To encourage the weights to stay close to the $w_{prior}$ that I have given.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Looking for the latest TMZ celebrity news? You've come to the right place. From shocking Hollywood scandals to exclusive videos, TMZ delivers it all in real time.
Whether it’s a red carpet slip-up, a viral paparazzi moment, or a legal drama involving your favorite stars, TMZ news is always first to break the story. Stay in the loop with daily updates, insider tips, and jaw-dropping photos.
🎥 Watch TMZ Live
TMZ Live brings you daily celebrity news and interviews straight from the TMZ newsroom. Don’t miss a beat—watch now and see what’s trending in Hollywood.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have a use case where I want to train a linear regression model (could be any of ridge, lasso, elastic net) but instead of regularizing the weights such that the coefficients tend to zero, I would like to penalize coefficients for deviating from prior weights that I've passed when initializing the model.
From theoretical knowledge and similar datasets, I already know roughly what the weights should be (e.g., which variables should be positively and which negatively correlated), so I would like the model to use these weights unless the given dataset strongly suggests that a coefficient should be adapted.
Basically, instead of doing
$$\min_w |w|^2 = |w-0|^2$$
for the regularization, I would like to do
$$\min_w |w-w_{prior}|^2$$
To encourage the weights to stay close to the$w_{prior}$ that I have given.
Beta Was this translation helpful? Give feedback.
All reactions