#overfitting — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #overfitting, aggregated by home.social.
-
I was working this week on a UI to help users understand when they’re #overfitting their data. Coming up with ideas for the #UI/#UX was proving to be difficult.
My experience 2 months ago with #Claude was terrible. But this time the output from #ClaudeDesign, was pretty good: training vs prediction, noise and model complexity controls, and comparing fits across degrees.
Needs improvement, but its a good starting point. AI in design is going to be a thing.
-
Furthermore, we were discussing overfitting as another major problem with machine learning. SImply memorising the data doesn't help, when you have to make predictions over unknown data. When overfitting, the model looses the ability to generalise...
#AI #lecture #machine learning #KDAI2026 #overfitting #datascience #data @fiz_karlsruhe @fizise #knowledge
-
When Dimensionality Hurts: The Role of #LLM Embedding Compression for Noisy Regression Tasks https://d.repec.org/n?u=RePEc:arx:papers:2502.02199&r=&r=cmp
"… suggest that the optimal dimensionality is dependent on the signal-to-noise ratio, exposing the necessity of feature compression in high noise environments. The implication of the result is that researchers should consider the #noise of a task when making decisions about the dimensionality of text.… findings indicate that sentiment and emotion-based representations do not provide inherent advantages over learned latent features, implying that their previous success in similar tasks may be attributed to #regularisation effects rather than intrinsic informativeness."
#ML #autoencoders #Overfitting -
One question for the #MachineLearning people: what approach do you use to determine if a decision trees or a random forest approach should work better? Do you simply try both approaches and use whatever seems to work better?
According to what I read, decision trees are more prone to overfitting, while random forest is a more complex approach. Which means little to me 😅
-
'The Implicit Bias of Benign Overfitting', by Ohad Shamir.
http://jmlr.org/papers/v24/22-0784.html
#overfitting #predictors #predictor -
from the standpoint of model selection, parsimony often boils down to dimensionality reduction
#modelSelection #parsimony #OccamsRazor #dimensionalityReduction #degreesOfFreedom #complexity #informationTheory #biasVarianceTradeoff #overfitting #underfitting #optimization #parameterTuning #crossValidation #inverseProblems #inference #statisticalLearning #machineLearning #ML #dataScience #modeling #decisionTheory #fitting #regression #classification #residualError #costFunction #performanceLoss
-
from the standpoint of model selection, parsimony often boils down to dimensionality reduction
#modelSelection #parsimony #OccamsRazor #dimensionalityReduction #degreesOfFreedom #complexity #informationTheory #biasVarianceTradeoff #overfitting #underfitting #optimization #parameterTuning #crossValidation #inverseProblems #inference #statisticalLearning #machineLearning #ML #dataScience #modeling #decisionTheory #fitting #regression #classification #residualError #costFunction #performanceLoss
-
from the standpoint of model selection, parsimony often boils down to dimensionality reduction
#modelSelection #parsimony #OccamsRazor #dimensionalityReduction #degreesOfFreedom #complexity #informationTheory #biasVarianceTradeoff #overfitting #underfitting #optimization #parameterTuning #crossValidation #inverseProblems #inference #statisticalLearning #machineLearning #ML #dataScience #modeling #decisionTheory #fitting #regression #classification #residualError #costFunction #performanceLoss
-
from the standpoint of model selection, parsimony often boils down to dimensionality reduction
#modelSelection #parsimony #OccamsRazor #dimensionalityReduction #degreesOfFreedom #complexity #informationTheory #biasVarianceTradeoff #overfitting #underfitting #optimization #parameterTuning #crossValidation #inverseProblems #inference #statisticalLearning #machineLearning #ML #dataScience #modeling #decisionTheory #fitting #regression #classification #residualError #costFunction #performanceLoss
-
from the standpoint of model selection, parsimony often boils down to dimensionality reduction
#modelSelection #parsimony #OccamsRazor #dimensionalityReduction #degreesOfFreedom #complexity #informationTheory #biasVarianceTradeoff #overfitting #underfitting #optimization #parameterTuning #crossValidation #inverseProblems #inference #statisticalLearning #machineLearning #ML #dataScience #modeling #decisionTheory #fitting #regression #classification #residualError #costFunction #performanceLoss
-
Understand basic principles of underfitting and overfitting - by @dimid_ml
https://towardsdatascience.com/overfitting-and-underfitting-principles-ea8964d9c45c
#overfitting #underfitting #DataScience #MachineLearning #AI