home.social

#overfitting — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #overfitting, aggregated by home.social.

  1. I was working this week on a UI to help users understand when they’re #overfitting their data. Coming up with ideas for the #UI/#UX was proving to be difficult.

    My experience 2 months ago with #Claude was terrible. But this time the output from #ClaudeDesign, was pretty good: training vs prediction, noise and model complexity controls, and comparing fits across degrees.

    Needs improvement, but its a good starting point. AI in design is going to be a thing.

    #ux #ui #datascience #prototyping

  2. Furthermore, we were discussing overfitting as another major problem with machine learning. SImply memorising the data doesn't help, when you have to make predictions over unknown data. When overfitting, the model looses the ability to generalise...

    #AI #lecture #machine learning #KDAI2026 #overfitting #datascience #data @fiz_karlsruhe @fizise #knowledge

  3. When Dimensionality Hurts: The Role of #LLM Embedding Compression for Noisy Regression Tasks d.repec.org/n?u=RePEc:arx:pape
    "… suggest that the optimal dimensionality is dependent on the signal-to-noise ratio, exposing the necessity of feature compression in high noise environments. The implication of the result is that researchers should consider the #noise of a task when making decisions about the dimensionality of text.

    … findings indicate that sentiment and emotion-based representations do not provide inherent advantages over learned latent features, implying that their previous success in similar tasks may be attributed to #regularisation effects rather than intrinsic informativeness."
    #ML #autoencoders #Overfitting

  4. One question for the #MachineLearning people: what approach do you use to determine if a decision trees or a random forest approach should work better? Do you simply try both approaches and use whatever seems to work better?

    According to what I read, decision trees are more prone to overfitting, while random forest is a more complex approach. Which means little to me 😅

    #ml #DecisionTrees #RandomForest #Overfitting