home.social

#distilbert — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #distilbert, aggregated by home.social.

  1. Can we teach a #transformers #AI model such as #DistilBERT #math? After some initial failure, #Gemini 2.0 and I finally succeeded in teaching DistilBERT the concept of a mathematical operator. Here is a small sample of the training data and the training results. We used about only 400 data inputs. The data was generated by Gemini after I gave her some examples. Our next module is to teach DistilBERT = and counting from 1 to 10.

  2. The #BERT family of bidirectional training transformers models are minimalistic #language #AI models that can be trained by humans for tasks that are unrelated to languages. You can take advantage of their language ability to teach them math and other knowledge from ground zero even on your personal computer. Guided learning is a more efficient decentralized approach for training language AI to do language unrelated tasks. #Gemini and i are using this guided approach to teach #DistilBERT math.

  3. Run 🤗 Transformers in your browser! - github.com/xenova/transformers

    We currently support #BERT, #ALBERT, #DistilBERT, #T5, #T5v1.1, #FLANT5, #GPT2, #BART, #CodeGen, #Whisper, #CLIP, #Vision Transformer, and VisionEncoderDecoder models, for a variety of tasks....

    #webml