home.social

#wordembeddings — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #wordembeddings, aggregated by home.social.

  1. 🧠📊 How can we measure imageability in literary texts?
    The authors approach how words evoke sensory experience and test whether multimodal #WordEmbeddings can better capture #imageability, #visuality, and #concreteness than text-only models, from words to sentences to poems.
    #CCLS2025 #JCLS #CLS

  2. 🧠📊 How can we measure imageability in literary texts?
    The authors approach how words evoke sensory experience and test whether multimodal #WordEmbeddings can better capture #imageability, #visuality, and #concreteness than text-only models, from words to sentences to poems.
    #CCLS2025 #JCLS #CLS

  3. 🧠📊 How can we measure imageability in literary texts?
    The authors approach how words evoke sensory experience and test whether multimodal #WordEmbeddings can better capture #imageability, #visuality, and #concreteness than text-only models, from words to sentences to poems.
    #CCLS2025 #JCLS #CLS

  4. 🧠📊 How can we measure imageability in literary texts?
    The authors approach how words evoke sensory experience and test whether multimodal #WordEmbeddings can better capture #imageability, #visuality, and #concreteness than text-only models, from words to sentences to poems.
    #CCLS2025 #JCLS #CLS

  5. 🧠📊 How can we measure imageability in literary texts?
    The authors approach how words evoke sensory experience and test whether multimodal #WordEmbeddings can better capture #imageability, #visuality, and #concreteness than text-only models, from words to sentences to poems.
    #CCLS2025 #JCLS #CLS

  6. It's already the last talk of #CCLS2025 😱

    Yuri Bizzoni, Pascale Feldkamp, Kristoffer L. Nielbo: Encoding Imagism? Measuring Literary Imageability, Visuality and Concreteness via Multimodal Word Embeddings (doi.org/10.26083/tuprints-0003)
    #Measuring #LiteraryImageability #WordEmbeddings

  7. Published at #IRRJ: "Graph Embeddings to Empower Entity Retrieval" by Emma J. Gerritse, Faegheh Hasibi, and Arjen P. de Vries. #EntityRetrieval, #KnowledgeGraphEmbeddings, #WordEmbeddings

    doi.org/10.54195/irrj.19877

  8. Next stop in our NLP timeline is 2013, the introduction of low dimensional dense word vectors - so-called "word embeddings" - based on distributed semantics, as e.g. word2vec by Mikolov et al. from Google, which enabled representation learning on text.

    T. Mikolov et al. (2013). Efficient Estimation of Word Representations in Vector Space.
    arxiv.org/abs/1301.3781

    #NLP #AI #wordembeddings #word2vec #ise2025 #historyofscience @fiz_karlsruhe @fizise @tabea @sourisnumerique @enorouzi

  9. Hi there! 😊 I'm Aleena, a junior computer science researcher at SINTEF AS since May 2021. In #Fakespeak,
    I primarily focus on collecting #Norwegian datasets and assisting with text analysis. My interests revolve
    around #NLP applications and contextualized #wordembeddings. Outside of work, I like to unwind with
    activities such as table tennis, cooking, enjoying music, and playing games.

  10. Something I have used a lot this year and is excellent github.com/RichardScottOZ/geos - a fork of the original with a few updates and also doing some things in - it is really well done and has pretty good models available too - with a Canada focus. I would have put a million documents through it roughly.