#slm — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #slm, aggregated by home.social.
-
What is #BLAS?
BLAS is a set of fast matrix routines originally written in #Fortran.
If you’re tired of dynamic types, hidden references, ownership rules, and endless “stream” abstractions, Free #Pascal + BLAS gives you old‑school, deterministic HPC #programming with none of the modern noise.#Copilot and I will be using Free Pascal and BLAS for our Small Language Model project #SLM. No more #C, #python, #Rust, or C#
-
Why do people use #python, a glue language, which is so slow? The only reason is the AI ecosystem.
#Copilot and I just tested Free Pascal and BLAS for its speed without using #numpy or #pytorch. The result is amazing. It took less than a second to do a 1024x1024 #matrix multiplication.
We will be using Free #Pascal and #BLAS to write our Small Language Model #SLM using #NNUE.
-
Why do people use #python, a glue language, which is so slow? The only reason is the AI ecosystem.
#Copilot and I just tested Free Pascal and BLAS for its speed without using #numpy or #pytorch. The result is amazing. It took less than a second to do a 1024x1024 #matrix multiplication.
We will be using Free #Pascal and #BLAS to write our Small Language Model #SLM using #NNUE.
-
Why do people use #python, a glue language, which is so slow? The only reason is the AI ecosystem.
#Copilot and I just tested Free Pascal and BLAS for its speed without using #numpy or #pytorch. The result is amazing. It took less than a second to do a 1024x1024 #matrix multiplication.
We will be using Free #Pascal and #BLAS to write our Small Language Model #SLM using #NNUE.
-
Why do people use #python, a glue language, which is so slow? The only reason is the AI ecosystem.
#Copilot and I just tested Free Pascal and BLAS for its speed without using #numpy or #pytorch. The result is amazing. It took less than a second to do a 1024x1024 #matrix multiplication.
We will be using Free #Pascal and #BLAS to write our Small Language Model #SLM using #NNUE.
-
Fiz uma análise para o @outraspalavras.net de documento da Comissão de Análise Econômica e de Segurança EUA-China sobre como o país asiático escolheu disputar a corrida industrial por IA e o que o Sul Global pode aprender com isso.
https://outraspalavras.net/tecnologiaemdisputa/ia-como-a-china-esta-vencendo/
-
llmfit — one command to find which LLMs actually fit your hardware
https://github.com/AlexsJones/llmfit
Detects your RAM/GPU, scores hundreds of models, works with Ollama, LM Studio, llama.cpp...
Written in Rust of course 🙌
Handy when you're trying to optimize for SLMs like I do with @silex MCP => https://www.silex.me/ai/
Have you tried local models (Ollama, LM Studio, llama.cpp...)?
How did they perform, let me know 👇
-
In my journey to make @silex AI-native and optimized for free #OpenSource local models
I just released a grapesjs-ai-capabilities plugin, inspired by #WordPress new capability API: https://github.com/silexlabs/grapesjs-ai-capabilities
Now I'll refactor all the #GrapeJs plugins I maintain to expose capabilities as #MCP tools, with minimal prompts and very specific errors for #SLM
Then I’ll benchmark different models and maybe experiment with fine-tuning
#GrapesJS #AI #LocalAI #NoCode #BuildInPublic #FOSS #MCP #LLM
-
Geometry > Scale: Как 40М параметров на решетке E8 обходят классические трансформеры
Ребята, кажется, мы уперлись в стену. Пока гиганты наращивают параметры и жгут тераватты, пытаясь выжать каплю разума из статистики, я решил пересмотреть сам фундамент. Проблема не в данных, проблема в «вязкости» стандартного Attention.
-
[Перевод] Как сделать (очень) маленькие LLM действительно полезными
Команда AI for Devs подготовила перевод статьи о том, как выжать максимум из маленьких языковых моделей. Автор показывает, что даже очень компактные LLM могут быть полезны в реальных задачах — если правильно работать с контекстом, embeddings и RAG.
-
Aviation weather for Salamanca airport (Spain) is “LESA 190900Z 00000KT 3000 BR FEW028 01/01 Q1018” : See what it means on https://www.bigorre.org/aero/meteo/lesa/en #salamancaairport #airport #salamanca #spain #lesa #slm #metar #aviation #aviationweather #avgeek vl
-
Everyone is hyping up LLMs, but in 2026 SLMs are quietly winning key battles. 🤖
LLMs: great for broad, creative, multi-domain reasoning.
SLMs: faster, cheaper, privacy-friendly specialists for focused tasks and edge deployments.
New TechGlimmer article on when to pick each model and why “bigger” isn’t always better:
🔗 https://techglimmer.io/llm-vs-slm-in-2026-why-bigger-isnt-always-better/ -
Die UKW-Ausstrahlung der Freien Radios in Sachsen (@coloRadio, @radiot, #RadioBlau) ist nach dem Ausstieg des Mantelprogrammanbieters Apolloradio in Gefahr - die Sächsische Landesmedienanstalt hat eine Lizenzierung für ein Vollprogramm für die Freien Radios abgelehnt, und der UKW-Sendenetzbetreiber will die Frequenzen nicht für wenige Stunden Fensterprogramm am Tag trotzdem 24/7 vorhalten.
https://www.radioblau.de/ukw-ausstrahlung-in-gefahr/
#FreieRadios #Sachsen #SLM #UKW #RadioT #coloradio #FreiesRadio -
[Перевод] Плато возможностей, или Куда катится машинное обучение в 2026 году
2025 год стал для ИИ временем отрезвления. Эпоха масштабирования подходит к концу. Эксперты сходятся во мнении, что простым увеличением данных и вычислительной мощности следующий качественный скачок не совершить. На первый план выходят новые архитектуры, компактные модели и принципиально иные подходы к обучению. В 2026 году индустрия, похоже, даст ответ на вопрос, что ИИ может дать нам здесь и сейчас. Мы вступаем в эпоху прагматичного ИИ. Попробуем разобраться, какие именно тенденции определят лицо ИИ в наступающем 2026-м году.
https://habr.com/ru/companies/bothub/articles/983466/
#вайбкодинг #2025_год #2026_год #ииагенты #алекс_крижевский #илья_суцкевер #джеффри_хинтон #openai #gpt3 #slm
-
Aviation weather for Salamanca airport (Spain) is “LESA 061200Z 36009KT 330V040 9999 FEW030 03/M05 Q1020” : See what it means on https://www.bigorre.org/aero/meteo/lesa/en #salamancaairport #airport #salamanca #spain #lesa #slm #metar #aviation #aviationweather #avgeek vl
-
Started testing this - exceeded expectations for a small model https://essential.ai/research/rnj-1 #slm #llm
-
Aviation weather for Salamanca airport (Spain) is “LESA 141200Z 11003KT 060V140 6000 OVC003 05/04 Q1022” : See what it means on https://www.bigorre.org/aero/meteo/lesa/en #salamancaairport #airport #salamanca #spain #lesa #slm #metar #aviation #aviationweather #avgeek vl
-
What are small language models and how do they differ from large ones?
#Tech #AI #ArtificialIntelligence #LLM #SLM #LLMs #AIAssistants #SmallLanguageModels #TechNews #Microsoft #LargeLanguageModels #MachineLearning #The14 #The14Media
https://the-14.com/what-are-small-language-models-and-how-do-they-differ-from-large-ones/ -
🚀 Big isn’t always better! Discover how 10B-parameter models are outsmarting 100B-giants — faster, cheaper & smarter 🤖💡
Read here 👉 https://medium.com/@rogt.x1997/how-10-b-parameter-models-are-outperforming-100-b-giants-the-rise-of-small-language-models-9b81386d3998#AIRevolution #SLM #TechTrends
https://medium.com/@rogt.x1997/how-10-b-parameter-models-are-outperforming-100-b-giants-the-rise-of-small-language-models-9b81386d3998 -
#Clade4 #Sonnet and I have just deployed our second #AgenticAI on our #ICandy #Browser #dashboard project. We have have a total of 12 apps on this dashboard. The #AIAgent is a local AI chat app that allows me to chat with any of the small #LLM model that I downloaded and run on #LMStudio. I use them for learning #programming, #Math and #Japanese. This module can also generate 3000 #English sentences based on the set of vocabulary I upload to it that will be used for training my #NNUE #SLM.
-
I created 7 #AgenticAI 's and 1 #CSharp #app and 1 #Typescript app using #Claude pro in 3 days. We are both exhausted. For $200 a year, I will be able to create more than 100 apps using Claude 4 #Sonnet. I will also be able to finish my #NNUE #SLM project and my #VR entertainment room project and learn as much knowledge as I can in just one year. Who says AIs are bad? You don't need to be a programmer or a big corporation to benefit from #AIs, you only need to be #computer literate. #AI
-
I just installed #copilot for Obsidian. After getting the API key from Google, I could now use #Gemini 2.5 while writing notes for my #NNUE #SLM project in #Obsidian. In fact, I can choose other AIs to help me learn C# and other languages to code my project in Obsidian. Or I can focus on my model logic instead of spending most of my time learning computer languages. I can also use other AI's in my browser and in VS Code. All a sudden, I am going full #AI mode. I don't even need cloud storage.
-
Why is #NNUE Broad Shallow Learning Small Language Model much more efficient and transparent than Deep Learning. I can use just 2 layers for my #SLM AI to learn English #Semantics. With this foundation, I can add more modules to the model to improve upon her understanding of the meanings of words in different contexts and to be able to learn other knowledge such as #mathematics. NNUE gives rise to a clear knowledge space and ease of computation. Sparse matrix multiplication is unnecessary. #AI
-
I am now paying $16.8 a month for #Gemini Advanced for a couple of reasons. 1. I get 2T cloud storage. 2. I get #DeepResearch... I hate reading because I can't follow other people's logic. To learn any new knowledge, I will have to have curiosity in something first. I asked Gemini 2.5 to generate a deep report on #SparseMatrix because I need to speed up the matrix multiplication for my #NNUE #SLM with a huge parameter matrix and a super sparse input matrix representing vectors of words. #AI
-
One year of Phi: Small language models making big leaps in AI.
https://azure.microsoft.com/en-us/blog/one-year-of-phi-small-language-models-making-big-leaps-in-ai/
-
Microsoft Debuts Phi-4 Reasoning Models, Aiming for Big Performance Gains
#Microsoft #AI #Phi4 #SLM #LLM #OpenSourceAI #ReasoningModels #GenAI #MachineLearning
-
SciTech Chronicles. . . . . . . . .Jan 31, 2025
#wildfire #modelling #AI #Forecasting #Firefighters #microalga #footprint #Chlorella #bioreactors #production #stem-cells #lab-grown #stable #clinical #failure #scratching #aggravates #inflammation #eczema #dermatitis #NAAREA #3D-Printed #Phoenix #Nikon #SLM
-
Das neueste 010 Min #Video
BRENNPUNKT #GÜTERZÜGE
Seit dem Wochenende läuft die vierte Serie mit Spezialgebiet #Schüttgut [Kies. Schotter. Sand. Aushub]#peertube https://video.ploud.fr/w/kKLSGECG62nNt5e5vnksNr
#youtube : https://youtu.be/GETOZVlE1n8
- SBBC DOPPELTRAKTION Re 620
- SBB Re 420
- SBB Re 620
- SBBC Re 620
- TÄGERHARD-TUNNEL
- SBBC Am 843
- HASTAG KIESWERK VERLAD
- SBBC LOKZUG Re 620
- ZUFAHRT & KIESWERK HOLCIM
- MBC DOPPELTRAKTION Ge 4/4
u.v.m.
#TRAXX #BR189 #vectron ##re465 #dbcargo #re475 #bls #blscargo #re420 #br185 #postzug #re430 #re620 #re484 #rupperswil #würenlos #heitersberg #furttal #gotthard #simplon #olten #eisenbahn #bahn #bahnhof #train #trainspotting #lokführer #cff #ffs #sbb #sbbcargo #sbbc #doppeltraktion #bombardier #alstom #slm #db #arthgoldau #blsc #beacon #mrce #güterzug #mbc #BOXX #evu
-
WSW will nicht-kommerziell werden: Wie Sachsen-Fernsehen-Chef Frank Haring mitteilt, hat er für WSW eine Umwandlung der Lizenz von „kommerzielles“ auf „nicht-kommerzielles Radio“ gestellt. Außerdem hat er Fördermittel aus dem NKL-Topf beantragt. "Es ist doch absurd, wenn man inzwischen für ein NKL-Projekt mehr Geld bekommen kann, als man in einem lokalen Markt wie Weißwasser für ein lokales Radio mit Werbung erlösen kann“, sagt er. https://www.flurfunk-dresden.de/2023/06/26/erdbeben-im-saechsischen-radiomarkt-wsw-will-nicht-kommerziell-werden/
#Medien #Sachsen #NKL #FreieRadios #WSW #SLM #Radio #Lausitz