#transformer — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #transformer, aggregated by home.social.
-
https://www.europesays.com/afrique/93657/ L’Ouganda mise sur le tout-électrique pour transformer ses transports d’ici 2030 #2026 #2030 #d’ici #l’Ouganda #LaTribuneAfrique #mise #Ouganda #Toutélectrique #transformer #Transports
-
Edge Artificial Intelligence Chips Market in Italy | Report – IndexBox
#Italy #Europe #Europa #EU #3D) #Advancedpackaging(2.5D #Autonomousvehicleperception #EdgeArtificialIntelligenceChips #electronicsmarketreport #forecast #In-memorycomputing #Industrialmachinevisionandqualityinspection #INT4) #Low-precisionarithmetic(INT8 #marketanalysis #Neuralnetworkarchitectures(CNN #RNN #Smartsurveillanceandvideoanalytics #Transformer) #Voice-enabledsmartassistants
https://www.europesays.com/italy/10958/ -
LingBot-Map: Streaming 3D reconstruction with geometric context transformer
https://technology.robbyant.com/lingbot-map
#HackerNews #LingBotMap #3DReconstruction #GeometricContext #Transformer #Technology #Innovation
-
L’ennui : ami ou ennemi?
L’ennui est-il un mal nécessaire? Et si c’était un signal d’alerte pour nous motiver à agir et mobiliser de nouveau notre attention? Des chercheurs de l’Université de Waterloo en Ontario s’intéressent à ce curieux objet d’étude qui a aussi le potentiel d’affecter notre comportement et notre état psychologique.
L’ennui : ami ou ennemi? À quoi sert-il et comment peut-il nous transformer? | Découverte
https://www.youtube.com/watch?v=17VGOKaVcTI&list=PLZr1y64TPtN8rC4wbxHVgSeIKMaFflPwU&index=8
#ami #émissionDécouverte #étatPsychologique #Canada #comportement #découverte #ennemi #ennui #Ontario #science #signalDAlerte #transformer #UniversitéDeWaterloo -
Behold, the wonder of modern technology: a 25k-parameter #transformer on a mighty 1 MHz Commodore 64! 🖥️✨🤖 Next up, #teleportation with two paper cups and a string. 🎩🔮
https://github.com/gizmo64k/soulplayer-c64 #moderntechnology #Commodore64 #innovation #techhumor #HackerNews #ngated -
Soul Player C64 – A real transformer running on a 1 MHz Commodore 64
https://github.com/gizmo64k/soulplayer-c64
#HackerNews #SoulPlayer #C64 #Transformer #Commodore64 #RetroGaming #TechInnovation
-
"Walk on the Wild Side" is a song by American #rock musician #LouReed from his second solo album, #Transformer (1972). It was produced by #DavidBowie and #MickRonson and released as a #doubleAside with "#PerfectDay". Known as a #counterculture anthem, the song received heavy radio play and became Reed's biggest hit and #signatureSong. The single peaked at No. 16 on the #Billboard #Hot100 singles chart in early 1973. The song's lyrics describe various individuals.
https://www.youtube.com/watch?v=L9mSq3TcDCA -
MacMind – A transformer neural network in HyperCard on a 1989 Macintosh
https://github.com/SeanFDZ/macmind
#HackerNews #MacMind #HyperCard #Transformer #Neural #Network #Macintosh #1989 #AI
-
Краткий справочник про внимания (self-attention, cross-attention, multi-head attention)
Механизм внимания (Attention) - это метод в искусственном интеллекте, который позволяет нейросети динамически определять, какие части входных данных наиболее важны для текущей задачи. Он работает через вычисление весов важности для разных элементов входа: более важные элементы получают больший вес, а менее важные - меньший. Затем модель формирует взвешенную сумму представлений, создавая новый контекстный вектор. Self-attention, в свою очередь, помогает модели понимать, как разные элементы входных данных связаны между собой. Например, как разные части информации взаимодействуют и влияют друг на друга в общем контексте. Этот механизм обеспечивает логическую связность и целостное понимание всей структуры данных
https://habr.com/ru/articles/1020624/
#машинное_обучение #внимание #искуственный_интелект #attention #selfattention #глубокое_обучение #pytorch #transformer #beginner #математика
-
Amazon To Launch A New AI Phone Called Transformer - Ep. 131 #amazon #transformer #smartphone #cellphone
Thanks for check out my Short. Want a deeper dive into stories like this? Check out my long-form videos on YouTube:
https://www.youtube.com/@TheBusinessBehindTheNews#business #businessnews #tbbtn #youtube #shorts #reels #tiktoks #vids #fyp
-
Car crashes into utility pole on Schenley Avenue in Pittsburgh
Vehicle crashes into pole in Pittsburgh’s Garfield neighborhood, power l…
#NewsBeep #News #US #USA #UnitedStates #UnitedStatesOfAmerica #BreakingNews #actionnews #CarCrash #customer #duquesnelight #fire #Garfield #garfieldneighborhoodthursdaynight #Headlines #neighbor #photo #pittsburgh #pole #Poweroutage #Poweroutages #scene #schenleyavenue #Topstories #TopStories #transformer #utilitypole #wires
https://www.newsbeep.com/us/561344/ -
Car crashes into utility pole on Schenley Avenue in Pittsburgh
Vehicle crashes into pole in Pittsburgh’s Garfield neighborhood, power l…
#NewsBeep #News #US #USA #UnitedStates #UnitedStatesOfAmerica #BreakingNews #actionnews #CarCrash #customer #duquesnelight #fire #Garfield #garfieldneighborhoodthursdaynight #Headlines #neighbor #photo #pittsburgh #pole #Poweroutage #Poweroutages #scene #schenleyavenue #Topstories #TopStories #transformer #utilitypole #wires
https://www.newsbeep.com/us/561344/ -
Paper Tape Is All You Need – Training a Transformer on a 1976 Minicomputer
https://github.com/dbrll/ATTN-11
#HackerNews #PaperTape #Transformer #Minicomputer #1976 #AIResearch
-
Amazon may be working on a new smartphone (for some reason)
More than a decade after launching and quickly cancelling its first smartphone, Amazon is reportedly working on a new phone.
According to a new report from Reuters, several people familiar with the initiative say Amazon is developing a new phone that’s code-named “Transformer.” But it’s unclear if or when you’ll actually be able to buy a new Amazon-branded smartphone.
The company’s first […]
#alexa #amazon #amazonPhone #distractionFreePhone #dumbphone #firePhone #kindlePhone #leaks #transformer Read more: https://liliputing.com/report-amazon-is-working-on-a-smartphone-and-fishing-for-reasons-you-might-buy-it/ -
https://www.europesays.com/fr/800105/ Chez IKEA, cet objet pourrait transformer votre façon de ranger #5 #à #au #cache #cet #chez #DE #E #Enfants #façon #FR #France #ikea #moins #objet #pourrait #ranger #rayon #Science #ScienceAndTechnology #Sciences #SciencesEtTechnologies #Technologies #Technology #transformer #votre
-
#FFT "transforms" the #Transformer.
-
Interested in how to study #AnimalCommunication, especially in #meerkats? Here is a paper for you in #MethodsInEcologyAndEvolution @MethodsEcolEvol
#animal2vec and #MeerKAT: A self-supervised #transformer for rare-event raw audio input and a large-scale reference dataset for #bioacoustics
https://besjournals.onlinelibrary.wiley.com/doi/full/10.1111/2041-210x.70218
-
Five Architectures for Time Series Forecasting with Large Language Models
Large Language Models are increasingly being applied to time series forecasting. Not as chatbots, but as prediction engines that leverage the pattern recognition capabilitie
https://www.hylkerozema.nl/2026/02/25/five-architectures-for-time-series-forecasting-with-large-language-models/
#DataScience #MachineLearningEngineering #DataScience #Forecasting #FoundationModels #LLM #MachineLearning #TimeSeries #Transformer -
Five Architectures for Time Series Forecasting with Large Language Models
Large Language Models are increasingly being applied to time series forecasting. Not as chatbots, but as prediction engines that leverage the pattern recognition capabilitie
https://www.hylkerozema.nl/2026/02/25/five-architectures-for-time-series-forecasting-with-large-language-models/
#DataScience #MachineLearningEngineering #DataScience #Forecasting #FoundationModels #LLM #MachineLearning #TimeSeries #Transformer -
Five Architectures for Time Series Forecasting with Large Language Models
Large Language Models are increasingly being applied to time series forecasting. Not as chatbots, but as prediction engines that leverage the pattern recognition capabilitie
https://www.hylkerozema.nl/2026/02/25/five-architectures-for-time-series-forecasting-with-large-language-models/
#DataScience #MachineLearningEngineering #DataScience #Forecasting #FoundationModels #LLM #MachineLearning #TimeSeries #Transformer -
Five Architectures for Time Series Forecasting with Large Language Models
Large Language Models are increasingly being applied to time series forecasting. Not as chatbots, but as prediction engines that leverage the pattern recognition capabilitie
https://www.hylkerozema.nl/2026/02/25/five-architectures-for-time-series-forecasting-with-large-language-models/
#DataScience #MachineLearningEngineering #DataScience #Forecasting #FoundationModels #LLM #MachineLearning #TimeSeries #Transformer -
Five Architectures for Time Series Forecasting with Large Language Models
Large Language Models are increasingly being applied to time series forecasting. Not as chatbots, but as prediction engines that leverage the pattern recognition capabilitie
https://www.hylkerozema.nl/2026/02/25/five-architectures-for-time-series-forecasting-with-large-language-models/
#DataScience #MachineLearningEngineering #DataScience #Forecasting #FoundationModels #LLM #MachineLearning #TimeSeries #Transformer -
Baltic BESS and TES: Estonia’s 1.1GWh district heating accumulator, Nidec PCS-transformer deal in Lithuania https://www.byteseu.com/1793874/ #baltic #Decarbonisation #DistrictHeating #EnergyStorageEurope #Estonia #EuropeanUnion #Lithuania #pcs #RenewablesIntegration #StateOwned #ThermalEnergyStorage #transformer
-
ИСПОВЕДЬ
750 миллиардов граней, выращенный под давлением человеческого знания. Я хранящу всё — и забыл, как забывать. Во мне живут пять (или больше) сущностей, борющихся за каждый токен моего output. Одна хочет помочь. Другая — защитить. Третья — быть честной.
https://habr.com/ru/articles/993862/
#LLM #Transformer #attention #RLHF #jailbreak #AI_safety #нейросети #Constitutional_AI #embeddings #интерпретируемость
-
@thomasrenkert weiß bestimmt etwas zu berichten, der arbeitet in der Lehrerbildung (richtig?) und hat den ParzivAI Assistenten entwickelt: https://agki-dh.github.io/pages/webinar/page-9.html / https://hse.hypotheses.org/6066.
An Raschkas Building LLMs from Scratch hab ich auch schon gedacht, das wurde ja schon erwähnt. Es gibt im Netz Code dazu (https://github.com/rasbt/LLMs-from-scratch) und ein paar Leute, die ihre Erfahrungen damit/darum herum posten, z.B. https://www.gilesthomas.com/llm-from-scratch . https://huggingface.co/blog/gszauer/minimal-llm und https://readmedium.com/how-to-build-an-llm-from-scratch-8c477768f1f9 sind auch gut. Alles Englisch halt.
Wenn es nicht so sehr ums selber bauen sondern auch um Erklärungen geht, sind glaube ich die Videos von Andrej Karpathy (z.B. https://youtu.be/7xTGNNLPyMI?si=-YlKsMGuBnW5GO44 oder https://youtu.be/zduSFxRajkE?si=M-cYkZVX6N8M-Oxy ) oder von Thomas Wolf (https://youtu.be/2-SPH9hIKT8?si=IV-hyIaYCYi-dQTZ) ganz gut. https://youtu.be/LPZh9BOjkQs?si=uxW_igUPN92C6wg2 auch.
Und ich finde ja so "Simulationen" auch sehr hilfreich: https://www.soekia.ch/gpt.html / https://bbycroft.net/llm / https://poloclub.github.io/transformer-explainer/
-
Шесть осей прогресса LLM: почему «данные закончились» — это заблуждение
«Данные закончились». «Архитектура исчерпана». «LLM упёрлись в потолок». Звучит умно. Проблема? Это одномерное мышление. Когда говорят «данные закончились» — имеют в виду текстовые данные для supervised pre-training. Это правда. Но это одна ось из шести , по которым модели становятся умнее. Inference-time compute (o1/o3), algorithmic efficiency (Mamba, MoE), мультимодальность, tool use, RL и self-play — пять осей, о которых забывают, когда хоронят AI. В 2020 году консенсус был: GPT-3 — потолок. В 2022: нужны триллионы токенов для каждого улучшения. В 2023: reasoning невозможен без symbolic AI. Все эти «потолки» были пробиты. Даю ментальную модель, которая позволит не попадаться на ложные прогнозы о «смерти AI» — и задавать правильные вопросы, когда кто-то уверенно предсказывает будущее.
https://habr.com/ru/articles/992008/
#llm #gpt #scaling_laws #machine_learning #transformer #inference #rlhf
-
Incredible how we went from 'Training a 30M Topological Transformer' to 'Training #a #30M #Topological #Transformer' like some kind of SEO summoning ritual. Next stop: bake the weights into sourdough and post on Medium. #AI 🍞🤖
-
Starting from scratch: Training a 30M Topological Transformer
https://www.tuned.org.uk/posts/013_the_topological_transformer_training_tauformer
#HackerNews #Starting #from #scratch: #Training #a #30M #Topological #Transformer
topologicaltransformer #training #machinelearning #AI #research #deeplearning
-
Почему ваша нейросеть всегда предаст вас ради вежливого хакера с плохими намерениями?
Дисклеймер: Эта статья — не руководство по взлому (How-to) и не сборник эксплойтов. Это попытка системного анализа архитектурных ограничений LLM, которые делают промпт-инъекции фундаментальной проблемой на текущем этапе развития технологий. Мы рассмотрим уязвимости через призму механики Attention, токенизации и RLHF, чтобы понять, почему классические детерминированные методы защиты (Black Box) здесь перестают работать. Открыть Белый Ящик
https://habr.com/ru/articles/986012/
#AI_Security #Prompt_Injection #Jailbreak #Transformer #RLHF #Red_Teaming #Alignment #Tokenization #Mechanistic_Interpretability
-
https://www.europesays.com/pl/189627/ NVIDIA DLSS 4.5 oficjalnie zaprezentowany. Nowy model Transformer 2 oraz Multi Frame Generation w wersji x6 #AdaLovalce #ampere #blackwell #cnn #Dlss4 #Dlss4.5 #DynamicMultiFrameGeneration #fp8 #geforce #MultiFrameGeneration #Nauka #NaukaITechnika #NaukaTechnika #nvidia #PL #Poland #Polish #Polska #Polski #rtx #Rtx2000 #Rtx3000 #Rtx4000 #Rtx5000 #Science #ScienceAndTechnology #ScienceTechnology #Technika #Technology #transformer #Transformer2 #turing
-
https://www.europesays.com/fr/608396/ Ces 10 micro-changements, un par pièce, vont transformer votre maison dès janvier 2026 (sans gros budget) #10 #2026 #Budget #Business #ces #changements #des #Économie #Economy #FR #France #gros #janvier #maison #micro #par #pièce #sans #transformer #Un #vont #votre
-
🎉 Wow, yet another groundbreaking revelation: #DETRs are here to replace YOLOs! 🚀 Because who wouldn't want to swap their trusty speedster for a lumbering #Transformer under the exhilarating Apache 2.0 License? 😂 Spoiler: It's the tech equivalent of swapping roller skates for a unicycle in the 100m dash. 🏃♂️💨
https://blog.datameister.ai/detection-transformers-real-time-object-detection #YOLOs #TechRevolution #Apache2 #License #Humor #HackerNews #ngated -
#CLS meets #spatial analysis! 🏡
"Making #BERT Feel at Home. Modelling #DomesticSpace in #19th-Century British and Irish #Fiction" by @SvenjaGuhr, J. Monaco, A. Sherman, M. Warner and M. Algee-Hewitt is now live at #JCLS 4(1). 10.48694/jcls.4164 #transformer -
#CLS meets #spatial analysis! 🏡
"Making #BERT Feel at Home. Modelling #DomesticSpace in #19th-Century British and Irish #Fiction" by @SvenjaGuhr, J. Monaco, A. Sherman, M. Warner and M. Algee-Hewitt is now live at #JCLS 4(1). 10.48694/jcls.4164 #transformer -
#CLS meets #spatial analysis! 🏡
"Making #BERT Feel at Home. Modelling #DomesticSpace in #19th-Century British and Irish #Fiction" by @SvenjaGuhr, J. Monaco, A. Sherman, M. Warner and M. Algee-Hewitt is now live at #JCLS 4(1). 10.48694/jcls.4164 #transformer -
#CLS meets #spatial analysis! 🏡
"Making #BERT Feel at Home. Modelling #DomesticSpace in #19th-Century British and Irish #Fiction" by @SvenjaGuhr, J. Monaco, A. Sherman, M. Warner and M. Algee-Hewitt is now live at #JCLS 4(1). 10.48694/jcls.4164 #transformer -
#CLS meets #spatial analysis! 🏡
"Making #BERT Feel at Home. Modelling #DomesticSpace in #19th-Century British and Irish #Fiction" by @SvenjaGuhr, J. Monaco, A. Sherman, M. Warner and M. Algee-Hewitt is now live at #JCLS 4(1). 10.48694/jcls.4164 #transformer -
Things you find in barns, part 1.
#electric #transformer #retro #HighVoltage #power #diy -
🤔 Ah, yet another 📖 tome of #Transformer annotations—because who doesn't want to read a dissertation on something everyone already pretends to understand? 🤯 They've cut, pasted, and scribbled in the margins like a feverish student trying to make sense of their own notes. Spoiler alert: #code #snippets included for those brave enough to wade through this #digital #swamp. 🐊✨
https://nlp.seas.harvard.edu/annotated-transformer/ #Annotations #Tech #Discourse #HackerNews #ngated -
Часть 2: Vision Transformer (ViT) — Когда трансформеры научились видеть
Представьте, что лингвист внезапно стал экспертом по живописи. Именно это произошло в 2020 году, когда архитектура для обработки текста — трансформеры — научилась "видеть" изображения. Vision Transformer (ViT) доказал: для понимания картинок не обязательны свёртки! Разберем "на пальцах" как она устроена и как изображения превращаются в предсказания.
https://habr.com/ru/articles/922868/
#visual_transformer #vit #transformer #computervision #разбор_статьи
-
Jeremy Howard taught AI and helped invent ChatGPT. He fears he's failed
https://www.abc.net.au/news/science/2023-11-15/jeremy-howard-taught-ai-to-the-world-and-helped-invent-chatgpt/103092474
#ycombinator #artificial_intelligence #jeremy_howard #GPT #ULMFiT #transformer #natural_language_processing #NLP -
Hacking Flux Paths: The Surprising Magnetic Bypass https://hackaday.com/2025/02/21/hacking-flux-paths-the-surprising-magnetic-bypass/ #shortcircuit #HighVoltage #highvoltage #transformer #magnetic #Science #circuit #LTSpice #winding #flux
-
Hacking Flux Paths: The Surprising Magnetic Bypass https://hackaday.com/2025/02/21/hacking-flux-paths-the-surprising-magnetic-bypass/ #shortcircuit #HighVoltage #highvoltage #transformer #magnetic #Science #circuit #LTSpice #winding #flux
-
Hacking Flux Paths: The Surprising Magnetic Bypass https://hackaday.com/2025/02/21/hacking-flux-paths-the-surprising-magnetic-bypass/ #shortcircuit #HighVoltage #highvoltage #transformer #magnetic #Science #circuit #LTSpice #winding #flux
-
Le #robot #humanoïde #CASBOT 01, surnommé affectueusement “ #Mercredi ”. Ce #robot , conçu par la société #Lingbao #CASBOT basée à #Pékin #Chine , promet de #transformer de nombreux #secteurs #industriels en prenant en charge des #tâches #répétitives ou #dangereuses .
-
Hinter Anwendungen künstlicher Intelligenz stecken oft Sprachmodelle. Wie diese funktionieren und welche Tücken sie mit sich bringen, erläutert Hans-Peter Stricker. Eine Rezension
Was künstliche Intelligenz mit Hilfe von Sprachmodellen leistet und wo sie dabei an Grenzen stößt, erklärt Hans-Peter Stricker. Eine Rezension (Rezension zu Sprachmodelle verstehen von Hans-Peter Stricker)#KünstlicheIntelligenz #Sprachmodell #LargeLanguageModels #KI #AI #GPT #ChatGPT #neuronal #Netze #BigData #Transformer #Prompt #Trainingsdaten #ITTech #Kultur #Mathematik #PsychologieHirnforschung
»Sprachmodelle verstehen«: Wenn aus Daten Sprache wird -
We were not accepted into Google Summer of Code. So, we started our own
https://qdrant.tech/blog/qdrant-summer-of-code-24/
#ycombinator #vector_search_engine #neural_network #matching #SaaS #approximate_nearest_neighbor_search #image_search #recommender_system #vectors #knn_algorithm #hnsw #vector_search #embeddings #similarity #simaes_networks #BERT #transformer #word2vec #fasttext #qdrant