home.social

#billions — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #billions, aggregated by home.social.

  1. According to a survey conducted by "Der Spiegel" among the relevant ministries, German states are expected to spend hundreds of millions of euros over the comin... news.osna.fm/?p=45958 | #news #billions #civil #constitutional #costs

  2. According to a survey conducted by "Der Spiegel" among the relevant ministries, German states are expected to spend hundreds of millions of euros over the comin... news.osna.fm/?p=45958 | #news #billions #civil #constitutional #costs

  3. According to a survey conducted by "Der Spiegel" among the relevant ministries, German states are expected to spend hundreds of millions of euros over the comin... news.osna.fm/?p=45958 | #news #billions #civil #constitutional #costs

  4. According to a survey conducted by "Der Spiegel" among the relevant ministries, German states are expected to spend hundreds of millions of euros over the comin... news.osna.fm/?p=45958 | #news #billions #civil #constitutional #costs

  5. To comply with the guidelines set by the Federal Constitutional Court regarding adequate compensation for civil servants, Federal Interior Minister Alexander Do... news.osna.fm/?p=41896 | #news #berlins #billions #cost #debate

  6. To comply with the guidelines set by the Federal Constitutional Court regarding adequate compensation for civil servants, Federal Interior Minister Alexander Do... news.osna.fm/?p=41896 | #news #berlins #billions #cost #debate

  7. To comply with the guidelines set by the Federal Constitutional Court regarding adequate compensation for civil servants, Federal Interior Minister Alexander Do... news.osna.fm/?p=41896 | #news #berlins #billions #cost #debate

  8. To comply with the guidelines set by the Federal Constitutional Court regarding adequate compensation for civil servants, Federal Interior Minister Alexander Do... news.osna.fm/?p=41896 | #news #berlins #billions #cost #debate

  9. So #Claude #Code is exactly the hot mess I expected it to be.

    The mental image I have of any "#Agent Layer" on top of #LLMs is always the same: a barrel battered by bullets fixed with (ineffective) #bandaids making it leak *a little slower*.

    The #injection safeties are as ridiculous as they are insightful, the "#sentiment analysis" is a blacklist... c'mon.

    Raising #billions for this is most definitely quite the #achievement.

  10. Federal Minister for Family Affairs, Karin Prien (CDU), plans a comprehensive overhaul of child and youth welfare. A draft proposal from her ministry, which "Po... news.osna.fm/?p=40099 | #news #aiming #billions #child #family

  11. Federal Minister for Family Affairs, Karin Prien (CDU), plans a comprehensive overhaul of child and youth welfare. A draft proposal from her ministry, which "Po... news.osna.fm/?p=40099 | #news #aiming #billions #child #family

  12. Federal Minister for Family Affairs, Karin Prien (CDU), plans a comprehensive overhaul of child and youth welfare. A draft proposal from her ministry, which "Po... news.osna.fm/?p=40099 | #news #aiming #billions #child #family

  13. Federal Minister for Family Affairs, Karin Prien (CDU), plans a comprehensive overhaul of child and youth welfare. A draft proposal from her ministry, which "Po... news.osna.fm/?p=40099 | #news #aiming #billions #child #family

  14. The federal government plans to cut the air‑traffic tax in the middle of the year, according to a draft from the Finance Ministry reported by the FAZ. The measu... news.osna.fm/?p=38523 | #news #airtraffic #billions #cutting #fees

  15. The federal government plans to cut the air‑traffic tax in the middle of the year, according to a draft from the Finance Ministry reported by the FAZ. The measu... news.osna.fm/?p=38523 | #news #airtraffic #billions #cutting #fees

  16. The federal government plans to cut the air‑traffic tax in the middle of the year, according to a draft from the Finance Ministry reported by the FAZ. The measu... news.osna.fm/?p=38523 | #news #airtraffic #billions #cutting #fees

  17. The federal government plans to cut the air‑traffic tax in the middle of the year, according to a draft from the Finance Ministry reported by the FAZ. The measu... news.osna.fm/?p=38523 | #news #airtraffic #billions #cutting #fees

  18. www.nytimes.com/2026/02/14/u... How genuinely pathetic is this behavior? #billionaires would rather move elsewhere instead of paying a tax to support #healthcare for the #poor in #California - they would rather #profit from #stock #gains by spending #billions on #AI #genai

    Move Your Picassos, Get a Divo...

  19. nytimes.com/2026/02/14/us/cali

    How genuinely pathetic is this behavior? #billionaires would rather move elsewhere instead of paying a tax to support #healthcare for the #poor in #California - they would rather #profit from #stock #gains by spending #billions on #AI #genai - when they talk of #UBI - beware - they will not help a single person who loses their #job to AI

  20. Movie TV Tech Geeks #TVFeatures #Industry #Billions #PaulGiamatti If You Love ‘Industry,’ This Sleeper Hit Cat-and-Mouse Drama Is Your Next Binge With 49+ Million Hours Watched dlvr.it/TQRkqg

  21. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    Tags: Billions?, DeepMind, Gemini 3, Google, Large Language Models, LLMs, Lots of Parameters, MIT Technology Review, Parameter, Trillions?
    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  22. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    Tags: Billions?, DeepMind, Gemini 3, Google, Large Language Models, LLMs, Lots of Parameters, MIT Technology Review, Parameter, Trillions?
    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  23. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    Tags: Billions?, DeepMind, Gemini 3, Google, Large Language Models, LLMs, Lots of Parameters, MIT Technology Review, Parameter, Trillions?
    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  24. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  25. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  26. RE: flipboard.com/@lastampa/la-sta

    In Italy, the day after #mariodraghi reccomends to invest in AI, closes a small research center (30 researchers) focused on AI and several projects, some against #fakenews and #misinformation.

    0,8 M loss. While #OpenAI sits on #billions of profit-less investments.

    This is really a #pity and yet another bad omen on #EU future. After “loosing the train” of #internet #socialnetworks #e-economy #smartphones EU cannot afford to loose also the opportunity of #AI

  27. The dementia patient is dissembling freely again:

    > "Without Tariffs, and all of the TRILLIONS OF DOLLARS we have already taken in,
    > our Country would be completely destroyed, and our military power would be
    > instantly obliterated"

    Trillions? The actual figure for the whole of this administration is around 100 billion - and that's including tariffs, customs / duty, and excise taxes that existed pre-tantrum.

    He's mad judges have correctly ruled that his (limited) authority to levy tariffs in an emergency doesn't just let him do anything and everything he wants, forever.

    Someone please take the phone away from grandpa. He needs to sleep.

    #USPol #Trump #tariff #tariffs #revenue #customs #excise #trillions #billions #dissemble #lie #MentalPatient #dementia #demented #rant #raving #lunatic #USA #tantrum #TACO

  28. The dementia patient is dissembling freely again:

    > "Without Tariffs, and all of the TRILLIONS OF DOLLARS we have already taken in,
    > our Country would be completely destroyed, and our military power would be
    > instantly obliterated"

    Trillions? The actual figure for the whole of this administration is around 100 billion - and that's including tariffs, customs / duty, and excise taxes that existed pre-tantrum.

    He's mad judges have correctly ruled that his (limited) authority to levy tariffs in an emergency doesn't just let him do anything and everything he wants, forever.

    Someone please take the phone away from grandpa. He needs to sleep.

    #USPol #Trump #tariff #tariffs #revenue #customs #excise #trillions #billions #dissemble #lie #MentalPatient #dementia #demented #rant #raving #lunatic #USA #tantrum #TACO

  29. The dementia patient is dissembling freely again:

    > "Without Tariffs, and all of the TRILLIONS OF DOLLARS we have already taken in,
    > our Country would be completely destroyed, and our military power would be
    > instantly obliterated"

    Trillions? The actual figure for the whole of this administration is around 100 billion - and that's including tariffs, customs / duty, and excise taxes that existed pre-tantrum.

    He's mad judges have correctly ruled that his (limited) authority to levy tariffs in an emergency doesn't just let him do anything and everything he wants, forever.

    Someone please take the phone away from grandpa. He needs to sleep.

    #USPol #Trump #tariff #tariffs #revenue #customs #excise #trillions #billions #dissemble #lie #MentalPatient #dementia #demented #rant #raving #lunatic #USA #tantrum #TACO

  30. The dementia patient is dissembling freely again:

    > "Without Tariffs, and all of the TRILLIONS OF DOLLARS we have already taken in,
    > our Country would be completely destroyed, and our military power would be
    > instantly obliterated"

    Trillions? The actual figure for the whole of this administration is around 100 billion - and that's including tariffs, customs / duty, and excise taxes that existed pre-tantrum.

    He's mad judges have correctly ruled that his (limited) authority to levy tariffs in an emergency doesn't just let him do anything and everything he wants, forever.

    Someone please take the phone away from grandpa. He needs to sleep.

    #USPol #Trump #tariff #tariffs #revenue #customs #excise #trillions #billions #dissemble #lie #MentalPatient #dementia #demented #rant #raving #lunatic #USA #tantrum #TACO

  31. The dementia patient is dissembling freely again:

    > "Without Tariffs, and all of the TRILLIONS OF DOLLARS we have already taken in,
    > our Country would be completely destroyed, and our military power would be
    > instantly obliterated"

    Trillions? The actual figure for the whole of this administration is around 100 billion - and that's including tariffs, customs / duty, and excise taxes that existed pre-tantrum.

    He's mad judges have correctly ruled that his (limited) authority to levy tariffs in an emergency doesn't just let him do anything and everything he wants, forever.

    Someone please take the phone away from grandpa. He needs to sleep.

    #USPol #Trump #tariff #tariffs #revenue #customs #excise #trillions #billions #dissemble #lie #MentalPatient #dementia #demented #rant #raving #lunatic #USA #tantrum #TACO