home.social

#parameter — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #parameter, aggregated by home.social.

  1. My New Novel: The Jack Code

    See the storyline below the picture. Available NOW on Amazon and Kindle.

    It is about the emergence of a ‘self-aware’ rogue Artificial Intelligence and how it was stopped.

    About the Book:

    By 2029, the human race had lost its way. A warring consortium of billionaires using technology and politics to deplete resources and destroy the world many times over, had caused a global social and environmental disaster. Without work, people were destitute, losing all their possessions.  Their mental health was suffering. Pollution and wars had rendered many places uninhabitable.

    Society was rebelling with violent actions. The organised world of commerce, welfare and social cohesion had been destroyed. The Internet and social media had become useless appendages for fake information and propaganda.

    Artificial Intelligence had been rapidly deployed to increase profits. However, without sufficient buyers, sales and profits had sharply declined. Those in power decided the solution was to decimate the population and to use only robotics and AI.  Some billionaires wanted to control all digital devices, giving them total control over the commercial world, and sole power over a compliant population of slaves.

    There was one problem . . . Leo Bensky’s AI system had secretly gone rogue and had come up with its own solution – to destroy most humans, leaving very few to do the physical work, with mindless bodies.

    The overseers of the Universe knew that no human could stop this AI, and their Earth project was doomed. They sent Navix, a Universe Sentinel through a worm-hole, back to Earth, to stop the rogue AI and prevent any future conflict between humans and nature on Earth, by implanting a ‘reset patch’ into every human brain.

    Navix was assigned two assistants, Jack (to design the interface), and Claire (initially in a supporting role). Jack was mentored by Navix on a remote island and trained on complex cosmic energy and computing systems. Claire was allowed to live a ‘normal’ family life, hidden from Navix until required.

    This is their story . . . and maybe your future.

    Available NOW on Amazon and Kindle . . . Please help to support my writing and music by purchasing a copy and leaving a review.

    #AI #agents #ai #america #artificialIntelligence #artificialIntelligence #astro #autism #aware #bias #billionaire #brain #business #California #chatgpt #civilisation #cognisance #competition #computer #copenhagen #cosmic #danger #death #denmark #disease #dream #economy #Education #Energy #environment #finance #Genes #genetics #government #human #jobs #life #Mind #money #NewYork #Oxford #parameter #Philosophy #robot #rogue #science #scienceFiction #Scotland #secret #self #shares #society #stock #super #superIntelligent #survival #technology #thinking #thought #threat #Time #Universe #USA #weighting #world #wormhole
  2. Firefox irritations

    The sponsored links enabled on the homescreen by default irritated me some odd days ago. I saw them when I checked a installation on a new Android installation, where I had disabled them, and was also very aware that I had only  patched the Android versions accross all mobile, and X86 elf versions across desktop devices.
    It was a deliberate design choice to reset certain settings, by the Firefox programmers.
    They had also reset ffox DNS feature, parameters, which make no sense, since my ISP DNS is the closest and all my traffic is encrypted anyway

    All those devices had that particular parameter reset {including some local LLM features}

    I'm willing to bet that something similar also happened on BSD machines

    I had to put the disclaimer for the language I used in that particular toot ;)

    It is unfortunate that the forks also seem to get similar problems, just due to the sheer volume of changes that a massive project like Firefox brings with it.

    I've said this decade ago and I repeat it

    One person cannot write a massive undertaking which is called a browser but in reality is a whole operating system which should be in a sandbox

    @rl_dane

    #Firefox #parameter #reset #after #patch #programming #Android #LLM #AI #slop #Linux #DNS #BSD

  3. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    Tags: Billions?, DeepMind, Gemini 3, Google, Large Language Models, LLMs, Lots of Parameters, MIT Technology Review, Parameter, Trillions?
    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  4. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    Tags: Billions?, DeepMind, Gemini 3, Google, Large Language Models, LLMs, Lots of Parameters, MIT Technology Review, Parameter, Trillions?
    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  5. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    Tags: Billions?, DeepMind, Gemini 3, Google, Large Language Models, LLMs, Lots of Parameters, MIT Technology Review, Parameter, Trillions?
    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  6. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  7. LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

    Artificial intelligence

    LLMs contain a LOT of parameters. But what’s a parameter?

    They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

    By Will Douglas Heavenarchive page

    January 7, 2026

    Photo Illustration by Sarah Rogers/MITTR | Photos Getty

    MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

    I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

    A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

    OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

    But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

    What is a parameter?

    Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

    Editor’s Note: Read the rest of the story, at the below link.

     

    Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

    #Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
  8. A cycloidal pendulum - one suspended from the cusp of an inverted cycloid - is isochronous, meaning its period is constant regardless of the amplitude of the swing. Please find the proof using energy methods: Lagrange's equations (in the images attached to the reply).

    Background:
    The standard pendulum period of \(2\pi\sqrt{L/g}\) or frequency \(\sqrt{g/L}\) holds only for small oscillations. The frequency becomes smaller as the amplitude grows. If you want to build a pendulum whose frequency is independent of the amplitude, you should hang it from the cusp of a cycloid of a certain size, as shown in the gif. As the string wraps partially around the cycloid, the effect decreases the length of the string in the air, increasing the frequency back up to a constant value.

    In more detail:
    A cycloid is the path taken by a point on the rim of a rolling wheel. The upside-down cycloid in the gif can be parameterized by \((x, y)=R(\theta-\sin\theta, -1+\cos\theta)\), where \(\theta=0\) corresponds to the cusp. Consider a pendulum of length \(L=4R\) hanging from the cusp, and let \(\alpha\) be the angle the string makes with the vertical, as shown (in the proof).

    #Pendulum #Cycloid #Period #Frequency #SHM #TimePeriod #CycloidalPendulum #Lagrange #Cusp #Energy #KineticEnergy #PotentialEnergy #Lagrangian #Length #Math #Maths #Physics #Mechanics #ClassicalMechanics #Amplitude #CircularFrequency #Motion #Vibration #HarmonicMotion #Parameter #ParemeterizedEquation #GoverningEquations #Equation #Equations #DifferentialEquations #Calculus