home.social

#mooreslaw — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #mooreslaw, aggregated by home.social.

  1. The "AI" phenomenon and craze has been discussed endlessly, including the vast computational and energy requirements of dedicated #datacenters

    There is an important split betwen model building and inference of statistical models (and hence also #llm ) which means there is not one but two problems to solve.

    The first one is *highly* non-trivial. It bumps against #mooreslaw and requires #supercomputing networking tricks. The datacenter must operate as one giant machine

    openai.com/index/mrc-supercomp

  2. The "AI" phenomenon and craze has been discussed endlessly, including the vast computational and energy requirements of dedicated #datacenters

    There is an important split betwen model building and inference of statistical models (and hence also #llm ) which means there is not one but two problems to solve.

    The first one is *highly* non-trivial. It bumps against #mooreslaw and requires #supercomputing networking tricks. The datacenter must operate as one giant machine

    openai.com/index/mrc-supercomp

  3. The "AI" phenomenon and craze has been discussed endlessly, including the vast computational and energy requirements of dedicated #datacenters

    There is an important split betwen model building and inference of statistical models (and hence also #llm ) which means there is not one but two problems to solve.

    The first one is *highly* non-trivial. It bumps against #mooreslaw and requires #supercomputing networking tricks. The datacenter must operate as one giant machine

    openai.com/index/mrc-supercomp

  4. The "AI" phenomenon and craze has been discussed endlessly, including the vast computational and energy requirements of dedicated #datacenters

    There is an important split betwen model building and inference of statistical models (and hence also #llm ) which means there is not one but two problems to solve.

    The first one is *highly* non-trivial. It bumps against #mooreslaw and requires #supercomputing networking tricks. The datacenter must operate as one giant machine

    openai.com/index/mrc-supercomp

  5. The "AI" phenomenon and craze has been discussed endlessly, including the vast computational and energy requirements of dedicated #datacenters

    There is an important split betwen model building and inference of statistical models (and hence also #llm ) which means there is not one but two problems to solve.

    The first one is *highly* non-trivial. It bumps against #mooreslaw and requires #supercomputing networking tricks. The datacenter must operate as one giant machine

    openai.com/index/mrc-supercomp

  6. Voelt jouw telefoon na een jaar alweer oud? 🤔

    Op deze dag in 1965 werd Moore's Law gepubliceerd! De 'wet' die voorspelt dat de kracht van chips elke twee jaar verdubbelt. De échte reden waarom we maar blijven upgraden. 🚀

    Welk 'prehistorisch' toestel mis jij stiekem nog?

    #MooresLaw #Techweetje

  7. @icing There's a thing called "the curse of dimensionality" and it applies to neural networks. I guess you could say that it's like a reverse Moore's Law but for neural nets. Basically, (and this is just my mostly-non technical explanation), neural nets are basically huge multi-dimensional classifiers and when you need to do backpropagation to train the net, it involves making small adjustments to localised areas of the classifier space. The problem (or curse) of having more dimensions is that it becomes harder and harder to localise the changes because every local space becomes closer to all the other points in every other subspace. This means exponentially higher training costs as these models scale.

    At least that's as I understand it. I'm not a mathematician, but I have read plenty of stuff relating to machine learning over the years (since the 90s) and I think I've got the above right...

    #MooresLaw #MachineLearning #Classifiers

  8. @icing There's a thing called "the curse of dimensionality" and it applies to neural networks. I guess you could say that it's like a reverse Moore's Law but for neural nets. Basically, (and this is just my mostly-non technical explanation), neural nets are basically huge multi-dimensional classifiers and when you need to do backpropagation to train the net, it involves making small adjustments to localised areas of the classifier space. The problem (or curse) of having more dimensions is that it becomes harder and harder to localise the changes because every local space becomes closer to all the other points in every other subspace. This means exponentially higher training costs as these models scale.

    At least that's as I understand it. I'm not a mathematician, but I have read plenty of stuff relating to machine learning over the years (since the 90s) and I think I've got the above right...

    #MooresLaw #MachineLearning #Classifiers

  9. @icing There's a thing called "the curse of dimensionality" and it applies to neural networks. I guess you could say that it's like a reverse Moore's Law but for neural nets. Basically, (and this is just my mostly-non technical explanation), neural nets are basically huge multi-dimensional classifiers and when you need to do backpropagation to train the net, it involves making small adjustments to localised areas of the classifier space. The problem (or curse) of having more dimensions is that it becomes harder and harder to localise the changes because every local space becomes closer to all the other points in every other subspace. This means exponentially higher training costs as these models scale.

    At least that's as I understand it. I'm not a mathematician, but I have read plenty of stuff relating to machine learning over the years (since the 90s) and I think I've got the above right...

    #MooresLaw #MachineLearning #Classifiers

  10. @icing There's a thing called "the curse of dimensionality" and it applies to neural networks. I guess you could say that it's like a reverse Moore's Law but for neural nets. Basically, (and this is just my mostly-non technical explanation), neural nets are basically huge multi-dimensional classifiers and when you need to do backpropagation to train the net, it involves making small adjustments to localised areas of the classifier space. The problem (or curse) of having more dimensions is that it becomes harder and harder to localise the changes because every local space becomes closer to all the other points in every other subspace. This means exponentially higher training costs as these models scale.

    At least that's as I understand it. I'm not a mathematician, but I have read plenty of stuff relating to machine learning over the years (since the 90s) and I think I've got the above right...

    #MooresLaw #MachineLearning #Classifiers

  11. @icing There's a thing called "the curse of dimensionality" and it applies to neural networks. I guess you could say that it's like a reverse Moore's Law but for neural nets. Basically, (and this is just my mostly-non technical explanation), neural nets are basically huge multi-dimensional classifiers and when you need to do backpropagation to train the net, it involves making small adjustments to localised areas of the classifier space. The problem (or curse) of having more dimensions is that it becomes harder and harder to localise the changes because every local space becomes closer to all the other points in every other subspace. This means exponentially higher training costs as these models scale.

    At least that's as I understand it. I'm not a mathematician, but I have read plenty of stuff relating to machine learning over the years (since the 90s) and I think I've got the above right...

    #MooresLaw #MachineLearning #Classifiers

  12. MARIA, the graphics chip in the Atari 7800, had 24k transistors. Atari sold about 3.8M units of that console, which adds up to 91.2B transistors.

    Coincidentally, GB202-300, the graphics chip in the NVIDIA 5090, has 92.2B transistors.

    Just sayin'

    #RetroComputing #Bloat #MooresLaw

  13. MARIA, the graphics chip in the Atari 7800, had 24k transistors. Atari sold about 3.8M units of that console, which adds up to 91.2B transistors.

    Coincidentally, GB202-300, the graphics chip in the NVIDIA 5090, has 92.2B transistors.

    Just sayin'

    #RetroComputing #Bloat #MooresLaw

  14. MARIA, the graphics chip in the Atari 7800, had 24k transistors. Atari sold about 3.8M units of that console, which adds up to 91.2B transistors.

    Coincidentally, GB202-300, the graphics chip in the NVIDIA 5090, has 92.2B transistors.

    Just sayin'

    #RetroComputing #Bloat #MooresLaw

  15. MARIA, the graphics chip in the Atari 7800, had 24k transistors. Atari sold about 3.8M units of that console, which adds up to 91.2B transistors.

    Coincidentally, GB202-300, the graphics chip in the NVIDIA 5090, has 92.2B transistors.

    Just sayin'

    #RetroComputing #Bloat #MooresLaw

  16. MARIA, the graphics chip in the Atari 7800, had 24k transistors. Atari sold about 3.8M units of that console, which adds up to 91.2B transistors.

    Coincidentally, GB202-300, the graphics chip in the NVIDIA 5090, has 92.2B transistors.

    Just sayin'

    #RetroComputing #Bloat #MooresLaw

  17. Right now, plowing $ into transistors that can not get smaller. Why? I am someone who has taught AI for 25 years. #AI is NOT a #bubble because compute is no longer on an exponential curve. When compute becomes additive, there is not means of production to go around. So what does #capital do? Put it in a data center (a factory) at all costs. Make compute a metered service at all costs. It is not the #LLM, not the #chat. It is the end of Moore's law we are witnessing. #datacenter #mooreslaw

  18. Right now, plowing $ into transistors that can not get smaller. Why? I am someone who has taught AI for 25 years. #AI is NOT a #bubble because compute is no longer on an exponential curve. When compute becomes additive, there is not means of production to go around. So what does #capital do? Put it in a data center (a factory) at all costs. Make compute a metered service at all costs. It is not the #LLM, not the #chat. It is the end of Moore's law we are witnessing. #datacenter #mooreslaw

  19. Right now, plowing $ into transistors that can not get smaller. Why? I am someone who has taught AI for 25 years. #AI is NOT a #bubble because compute is no longer on an exponential curve. When compute becomes additive, there is not means of production to go around. So what does #capital do? Put it in a data center (a factory) at all costs. Make compute a metered service at all costs. It is not the #LLM, not the #chat. It is the end of Moore's law we are witnessing. #datacenter #mooreslaw

  20. Right now, plowing $ into transistors that can not get smaller. Why? I am someone who has taught AI for 25 years. #AI is NOT a #bubble because compute is no longer on an exponential curve. When compute becomes additive, there is not means of production to go around. So what does #capital do? Put it in a data center (a factory) at all costs. Make compute a metered service at all costs. It is not the #LLM, not the #chat. It is the end of Moore's law we are witnessing. #datacenter #mooreslaw

  21. Right now, plowing $ into transistors that can not get smaller. Why? I am someone who has taught AI for 25 years. #AI is NOT a #bubble because compute is no longer on an exponential curve. When compute becomes additive, there is not means of production to go around. So what does #capital do? Put it in a data center (a factory) at all costs. Make compute a metered service at all costs. It is not the #LLM, not the #chat. It is the end of Moore's law we are witnessing. #datacenter #mooreslaw

  22. ASML kept Moore's Law alive by shrinking 18th-century lithography to atomic scales. Transistor density: one rice grain (1959) to three tankers today bryl.us/tccw #ASML #MooresLaw #SemiconductorTech #TechHistory

  23. @TMRO
    🚀
    I am hoping this time we do bootstrap a Lunar economy it is 60 years since Robert A. Heinlein wrote "The Moon Is a Harsh Mistress" which had mass being launched from the Moon via a linear accelerator I know it was #SciFi but the physics was known & I'd say hard #SciFi within the technical capabilities of the day if we had continued visiting the Moon we could have built it admittedly an AI with a slightly warp SOH😜 is only being born now after sufficient iterations of #MooresLaw but we could have built an equivalent of the Antarctic bases during the last century. I really hope it does not stall again & it does end up being "For All Mankind"

  24. @TMRO
    🚀
    I am hoping this time we do bootstrap a Lunar economy it is 60 years since Robert A. Heinlein wrote "The Moon Is a Harsh Mistress" which had mass being launched from the Moon via a linear accelerator I know it was #SciFi but the physics was known & I'd say hard #SciFi within the technical capabilities of the day if we had continued visiting the Moon we could have built it admittedly an AI with a slightly warp SOH😜 is only being born now after sufficient iterations of #MooresLaw but we could have built an equivalent of the Antarctic bases during the last century. I really hope it does not stall again & it does end up being "For All Mankind"

  25. @TMRO
    🚀
    I am hoping this time we do bootstrap a Lunar economy it is 60 years since Robert A. Heinlein wrote "The Moon Is a Harsh Mistress" which had mass being launched from the Moon via a linear accelerator I know it was #SciFi but the physics was known & I'd say hard #SciFi within the technical capabilities of the day if we had continued visiting the Moon we could have built it admittedly an AI with a slightly warp SOH😜 is only being born now after sufficient iterations of #MooresLaw but we could have built an equivalent of the Antarctic bases during the last century. I really hope it does not stall again & it does end up being "For All Mankind"

  26. @TMRO
    🚀
    I am hoping this time we do bootstrap a Lunar economy it is 60 years since Robert A. Heinlein wrote "The Moon Is a Harsh Mistress" which had mass being launched from the Moon via a linear accelerator I know it was #SciFi but the physics was known & I'd say hard #SciFi within the technical capabilities of the day if we had continued visiting the Moon we could have built it admittedly an AI with a slightly warp SOH😜 is only being born now after sufficient iterations of #MooresLaw but we could have built an equivalent of the Antarctic bases during the last century. I really hope it does not stall again & it does end up being "For All Mankind"

  27. @TMRO
    🚀
    I am hoping this time we do bootstrap a Lunar economy it is 60 years since Robert A. Heinlein wrote "The Moon Is a Harsh Mistress" which had mass being launched from the Moon via a linear accelerator I know it was #SciFi but the physics was known & I'd say hard #SciFi within the technical capabilities of the day if we had continued visiting the Moon we could have built it admittedly an AI with a slightly warp SOH😜 is only being born now after sufficient iterations of #MooresLaw but we could have built an equivalent of the Antarctic bases during the last century. I really hope it does not stall again & it does end up being "For All Mankind"

  28. There should be a campaign to rename it to “Moore’s Suggestion” or “Moore’s Guideline” (fed) #MooresLaw

  29. There should be a campaign to rename it to “Moore’s Suggestion” or “Moore’s Guideline” (fed) #MooresLaw

  30. There should be a campaign to rename it to “Moore’s Suggestion” or “Moore’s Guideline” (fed) #MooresLaw

  31. There should be a campaign to rename it to “Moore’s Suggestion” or “Moore’s Guideline” (fed) #MooresLaw

  32. There should be a campaign to rename it to “Moore’s Suggestion” or “Moore’s Guideline” (fed) #MooresLaw

  33. @tomjennings
    It's amazing how the processing and memory wre expensive and had to be centralized, so each user had an "inexpensive" terminal, but only a few years later the processing and memory became the least expensive parts of a calculator.
    #MooresLaw

  34. @tomjennings
    It's amazing how the processing and memory wre expensive and had to be centralized, so each user had an "inexpensive" terminal, but only a few years later the processing and memory became the least expensive parts of a calculator.
    #MooresLaw

  35. @tomjennings
    It's amazing how the processing and memory wre expensive and had to be centralized, so each user had an "inexpensive" terminal, but only a few years later the processing and memory became the least expensive parts of a calculator.
    #MooresLaw

  36. @tomjennings
    It's amazing how the processing and memory wre expensive and had to be centralized, so each user had an "inexpensive" terminal, but only a few years later the processing and memory became the least expensive parts of a calculator.
    #MooresLaw

  37. @tomjennings
    It's amazing how the processing and memory wre expensive and had to be centralized, so each user had an "inexpensive" terminal, but only a few years later the processing and memory became the least expensive parts of a calculator.
    #MooresLaw

  38. My #Thinkpad X200 (currently running #OpenBSD) is about to turn 16.
    I bought it when it was already a relic of 9 years old.

    Yet, I could just about daily-drive it today without really breaking a sweat. Especially using a BSD and an X11-based window manager, which leaves a lot of free RAM out of its 4GB available for other things, like actually running a web browser. XD

    That's just crazy to me.

    I remember when my belovèd old #Macintosh SE turned 16 (in 2006), it was just sitting in my closet gathering dust. It had a lot of sentimental value, but wasn't really usable to me.

    RIP #MooresLaw. ;)

  39. My #Thinkpad X200 (currently running #OpenBSD) is about to turn 16.
    I bought it when it was already a relic of 9 years old.

    Yet, I could just about daily-drive it today without really breaking a sweat. Especially using a BSD and an X11-based window manager, which leaves a lot of free RAM out of its 4GB available for other things, like actually running a web browser. XD

    That's just crazy to me.

    I remember when my belovèd old #Macintosh SE turned 16 (in 2006), it was just sitting in my closet gathering dust. It had a lot of sentimental value, but wasn't really usable to me.

    RIP #MooresLaw. ;)

  40. My #Thinkpad X200 (currently running #OpenBSD) is about to turn 16.
    I bought it when it was already a relic of 9 years old.

    Yet, I could just about daily-drive it today without really breaking a sweat. Especially using a BSD and an X11-based window manager, which leaves a lot of free RAM out of its 4GB available for other things, like actually running a web browser. XD

    That's just crazy to me.

    I remember when my belovèd old #Macintosh SE turned 16 (in 2006), it was just sitting in my closet gathering dust. It had a lot of sentimental value, but wasn't really usable to me.

    RIP #MooresLaw. ;)

  41. My #Thinkpad X200 (currently running #OpenBSD) is about to turn 16.
    I bought it when it was already a relic of 9 years old.

    Yet, I could just about daily-drive it today without really breaking a sweat. Especially using a BSD and an X11-based window manager, which leaves a lot of free RAM out of its 4GB available for other things, like actually running a web browser. XD

    That's just crazy to me.

    I remember when my belovèd old #Macintosh SE turned 16 (in 2006), it was just sitting in my closet gathering dust. It had a lot of sentimental value, but wasn't really usable to me.

    RIP #MooresLaw. ;)