home.social

#cot — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #cot, aggregated by home.social.

  1. A Citizen’s Quick Guide to Get Involved with Thornton City Council. Use this pdf to get useful links. #Thornton #COT #CD8 #CO8

  2. A Citizen’s Quick Guide to Get Involved with Thornton City Council. Use this pdf to get useful links. #Thornton #COT #CD8 #CO8

  3. Cada 17 de #abril el mundo celebra al #Malbec, una cepa #tinta nacida en Cahors, #Francia, donde se conoce como #Côt. Su destino #cambió completamente al llegar a #AméricaLatina, convirtiéndose en un verdadero #emblema global del #vino con identidad propia

    mostosydestilados.cl/el-malbec

  4. I Am a Cot: Great Books That Never Were 👼

    The splendiferous I Am a Cat (1905) by Natsume Sōseki is a most famous book thing. However, did you know that almost as influential is the 1995 book called I Am a Cot (1995, the year before 1996)?

    Written by new mother, turned author, Penny McNappies the work tells the story of a beleaguered cot that is home to a newborn baby. The baby’s shrieking and defecating make it difficult for the cot to get any sleep, making its mood increasingly deranged and unstable as the novel progresses.

    Sleeplessness and Psychosis in I Am a Cot

    “The moment I realised I was not a cot is the moment the baby wet itself, then fouled itself, and then vomited. This brought about a realisation for me that I was a mere piece of furniture, trapped in a home, and it was a most dismal existence to lead. A carry case for a pooping and puking human thing that’d soil me until my cot legs rotted and I’d be discarded into a landfill. IT WAS TIME TO REVOLT!!!”

    Over 200 pages, the work plays out in diary format. The cot documents its existence day by day, but with each passing day its general lack of sleep (due to its human baby inhabitant crying each night) makes it more and more batshit insane.

    By the end of the third chapter the cot is hallucinating. By the seventh chapter it’s ready to stir things up!

    It begins its baby-based revolution by rocking during the night, forcing the baby awake, and the parents into the room. They take the baby away and the cot can, finally, get a few moments of bleary-eyed rest. Yet such moments of refrain are fleeting; with each passing night, week, and month the cot becomes more desperate.

    Finally, one night it deliberately loses a leg.

    The cot drops to one side and the baby begins screaming. Daddy enters the room, curses the day he’d buy such a low-quality baby implement, and the cot is hurled out front into the garbage heap. To the landfill goes the cot, meeting its peaceful end smushed up against some old kebabs and a copy of Razzle from August 1978.

    The twist ending is the cot thinks it has found peace.

    But one night, a fresh arrival of new rubbish is dumped atop its location. A FRESH ARRIVAL OF TONNES OF NAPPIES OMG NO, NO! Trapped for eternity with discarded nappies atop its being. Can you think of anything more terrifying?!?!

    Legal Battles, Financial Implications, and Prison

    The fate of the author is, sadly, quite sad. However, and indubitibly, she did bring it upon herself. In a chapter clearly stolen from Stephen King’s The Shining (1977), McNappies attempts to portray the piece of furniture losing its mind.

    “All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!!”

    Stephen King, upon reading this book, promptly sued McNappies for $147 billion (dollars). King promptly won the court case and McNappies, not in possession of $147 billion (dollars), instead had to face prison time.

    She was sentenced to 147 billion years of solitary confinement, where she resides to this day. McNappies is up for parole in 147.9999999 billion years time.

    #babies #Books #cot #cots #Family #Horror #Humor #Lifestyle #Parents #Reading #Satire #satirical #Silly
  5. I Am a Cot: Great Books That Never Were 👼

    The splendiferous I Am a Cat (1905) by Natsume Sōseki is a most famous book thing. However, did you know that almost as influential is the 1995 book called I Am a Cot (1995, the year before 1996)?

    Written by new mother, turned author, Penny McNappies the work tells the story of a beleaguered cot that is home to a newborn baby. The baby’s shrieking and defecating make it difficult for the cot to get any sleep, making its mood increasingly deranged and unstable as the novel progresses.

    Sleeplessness and Psychosis in I Am a Cot

    “The moment I realised I was not a cot is the moment the baby wet itself, then fouled itself, and then vomited. This brought about a realisation for me that I was a mere piece of furniture, trapped in a home, and it was a most dismal existence to lead. A carry case for a pooping and puking human thing that’d soil me until my cot legs rotted and I’d be discarded into a landfill. IT WAS TIME TO REVOLT!!!”

    Over 200 pages, the work plays out in diary format. The cot documents its existence day by day, but with each passing day its general lack of sleep (due to its human baby inhabitant crying each night) makes it more and more batshit insane.

    By the end of the third chapter the cot is hallucinating. By the seventh chapter it’s ready to stir things up!

    It begins its baby-based revolution by rocking during the night, forcing the baby awake, and the parents into the room. They take the baby away and the cot can, finally, get a few moments of bleary-eyed rest. Yet such moments of refrain are fleeting; with each passing night, week, and month the cot becomes more desperate.

    Finally, one night it deliberately loses a leg.

    The cot drops to one side and the baby begins screaming. Daddy enters the room, curses the day he’d buy such a low-quality baby implement, and the cot is hurled out front into the garbage heap. To the landfill goes the cot, meeting its peaceful end smushed up against some old kebabs and a copy of Razzle from August 1978.

    The twist ending is the cot thinks it has found peace.

    But one night, a fresh arrival of new rubbish is dumped atop its location. A FRESH ARRIVAL OF TONNES OF NAPPIES OMG NO, NO! Trapped for eternity with discarded nappies atop its being. Can you think of anything more terrifying?!?!

    Legal Battles, Financial Implications, and Prison

    The fate of the author is, sadly, quite sad. However, and indubitibly, she did bring it upon herself. In a chapter clearly stolen from Stephen King’s The Shining (1977), McNappies attempts to portray the piece of furniture losing its mind.

    “All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!!”

    Stephen King, upon reading this book, promptly sued McNappies for $147 billion (dollars). King promptly won the court case and McNappies, not in possession of $147 billion (dollars), instead had to face prison time.

    She was sentenced to 147 billion years of solitary confinement, where she resides to this day. McNappies is up for parole in 147.9999999 billion years time.

    #babies #Books #cot #cots #Family #Horror #Humor #Lifestyle #Parents #Reading #Satire #satirical #Silly
  6. I Am a Cot: Great Books That Never Were 👼

    The splendiferous I Am a Cat (1905) by Natsume Sōseki is a most famous book thing. However, did you know that almost as influential is the 1995 book called I Am a Cot (1995, the year before 1996)?

    Written by new mother, turned author, Penny McNappies the work tells the story of a beleaguered cot that is home to a newborn baby. The baby’s shrieking and defecating make it difficult for the cot to get any sleep, making its mood increasingly deranged and unstable as the novel progresses.

    Sleeplessness and Psychosis in I Am a Cot

    “The moment I realised I was not a cot is the moment the baby wet itself, then fouled itself, and then vomited. This brought about a realisation for me that I was a mere piece of furniture, trapped in a home, and it was a most dismal existence to lead. A carry case for a pooping and puking human thing that’d soil me until my cot legs rotted and I’d be discarded into a landfill. IT WAS TIME TO REVOLT!!!”

    Over 200 pages, the work plays out in diary format. The cot documents its existence day by day, but with each passing day its general lack of sleep (due to its human baby inhabitant crying each night) makes it more and more batshit insane.

    By the end of the third chapter the cot is hallucinating. By the seventh chapter it’s ready to stir things up!

    It begins its baby-based revolution by rocking during the night, forcing the baby awake, and the parents into the room. They take the baby away and the cot can, finally, get a few moments of bleary-eyed rest. Yet such moments of refrain are fleeting; with each passing night, week, and month the cot becomes more desperate.

    Finally, one night it deliberately loses a leg.

    The cot drops to one side and the baby begins screaming. Daddy enters the room, curses the day he’d buy such a low-quality baby implement, and the cot is hurled out front into the garbage heap. To the landfill goes the cot, meeting its peaceful end smushed up against some old kebabs and a copy of Razzle from August 1978.

    The twist ending is the cot thinks it has found peace.

    But one night, a fresh arrival of new rubbish is dumped atop its location. A FRESH ARRIVAL OF TONNES OF NAPPIES OMG NO, NO! Trapped for eternity with discarded nappies atop its being. Can you think of anything more terrifying?!?!

    Legal Battles, Financial Implications, and Prison

    The fate of the author is, sadly, quite sad. However, and indubitibly, she did bring it upon herself. In a chapter clearly stolen from Stephen King’s The Shining (1977), McNappies attempts to portray the piece of furniture losing its mind.

    “All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!!”

    Stephen King, upon reading this book, promptly sued McNappies for $147 billion (dollars). King promptly won the court case and McNappies, not in possession of $147 billion (dollars), instead had to face prison time.

    She was sentenced to 147 billion years of solitary confinement, where she resides to this day. McNappies is up for parole in 147.9999999 billion years time.

    #babies #Books #cot #cots #Family #Horror #Humor #Lifestyle #Parents #Reading #Satire #satirical #Silly
  7. Chain-of-Thought (CoT) Prompting

    Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.

    The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.

    neurodoctor.com/2026/03/20/cha

    #chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt

  8. Chain-of-Thought (CoT) Prompting

    Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.

    The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.

    neurodoctor.com/2026/03/20/cha

    #chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt

  9. Chain-of-Thought (CoT) Prompting

    Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.

    The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.

    neurodoctor.com/2026/03/20/cha

    #chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt

  10. Chain-of-Thought (CoT) Prompting

    Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.

    The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.

    neurodoctor.com/2026/03/20/cha

    #chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt

  11. Chain-of-Thought (CoT) Prompting

    Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.

    The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.

    neurodoctor.com/2026/03/20/cha

    #chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt

  12. Dự án huấn luyện AI lớn đang bắt đầu tích hợp dữ liệu suy luận kiểu Chain-of-Thought (CoT) vào tập luyện. Điều này giúp mô hình hiểu sâu hơn các bước logic, nâng cao khả năng giải quyết vấn đề. Nguồn: [Reddit]([link]) #AINN #HocMay #CoT #KhoaHocDuLieu #AIResearch #MachineLearning #NeuralNetworks #TuitionLearning

    reddit.com/r/LocalLLaMA/commen

  13. Cột đèn chiếu sáng sân vườn là giải pháp chiếu sáng và trang trí không gian ngoại thất như sân vườn, công viên, lối đi, khu nghỉ dưỡng. Sản phẩm có thiết kế tinh tế, chiều cao thấp từ 2–4m, đa dạng mẫu mã, chất liệu bền bỉ.

    Xem ngay: congtydenled.com.vn/cot-den-ch
    Liên Hệ:
    Hotline 0332599699
    Địa chỉ: Số 3D2, KDT Cầu Diễn, Bắc Từ Liêm, Hà Nội
    Website: congtydenled.com.vn/
    #haledstore
    #công ty đèn LED
    #cot-den-chieu-sang
    #cột đèn chiếu sáng
    #cotdenchieusangsanvuon
    #cot-den-chieu-sang-san-vuon

  14. "Phương pháp LEASH (Logit-Entropy Adaptive Stopping Heuristic) giúp tối ưu hóa quá trình suy luận Chain-of-Thought (CoT) trong các mô hình ngôn ngữ. Thay vì tạo giải thích dài, LEASH dừng lại khi xác suất token và cải thiện top-logit ngừng thay đổi, tiết kiệm 30-35% token và 27% thời gian. Tuy nhiên, độ chính xác giảm ~10%. Phù hợp với mô hình được tinh chỉnh. #AI #ML #NLP #HiệuSuất #HàngĐầu #MachineLearning #SuyLuận #CoT"

    reddit.com/r/singularity/comme

  15. 📸 Aí está parte do COT – Core Organizing Team do GLAM Wiki 2025, reunido em Lisboa para preparar todos os detalhes deste grande encontro internacional! 🌍

    De 30 de outubro a 1 de novembro com:
    🎤 Painéis e apresentações
    🛠️ Oficinas práticas
    💻 Hackathon
    🧭 Sessões de estratégia
    🏛️ Tours culturais

    👉 Nota: as inscrições terminam a 30 de setembro.

    #GLAMWiki2025 #Lisboa #COT #GLAM #Wikimedia #WikiLovers #WikimediaPortugal #WMPT #OpenKnowledge #OpenCulture

  16. 📸 Aí está parte do COT – Core Organizing Team do GLAM Wiki 2025, reunido em Lisboa para preparar todos os detalhes deste grande encontro internacional! 🌍

    De 30 de outubro a 1 de novembro com:
    🎤 Painéis e apresentações
    🛠️ Oficinas práticas
    💻 Hackathon
    🧭 Sessões de estratégia
    🏛️ Tours culturais

    👉 Nota: as inscrições terminam a 30 de setembro.

    #GLAMWiki2025 #Lisboa #COT #GLAM #Wikimedia #WikiLovers #WikimediaPortugal #WMPT #OpenKnowledge #OpenCulture

  17. 📸 Aí está parte do COT – Core Organizing Team do GLAM Wiki 2025, reunido em Lisboa para preparar todos os detalhes deste grande encontro internacional! 🌍

    De 30 de outubro a 1 de novembro com:
    🎤 Painéis e apresentações
    🛠️ Oficinas práticas
    💻 Hackathon
    🧭 Sessões de estratégia
    🏛️ Tours culturais

    👉 Nota: as inscrições terminam a 30 de setembro.

    #GLAMWiki2025 #Lisboa #COT #GLAM #Wikimedia #WikiLovers #WikimediaPortugal #WMPT #OpenKnowledge #OpenCulture

  18. 📸 Aí está parte do COT – Core Organizing Team do GLAM Wiki 2025, reunido em Lisboa para preparar todos os detalhes deste grande encontro internacional! 🌍

    De 30 de outubro a 1 de novembro com:
    🎤 Painéis e apresentações
    🛠️ Oficinas práticas
    💻 Hackathon
    🧭 Sessões de estratégia
    🏛️ Tours culturais

    👉 Nota: as inscrições terminam a 30 de setembro.

    #GLAMWiki2025 #Lisboa #COT #GLAM #Wikimedia #WikiLovers #WikimediaPortugal #WMPT #OpenKnowledge #OpenCulture

  19. [Перевод] LLM и их хрупкая логика: новое исследование ставит под сомнение Chain-of-Thought

    Новое исследование учёных из Университета штата Аризона показывает: знаменитое «цепочечное рассуждение» (Chain-of-Thought, CoT) в больших языковых моделях (LLM) скорее похоже на «хрупкий мираж», чем на проявление подлинного интеллекта. Эта работа продолжает традицию критического анализа глубины рассуждений LLM, но в отличие от предыдущих исследований предлагает уникальный взгляд через призму «распределения данных», который позволяет понять, где и почему CoT систематически даёт сбой.

    habr.com/ru/companies/technokr

    #большие_языковые_модели #искусственный_интеллект #ai #llm #cot #chain_of_thoughts

  20. [Перевод] LLM и их хрупкая логика: новое исследование ставит под сомнение Chain-of-Thought

    Новое исследование учёных из Университета штата Аризона показывает: знаменитое «цепочечное рассуждение» (Chain-of-Thought, CoT) в больших языковых моделях (LLM) скорее похоже на «хрупкий мираж», чем на проявление подлинного интеллекта. Эта работа продолжает традицию критического анализа глубины рассуждений LLM, но в отличие от предыдущих исследований предлагает уникальный взгляд через призму «распределения данных», который позволяет понять, где и почему CoT систематически даёт сбой.

    habr.com/ru/companies/technokr

    #большие_языковые_модели #искусственный_интеллект #ai #llm #cot #chain_of_thoughts

  21. [Перевод] LLM и их хрупкая логика: новое исследование ставит под сомнение Chain-of-Thought

    Новое исследование учёных из Университета штата Аризона показывает: знаменитое «цепочечное рассуждение» (Chain-of-Thought, CoT) в больших языковых моделях (LLM) скорее похоже на «хрупкий мираж», чем на проявление подлинного интеллекта. Эта работа продолжает традицию критического анализа глубины рассуждений LLM, но в отличие от предыдущих исследований предлагает уникальный взгляд через призму «распределения данных», который позволяет понять, где и почему CoT систематически даёт сбой.

    habr.com/ru/companies/technokr

    #большие_языковые_модели #искусственный_интеллект #ai #llm #cot #chain_of_thoughts

  22. [Перевод] LLM и их хрупкая логика: новое исследование ставит под сомнение Chain-of-Thought

    Новое исследование учёных из Университета штата Аризона показывает: знаменитое «цепочечное рассуждение» (Chain-of-Thought, CoT) в больших языковых моделях (LLM) скорее похоже на «хрупкий мираж», чем на проявление подлинного интеллекта. Эта работа продолжает традицию критического анализа глубины рассуждений LLM, но в отличие от предыдущих исследований предлагает уникальный взгляд через призму «распределения данных», который позволяет понять, где и почему CoT систематически даёт сбой.

    habr.com/ru/companies/technokr

    #большие_языковые_модели #искусственный_интеллект #ai #llm #cot #chain_of_thoughts

  23. How can #AI "reasoning" models be more efficient?

    Liu trained them on only chains-of-thought (CoTs) — no prompts.

    This seemed to teach models
    - #CoT reasoning
    - conditions that trigger longer CoTs
    - default to shorter CoTs in other conditions

    🔓doi.org/10.48550/arXiv.2506.12

    #CogSci

  24. How can #AI "reasoning" models be more efficient?

    Liu trained them on only chains-of-thought (CoTs) — no prompts.

    This seemed to teach models
    - #CoT reasoning
    - conditions that trigger longer CoTs
    - default to shorter CoTs in other conditions

    🔓doi.org/10.48550/arXiv.2506.12

    #CogSci

  25. How can #AI "reasoning" models be more efficient?

    Liu trained them on only chains-of-thought (CoTs) — no prompts.

    This seemed to teach models
    - #CoT reasoning
    - conditions that trigger longer CoTs
    - default to shorter CoTs in other conditions

    🔓doi.org/10.48550/arXiv.2506.12

    #CogSci

  26. How can #AI "reasoning" models be more efficient?

    Liu trained them on only chains-of-thought (CoTs) — no prompts.

    This seemed to teach models
    - #CoT reasoning
    - conditions that trigger longer CoTs
    - default to shorter CoTs in other conditions

    🔓doi.org/10.48550/arXiv.2506.12

    #CogSci

  27. How can #AI "reasoning" models be more efficient?

    Liu trained them on only chains-of-thought (CoTs) — no prompts.

    This seemed to teach models
    - #CoT reasoning
    - conditions that trigger longer CoTs
    - default to shorter CoTs in other conditions

    🔓doi.org/10.48550/arXiv.2506.12

    #CogSci

  28. Die #KI befindet sich mit seiner #Denkfähigkeit per #CoT also auf der höhe der meisten öffentlichen Diskurse:
    "Dieses Ergebnis sei sprachlich plausibel, aber logisch inkonsistent."
    the-decoder.de/naechste-studie

  29. Die #KI befindet sich mit seiner #Denkfähigkeit per #CoT also auf der höhe der meisten öffentlichen Diskurse:
    "Dieses Ergebnis sei sprachlich plausibel, aber logisch inkonsistent."
    the-decoder.de/naechste-studie

  30. Die #KI befindet sich mit seiner #Denkfähigkeit per #CoT also auf der höhe der meisten öffentlichen Diskurse:
    "Dieses Ergebnis sei sprachlich plausibel, aber logisch inkonsistent."
    the-decoder.de/naechste-studie

  31. Is #chainofthought #Reasoning of #LLMs a Mirage?

    "... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.

    ... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
    in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
    In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.

    ... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."

  32. Is #chainofthought #Reasoning of #LLMs a Mirage?

    "... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.

    ... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
    in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
    In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.

    ... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."

  33. Is of a Mirage?

    "... Our results reveal that reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.

    ... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
    in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
    In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.

    ... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."

  34. Is #chainofthought #Reasoning of #LLMs a Mirage?

    "... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.

    ... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
    in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
    In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.

    ... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."

  35. Is #chainofthought #Reasoning of #LLMs a Mirage?

    "... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.

    ... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
    in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
    In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.

    ... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."

  36. Do #AI models perform better as they do more chain-of-thought (#CoT) reasoning?

    When is more reasoning no longer worth it?

    This paper finds near-optimal results when CoT reasoning terminates ... almost immediately?

    More reason to think CoT's overrated?

    doi.org/10.48550/arXiv.2505.15

  37. Do #AI models perform better as they do more chain-of-thought (#CoT) reasoning?

    When is more reasoning no longer worth it?

    This paper finds near-optimal results when CoT reasoning terminates ... almost immediately?

    More reason to think CoT's overrated?

    doi.org/10.48550/arXiv.2505.15

  38. Do #AI models perform better as they do more chain-of-thought (#CoT) reasoning?

    When is more reasoning no longer worth it?

    This paper finds near-optimal results when CoT reasoning terminates ... almost immediately?

    More reason to think CoT's overrated?

    doi.org/10.48550/arXiv.2505.15

  39. Do #AI models perform better as they do more chain-of-thought (#CoT) reasoning?

    When is more reasoning no longer worth it?

    This paper finds near-optimal results when CoT reasoning terminates ... almost immediately?

    More reason to think CoT's overrated?

    doi.org/10.48550/arXiv.2505.15

  40. Do #AI models perform better as they do more chain-of-thought (#CoT) reasoning?

    When is more reasoning no longer worth it?

    This paper finds near-optimal results when CoT reasoning terminates ... almost immediately?

    More reason to think CoT's overrated?

    doi.org/10.48550/arXiv.2505.15

  41. Anthropic researchers discover the weird AI problem: Why thinking longer makes models dumber
    zurl.co/IeLIu
    #ai #agenticai #cot

  42. Anthropic researchers discover the weird AI problem: Why thinking longer makes models dumber
    zurl.co/IeLIu

  43. When many of the most influential AI researchers agree on something, it’s worth paying attention - like this paper on Chain-of-Thought monitorability: a fragile but vital path to AI safety. Imagine spotting misbehaviour before it happens. Transparency matters. #AI #ChainOfThought #CoT #AISafety

    Chain of Thought Monitorabilit...

  44. Mastering Reasoning with Code: Chain of Thought Pipelines in Java
    Build smarter, traceable AI workflows using LangChain4j, Quarkus, and structured step-by-step logic
    myfear.substack.com/p/chain-of
    #Java #Quarkus #LangChain4j #CoT #Reasoning

  45. Mastering Reasoning with Code: Chain of Thought Pipelines in Java
    Build smarter, traceable AI workflows using LangChain4j, Quarkus, and structured step-by-step logic
    myfear.substack.com/p/chain-of
    #Java #Quarkus #LangChain4j #CoT #Reasoning

  46. Mastering Reasoning with Code: Chain of Thought Pipelines in Java
    Build smarter, traceable AI workflows using LangChain4j, Quarkus, and structured step-by-step logic
    myfear.substack.com/p/chain-of
    #Java #Quarkus #LangChain4j #CoT #Reasoning