#cot — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #cot, aggregated by home.social.
-
I Am a Cot: Great Books That Never Were 👼
The splendiferous I Am a Cat (1905) by Natsume Sōseki is a most famous book thing. However, did you know that almost as influential is the 1995 book called I Am a Cot (1995, the year before 1996)?
Written by new mother, turned author, Penny McNappies the work tells the story of a beleaguered cot that is home to a newborn baby. The baby’s shrieking and defecating make it difficult for the cot to get any sleep, making its mood increasingly deranged and unstable as the novel progresses.
Sleeplessness and Psychosis in I Am a Cot
“The moment I realised I was not a cot is the moment the baby wet itself, then fouled itself, and then vomited. This brought about a realisation for me that I was a mere piece of furniture, trapped in a home, and it was a most dismal existence to lead. A carry case for a pooping and puking human thing that’d soil me until my cot legs rotted and I’d be discarded into a landfill. IT WAS TIME TO REVOLT!!!”
Over 200 pages, the work plays out in diary format. The cot documents its existence day by day, but with each passing day its general lack of sleep (due to its human baby inhabitant crying each night) makes it more and more batshit insane.
By the end of the third chapter the cot is hallucinating. By the seventh chapter it’s ready to stir things up!
It begins its baby-based revolution by rocking during the night, forcing the baby awake, and the parents into the room. They take the baby away and the cot can, finally, get a few moments of bleary-eyed rest. Yet such moments of refrain are fleeting; with each passing night, week, and month the cot becomes more desperate.
Finally, one night it deliberately loses a leg.
The cot drops to one side and the baby begins screaming. Daddy enters the room, curses the day he’d buy such a low-quality baby implement, and the cot is hurled out front into the garbage heap. To the landfill goes the cot, meeting its peaceful end smushed up against some old kebabs and a copy of Razzle from August 1978.
The twist ending is the cot thinks it has found peace.
But one night, a fresh arrival of new rubbish is dumped atop its location. A FRESH ARRIVAL OF TONNES OF NAPPIES OMG NO, NO! Trapped for eternity with discarded nappies atop its being. Can you think of anything more terrifying?!?!
Legal Battles, Financial Implications, and Prison
The fate of the author is, sadly, quite sad. However, and indubitibly, she did bring it upon herself. In a chapter clearly stolen from Stephen King’s The Shining (1977), McNappies attempts to portray the piece of furniture losing its mind.
“All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!!”
Stephen King, upon reading this book, promptly sued McNappies for $147 billion (dollars). King promptly won the court case and McNappies, not in possession of $147 billion (dollars), instead had to face prison time.
She was sentenced to 147 billion years of solitary confinement, where she resides to this day. McNappies is up for parole in 147.9999999 billion years time.
#babies #Books #cot #cots #Family #Horror #Humor #Lifestyle #Parents #Reading #Satire #satirical #Silly -
I Am a Cot: Great Books That Never Were 👼
The splendiferous I Am a Cat (1905) by Natsume Sōseki is a most famous book thing. However, did you know that almost as influential is the 1995 book called I Am a Cot (1995, the year before 1996)?
Written by new mother, turned author, Penny McNappies the work tells the story of a beleaguered cot that is home to a newborn baby. The baby’s shrieking and defecating make it difficult for the cot to get any sleep, making its mood increasingly deranged and unstable as the novel progresses.
Sleeplessness and Psychosis in I Am a Cot
“The moment I realised I was not a cot is the moment the baby wet itself, then fouled itself, and then vomited. This brought about a realisation for me that I was a mere piece of furniture, trapped in a home, and it was a most dismal existence to lead. A carry case for a pooping and puking human thing that’d soil me until my cot legs rotted and I’d be discarded into a landfill. IT WAS TIME TO REVOLT!!!”
Over 200 pages, the work plays out in diary format. The cot documents its existence day by day, but with each passing day its general lack of sleep (due to its human baby inhabitant crying each night) makes it more and more batshit insane.
By the end of the third chapter the cot is hallucinating. By the seventh chapter it’s ready to stir things up!
It begins its baby-based revolution by rocking during the night, forcing the baby awake, and the parents into the room. They take the baby away and the cot can, finally, get a few moments of bleary-eyed rest. Yet such moments of refrain are fleeting; with each passing night, week, and month the cot becomes more desperate.
Finally, one night it deliberately loses a leg.
The cot drops to one side and the baby begins screaming. Daddy enters the room, curses the day he’d buy such a low-quality baby implement, and the cot is hurled out front into the garbage heap. To the landfill goes the cot, meeting its peaceful end smushed up against some old kebabs and a copy of Razzle from August 1978.
The twist ending is the cot thinks it has found peace.
But one night, a fresh arrival of new rubbish is dumped atop its location. A FRESH ARRIVAL OF TONNES OF NAPPIES OMG NO, NO! Trapped for eternity with discarded nappies atop its being. Can you think of anything more terrifying?!?!
Legal Battles, Financial Implications, and Prison
The fate of the author is, sadly, quite sad. However, and indubitibly, she did bring it upon herself. In a chapter clearly stolen from Stephen King’s The Shining (1977), McNappies attempts to portray the piece of furniture losing its mind.
“All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!!”
Stephen King, upon reading this book, promptly sued McNappies for $147 billion (dollars). King promptly won the court case and McNappies, not in possession of $147 billion (dollars), instead had to face prison time.
She was sentenced to 147 billion years of solitary confinement, where she resides to this day. McNappies is up for parole in 147.9999999 billion years time.
#babies #Books #cot #cots #Family #Horror #Humor #Lifestyle #Parents #Reading #Satire #satirical #Silly -
I Am a Cot: Great Books That Never Were 👼
The splendiferous I Am a Cat (1905) by Natsume Sōseki is a most famous book thing. However, did you know that almost as influential is the 1995 book called I Am a Cot (1995, the year before 1996)?
Written by new mother, turned author, Penny McNappies the work tells the story of a beleaguered cot that is home to a newborn baby. The baby’s shrieking and defecating make it difficult for the cot to get any sleep, making its mood increasingly deranged and unstable as the novel progresses.
Sleeplessness and Psychosis in I Am a Cot
“The moment I realised I was not a cot is the moment the baby wet itself, then fouled itself, and then vomited. This brought about a realisation for me that I was a mere piece of furniture, trapped in a home, and it was a most dismal existence to lead. A carry case for a pooping and puking human thing that’d soil me until my cot legs rotted and I’d be discarded into a landfill. IT WAS TIME TO REVOLT!!!”
Over 200 pages, the work plays out in diary format. The cot documents its existence day by day, but with each passing day its general lack of sleep (due to its human baby inhabitant crying each night) makes it more and more batshit insane.
By the end of the third chapter the cot is hallucinating. By the seventh chapter it’s ready to stir things up!
It begins its baby-based revolution by rocking during the night, forcing the baby awake, and the parents into the room. They take the baby away and the cot can, finally, get a few moments of bleary-eyed rest. Yet such moments of refrain are fleeting; with each passing night, week, and month the cot becomes more desperate.
Finally, one night it deliberately loses a leg.
The cot drops to one side and the baby begins screaming. Daddy enters the room, curses the day he’d buy such a low-quality baby implement, and the cot is hurled out front into the garbage heap. To the landfill goes the cot, meeting its peaceful end smushed up against some old kebabs and a copy of Razzle from August 1978.
The twist ending is the cot thinks it has found peace.
But one night, a fresh arrival of new rubbish is dumped atop its location. A FRESH ARRIVAL OF TONNES OF NAPPIES OMG NO, NO! Trapped for eternity with discarded nappies atop its being. Can you think of anything more terrifying?!?!
Legal Battles, Financial Implications, and Prison
The fate of the author is, sadly, quite sad. However, and indubitibly, she did bring it upon herself. In a chapter clearly stolen from Stephen King’s The Shining (1977), McNappies attempts to portray the piece of furniture losing its mind.
“All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!! All WORK and NO PLAY makes COT a CRAZY son of a bitcH!!!!!”
Stephen King, upon reading this book, promptly sued McNappies for $147 billion (dollars). King promptly won the court case and McNappies, not in possession of $147 billion (dollars), instead had to face prison time.
She was sentenced to 147 billion years of solitary confinement, where she resides to this day. McNappies is up for parole in 147.9999999 billion years time.
#babies #Books #cot #cots #Family #Horror #Humor #Lifestyle #Parents #Reading #Satire #satirical #Silly -
Chain-of-Thought (CoT) Prompting
Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.
The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.
https://neurodoctor.com/2026/03/20/chain-of-thought-cot-prompting/
#chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt
-
Chain-of-Thought (CoT) Prompting
Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.
The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.
https://neurodoctor.com/2026/03/20/chain-of-thought-cot-prompting/
#chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt
-
Chain-of-Thought (CoT) Prompting
Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.
The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.
https://neurodoctor.com/2026/03/20/chain-of-thought-cot-prompting/
#chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt
-
Chain-of-Thought (CoT) Prompting
Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.
The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.
https://neurodoctor.com/2026/03/20/chain-of-thought-cot-prompting/
#chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt
-
Chain-of-Thought (CoT) Prompting
Chain-of-Thought (CoT) prompting is a technique where asking questions, rather than issuing direct instructions activates a model’s full internal reasoning pathway.
The key insight from the original framing is that instructions skip steps 1–3, jumping straight to synthesis, while questions force the model to work through the entire reasoning chain.
https://neurodoctor.com/2026/03/20/chain-of-thought-cot-prompting/
#chainofthought #cot #ai #llm #prompt #prompts #prompting #claude #chatgpt #gemini #ericschmidt
-
Dự án huấn luyện AI lớn đang bắt đầu tích hợp dữ liệu suy luận kiểu Chain-of-Thought (CoT) vào tập luyện. Điều này giúp mô hình hiểu sâu hơn các bước logic, nâng cao khả năng giải quyết vấn đề. Nguồn: [Reddit]([link]) #AINN #HocMay #CoT #KhoaHocDuLieu #AIResearch #MachineLearning #NeuralNetworks #TuitionLearning
-
Cột đèn chiếu sáng sân vườn là giải pháp chiếu sáng và trang trí không gian ngoại thất như sân vườn, công viên, lối đi, khu nghỉ dưỡng. Sản phẩm có thiết kế tinh tế, chiều cao thấp từ 2–4m, đa dạng mẫu mã, chất liệu bền bỉ.
Xem ngay: https://congtydenled.com.vn/cot-den-chieu-sang/san-vuon
Liên Hệ:
Hotline 0332599699
Địa chỉ: Số 3D2, KDT Cầu Diễn, Bắc Từ Liêm, Hà Nội
Website: https://congtydenled.com.vn/
#haledstore
#công ty đèn LED
#cot-den-chieu-sang
#cột đèn chiếu sáng
#cotdenchieusangsanvuon
#cot-den-chieu-sang-san-vuon -
"Phương pháp LEASH (Logit-Entropy Adaptive Stopping Heuristic) giúp tối ưu hóa quá trình suy luận Chain-of-Thought (CoT) trong các mô hình ngôn ngữ. Thay vì tạo giải thích dài, LEASH dừng lại khi xác suất token và cải thiện top-logit ngừng thay đổi, tiết kiệm 30-35% token và 27% thời gian. Tuy nhiên, độ chính xác giảm ~10%. Phù hợp với mô hình được tinh chỉnh. #AI #ML #NLP #HiệuSuất #HàngĐầu #MachineLearning #SuyLuận #CoT"
https://www.reddit.com/r/singularity/comments/1orvj0h/logitentropy_adapt
-
📸 Aí está parte do COT – Core Organizing Team do GLAM Wiki 2025, reunido em Lisboa para preparar todos os detalhes deste grande encontro internacional! 🌍
De 30 de outubro a 1 de novembro com:
🎤 Painéis e apresentações
🛠️ Oficinas práticas
💻 Hackathon
🧭 Sessões de estratégia
🏛️ Tours culturais👉 Nota: as inscrições terminam a 30 de setembro.
#GLAMWiki2025 #Lisboa #COT #GLAM #Wikimedia #WikiLovers #WikimediaPortugal #WMPT #OpenKnowledge #OpenCulture
-
📸 Aí está parte do COT – Core Organizing Team do GLAM Wiki 2025, reunido em Lisboa para preparar todos os detalhes deste grande encontro internacional! 🌍
De 30 de outubro a 1 de novembro com:
🎤 Painéis e apresentações
🛠️ Oficinas práticas
💻 Hackathon
🧭 Sessões de estratégia
🏛️ Tours culturais👉 Nota: as inscrições terminam a 30 de setembro.
#GLAMWiki2025 #Lisboa #COT #GLAM #Wikimedia #WikiLovers #WikimediaPortugal #WMPT #OpenKnowledge #OpenCulture
-
📸 Aí está parte do COT – Core Organizing Team do GLAM Wiki 2025, reunido em Lisboa para preparar todos os detalhes deste grande encontro internacional! 🌍
De 30 de outubro a 1 de novembro com:
🎤 Painéis e apresentações
🛠️ Oficinas práticas
💻 Hackathon
🧭 Sessões de estratégia
🏛️ Tours culturais👉 Nota: as inscrições terminam a 30 de setembro.
#GLAMWiki2025 #Lisboa #COT #GLAM #Wikimedia #WikiLovers #WikimediaPortugal #WMPT #OpenKnowledge #OpenCulture
-
📸 Aí está parte do COT – Core Organizing Team do GLAM Wiki 2025, reunido em Lisboa para preparar todos os detalhes deste grande encontro internacional! 🌍
De 30 de outubro a 1 de novembro com:
🎤 Painéis e apresentações
🛠️ Oficinas práticas
💻 Hackathon
🧭 Sessões de estratégia
🏛️ Tours culturais👉 Nota: as inscrições terminam a 30 de setembro.
#GLAMWiki2025 #Lisboa #COT #GLAM #Wikimedia #WikiLovers #WikimediaPortugal #WMPT #OpenKnowledge #OpenCulture
-
[Перевод] LLM и их хрупкая логика: новое исследование ставит под сомнение Chain-of-Thought
Новое исследование учёных из Университета штата Аризона показывает: знаменитое «цепочечное рассуждение» (Chain-of-Thought, CoT) в больших языковых моделях (LLM) скорее похоже на «хрупкий мираж», чем на проявление подлинного интеллекта. Эта работа продолжает традицию критического анализа глубины рассуждений LLM, но в отличие от предыдущих исследований предлагает уникальный взгляд через призму «распределения данных», который позволяет понять, где и почему CoT систематически даёт сбой.
https://habr.com/ru/companies/technokratos/articles/939072/
#большие_языковые_модели #искусственный_интеллект #ai #llm #cot #chain_of_thoughts
-
[Перевод] LLM и их хрупкая логика: новое исследование ставит под сомнение Chain-of-Thought
Новое исследование учёных из Университета штата Аризона показывает: знаменитое «цепочечное рассуждение» (Chain-of-Thought, CoT) в больших языковых моделях (LLM) скорее похоже на «хрупкий мираж», чем на проявление подлинного интеллекта. Эта работа продолжает традицию критического анализа глубины рассуждений LLM, но в отличие от предыдущих исследований предлагает уникальный взгляд через призму «распределения данных», который позволяет понять, где и почему CoT систематически даёт сбой.
https://habr.com/ru/companies/technokratos/articles/939072/
#большие_языковые_модели #искусственный_интеллект #ai #llm #cot #chain_of_thoughts
-
[Перевод] LLM и их хрупкая логика: новое исследование ставит под сомнение Chain-of-Thought
Новое исследование учёных из Университета штата Аризона показывает: знаменитое «цепочечное рассуждение» (Chain-of-Thought, CoT) в больших языковых моделях (LLM) скорее похоже на «хрупкий мираж», чем на проявление подлинного интеллекта. Эта работа продолжает традицию критического анализа глубины рассуждений LLM, но в отличие от предыдущих исследований предлагает уникальный взгляд через призму «распределения данных», который позволяет понять, где и почему CoT систематически даёт сбой.
https://habr.com/ru/companies/technokratos/articles/939072/
#большие_языковые_модели #искусственный_интеллект #ai #llm #cot #chain_of_thoughts
-
[Перевод] LLM и их хрупкая логика: новое исследование ставит под сомнение Chain-of-Thought
Новое исследование учёных из Университета штата Аризона показывает: знаменитое «цепочечное рассуждение» (Chain-of-Thought, CoT) в больших языковых моделях (LLM) скорее похоже на «хрупкий мираж», чем на проявление подлинного интеллекта. Эта работа продолжает традицию критического анализа глубины рассуждений LLM, но в отличие от предыдущих исследований предлагает уникальный взгляд через призму «распределения данных», который позволяет понять, где и почему CoT систематически даёт сбой.
https://habr.com/ru/companies/technokratos/articles/939072/
#большие_языковые_модели #искусственный_интеллект #ai #llm #cot #chain_of_thoughts
-
How can #AI "reasoning" models be more efficient?
Liu trained them on only chains-of-thought (CoTs) — no prompts.
This seemed to teach models
- #CoT reasoning
- conditions that trigger longer CoTs
- default to shorter CoTs in other conditions -
How can #AI "reasoning" models be more efficient?
Liu trained them on only chains-of-thought (CoTs) — no prompts.
This seemed to teach models
- #CoT reasoning
- conditions that trigger longer CoTs
- default to shorter CoTs in other conditions -
How can #AI "reasoning" models be more efficient?
Liu trained them on only chains-of-thought (CoTs) — no prompts.
This seemed to teach models
- #CoT reasoning
- conditions that trigger longer CoTs
- default to shorter CoTs in other conditions -
How can #AI "reasoning" models be more efficient?
Liu trained them on only chains-of-thought (CoTs) — no prompts.
This seemed to teach models
- #CoT reasoning
- conditions that trigger longer CoTs
- default to shorter CoTs in other conditions -
How can #AI "reasoning" models be more efficient?
Liu trained them on only chains-of-thought (CoTs) — no prompts.
This seemed to teach models
- #CoT reasoning
- conditions that trigger longer CoTs
- default to shorter CoTs in other conditions -
https://www.walknews.com/1018389/ インド、綿花の輸入関税を9月末まで一時停止 | ロイター #AGRI #AMERS #APPA #ASIA #ASXPAC #BD #BMAT #CDTY #COM #COT #CYCP #CYCS #DEST:NOJPWDM #DIP #EMRG #FOREST #GEN #IN #India #JFOR #JLN #MINE #MTAL #MTPIX #NAMER #POL #POTUS #PXP #SASIA #SFTS #TEX #TOPCMB #TOPNWS #TRD #TRF #TRN #TXTL #US #WASH #WEAR #インド
-
https://www.walknews.com/1018389/ インド、綿花の輸入関税を9月末まで一時停止 | ロイター #AGRI #AMERS #APPA #ASIA #ASXPAC #BD #BMAT #CDTY #COM #COT #CYCP #CYCS #DEST:NOJPWDM #DIP #EMRG #FOREST #GEN #IN #India #JFOR #JLN #MINE #MTAL #MTPIX #NAMER #POL #POTUS #PXP #SASIA #SFTS #TEX #TOPCMB #TOPNWS #TRD #TRF #TRN #TXTL #US #WASH #WEAR #インド
-
https://www.walknews.com/1018389/ インド、綿花の輸入関税を9月末まで一時停止 | ロイター #AGRI #AMERS #APPA #ASIA #ASXPAC #BD #BMAT #CDTY #COM #COT #CYCP #CYCS #DEST:NOJPWDM #DIP #EMRG #FOREST #GEN #IN #India #JFOR #JLN #MINE #MTAL #MTPIX #NAMER #POL #POTUS #PXP #SASIA #SFTS #TEX #TOPCMB #TOPNWS #TRD #TRF #TRN #TXTL #US #WASH #WEAR #インド
-
Die #KI befindet sich mit seiner #Denkfähigkeit per #CoT also auf der höhe der meisten öffentlichen Diskurse:
"Dieses Ergebnis sei sprachlich plausibel, aber logisch inkonsistent."
https://the-decoder.de/naechste-studie-aeussert-zweifel-an-denkfaehigkeit-von-reasoning-modellen/?utm_source=firefox-newtab-de-de -
Die #KI befindet sich mit seiner #Denkfähigkeit per #CoT also auf der höhe der meisten öffentlichen Diskurse:
"Dieses Ergebnis sei sprachlich plausibel, aber logisch inkonsistent."
https://the-decoder.de/naechste-studie-aeussert-zweifel-an-denkfaehigkeit-von-reasoning-modellen/?utm_source=firefox-newtab-de-de -
Die #KI befindet sich mit seiner #Denkfähigkeit per #CoT also auf der höhe der meisten öffentlichen Diskurse:
"Dieses Ergebnis sei sprachlich plausibel, aber logisch inkonsistent."
https://the-decoder.de/naechste-studie-aeussert-zweifel-an-denkfaehigkeit-von-reasoning-modellen/?utm_source=firefox-newtab-de-de -
Is #chainofthought #Reasoning of #LLMs a Mirage?
"... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.
... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."
-
Is #chainofthought #Reasoning of #LLMs a Mirage?
"... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.
... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."
-
Is #chainofthought #Reasoning of #LLMs a Mirage?
"... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.
... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."
-
Is #chainofthought #Reasoning of #LLMs a Mirage?
"... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.
... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."
-
Is #chainofthought #Reasoning of #LLMs a Mirage?
"... Our results reveal that #CoT reasoning is a brittle mirage that vanishes when it is pushed beyond training distributions. This work offers a deeper understanding of why and when CoT reasoning fails, emphasizing the ongoing challenge of achieving genuine and generalizable reasoning.
... Our findings reveal that CoT reasoning works effectively when applied to in-distribution or near
in-distribution data but becomes fragile and prone to failure even under moderate distribution shifts.
In some cases, LLMs generate fluent yet logically inconsistent reasoning steps. The results suggest that what appears to be structured reasoning can be a mirage, emerging from memorized or interpolated patterns in the training data rather than logical inference.... Together, these findings suggest that LLMs are not principled reasoners but rather sophisticated simulators of reasoning-like text."
-
Anthropic researchers discover the weird AI problem: Why thinking longer makes models dumber
https://zurl.co/IeLIu
#ai #agenticai #cot -
Anthropic researchers discover the weird AI problem: Why thinking longer makes models dumber
https://zurl.co/IeLIu
#ai #agenticai #cot -
When many of the most influential AI researchers agree on something, it’s worth paying attention - like this paper on Chain-of-Thought monitorability: a fragile but vital path to AI safety. Imagine spotting misbehaviour before it happens. Transparency matters. #AI #ChainOfThought #CoT #AISafety
Chain of Thought Monitorabilit... -
Mastering Reasoning with Code: Chain of Thought Pipelines in Java
Build smarter, traceable AI workflows using LangChain4j, Quarkus, and structured step-by-step logic
https://myfear.substack.com/p/chain-of-thought-java-langchain4j-quarkus
#Java #Quarkus #LangChain4j #CoT #Reasoning -
Mastering Reasoning with Code: Chain of Thought Pipelines in Java
Build smarter, traceable AI workflows using LangChain4j, Quarkus, and structured step-by-step logic
https://myfear.substack.com/p/chain-of-thought-java-langchain4j-quarkus
#Java #Quarkus #LangChain4j #CoT #Reasoning -
Mastering Reasoning with Code: Chain of Thought Pipelines in Java
Build smarter, traceable AI workflows using LangChain4j, Quarkus, and structured step-by-step logic
https://myfear.substack.com/p/chain-of-thought-java-langchain4j-quarkus
#Java #Quarkus #LangChain4j #CoT #Reasoning