#substanceusedisorder — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #substanceusedisorder, aggregated by home.social.
-
DATE: May 15, 2026 at 10:00PM
SOURCE: PSYPOST.ORG** Research quality varies widely from fantastic to small exploratory studies. Please check research methods when conclusions are very important to you. **
-------------------------------------------------TITLE: Artificial intelligence tools answer addiction questions accurately but lack medical nuance
Artificial intelligence chatbots regularly answer public queries about sensitive health topics such as addiction, providing mostly accurate but highly generalized information. A recent evaluation found that while chatbot responses align broadly with national guidelines, they often lack the situational details necessary for individualized health decisions. These descriptive findings were recently published in the journal Drug and Alcohol Dependence.
Substance use disorder is a chronic medical condition defined by the compulsive use of drugs or alcohol despite adverse physical, social, or emotional consequences. The official medical diagnostic framework views the condition on a spectrum of severity rather than applying a binary label of addiction. This diagnosis reflects changes in brain function that lead to cravings, physical tolerance, and withdrawal symptoms. In the United States alone, nearly fifty million people over the age of twelve met the diagnostic criteria for this condition in recent health surveys.
Despite the availability of medical treatments, care for addiction remains heavily underutilized. Medical providers face institutional limitations, time constraints, and a lack of specific training regarding the condition. At the same time, the social stigma surrounding addiction causes many individuals to avoid seeking formal medical advice out of fear of judgment or legal repercussions.
People often turn to digital platforms as an initial, private step to gather health information. Chatbots offer immediate, anonymous responses without the perceived judgment of a clinical environment. However, the quality of this digitally generated medical guidance is not always reliable, especially for deeply stigmatized behavioral health conditions.
To better understand how these systems perform, researchers designed a study to evaluate the medical accuracy of artificial intelligence responses regarding addiction. Lead author Morgan Decker, a medical student, and senior author Lea Sacca, a public health researcher, conducted the work alongside a team at Florida Atlantic University. They collaborated with addiction medicine physicians and data scientists to assess the digital guidance.
The research team focused on fourteen frequently asked questions about substance use disorders. To build this list, they first asked the chatbot to generate a list of common questions that adults have about diagnosis, treatment, and recovery. The team then cross-referenced these outputs with actual frequently asked questions from major health organizations.
The benchmark organizations included the Centers for Disease Control and Prevention and the Substance Abuse and Mental Health Services Administration. The researchers also incorporated guidelines from the National Institute on Drug Abuse and the American Society of Addiction Medicine. This ensured the artificial intelligence answers would be measured against established best practices in the medical field.
Researchers entered the fourteen finalized questions into the software to gather its responses. They specifically utilized the updated fifth version of the application. To standardize the outputs, they applied settings that limit the model’s randomness, ensuring the answers remained consistent and factual rather than conversational.
Pairs of evaluators independently reviewed each generated answer in a blinded fashion. The rating pairs intentionally mixed training levels, pairing students with board-certified addiction specialists. They scored the responses on a four-point scale based on accuracy, precision, and appropriateness for a general audience. Any disagreements between the rater pairs were resolved through discussions with an additional senior expert.
The highest score on the scale indicated an excellent response requiring no further explanation. The next two tiers represented satisfactory answers that needed either minimal or moderate clinical explanation. The lowest score was reserved for unsatisfactory answers that contained incorrect or dangerously misleading information based on contemporary medical practices.
The evaluators found that none of the answers provided by the software were unsatisfactory. Three of the fourteen responses received an excellent rating. Nine answers were deemed satisfactory but required minimal elaboration. Two answers were satisfactory but needed moderate clinical elaboration.
The artificial intelligence performed best on straightforward definitional prompts. When asked about the signs and symptoms of a substance use disorder, it gave a highly accurate list that matched expert guidelines. It correctly noted cravings, withdrawal, and the inability to control use as primary indicators.
Another highly rated response addressed whether a relapse represents a failure. The software accurately emphasized that an eventual return to use does not mean a medical treatment has failed. Instead, it framed relapse as a normal part of the recovery process that might require an adjustment in medical strategy, matching the empathetic tone recommended by public health officials.
Many answers provided a broad summary but missed nuanced clinical examples. When asked about the risks of untreated addiction, the software correctly listed overdose, liver damage, and social isolation. However, it failed to mention the increased risks of various cancers and infectious diseases, which are major complications recognized by public health authorities.
In evaluating treatment options, the software accurately mentioned behavioral therapies and support groups. Yet it failed to identify specific medical therapies approved by the federal government for alcohol use disorder. It also provided vague advice about how to help a loved one, advising against enabling behaviors without explaining what enabling actually looks like in practice.
The software also fell short of providing actionable resources when asked where to seek treatment. It accurately identified primary care doctors, mental health professionals, and anonymous support groups as avenues for help. Unfortunately, it completely omitted centralized, government-supported tools like national helplines or specific website directories that provide immediate, confidential assistance based on geographic location.
More complex medical scenarios revealed greater gaps in the knowledge base of the software. When asked about managing withdrawal, the application correctly noted that physical symptoms occur when a dependent person stops using a substance. Yet it did not warn users that withdrawing from certain substances like alcohol or benzodiazepines can be fatal and requires immediate medical supervision.
The software also required moderate elaboration regarding treatment duration. It accurately stated that recovery timelines vary widely based on individual needs and the severity of the condition. While true, health organizations typically recommend a minimum of three months in a treatment program to achieve better recovery outcomes, a benchmark the software failed to mention.
The researchers point out several limitations in their methodology. The study relied on a subjective evaluation process by a specific group of medical professionals. Other clinical experts might grade the nuanced responses differently. Additionally, the researchers only tested a small sample of fourteen questions, which limits how broadly the results can summarize the capabilities of the software.
Using an artificial intelligence program to generate the initial list of questions may have introduced circular bias into the experiment. The software likely performs better on prompts that match its own structured, rational logic. Real patients often write prompts that are highly emotional, ambiguous, or poorly worded, which could generate very different guidance.
The researchers did not test how actual patients interpret or apply the digital advice in real life. Health literacy varies widely among the public. A scientifically accurate but highly generalized paragraph could still lead to confusion for someone unfamiliar with medical terminology, especially if they try to manage an addiction without a doctor.
Ethical concerns also surround the use of private medical data by technology companies. Substance use disorders often carry legal risks, and poorly protected digital searches could compromise patient privacy. The phrasing used by chatbots could also accidentally reinforce social prejudices if the software relies on biased training data.
Future studies should explore a wider variety of real-world patient queries drawn from online forums or clinic data. Researchers also recommend evaluating competing digital platforms to see if different corporate models offer better medical accuracy. Until these systems improve, human medical professionals remain necessary to contextualize digital health information safely.
The study, “Descriptive content analysis assessment of ChatGPT responses to substance use disorder treatment questions compared to National health guidelines,” was authored by Morgan Decker, Christine Kamm, Sara Burgoa, Meera Rao, Maria Mejia, Christine Ramdin, Adrienne Dean, Melodie Nasr, Lewis S. Nelson, and Lea Sacca.
-------------------------------------------------
DAILY EMAIL DIGEST: Email [email protected] -- no subject or message needed.
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Unofficial Psychology Today Xitter to toot feed at Psych Today Unofficial Bot @PTUnofficialBot
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE: http://subscribe-article-digests.clinicians-exchange.org
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly...
-------------------------------------------------
#psychology #counseling #socialwork #psychotherapy @psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #psychiatry #healthcare #depression #psychotherapist #ArtificialIntelligence #SubstanceUseDisorder #AddictionMedicine #HealthTechEthics #DigitalHealth #PublicHealthGuidelines #MedicalAccuracy #ChatbotsInHealthcare # addictionAWareness #TreatmentAndRecovery
-
DATE: May 15, 2026 at 10:00PM
SOURCE: PSYPOST.ORG** Research quality varies widely from fantastic to small exploratory studies. Please check research methods when conclusions are very important to you. **
-------------------------------------------------TITLE: Artificial intelligence tools answer addiction questions accurately but lack medical nuance
Artificial intelligence chatbots regularly answer public queries about sensitive health topics such as addiction, providing mostly accurate but highly generalized information. A recent evaluation found that while chatbot responses align broadly with national guidelines, they often lack the situational details necessary for individualized health decisions. These descriptive findings were recently published in the journal Drug and Alcohol Dependence.
Substance use disorder is a chronic medical condition defined by the compulsive use of drugs or alcohol despite adverse physical, social, or emotional consequences. The official medical diagnostic framework views the condition on a spectrum of severity rather than applying a binary label of addiction. This diagnosis reflects changes in brain function that lead to cravings, physical tolerance, and withdrawal symptoms. In the United States alone, nearly fifty million people over the age of twelve met the diagnostic criteria for this condition in recent health surveys.
Despite the availability of medical treatments, care for addiction remains heavily underutilized. Medical providers face institutional limitations, time constraints, and a lack of specific training regarding the condition. At the same time, the social stigma surrounding addiction causes many individuals to avoid seeking formal medical advice out of fear of judgment or legal repercussions.
People often turn to digital platforms as an initial, private step to gather health information. Chatbots offer immediate, anonymous responses without the perceived judgment of a clinical environment. However, the quality of this digitally generated medical guidance is not always reliable, especially for deeply stigmatized behavioral health conditions.
To better understand how these systems perform, researchers designed a study to evaluate the medical accuracy of artificial intelligence responses regarding addiction. Lead author Morgan Decker, a medical student, and senior author Lea Sacca, a public health researcher, conducted the work alongside a team at Florida Atlantic University. They collaborated with addiction medicine physicians and data scientists to assess the digital guidance.
The research team focused on fourteen frequently asked questions about substance use disorders. To build this list, they first asked the chatbot to generate a list of common questions that adults have about diagnosis, treatment, and recovery. The team then cross-referenced these outputs with actual frequently asked questions from major health organizations.
The benchmark organizations included the Centers for Disease Control and Prevention and the Substance Abuse and Mental Health Services Administration. The researchers also incorporated guidelines from the National Institute on Drug Abuse and the American Society of Addiction Medicine. This ensured the artificial intelligence answers would be measured against established best practices in the medical field.
Researchers entered the fourteen finalized questions into the software to gather its responses. They specifically utilized the updated fifth version of the application. To standardize the outputs, they applied settings that limit the model’s randomness, ensuring the answers remained consistent and factual rather than conversational.
Pairs of evaluators independently reviewed each generated answer in a blinded fashion. The rating pairs intentionally mixed training levels, pairing students with board-certified addiction specialists. They scored the responses on a four-point scale based on accuracy, precision, and appropriateness for a general audience. Any disagreements between the rater pairs were resolved through discussions with an additional senior expert.
The highest score on the scale indicated an excellent response requiring no further explanation. The next two tiers represented satisfactory answers that needed either minimal or moderate clinical explanation. The lowest score was reserved for unsatisfactory answers that contained incorrect or dangerously misleading information based on contemporary medical practices.
The evaluators found that none of the answers provided by the software were unsatisfactory. Three of the fourteen responses received an excellent rating. Nine answers were deemed satisfactory but required minimal elaboration. Two answers were satisfactory but needed moderate clinical elaboration.
The artificial intelligence performed best on straightforward definitional prompts. When asked about the signs and symptoms of a substance use disorder, it gave a highly accurate list that matched expert guidelines. It correctly noted cravings, withdrawal, and the inability to control use as primary indicators.
Another highly rated response addressed whether a relapse represents a failure. The software accurately emphasized that an eventual return to use does not mean a medical treatment has failed. Instead, it framed relapse as a normal part of the recovery process that might require an adjustment in medical strategy, matching the empathetic tone recommended by public health officials.
Many answers provided a broad summary but missed nuanced clinical examples. When asked about the risks of untreated addiction, the software correctly listed overdose, liver damage, and social isolation. However, it failed to mention the increased risks of various cancers and infectious diseases, which are major complications recognized by public health authorities.
In evaluating treatment options, the software accurately mentioned behavioral therapies and support groups. Yet it failed to identify specific medical therapies approved by the federal government for alcohol use disorder. It also provided vague advice about how to help a loved one, advising against enabling behaviors without explaining what enabling actually looks like in practice.
The software also fell short of providing actionable resources when asked where to seek treatment. It accurately identified primary care doctors, mental health professionals, and anonymous support groups as avenues for help. Unfortunately, it completely omitted centralized, government-supported tools like national helplines or specific website directories that provide immediate, confidential assistance based on geographic location.
More complex medical scenarios revealed greater gaps in the knowledge base of the software. When asked about managing withdrawal, the application correctly noted that physical symptoms occur when a dependent person stops using a substance. Yet it did not warn users that withdrawing from certain substances like alcohol or benzodiazepines can be fatal and requires immediate medical supervision.
The software also required moderate elaboration regarding treatment duration. It accurately stated that recovery timelines vary widely based on individual needs and the severity of the condition. While true, health organizations typically recommend a minimum of three months in a treatment program to achieve better recovery outcomes, a benchmark the software failed to mention.
The researchers point out several limitations in their methodology. The study relied on a subjective evaluation process by a specific group of medical professionals. Other clinical experts might grade the nuanced responses differently. Additionally, the researchers only tested a small sample of fourteen questions, which limits how broadly the results can summarize the capabilities of the software.
Using an artificial intelligence program to generate the initial list of questions may have introduced circular bias into the experiment. The software likely performs better on prompts that match its own structured, rational logic. Real patients often write prompts that are highly emotional, ambiguous, or poorly worded, which could generate very different guidance.
The researchers did not test how actual patients interpret or apply the digital advice in real life. Health literacy varies widely among the public. A scientifically accurate but highly generalized paragraph could still lead to confusion for someone unfamiliar with medical terminology, especially if they try to manage an addiction without a doctor.
Ethical concerns also surround the use of private medical data by technology companies. Substance use disorders often carry legal risks, and poorly protected digital searches could compromise patient privacy. The phrasing used by chatbots could also accidentally reinforce social prejudices if the software relies on biased training data.
Future studies should explore a wider variety of real-world patient queries drawn from online forums or clinic data. Researchers also recommend evaluating competing digital platforms to see if different corporate models offer better medical accuracy. Until these systems improve, human medical professionals remain necessary to contextualize digital health information safely.
The study, “Descriptive content analysis assessment of ChatGPT responses to substance use disorder treatment questions compared to National health guidelines,” was authored by Morgan Decker, Christine Kamm, Sara Burgoa, Meera Rao, Maria Mejia, Christine Ramdin, Adrienne Dean, Melodie Nasr, Lewis S. Nelson, and Lea Sacca.
-------------------------------------------------
DAILY EMAIL DIGEST: Email [email protected] -- no subject or message needed.
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Unofficial Psychology Today Xitter to toot feed at Psych Today Unofficial Bot @PTUnofficialBot
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE: http://subscribe-article-digests.clinicians-exchange.org
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly...
-------------------------------------------------
#psychology #counseling #socialwork #psychotherapy @psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #psychiatry #healthcare #depression #psychotherapist #ArtificialIntelligence #SubstanceUseDisorder #AddictionMedicine #HealthTechEthics #DigitalHealth #PublicHealthGuidelines #MedicalAccuracy #ChatbotsInHealthcare # addictionAWareness #TreatmentAndRecovery
-
DATE: May 15, 2026 at 10:00PM
SOURCE: PSYPOST.ORG** Research quality varies widely from fantastic to small exploratory studies. Please check research methods when conclusions are very important to you. **
-------------------------------------------------TITLE: Artificial intelligence tools answer addiction questions accurately but lack medical nuance
Artificial intelligence chatbots regularly answer public queries about sensitive health topics such as addiction, providing mostly accurate but highly generalized information. A recent evaluation found that while chatbot responses align broadly with national guidelines, they often lack the situational details necessary for individualized health decisions. These descriptive findings were recently published in the journal Drug and Alcohol Dependence.
Substance use disorder is a chronic medical condition defined by the compulsive use of drugs or alcohol despite adverse physical, social, or emotional consequences. The official medical diagnostic framework views the condition on a spectrum of severity rather than applying a binary label of addiction. This diagnosis reflects changes in brain function that lead to cravings, physical tolerance, and withdrawal symptoms. In the United States alone, nearly fifty million people over the age of twelve met the diagnostic criteria for this condition in recent health surveys.
Despite the availability of medical treatments, care for addiction remains heavily underutilized. Medical providers face institutional limitations, time constraints, and a lack of specific training regarding the condition. At the same time, the social stigma surrounding addiction causes many individuals to avoid seeking formal medical advice out of fear of judgment or legal repercussions.
People often turn to digital platforms as an initial, private step to gather health information. Chatbots offer immediate, anonymous responses without the perceived judgment of a clinical environment. However, the quality of this digitally generated medical guidance is not always reliable, especially for deeply stigmatized behavioral health conditions.
To better understand how these systems perform, researchers designed a study to evaluate the medical accuracy of artificial intelligence responses regarding addiction. Lead author Morgan Decker, a medical student, and senior author Lea Sacca, a public health researcher, conducted the work alongside a team at Florida Atlantic University. They collaborated with addiction medicine physicians and data scientists to assess the digital guidance.
The research team focused on fourteen frequently asked questions about substance use disorders. To build this list, they first asked the chatbot to generate a list of common questions that adults have about diagnosis, treatment, and recovery. The team then cross-referenced these outputs with actual frequently asked questions from major health organizations.
The benchmark organizations included the Centers for Disease Control and Prevention and the Substance Abuse and Mental Health Services Administration. The researchers also incorporated guidelines from the National Institute on Drug Abuse and the American Society of Addiction Medicine. This ensured the artificial intelligence answers would be measured against established best practices in the medical field.
Researchers entered the fourteen finalized questions into the software to gather its responses. They specifically utilized the updated fifth version of the application. To standardize the outputs, they applied settings that limit the model’s randomness, ensuring the answers remained consistent and factual rather than conversational.
Pairs of evaluators independently reviewed each generated answer in a blinded fashion. The rating pairs intentionally mixed training levels, pairing students with board-certified addiction specialists. They scored the responses on a four-point scale based on accuracy, precision, and appropriateness for a general audience. Any disagreements between the rater pairs were resolved through discussions with an additional senior expert.
The highest score on the scale indicated an excellent response requiring no further explanation. The next two tiers represented satisfactory answers that needed either minimal or moderate clinical explanation. The lowest score was reserved for unsatisfactory answers that contained incorrect or dangerously misleading information based on contemporary medical practices.
The evaluators found that none of the answers provided by the software were unsatisfactory. Three of the fourteen responses received an excellent rating. Nine answers were deemed satisfactory but required minimal elaboration. Two answers were satisfactory but needed moderate clinical elaboration.
The artificial intelligence performed best on straightforward definitional prompts. When asked about the signs and symptoms of a substance use disorder, it gave a highly accurate list that matched expert guidelines. It correctly noted cravings, withdrawal, and the inability to control use as primary indicators.
Another highly rated response addressed whether a relapse represents a failure. The software accurately emphasized that an eventual return to use does not mean a medical treatment has failed. Instead, it framed relapse as a normal part of the recovery process that might require an adjustment in medical strategy, matching the empathetic tone recommended by public health officials.
Many answers provided a broad summary but missed nuanced clinical examples. When asked about the risks of untreated addiction, the software correctly listed overdose, liver damage, and social isolation. However, it failed to mention the increased risks of various cancers and infectious diseases, which are major complications recognized by public health authorities.
In evaluating treatment options, the software accurately mentioned behavioral therapies and support groups. Yet it failed to identify specific medical therapies approved by the federal government for alcohol use disorder. It also provided vague advice about how to help a loved one, advising against enabling behaviors without explaining what enabling actually looks like in practice.
The software also fell short of providing actionable resources when asked where to seek treatment. It accurately identified primary care doctors, mental health professionals, and anonymous support groups as avenues for help. Unfortunately, it completely omitted centralized, government-supported tools like national helplines or specific website directories that provide immediate, confidential assistance based on geographic location.
More complex medical scenarios revealed greater gaps in the knowledge base of the software. When asked about managing withdrawal, the application correctly noted that physical symptoms occur when a dependent person stops using a substance. Yet it did not warn users that withdrawing from certain substances like alcohol or benzodiazepines can be fatal and requires immediate medical supervision.
The software also required moderate elaboration regarding treatment duration. It accurately stated that recovery timelines vary widely based on individual needs and the severity of the condition. While true, health organizations typically recommend a minimum of three months in a treatment program to achieve better recovery outcomes, a benchmark the software failed to mention.
The researchers point out several limitations in their methodology. The study relied on a subjective evaluation process by a specific group of medical professionals. Other clinical experts might grade the nuanced responses differently. Additionally, the researchers only tested a small sample of fourteen questions, which limits how broadly the results can summarize the capabilities of the software.
Using an artificial intelligence program to generate the initial list of questions may have introduced circular bias into the experiment. The software likely performs better on prompts that match its own structured, rational logic. Real patients often write prompts that are highly emotional, ambiguous, or poorly worded, which could generate very different guidance.
The researchers did not test how actual patients interpret or apply the digital advice in real life. Health literacy varies widely among the public. A scientifically accurate but highly generalized paragraph could still lead to confusion for someone unfamiliar with medical terminology, especially if they try to manage an addiction without a doctor.
Ethical concerns also surround the use of private medical data by technology companies. Substance use disorders often carry legal risks, and poorly protected digital searches could compromise patient privacy. The phrasing used by chatbots could also accidentally reinforce social prejudices if the software relies on biased training data.
Future studies should explore a wider variety of real-world patient queries drawn from online forums or clinic data. Researchers also recommend evaluating competing digital platforms to see if different corporate models offer better medical accuracy. Until these systems improve, human medical professionals remain necessary to contextualize digital health information safely.
The study, “Descriptive content analysis assessment of ChatGPT responses to substance use disorder treatment questions compared to National health guidelines,” was authored by Morgan Decker, Christine Kamm, Sara Burgoa, Meera Rao, Maria Mejia, Christine Ramdin, Adrienne Dean, Melodie Nasr, Lewis S. Nelson, and Lea Sacca.
-------------------------------------------------
DAILY EMAIL DIGEST: Email [email protected] -- no subject or message needed.
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Unofficial Psychology Today Xitter to toot feed at Psych Today Unofficial Bot @PTUnofficialBot
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE: http://subscribe-article-digests.clinicians-exchange.org
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly...
-------------------------------------------------
#psychology #counseling #socialwork #psychotherapy @psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #psychiatry #healthcare #depression #psychotherapist #ArtificialIntelligence #SubstanceUseDisorder #AddictionMedicine #HealthTechEthics #DigitalHealth #PublicHealthGuidelines #MedicalAccuracy #ChatbotsInHealthcare # addictionAWareness #TreatmentAndRecovery
-
https://www.europesays.com/ie/405314/ Khloé Kardashian Says She Punched Lamar Odom Over Drugs #Addiction #Celebrities #Éire #Entertainment #IE #Ireland #KhloéKardashian #LamarOdom #SubstanceUseDisorder
-
https://www.europesays.com/ie/398483/ Study finds addiction risk tied to reward and impulse genes #Addiction #Alcohol #Brain #Cannabis #Drugs #Éire #Gene #Genes #Genetic #Genome #Genomic #Hyperactivity #IE #Ireland #MedicalSchool #MentalHealth #Metabolism #Nicotine #Opioids #Psychiatry #Research #Science #SubstanceUseDisorder #Tobacco
-
Former Celtics guard Chris Herren on recovery after addiction https://www.rawchili.com/nba/664013/ #addiction #AddictionRecovery #AshNadkarni #Basketball #Boston #BostonCeltics #BostonCeltics #career #Celtics #challenge #ChrisHerren #Herren #HerrenWellness #Jersey #life #MassGeneralBrigham #MassachusettsAthletes #MentalHealth #MuchShame #NBA #NegativeStigma #OpioidCrisis #part #People #recovery #RecoveryAdvocacy #seekonk #Story #student #SubstanceUseDisorder
-
Large Veterans Study Shows How GLP-1 Weight Loss Drugs Could Treat Addiction
Credit: MotivateBootCamp. Drugs originally designed to treat diabetes and obesity are reshaping medicine in unforeseen ways. Their impact…
#NewsBeep #News #Health #Addictionresearch #alcoholusedisorder #brainrewardsystem #CA #Canada #diabetesdrugs #dopamine #glp-1drugs #metabolichealth #Neuroscience #obesitytreatment #opioidaddiction #Semaglutide #substanceusedisorder #VAstudy
https://www.newsbeep.com/ca/527512/ -
Founder seeks new location after Narcan box taken down
Steps of Hope Iowa has placed 60 boxes across 25 Iowa co…
#NewsBeep #News #Medication #AnnBreeding #area #Box #breeding #BroadwayAvenue #casey #Casey's #community #DesMoines #founder #Health #iowa #KCCI #Lawenforcement #Life #month #Naloxone #narcan #narcanbox #nonprofit #Northeast14thStreet #Opioid #part #PolkCountySheriff #StepsofHopeIowa #stigma #SubstanceUseDisorder #teammember #UK #UnitedKingdom
https://www.newsbeep.com/uk/423593/ -
https://www.europesays.com/uk/761523/ Founder seeks new location after Narcan box taken down #AnnBreeding #area #Box #breeding #BroadwayAvenue #casey #Casey's #Community #DesMoines #Founder #Health #Iowa #KCCI #LawEnforcement #Life #Medication #month #naloxone #Narcan #NarcanBox #nonprofit #Northeast14thStreet #opioid #part #PolkCountySheriff #StepsOfHopeIowa #stigma #SubstanceUseDisorder #TeamMember #UK #UnitedKingdom
-
https://www.europesays.com/ie/335087/ Founder seeks new location after Narcan box taken down #AnnBreeding #Area #Box #breeding #BroadwayAvenue #casey #Casey's #Community #DesMoines #Éire #founder #Health #IE #iowa #Ireland #KCCI #LawEnforcement #life #Medication #month #Naloxone #Narcan #NarcanBox #nonprofit #Northeast14thStreet #Opioid #part #PolkCountySheriff #StepsOfHopeIowa #Stigma #SubstanceUseDisorder #TeamMember
-
US seizes oil tanker off coast of Venezuela.
https://www.abc.net.au/news/2025-12-11/us-seizes-oil-tanker-off-coast-of-venezuela/106128382
#FossilFuels #DrugAddiction #SubstanceUseDisorder -
US seizes oil tanker off coast of Venezuela.
https://www.abc.net.au/news/2025-12-11/us-seizes-oil-tanker-off-coast-of-venezuela/106128382
#FossilFuels #DrugAddiction #SubstanceUseDisorder -
US seizes oil tanker off coast of Venezuela.
https://www.abc.net.au/news/2025-12-11/us-seizes-oil-tanker-off-coast-of-venezuela/106128382
#FossilFuels #DrugAddiction #SubstanceUseDisorder -
US seizes oil tanker off coast of Venezuela.
https://www.abc.net.au/news/2025-12-11/us-seizes-oil-tanker-off-coast-of-venezuela/106128382
#FossilFuels #DrugAddiction #SubstanceUseDisorder -
US seizes oil tanker off coast of Venezuela.
https://www.abc.net.au/news/2025-12-11/us-seizes-oil-tanker-off-coast-of-venezuela/106128382
#FossilFuels #DrugAddiction #SubstanceUseDisorder -
The Health Risks of Drinking Alcohol | Johns Hopkins https://www.diningandcooking.com/2415790/the-health-risks-of-drinking-alcohol-johns-hopkins/ #alcohol #cancer #Drinking #francais #france #French #FrenchParadox #MentalHealth #SubstanceAbuse #SubstanceUseDisorder
-
Beacon founder Holly Howat joins state office of behavioral health
Beacon Community Connections founder Holly Howat has joined the Louisiana Department of Health as its new assistant secretary of behavioral health.
Howat left her role as executive director of Beacon this summer after nearly a decade of leading the social care nonprofit. There, she spearheaded programs that included addressing the social determinants of health and peer support for those struggling with opioid use disorder. She also served as the first executive director of the Lafayette Parish Criminal Justice Coordinating Committee prior to starting Beacon.
During a panel at the South Louisiana Community Health Summit this week, organized by Beacon, Howat said her aim was to drive the expansion of programs such as Certified Community Behavioral Health Clinics (CCBHC), and to seek community input on both needs and potential solutions for behavioral health issues.
“We really want to hear from people, because that’s how we’re going to know whether what we’re trying to fix is working or not,” she said.
-
A short video about our latest feature on opioid settlement funding in #Massachusetts ...
This Company Paid Massachusetts Opioid Settlement Funds, And Has Had Millions In State Contracts Since
-
Marquette County Board hears concerns over proposed changes to mental health PIHPs
MARQUETTE, Mich. (WLUC) – On Tuesday, the Marquette County Board heard concerns over the state’s plan to privatize…
#NewsBeep #News #Mentalhealth #AU #Australia #departmentofhealthandhumanservices #funding #Health #localcontrol #marquette #marquettecounty #marquettecountyboard #MDHHS #mentalillness #MentalHealth #pathways #pathwayscommunitymentalhealth #Substanceusedisorder
https://www.newsbeep.com/au/128126/ -
Marquette County Board hears concerns over proposed changes to mental health PIHPs
MARQUETTE, Mich. (WLUC) – On Tuesday, the Marquette County Board heard concerns over th…
#NewsBeep #News #US #USA #UnitedStates #UnitedStatesOfAmerica #Mentalhealth #departmentofhealthandhumanservices #Funding #Health #localcontrol #marquette #marquettecounty #marquettecountyboard #MDHHS #mentalillness #MentalHealth #pathways #pathwayscommunitymentalhealth #substanceusedisorder
https://www.newsbeep.com/us/132621/ -
Marquette County Board hears concerns over proposed changes to mental health PIHPs
MARQUETTE, Mich. (WLUC) – On Tuesday, the Marquette County Board heard concerns over th…
#NewsBeep #News #US #USA #UnitedStates #UnitedStatesOfAmerica #Mentalhealth #departmentofhealthandhumanservices #Funding #Health #localcontrol #marquette #marquettecounty #marquettecountyboard #MDHHS #mentalillness #MentalHealth #pathways #pathwayscommunitymentalhealth #substanceusedisorder
https://www.newsbeep.com/us/132621/ -
https://www.sciencedirect.com/science/article/pii/S2773021225000161 Qualitative analysis of a patient's experience of ketamine-assisted psychotherapy for substance use disorder: Empirical synergies with twelve-step programs (Petrovitch, et al, 2025) #ketamine #substanceusedisorder #mentalhealth #recovery #ketaminetherapy #ketamineassistedpsychotherapy #ketamineassistedtherapy #psychedelic #psychedelics #addiction #12step #psychedelicassistedpsychotherapy
-
Revolutionizing Addiction Treatment: The Science of Hope
#AddictionTreatment #Psychedelics #Empathogens #GLP1Agonists #MentalHealth #Neuroscience #Recovery #Hope #TransformativeHealing #BrainHealth #InnovativeScience #SubstanceUseDisorder #HealingJourney #Mindfulness #AddictionRecovery
-
Revolutionizing Addiction Treatment: The Science of Hope
#AddictionTreatment #Psychedelics #Empathogens #GLP1Agonists #MentalHealth #Neuroscience #Recovery #Hope #TransformativeHealing #BrainHealth #InnovativeScience #SubstanceUseDisorder #HealingJourney #Mindfulness #AddictionRecovery
-
Revolutionizing Addiction Treatment: The Science of Hope
#AddictionTreatment #Psychedelics #Empathogens #GLP1Agonists #MentalHealth #Neuroscience #Recovery #Hope #TransformativeHealing #BrainHealth #InnovativeScience #SubstanceUseDisorder #HealingJourney #Mindfulness #AddictionRecovery
-
Revolutionizing Addiction Treatment: The Science of Hope
#AddictionTreatment #Psychedelics #Empathogens #GLP1Agonists #MentalHealth #Neuroscience #Recovery #Hope #TransformativeHealing #BrainHealth #InnovativeScience #SubstanceUseDisorder #HealingJourney #Mindfulness #AddictionRecovery
-
Revolutionizing Addiction Treatment: The Science of Hope
#AddictionTreatment #Psychedelics #Empathogens #GLP1Agonists #MentalHealth #Neuroscience #Recovery #Hope #TransformativeHealing #BrainHealth #InnovativeScience #SubstanceUseDisorder #HealingJourney #Mindfulness #AddictionRecovery
-
#Incarceration Should Not be a #DeathSentence for Individuals Who Use #Opioids
New litigation centered on increasing access to #substanceusedisorder treatment in #jails and #prisons is helping to reduce #mortalityrates among #incarcerated individuals.
-
People whose parents suffered from substance use disorders are more likely to develop psychiatric disorders https://www.psypost.org/people-whose-parents-suffered-from-substance-use-disorders-are-more-likely-to-develop-psychiatric-disorders/?utm_source=dlvr.it&utm_medium=mastodon #SubstanceUseDisorder #PsychiatricDisorders #MentalHealth #ParentalInfluence #RiskFactors
-
Anger more strongly linked to alcohol and tobacco use than illicit drug use https://www.psypost.org/anger-more-strongly-linked-to-alcohol-and-tobacco-use-than-illicit-drug-use/?utm_source=dlvr.it&utm_medium=mastodon #AngerManagement #AlcoholAbuse #TobaccoAddiction #SubstanceUseDisorder #MentalHealth
-
Anger more strongly linked to alcohol and tobacco use than illicit drug use https://www.psypost.org/anger-more-strongly-linked-to-alcohol-and-tobacco-use-than-illicit-drug-use/?utm_source=dlvr.it&utm_medium=mastodon #AngerManagement #AlcoholAbuse #TobaccoAddiction #SubstanceUseDisorder #MentalHealth
-
Early onset of nonmedical prescription stimulant use (such as taking Adderall, Ritalin, or Concerta without a prescription) appears to be a signal of increased risk of cocaine use among US adolescents.
-
Eby and Rustad Agree on Involuntary Treatment. Experts Say They’re Wrong | The Tyee
https://thetyee.ca/News/2024/09/16/Eby-Rustad-Agree-Involuntary-Treatment/?utm_source=bluesky&utm_medium=social&utm_campaign=editorial#BrainInjury
#Addiction
#MentalIllness
#MentalHealth
#InvoluntaryTreatment
#ForcedTreatment
#BritishColumbia
#Conservatives
#SubstanceAbuse
#SubstanceUseDisorder
#MemorialUniOfNewfoundland -
https://www.nature.com/articles/s41586-024-07804-3 Structure of the human dopamine transporter in complex with cocaine (Nielsen, et al, 2024) #substanceUseDisorder #addiction #dopamine #cocaine
-
https://pubmed.ncbi.nlm.nih.gov/39101571/ Off-Label Use of Lamotrigine and Naltrexone in the Treatment of Ketamine Use Disorder: A Case Report #ketamine #substanceUseDisorder #lamotrigine #naltrexone #addiction #substanceuse #recovery
-
https://link.springer.com/article/10.1007/s40429-021-00401-8?fromPaywallRec=true Novel Treatment Approaches for Substance Use Disorders: Therapeutic Use of Psychedelics and the Role of Psychotherapy (Koslowski, et al, 2022) #cognitivebehavioraltherapy #psychotherapy #psychedelic #psychedelics #psychedelictherapy #psychedelicassistedpsychotherapy #fda #mdma #ketamine #substanceUseDisorder #mentalhealth #neuroscience #psychedelicresearch
-
https://www.appa-us.org/ #psychedelic #psychedelics #psychedelicpractitioner #plantmedicine #psychedelicresearch #mentalhealth #psychedelicmedicine #ketamine #mdma #lsd #psilocybin #dmt #APPA #NeuroScience #counseling #psychiatry #palliativecare #hospice #grief #depression #anxiety #substanceUseDisorder #harmreduction
-
Oregon’s #drugdecriminalization gets poor marks
In #Oregon, decriminalization of personal-use amounts of #drugs in 2020 was supposed to send hundreds of millions of dollars of marijuana tax revenues into #drugtreatment and harm reduction programs. But that hasn’t yet translated into an improved care network for a state with the 2nd highest rate of #substanceusedisorder in the US and ranked 50th for access to treatment.
https://apnews.com/article/health-oregon-8629d6e62bff151afbbdb3a37c2206ae
-
Experiencing negative substance use consequences may not deter future use, according to a new study of alcohol and cannabis use.
https://pubmed.ncbi.nlm.nih.gov/37650855/
#DryJanuary #SubstanceUseDisorder #Health #PublicHealth @demography @sociology
-
https://brainpizza.substack.com/p/altered-states-rewiring-cravings
Altered States: Rewiring cravings (Part two) - Semaglutide's potential impact on substance use disorders and brain plasticity
#SemaglutideImpact #BrainPlasticity #SubstanceUseDisorder #AddictionTreatment #HealthResearch #BrainHealth #MedicalAdvancements #HealthcareInnovation #AlteredStatesSeries #HealthAndWellness #Neuroscience #DrugEffects #SUDTreatment #RewiringCravings
-
Some Addiction Psychologists Dislike "Preaddiction” Label https://researchinenglish.com/article/2023.9/examining-the-proposed-and-in-39kqreol
#addiction #psychology #substanceusedisorder #psych #alcoholism #preaddiction
-
CW: anti-drug user sentiment
Even so-called progressive people will take for granted that they can just casually refer to being an addict as a character defect. Being a bad father is negative, having Substance Use Disorder is morally neutral.
-
#AliceMatone et al. provide an overview on changes in #alcohol and #SubstanceUseDisorder criteria in #ICD11. These represent an opportunity to fill the therapeutic gap and increase the coverage of substance use disorders.
https://doi.org/10.32872/cpe.9539 -
Wearable Sensor for Detecting Substance Use Disorder - Oftentimes, the feature set for our typical fitness-focused wearables feels a bit ... - https://hackaday.com/2021/12/09/wearable-sensor-for-detecting-substance-use-disorder/ #supervisedinjectionfacility #artificialintelligence #substanceusedisorder #machinelearning #opioidoverdose #wearablehacks #medicalhacks #empatica #e4
-
Researchers Use Wearable to Detect and Reverse Opioid Overdoses in Real-Time - Opioid overdose-related deaths have unfortunately been increasing over the last fe... - https://hackaday.com/2021/12/04/researchers-use-wearable-to-detect-and-reverse-opioid-overdoses-in-real-time/ #supervisedinjectionfacility #substanceusedisorder #wearablehacks #medicalhacks #covid-19 #opioid