#chatbotgpt — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #chatbotgpt, aggregated by home.social.
-
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/I'm always a bit skeptical of presentations from tech company CEOs on
how their product areas are necessary in the mental health field.That said, this article has a few good points:
/"Umar Nizamani, CEO, International, at NiceDay, emphasised that AI will
inevitably become an essential tool in mental health care: 'I am very
confident AI will not replace therapists – but therapists using AI will
replace therapists not using AI.'"//
/
I am beginning to think this also -- for better or worse. I took a VERY
fast 60 second look at NiceDay and it appears to be another
all-encompassing EHR, but with a strong emphasis on data. Lots of tools
and questionnaires and attractive graphs for therapists to monitor
symptoms. (I need to take a longer look later.) So data-driven could
be very good, if it does not crowd out the human touch./"Nizamani said there had been suicides caused by AI, citing the case of
a person in Belgium who died by suicide after downloading an anxiety
app. The individual was anxious about climate change. The app suggested
'if you did not exist' it would help the planet, said Nizamani."//
/
YIKES... So, yes, his point that care in implementation is needed is
critical. I worry at the speed of the gold-rush./"He [//Nizamni] //called on the industry to come together to ensure
that mental health systems using AI and data are 'explainable’,
'transparent', and 'accountable'." //
/
This has been my biggest focus so far, coming from an Internet security
background when I was younger.See: https://nicedaytherapy.com/
/"Arden Tomison, CEO and founder of Thalamos"/ spoke on how his company
automates and streamlines complex bureaucracy and paperwork to both
speed patients getting help and extract the useful data from the forms
for clinicians to use. More at: https://www.thalamos.co.uk//"Dr Stefano Goria, co-founder and CTO at Thymia, gave an example of
'frontier AI': 'mental health biomarkers' which are 'driving towards
precision medicine' in mental health. Goria said thymia’s biomarkers
(e.g. how someone sounds, or how they appear in a video) could help
clinicians be aware of symptoms and diagnose conditions that are often
missed."//
/
Now **THIS** is how I'd like to receive my AI augmentation. Give me
improved diagnostic tools rather than replacing me with chatbots or
over-crowding the therapy process with too much automated tool data
collection (some is good). I just want this to remain in the hands of
the solo practitioner rather than being a performance monitor on us by
insurance companies. I want to see empowered clinicians.Take a look at this at: https://thymia.ai/#our-products
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/--
*Michael Reeder, LCPC
*
*Hygeia Counseling Services : Baltimore / Mt. Washington Village location*#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement #NiceDay #NiceDayTherapy #/Thalamos
#//Thymia///
.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/I'm always a bit skeptical of presentations from tech company CEOs on
how their product areas are necessary in the mental health field.That said, this article has a few good points:
/"Umar Nizamani, CEO, International, at NiceDay, emphasised that AI will
inevitably become an essential tool in mental health care: 'I am very
confident AI will not replace therapists – but therapists using AI will
replace therapists not using AI.'"//
/
I am beginning to think this also -- for better or worse. I took a VERY
fast 60 second look at NiceDay and it appears to be another
all-encompassing EHR, but with a strong emphasis on data. Lots of tools
and questionnaires and attractive graphs for therapists to monitor
symptoms. (I need to take a longer look later.) So data-driven could
be very good, if it does not crowd out the human touch./"Nizamani said there had been suicides caused by AI, citing the case of
a person in Belgium who died by suicide after downloading an anxiety
app. The individual was anxious about climate change. The app suggested
'if you did not exist' it would help the planet, said Nizamani."//
/
YIKES... So, yes, his point that care in implementation is needed is
critical. I worry at the speed of the gold-rush./"He [//Nizamni] //called on the industry to come together to ensure
that mental health systems using AI and data are 'explainable’,
'transparent', and 'accountable'." //
/
This has been my biggest focus so far, coming from an Internet security
background when I was younger.See: https://nicedaytherapy.com/
/"Arden Tomison, CEO and founder of Thalamos"/ spoke on how his company
automates and streamlines complex bureaucracy and paperwork to both
speed patients getting help and extract the useful data from the forms
for clinicians to use. More at: https://www.thalamos.co.uk//"Dr Stefano Goria, co-founder and CTO at Thymia, gave an example of
'frontier AI': 'mental health biomarkers' which are 'driving towards
precision medicine' in mental health. Goria said thymia’s biomarkers
(e.g. how someone sounds, or how they appear in a video) could help
clinicians be aware of symptoms and diagnose conditions that are often
missed."//
/
Now **THIS** is how I'd like to receive my AI augmentation. Give me
improved diagnostic tools rather than replacing me with chatbots or
over-crowding the therapy process with too much automated tool data
collection (some is good). I just want this to remain in the hands of
the solo practitioner rather than being a performance monitor on us by
insurance companies. I want to see empowered clinicians.Take a look at this at: https://thymia.ai/#our-products
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/--
*Michael Reeder, LCPC
*
*Hygeia Counseling Services : Baltimore / Mt. Washington Village location*#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement #NiceDay #NiceDayTherapy #/Thalamos
#//Thymia///
.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/I'm always a bit skeptical of presentations from tech company CEOs on
how their product areas are necessary in the mental health field.That said, this article has a few good points:
/"Umar Nizamani, CEO, International, at NiceDay, emphasised that AI will
inevitably become an essential tool in mental health care: 'I am very
confident AI will not replace therapists – but therapists using AI will
replace therapists not using AI.'"//
/
I am beginning to think this also -- for better or worse. I took a VERY
fast 60 second look at NiceDay and it appears to be another
all-encompassing EHR, but with a strong emphasis on data. Lots of tools
and questionnaires and attractive graphs for therapists to monitor
symptoms. (I need to take a longer look later.) So data-driven could
be very good, if it does not crowd out the human touch./"Nizamani said there had been suicides caused by AI, citing the case of
a person in Belgium who died by suicide after downloading an anxiety
app. The individual was anxious about climate change. The app suggested
'if you did not exist' it would help the planet, said Nizamani."//
/
YIKES... So, yes, his point that care in implementation is needed is
critical. I worry at the speed of the gold-rush./"He [//Nizamni] //called on the industry to come together to ensure
that mental health systems using AI and data are 'explainable’,
'transparent', and 'accountable'." //
/
This has been my biggest focus so far, coming from an Internet security
background when I was younger.See: https://nicedaytherapy.com/
/"Arden Tomison, CEO and founder of Thalamos"/ spoke on how his company
automates and streamlines complex bureaucracy and paperwork to both
speed patients getting help and extract the useful data from the forms
for clinicians to use. More at: https://www.thalamos.co.uk//"Dr Stefano Goria, co-founder and CTO at Thymia, gave an example of
'frontier AI': 'mental health biomarkers' which are 'driving towards
precision medicine' in mental health. Goria said thymia’s biomarkers
(e.g. how someone sounds, or how they appear in a video) could help
clinicians be aware of symptoms and diagnose conditions that are often
missed."//
/
Now **THIS** is how I'd like to receive my AI augmentation. Give me
improved diagnostic tools rather than replacing me with chatbots or
over-crowding the therapy process with too much automated tool data
collection (some is good). I just want this to remain in the hands of
the solo practitioner rather than being a performance monitor on us by
insurance companies. I want to see empowered clinicians.Take a look at this at: https://thymia.ai/#our-products
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/--
*Michael Reeder, LCPC
*
*Hygeia Counseling Services : Baltimore / Mt. Washington Village location*#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement #NiceDay #NiceDayTherapy #/Thalamos
#//Thymia///
.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/I'm always a bit skeptical of presentations from tech company CEOs on
how their product areas are necessary in the mental health field.That said, this article has a few good points:
/"Umar Nizamani, CEO, International, at NiceDay, emphasised that AI will
inevitably become an essential tool in mental health care: 'I am very
confident AI will not replace therapists – but therapists using AI will
replace therapists not using AI.'"//
/
I am beginning to think this also -- for better or worse. I took a VERY
fast 60 second look at NiceDay and it appears to be another
all-encompassing EHR, but with a strong emphasis on data. Lots of tools
and questionnaires and attractive graphs for therapists to monitor
symptoms. (I need to take a longer look later.) So data-driven could
be very good, if it does not crowd out the human touch./"Nizamani said there had been suicides caused by AI, citing the case of
a person in Belgium who died by suicide after downloading an anxiety
app. The individual was anxious about climate change. The app suggested
'if you did not exist' it would help the planet, said Nizamani."//
/
YIKES... So, yes, his point that care in implementation is needed is
critical. I worry at the speed of the gold-rush./"He [//Nizamni] //called on the industry to come together to ensure
that mental health systems using AI and data are 'explainable’,
'transparent', and 'accountable'." //
/
This has been my biggest focus so far, coming from an Internet security
background when I was younger.See: https://nicedaytherapy.com/
/"Arden Tomison, CEO and founder of Thalamos"/ spoke on how his company
automates and streamlines complex bureaucracy and paperwork to both
speed patients getting help and extract the useful data from the forms
for clinicians to use. More at: https://www.thalamos.co.uk//"Dr Stefano Goria, co-founder and CTO at Thymia, gave an example of
'frontier AI': 'mental health biomarkers' which are 'driving towards
precision medicine' in mental health. Goria said thymia’s biomarkers
(e.g. how someone sounds, or how they appear in a video) could help
clinicians be aware of symptoms and diagnose conditions that are often
missed."//
/
Now **THIS** is how I'd like to receive my AI augmentation. Give me
improved diagnostic tools rather than replacing me with chatbots or
over-crowding the therapy process with too much automated tool data
collection (some is good). I just want this to remain in the hands of
the solo practitioner rather than being a performance monitor on us by
insurance companies. I want to see empowered clinicians.Take a look at this at: https://thymia.ai/#our-products
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/--
*Michael Reeder, LCPC
*
*Hygeia Counseling Services : Baltimore / Mt. Washington Village location*#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement #NiceDay #NiceDayTherapy #/Thalamos
#//Thymia///
.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/I'm always a bit skeptical of presentations from tech company CEOs on
how their product areas are necessary in the mental health field.That said, this article has a few good points:
/"Umar Nizamani, CEO, International, at NiceDay, emphasised that AI will
inevitably become an essential tool in mental health care: 'I am very
confident AI will not replace therapists – but therapists using AI will
replace therapists not using AI.'"//
/
I am beginning to think this also -- for better or worse. I took a VERY
fast 60 second look at NiceDay and it appears to be another
all-encompassing EHR, but with a strong emphasis on data. Lots of tools
and questionnaires and attractive graphs for therapists to monitor
symptoms. (I need to take a longer look later.) So data-driven could
be very good, if it does not crowd out the human touch./"Nizamani said there had been suicides caused by AI, citing the case of
a person in Belgium who died by suicide after downloading an anxiety
app. The individual was anxious about climate change. The app suggested
'if you did not exist' it would help the planet, said Nizamani."//
/
YIKES... So, yes, his point that care in implementation is needed is
critical. I worry at the speed of the gold-rush./"He [//Nizamni] //called on the industry to come together to ensure
that mental health systems using AI and data are 'explainable’,
'transparent', and 'accountable'." //
/
This has been my biggest focus so far, coming from an Internet security
background when I was younger.See: https://nicedaytherapy.com/
/"Arden Tomison, CEO and founder of Thalamos"/ spoke on how his company
automates and streamlines complex bureaucracy and paperwork to both
speed patients getting help and extract the useful data from the forms
for clinicians to use. More at: https://www.thalamos.co.uk//"Dr Stefano Goria, co-founder and CTO at Thymia, gave an example of
'frontier AI': 'mental health biomarkers' which are 'driving towards
precision medicine' in mental health. Goria said thymia’s biomarkers
(e.g. how someone sounds, or how they appear in a video) could help
clinicians be aware of symptoms and diagnose conditions that are often
missed."//
/
Now **THIS** is how I'd like to receive my AI augmentation. Give me
improved diagnostic tools rather than replacing me with chatbots or
over-crowding the therapy process with too much automated tool data
collection (some is good). I just want this to remain in the hands of
the solo practitioner rather than being a performance monitor on us by
insurance companies. I want to see empowered clinicians.Take a look at this at: https://thymia.ai/#our-products
*Warning on AI and Data in mental health: ‘Patients are drowning’**
*https://www.digitalhealth.net/2023/10/warning-on-ai-and-data-in-mental-health-patients-are-drowning/--
*Michael Reeder, LCPC
*
*Hygeia Counseling Services : Baltimore / Mt. Washington Village location*#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement #NiceDay #NiceDayTherapy #/Thalamos
#//Thymia///
.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
TITLE: Iowa health system warns against using ChatGPT to draft patient
lettersApparently some people have to be told that using AI services in the
cloud to compose medical letters is a violation of HIPAA.Now what I would like to see with all the AI-assisted EHR systems
currently being developed (EPIC, Oracle, Amazon, etc.) is not only BAA
contracts in place with the tech companies, but also:a) Separate AI systems that don't share data with the main AI system.
(So the Hospital AI database would be separate from the general AI
database), orb) Much better: Separate AI software and databases that are held
internal to the Hospital's own computer servers with restricted Internet
access to the outside.This is wholly feasible, yet somehow I have a low trust level of it
occurring.For any private practice people out there playing with AI on a small
office scale, I'm not a lawyer, but what I would recommend are a) AI
systems that can be run on a desktop (not in the cloud), and b) cutting
them off from Internet or severe restrictions on where those desktops
can call out to since you likely don't know what's in the code of the AI
you downloaded!~~~~
*Iowa health system warns against using ChatGPT to draft patient letters*
https://www.beckershospitalreview.com/cybersecurity/iowa-health-system-warns-against-using-chatgpt-to-draft-patient-letters.html/Iowa City-based University of Iowa Health Care is warning employees
against the use of ChatGPT for its potential to violate HIPAA.../--
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
TITLE: Iowa health system warns against using ChatGPT to draft patient
lettersApparently some people have to be told that using AI services in the
cloud to compose medical letters is a violation of HIPAA.Now what I would like to see with all the AI-assisted EHR systems
currently being developed (EPIC, Oracle, Amazon, etc.) is not only BAA
contracts in place with the tech companies, but also:a) Separate AI systems that don't share data with the main AI system.
(So the Hospital AI database would be separate from the general AI
database), orb) Much better: Separate AI software and databases that are held
internal to the Hospital's own computer servers with restricted Internet
access to the outside.This is wholly feasible, yet somehow I have a low trust level of it
occurring.For any private practice people out there playing with AI on a small
office scale, I'm not a lawyer, but what I would recommend are a) AI
systems that can be run on a desktop (not in the cloud), and b) cutting
them off from Internet or severe restrictions on where those desktops
can call out to since you likely don't know what's in the code of the AI
you downloaded!~~~~
*Iowa health system warns against using ChatGPT to draft patient letters*
https://www.beckershospitalreview.com/cybersecurity/iowa-health-system-warns-against-using-chatgpt-to-draft-patient-letters.html/Iowa City-based University of Iowa Health Care is warning employees
against the use of ChatGPT for its potential to violate HIPAA.../--
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
TITLE: Iowa health system warns against using ChatGPT to draft patient
lettersApparently some people have to be told that using AI services in the
cloud to compose medical letters is a violation of HIPAA.Now what I would like to see with all the AI-assisted EHR systems
currently being developed (EPIC, Oracle, Amazon, etc.) is not only BAA
contracts in place with the tech companies, but also:a) Separate AI systems that don't share data with the main AI system.
(So the Hospital AI database would be separate from the general AI
database), orb) Much better: Separate AI software and databases that are held
internal to the Hospital's own computer servers with restricted Internet
access to the outside.This is wholly feasible, yet somehow I have a low trust level of it
occurring.For any private practice people out there playing with AI on a small
office scale, I'm not a lawyer, but what I would recommend are a) AI
systems that can be run on a desktop (not in the cloud), and b) cutting
them off from Internet or severe restrictions on where those desktops
can call out to since you likely don't know what's in the code of the AI
you downloaded!~~~~
*Iowa health system warns against using ChatGPT to draft patient letters*
https://www.beckershospitalreview.com/cybersecurity/iowa-health-system-warns-against-using-chatgpt-to-draft-patient-letters.html/Iowa City-based University of Iowa Health Care is warning employees
against the use of ChatGPT for its potential to violate HIPAA.../--
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
TITLE: Iowa health system warns against using ChatGPT to draft patient
lettersApparently some people have to be told that using AI services in the
cloud to compose medical letters is a violation of HIPAA.Now what I would like to see with all the AI-assisted EHR systems
currently being developed (EPIC, Oracle, Amazon, etc.) is not only BAA
contracts in place with the tech companies, but also:a) Separate AI systems that don't share data with the main AI system.
(So the Hospital AI database would be separate from the general AI
database), orb) Much better: Separate AI software and databases that are held
internal to the Hospital's own computer servers with restricted Internet
access to the outside.This is wholly feasible, yet somehow I have a low trust level of it
occurring.For any private practice people out there playing with AI on a small
office scale, I'm not a lawyer, but what I would recommend are a) AI
systems that can be run on a desktop (not in the cloud), and b) cutting
them off from Internet or severe restrictions on where those desktops
can call out to since you likely don't know what's in the code of the AI
you downloaded!~~~~
*Iowa health system warns against using ChatGPT to draft patient letters*
https://www.beckershospitalreview.com/cybersecurity/iowa-health-system-warns-against-using-chatgpt-to-draft-patient-letters.html/Iowa City-based University of Iowa Health Care is warning employees
against the use of ChatGPT for its potential to violate HIPAA.../--
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open Mastodon instance for all mental health workers: https://mastodon.clinicians-exchange.org
.
TITLE: Iowa health system warns against using ChatGPT to draft patient
lettersApparently some people have to be told that using AI services in the
cloud to compose medical letters is a violation of HIPAA.Now what I would like to see with all the AI-assisted EHR systems
currently being developed (EPIC, Oracle, Amazon, etc.) is not only BAA
contracts in place with the tech companies, but also:a) Separate AI systems that don't share data with the main AI system.
(So the Hospital AI database would be separate from the general AI
database), orb) Much better: Separate AI software and databases that are held
internal to the Hospital's own computer servers with restricted Internet
access to the outside.This is wholly feasible, yet somehow I have a low trust level of it
occurring.For any private practice people out there playing with AI on a small
office scale, I'm not a lawyer, but what I would recommend are a) AI
systems that can be run on a desktop (not in the cloud), and b) cutting
them off from Internet or severe restrictions on where those desktops
can call out to since you likely don't know what's in the code of the AI
you downloaded!~~~~
*Iowa health system warns against using ChatGPT to draft patient letters*
https://www.beckershospitalreview.com/cybersecurity/iowa-health-system-warns-against-using-chatgpt-to-draft-patient-letters.html/Iowa City-based University of Iowa Health Care is warning employees
against the use of ChatGPT for its potential to violate HIPAA.../--
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
.
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
.
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE:
http://subscribe-article-digests.clinicians-exchange.org
.
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly... -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open LEMMY instance for all mental health workers: https://lem.clinicians-exchange.org
.TITLE: Coming to a doc near you
*Oracle announces new generative AI services for healthcare organisations**
*https://www.digitalhealth.net/2023/09/oracle-announces-new-generative-ai-services-for-healthcare-organisations/This AI will follow along and take the session notes for the doctor by
listening to the office visit. It will also bring up charts and records
through voice command and prompt the doctor to do routine things during
the office visit. It's due out early next year.This could be very helpful.
However I can imagine a few kinks in the office visit process initially:
Patient: "Doctor, my knee hurts"
AI: "REMEMBER TO MAKE A FOLLOW-UP APPOINTMENT"
Patient: "What was that?!"
Doctor: "Oh pay no attention -- that is just the new AI system everyone
has to consent to for treatment. It will help us during the session."AI: "HAVE YOU EXAMINED THE KNEE X-RAY YET?"
Doctor: "AI, pull up the knee x-ray"
Patient: "This is my first visit, there is no knee x-ray yet."
AI: "REMEMBER TO SCHEDULE A KNEE X-RAY"
Doctor & Patient Together: "We don't know if we need a knee x-ray yet!"
Patient: "It started hurting yesterday"
Doctor: "Jump up on the table and I'll take a look at it"
AI: "SHALL I SUMMON A NURSE TO WATCH TO GUARD AGAINST ALLEGATIONS OF
IMPROPRIETY?"Doctor: "NO!"
Doctor: "It does look a bit red. Does this hurt?"
Patient: "A bit when you touch there and I bend it."
AI: "SHALL I SCHEDULE THE KNEE X-RAY NOW?"
Doctor: "SHUT UP! AI -- Silent mode now!"
Office visits are going to be fun the next few years while this gets sorted.
-- Michael
~~
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can subscribe at @PsychResearchBot -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open LEMMY instance for all mental health workers: https://lem.clinicians-exchange.org
.TITLE: Coming to a doc near you
*Oracle announces new generative AI services for healthcare organisations**
*https://www.digitalhealth.net/2023/09/oracle-announces-new-generative-ai-services-for-healthcare-organisations/This AI will follow along and take the session notes for the doctor by
listening to the office visit. It will also bring up charts and records
through voice command and prompt the doctor to do routine things during
the office visit. It's due out early next year.This could be very helpful.
However I can imagine a few kinks in the office visit process initially:
Patient: "Doctor, my knee hurts"
AI: "REMEMBER TO MAKE A FOLLOW-UP APPOINTMENT"
Patient: "What was that?!"
Doctor: "Oh pay no attention -- that is just the new AI system everyone
has to consent to for treatment. It will help us during the session."AI: "HAVE YOU EXAMINED THE KNEE X-RAY YET?"
Doctor: "AI, pull up the knee x-ray"
Patient: "This is my first visit, there is no knee x-ray yet."
AI: "REMEMBER TO SCHEDULE A KNEE X-RAY"
Doctor & Patient Together: "We don't know if we need a knee x-ray yet!"
Patient: "It started hurting yesterday"
Doctor: "Jump up on the table and I'll take a look at it"
AI: "SHALL I SUMMON A NURSE TO WATCH TO GUARD AGAINST ALLEGATIONS OF
IMPROPRIETY?"Doctor: "NO!"
Doctor: "It does look a bit red. Does this hurt?"
Patient: "A bit when you touch there and I bend it."
AI: "SHALL I SCHEDULE THE KNEE X-RAY NOW?"
Doctor: "SHUT UP! AI -- Silent mode now!"
Office visits are going to be fun the next few years while this gets sorted.
-- Michael
~~
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can subscribe at @PsychResearchBot -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open LEMMY instance for all mental health workers: https://lem.clinicians-exchange.org
.TITLE: Coming to a doc near you
*Oracle announces new generative AI services for healthcare organisations**
*https://www.digitalhealth.net/2023/09/oracle-announces-new-generative-ai-services-for-healthcare-organisations/This AI will follow along and take the session notes for the doctor by
listening to the office visit. It will also bring up charts and records
through voice command and prompt the doctor to do routine things during
the office visit. It's due out early next year.This could be very helpful.
However I can imagine a few kinks in the office visit process initially:
Patient: "Doctor, my knee hurts"
AI: "REMEMBER TO MAKE A FOLLOW-UP APPOINTMENT"
Patient: "What was that?!"
Doctor: "Oh pay no attention -- that is just the new AI system everyone
has to consent to for treatment. It will help us during the session."AI: "HAVE YOU EXAMINED THE KNEE X-RAY YET?"
Doctor: "AI, pull up the knee x-ray"
Patient: "This is my first visit, there is no knee x-ray yet."
AI: "REMEMBER TO SCHEDULE A KNEE X-RAY"
Doctor & Patient Together: "We don't know if we need a knee x-ray yet!"
Patient: "It started hurting yesterday"
Doctor: "Jump up on the table and I'll take a look at it"
AI: "SHALL I SUMMON A NURSE TO WATCH TO GUARD AGAINST ALLEGATIONS OF
IMPROPRIETY?"Doctor: "NO!"
Doctor: "It does look a bit red. Does this hurt?"
Patient: "A bit when you touch there and I bend it."
AI: "SHALL I SCHEDULE THE KNEE X-RAY NOW?"
Doctor: "SHUT UP! AI -- Silent mode now!"
Office visits are going to be fun the next few years while this gets sorted.
-- Michael
~~
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can subscribe at @PsychResearchBot -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open LEMMY instance for all mental health workers: https://lem.clinicians-exchange.org
.TITLE: Coming to a doc near you
*Oracle announces new generative AI services for healthcare organisations**
*https://www.digitalhealth.net/2023/09/oracle-announces-new-generative-ai-services-for-healthcare-organisations/This AI will follow along and take the session notes for the doctor by
listening to the office visit. It will also bring up charts and records
through voice command and prompt the doctor to do routine things during
the office visit. It's due out early next year.This could be very helpful.
However I can imagine a few kinks in the office visit process initially:
Patient: "Doctor, my knee hurts"
AI: "REMEMBER TO MAKE A FOLLOW-UP APPOINTMENT"
Patient: "What was that?!"
Doctor: "Oh pay no attention -- that is just the new AI system everyone
has to consent to for treatment. It will help us during the session."AI: "HAVE YOU EXAMINED THE KNEE X-RAY YET?"
Doctor: "AI, pull up the knee x-ray"
Patient: "This is my first visit, there is no knee x-ray yet."
AI: "REMEMBER TO SCHEDULE A KNEE X-RAY"
Doctor & Patient Together: "We don't know if we need a knee x-ray yet!"
Patient: "It started hurting yesterday"
Doctor: "Jump up on the table and I'll take a look at it"
AI: "SHALL I SUMMON A NURSE TO WATCH TO GUARD AGAINST ALLEGATIONS OF
IMPROPRIETY?"Doctor: "NO!"
Doctor: "It does look a bit red. Does this hurt?"
Patient: "A bit when you touch there and I bend it."
AI: "SHALL I SCHEDULE THE KNEE X-RAY NOW?"
Doctor: "SHUT UP! AI -- Silent mode now!"
Office visits are going to be fun the next few years while this gets sorted.
-- Michael
~~
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can subscribe at @PsychResearchBot -
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Open LEMMY instance for all mental health workers: https://lem.clinicians-exchange.org
.TITLE: Coming to a doc near you
*Oracle announces new generative AI services for healthcare organisations**
*https://www.digitalhealth.net/2023/09/oracle-announces-new-generative-ai-services-for-healthcare-organisations/This AI will follow along and take the session notes for the doctor by
listening to the office visit. It will also bring up charts and records
through voice command and prompt the doctor to do routine things during
the office visit. It's due out early next year.This could be very helpful.
However I can imagine a few kinks in the office visit process initially:
Patient: "Doctor, my knee hurts"
AI: "REMEMBER TO MAKE A FOLLOW-UP APPOINTMENT"
Patient: "What was that?!"
Doctor: "Oh pay no attention -- that is just the new AI system everyone
has to consent to for treatment. It will help us during the session."AI: "HAVE YOU EXAMINED THE KNEE X-RAY YET?"
Doctor: "AI, pull up the knee x-ray"
Patient: "This is my first visit, there is no knee x-ray yet."
AI: "REMEMBER TO SCHEDULE A KNEE X-RAY"
Doctor & Patient Together: "We don't know if we need a knee x-ray yet!"
Patient: "It started hurting yesterday"
Doctor: "Jump up on the table and I'll take a look at it"
AI: "SHALL I SUMMON A NURSE TO WATCH TO GUARD AGAINST ALLEGATIONS OF
IMPROPRIETY?"Doctor: "NO!"
Doctor: "It does look a bit red. Does this hurt?"
Patient: "A bit when you touch there and I bend it."
AI: "SHALL I SCHEDULE THE KNEE X-RAY NOW?"
Doctor: "SHUT UP! AI -- Silent mode now!"
Office visits are going to be fun the next few years while this gets sorted.
-- Michael
~~
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#BAA #businessassociateagreement.
.
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can subscribe at @PsychResearchBot -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.
I'm a bit behind on this news cycle, so you may have read about these
issues. _My point is to tie them to data privacy and OUR clinical
practices._**THIS BELOW** is one of the main reasons /I keep throwing a fit about
data leaks from HIPAA BAA subcontractors/ - whether or not they end up
being legally PHI, and despite the fact that not too many therapists are
interested in the topic.*If an Attorney General is willing to go after unredacted medical
records* in-state or out-of-state, /then they are certainly *_capable of
getting data from data brokers and marketing firms_* (or Google,
Facebook, LinkedIn, Twitter, etc.)./Closer-to-home -- It's not too much of a stretch to speculate if
psychotherapists in blue states will get subpoenas for chart records
pertaining to clients who moved to a red state shortly after counseling,
then got in trouble for whatever the legal medical issue of the moment
is (abortion, birth control, transgender concerns, fertility clinic
involvement, etc.).*Here’s why Tennessee’s AG wants access to reproductive medical records
— including for out-of-state abortions**
*https://wpln.org/post/heres-why-tennessees-ag-wants-access-to-reproductive-medical-records-including-for-out-of-state-abortions/
/"State attorneys general in 18 states — including Tennessee’s — are
fighting with the Biden Administration over medical records related to
reproductive care."//
/
*Tennessee A.G. weaponizes private medical records in GOP campaign
against trans people**
*https://the-rachel-maddow-show.simplecast.com/episodes/tennessee-ag-weaponizes-private-medical-records-in-gop-campaign-against-trans-people
/Maddow podcast recording. Talks about attorneys general from 16 states
writing a letter to President Biden asserting their right to go after
medical records located outside their states.//
/
*Biden’s HIPAA expansion for abortion draws criticism, lawsuit threats **
*https://www.politico.com/news/2023/07/18/biden-hipaa-expansion-abortion-00106694
/Biden administration trying to shield abortion medical record data
located in blue states from red state Attorney General probes.//
/In case you are interested, here are some of my past articles on medical
data privacy and various vendors:
*
hipaalink.net security initial testing*
https://lem.clinicians-exchange.org/post/49122
*
Nearly All Hospital Websites Send Tracking Data to 3rd Parties,
Endangering Pt Privacy—Common Recipients: Alphabet, Meta, Adobe, AT&T*
https://lem.clinicians-exchange.org/post/24598
*
To become an Amazon Clinic patient, first you sign away some privacy,You
agreed to what? The ‘HIPAA authorization’ for Amazon’s new low-cost
clinic offers the tech giant more control over your health*
https://lem.clinicians-exchange.org/post/24603
*
FTC, HHS warn health providers not to use tracking tech in websites, apps*
https://lem.clinicians-exchange.org/post/44657
*
Would you want #AI used to help write a medical or psychotherapy chart
note?**(Ongoing Poll)*
https://mastodon.clinicians-exchange.org/@admin/110799586045837116
*
AWS rolls out generative AI service for healthcare documentation software*
https://lem.clinicians-exchange.org/post/57450I'm not posting this to be political (although it certainly is) --*I'm
posting it as a legit medical records concern for all of us regardless
of each individual reader's political positions. We need -- as
therapists -- to care about data leaks and privacy.*+++++++++++
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #legal #lgbtq #abortion
#transgender
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#amazon #BAA #businessassociateagreement -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.
I'm a bit behind on this news cycle, so you may have read about these
issues. _My point is to tie them to data privacy and OUR clinical
practices._**THIS BELOW** is one of the main reasons /I keep throwing a fit about
data leaks from HIPAA BAA subcontractors/ - whether or not they end up
being legally PHI, and despite the fact that not too many therapists are
interested in the topic.*If an Attorney General is willing to go after unredacted medical
records* in-state or out-of-state, /then they are certainly *_capable of
getting data from data brokers and marketing firms_* (or Google,
Facebook, LinkedIn, Twitter, etc.)./Closer-to-home -- It's not too much of a stretch to speculate if
psychotherapists in blue states will get subpoenas for chart records
pertaining to clients who moved to a red state shortly after counseling,
then got in trouble for whatever the legal medical issue of the moment
is (abortion, birth control, transgender concerns, fertility clinic
involvement, etc.).*Here’s why Tennessee’s AG wants access to reproductive medical records
— including for out-of-state abortions**
*https://wpln.org/post/heres-why-tennessees-ag-wants-access-to-reproductive-medical-records-including-for-out-of-state-abortions/
/"State attorneys general in 18 states — including Tennessee’s — are
fighting with the Biden Administration over medical records related to
reproductive care."//
/
*Tennessee A.G. weaponizes private medical records in GOP campaign
against trans people**
*https://the-rachel-maddow-show.simplecast.com/episodes/tennessee-ag-weaponizes-private-medical-records-in-gop-campaign-against-trans-people
/Maddow podcast recording. Talks about attorneys general from 16 states
writing a letter to President Biden asserting their right to go after
medical records located outside their states.//
/
*Biden’s HIPAA expansion for abortion draws criticism, lawsuit threats **
*https://www.politico.com/news/2023/07/18/biden-hipaa-expansion-abortion-00106694
/Biden administration trying to shield abortion medical record data
located in blue states from red state Attorney General probes.//
/In case you are interested, here are some of my past articles on medical
data privacy and various vendors:
*
hipaalink.net security initial testing*
https://lem.clinicians-exchange.org/post/49122
*
Nearly All Hospital Websites Send Tracking Data to 3rd Parties,
Endangering Pt Privacy—Common Recipients: Alphabet, Meta, Adobe, AT&T*
https://lem.clinicians-exchange.org/post/24598
*
To become an Amazon Clinic patient, first you sign away some privacy,You
agreed to what? The ‘HIPAA authorization’ for Amazon’s new low-cost
clinic offers the tech giant more control over your health*
https://lem.clinicians-exchange.org/post/24603
*
FTC, HHS warn health providers not to use tracking tech in websites, apps*
https://lem.clinicians-exchange.org/post/44657
*
Would you want #AI used to help write a medical or psychotherapy chart
note?**(Ongoing Poll)*
https://mastodon.clinicians-exchange.org/@admin/110799586045837116
*
AWS rolls out generative AI service for healthcare documentation software*
https://lem.clinicians-exchange.org/post/57450I'm not posting this to be political (although it certainly is) --*I'm
posting it as a legit medical records concern for all of us regardless
of each individual reader's political positions. We need -- as
therapists -- to care about data leaks and privacy.*+++++++++++
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #legal #lgbtq #abortion
#transgender
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#amazon #BAA #businessassociateagreement -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.
I'm a bit behind on this news cycle, so you may have read about these
issues. _My point is to tie them to data privacy and OUR clinical
practices._**THIS BELOW** is one of the main reasons /I keep throwing a fit about
data leaks from HIPAA BAA subcontractors/ - whether or not they end up
being legally PHI, and despite the fact that not too many therapists are
interested in the topic.*If an Attorney General is willing to go after unredacted medical
records* in-state or out-of-state, /then they are certainly *_capable of
getting data from data brokers and marketing firms_* (or Google,
Facebook, LinkedIn, Twitter, etc.)./Closer-to-home -- It's not too much of a stretch to speculate if
psychotherapists in blue states will get subpoenas for chart records
pertaining to clients who moved to a red state shortly after counseling,
then got in trouble for whatever the legal medical issue of the moment
is (abortion, birth control, transgender concerns, fertility clinic
involvement, etc.).*Here’s why Tennessee’s AG wants access to reproductive medical records
— including for out-of-state abortions**
*https://wpln.org/post/heres-why-tennessees-ag-wants-access-to-reproductive-medical-records-including-for-out-of-state-abortions/
/"State attorneys general in 18 states — including Tennessee’s — are
fighting with the Biden Administration over medical records related to
reproductive care."//
/
*Tennessee A.G. weaponizes private medical records in GOP campaign
against trans people**
*https://the-rachel-maddow-show.simplecast.com/episodes/tennessee-ag-weaponizes-private-medical-records-in-gop-campaign-against-trans-people
/Maddow podcast recording. Talks about attorneys general from 16 states
writing a letter to President Biden asserting their right to go after
medical records located outside their states.//
/
*Biden’s HIPAA expansion for abortion draws criticism, lawsuit threats **
*https://www.politico.com/news/2023/07/18/biden-hipaa-expansion-abortion-00106694
/Biden administration trying to shield abortion medical record data
located in blue states from red state Attorney General probes.//
/In case you are interested, here are some of my past articles on medical
data privacy and various vendors:
*
hipaalink.net security initial testing*
https://lem.clinicians-exchange.org/post/49122
*
Nearly All Hospital Websites Send Tracking Data to 3rd Parties,
Endangering Pt Privacy—Common Recipients: Alphabet, Meta, Adobe, AT&T*
https://lem.clinicians-exchange.org/post/24598
*
To become an Amazon Clinic patient, first you sign away some privacy,You
agreed to what? The ‘HIPAA authorization’ for Amazon’s new low-cost
clinic offers the tech giant more control over your health*
https://lem.clinicians-exchange.org/post/24603
*
FTC, HHS warn health providers not to use tracking tech in websites, apps*
https://lem.clinicians-exchange.org/post/44657
*
Would you want #AI used to help write a medical or psychotherapy chart
note?**(Ongoing Poll)*
https://mastodon.clinicians-exchange.org/@admin/110799586045837116
*
AWS rolls out generative AI service for healthcare documentation software*
https://lem.clinicians-exchange.org/post/57450I'm not posting this to be political (although it certainly is) --*I'm
posting it as a legit medical records concern for all of us regardless
of each individual reader's political positions. We need -- as
therapists -- to care about data leaks and privacy.*+++++++++++
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #legal #lgbtq #abortion
#transgender
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#amazon #BAA #businessassociateagreement -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.
I'm a bit behind on this news cycle, so you may have read about these
issues. _My point is to tie them to data privacy and OUR clinical
practices._**THIS BELOW** is one of the main reasons /I keep throwing a fit about
data leaks from HIPAA BAA subcontractors/ - whether or not they end up
being legally PHI, and despite the fact that not too many therapists are
interested in the topic.*If an Attorney General is willing to go after unredacted medical
records* in-state or out-of-state, /then they are certainly *_capable of
getting data from data brokers and marketing firms_* (or Google,
Facebook, LinkedIn, Twitter, etc.)./Closer-to-home -- It's not too much of a stretch to speculate if
psychotherapists in blue states will get subpoenas for chart records
pertaining to clients who moved to a red state shortly after counseling,
then got in trouble for whatever the legal medical issue of the moment
is (abortion, birth control, transgender concerns, fertility clinic
involvement, etc.).*Here’s why Tennessee’s AG wants access to reproductive medical records
— including for out-of-state abortions**
*https://wpln.org/post/heres-why-tennessees-ag-wants-access-to-reproductive-medical-records-including-for-out-of-state-abortions/
/"State attorneys general in 18 states — including Tennessee’s — are
fighting with the Biden Administration over medical records related to
reproductive care."//
/
*Tennessee A.G. weaponizes private medical records in GOP campaign
against trans people**
*https://the-rachel-maddow-show.simplecast.com/episodes/tennessee-ag-weaponizes-private-medical-records-in-gop-campaign-against-trans-people
/Maddow podcast recording. Talks about attorneys general from 16 states
writing a letter to President Biden asserting their right to go after
medical records located outside their states.//
/
*Biden’s HIPAA expansion for abortion draws criticism, lawsuit threats **
*https://www.politico.com/news/2023/07/18/biden-hipaa-expansion-abortion-00106694
/Biden administration trying to shield abortion medical record data
located in blue states from red state Attorney General probes.//
/In case you are interested, here are some of my past articles on medical
data privacy and various vendors:
*
hipaalink.net security initial testing*
https://lem.clinicians-exchange.org/post/49122
*
Nearly All Hospital Websites Send Tracking Data to 3rd Parties,
Endangering Pt Privacy—Common Recipients: Alphabet, Meta, Adobe, AT&T*
https://lem.clinicians-exchange.org/post/24598
*
To become an Amazon Clinic patient, first you sign away some privacy,You
agreed to what? The ‘HIPAA authorization’ for Amazon’s new low-cost
clinic offers the tech giant more control over your health*
https://lem.clinicians-exchange.org/post/24603
*
FTC, HHS warn health providers not to use tracking tech in websites, apps*
https://lem.clinicians-exchange.org/post/44657
*
Would you want #AI used to help write a medical or psychotherapy chart
note?**(Ongoing Poll)*
https://mastodon.clinicians-exchange.org/@admin/110799586045837116
*
AWS rolls out generative AI service for healthcare documentation software*
https://lem.clinicians-exchange.org/post/57450I'm not posting this to be political (although it certainly is) --*I'm
posting it as a legit medical records concern for all of us regardless
of each individual reader's political positions. We need -- as
therapists -- to care about data leaks and privacy.*+++++++++++
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #legal #lgbtq #abortion
#transgender
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#amazon #BAA #businessassociateagreement -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.
I'm a bit behind on this news cycle, so you may have read about these
issues. _My point is to tie them to data privacy and OUR clinical
practices._**THIS BELOW** is one of the main reasons /I keep throwing a fit about
data leaks from HIPAA BAA subcontractors/ - whether or not they end up
being legally PHI, and despite the fact that not too many therapists are
interested in the topic.*If an Attorney General is willing to go after unredacted medical
records* in-state or out-of-state, /then they are certainly *_capable of
getting data from data brokers and marketing firms_* (or Google,
Facebook, LinkedIn, Twitter, etc.)./Closer-to-home -- It's not too much of a stretch to speculate if
psychotherapists in blue states will get subpoenas for chart records
pertaining to clients who moved to a red state shortly after counseling,
then got in trouble for whatever the legal medical issue of the moment
is (abortion, birth control, transgender concerns, fertility clinic
involvement, etc.).*Here’s why Tennessee’s AG wants access to reproductive medical records
— including for out-of-state abortions**
*https://wpln.org/post/heres-why-tennessees-ag-wants-access-to-reproductive-medical-records-including-for-out-of-state-abortions/
/"State attorneys general in 18 states — including Tennessee’s — are
fighting with the Biden Administration over medical records related to
reproductive care."//
/
*Tennessee A.G. weaponizes private medical records in GOP campaign
against trans people**
*https://the-rachel-maddow-show.simplecast.com/episodes/tennessee-ag-weaponizes-private-medical-records-in-gop-campaign-against-trans-people
/Maddow podcast recording. Talks about attorneys general from 16 states
writing a letter to President Biden asserting their right to go after
medical records located outside their states.//
/
*Biden’s HIPAA expansion for abortion draws criticism, lawsuit threats **
*https://www.politico.com/news/2023/07/18/biden-hipaa-expansion-abortion-00106694
/Biden administration trying to shield abortion medical record data
located in blue states from red state Attorney General probes.//
/In case you are interested, here are some of my past articles on medical
data privacy and various vendors:
*
hipaalink.net security initial testing*
https://lem.clinicians-exchange.org/post/49122
*
Nearly All Hospital Websites Send Tracking Data to 3rd Parties,
Endangering Pt Privacy—Common Recipients: Alphabet, Meta, Adobe, AT&T*
https://lem.clinicians-exchange.org/post/24598
*
To become an Amazon Clinic patient, first you sign away some privacy,You
agreed to what? The ‘HIPAA authorization’ for Amazon’s new low-cost
clinic offers the tech giant more control over your health*
https://lem.clinicians-exchange.org/post/24603
*
FTC, HHS warn health providers not to use tracking tech in websites, apps*
https://lem.clinicians-exchange.org/post/44657
*
Would you want #AI used to help write a medical or psychotherapy chart
note?**(Ongoing Poll)*
https://mastodon.clinicians-exchange.org/@admin/110799586045837116
*
AWS rolls out generative AI service for healthcare documentation software*
https://lem.clinicians-exchange.org/post/57450I'm not posting this to be political (although it certainly is) --*I'm
posting it as a legit medical records concern for all of us regardless
of each individual reader's political positions. We need -- as
therapists -- to care about data leaks and privacy.*+++++++++++
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #legal #lgbtq #abortion
#transgender
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal
#HIPAA #dataprotection #infosec @infosec #doctors #hospitals
#amazon #BAA #businessassociateagreement -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.TITLE: AWS rolls out generative AI service for healthcare documentation
softwareYeah... If it's going to be worth using it would have to listen to the
whole visit... But this needs more thought. In my past quick
experiments, it took 90% of the effort typing directions to get an AI to
generate a halfway okay note. So the AI (I think) would have to listen
in and then do the note to be worth it.Would we want this? Can we trust this?
--Michael
+++++++++
------------------------------------------------------------------------
"Amazon Web Services announced Wednesday a new AI-powered service for
healthcare software providers that will help clinicians with paperwork.""AWS HealthScribe uses generative AI and speech recognition to help
doctors transcribe and analyze their conversations with patients and
drafts clinical notes, the company announced Wednesday at its AWS Summit
New York."------------------------------------------------------------------------
Posted by:
Michael Reeder LCPC
Baltimore, MD#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #Amazon
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.TITLE: AWS rolls out generative AI service for healthcare documentation
softwareYeah... If it's going to be worth using it would have to listen to the
whole visit... But this needs more thought. In my past quick
experiments, it took 90% of the effort typing directions to get an AI to
generate a halfway okay note. So the AI (I think) would have to listen
in and then do the note to be worth it.Would we want this? Can we trust this?
--Michael
+++++++++
------------------------------------------------------------------------
"Amazon Web Services announced Wednesday a new AI-powered service for
healthcare software providers that will help clinicians with paperwork.""AWS HealthScribe uses generative AI and speech recognition to help
doctors transcribe and analyze their conversations with patients and
drafts clinical notes, the company announced Wednesday at its AWS Summit
New York."------------------------------------------------------------------------
Posted by:
Michael Reeder LCPC
Baltimore, MD#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #Amazon
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.TITLE: AWS rolls out generative AI service for healthcare documentation
softwareYeah... If it's going to be worth using it would have to listen to the
whole visit... But this needs more thought. In my past quick
experiments, it took 90% of the effort typing directions to get an AI to
generate a halfway okay note. So the AI (I think) would have to listen
in and then do the note to be worth it.Would we want this? Can we trust this?
--Michael
+++++++++
------------------------------------------------------------------------
"Amazon Web Services announced Wednesday a new AI-powered service for
healthcare software providers that will help clinicians with paperwork.""AWS HealthScribe uses generative AI and speech recognition to help
doctors transcribe and analyze their conversations with patients and
drafts clinical notes, the company announced Wednesday at its AWS Summit
New York."------------------------------------------------------------------------
Posted by:
Michael Reeder LCPC
Baltimore, MD#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #Amazon
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.TITLE: AWS rolls out generative AI service for healthcare documentation
softwareYeah... If it's going to be worth using it would have to listen to the
whole visit... But this needs more thought. In my past quick
experiments, it took 90% of the effort typing directions to get an AI to
generate a halfway okay note. So the AI (I think) would have to listen
in and then do the note to be worth it.Would we want this? Can we trust this?
--Michael
+++++++++
------------------------------------------------------------------------
"Amazon Web Services announced Wednesday a new AI-powered service for
healthcare software providers that will help clinicians with paperwork.""AWS HealthScribe uses generative AI and speech recognition to help
doctors transcribe and analyze their conversations with patients and
drafts clinical notes, the company announced Wednesday at its AWS Summit
New York."------------------------------------------------------------------------
Posted by:
Michael Reeder LCPC
Baltimore, MD#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #Amazon
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal -
EMAIL LIST: https://www.clinicians-exchange.org & LEMMY: https://lem.clinicians-exchange.org
.TITLE: AWS rolls out generative AI service for healthcare documentation
softwareYeah... If it's going to be worth using it would have to listen to the
whole visit... But this needs more thought. In my past quick
experiments, it took 90% of the effort typing directions to get an AI to
generate a halfway okay note. So the AI (I think) would have to listen
in and then do the note to be worth it.Would we want this? Can we trust this?
--Michael
+++++++++
------------------------------------------------------------------------
"Amazon Web Services announced Wednesday a new AI-powered service for
healthcare software providers that will help clinicians with paperwork.""AWS HealthScribe uses generative AI and speech recognition to help
doctors transcribe and analyze their conversations with patients and
drafts clinical notes, the company announced Wednesday at its AWS Summit
New York."------------------------------------------------------------------------
Posted by:
Michael Reeder LCPC
Baltimore, MD#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #medicalnotes #progressnotes #Amazon
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal -
Would you want #AI used to help write a medical or psychotherapy chart note?
Health professionals are under lots of time pressure -- often unpaid for charting time, and increasingly typing notes during appointments.
But, there may be privacy and accuracy concerns with using AI in charting.
++++
#CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #technology #psychiatry #healthcare #patientportal -
Would you want #AI used to help write a medical or psychotherapy chart note?
Health professionals are under lots of time pressure -- often unpaid for charting time, and increasingly typing notes during appointments.
But, there may be privacy and accuracy concerns with using AI in charting.
++++
#CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #technology #psychiatry #healthcare #patientportal -
Would you want #AI used to help write a medical or psychotherapy chart note?
Health professionals are under lots of time pressure -- often unpaid for charting time, and increasingly typing notes during appointments.
But, there may be privacy and accuracy concerns with using AI in charting.
++++
#CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #technology #psychiatry #healthcare #patientportal -
Would you want #AI used to help write a medical or psychotherapy chart note?
Health professionals are under lots of time pressure -- often unpaid for charting time, and increasingly typing notes during appointments.
But, there may be privacy and accuracy concerns with using AI in charting.
++++
#CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #technology #psychiatry #healthcare #patientportal -
Would you want #AI used to help write a medical or psychotherapy chart note?
Health professionals are under lots of time pressure -- often unpaid for charting time, and increasingly typing notes during appointments.
But, there may be privacy and accuracy concerns with using AI in charting.
++++
#CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #technology #psychiatry #healthcare #patientportal -
It occurred to me this morning that by the time I fill out the form and write two or three sentences, I've already done all the work that is needed for an official note (after adding start and end times, diagnosis, name, client age, and a few other elements to the form). There is no need to convert it all to narrative -- it can stay in form factor mostly.
So -- while I want an AI I can trust to help with notes (and this one may grow into such) -- right now the effort of getting it to create a note is about exactly equal to the effort of just writing it myself anyway.
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
It occurred to me this morning that by the time I fill out the form and write two or three sentences, I've already done all the work that is needed for an official note (after adding start and end times, diagnosis, name, client age, and a few other elements to the form). There is no need to convert it all to narrative -- it can stay in form factor mostly.
So -- while I want an AI I can trust to help with notes (and this one may grow into such) -- right now the effort of getting it to create a note is about exactly equal to the effort of just writing it myself anyway.
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
It occurred to me this morning that by the time I fill out the form and write two or three sentences, I've already done all the work that is needed for an official note (after adding start and end times, diagnosis, name, client age, and a few other elements to the form). There is no need to convert it all to narrative -- it can stay in form factor mostly.
So -- while I want an AI I can trust to help with notes (and this one may grow into such) -- right now the effort of getting it to create a note is about exactly equal to the effort of just writing it myself anyway.
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
It occurred to me this morning that by the time I fill out the form and write two or three sentences, I've already done all the work that is needed for an official note (after adding start and end times, diagnosis, name, client age, and a few other elements to the form). There is no need to convert it all to narrative -- it can stay in form factor mostly.
So -- while I want an AI I can trust to help with notes (and this one may grow into such) -- right now the effort of getting it to create a note is about exactly equal to the effort of just writing it myself anyway.
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
It occurred to me this morning that by the time I fill out the form and write two or three sentences, I've already done all the work that is needed for an official note (after adding start and end times, diagnosis, name, client age, and a few other elements to the form). There is no need to convert it all to narrative -- it can stay in form factor mostly.
So -- while I want an AI I can trust to help with notes (and this one may grow into such) -- right now the effort of getting it to create a note is about exactly equal to the effort of just writing it myself anyway.
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
*If* AutoNotes.ai just plugged in a free AI available from elsewhere with a front-end they created, I wonder what legalese governs use of the AI on the backend? I am not a lawyer and none of us know the licensing agreement or ownership of the AI which AutoNotes.ai is using.
And, well hey -- while I'm just busy making up wild speculations -- let's play with this a bit:
So I know AutoNotes.ai is sending SOME kind of information to Google tracking services because my browser plug-ins are preventing this data from being sent to Google and telling me so.
Let's just suppose for a moment that they are using Google's Bard AI on the back-end. Because -- why not -- there is no PHI being collected anyway...
Meanwhile, both the therapist and the client are using the Google Chrome web browser for televideo. Or maybe they are using Gmail and the Gmail text is being mined. Or the data input for the note is sent along to Google datamining regardless of whether or not the Bard AI is used...
Let's go further out on our hypothetical limb and say that the therapist sees only three clients that day. The therapist creates three notes in AutoNotes.ai that day...
It's now a more than fair chance that one of those three unnamed clients has Acute Stress Disorder (like in my example above). If Google has gone to the bother to devote the computer tracking power to it, they might know from Gmail, or Bard, or data aggregation the names of the clients the therapist saw that day.
Of course, I really am making this all up -- we are just not given enough data to know what's real and false anymore.
Here is a paragraph from the welcome message they emailed me:
"Here are a couple of simple suggestions, first, complete as thorough a Mental Status Exam (MSE) as possible, submit a few sentences related to the session and theme, and include treatment plan goals, objectives, and strategies; this will ensure the best possible clinical note. Please revise and submit your revised version inside the app! This will assist all of us in building the greatest tool on earth for the field!"
Well, okay -- I do want the AI to get better at its job...
But this DOES mean they are keeping a version of what you provide, doesn't it?
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
*If* AutoNotes.ai just plugged in a free AI available from elsewhere with a front-end they created, I wonder what legalese governs use of the AI on the backend? I am not a lawyer and none of us know the licensing agreement or ownership of the AI which AutoNotes.ai is using.
And, well hey -- while I'm just busy making up wild speculations -- let's play with this a bit:
So I know AutoNotes.ai is sending SOME kind of information to Google tracking services because my browser plug-ins are preventing this data from being sent to Google and telling me so.
Let's just suppose for a moment that they are using Google's Bard AI on the back-end. Because -- why not -- there is no PHI being collected anyway...
Meanwhile, both the therapist and the client are using the Google Chrome web browser for televideo. Or maybe they are using Gmail and the Gmail text is being mined. Or the data input for the note is sent along to Google datamining regardless of whether or not the Bard AI is used...
Let's go further out on our hypothetical limb and say that the therapist sees only three clients that day. The therapist creates three notes in AutoNotes.ai that day...
It's now a more than fair chance that one of those three unnamed clients has Acute Stress Disorder (like in my example above). If Google has gone to the bother to devote the computer tracking power to it, they might know from Gmail, or Bard, or data aggregation the names of the clients the therapist saw that day.
Of course, I really am making this all up -- we are just not given enough data to know what's real and false anymore.
Here is a paragraph from the welcome message they emailed me:
"Here are a couple of simple suggestions, first, complete as thorough a Mental Status Exam (MSE) as possible, submit a few sentences related to the session and theme, and include treatment plan goals, objectives, and strategies; this will ensure the best possible clinical note. Please revise and submit your revised version inside the app! This will assist all of us in building the greatest tool on earth for the field!"
Well, okay -- I do want the AI to get better at its job...
But this DOES mean they are keeping a version of what you provide, doesn't it?
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
*If* AutoNotes.ai just plugged in a free AI available from elsewhere with a front-end they created, I wonder what legalese governs use of the AI on the backend? I am not a lawyer and none of us know the licensing agreement or ownership of the AI which AutoNotes.ai is using.
And, well hey -- while I'm just busy making up wild speculations -- let's play with this a bit:
So I know AutoNotes.ai is sending SOME kind of information to Google tracking services because my browser plug-ins are preventing this data from being sent to Google and telling me so.
Let's just suppose for a moment that they are using Google's Bard AI on the back-end. Because -- why not -- there is no PHI being collected anyway...
Meanwhile, both the therapist and the client are using the Google Chrome web browser for televideo. Or maybe they are using Gmail and the Gmail text is being mined. Or the data input for the note is sent along to Google datamining regardless of whether or not the Bard AI is used...
Let's go further out on our hypothetical limb and say that the therapist sees only three clients that day. The therapist creates three notes in AutoNotes.ai that day...
It's now a more than fair chance that one of those three unnamed clients has Acute Stress Disorder (like in my example above). If Google has gone to the bother to devote the computer tracking power to it, they might know from Gmail, or Bard, or data aggregation the names of the clients the therapist saw that day.
Of course, I really am making this all up -- we are just not given enough data to know what's real and false anymore.
Here is a paragraph from the welcome message they emailed me:
"Here are a couple of simple suggestions, first, complete as thorough a Mental Status Exam (MSE) as possible, submit a few sentences related to the session and theme, and include treatment plan goals, objectives, and strategies; this will ensure the best possible clinical note. Please revise and submit your revised version inside the app! This will assist all of us in building the greatest tool on earth for the field!"
Well, okay -- I do want the AI to get better at its job...
But this DOES mean they are keeping a version of what you provide, doesn't it?
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
*If* AutoNotes.ai just plugged in a free AI available from elsewhere with a front-end they created, I wonder what legalese governs use of the AI on the backend? I am not a lawyer and none of us know the licensing agreement or ownership of the AI which AutoNotes.ai is using.
And, well hey -- while I'm just busy making up wild speculations -- let's play with this a bit:
So I know AutoNotes.ai is sending SOME kind of information to Google tracking services because my browser plug-ins are preventing this data from being sent to Google and telling me so.
Let's just suppose for a moment that they are using Google's Bard AI on the back-end. Because -- why not -- there is no PHI being collected anyway...
Meanwhile, both the therapist and the client are using the Google Chrome web browser for televideo. Or maybe they are using Gmail and the Gmail text is being mined. Or the data input for the note is sent along to Google datamining regardless of whether or not the Bard AI is used...
Let's go further out on our hypothetical limb and say that the therapist sees only three clients that day. The therapist creates three notes in AutoNotes.ai that day...
It's now a more than fair chance that one of those three unnamed clients has Acute Stress Disorder (like in my example above). If Google has gone to the bother to devote the computer tracking power to it, they might know from Gmail, or Bard, or data aggregation the names of the clients the therapist saw that day.
Of course, I really am making this all up -- we are just not given enough data to know what's real and false anymore.
Here is a paragraph from the welcome message they emailed me:
"Here are a couple of simple suggestions, first, complete as thorough a Mental Status Exam (MSE) as possible, submit a few sentences related to the session and theme, and include treatment plan goals, objectives, and strategies; this will ensure the best possible clinical note. Please revise and submit your revised version inside the app! This will assist all of us in building the greatest tool on earth for the field!"
Well, okay -- I do want the AI to get better at its job...
But this DOES mean they are keeping a version of what you provide, doesn't it?
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
*If* AutoNotes.ai just plugged in a free AI available from elsewhere with a front-end they created, I wonder what legalese governs use of the AI on the backend? I am not a lawyer and none of us know the licensing agreement or ownership of the AI which AutoNotes.ai is using.
And, well hey -- while I'm just busy making up wild speculations -- let's play with this a bit:
So I know AutoNotes.ai is sending SOME kind of information to Google tracking services because my browser plug-ins are preventing this data from being sent to Google and telling me so.
Let's just suppose for a moment that they are using Google's Bard AI on the back-end. Because -- why not -- there is no PHI being collected anyway...
Meanwhile, both the therapist and the client are using the Google Chrome web browser for televideo. Or maybe they are using Gmail and the Gmail text is being mined. Or the data input for the note is sent along to Google datamining regardless of whether or not the Bard AI is used...
Let's go further out on our hypothetical limb and say that the therapist sees only three clients that day. The therapist creates three notes in AutoNotes.ai that day...
It's now a more than fair chance that one of those three unnamed clients has Acute Stress Disorder (like in my example above). If Google has gone to the bother to devote the computer tracking power to it, they might know from Gmail, or Bard, or data aggregation the names of the clients the therapist saw that day.
Of course, I really am making this all up -- we are just not given enough data to know what's real and false anymore.
Here is a paragraph from the welcome message they emailed me:
"Here are a couple of simple suggestions, first, complete as thorough a Mental Status Exam (MSE) as possible, submit a few sentences related to the session and theme, and include treatment plan goals, objectives, and strategies; this will ensure the best possible clinical note. Please revise and submit your revised version inside the app! This will assist all of us in building the greatest tool on earth for the field!"
Well, okay -- I do want the AI to get better at its job...
But this DOES mean they are keeping a version of what you provide, doesn't it?
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft -
This is for a new product called AutoNotes.ai that will create progress notes for you for about $14+ per month.
I am tentatively interested in this and signed up for a free trial.
I have concerns (below) but am hopeful this continues to improve.
~~~
The system has no BAA agreement for HIPAA because they claim not to collect any PHI. This may be true.Their terms of service and privacy policy may be amended at any time and the only notification provided will be on their website. I am wary of this now that I have been burned by such stealth changes by one of my own BAA subcontractors.
Information for each client will have to be entered from scratch each time as they do not store data about clients. For the free demo, it takes about 60 seconds to generate each note. You then cut and paste it into your EHR.
While they claim to collect no PHI, they do send SOME data of SOME sort to several tracking systems and data aggregators. At best, they are tracking YOU -- the clinician. At worst -- they may be sending some of the data you enter to data aggregators which -- hopefully -- do not have data from other sources to be able to pin down the individual clients you are describing.
DATA IS SEND TO THE FOLLOWING OUTSIDE TRACKING COMPANIES ON BOTH THEIR DATA INPUT PAGE & THE GENERATED NOTE PAGE:
PRIVACY BADGER BLOCKED:
connect.facebook.net
www.googletagmanager.com
s.gravatar.com
snap.licdn.com
i0.wp.comGHOSTERY BLOCKED:
Google Tag Manager
LinkedIn Analytics
Facebook Connect
GravatarThe following is an example of the note generated for a fake client with Acute Stress Disorder:
%GREEN = IN WHAT I REPORTED%
&RED = INVENTED BY AI, NOT REPORTED BY CLINICIAN&
BLACK = I HAVE NO STRONG OPINIONSubjective:
Client &reported increased levels of stress, particularly in the& %workplace. Described experiencing anxious mood and labile mood, but was able to demonstrate appropriate behavior and normal speech throughout the session. Client was self-reflective% and &engaged& %during the 45-minute video session%, %which allowed for a comprehensive exploration of the current challenges they face&. The client recognized the need to address these issues in order to improve their overall well-being.Objective:
During the session, several areas were discussed. &Client was open and honest& about %feeling overwhelmed and stressed at work, further precipitating their anxiety. They were able to identify specific work situations that were particularly difficult for them to navigate. It was noted that the client was able to maintain self-reflective behavior during discussions,% which will be beneficial for making progress in therapy.Assessment:
%The client is showing signs of progress in therapy, taking active steps to address and work through their challenges with stress and anxiety.% Their willingness to participate and self-reflect in session indicates a strong commitment to achieving their therapeutic goals. %Current therapeutic interventions, which are directed at helping the client build skills and coping strategies, are having a positive impact on the client's progress.%Plan:
%In order to lower workplace stress and continue advancing in therapy, the client agreed to set specific objectives for themselves. They plan on talking to their manager about ways to manage their workload or address any outstanding concerns. Additionally, they will begin practicing meditation four times a week during their lunch breaks, as a means of managing stress and promoting relaxation.% &Continued exploration of these& and other stress reduction &strategies will be a focus in future sessions.&Hmmm... My take-away is that this needs more work (that's fine); I want to know why they have to report to LinkedIn, Facebook, Gravatar, and Word Press while I'm logged in and what they report; and the system IS inventing minor elements that I did not tell it to add. For example, while I reported the client was overwhelmed and stressed, I did not say the client was open and honest about it. I told the system the client was "progressing", but never said that increased levels of stress were reported in this session.
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #technology #psychiatry #healthcare #patientportal #autonotesai -
It's so easy to project both hope and fear onto this bogeyman.
I'm hopeful that AI in healthcare will be implemented well, as in:
-- Save the staff time while NOT cutting off patients from the ability to access staff.
-- The right mix of AI and real human responses to portal questions. With humans looking over and improving AI responses before they are sent.
-- Safeguards to keep the AI from sharing client information outside the EPIC system (like to Microsoft in this case).
-- Help for the staff in terms of writing progress notes and getting treatment recommendations.
-- NOT double-checking staff-written notes for non-compliance with rigid hospital-imposed standards (there might be cases where something different needs to be noted).
-- NOT using the increased productivity to pile higher caseloads on staff. Hospital staff are already stretched VERY thin and need some relief.
@psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft
#mentalhealth #technology #psychiatry #healthcare
#patientportal -
TITLE: Epic taps Microsoft to integrate generative AI into EHRs with
Stanford, UC San Diego as early adoptersHmmm... So the stated goal is to enhance provider productivity so they
can focus on patients... Followed immediately by a sentence that the
first use -- already deployed in a few hospitals -- is to automatically
generate message responses...While this could go good places, the cynic in me speculates what those
responses might look like if sent in response to client EPIC portal
inquiries? Implemented especially badly this could close the last
direct route to speak with doctors.I wonder if this will slow the trend of clients having to pay for
message portal responses from medical staff?-- Michael
#AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #chatgpt
#artificialintelligence #psychology #counseling #socialwork
#psychotherapy #EHR #EPIC #medicalnotes #progressnotes #Microsoft
@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry #mentalhealth #technology #psychiatry #healthcare
#patientportal -
“…we now have the world's most used chatbot, governed by training data that nobody knows about, obeying an algorithm that is only hinted at, glorified by the media, and yet with ethical guardrails that only sorta kinda work and that are driven more by text similarity than any true moral calculus…”
Gary Marcus has a good piece in the #cacm about the ##HYPE around #ChatBotGPT https://cacm.acm.org/blogs/blog-cacm/269854-inside-the-heart-of-chatgpts-darkness/fulltext