Curating the Internet: Science and technology digest for November 22, 2019
Upcoming uses for artificial intelligence in psychiatry; Deal between Verizon and Snapchat to showcase 5G and augmented reality; New colocation center in Nebraska to provide service for bitcoin mining and machine learning; MS Windows will support DNS over HTTPS (DoH) for encrypted name resolution; and a Steem post with photos and descriptions of the Bubble Nebula - NGC7635
Straight from my RSS feed | Whatever gets my attention |
Links and micro-summaries from my 1000+ daily headlines. I filter them so you don't have to.
- Artificial Intelligence Could Help Solve America's Impending Mental Health Crisis - By improving diagnostics and interpreting medial images, artificial intelligence (AI) has already begun reshaping the medical landscape. Lacking emotional intelligence, however, robots and AI have not had much impact in psychiatry. As the baby boomer generation continues retiring en masse, however, the medical industry is imagining new uses for technology. And with AIs ability to recognize patterns, analyze data, and perform remote monitoring, experts say that it may be able to pick up on cues that humans would miss, especially because of the limited time that clinicians are able to spend with patients. Although some AI apps, like woebot are entering the field, the article suggests that it will be 5-10 more years before AI systems are a mainstream part of mental health treatment, and it may even be longer. John Torous is quoted as saying that the inclusion of AI in psychiatry is going to be an uphill challenge, but Henry A. Nasrallah is more optimistic. Nasrallah notes that speech and mental health are closely linked, and two of AIs possible uses in psychiatry include speech and language. Thus, he suggests that an AI system might be better than human clinicians at recognizing certain systems, like changes in intonation. Other possible uses of AI include distinguishing between real and fake suicide notes, or wearable devices that track sleep and physical activity. Tourous cautions, however, that "Nothing has ever been a panacea" and cautions against overblown expectations.
Here is a video that accompanies the article:
The Bubble Nebula captured with a focal length of 420mm. - photo by @astrophoto.kevin
Click through for the rest.
In order to help bring Steem's content to a new audience, if you think this post was informative, please consider sharing it through your other social media accounts.
And to help make Steem the go to place for timely information on diverse topics, I invite you to discuss any of these links in the comments and/or your own response post.
Beneficiaries
- Burn Steem/SBD - @null - 5%
- Cited author(s) - @astrophoto.kevin - 10%
- Fundraising for the Rustin Golden Knights Marching Band - @rgkmb-unofficial - 10%
- Posting and/or scheduling service (steempeak.com) - @steempeak - 5%
- Steem/API services (anyx.io) - anyx - 5%
- Steem/RSS services (steemrss.com) - torrey.blog - 5%
- SteemWorld (steemworld.org) support - steemchiller - 5%
About this series
Sharing a link does not imply endorsement or agreement, and I receive no incentives for sharing from any of the content creators.
Follow on steem: @remlaps-lite, @remlaps
If you are not on Steem yet, you can follow through RSS: remlaps-lite, remlaps.
Thanks to SteemRSS from philipkoon, doriitamar, and torrey.blog for the Steem RSS feeds!
Hello,
Your post has been manually curated by a @stem.curate curator.
We are dedicated to supporting great content, like yours on the STEMGeeks tribe.
Please join us on discord.
AI is gonna be key to many tech developments, and particularly in decentralization. But I find psychiactric diagnosis by AI not only utterly ridiculous, but Orwellianly terrifying. I don't even think real human psychiatists are better than guessing at what consciousness is and how it goes wrong, and now their handing off their guesses to AI? Just let me check off boxes on a listicle.
Thanks!
Thanks for the reply!
I'd feel much better about psychiatry if they could eventually get to the level of identifying and treating bio-mechanical root causes, as I suspect that a number of totally unrelated physiological problems can cause similar seeming behavioral anomalies. I could definitely see AI helping with identifying those underlying root causes, although that's not what the article is about.
As it stands now, my observation of psychiatry over the years is that - in practice - the doctors seem to try medicines until something works, and as they do, they change diagnoses to match the medicine they're trying. Then, whatever medicine eventually seems to cause improvement tells them their ultimate diagnosis. So I don't worry about them adding an AI to the diagnostic toolbox. To me, it doesn't seem like the process can get much less accurate. I don't think an AI should be the only diagnostic decision-maker, but I am ok with it being another input for consideration by a human doctor. Of course a risk with that is that doctors may come to trust the AI's too much, as we have seen with accidents that occurred during the emergence of autonomous vehicles, when their human observers were less than attentive.