Recently a friend who works in acute youth mental health told me that their workplace is being pushed to use AI transcription and summary services with their patients.

I may have ranted. These patients are minors, at a public health service, who can't go elsewhere, and might even be mandated to use the service by a court - they obviously can't reasonably give consent. These notes might decide their futures, be seen by a judge or similar decision maker, be used to determine what future treatment they have. And we *know* AI fails to understand cultural difference. It is likely to miss nuances of slang, sarcasm, and over-dramatic (or under-dramatic) teen language.

The speed with which employers are falling over themselves to adopt AI, without concern for their patients, customers or staff, is mind boggling.

All this to say, get behind @drwausDigital Rights Watch campaign: digitalrightswatch.org.au/camp

0
0
0

If you have a fediverse account, you can quote this note from your own instance. Search https://aus.social/users/keira_reckons/statuses/115976815278517223 on your instance and quote it. (Note that quoting is not supported in Mastodon.)