For the record: Using a local model may be better than querying a cloud-based AI, but it doesn't resolve all of my objections because the environmental impact is only one of the problems with generative AI. Follow-up questions would include:

1. Was your model trained on other people's work without their permission?

2. Are you using it to avoid paying professionals for their work?

3. Are you using it for output that people expect to be, or might mistake for, human-authored work?

4. Does it transfer data to AI companies?

Being unable to answer any of those questions is indistinguishable from a "yes."

@lrhodesL. Rhodes ⁂

The second one is tricky for me because ‘AI’ is often quite different to prior technologies that replaced humans. I don’t mind that lamp lighters are all out of a job because street lamps can turn themselves on and off automatically, for example. There are two places where I find ‘AI’ problematic here:

The first is replacing humans with something that does a less-good job. Translation is interesting here. Machine translation is great for low-stakes work. I can’t employ a translator to translate everything I read while travelling and a mostly correct machine translation is far better than no translation. But machine translation is really bad at consistency, which is one of the hardest parts of translation. Do you translate the same term of art using the same terminology in every use in a document? Do you correctly recognise idiomatic usage and convey the same meaning? These things are really important in a lot of places, especially any writing where misunderstandings can have serious impacts (legal documents, manuals) but also for enjoyment in fiction. Subtitles are similar: machine-generated ones are better than nothing but much worse than ones written by people who understood the film. And this gets worse because normalising the mechanical form devalues the skilled labour of the humans.

The second is where the work is intrinsically fulfilling or an exercise in creativity. Mechanisation should free people from drudgery. Eliminating the job of shelf stacker from supermarkets is great. Using forklift trucks instead of humans doing back-breaking labour is an huge improvement. Eliminating the job of artist is not. If there is a job that people would choose to do in a post-scarce society where they had all of their needs met without reference to their occupation, eliminating it is a societal harm. Closely related to this are activities that provide training for things that can’t be automated. Eliminating junior positions with ‘AI’ and hoping that you can still hire senior people is a large externality (you are relying on other people doing the training that you are no longer doing).

The vast majority of ‘AI’ uses trigger one of these, but there are a handful of things that do not.

0

If you have a fediverse account, you can quote this note from your own instance. Search https://infosec.exchange/users/david_chisnall/statuses/115870633562184044 on your instance and quote it. (Note that quoting is not supported in Mastodon.)