

Admittedly very tough question. Here are some of the ideas I just came up with:
Make it easier to hold people or organizations liable for mistakes made because of haphazard reliance on LLMs.
Reparations for everyone ever sued for piracy, and completely do away with intellectual privacy protections for corporations, but independent artists get to keep them.
Public service announcements campaign aimed at making the general public less trustful of LLMs.
Strengthen consumer protection such that baseless claims of AI capabilities in advertising or product labeling are legally dangerous to make.
Fine companies for every verifiably inaccurate result given to a customer or end user by an LLM
This is a case where “died by suicide” seems genuinely more effective at describing what happened than “killed herself”. She didn’t inflict on herself the conditions that led to suicide. Other people did that, and they are ultimately responsible for her death, even if it was technically by her own hand.
Those contemptable fucking bastard Nazi shitheads.