I found this very interesting, because I'm only a little aware of the emerging law in this area. I have a question that maybe you have insight on:
I imagine there has to be conditions on when something is required to be labeled vs. not. For example, if I make a birthday card for a friend with Generative AI and Photoshop, I would find it very weird if there were a law that technically said I had to label it as AI generated. Or, for that matter, that I had to label it as using Photoshop, or digital editing, etc.
My question is: Do we have a sense yet of where that line is? I write blogs sometimes, and I use Generative AI images to illustrate the writing. Would I be expected to label those AI images as generated? More important to me, if I were to write a self-published book, and then used AI-assisted tools to create the book cover (putting my own ability to copyright aside). Does the book cover need to have some sort of watermark on it showing it's AI generated?
In none of those cases is the AI-Generated art particularly meaningful to the product value. It's not tricking the person buying the book, and I would be surprised that telling them the cover was a AI-assisted art would cause them to put the book back. It seems strange to force arbitrary labeling to all forms of generated art. Again, I'd feel equally weird if every digital art needed to have a "Photoshop" label on it to show it was digitally created.
Obviously, this is much different if there's some sort of deception going on. If I hire an artist to create art, and they use G-AI to do it without telling me, that seems bad. Written information and articles - I can see the value of having them labeled the same way you label sourced materials. But there are many areas where it's much harder for me to articulate the harm of me having a more attractive book cover for my book than I have the skill to produce on my own.
So yeah, your article made me start thinking about it, and so I just decided I'd ask. :)