You're simply repeating your earlier statement here, not refuting any of Kinsella's arguments as to why IP is illegitimate and shouldn't be considered an NAP violation.
You said
ignoring the ownership problems in your original quote:
Even ignoring the
ownership debate, there's still the question of how far you can go to enforce IP protections without infringing on the NAP in other ways?
So I was like, okay, if we ignore the ownership problem, once the NAP has been violated, you are allowed to retaliate. The NAP doesn't have much to say about what one can or can't do once the initial aggression happened.
If anything, copyright law means you're telling Bart that he can't use his own paper, ink, and printing press to make his own copy of Adam's book — thus giving Adam some measure of control over property (Bart's paper, Bart's ink, and Bart's printing press) that isn't actually his by preventing Bart from using them the way he wants to. That's more of an NAP violation than breaking the IP law itself, if you ask me.
Eh, there's fair arguments on both sides here, but the thing you are discounting is the
labor. Bart is using his stuff, but he's also using
Adam's labor (the words in the book) without Adam's permission, which is a large part of the labor in the book itself. When it comes to software/digital art, the idea becomes nearly the entire value of the piece.
Now if Bart independently came up with the idea, that's a different story. And this is the key difference Rothbard makes re: copyright law vs patent law. More here:
mises.org
While the idea of using certain tools to fish for original works used to train an AI model is "interesting", I'm not aware of any cases where that's actually been achieved? Now, I'll grant I could be wrong about that; even if I'm right, technological advancement will enable the means to screen things out to get better than they currently are, anyway.
It's very new, as this is a very new area of law. I have no idea where technology or the law will go here.
But even then, I'm not so sure your solution would be enforceable, in the event the model's creator is an anonymous online personality who lives in Russia or something. Technically, I suppose you could apply discovery to find out who they are and bring them to court (if they're American), though as you admit, that'd require the government to intervene quite a bit more. In fact, even if it doesn't require much (if any) additional federal power, they'll still seize the opportunity to pass a slew of new laws, anyway — whether we actually need them or not.
If this causes a further decentralization of power for AI, that's also a win, I'd say. Just in a different direction. As for this causing a slew of new laws: there are always a slew of new laws. The question at hand is 'is this one good?'
I also notice that you glossed over my other examples of how IP enforcement would expand to cover an array of other areas, such as bans on sharing certain files online (internet censorship and surveillance) and cracking down on illicitly manufactured patented goods (going after 3D printing and CNC machining). Frankly, I think unless Libertarians are okay with going after those (and sacrificing some of their best hopes of decentralizing the means of production along the way)... then they'd just be best served acknowledging that traditional IP is a lost cause and siding with the Copyleft Crowd.
It very much
doesn't apply to those, as those are
already subject to copyright. For example, one can already copyright a CAD design for 3d printing, and then sell it online and DMCA strike people who copy it without asking.
This is specifically for the question of "is the use of an artwork in a LLM AI training data set a copyright infringement, and is it covered under fair use?" My new law clarifies that "yes, it is an infringement."
Ultimately, it's one thing to advocate certain laws, but you also need to think of what it'd take to actually enforce them and what that cost-benefit analysis looks like. To me, the future doesn't look all that bright for the pro-IP side, so if I were a content creator, I'd try to stay ahead of the curve and make sure my business model adapts well to a world where traditional enforcement won't be viable for much longer. Big corpos and proprietarians clinging to the old paradigm won't adapt well at all, I fear.
Note that enforcement would be by private actors. And yes, people clinging to old paradigms will lose. But that doesn't excuse mass theft. Without this, we could end up with static art: if no new art is made, all AI will have to learn from in the future is new AI stuff.
You may think that's not what you're for, but that's nothing more than a pleasant-seeming delusion on your part (which isn't a surprise coming from you, seeing as many of your political positions have tended to be severely lacking in terms of being rooted in realty). Saying that "Using an artwork to train a data model is a violation if one does not have copyright permission" is no different than saying "Using an artwork to train a person is a violation if one does not have copyright permission";
First off, legally, they are distinct. But I get you are aiming towards morality here.
Second, yes they are similar morally: both are morally bad. Using art you've obtained illegally to learn how to do something (or for that matter, anything else) is wrong. Illegally downloading music in order to learn how to make music does violate IP laws. Your creations afterwards may or (more likely) may not.
Note that it's very easy for a person to listen to music, look at art, or read a book legally. For music, there's the radio and online streaming services. For art, there's museums and online collections of art. For books there are both libraries and online fiction of various types. I'd say that 99% of what 95% of artists consume in terms of art is legally acquired, they just never noticed it, and then there's a few artists (mostly musicians) who learned off of stolen IP.
Again, I'm not saying that an art program is stealing the art when a person types in the art producer that they'd like a picture of a cat with googly eyes. I'm saying that the art theft happened when the model was trained. I believe this covers the rest of the post as well.