I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.
This will actually protect smaller artists. It will prevent giant companies from profiting from their work without credit or payment.
I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.
Generative AI models could be trained using only on public domain and royalty free images. Should the output of those be eligible for copyright, but not if they also had unlicensed training data?
It seems there two separate arguments being conflated in this debate. One is whether using copyrighted works as AI training data is fair use. The other is whether creative workers should be protected from displacement by AI.
So we kill open source models, and proprietary data models like adobe are fine, so they can be the only resource and continue rent seeking while independent artists can eat dirt.
Whether or not the model learned from my art is probably not going to affect me in any way shape or form, unless I’m worried about being used as a prompt so people could use me as a compass while directing their new image aesthetic. Disney/warner could already hire someone to do that 100% legally, so it’s just the other peasants im worried about. I don’t think the peasants are the problem when it comes to the wellbeing and support of artists
Ok that’s entirely disingenuous. You can make an AI model open source where you get real permission from artists instead of taking their work without permission and using it to replace them.
It’s entirely possible. Will the large orgs have more resources to collect art? No shit, yeah, and they’ll have better PCs to train on too. No matter what, allowed to take from hard working small artists while attempting to make them irrelevant or not, the big corps will always have a heads up here.
Unless, like so many other projects, an extremely complicated system benefits from collaborative work inside an open environment. Shocking, I know, working on merit over money.
You don’t want a conversation though you just want “epic dunks”
The amount of training data needed for a model is so huge that you’d have to use only artwork that was preemptively licensed for that purpose. Individually asking artists for permission to use their work would be far too expensive even if they all agreed to let you use their work for free.
Artists should have control over their work. It’s not my problem that a big company vs a small company is stealing my work, I don’t want either of them to.
I no longer post anything online that I create cause I’d rather nobody see it than it be stolen for AI training.
That is correct, though there could be campaigns to collect art otherwise. There are plenty of artists in the open source world who could do it, and asking individuals to signal boost these calls to action can get more push. Once more, no matter what, big corps will always have more monitary resources. The power of open source is volunteer manpower and passion. Even if these weren’t the case, the moral argument still stands in using a persons work to replace them without permission.
Regardless of that even, what this will do is cause stagnation in the art field if not protected. Nobodies going to share their art, their method, or their ideas freely in a world where doing so allows a massive corp to take it freely without permission, thus replacing them. This kills ideas of open distribution of art and art information. It will become hidden, and new ideas, new art, will not be available to view.
Allowing people to take without permission will only ever hurt the small artists. Disney will always be able to just “take” any art they make.
Also, you’re not entirely correct on that. Models made for specific purposes don’t actually need the absurd amount generalist models need. However in the context of current expectations yeah, you’re right on quantity.
I believe a person can still sell or market art that is AI created. I just believe they shouldn’t have total ownership of the work.
Already most creators don’t fret over fanart or fanfiction so there is wiggle room for fair-use. It’s a lot like the game modding scene. Usually modders use pre-existing assets or code to create something new.
Big companies like Adobe and Google can get the rights to use material to train their models. If stricter laws get passed it will only slightly inconvenience the larger companies, but might completely destroy any of the smaller companies or open-source versions available.
The anti-ai lawsuits aren’t going to stop ai art/etc, just determine whether it’s completely controlled by the current tech giants or not.
Sadly no matter what, the big media companies are going to have a huge advantage in everything because of decades of lobbying etc.
I think people should still be able to profit from selling the image themselves, however, I don’t think we have enough knowledge on how AI will truly impact things. If it becomes a minor fad and is just a tool to help speed a process I think the law doesn’t need to change much.
If AI becomes the majority creator on projects then we have to have this conversation about who owns what.
Close models will probably be the future, much like stock photos, and people will have to pay to access the models.
In the end big business will always fuck us over, copyright or not.
I agree that AI work should not have copyright protection. Even with human intervention it still collects data, without expressed permission, from numerous sources.
This will actually protect smaller artists. It will prevent giant companies from profiting from their work without credit or payment.
Generative AI models could be trained using only on public domain and royalty free images. Should the output of those be eligible for copyright, but not if they also had unlicensed training data?
It seems there two separate arguments being conflated in this debate. One is whether using copyrighted works as AI training data is fair use. The other is whether creative workers should be protected from displacement by AI.
But that won’t happen. Companies have money, and by extension, lobbyists. It doesn’t matter what the general consensus is, they will get their way.
So we kill open source models, and proprietary data models like adobe are fine, so they can be the only resource and continue rent seeking while independent artists can eat dirt.
Whether or not the model learned from my art is probably not going to affect me in any way shape or form, unless I’m worried about being used as a prompt so people could use me as a compass while directing their new image aesthetic. Disney/warner could already hire someone to do that 100% legally, so it’s just the other peasants im worried about. I don’t think the peasants are the problem when it comes to the wellbeing and support of artists
Ok that’s entirely disingenuous. You can make an AI model open source where you get real permission from artists instead of taking their work without permission and using it to replace them.
It’s entirely possible. Will the large orgs have more resources to collect art? No shit, yeah, and they’ll have better PCs to train on too. No matter what, allowed to take from hard working small artists while attempting to make them irrelevant or not, the big corps will always have a heads up here.
Unless, like so many other projects, an extremely complicated system benefits from collaborative work inside an open environment. Shocking, I know, working on merit over money.
You don’t want a conversation though you just want “epic dunks”
The amount of training data needed for a model is so huge that you’d have to use only artwork that was preemptively licensed for that purpose. Individually asking artists for permission to use their work would be far too expensive even if they all agreed to let you use their work for free.
And why is that a problem?
Artists should have control over their work. It’s not my problem that a big company vs a small company is stealing my work, I don’t want either of them to.
I no longer post anything online that I create cause I’d rather nobody see it than it be stolen for AI training.
That is correct, though there could be campaigns to collect art otherwise. There are plenty of artists in the open source world who could do it, and asking individuals to signal boost these calls to action can get more push. Once more, no matter what, big corps will always have more monitary resources. The power of open source is volunteer manpower and passion. Even if these weren’t the case, the moral argument still stands in using a persons work to replace them without permission.
Regardless of that even, what this will do is cause stagnation in the art field if not protected. Nobodies going to share their art, their method, or their ideas freely in a world where doing so allows a massive corp to take it freely without permission, thus replacing them. This kills ideas of open distribution of art and art information. It will become hidden, and new ideas, new art, will not be available to view.
Allowing people to take without permission will only ever hurt the small artists. Disney will always be able to just “take” any art they make.
Also, you’re not entirely correct on that. Models made for specific purposes don’t actually need the absurd amount generalist models need. However in the context of current expectations yeah, you’re right on quantity.
I believe a person can still sell or market art that is AI created. I just believe they shouldn’t have total ownership of the work.
Already most creators don’t fret over fanart or fanfiction so there is wiggle room for fair-use. It’s a lot like the game modding scene. Usually modders use pre-existing assets or code to create something new.
Let people play but not own AI work for now.
Big companies like Adobe and Google can get the rights to use material to train their models. If stricter laws get passed it will only slightly inconvenience the larger companies, but might completely destroy any of the smaller companies or open-source versions available.
The anti-ai lawsuits aren’t going to stop ai art/etc, just determine whether it’s completely controlled by the current tech giants or not.
Sadly no matter what, the big media companies are going to have a huge advantage in everything because of decades of lobbying etc.
I think people should still be able to profit from selling the image themselves, however, I don’t think we have enough knowledge on how AI will truly impact things. If it becomes a minor fad and is just a tool to help speed a process I think the law doesn’t need to change much.
If AI becomes the majority creator on projects then we have to have this conversation about who owns what.
Close models will probably be the future, much like stock photos, and people will have to pay to access the models.
In the end big business will always fuck us over, copyright or not.