MicrostockGroup Sponsors


Author Topic: A.I. Legal cases  (Read 10236 times)

0 Members and 1 Guest are viewing this topic.

« on: April 15, 2023, 03:34 »
+5
Sky News today aired a story about gathering momentum from creative artists to call a halt to ilegal  A.I. data set usage because work is being plucked direct off their private commercial websites.

 https://www.google.com/amp/s/news.sky.com/story/amp/ai-art-generators-face-backlash-from-artists-but-could-they-unlock-creative-potential-12857072

Someone should pay attention or its gonna be expensive down the line. The US copywrite office decided in 2022 that A.I. generated work wasn't eligible for copywrite and began defending a law suit against a company that filed a law suit claiming this decision was wrong. Never the less at present A.I. generated work isn't copywritten. So anyone who wants to use it can do so free of charge.

They were open to exploring a change in the law if humans were involved. For instance if an artist or photographer used A.I. to enhance work. But the problem exists on what percentage of A.I. involvement does the copywrite cease to apply. They state they will look at the situation this year. Which explains the gold rush to get it up and running.

The UK government are now looking at the illegal use of artists (creators) via data sets which they are aware have been used illegally after pressure from many trade bodies. And will be looking at changing the law. But they state that initially this change in law will possibly require voluntary registration. But the big boys will rush to join and those who dig their heals in will pay the price. Because that's how it always works no matter what it is.

So Adobe ... look faster at a compensation model.

A paper by researchgate in January 2022 examined the potential for legal action against companies that use copywritten work to train their A.I.

Using Generative Adversarial Networks (GAN) to create images from data sets, which is how many A.I. generators work, they looked at the legal position regarding copywrite law, although newer methods exist such as CANs (Creative Adversarial Networks). The former would focus on the predominant features of a data set. If it was trained on animals it created animal like images even if clouds that contained animal like features were added. CANs were created to remove human input in the creation process but never the less using human data sets.

After examining copywrite law in this framework they state that Copywrite law may be triggered if the origin data was copywritten work. It varies from state to state but generally this fact is a given via reproduction laws. Even partial use is covered and requires the authorisation of the author/artist/creator ... this was never given. Implied authorisation by use of the site isn't enough because you have to be aware your images are being used for this purpose. 

Opinion - If the output of the A.I. isn't as yet copywritten then anything it creates is free to use and requires no payment.

If people are paying for those created images it proves that that they believe their purchase is protected by use laws. In this regard they believe their purchase is copywritten and safe.  And no doubt the Ts&Cs will assert as much therefore a company selling these pictures must be offering to protect the images because they are charging for them. This infers the sold A.I. output is coyywritten and protected by Adobe's legal framework.

If this is true they must have paid for the license to use copywritten work to create protected work which they sell. If they didn't they must give the A.I. generated work away for free. They can't claim fair use because they sell world wide and are also governed by TDM (Text and Data Mining) in Europe. Which excludes commercial gain.

You can make your own opinions but compensation now will be much cheaper than compensation later because now you pay for what you have used. Later you have to pay everyone because you won't be able to prove who's work you did or didn't use.

Researchgate source - https://www.researchgate.net/publication/357685384_Protection_of_AI_generated_photographs_under_copyright_law_pre_print_version_09_01_2022

Edit: I've just seen a payment today from POND 5 for data set use of my work. SS have also paid for use of my work. As we can clearly see ... regardless of the laws in place at present ... stock agencies do not want to be the last one digging their heals in.





 



« Last Edit: April 15, 2023, 03:47 by Lowls »


« Reply #1 on: April 15, 2023, 03:57 »
0
So what are the conclusions? Stock agencies are not allowed to sell content to buyers on their own behalf?

« Reply #2 on: April 15, 2023, 04:41 »
+4
So what are the conclusions? Stock agencies are not allowed to sell content to buyers on their own behalf?

No that's already happening I believe. I'm not a buyer so I don't know. But the conclusion is that if the A.I. generated product isn't copywritten and I believe it isnt, why do people need to pay for it. They could just take it. An A.I. can't bring a case against them. It is the creator. It's work according to the conclusion last year wasn't copywritten. So who can claim copywrite theft? Unless laws have changed since then it appears the work can just be taken.

« Reply #3 on: April 15, 2023, 06:00 »
+2
So what are the conclusions? Stock agencies are not allowed to sell content to buyers on their own behalf?

No that's already happening I believe. I'm not a buyer so I don't know. But the conclusion is that if the A.I. generated product isn't copywritten and I believe it isnt, why do people need to pay for it. They could just take it. An A.I. can't bring a case against them. It is the creator. It's work according to the conclusion last year wasn't copywritten. So who can claim copywrite theft? Unless laws have changed since then it appears the work can just be taken.

it will be interesting, not sure we can conclude it can be taken.  Going to take property illegally, copyrighted or not, is still theft, either downloading without paying (breach of terms) or removing watermarks (intention to defraud).  I think the bigger issue is for buyers, if they use AI generated material they can copyright it, so this would be a big issue if they use it in trademark and copyright stuff

« Reply #4 on: April 15, 2023, 06:37 »
+3
This means that artificial intelligence will not deprive us of work, because responsible buyers will simply not download this artificial content on stocks.

« Reply #5 on: April 15, 2023, 07:08 »
+2
This means that artificial intelligence will not deprive us of work, because responsible buyers will simply not download this artificial content on stocks.

Well I think buyers will want a product that is protected by copywrite so that they can use it safely.

« Reply #6 on: April 15, 2023, 07:18 »
+2
So what are the conclusions? Stock agencies are not allowed to sell content to buyers on their own behalf?

No that's already happening I believe. I'm not a buyer so I don't know. But the conclusion is that if the A.I. generated product isn't copywritten and I believe it isnt, why do people need to pay for it. They could just take it. An A.I. can't bring a case against them. It is the creator. It's work according to the conclusion last year wasn't copywritten. So who can claim copywrite theft? Unless laws have changed since then it appears the work can just be taken.

it will be interesting, not sure we can conclude it can be taken.  Going to take property illegally, copyrighted or not, is still theft, either downloading without paying (breach of terms) or removing watermarks (intention to defraud).  I think the bigger issue is for buyers, if they use AI generated material they can copyright it, so this would be a big issue if they use it in trademark and copyright stuff

This is the point though isn't it. Watermarks can be removed because an AI created the work. Watermarks are for ownership protection. They don't own it 😲. They didn't pay for the rights to use data sets. They just took them. You cannot claim ownership of stolen goods.

If you steal pair of dogs and use them to mate. The pups don't belong to you, just because you I introduced them.

If you steal a red car and paint it green it still isn't yours.

If you steal a Monet from a gallery and use its detail to paint a fake and sell it for millions ... you don't get to keep your millions.

And at least technically these items produced are in theory covered because as per the law they have had human interaction.

A.I. cannot claim any of these traits.

« Reply #7 on: April 15, 2023, 09:32 »
+6
A few years ago there was an outcry when Getty Images was charging for downloads (from the agency web site) of public domain images. Getty's claim was that they were entitled to charge for the convenience of having scanned and hosted these in a convenient way for their customers. In other words, the fact that Getty didn't own the copyright in an image wasn't significant in offering it for sale. I'd assume the same would apply to licensing or sale of AI-generated imagery where no one held copyright in it.

https://www.latimes.com/business/hiltzik/la-fi-hiltzik-getty-photos-20160801-snap-story.html
https://will.illinois.edu/legalissuesinthenews/program/getty-images-and-fair-use
https://petapixel.com/2016/11/22/1-billion-getty-images-lawsuit-ends-not-bang-whimper/

« Reply #8 on: April 15, 2023, 10:43 »
0
A few years ago there was an outcry when Getty Images was charging for downloads (from the agency web site) of public domain images. Getty's claim was that they were entitled to charge for the convenience of having scanned and hosted these in a convenient way for their customers. In other words, the fact that Getty didn't own the copyright in an image wasn't significant in offering it for sale. I'd assume the same would apply to licensing or sale of AI-generated imagery where no one held copyright in it.

https://www.latimes.com/business/hiltzik/la-fi-hiltzik-getty-photos-20160801-snap-story.html
https://will.illinois.edu/legalissuesinthenews/program/getty-images-and-fair-use
https://petapixel.com/2016/11/22/1-billion-getty-images-lawsuit-ends-not-bang-whimper/

Thank you for that Jo that does indeed tally with what I assume. Whilst Getty acquired the library it didn't own the copywrite. As the article states

Whilst those that purchase use of the image from Getty it doesn't indemnify the purchaser from copywrite infringement and therefore Getty indemnified the purchaser themselves because the law couldn't. Hahahaha ... oh dear. Well this is an extremely ugly can of worms isn't it.

But also we have not handed over use free of charge in any way by way of donation to public use. So slightly different.
« Last Edit: April 15, 2023, 10:46 by Lowls »

Mir

« Reply #9 on: April 15, 2023, 17:32 »
0
I am not sure if this was already shared:
Class Action Filed Against Stable Diffusion, Midjourney, and DeviantArt
https://stablediffusionlitigation.com/
This is from January but I can't find any updates.

« Reply #10 on: April 16, 2023, 07:24 »
0
I am not sure if this was already shared:
Class Action Filed Against Stable Diffusion, Midjourney, and DeviantArt
https://stablediffusionlitigation.com/
This is from January but I can't find any updates.

https://www.google.com/amp/s/finance.yahoo.com/amphtml/news/generative-ai-heading-down-dangerous-164041633.html

although the words million dollar law suit have now been replaced with trillion dollar and billion dollar lawsuits in the various online articles regarding Getty and others. They have yet to respond to the law suit's.
« Last Edit: April 16, 2023, 07:26 by Lowls »

Uncle Pete

  • Great Place by a Great Lake - My Home Port
« Reply #11 on: April 16, 2023, 12:06 »
+1
I am not sure if this was already shared:
Class Action Filed Against Stable Diffusion, Midjourney, and DeviantArt
https://stablediffusionlitigation.com/
This is from January but I can't find any updates.

It's filed in California, last update was March 14, 2023

https://unicourt.com/case/pc-db5-andersen-et-al-v-stability-ai-ltd-et-al-1380299


« Reply #12 on: April 19, 2023, 04:40 »
0
March 21st 2023

apparently we will be able to get paid they told the media and opt out ... if only they mentioned if/when/how ...

"Adobe Inc (ADBE.O) added artificial intelligence to some of its most popular software, including Adobe Photoshop and Adobe Illustrator, to speed the process of generating images and text effects, noting that creators whose work was used by the tools will be able to get paid."

"Nvidia trained the technology on images licensed from Getty Images, Shutterstock Inc (SSTK.N), and Adobe, and plans to pay royalties."

"Adobe's new AI-enhanced feature, called "Firefly," allows users to use words to describe the images, illustrations or videos that its software will create. Because the AI has been trained on Adobe Stock images, openly licensed content and older content where copyright has expired, the resulting creations are safe for commercial use, it said.

The company also is advocating for a universal "do not train" tag that would allow photographers to request that their content not be used to train models."

source https://www.reuters.com/technology/adobe-nvidia-ai-imagery-systems-aim-resolve-copyright-questions-2023-03-21/

« Reply #13 on: April 20, 2023, 07:13 »
0
Dismiss in the class action lawsuit against StabilityAI, Deviantart, and Midjourney

https://twitter.com/technollama/status/1648981345924685824?cn=ZmxleGlibGVfcmVjcw%3D%3D&refsrc=email

Mir

« Reply #14 on: April 20, 2023, 07:28 »
+3
But it says 'The companies asked a San Francisco federal court to dismiss the artists' proposed class action lawsuit'
Isn't the court supposed to decide whether to dismiss it or not.

https://www.reuters.com/legal/ai-companies-ask-us-court-dismiss-artists-copyright-lawsuit-2023-04-19/

Uncle Pete

  • Great Place by a Great Lake - My Home Port
« Reply #15 on: April 20, 2023, 13:33 »
+1
But it says 'The companies asked a San Francisco federal court to dismiss the artists' proposed class action lawsuit'
Isn't the court supposed to decide whether to dismiss it or not.

https://www.reuters.com/legal/ai-companies-ask-us-court-dismiss-artists-copyright-lawsuit-2023-04-19/

Yes. Motion to dismiss is pretty standard and it doesn't mean the judge will say they agree. My bold of the upcoming court date.

"PLEASE TAKE NOTICE that on July 19, 2023, at 2:00 p.m., or as soon thereafter as the
matter may be heard, in the United States District Court for the Northern District of California,
Courtroom 2, 17th Floor, located at 450 Golden Gate Ave., San Francisco, CA 94102, Defendants
Stability AI Ltd. and Stability AI, Inc., through their undersigned counsel, will, and hereby do, move
to dismiss Plaintiffs Class Action Complaint (Compl. or Complaint) pursuant to Federal Rule
of Civil Procedure (FRCP) 12(b)(6)."

https://storage.courtlistener.com/recap/gov.uscourts.cand.407208/gov.uscourts.cand.407208.51.0.pdf

« Reply #16 on: April 20, 2023, 17:50 »
0
But it says 'The companies asked a San Francisco federal court to dismiss the artists' proposed class action lawsuit'
Isn't the court supposed to decide whether to dismiss it or not.

https://www.reuters.com/legal/ai-companies-ask-us-court-dismiss-artists-copyright-lawsuit-2023-04-19/

Yes. Motion to dismiss is pretty standard and it doesn't mean the judge will say they agree. My bold of the upcoming court date.

"PLEASE TAKE NOTICE that on July 19, 2023, at 2:00 p.m., or as soon thereafter as the
matter may be heard, in the United States District Court for the Northern District of California,
Courtroom 2, 17th Floor, located at 450 Golden Gate Ave., San Francisco, CA 94102, Defendants
Stability AI Ltd. and Stability AI, Inc., through their undersigned counsel, will, and hereby do, move
to dismiss Plaintiffs Class Action Complaint (Compl. or Complaint) pursuant to Federal Rule
of Civil Procedure (FRCP) 12(b)(6)."

https://storage.courtlistener.com/recap/gov.uscourts.cand.407208/gov.uscourts.cand.407208.51.0.pdf

and of course this isn't one of the high profile cases. This is a collection of artists that have gone after them. This isn't Getty or future organisatios. They should avoid wriggling out of this because the hammer will fall on them. How hard depends on what they do now. Already numerous legal bodies are scrambling to nail this down and when they do ...

... further other areas of society have begun recoiling away from this tech. Writers, painters, even the legal profession and song writers and financial bodies. And now they are teaching Chatgpt to lie ... which will render it useless.

« Reply #17 on: May 19, 2023, 11:13 »
+1
... And so it begins ... Adobe this means you.

OpenAI CEO Sam Altman urged lawmakers to regulate artificial intelligence during a Senate panel hearing Tuesday, describing the technology's current boom as a potential "printing press moment" but one that required safeguards.

"I think that people should have the right to refuse to have their data (created content) used for it to be trained on"

Quickly followed by agreement and expression of laws in the making to do just that.

Uncle Pete

  • Great Place by a Great Lake - My Home Port
« Reply #18 on: May 19, 2023, 11:45 »
0
... And so it begins ... Adobe this means you.

OpenAI CEO Sam Altman urged lawmakers to regulate artificial intelligence during a Senate panel hearing Tuesday, describing the technology's current boom as a potential "printing press moment" but one that required safeguards.

"I think that people should have the right to refuse to have their data (created content) used for it to be trained on"

Quickly followed by agreement and expression of laws in the making to do just that.

This should be interesting and maybe bring some future returns for the use of datasets. We didn't create the art and photos to be paid cents and dimes for their eternal use, and re-use.


Used for training and I get paid this?

« Reply #19 on: May 19, 2023, 15:42 »
+1
... And so it begins ... Adobe this means you.

OpenAI CEO Sam Altman urged lawmakers to regulate artificial intelligence during a Senate panel hearing Tuesday, describing the technology's current boom as a potential "printing press moment" but one that required safeguards.

"I think that people should have the right to refuse to have their data (created content) used for it to be trained on"

Quickly followed by agreement and expression of laws in the making to do just that.

He's obviously right, but that's a weird thing of him to say. Right? He used people's data to train his ChatGPT.
It's already there, and it sounds like he want's to prevent competitors from getting access to the amount of data he had.
Or is he gonna shut down ChatGPT and start all over again the "moral" way?

« Reply #20 on: May 20, 2023, 06:18 »
0
... And so it begins ... Adobe this means you.

OpenAI CEO Sam Altman urged lawmakers to regulate artificial intelligence during a Senate panel hearing Tuesday, describing the technology's current boom as a potential "printing press moment" but one that required safeguards.

"I think that people should have the right to refuse to have their data (created content) used for it to be trained on"

Quickly followed by agreement and expression of laws in the making to do just that.

He's obviously right, but that's a weird thing of him to say. Right? He used people's data to train his ChatGPT.
It's already there, and it sounds like he want's to prevent competitors from getting access to the amount of data he had.
Or is he gonna shut down ChatGPT and start all over again the "moral" way?

I know right. So I couldn't listen any more because flashbacks to Zuckerbergs BS. I say that because he stated paraphrasing "I think that governmnent needs to make laws to protect creativity and jobs and people's livelihoods when this grows and we have to decide what do we do with our time" but then his sickening smile as he said "but I am here ro ask for that support and to work with tye governmnet to do this the right way but I also have huge belief in the ability of humans to find new uses for ... blah nlah blah"

So I think he has realised he can't get the s#it back in the horse. And he isn't as arrogant or as well protected as Zuckerberg and he knows that when the hammer falls its gonna fall on him. So he's come running to get in with the in crowd early doors. It's totally too late. He I am sure knows its way too late but I think he is banking on the government not knowing its as late as it is. If he can hand the government the reigns its their shtshow and he's just an advisor when it becomes apprent just how damaging this will be.

Drug companies bust as that thing examines every bit of data and works our the best chemical composition for everything.

Defence agencies bust as it examines all data and designs open source hyperersonic jets for the rich. 

Secrets exposed as it crawls every chat, email, document, stocks movements and utterance by anyone on any platform and joins the dots. A secret plane design shown on a patch of ground in White sands which gives the US superior air power propaganda. It matches the landscape up as a field in Texas belonging to a visual effects artist lol. Accountants out of work. Lawyers out of work because it wrote your whole defence for you while you ate a pop tart.

This guy is a walking dead man. Of course he wants to help the government.

« Reply #21 on: May 20, 2023, 13:27 »
+1
"I think that people should have the right to refuse to have their data (created content) used for it to be trained on"
I give permission for education, but I do not give permission for artificial intelligence to trade. Let the AI learn, but it cannot sell the works it creates. More precisely, in this case, a base should be created, and each author from whom the AI studied should receive in the future a percentage of the sales of everything that will be created by the AI. And even now, the authors must give consent not only to AI training, but also to its business activities.

« Reply #22 on: May 20, 2023, 14:07 »
0

-there must be an opt out
-and in case, like at adobe - an undone and unlearn

it would be absurd getting some ridiculous compensation
for training final enemy,  rich companies cash cow,
with ones hard work


« Reply #23 on: May 20, 2023, 16:11 »
0
it would be absurd getting some ridiculous compensation
At the moment, I am satisfied that AI pays good money for his training. I think that if AI has prospects and people will buy its product, the compensation could also be very good money. As a result, one could no longer even work, AI would work on the authors.  8) :)
« Last Edit: May 20, 2023, 16:14 by stoker2014 »

« Reply #24 on: May 20, 2023, 16:23 »
+2
it would be absurd getting some ridiculous compensation
At the moment, I am satisfied that AI pays good money for his training. I think that if AI has prospects and people will buy its product, the compensation could also be very good money. As a result, one could no longer even work, AI would work on the authors.  8) :)


 

Related Topics

  Subject / Started by Replies Last post
24 Replies
12322 Views
Last post March 11, 2008, 02:53
by leaf
8 Replies
3878 Views
Last post March 28, 2014, 06:40
by Bibi
9 Replies
8801 Views
Last post January 17, 2017, 11:47
by Jafo2016
307 Replies
57650 Views
Last post April 22, 2024, 08:52
by Jo Ann Snover
0 Replies
395 Views
Last post January 16, 2024, 07:00
by cobalt

Sponsors

Mega Bundle of 5,900+ Professional Lightroom Presets

Microstock Poll Results

Sponsors