pancakes

MicrostockGroup Sponsors


Author Topic: "Big payments"/lump sums... re: "ai training" - how to pay contributors n future  (Read 4404 times)

0 Members and 1 Guest are viewing this topic.

« on: September 13, 2023, 07:55 »
+8
Really don't like the way these companies basically say:

"Oh yah, we stole, erm, 'trained' our AI tool on your images... Here's what we figure it's worth, now don't bother us".

It's theft. Plain & simple. Doesn't matter how "they" justify it - if you did not explicitly give permission to do so, it's theft. Some "justify" it saying "well we have rights in our license agreement" - eh, no. BECAUSE of the way they've "approached" it - they KNOW it is wrong. (I.e., why would you have to 'hide' what you are doing, and after the fact say 'oh, here's some random money, now don't bother us' if you didn't feel it was wrong - because they KNOW - or rather - pretty strongly suspect - if they said to contributors 'Hey guys! We want to make a tool that will make US more money in the long term, and you less - so we basically want to rip of your images so we can do that, but we'll pay you a couple bucks so you don't feel bad, how does that sound?" MOST contributors would MOST LIKELY say, em, no.

Obviously dall-e/midjourney/etc were the first to just simply STEAL images... and one of the "pesky" little "problems" they have is getting rid of "watermarks"... hmm... how EVER could WATER marks have GOTTEN there? OH the mystery!

(While I have used the tools - and I do admit they are 'cool' - I think the approach to creating them is wrong and they should compensate artists for the hard work they did creating them - PLUS - future compensation for every single images generated based on those. Programatically - it IS VERY EASY to set up such a system. RETROACTIVELY - it is ALSO POSSIBLE. More work - but definitely doable).

Then shutterstock basically ripped things off, then said 'oh haha, yah, here's some money, SOOOOOOOreee! we already ripped it off, so you can't get it back, but here's what we randomly decided to pay you!"...

Sad to see what I would have considered better companies now following suit.

BY THE WAY...

CONTRARY to what these "AI" companies say (i.e., midjourney/dall-e/etc) - it actually IS possible to "backwards compensate"/"retroactively" pay/compensate contributors for the images they took IF they chose to do so...
(a) They "tracked" which images they fed into their training set.
(b) People who generated images used certain 'neural nodes' to create that image.
(c) They keep EXTENSIVE track of EVERY SINGLE THING created with the software.

So it IS possible to to write an algorithm "AI" to do super micropayments (like fractional cents) for EVERY SINGLE IMAGE created, THEN it IS possible to find those contributors (i.e., those images that had the 'pesky little watermarks') - and compensate them - and then it IS POSSIBLE to PAY OUT for EVERY SINGLE IMAGE going forward based on those BASE IMAGES...

It may be a little bit of work - but just an FYI - it IS possible. Contrary to what "they" might say. You'd just have to write a computer algorithm to do so.

So going forward - for EVERY single "AI" image created - you could be compensated fractional cents for 'neural node' inputs to create an image (i.e., $0.0001, because components of your image were used in making a new composite) - which - with the hundreds of thousands (more likely millions) being created every day... would quickly add up. AND - give you a nice future consistent revenue stream.


« Last Edit: September 13, 2023, 08:02 by SuperPhoto »


Mir

« Reply #1 on: September 13, 2023, 08:08 »
+1
A great selling point too, we will read all the new articles, forum posts and video comments how they compensated the contributors.
And all the people happily creating their AI content won't be needed soon, they are a useful tool to train AI for now.

« Reply #2 on: September 13, 2023, 08:29 »
+1
I absolutely think artists should have the right to opt out of any training models.

Make it a legal requirement to get the artists permission and to immediately opt them out of any training sets if they write to "image collectors" and demand their content is removed.

Same for written text, music, software...

If the content is properly licensed, I have no problem with it.

And personally I am paying for commercial use of ai content. So that money should somehow go back to the artists.

« Reply #3 on: September 14, 2023, 00:23 »
0
Really don't like the way these companies basically say:

"Oh yah, we stole, erm, 'trained' our AI tool on your images... Here's what we figure it's worth, now don't bother us".

It's theft. Plain & simple. Doesn't matter how "they" justify it - if you did not explicitly give permission to do so, it's theft. Some "justify" it saying "well we have rights in our license agreement" - eh, no. BECAUSE of the way they've "approached" it - they KNOW it is wrong. (I.e., why would you have to 'hide' what you are doing, and after the fact say 'oh, here's some random money, now don't bother us' if you didn't feel it was wrong - because they KNOW - or rather - pretty strongly suspect - if they said to contributors 'Hey guys! We want to make a tool that will make US more money in the long term, and you less - so we basically want to rip of your images so we can do that, but we'll pay you a couple bucks so you don't feel bad, how does that sound?" MOST contributors would MOST LIKELY say, em, no.

Obviously dall-e/midjourney/etc were the first to just simply STEAL images... and one of the "pesky" little "problems" they have is getting rid of "watermarks"... hmm... how EVER could WATER marks have GOTTEN there? OH the mystery!

(While I have used the tools - and I do admit they are 'cool' - I think the approach to creating them is wrong and they should compensate artists for the hard work they did creating them - PLUS - future compensation for every single images generated based on those. Programatically - it IS VERY EASY to set up such a system. RETROACTIVELY - it is ALSO POSSIBLE. More work - but definitely doable).

Then shutterstock basically ripped things off, then said 'oh haha, yah, here's some money, SOOOOOOOreee! we already ripped it off, so you can't get it back, but here's what we randomly decided to pay you!"...

Sad to see what I would have considered better companies now following suit.

BY THE WAY...

CONTRARY to what these "AI" companies say (i.e., midjourney/dall-e/etc) - it actually IS possible to "backwards compensate"/"retroactively" pay/compensate contributors for the images they took IF they chose to do so...
(a) They "tracked" which images they fed into their training set.
(b) People who generated images used certain 'neural nodes' to create that image.
(c) They keep EXTENSIVE track of EVERY SINGLE THING created with the software.

So it IS possible to to write an algorithm "AI" to do super micropayments (like fractional cents) for EVERY SINGLE IMAGE created, THEN it IS possible to find those contributors (i.e., those images that had the 'pesky little watermarks') - and compensate them - and then it IS POSSIBLE to PAY OUT for EVERY SINGLE IMAGE going forward based on those BASE IMAGES...

It may be a little bit of work - but just an FYI - it IS possible. Contrary to what "they" might say. You'd just have to write a computer algorithm to do so.

So going forward - for EVERY single "AI" image created - you could be compensated fractional cents for 'neural node' inputs to create an image (i.e., $0.0001, because components of your image were used in making a new composite) - which - with the hundreds of thousands (more likely millions) being created every day... would quickly add up. AND - give you a nice future consistent revenue stream.
I see your point but dont think it is technically possible to tie output to specific learning material which went into the model. So in fact you would need to broadly distribute money to creators undiscriminate of the quality and usefulness of their work. So if 100.000.000 images went into the training the compensation would need to be split between all of them - and MidJourney doesnt even know them because they scraped the internet.

« Reply #4 on: September 14, 2023, 15:25 »
+1
I for one think it is great....They gave me few hundred bucks for free. No nothing is free but i will take it....No one else is give me free money....Thank you Adobe....

« Reply #5 on: September 14, 2023, 16:55 »
0
Quote
I see your point but dont think it is technically possible to tie output to specific learning material which went into the model. So in fact you would need to broadly distribute money to creators undiscriminate of the quality and usefulness of their work. So if 100.000.000 images went into the training the compensation would need to be split between all of them - and MidJourney doesnt even know them because they scraped the internet.

From a programming standpoint, it actually is very easy/very possible... Whether or not they do it is something entirely different, but certainly possible. Here's how you'd do it (just one way).

a) Most companies (i.e., midjourney included) keep EXTENSIVE server logs/etc. They also keep "snapshots" of their databases (i.e., they did I think it was going from v4 to v5, because some artists where quite vocal and some actually (I believe if I recall what I read correctly) - got their material removed from the database).
b) If you wanted to pay artists whose works were taking - you'd simply "re-scrape" the content and extract contact info. Good chance they have archives of the data they did scrape - so they'd just have to process their archived data (no need for rescraping the net).
c) Of course - maybe not "everyone" would have contact info - but enough would that one could contact them.

In terms of on-going perpetual compensation for using their assets -
d) Basically - it would require some tweaking of the current neural net algorithm, such that when they create "datapoints" for images, it includes an identifier for whose content it was. (Chances are they ALREADY have that - they just don't make that publicly known). But relatively easy to do.
e) When an image is created from say 1000 datapoints (just using easy #'s here) - each tagged artists get's a microfraction of compensation (i.e., say $0.00001). Of course - it may not seem like much - but when millions of images are generated daily - it adds up (i.e., say someone's image was used 1 million times in a day, that is $10).
f) You then compensate them.

Existing/future "ai" systems (not true "ai", just a popular term nowadays for things people have been doing for 40+ years) - these systems can incorporate  this type of "tagging" for image creation, in order to properly compensate artists whose works were used.

g) Opting out is also quite easy. You'd just tag certain assets and not include them in the image creation.

May require a bit of tweaking of existing neural net structures, but EXTREMELY feasible to do providing someone just DOES it. ESPECIALLY with ALL the MILLIONS and MILLIONS in revenue being generated, most likely on a daily basis.


 

Related Topics

  Subject / Started by Replies Last post
10 Replies
7094 Views
Last post May 21, 2009, 16:30
by leaf
5 Replies
8744 Views
Last post September 17, 2011, 22:33
by PeterChigmaroff
25 Replies
50480 Views
Last post May 26, 2015, 05:40
by cathyslife
2 Replies
201 Views
Last post January 22, 2024, 09:35
by korner83
8 Replies
1406 Views
Last post January 23, 2024, 11:41
by Uncle Pete

Sponsors

Mega Bundle of 5,900+ Professional Lightroom Presets

Microstock Poll Results

Sponsors