pancakes

MicrostockGroup Sponsors


Author Topic: Now Adobe Stock keyword search excludes Generative AI contents at default?  (Read 6546 times)

0 Members and 1 Guest are viewing this topic.

« Reply #75 on: January 14, 2025, 02:08 »
+4
Here's another of Sydney from the top of the first page search results.

Since when did Circular Key vanish!  ;D ;D ;D

I've posted an actual photograph below where you can see the blatant embarrassing error of this IA image and the equally blatant embarrassing acceptance of this in Adobe Stock's library... 



« Last Edit: January 14, 2025, 02:12 by Pacesetter »


« Reply #76 on: January 14, 2025, 03:05 »
0
In the Adobe library this is clearly labelled as ai.

However it should not have a description claiming it is a real location. But the customer can easily see it is a fantasy image.

On Shutterstock you find entire ports consisting of ai images who have been accepted as camera content or handmade illustration, with no distinctions. Customers cannot filter them.

Adobe will certainly need to revisit their ai library and remove misleading content.

But if Adobe starts removing bad looking content, it will be removing lots of the ugly amateur photos first.

And all content is "sorted" by customer interactions.

You can always find ugly stuff to post here, but the reality is the ai collection is overall much better looking than the amateur content and mass duplicates being uploaded.

There is a reason ai sells so well.

And soon, you will have huge ai collections at all the other agencies.

It is inevitable.

« Reply #77 on: January 14, 2025, 04:29 »
+8
If you were the editor of Lonely Planet or National Geographic would you risk buying AI images?

AI images and texts are the result of probabilistic and not deterministic algorithms. Errors are always around the corner.

With Photoshop and retouching, errors were very rare, but when they happened, magazines were laughed at. With AI, the probability of running into these errors increases a lot.

« Reply #78 on: January 14, 2025, 04:30 »
+2
Here's another of Sydney from the top of the first page search results.

Since when did Circular Key vanish!  ;D ;D ;D

I've posted an actual photograph below where you can see the blatant embarrassing error of this IA image and the equally blatant embarrassing acceptance of this in Adobe Stock's library... 


OMG  :o ... but color and light are perfect  ;D

« Reply #79 on: January 14, 2025, 05:01 »
+1
If you were the editor of Lonely Planet or National Geographic would you risk buying AI images?

AI images and texts are the result of probabilistic and not deterministic algorithms. Errors are always around the corner.

With Photoshop and retouching, errors were very rare, but when they happened, magazines were laughed at. With AI, the probability of running into these errors increases a lot.

Specific locations, I would prefer camera content. Probably real editorial to make sure nothing was edited.

But happy people hiking, at the beach, general city stroll, close up of a kid eating ice cream, families taking selfies at generic summer background, empty generic hotel room, generic airport with people travelling - for that ai  will do perfectly well.

The reason ai is all over regular search results is because it has great response from customers.

Agencies always put customer interactions first.

That is why ai is mostly a "threat" to people who don't know how to shoot or process files properly.

No algo can force a customer to buy an ugly crappy "real camera" file if right next to it is a beautifully composed, well lit, great looking ai image.

But the editorial market will always be safe from ai and even with low quality amateur images people will still make money.

For generic commercial stock however...if you don't know how to produce professional looking content, the ai producers will eat your lunch.


« Last Edit: January 14, 2025, 05:59 by cobalt »

« Reply #80 on: January 14, 2025, 06:35 »
+1
that image of Big Ben for example can be corrected with a simple selection and generative fill,it doesn't take anything at all.

AI are not suitable for creating images of specific,real locations,but are more suitable for other types of work.

AI is not the problem,but who uses it.

Do you want to know how many AI images I created in the last week?(probably not but I'll tell you anyway!  :D )

I created 17 AI images in a week and now I'm working on the eighteenth.

I'll start by saying that I haven't been working many hours a day lately,but still,creating quality AI content,correcting the face,teeth,eyes,using generative fills,blurs,and everything else takes time,if you really want to create AI content,interesting, different,quality and marketable,you have to dedicate time.

in my opinion creating quality AI content is more difficult than creating quality real content.

creating videos,in my opinion is easier than creating quality AI content,the only difference is that with videos you have to be in specific locations and you also have to have suitable equipment,but not necessarily,and it still depends on the type of video of course.

good to see AI back on track in the search.

and the blue bar is back!  :)


« Reply #81 on: January 14, 2025, 07:03 »
+2

1.337 / 5.000
How will misleading content in the Adobe library be removed? According to the examples given, everyone is knowledgeable about their own region.

I searched for "Istanbul" as artificial intelligence content in the Adobe library. I know very well because I live there. 85% of the images on the first page are misleading. How can an employee working at Adobe and living in another country know this?

Or how can I know whether the image about "Hamburg", which I have never been to, is misleading?

Actually, I wonder where exactly artificial intelligence images are used. Artificial intelligence images are definitely not used in magazines, newspapers, press and publishing websites in Turkey. It was used in the news of major newspapers a few times in the beginning. However, since the use of models that did not look like Turks in a news about Turkish Retirees was found quite ridiculous, the content of the news was changed immediately.

I am sure that the customer knows what kind of image they want. It is very good to separate Real Photo and AI in searches. But the selection tool could be in an easier place. It could be in the selection in the upper left corner.

(By the way, we can participate from Turkey, but Adobe is closed to Turkish customers. We can check our portfolio by changing the "tr" extension in the internet address bar to "fr". That's why the first page I see and the first pages you see may be different.)

« Reply #82 on: January 14, 2025, 08:36 »
+2
More and more customers are being aware of this horror stories, and I am 100% sure that they will turn away or will be given instructions to stay away from this type of content or agencies that do not have solid ways to not have a solid way this crap slips through them.

Kudos so far for Getty to be very strict in this regard. They will kick out in an eye blink who ever tries to disregard this rules, and I think this time their are completely correct and protecting the library of those abuses. What they will do with the contaminated SS we will see, but I guess that large clients that want assurance they do not get horrid samples as those that have been posted here will have more motives to stay on the Getty library. At the end the Adobe IA acceptance move might be much more problematic than first expected.

« Reply #83 on: January 14, 2025, 08:37 »
0

1.337 / 5.000
How will misleading content in the Adobe library be removed? According to the examples given, everyone is knowledgeable about their own region.

I searched for "Istanbul" as artificial intelligence content in the Adobe library. I know very well because I live there. 85% of the images on the first page are misleading. How can an employee working at Adobe and living in another country know this?

Or how can I know whether the image about "Hamburg", which I have never been to, is misleading?

Actually, I wonder where exactly artificial intelligence images are used. Artificial intelligence images are definitely not used in magazines, newspapers, press and publishing websites in Turkey. It was used in the news of major newspapers a few times in the beginning. However, since the use of models that did not look like Turks in a news about Turkish Retirees was found quite ridiculous, the content of the news was changed immediately.

I am sure that the customer knows what kind of image they want. It is very good to separate Real Photo and AI in searches. But the selection tool could be in an easier place. It could be in the selection in the upper left corner.

(By the way, we can participate from Turkey, but Adobe is closed to Turkish customers. We can check our portfolio by changing the "tr" extension in the internet address bar to "fr". That's why the first page I see and the first pages you see may be different.)

The interesting question over AI is its potential impact on the stability and mental health of the general population. It's already been responsible for triggering civil disobedience via fake events generated accidentally and intentionally by local & foreign actors. The new battlegrounds of the future.

We're only at the beginning of AI really so without some sort of safeguarding, control or regulation to protect jobs and the truth, the internet is at risk of ending up as a mass of information that no one believe anymore. A lot of the fact checkers we see today are driven by companies with political agendas that use it as a way to push their own version of the truth by bending it here and there. In other words, who is going to control and provide you with the truth in the future?

For instance, if we look at the amount of AI imagery available at various agencies compared to real / photographic imagery, and, that they use the AI images as well as real ones when creating new datasets, the % of real imagery (correct data) will become less and less and potentially provide even less accurate results if things are not managed correctly. If the same happens with written facts online, people will start to doubt everything they see even if it was actually the truth.

AI has so much potential to do good things, to make a positive difference in life but if we're not careful, it could also do a lot of damage. Imagery, while important to us, is only a small element of the whole AI thing and I worry that our current crop of politicians etc are not really up to the job. They see $$$ signs first and fail to grasp the long term damage it could cause if not handled correctly, probably because they know they won't be around when the preverbal hits the fan and that'll be for someone else to sort it out.

For us older bods that only have a few years left to work it's probably not a major issue but for the younger generations that still have 20+ years left to work, they'll see jobs being wiped out with minimal replacements in return as businesses look to use the tech to slash more and more employees to improve their bottom as politicians try to tax them and the remaining employees more and more to make up for the shortfall in cash in the treasury coffers.

Of course, we don't really need to worry about it... our current crop of global leaders are bound to have it in hand 🙃

« Reply #84 on: January 14, 2025, 08:43 »
+1

1.337 / 5.000
How will misleading content in the Adobe library be removed? According to the examples given, everyone is knowledgeable about their own region.

I searched for "Istanbul" as artificial intelligence content in the Adobe library. I know very well because I live there. 85% of the images on the first page are misleading. How can an employee working at Adobe and living in another country know this?

Or how can I know whether the image about "Hamburg", which I have never been to, is misleading?

Actually, I wonder where exactly artificial intelligence images are used. Artificial intelligence images are definitely not used in magazines, newspapers, press and publishing websites in Turkey. It was used in the news of major newspapers a few times in the beginning. However, since the use of models that did not look like Turks in a news about Turkish Retirees was found quite ridiculous, the content of the news was changed immediately.

I am sure that the customer knows what kind of image they want. It is very good to separate Real Photo and AI in searches. But the selection tool could be in an easier place. It could be in the selection in the upper left corner.

(By the way, we can participate from Turkey, but Adobe is closed to Turkish customers. We can check our portfolio by changing the "tr" extension in the internet address bar to "fr". That's why the first page I see and the first pages you see may be different.)

Adobe customers can also use generative edits to correct images however they want,and we also receive royalties if a customer edits and then downloads the result,to be clearer,if they modify your image and then download it you still earn.

actually AI contents are used everywhere,including in magazines,newspapers,and widely across the web.

Now I don't know specifically about Trkiye,I'm in Italy,and I read news every day and see images generated by AI every day.

customers know what they are buying because all AI content is labeled as AI and anyone who doesn't label it will be blocked in a short time by Adobe.

« Reply #85 on: January 14, 2025, 09:03 »
+2
Adobe Stock should put a "Turn on AI generated contents" button above the search results.  Most buyers may not use the filter panel.

No. Make it a "turn OFF" and leave it ON by default. Those that WANT it off will find it easily. Others will get the benefit of both.

Or flip that, those that need a legit image or asset benefit from having it off by default.  Those that dont care if its artificial can easily find the turn on switch.

« Reply #86 on: January 14, 2025, 09:09 »
0
I think the reason we see so little AI content in Turkey is that Adobe, which has the most AI content, does not sell to Turkish customers.

For this reason, almost all of the stock images used are licensed through Shutterstock, Istock and GettyImages. Since there is no AI content in them, we do not come across them either.

However, I have been looking at AI images related to Turkey for the last 2 hours. Most of them are not prepared by Turkish users, but by citizens of other countries. Especially in images related to tourism and travel, they have had the AI ​​create touristic places with high sales potential by entering words. But maybe they prepared images of places they have never seen, so they sent them without realizing that they were misleading. I am sure that the people who examined them accepted them because they did not know that they were misleading.

As Ramadan approaches, it seems that we will see mosques with missing or extra minarets in many images. Any user can understand a kangaroo with 6 legs, but they will not pay as much attention to Hagia Sophia with 3 minarets instead of 4 as a kangaroo.

Or those who choose Bodrum from Turkey for their summer vacation will think "Why isn't it like the Game of Thrones castle in Adobe" when they see the real Bodrum Castle.

Adobe does not only sell stock images, videos, etc. in Turkey. Apart from that, we buy Adobe programs under license.

wds

« Reply #87 on: January 14, 2025, 09:11 »
0
Regarding the abundance of AI in the search results. I would think that is largely do to the unending flood of AI submissions....new items always start out high in the search. It would be an interesting stat to see the rate of new AI assets vs. the rate of new actual photo assets introduced into the collection.

« Reply #88 on: January 14, 2025, 09:14 »
+1
ai must be clearly labelled, then the customers can decide what they want to do.

And ai cannot be described or claimed to be of a specific location or editorial event.

It is forbidden under adobe rules and usually will not accepted.

People who disobey the rules are at risk of having their ports closed for fraud, because that is what it is.

ai creation for stock is not designed to replace editorial photography.

Just like any photoshop art is not designed to replace editorial photography of a specific location.

---

Getty will take ai content when they are ready.

There is no question about that.

SS already has it, but made by their own team or customers. And it looks horrible so far. Then they have the ai from uploads that is not declared as ai and they don't seem to care...


I think Getty first needs to come up with an ai creator that can produce content on the same level as midjourney. Adobe is trying to do that as well, but firefly still cannot compete. Adobe is also more keen to integrate this tool into photoshop, which makes perfect sense.

Once Getty has that, an ai tool as good as midjourney, they can start offering that app not just to commercial clients but also the general public that enjoys ai as a hobby.

That is a gigantic market, millions of people paying midjourney or other ai companies every month just to have fun with ai.

Then they can offer a commercial version to create stock, that probably can only be sold on istock/getty.

This way, they control the legal path and...everytime such an ai image is sold, the original creators of the camera training content can also be paid a tiny amount of money.

At the moment they are testing this with the ai tool they offer. Customers can amend files with the getty ai tool and origianl creators get tiny amounts for how their work got included.

But once ai content "made by getty" can be sold as stock on istock, it will become a very different thing.

Also they will make money from the huge group of ai producers who must then pay for the getty app to produce content.

And they can also forbid selling their ai on other platforms. So the Getty ai content will be exclusive to their platform.

Legally also much safer than content on Adobe which is made by all kind of platforms that did not license content properly.

Perhaps adobe or getty will also buy companies like midjourney, or sora or flux.

But ready made ai content will be avaialable on all platforms just like preshot camera content.







« Last Edit: January 14, 2025, 09:23 by cobalt »

« Reply #89 on: January 14, 2025, 09:20 »
0
Regarding the abundance of AI in the search results. I would think that is largely do to the unending flood of AI submissions....new items always start out high in the search. It would be an interesting stat to see the rate of new AI assets vs. the rate of new actual photo assets introduced into the collection.

If the content was crappy or unwanted it simply wouldn't sell and stay on the first pages. I do test searches every day and many files have held their positions for nearly two years now or even keep rising.

The reality is a lot of ai content is very, very good and much better than what amateurs upload.

Only professionals can compete with good quality ai.

The amateurs are suffering most from the competition.

Whereas if I upload normal photo content, it sells just as well as before ai.

There is of course also a lot of stupid ai, but that doesn't rise in searches. The examples posted here are not from the first pages of a search.

« Reply #90 on: January 14, 2025, 09:36 »
+3

Only professionals can compete with good quality ai.

The amateurs are suffering most from the competition.


Yes, but amateurs with the earnings from stock are happy to buy a lens or a new camera, while professionals have to pay their bills,

So it becomes much easier for amateurs than professionals to compete against AI.

But if professionals can no longer make a living from it, what will happen to stock agencies? Will they only sell AI and snapshots of amateurs?

« Reply #91 on: January 14, 2025, 09:58 »
+3
My thought is that even in the stock industry we are going towards the Cory Doctorow neologism "Enshittification8)

https://en.wikipedia.org/wiki/Enshittification


« Reply #92 on: January 14, 2025, 10:12 »
+1

Here's another of Sydney from the top of the first page search results.

Since when did Circular Key vanish!  ;D ;D ;D

I've posted an actual photograph below where you can see the blatant embarrassing error of this IA image and the equally blatant embarrassing acceptance of this in Adobe Stock's library... 





The funny thing is - actually in the submission requirements for "AI" - Adobe actually DOES explicitly state NOT to label "real place names, real landmarks".

So - in this case - it would be the fault of the contributor, so there is a very valid reason for removing that asset.

If they stated "Simulation of Syndey" or "Australian Style Landscape", that would be one thing. But if they are calling it "Sydney", which obviously it is not - then yes, that asset can be deleted/removed under the terms of submission they specified.

« Reply #93 on: January 14, 2025, 10:13 »
+1
Regarding the abundance of AI in the search results. I would think that is largely do to the unending flood of AI submissions....new items always start out high in the search. It would be an interesting stat to see the rate of new AI assets vs. the rate of new actual photo assets introduced into the collection.

There is of course also a lot of stupid ai, but that doesn't rise in searches. The examples posted here are not from the first pages of a search.
the first page if searching "dragon" what is with his legs? what is with his tail? the spikes is completely mess

« Reply #94 on: January 14, 2025, 11:37 »
+1
If you check for example Mont Saint Michel and display the latest photos, you will find Mont Saint Michel surrounded by mountains, lavender fields, in the middle of rolling hills.. and all those that I checked had this name of a real place in description. They are labelled as AI, but still... I see them as misleading.

https://stock.adobe.com/fr/images/a-drones-eye-perspective-capturing-mont-saint-michel-and-its-reflection-in-calm-tidal-waters/1172265712 [nofollow]

« Reply #95 on: January 14, 2025, 12:41 »
0
Personally, I'd prefer AI to be off by default and I wish there was a way to turn off AI for social media,  news, etc.

Maybe Adobe (and SS?) are hoping that at some point they can use AI to clean up the horrible AI images that completely litter the library. Or more likely it will be like spam keywords where a few of the most egregious get a little slap and the rest just sit there polluting everything. How can you expect them to fix AI when they can't even fix images that are labeled as multiple sites (like listing 10 national parks when it clearly can't be more than 1).

I do wonder about the drift when AI is used to train AI. Careful AI producers look at the images at full resolution and fix all sorts of things, others just post anything that looks good at thumbnail (or worse) scale.

Uncle Pete

  • Great Place by a Great Lake - My Home Port
« Reply #96 on: January 14, 2025, 12:57 »
+1

Here's another of Sydney from the top of the first page search results.

Since when did Circular Key vanish!  ;D ;D ;D

I've posted an actual photograph below where you can see the blatant embarrassing error of this IA image and the equally blatant embarrassing acceptance of this in Adobe Stock's library... 



The funny thing is - actually in the submission requirements for "AI" - Adobe actually DOES explicitly state NOT to label "real place names, real landmarks".

So - in this case - it would be the fault of the contributor, so there is a very valid reason for removing that asset.

If they stated "Simulation of Syndey" or "Australian Style Landscape", that would be one thing. But if they are calling it "Sydney", which obviously it is not - then yes, that asset can be deleted/removed under the terms of submission they specified.

Not only should the image be removed the artist should get a warning.

If you can't use "In the style of Picasso" or some other artist, why should "Australian Style Landscape" be allowed. No names of real places, no names of real people, no names of artists styles or using a real famous or otherwise, personal likeness, as the basis for any image.

Whether people here agree or disagree about AI, as stock or art, I think most of us agree that labeling something AI with real tags, implying it's a real place or person, is supporting fakes and frauds. It's misleading.

My thought is that even in the stock industry we are going towards the Cory Doctorow neologism "Enshittification8)

https://en.wikipedia.org/wiki/Enshittification


Not going towards, already there.

« Reply #97 on: January 14, 2025, 15:09 »
0
Regarding the abundance of AI in the search results. I would think that is largely do to the unending flood of AI submissions....new items always start out high in the search. It would be an interesting stat to see the rate of new AI assets vs. the rate of new actual photo assets introduced into the collection.

go back to the previous page,I think Jo Ann has already answered this question.  :)

« Reply #98 on: January 14, 2025, 15:47 »
+2
@Uncle Pete

a difficult matter!

If you can't use the name of anything,how does anyone find this content?

it's different to write "in the style of Miyazaki" or "Australian style landscape" as you already know.

but if you can't write tomato because it's not a real tomato,how do you find the AI ​​tomato?   :D

and the same is also applicable to a name of a city or a country.

why set these limits?

if anyone wants to create a London with 2 Big Ben and insert the name "London", "Big Ben" and "Westminster" in the keywords,what's bad?

it doesn't matter at all because all the content is labeled as AI,so whoever buys it knows that it is AI content.

man has always been looking for new lands to explore,new borders to overcome,new discoveries,see what's on Mars and create gluten-free biscuits!

text to image AIs are simply the answer to a need that is innate in human beings,the need for something different,new.


« Last Edit: January 15, 2025, 05:24 by Injustice for all »

« Reply #99 on: January 15, 2025, 06:12 »
+1

Only professionals can compete with good quality ai.

The amateurs are suffering most from the competition.


Yes, but amateurs with the earnings from stock are happy to buy a lens or a new camera, while professionals have to pay their bills,

So it becomes much easier for amateurs than professionals to compete against AI.

But if professionals can no longer make a living from it, what will happen to stock agencies? Will they only sell AI and snapshots of amateurs?

buying gear and lenses does not make you a good photographer.

the amateurs are the ones who will be squeezed out by ai content.

the pros can always make it work and a lot of the best ai content comes from professional photographers or designers.

some amateurs are now switching to ai. this way they don't have to learn proper photography.

but i doubt that a newbie amateur photographer just starting hs uploads of underxposed flowers, pets and overprocessed sunsets has any reasonable chance of getting sales.

in the old days, you could start with low quality and gradually improve your skills and improve your sales.

the only thing left with o photography skills is editorial.

preferable niches and places not yet covered by the pros.

pros also have the option of combining a work for hire shot, for instance marketing material for a restaurant, with a stock shooting, the client pays less and they get material for their ports.

amateurs don't know how to do that.


 

Related Topics

  Subject / Started by Replies Last post
6 Replies
5243 Views
Last post July 19, 2019, 13:41
by cathyslife
26 Replies
9582 Views
Last post August 14, 2022, 22:06
by hatman12
52 Replies
12413 Views
Last post July 13, 2023, 06:15
by Justanotherphotographer
18 Replies
4266 Views
Last post July 24, 2023, 12:32
by MxR
185 Replies
35633 Views
Last post October 17, 2023, 02:37
by Deyan Georgiev Photography

Sponsors

Mega Bundle of 5,900+ Professional Lightroom Presets

Microstock Poll Results

Sponsors