MicrostockGroup Sponsors


Author Topic: Shutterstock using some kind of software A.I. to review images?  (Read 13044 times)

0 Members and 1 Guest are viewing this topic.

« on: September 18, 2012, 16:44 »
0
Judging by the last couple of batches I have submitted (images with motion-blur that is clearly supposed to be there and images with shallow depth of field) that have been rejected for "focus" or other suspect reasons, I am kind of thinking that SS is using some sort of software A.I. to review images.

Has anyone else seen this type of review of their images?  What are your thoughts on companies using some type of software to review your images?


« Reply #1 on: September 18, 2012, 16:54 »
0
I have a couple of motion-blur shots of toys, on which I lavished a lot of time and which I think are quite good - and I won't even bother submitting them to the micros.

« Reply #2 on: September 18, 2012, 17:11 »
0
I think they have acknowledged (in their IPO filing?) that they use automated tools.

I've had shots that sell really well elsewhere and which I know are in focus rejected for focus. I suspect that something about those images runs afoul of the automated software they use.

I'd just move along - I doubt they are interested in doing anything about it (most unfortunately).

« Reply #3 on: September 18, 2012, 17:14 »
0
Judging by the last couple of batches I have submitted (images with motion-blur that is clearly supposed to be there and images with shallow depth of field) that have been rejected for "focus" or other suspect reasons, I am kind of thinking that SS is using some sort of software A.I. to review images.

Has anyone else seen this type of review of their images?  What are your thoughts on companies using some type of software to review your images?

It's been talked about for years. Judging by the number of forum posts on both this and their own forum, SS is aware of what's going on, but are quite happy with the results they are getting. So, nothing you can do, except submit elsewhere.

« Reply #4 on: September 19, 2012, 00:53 »
+1
Here's where Shuterstock talks about reviews in their IPO

Quote
         The content we provide to our users is created by a community of contributors from around the world and is vetted by our specialized team of image and video reviewers. Whether photographers, videographers, illustrators or designers, our community of more than 35,000 approved contributors range from part-time enthusiasts to full-time professionals, and all of them must meet high standards in order to work with Shutterstock.

        In order to become a contributor, an individual must submit an application that includes a portfolio of images or videos. Of more than 375,000 contributor accounts that have been created, less than 40,000 contributors have been approved. Once accepted by Shutterstock's review team, contributors can upload as many images as they would like; however, every submitted image is reviewed and either accepted or rejected by our team to ensure that images in our library meet certain standards of aesthetic and technical quality. Approximately 38 million images have been submitted to our review team by approved contributors and, of those, only 20 million, or approximately 50%, were approved and made available in our marketplace. Each image that is rejected by our review team is tagged with at least one rejection reason that is communicated to the submitting contributor to help him or her to improve and to give insight into our review standards. Such rejection reasons include focus, composition, poor lighting, trademark infringement and limited commercial value. We combine proprietary technology and highly trained content review staff to deliver sophisticated yet efficient image reviewwe typically process images within 36 hours of upload.

        Contributors are required to associate keywords with each image they submit in order to make their images more easily found using our search algorithms. Keywords usually contain both descriptive terms that literally identify the content of an image (e.g., "padlock") and conceptual terms that describe what an image might convey (e.g., "security"). We have over 650 million contributor generated keywords in our database, an average of approximately 30 keywords per image.

        All images accepted into our collection are added to our website where they are available for search, selection, license and download. Contributors are paid monthly based on how many times their images have been licensed in the previous month. Contributors may choose to remove their images from our library at any time. Due to our large number of contributors, we do not have any material content supply concentration; the content contributed by our five highest-earning contributors was together responsible for less than 4% of downloads in 2011.

        Shutterstock provides different earnings structures for photographs, illustrations and vector art, and for video footage:

http://secfilings.nasdaq.com/edgar_conv_html/2012/08/30/0001047469-12-008610.html#A2210439ZS-1A_HTM_BG47301A_MAIN_TOC

They do refer to software in this sentanct "We combine proprietary technology and highly trained content review staff " but I think they are talking about the proprietary tech the review staff use, and not some tech to review the images automatically.  In the rest of the write up it talks about each image being reviewed by a person.

« Reply #5 on: September 19, 2012, 01:24 »
0
Don't get me wrong, I'm not complaining, only mentioning that I think it is taking place more often than we might think (images being reviewed by a computer rather than an actual person).

I can see both sides of the argument, with the massive number of images that need to be reviewed, how can SS be more efficient, etc.

However, just like anything else if an image doesn't pass the software review because it has a very shallow depth of field, then the image buyers are the ones that don't get to see and decide for themselves if the image has "real" value.

As always, some good images will be rejected and some bad images will be approved doesn't matter who, or what reviews the images, it is what is is.

microstockphoto.co.uk

« Reply #6 on: September 19, 2012, 01:26 »
0
my grandma - which is highly untrained in photography - finds that photos with dull "perfect" lighting and flat dof are better than more dynamic pictures with proper depth of field and some shadows;

so, if they are not using software, they are probably using their grandma

that's fine: their site, their choice
« Last Edit: September 19, 2012, 01:38 by microstockphoto.co.uk »

« Reply #7 on: September 19, 2012, 01:55 »
0
I've seen numerous threads where people say they have had a rejection, waited a bit and then resubmitted and got the image through. If the process was entirely automated it would always deliver the same result, therefore humans must be making the final decision.
I suspect their software flashes for stuff like blown highlights or digital noise. The software we use can spot this sort of thing and has been around for more than a decade.

« Reply #8 on: September 19, 2012, 04:08 »
0
Yesterday were reviewed 4 particular pictures... two of them were accepted and two declined. All four pictures are at the same resolution, object, quality. the reason for declined ones was: Trademark. not even one has a trademark sign/logo on them.

This experience can mean two thinks:

1. SS is using some sort of software as a filter before review , and the ''filtered''  images end up for a second ''human'' review. In this case the soft decided that two of my pics have trademark logos/brands and rejects them..... and the human reviewer accepted the other 2.

2. same bench of images are reviewed by multiple reviewers and one person receive two pics and another one the other two.

I can't find other explanation in my case.

« Reply #9 on: September 19, 2012, 05:38 »
0
Yesterday were reviewed 4 particular pictures... two of them were accepted and two declined. All four pictures are at the same resolution, object, quality. the reason for declined ones was: Trademark. not even one has a trademark sign/logo on them.

This experience can mean two thinks:

1. SS is using some sort of software as a filter before review , and the ''filtered''  images end up for a second ''human'' review. In this case the soft decided that two of my pics have trademark logos/brands and rejects them..... and the human reviewer accepted the other 2.

2. same bench of images are reviewed by multiple reviewers and one person receive two pics and another one the other two.

I can't find other explanation in my case.

Or, the automated filter accepted 2 and the human reviewer rejected 2  ;D

OM

« Reply #10 on: September 19, 2012, 05:45 »
0
I had a strange one in a recent submission. I got an email with most of the batch approved but one shot had been rejected with no reason given. When I looked at the submission on the site, the 'rejected' image had actually been approved and started selling almost immediately. The only thing that I can think of is that in a previous submission, I had already 2 images that were similar and that maybe a machine had rejected the third shot based on either keywords or optical similarity but that decision had been over-ridden by a senior (human) reviewer. Thought the discrepancy a little strange but as I'm fairly new to SS, I don't know how things are done there.

« Reply #11 on: September 19, 2012, 07:17 »
+4
To be on the safe side, I would suggest wearing a tinfoil helmet whenever you consider uploading to Shutterstock.

ruxpriencdiam

    This user is banned.
  • Location. Third stone from the sun
« Reply #12 on: September 19, 2012, 08:36 »
+1
Anthony says he stands by his reviewers and that they are qualified.

Shallow DOF can, does and will get accepted as long as it is done right and sometimes it is best to be safe and add a note with an explanation.

I doubt it is software because there are some things getting accepted that if I or anyone else was to try and get them approved they would be rejected.

And also as stated by someone else if you resubmit they usually pass without a problem.

So it is Attila that is at work.


   

And I wont be editing in 2 minutes either!

« Reply #13 on: September 19, 2012, 09:13 »
0
It's inevitable that software is going to be used to 'review' [cough] images, and used increasingly. Human reviewers are a big cost item and an obvious target for the new generation bean-counters now flooding into these agencies as they go public, are sold, or take on new investors.   And it's also inevitable that this software is often going to fail and misfire, in ways that will be frustrating, or hilarious, or occasionally spectacular. 

Like any industry, they want to automate and get rid of as many of  those problematic and expensive employees as possible, and they'll try to do it before the necessary technology is really ready. 

In other words this is only going to get worse.

« Reply #14 on: September 19, 2012, 09:26 »
0
It's inevitable that software is going to be used to 'review' [cough] images, and used increasingly. Human reviewers are a big cost item and an obvious target for the new generation bean-counters now flooding into these agencies as they go public, are sold, or take on new investors.   And it's also inevitable that this software is often going to fail and misfire, in ways that will be frustrating, or hilarious, or occasionally spectacular. 
Like any industry, they want to automate and get rid of as many of  those problematic and expensive employees as possible, and they'll try to do it before the necessary technology is really ready. 

In other words this is only going to get worse.

Generally speaking machines, computers, software, technology, etc are vastly better than humans at performing complex tasks. That's why we have them.

In the unlikely event that your image was rejected by an automated process it will be because it did not meet the set specification. It may be that the specification is wrong ... but the specification would have been defined and set by a human.

RacePhoto

« Reply #15 on: September 19, 2012, 09:45 »
0
I'd agree, it's not "AI bots" doing reviews. You can check things like size and format with a computer program and save the reviewer time, so they don't have to look. I don't believe that SS uses image recognition for reviews. There's nothing but forum rumors and suspicion that claims they do. One would think that by this stage of the game, someone on the inside would have told us the facts? Like yes they do, or no they don't.  ???

As for soft focus and blurred or depth of field. I don't bother anymore. Soft in back can pass, anything soft in front = rejection. It's their standards and anything like that with any artistic exposure or motion blur, is more likely to get refused. Same goes for shadows. Even useful and intentional shadows! Frustrating.

My favorite part, that answers an often asked question: "Of more than 375,000 contributor accounts that have been created, less than 40,000 contributors have been approved." What's that? Oh about 10% pass the first review. And of those? 13,065 authors have portfolio with less than 50 works, is 38.86% from the total amount.

Roughly 30% of the total approved contributors (10,000 people) have over 250 images on SS.

40% less than 50 images, 30% with more than 250, 5% with more than 2000 images. (people here are much of that 5%) This can also be read as 70% of the people who joined and passed the test have less than 250 images on SS!  :o


Here's where Shuterstock talks about reviews in their IPO

Quote
         The content we provide to our users is created by a community of contributors from around the world and is vetted by our specialized team of image and video reviewers. Whether photographers, videographers, illustrators or designers, our community of more than 35,000 approved contributors range from part-time enthusiasts to full-time professionals, and all of them must meet high standards in order to work with Shutterstock.

        In order to become a contributor, an individual must submit an application that includes a portfolio of images or videos. Of more than 375,000 contributor accounts that have been created, less than 40,000 contributors have been approved. Once accepted by Shutterstock's review team, contributors can upload as many images as they would like; however, every submitted image is reviewed and either accepted or rejected by our team to ensure that images in our library meet certain standards of aesthetic and technical quality. Approximately 38 million images have been submitted to our review team by approved contributors and, of those, only 20 million, or approximately 50%, were approved and made available in our marketplace. Each image that is rejected by our review team is tagged with at least one rejection reason that is communicated to the submitting contributor to help him or her to improve and to give insight into our review standards. Such rejection reasons include focus, composition, poor lighting, trademark infringement and limited commercial value. We combine proprietary technology and highly trained content review staff to deliver sophisticated yet efficient image reviewwe typically process images within 36 hours of upload.

        Contributors are required to associate keywords with each image they submit in order to make their images more easily found using our search algorithms. Keywords usually contain both descriptive terms that literally identify the content of an image (e.g., "padlock") and conceptual terms that describe what an image might convey (e.g., "security"). We have over 650 million contributor generated keywords in our database, an average of approximately 30 keywords per image.

        All images accepted into our collection are added to our website where they are available for search, selection, license and download. Contributors are paid monthly based on how many times their images have been licensed in the previous month. Contributors may choose to remove their images from our library at any time. Due to our large number of contributors, we do not have any material content supply concentration; the content contributed by our five highest-earning contributors was together responsible for less than 4% of downloads in 2011.

        Shutterstock provides different earnings structures for photographs, illustrations and vector art, and for video footage:

http://secfilings.nasdaq.com/edgar_conv_html/2012/08/30/0001047469-12-008610.html#A2210439ZS-1A_HTM_BG47301A_MAIN_TOC

They do refer to software in this sentanct "We combine proprietary technology and highly trained content review staff " but I think they are talking about the proprietary tech the review staff use, and not some tech to review the images automatically.  In the rest of the write up it talks about each image being reviewed by a person.

« Reply #16 on: September 19, 2012, 10:10 »
+2
Judging by the last couple of batches I have submitted (images with motion-blur that is clearly supposed to be there and images with shallow depth of field) that have been rejected for "focus" or other suspect reasons, I am kind of thinking that SS is using some sort of software A.I. to review images.

Post the examples instead of dreaming up conspiracies.


« Reply #17 on: September 19, 2012, 10:42 »
0
Here's an example that SS rejected twice - second time I downsized and I did note it was a resubmit. You can see the same image at DT here (as IS's zoom is busted)

I've had a lot of beach (tropical, colorful, shots that sell elsewhere) shots rejected by SS. It's possible that they treat beaches as an oversupplied subject and thus are picky, but the ones they do accept sell well there too. I really have no idea what they do or how they do it, but I do get focus rejections for images that are clearly in focus and mostly on the beach shots. I've also had "poor framing" rejections from them from images that sell well elsewhere - so perhaps buyers love poor framing?

I just move on, but it does irk me that the rejection reason is rubbish - they're entitled to say they don't want it but I wish they had a formal appeals process (and don't suggest the forum mosh pit, it's not the same thing).

« Reply #18 on: September 19, 2012, 11:40 »
0
.... It's possible that they treat beaches as an oversupplied subject and thus are picky, but the ones they do accept sell well there too.....

I believe they are much more picky about "oversupplied subject matter" and if even the ones that sell well weren't there, a buyer has plenty more to pick from so no sale lost (for SS).  I know this Ps off a lot of contributors but it makes sound business sense (again for SS).

ruxpriencdiam

    This user is banned.
  • Location. Third stone from the sun
« Reply #19 on: September 19, 2012, 12:28 »
0
You have way to much time on your hands!

BTW where did you find this info? Post it on the SS forums for them to actually see how many submitters there
really are!

I'd agree, it's not "AI bots" doing reviews. You can check things like size and format with a computer program and save the reviewer time, so they don't have to look. I don't believe that SS uses image recognition for reviews. There's nothing but forum rumors and suspicion that claims they do. One would think that by this stage of the game, someone on the inside would have told us the facts? Like yes they do, or no they don't.  ???

As for soft focus and blurred or depth of field. I don't bother anymore. Soft in back can pass, anything soft in front = rejection. It's their standards and anything like that with any artistic exposure or motion blur, is more likely to get refused. Same goes for shadows. Even useful and intentional shadows! Frustrating.

My favorite part, that answers an often asked question: "Of more than 375,000 contributor accounts that have been created, less than 40,000 contributors have been approved." What's that? Oh about 10% pass the first review. And of those? 13,065 authors have portfolio with less than 50 works, is 38.86% from the total amount.

Roughly 30% of the total approved contributors (10,000 people) have over 250 images on SS.

40% less than 50 images, 30% with more than 250, 5% with more than 2000 images. (people here are much of that 5%) This can also be read as 70% of the people who joined and passed the test have less than 250 images on SS!  :o


Here's where Shuterstock talks about reviews in their IPO

Quote
         The content we provide to our users is created by a community of contributors from around the world and is vetted by our specialized team of image and video reviewers. Whether photographers, videographers, illustrators or designers, our community of more than 35,000 approved contributors range from part-time enthusiasts to full-time professionals, and all of them must meet high standards in order to work with Shutterstock.

        In order to become a contributor, an individual must submit an application that includes a portfolio of images or videos. Of more than 375,000 contributor accounts that have been created, less than 40,000 contributors have been approved. Once accepted by Shutterstock's review team, contributors can upload as many images as they would like; however, every submitted image is reviewed and either accepted or rejected by our team to ensure that images in our library meet certain standards of aesthetic and technical quality. Approximately 38 million images have been submitted to our review team by approved contributors and, of those, only 20 million, or approximately 50%, were approved and made available in our marketplace. Each image that is rejected by our review team is tagged with at least one rejection reason that is communicated to the submitting contributor to help him or her to improve and to give insight into our review standards. Such rejection reasons include focus, composition, poor lighting, trademark infringement and limited commercial value. We combine proprietary technology and highly trained content review staff to deliver sophisticated yet efficient image reviewwe typically process images within 36 hours of upload.

        Contributors are required to associate keywords with each image they submit in order to make their images more easily found using our search algorithms. Keywords usually contain both descriptive terms that literally identify the content of an image (e.g., "padlock") and conceptual terms that describe what an image might convey (e.g., "security"). We have over 650 million contributor generated keywords in our database, an average of approximately 30 keywords per image.

        All images accepted into our collection are added to our website where they are available for search, selection, license and download. Contributors are paid monthly based on how many times their images have been licensed in the previous month. Contributors may choose to remove their images from our library at any time. Due to our large number of contributors, we do not have any material content supply concentration; the content contributed by our five highest-earning contributors was together responsible for less than 4% of downloads in 2011.

        Shutterstock provides different earnings structures for photographs, illustrations and vector art, and for video footage:

http://secfilings.nasdaq.com/edgar_conv_html/2012/08/30/0001047469-12-008610.html#A2210439ZS-1A_HTM_BG47301A_MAIN_TOC

They do refer to software in this sentanct "We combine proprietary technology and highly trained content review staff " but I think they are talking about the proprietary tech the review staff use, and not some tech to review the images automatically.  In the rest of the write up it talks about each image being reviewed by a person.


RT


« Reply #20 on: September 19, 2012, 13:22 »
0
Bit OT but how long are people waiting for reviews these day's, I haven't uploaded there for a few months but have had some in the queue for 5 days now - is that the current norm?

ruxpriencdiam

    This user is banned.
  • Location. Third stone from the sun
« Reply #21 on: September 19, 2012, 13:31 »
0
SS reviews are running up to 10 days right now.

RT


« Reply #22 on: September 19, 2012, 13:32 »
0
^ Thanks

« Reply #23 on: September 19, 2012, 13:34 »
0
I've never known them to take so long at reviews as they do at the moment.

« Reply #24 on: September 19, 2012, 14:11 »
0
They're not using "AI to review images" because AI doesn't exist - it's just a computer science term that's been a speculation,  a dream, a prediction,  and now a marketing buzzword, without ever becoming a reality.   

To cut their (human) reviewing costs these agencies will be investing in software that can reject as a first pass, reducing the number of images that have to be seen by a human.  For example,  it's quite possible today to use software to determine what percentage of an image is in focus.  Not perfectly, of course, but good enough to weed out a lot of photos that look ok at reduced size but actually aren't sufficiently sharp at 100%.  This could save an agency a lot of money. 

Other sorts of screening are possible, and some might not be obvious.  For example, the agencies obviously prefer high-key light toned images, nice pastels, and they don't want anything that's dark overall - easy enough to filter on this criteria.   Fair? No, but when you have images pouring in like they're coming out of a fire hose, why care?

Software can pick out faces, and reject images of people without model releases.   

These companies won't care if the software is  "fair" or  it rejects some "perfectly good images".  It only has to increase profitability, i.e. pay for itself. 
« Last Edit: September 19, 2012, 14:32 by stockastic »

« Reply #25 on: September 19, 2012, 14:21 »
0
They could really speed things up w/o going totally auto by having the program recommend likely rejection reasons (if it finds technical problems), then if the reviewer agrees they just click one button and it is on to the next. Of course a lazy reviewer will just do that but a good reviewer could easily override the program's decision. They could also have some sort of collection value - a mix of the number of images with similar keywords and the number of images purchased using those keywords. This way if an image is so-so but of little value to the collection - there are already a heap of similar images and it isn't very high demand - it gets rejected but if it is in a subject w/o a lot of images compared to the demand  - it gets accepted.

Of course there will be plenty of images that get rejected or accepted that should be the opposite. Especially if they are my images that get rejected.

Poncke

« Reply #26 on: September 19, 2012, 14:40 »
0
The OP needs to post his photo.

My rejections are for composition and uneven lighting, never focus. Something that cant be handled by software.

Alloy hats is what most stock togs need to wear.


« Reply #27 on: September 19, 2012, 15:36 »
0
It would seem a lot more fair if SS had an official way to resubmit, with explanation or comments.  Like "this image was rejected for focus but I feel the use of DOF enhances the composition."   If they really do have software looking at focus, they'll know perfectly well that there are some bad rejections of good photos, and might approve them on appeal.

Poncke

« Reply #28 on: September 19, 2012, 15:44 »
0
There is an official way to resubmit. Its in their instructions, you need to fix the photo, and upload again with a note to the reviewer the number of the rejected photo and the fixes you applied.

« Reply #29 on: September 19, 2012, 15:52 »
0
There is an official way to resubmit. Its in their instructions, you need to fix the photo, and upload again with a note to the reviewer the number of the rejected photo and the fixes you applied.

Thanks, didn't know about that - guess I'm out of date.

Poncke

« Reply #30 on: September 19, 2012, 15:56 »
0
There is an official way to resubmit. Its in their instructions, you need to fix the photo, and upload again with a note to the reviewer the number of the rejected photo and the fixes you applied.


Thanks, didn't know about that - guess I'm out of date.


http://submit.shutterstock.com/faq.mhtml#How do I re-submit content that was previously approved?

OM

« Reply #31 on: September 19, 2012, 17:18 »
0
OT but FT is presumably using optical software to place its 'Infinity' series images on search pages.

http://en.fotolia.com/search?k=aids&filters[content_type%3Aall]=1&submit.x=0&submit.y=0

Search subject  is 'aids' and on that page you'll find 3 'Infinity' images that have absolutely nothing to do with 'aids'. They're images of a round European table with seats. AIDS is not in the keywords either (unless 'assistance' counts)..........however, I reckon the table with seats around it looks to a machine like some models of an aids virus particle. That's the only explanation I can think of anyway.

« Reply #32 on: September 19, 2012, 17:59 »
0
OT but FT is presumably using optical software to place its 'Infinity' series images on search pages.

http://en.fotolia.com/search?k=aids&filters[content_type%3Aall]=1&submit.x=0&submit.y=0

Search subject  is 'aids' and on that page you'll find 3 'Infinity' images that have absolutely nothing to do with 'aids'. They're images of a round European table with seats. AIDS is not in the keywords either (unless 'assistance' counts)..........however, I reckon the table with seats around it looks to a machine like some models of an aids virus particle. That's the only explanation I can think of anyway.


What on earth is 'optical software' when it's at home? FT just have a painfully crude search algorithm in which every keyword used by any image (however irrelevant) is apparently given equal weight in search results __ except in the case of 'Infinity' images which obviously have a major boost. It appears that FT's 'translation' device is equally crude which explains the weird 'results' from images submitted by non-English speaking contributors.

OM

« Reply #33 on: September 19, 2012, 19:19 »
0
What I meant by that was some sort of 'optical pattern' recognition that relates totally unrelated and unrequested images to one another as in 'suggested' alternatives for eg image of hamburger (although the search was for 'bread'). Or maybe it is a keyword thing in the previous example whereby the software relates 'aids' to 'assistance'. Mebbee they're the same in Spanish (contributor from Spain). I dunno.  :-\
« Last Edit: September 19, 2012, 19:23 by OM »

« Reply #34 on: September 19, 2012, 19:34 »
0
What I meant by that was some sort of 'optical pattern' recognition that relates totally unrelated and unrequested images to one another as in 'suggested' alternatives for eg image of hamburger (although the search was for 'bread'). Or maybe it is a keyword thing in the previous example whereby the software relates 'aids' to 'assistance'. Mebbee they're the same in Spanish (contributor from Spain). I dunno.  :-\

Wow. I'm amazed that you managed to construct such a complex conspiracy theorem to 'explain' what is self-evidently merely technological inadequacy.

« Reply #35 on: September 20, 2012, 00:01 »
0
Yesterday were reviewed 4 particular pictures... two of them were accepted and two declined. All four pictures are at the same resolution, object, quality. the reason for declined ones was: Trademark. not even one has a trademark sign/logo on them.

This experience can mean two thinks:

1. SS is using some sort of software as a filter before review , and the ''filtered''  images end up for a second ''human'' review. In this case the soft decided that two of my pics have trademark logos/brands and rejects them..... and the human reviewer accepted the other 2.

2. same bench of images are reviewed by multiple reviewers and one person receive two pics and another one the other two.

I can't find other explanation in my case.

Or, the automated filter accepted 2 and the human reviewer rejected 2  ;D

I don't believe SS is letting a soft to do all the job. OK, the software is filtering the images ( sharpness, resolution, colors etc.) but the final review is still made by people.

The reason is very simple .... how will a software judge the concept of a photography??? .... when all agencies are telling us  to came with new concepts.

« Reply #36 on: September 20, 2012, 02:55 »
0
Just yanking your chain with a 3rd option  ;)

I'd be fairly confident that someone's eyes are involved in acceptances.


« Reply #37 on: September 20, 2012, 05:06 »
0
while we're on conspiracy theories. Is it atilla inspecting from the clocktower or is it some faceless men on the grassy knoll that are rejecting our potential best sellers ?

« Reply #38 on: September 20, 2012, 06:35 »
0
So, no post of examples by the OP?

OM

« Reply #39 on: September 20, 2012, 08:47 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

« Reply #40 on: September 20, 2012, 09:28 »
0
I'm sure the "related image" algorithm just uses keywords and if we see something that's really, really unrelated it's most likely caused by the same spam that results in searches pulling up stuff with no relationship to the search term.

« Reply #41 on: September 20, 2012, 10:55 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

Exactly.  Basically, these new, unproven and imperfect software screening technologies will cost us (contributors) money while reducing costs at the agencies.

WarrenPrice

« Reply #42 on: September 20, 2012, 11:05 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

Exactly.  Basically, these new, unproven and imperfect software screening technologies will cost us (contributors) money while reducing costs at the agencies.

Maybe ... but doesn't that contradict the argument that costing the contributor money is costing the agency money?
 :P

suwanneeredhead

  • O.I.D. Sufferer (Obsessive Illustration Disorder)
« Reply #43 on: September 20, 2012, 14:23 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

Exactly.  Basically, these new, unproven and imperfect software screening technologies will cost us (contributors) money while reducing costs at the agencies.

Maybe ... but doesn't that contradict the argument that costing the contributor money is costing the agency money?
 :P

Yes, Warren, and also could you answer me this, why are we conjuring up these wild theories? Because of "reviewer inconsistency" ?  Don't you think that if the images were reviewed by "software" or "artificial intelligence," that the reviews would be 100% consistent? Rock-solid consistency is what I think we'd get with artificial intelligence reviews, because the software would only understand objectivity and would do the same thing every time.  It's the reviewer inconsistency (at ALL stock sites) that PROVES human eyes do this work.

Amazing ... seems to me some here would be better served learning their craft, than bitching and inventing wild conspiracy theories.

WarrenPrice

« Reply #44 on: September 20, 2012, 14:50 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

Exactly.  Basically, these new, unproven and imperfect software screening technologies will cost us (contributors) money while reducing costs at the agencies.

Maybe ... but doesn't that contradict the argument that costing the contributor money is costing the agency money?
 :P

Yes, Warren, and also could you answer me this, why are we conjuring up these wild theories? Because of "reviewer inconsistency" ?  Don't you think that if the images were reviewed by "software" or "artificial intelligence," that the reviews would be 100% consistent? Rock-solid consistency is what I think we'd get with artificial intelligence reviews, because the software would only understand objectivity and would do the same thing every time.  It's the reviewer inconsistency (at ALL stock sites) that PROVES human eyes do this work.

Amazing ... seems to me some here would be better served learning their craft, than bitching and inventing wild conspiracy theories.

???Why Me???  Or did "YOU" mean the OP?  Or another conjurer?   ??? ;)

Poncke

« Reply #45 on: September 20, 2012, 15:26 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

Exactly.  Basically, these new, unproven and imperfect software screening technologies will cost us (contributors) money while reducing costs at the agencies.

Maybe ... but doesn't that contradict the argument that costing the contributor money is costing the agency money?
 :P

Yes, Warren, and also could you answer me this, why are we conjuring up these wild theories? Because of "reviewer inconsistency" ?  Don't you think that if the images were reviewed by "software" or "artificial intelligence," that the reviews would be 100% consistent? Rock-solid consistency is what I think we'd get with artificial intelligence reviews, because the software would only understand objectivity and would do the same thing every time.  It's the reviewer inconsistency (at ALL stock sites) that PROVES human eyes do this work.

Amazing ... seems to me some here would be better served learning their craft, than bitching and inventing wild conspiracy theories.

So tell me, how is this AI going to check for composition and uneven lighting and commercial value and white balance and copyright violations?

« Reply #46 on: September 20, 2012, 16:22 »
0
Give it a rest Sean,

I'm not going to post my images for debate because for one, I don't care that much about these couple of images (stock sites approve and reject many, many images for what appears to be random reasons, I get that and just move on) and for two, I didn't intend to discuss my specific images, only that "I" got the impression that they were using some sort of software to do either part or all of the review, and asked what the group thinks about the possibility of stock sites using some sort of software to review images.

It's not conspiracy theory or me complaining, and no I don't need a foil hat unless it will stop all the voices in my head that make me despise people who can't have a reasonable discussion about something that they may, or may not agree upon.

The discussion is more about "Do you think stock sites are using software to review images, and what do you think about it if they are?"

That is all I was doing, opening up a dialog or discussion about "Your thought on software reviewing your photos."


« Reply #47 on: September 20, 2012, 16:33 »
0
So tell me, how is this AI going to check for composition and uneven lighting and commercial value and white balance and copyright violations?
Nothing like that is remotely possibly today; well maybe white balance, but only to some extent.  But the idea of a software 'focus' check is interesting.

I just submitted a closeup of some small glass objects - it's as in focus as it can be, but the objects are smooth, rounded and translucent - no real straight lines or sharp edges.  I'll bet it gets rejected for 'focus'.
« Last Edit: September 20, 2012, 17:14 by stockastic »


suwanneeredhead

  • O.I.D. Sufferer (Obsessive Illustration Disorder)
« Reply #49 on: September 22, 2012, 16:23 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

Exactly.  Basically, these new, unproven and imperfect software screening technologies will cost us (contributors) money while reducing costs at the agencies.

Maybe ... but doesn't that contradict the argument that costing the contributor money is costing the agency money?
 :P

Yes, Warren, and also could you answer me this, why are we conjuring up these wild theories? Because of "reviewer inconsistency" ?  Don't you think that if the images were reviewed by "software" or "artificial intelligence," that the reviews would be 100% consistent? Rock-solid consistency is what I think we'd get with artificial intelligence reviews, because the software would only understand objectivity and would do the same thing every time.  It's the reviewer inconsistency (at ALL stock sites) that PROVES human eyes do this work.

Amazing ... seems to me some here would be better served learning their craft, than bitching and inventing wild conspiracy theories.

So tell me, how is this AI going to check for composition and uneven lighting and commercial value and white balance and copyright violations?
It'll do what it's programmed to do in each situation, according to what the software design team decides in terms of assigning objectivity to an inherently subjective topic.  Thus the consistency would be 100% -- and I'm sure it'd be dead wrong 99% of the time as well.

Sorry Warren, what I meant was, could SOMEBODY answer my question... you were right!

OM

« Reply #50 on: September 23, 2012, 10:41 »
0
I'm sure the "related image" algorithm just uses keywords and if we see something that's really, really unrelated it's most likely caused by the same spam that results in searches pulling up stuff with no relationship to the search term.

At SS I would agree that it's 'keyword' based and I find SS 'suggestions' to be very good generally but at FT I'm not so sure 'suggestions' are keyword based after finding this combination.

I'll say straight off that there are a few keywords in common with the main shot and the eggs on plate with knife vector (cutting, cutting board and a few others) but looking at the images as a whole the 2 images have a few visual things in common: the rectangular background shapes and tones,  the two faces and two egg yolks and the knife in the vector could be recognised as the girl's arm in the main photo.
But whatever algorithms they're using whether keyword based or optical pattern recognition, they're making a total nonsense of 'suggested' in this case. If you're looking for images of a couple in a kitchen, you probably don't want to be presented with most of the 'suggestion' offered here.
« Last Edit: September 23, 2012, 10:53 by OM »

RacePhoto

« Reply #51 on: September 24, 2012, 11:46 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

And the other option was, they aren't using any of this technology so the whole contention of "technological inadequacy" and the forum debate is just a nice what if scenario?

And for the others who asked: Illuminati, Men in Black, all that secret society stuff that "they" hide from us. That's what AI reviews on SS would be.  ;D

No comment on image searches, go to Google Image and look for similar images, you get some of the strangest things that have similar colors and shapes. And for the examples posted here, I see melons and melons.  ::)

« Reply #52 on: September 25, 2012, 09:48 »
0
Whatever they're doing, it's gotten slower.  I've had some photos sitting in the review queue for 6 days now, waiting for my 'focus' and 'lighting' rejections.

« Reply #53 on: September 26, 2012, 09:29 »
0
Well I got my rejections after a week.  4 of 11 turned down for 'focus' and 'lighting'.

These photos are perfectly in focus, and are as good or better than any I've had approved at SS in the past.  In fact there's no difference between the ones that were accepted and the ones that were rejected.  I hadn't even moved the camera between them.

They were all closeups of fairly small objects. The micro inspectors have always had trouble with macro; SS seems to think infinite DOF is achievable and IS mistakes the actual texture of an object for noise or 'artifacts'.

It does seem likely that they're using software to try and reject on 'focus' and that in these photos the software couldn't find enough of the obvious edges it needs. No doubt it works fine on ordinary subjects, but it isn't sophisticated enough to handle  everything.




« Last Edit: September 26, 2012, 09:49 by stockastic »

« Reply #54 on: September 28, 2012, 15:25 »
0
Cool, I want this software in my camera so I can just wave it around and get pretty shots automatically!  :o

« Reply #55 on: September 29, 2012, 19:46 »
0
I don't think they use software. Shutterstock generally doesn't like images with motion blur. They'll take them, but it's harder to get them accepted.

« Reply #56 on: October 01, 2012, 17:49 »
0
I resubmitted, unmodified, one of the 4 that had been rejected for 'focus'.  In my note to the reviewer, I simply said I'd looked at it closely and believed the focus was good.   

It was approved.

So I'm not crazy, I do know what 'focus' is, and I'm becoming convinced that a first-pass screening for 'focus' is being done - by a piece of software not sophisticated enough to handle all sorts of subjects. 



Poncke

« Reply #57 on: October 02, 2012, 01:49 »
0
Two people, two different sets of eyes. One reviewer didnt like the focus, the other did. Your test doesnt prove anything imo

OM

« Reply #58 on: October 03, 2012, 17:08 »
0
My intention was not to imply any form of conspiracy but merely to highlight the technological inadequacy of whatever algorithm they're using.

And the other option was, they aren't using any of this technology so the whole contention of "technological inadequacy" and the forum debate is just a nice what if scenario?


They have to be using some  technology; after all there's not some geezer/geezeress sitting there behind a machine deciding which 'suggested' to show you when you view an image.

« Reply #59 on: October 03, 2012, 18:48 »
0
They have to be using some  technology; after all there's not some geezer/geezeress sitting there behind a machine deciding which 'suggested' to show you when you view an image.

Huh? Any chance that can be translated into English?

« Reply #60 on: October 03, 2012, 18:59 »
0
So I'm not crazy, I do know what 'focus' is, and I'm becoming convinced that a first-pass screening for 'focus' is being done - by a piece of software not sophisticated enough to handle all sorts of subjects.


It's not software, it's just a very busy reviewer who will be paid just a few cents (6c in 2007 )for reviewing your image and therefore is going to make a very, very quick decision on the technical merits of your work.

It happens to me occasionally and I always know that it is a 'borderline' image when I submit it. It's not that the image is not in focus, the problem is that not enough of it is in focus for it to be a useful stock image (in the rapid opinion of said reveiwer). This is often a problem with macro images of which I shoot a lot. Just shrink the image down to say 3K x 2K pixels and it will almost always be approved.

« Reply #61 on: October 03, 2012, 19:09 »
0
Guys.  This photo is a collection of small objects laid out flat on a light table.  Camera mounted about 2 feet above, leveled with a bubble level.  Small aperture, carefully focused ( in-focus indicator was lit).  Entire subject in one narrow plane at the same distance from the camera.  Nikon ED lens.   At 100%, it's sharp corner-to-corner.  Not one molecule of the subject is out of focus - nor could it be, given these parameters.   

You'd have to be blind.

« Last Edit: October 03, 2012, 19:39 by stockastic »

« Reply #62 on: October 03, 2012, 19:23 »
0
This photo was a collection of small objects laid out on a light table.  Camera mounted about 2 feet above, leveled with a bubble level.  Small aperture, carefully focused ( in-focus indicator was lit).  Entire subject in one narrow plane at the same distance from the camera.  Nikon ED lens.   At 100%, it's sharp corner-to-corner.  Not one molecule of the subject is out of focus - nor could it be, given these parameters.   

You'd have to be blind.

Post a link to the full-size image so we can see. I've had over 5K images approved at SS, with at least a 98% AR, and I consider their reviews pretty consistent. Not 'perfect' obviously as they are using humans and it is a subjective process.

ShadySue

« Reply #63 on: October 03, 2012, 19:40 »
0
Computational image aesthetic evaluation is actually a fairly hot area of academic research right now.  There are quite a few valuable applications for this beyond making a microstock reviewer's life easier.  For example, helping an amateur pick out their "best" smartphone photos from a vacation.

A number of high-level image features can be detected and evaluated together to predict aesthetic quality including foreground sharpness, background simplicity, color and luminosity contrast, depth of field, leading lines, symmetry, rule of thirds subject placement, pattern and texture detail, etc.

The current state of the art does not replace a human expert and perhaps never will, but automated aesthetic evaluation is definitely advancing rapidly and should be appearing in consumer applications, search engines and social networks in the very near future. 

Oh, goody. So all our photos will be homogenous. Whoopee.

« Reply #64 on: October 09, 2012, 17:49 »
0
Sorry, had to revive this thread. 

Earlier I ranted about a bunch of recent "focus" rejections which were totally bogus.   I just submitted all of them to DT, and all were accepted.   For me, that settles it.   
« Last Edit: October 09, 2012, 17:55 by stockastic »

Poncke

« Reply #65 on: October 09, 2012, 17:59 »
0
It settled what? Dat DT has different reviewers then SS? I think you are on to something indeed.

« Reply #66 on: October 09, 2012, 18:16 »
+1
I've seen so many tales of the rejected images that get accepted 2nd time around.  Feed the same data into a program and you will always get the same result so highly unlikely that software is making the decisions.


RacePhoto

« Reply #67 on: October 10, 2012, 10:30 »
0
I've seen so many tales of the rejected images that get accepted 2nd time around.  Feed the same data into a program and you will always get the same result so highly unlikely that software is making the decisions.


Well maybe the software has a RNG and it's set to be harder to pass review on weekends? (just a bit of humor)

Yes I agree, it's humans and that's the only way we would get photos from a shoot accepted, rejected and random reasons why, depending on who looks at them. A flawed computer program would be consistently wrong.  :D

I file it under "This is MicroStock" the wild playground of change, bizarre agency behavior and the unexplained. It's the Twilight Zone of photography marketing. It's the Forrest Gump of business practices... You have entered!


Inconsistant is the rule

« Reply #68 on: July 17, 2018, 18:38 »
+2
***** OLD THREAD ALERT!!!!  ***** charliegnomes is on a roll with restarting threads!


 

Related Topics

  Subject / Started by Replies Last post
25 Replies
7095 Views
Last post January 31, 2010, 12:23
by donding
6 Replies
3375 Views
Last post July 18, 2018, 14:17
by spacedrone808
7 Replies
2755 Views
Last post January 28, 2013, 06:53
by Indivstock
61 Replies
13309 Views
Last post July 20, 2018, 18:54
by rinderart
11 Replies
2137 Views
Last post January 24, 2018, 22:20
by Kamran

Sponsors

Microstock Poll Results

Sponsors