pancakes

MicrostockGroup Sponsors


Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - dam.everypixel.com

Pages: [1]
1
Was your experiment done on commercial photos only or commercial VS editorial?

I find editorial content to have more sales when it's very specific.

Our friendly photo production studio tested the algorithm on several commercial shoots.

Keywords for half of the images were selected with Everypixel DAM, for the other half with the help of distribution specialists. They have been working on keywording for many years.

At the end of the month, the sales of the images with auto-keywording were higher.

2
so, while it might do ok with simple stock images, on the unique ones, the ones where we need assistance, it fails

We experimented. We measured the sales of photos with auto keywords and keywords selected by a person. The number of sales with automatic keywords was at least 10% higher.

As we found out, this is because people are more likely to look for photos in general terms, rather than by specific names of locations and objects. As a result, the images containing general keywords received more sales than those containing original place names.

3
Thanks for your attention!

The FTP access doesn't work.

"Error -203: miscellaneous error occurred while trying to login to the host"

The FTP access was temporarily suspended due to exceeding the limit of active users. We've raised the limit, so now FTP is able :)

Some inaccurate keywords are - smiling, cheerful, blue, positive emotions, fashion, thinking, posing, backgrounds.

We assume that not all keywords may be quite relevant for your photography. That's why we have implemented a convenient interface for metadata editing and you can delete inaccurate keywords in one click.

It seems all the inaccurate keywords are towards the bottom of the list, which seems to suggest the list is sorted based upon a confidence ranking. Might be easier to give the user an ability to filter out keywords that have a low confidence score. Google, Microsoft, Amazon's image recognition software all put a confidence number next to the keywords they produce. Such as Men (99), Casual Clothing (70), chair (12). In this example, anything below a 70 could probably be filtered out. 

The confidence score may be helpful in some situations, but this can affect that users will be too strict in their approach to filtering keywords. Like in your example, clean up everything rated below 70. As a result, the effectiveness of auto-keywording will decrease, cause a low rating doesn't always mean that the keyword is unnecessary and doesn't correspond to the image.

The software should have looked at the keywords within the file instead of ignoring them.

We're working on it ;)

4
We have launched the new version of Everypixel DAM, a service that will cut your time costs on a microstock distribution routine.

You don't need to spend several minutes to invent the keywords for each file. Try auto-keywording! Neural networks of the Everypixel DAM will select up to 49 relevant keywords for your photos and videos in seconds. Just upload your shoot to the system and click on the "Auto keywording" button to automatically match the keywords to the entire content of the folder. Also, you can to keyword every single document and quickly edit them: you need only one click to delete each keyword.

You can download these keywords in CSV-format or use it for the photo banks via the Everypixel DAM platform. On the free plan you have the storage for 3 GB, 50 submits and 50 requests to auto-keywording.

Try it right now!


5
Now, you can attribute files in a few clicks. Literally!

We have updated the Everypixel DAM interface to make the content attribution simpler and faster. Just upload your photos or videos to the system and click on the "Auto keywording" button to automatically match the keywords to the entire content of the folder. Also, you can to keyword every single document and quickly edit them: you need only one click to delete each keyword.

The update affects not only the interface but also the neural network for auto keywording. We've expanded its video recognition capabilities: now it can find from 20 to 49 keywords, depending on the content and duration of the video.

Try it right now: dam.everypixel.com



6
Everypixel DAM is getting easier to use!

In the new release of the Everypixel DAM, we have developed the web-uploading. Now, you can directly upload your photos into the interface of the system. Without FTP!

Just log in your account on dam.everypixel.com, push the button Add files and select the shoots that you need to upload to the system. Files must be less than 50 MB, supported formats: jpg, jpeg, jpe, jfif, jif, tiff. After uploading, you сan to auto-keyword your photos and send the shoot on the stock sites.

Also, we completely redesign the interface of operation with files and folders. Now, its more simple!
Dont use the Everypixel DAM yet? Sign up on dam.everypixel.com to try all the functions on Free plan.

7
Just visual for now.

Just curious, is it technically challenging to combine both?

Because this way it would be much more accurate and since I'm sure the most important words are there anyway, then I may not need to waste time correcting the results.
Its technically possible but not easy :-)
We are working on it. Probably, our AI will be able to analyze the metadata and generate the keywords at its base. But we cant say specific dates.

8
Can your AI read existing metadata, for example description and generate keywords based on it too (+include words from description)? Or is it just visual?

Just visual for now.

9
Call me when it can positively identify Greenbuls.
Other than that, it's easier and quicker to do it myself. But then English is near enough my native language; it would be different if I were trying to keyword in a different language. But even then, I'd need to know enough of the language to be able to delete false suggestions, so not that much of a help.
We are not sure that somebody can really easy and quickly to keyword by himself at least a couple dozen photos. On average, human needs more than 3 minutes to come up the ~50 keywords for one simple photo like a beach landscape or office routine. Our AI can describe it for seconds.

Now, imagine that you need to keyword 50, 100 or 200 photos in one run. It will be hard even for native English speakers.

10
Why do I want free storage? I can understand I want to upload, but after that, I don't need to keep my files on your server? What am I missing?
There are two main use cases.
1. Some contributors (studios, most often) prefer to describe their content in advance and upload it to stock sites by degrees, according to the internal schedule. Of course, until that moment its more comfortable to keep files directly in service.
2. Many people usually keep their data in the cloud to avoid losing it. So, we give our users the possibility to save them content inside the Everypixel DAM storage after uploading on stock sites.

I'll assume anyone who has any business or uploads will want the $34.99 plan (why not just say $35? This isn't a Walmart) which is for how long? Expert plan doesn't have a time limit?
All plans are active for a month.

11
I guess if we keep training the computer by fixing what it comes up with eventually it will get better?

Yes, of course :-)
That's how AI works.

12
Interesting idea - I know why everyone would love to automate keyword generation - but it really isn't up to the job. At least my experiments with several of my own stock images showed that in terms of keywording as well is the "stock photo score" the tool doesn't really know enough about what it's looking at to be useful.

I ran through a couple of kitchen remodel images  and a couple of external shots of places where the important information was where it was in the world, not just what was in it. As these images have all been sold - in one case just shy of one thousand times across several agencies - I have an idea of what the important keywords actually are, primarily using SS's sales info that tells me the keywords often used for purchases.

In no case did the tool identify the place on the exterior shots and in the kitchen remodel cases, it missed all the key what-is-going-on elements and just picked up the filler - Residential Building, Home Interior, Architecture, Wood - Material...

In the case of one image, that was of an island ferry dock, it concluded that there was a Pipeline, a Factory, Construction Industry and Fuel and Power Generation - all totally incorrect. Plus it put the scene in Europe (it was off the west coast of the USA. Adding keywords like Nautical Vessel for boat may work with Getty's CV, but it's worse than useless elsewhere. No user types in these awkward terms and other sites don't translate them into the type of English real people speak.

Oh, and one of my really solid long-term sellers rated a stock photo score of 11.6%.

I think I tried enough different images to give the tool a fair evaluation, but I wouldn't use it.

AI-keywording its not human substitute, its a helper. The final decision about the content description still belongs to the person. Somebody can agree with proposed keywords; somebody can edit a couple of them; somebody can delete them all.

We admit that auto-keywords may be not enough relevance for you. Thats why our neural network continues to training so far. But an overwhelming majority of Everypixel DAM users satisfied with AI-keywords and slightly adjust them in some cases. Combined with the ability to upload content on several stock sites at the same time, its really perceptible time savings compared to manual tagging.

13
Sounds like a recipe for a lot more poorly keyworded spammed files.

You can estimate the quality of AI-keywording on the demo page: labs.everypixel.com/api/demo

We trained our neural network to describe the objects on the dataset of keywords that commonly used on stock sites. Thats why its more relevance than results by similar services.

that is fine if you understand the language you are keywording in and correct accordingly, but if you can do that you probably don't need this tool anyway

Also, our tool will be helpful for native speakers too. According to our research, people usually pay attention to a smaller number of objects and their characteristics than a neural network. For example, somebody can forgive to describe the dominant colors or secondary persons. AI of Everypixel DAM will not forget :-)

14
The tariff scale of the Everypixel DAM was completely updated. Now, our neural network for keyword generation is available for everyone! You can use the computer vision to describe photos and videos and then upload it to the stock sites or receive metadata in CSV-format.

Are you ready to try the AI-keywording? Use the Free plan. You will get a 3 GB file storage, 50 submits and 50 requests to AI-keywording. Its totally free!

On the Attribution plan, you will get a 30 GB file storage, 300 submits, 1000 requests to AI-keywording and the ability to connect two people to your account. It costs $6.99.

On the Professional plan, you will get a 200 GB file storage, 1800 submits, unlimited requests to AI-keywording and the ability to connect two people to your account. It costs $12.99.

On the Expert plan, you will get a 500 GB file storage, unlimited submits and requests to AI-keywording, plus the ability to create two projects and to connect three users to your account. It costs $34.99.

Try AI-keywording right now! Just sign up on dam.everypixel.com, its free :-)

Pages: [1]

Sponsors

Mega Bundle of 5,900+ Professional Lightroom Presets

Microstock Poll Results

Sponsors