pancakes

MicrostockGroup Sponsors


Author Topic: Be careful if you're using Topaz AI (Rejections for "AI Modified)  (Read 1693 times)

0 Members and 1 Guest are viewing this topic.

« Reply #25 on: January 08, 2025, 09:51 »
0
Those of your recommended scaling programs also ashampoo zoom and zoom pro have Ai.
That is correct, but the training data used in Zoom #2 Pro and Ashampoo Zoom #2 are only self made images.

How do I know that?
How do I best express this... I developed the Zoom #2 programme, so I can say with certainty that no data with third-party copyright was used.


« Reply #26 on: January 08, 2025, 10:43 »
0
So why wouldn't Topaz Ai also legally license the photos to Ai. This company is big and has a million users. A lot of people are making money from these Ai tools, but istock is still not clear about these Ai tools.

I will be happy to test these Ashampoo programs, they are also interesting.

« Reply #27 on: January 08, 2025, 11:23 »
0
So why wouldn't Topaz Ai also legally license the photos to Ai. This company is big and has a million users. A lot of people are making money from these Ai tools, but istock is still not clear about these Ai tools.

I will be happy to test these Ashampoo programs, they are also interesting.
The difference is in the way the AIs are used, but of course you can't see that from the outside.

Self-learned generative AI models require vast amounts of images to achieve good results with such a system, and by that I mean ranges of several hundred million images as a lower limit.

The AI system used in Zoom #2 pro is a non-generative system, which takes the information exclusively from the current image to be scaled (+ training images that can be specified additionally). This means that no large training volumes are required, but no new details are generated into the images.

Unfortunately, the topic is very complex and you need a lot of basic knowledge if you want to make well-founded statements here.

I did not want to claim that the various providers have not all acquired the training images legally, but you should at least remain sceptical.
The large image datasets that can be acquired from the relevant companies with 4-6 billion images cost 2-3 digit million sums - only the really big ones can afford such datasets...

Actually, I just wanted to say: Remain sceptical - but I probably didn't manage that very well ;-)


« Reply #28 on: January 08, 2025, 12:32 »
0
Yes, I know it's worth being skeptical. Okay, but what will it change if, years later, it turns out that a well-known company that uses Ai tools for image scaling and enhancement was illegally using a photo database?

Would we authors also be complicit with stock agencies? After all, we would no longer remove improved photos or videos using Ai tools from our portfolio, which sell to customers, etc. By purchasing the Ai utility program, we should be released from liability. That's how I understand it. What is your opinion?

« Reply #29 on: January 08, 2025, 17:33 »
0
Yes, I know it's worth being skeptical. Okay, but what will it change if, years later, it turns out that a well-known company that uses Ai tools for image scaling and enhancement was illegally using a photo database?

Would we authors also be complicit with stock agencies? After all, we would no longer remove improved photos or videos using Ai tools from our portfolio, which sell to customers, etc. By purchasing the Ai utility program, we should be released from liability. That's how I understand it. What is your opinion?
It is currently very difficult or even impossible to answer this question. On the one hand, there are currently numerous lawsuits in court around the world concerning the use of images for AI training, some of which are being brought by artists, photographers, graphic designers, etc., because the big companies simply download the entire Internet using web crawlers and use it as training data - but the problem has already been discussed extensively here.

Then there is the problem of the different regions, for example the EU has much stricter regulations than the USA, so does the company from the USA have to comply with the stricter rules for the EU customer of the software or not? Good question...

What happens now, if it turns out in the court cases (which I don't believe) that the companies are really ordered to use only copyright-free images in training and to prove this, would basically all images processed with AI tools (which were trained with unauthorised images/videos) not be legally flawless in the first place. In this case, the software customer would then have to claim the corresponding costs from the software provider (who used unauthorised data) - what a mess that would be, even if it were the right way :)

And to avoid such problems with my software, I don't use such tools, even if you can get some of them for free on the net and integrate them into commercial products.
Just because you can do it and it's possible doesn't mean you should do it :)

« Reply #30 on: January 09, 2025, 04:22 »
0

And to avoid such problems with my software, I don't use such tools, even if you can get some of them for free on the net and integrate them into commercial products.
Just because you can do it and it's possible doesn't mean you should do it :)


So you wouldn't even trust RAW file denoising programs because of their Ai tools? Like DxO PhotoLab or Pure Raw. They are the leaders and they boast that they have databases of billions of photos.

« Reply #31 on: January 09, 2025, 05:34 »
0
So you wouldn't even trust RAW file denoising programs because of their Ai tools? Like DxO PhotoLab or Pure Raw. They are the leaders and they boast that they have databases of billions of photos.
If you ask me privately, I generally only believe what I can verify - but as a mathematician, that's my own personal quirk ^^ Basically, I see every generative AI (trained with billions of images) as a potential legal problem in the future, but I'm also very cautious.

The results of these generative AI algorithms are impressive and certainly usable, in some cases (I would not denoise a night sky with stars with a generative AI, then I have some new stars in the sky afterwards and with luck also a few extra comets ;-)

At the moment, everything is unclear from a legal point of view (for example, it is not at all clear where Sora got all the videos for their training) - committing to something here is currently on shaky ground.

« Reply #32 on: January 09, 2025, 06:14 »
0
Yes, but can photos after editing with DxO PhotoLab or Pure Raw be checked with any tools to see if it leaves any traces after using the Ai tool? Same with Topaz Ai.

Now Topaz Ai boasts that Gigapixel will be on iOS soon. Top brands such as NASA, Google, Tesla are also reportedly using these Ai tools from this company. Which somehow Topaz Ai are becoming more and more credible.

Yes, but Sora creates images from scratch from other images and there is a greater risk here. I'm more interested in the safety of photos and videos improved using Ai tools.

« Reply #33 on: January 09, 2025, 08:18 »
0
Yes, but can photos after editing with DxO PhotoLab or Pure Raw be checked with any tools to see if it leaves any traces after using the Ai tool? Same with Topaz Ai.
You can recognise from the image itself (with a certain degree of certainty) whether it was a generative AI that created the image or not. There are algorithms that search for typical artefacts in the image that result from the use of generative AI. This is therefore technically possible even without looking at Exif data or embedded watermarks - only the agency itself knows whether an agency actively uses this.
The trace is therefore the image itself or the specific arrangement of certain structures.
And yes, in order to recognise this via the pixels, an AI is used that is trained to distinguish between a photo and an AI image - we have probably all experienced this before, where a real photo was rejected because it was supposedly taken with AI. In these cases, the recognition AI was simply wrong :)

Yes, but Sora creates images from scratch from other images and there is a greater risk here. I'm more interested in the safety of photos and videos improved using Ai tools.
Yes, that's right and something very similar happens when you denoise with generative AI.
Let me try to describe the process (very simplified) for denoising:
-> the process takes a block of 32x32 pixels with image noise from the original and generates a block as similar as possible without image noise and adds it there, then it continues with the next block in the original image, etc.

The resulting image is therefore completely newly generated and may well contain deviations from the original (see example additional stars in the sky) - regardless of whether we denoise, sharpen or scale with these tools, the technology is always exactly the same, only the training data differs.

« Reply #34 on: January 09, 2025, 12:57 »
0
When it comes to artificial intelligence images that are created in generators from scratch, there are certainly algorithms that detect them.

https://sightengine.com/detect-ai-generated-images

And I doubt that our photos or videos corrected in Ai tools will be detected by any algorithms. Just add some noise to your photos or videos in a graphics or video program. No chance.

These brands, Topaz Ai or Dxo photolab, I doubt that they want to expose themselves to damage to the creators or any lawsuits. Creators have paid, so they use these Ai tools to make their creative work easier and better. What would be the point of getting into trouble with some free Ai tools.

« Reply #35 on: January 14, 2025, 08:40 »
0
Not so sure about that. Getty has already warned multiple times about using AI in their submissions and given plenty of advice of what can be done and what not. I would not be surprised if they close accounts that violate their policies , no matter how many files in their portfolios or how successful.....

But next generation cameras will be using ai integrated into their software.
What happens then?

AI will be everywhere.  The agencies are fighting a losing battle.  I'm sure iStock and others are already flooded with AI-modified videos and images without them even knowing it.

« Reply #36 on: January 14, 2025, 08:59 »
0
Graphic animators who buy, textures or matcap for their graphics and renders. How do they know that these derived textures or matcap are created in Ai generators? Or are they taking a risk as to Getty?

Brasilnut

  • Author Brutally Honest Guide to Microstock & Blog

« Reply #37 on: Yesterday at 15:15 »
0
Resubmitted one of the files without the "Topaz AI" extension on the title and it has been accepted.


 

Related Topics

  Subject / Started by Replies Last post
2 Replies
5159 Views
Last post April 16, 2008, 10:19
by Karimala
9 Replies
10932 Views
Last post May 05, 2021, 08:21
by Uncle Pete
6 Replies
5346 Views
Last post June 07, 2014, 02:36
by Beppe Grillo
50 Replies
26675 Views
Last post September 22, 2015, 02:20
by Justanotherphotographer
5 Replies
4073 Views
Last post August 31, 2019, 21:08
by Amanda_K

Sponsors

Mega Bundle of 5,900+ Professional Lightroom Presets

Microstock Poll Results

Sponsors