
Definitely don't purge sorta, ones that have uploaded stuff, etc.We plan to add account purging under very specific criteria (completely inactive users over x amount of time) in the future.
Yay!We plan to add account purging under very specific criteria (completely inactive users over x amount of time) in the future.
Yes, but that is something the Admins would know.Some of those may be lurker accounts, though. You can opt to have your faves hidden.
We need a way to weed out bad eggs using proxies to further harass people. I came up with a suggestion to tackle it.
This is something we want to work to improve, it's just not as high a priority as other things right now.An established user can set the minimum age of the accounts required to see their activity. It can be treated as if they are blocked.
Think of the children! Wait... Does that even made sense? Ah, whatever! I agree that fetishizing children is bad. I wish people would stop doing that.Ban human child fetish art
I submitted a report on what was clearly human child fetish art and it was denied due to "As there is no visible nudity or sexual content per our Upload Policy, this is permitted. Nudity is defined as the open display of genitalia or exposed female areola or nipples, and does not include bare buttocks. Thank you for your report and for being a part of Fur Affinity!"
Quite literally it's human child fetish art - If you're not going to enforce your website rules fairly, why enforce stuff at all?Think of the children! Wait... Does that even made sense? Ah, whatever! I agree that fetishizing children is bad. I wish people would stop doing that.
We recently revised our protocol for these situations so that we can remove more types of content. No policy change, just tighter restrictions when reviewing reports.Ban human child fetish art
I submitted a report on what was clearly human child fetish art and it was denied due to "As there is no visible nudity or sexual content per our Upload Policy, this is permitted. Nudity is defined as the open display of genitalia or exposed female areola or nipples, and does not include bare buttocks. Thank you for your report and for being a part of Fur Affinity!"
In that case can you re-review my report, please?We recently revised our protocol for these situations so that we can remove more types of content. No policy change, just tighter restrictions when reviewing retorts.
You will have to reopen the ticket by commenting on it. And the content has to be sexual or suggestive. It can't just be a kid in diapers - has to be a kid in diapers with hyper scat or something similar, for example. That's a very, very lax example.In that case can you re-review my report, please?
Reopened, quite frankly it's extremely obviously fetish content, it looks like fetish content, it's clearly designed to be fetish content and looking through their gallery, it's blatantly obviously so.You will have to reopen the ticket by commenting on it. And the content has to be sexual or suggestive. It can't just be a kid in diapers - has to be a kid in diapers with hyper scat or something similar, for example. That's a very, very lax example.
I see it. If you report an entire gallery, we will always ask for all submission links that you think break the rules, which is why this addition wasn't removed before. I'll have the moderator review this as normal. Thanks!Reopened, quite frankly it's extremely obviously fetish content, it looks like fetish content, it's clearly designed to be fetish content and looking through their gallery, it's blatantly obviously so.
At least it is on that list. Hopefully it will come soon enough to stop my harassment issues.This is something we want to work to improve, it's just not as high a priority as other things right now.
Come on, they've created their own AI, which you can use for a subscription, right? They just protected themselves so that they wouldn't be pelted with shit. A great plan is to create a huge database, and decades later wait for technological progress and use it. Brilliant. If they hadn't banned the use arts for AI, they would have dug a huge grave.Deviantart just added a flag system to prevent third party AI datasets from using their users' content to make AI art. Is FA considering implementing something similar?
Yes they did and you can opt out of it; creating your own AI is a sound move in a current situation where AI datasets are going to exist regardless, hence my question. I would love to know if user content on FA is going to benefit from a similar protection as DA has already implemented.Come on, they've created their own AI, which you can use for a subscription, right? They just protected themselves so that they wouldn't be pelted with shit. A great plan is to create a huge database, and decades later wait for technological progress and use it. Brilliant. If they hadn't banned the use arts for AI, they would have dug a huge grave.
The idea of a public ban on the use of artificial intelligence is great, but I have absolutely no idea how to implement it. Imho.
Not quite; they’re using a fork of Stable Diffusion, already trained on a dataset that includes art scraped from dA. Doesn’t matter if the “no AI” flag is set because the training is already done.Come on, they've created their own AI, which you can use for a subscription, right? They just protected themselves so that they wouldn't be pelted with shit.
I have a suspicion of who you might be referring to and, if it is them, it seems like there is a fine line they’re stopping just short of.Reopened, quite frankly it's extremely obviously fetish content, it looks like fetish content, it's clearly designed to be fetish content and looking through their gallery, it's blatantly obviously so.
Real talk: We'd love to, but probably don't have the resources. Nevertheless, I'll bring it up for discussion.Deviantart just added a flag system to prevent third party AI datasets from using their users' content to make AI art. Is FA considering implementing something similar?
As I noted here, there’s no evidence the flag actually in itself does anything other than say “please don’t.” Provided dA hasn’t completely bungled the attempt at introducing a standard for such a flag (by, say, failing to provide adequate documentation), my guess would be that it’s relatively straightforward as features go.Real talk: We'd love to, but probably don't have the resources. Nevertheless, I'll bring it up for discussion.