• Fur Affinity Forums are governed by Fur Affinity's Rules and Policies. Links and additional information can be accessed in the Site Information Forum.

FA download tool

Hello all,
I've written an automated download tool in shell language, specifically using ZShell as it has some extra functions that really help and made for Linux (Mac too I guess since you can get zsh onto them).

The tool works perfectly as I've tested it on a small scale with my gallery and faves. It saves username, title, keywords, submission id, submission file link and description in a folder, it also creates a folder structure for the downloaded user [user]/[gallery|scraps|favorites]/[submission].
On top of that it can also search through the downloaded submissions for keywords, title, author and description and can limit the search to faves or the gallery/scraps.

The problem is that it is fast, really fast.
It works by getting the html source of a gallery/scraps/faves page then parses it for the id of submissions in that page. The html source of each submission is then saved in a variable and then parsed to extract the various information. It took me a while to code a fast way to isolate the description but eventually managed to.
Bottom line is it takes around 0,7 seconds to download a submission with an average speed of about 400Kb/sec (could be more or less, it is calculated by simply timing the download and then checking how much has been saved).

As you can all imagine what I'm worried about is that such a fast tool generates a lot of traffic and could lead to problems with site performance even though I doubt a single process could really do that. In any case I've decided to limit the speed of the tool.
Currently there are two ways I'm going about it:
1) limit the maximum speed of the program I use to actually download the data (curl and wget) to 100KB/sec
2) add a pause of a few seconds after each submission that is downloaded to avoid an almost continuous connection with the server
At the moment I'm using both methods concurrently, it'd take a long time to download a gallery but one could simply leave their laptop to download for a night. It'd be like a user browsing faster than normal.

What I'd like to ask here is any suggestion in making the tool use less bandwidth and methods to ensure no problems can be caused to FA. Since FA lacks any and all information regarding such things I'm going with the assumption that their servers prefer short connections separated by a pause. The two methods I've found so far should be enough for that but I cannot know.
For this reason I will not share the source of the tool to anyone as it could be easily modified to remove the restrictions and go too fast and if too many users use it concurrently it'd then be easily capable of ruining site performance.
If I get confirmation that the slowdowns I've written into it are enough then I will try and create an executable for it, probably in C++ or Python so that they cannot be tampered with.

Thank you all for reading all of that and have a nice day/night! :)

ps: please, please, PLEASE! FA devs please make an api. This would all be unneeded if we had an api to use instead of having to resort to artisan methods that can potentially ruin the website experience for everyone.
pps: for those curious to know the script uses ag (the silver searcher) to parse the page and for the offline search. It is an incredibly useful tool with incredible speed and great functionalities.
ppps: I've made the tool more as a coding challenge and to save the faves I like the most, it was never meant to download all of FA even though the sub-function I've made that handles the single submission would be easily capable of that.
Last edited: