• Fur Affinity Forums are governed by Fur Affinity's Rules and Policies. Links and additional information can be accessed in the Site Information Forum.

@@@ OFFICIAL SUGGESTION THREAD @@@

uncia2000

Member
yak said:
it would be very helpfull to see the date the picture was made. and for that matter, the date it was submitted.
otherwise you find yourself in situations when you comment on mistakes in someone's work just to find out the other day that it was made like 13 years ago and the author already got over them. and you really feel stupid all of the sudden... i know some of you have signatures, but not all.
Yes, seconded, as an optional item: would also improve on our currently displayed (c)2006 text!
Was on my wishlist, too, for when I eventually type that up...

In addition, it would be useful if the actual submitted date appeared in the "Submission information:".
 

nrr

Member
Ronald Reagan said:
If I cannot watch myself, why is it even an option?
That, Mr. Reagan, is just the way the code happens to be written. Nobody ever thought of having an "inactive" state for the link to watch someone and to write in the logic to control it, so you have an active +Watch link on your own userpage when you look at it as yourself. QED.

Ronald Reagan said:
Also, under "Type of Artist" in Settings, there's no "None" or "Not an artist" option.
There really needs to be a "Curmudgeon" option there in addition to those.

Ronald Reagan said:
Additionally, why do the artists have to provide thumbnails? Honestly, it takes two lines of PHP code to satisfy all filetypes.
That's a mystery! Oh, no, wait. It isn't.

Jheryn doesn't know how to write properly-formed PHP, and he ended up rewriting a good chunk of the PHP function set because he didn't know such functions existed. As a result of not knowing that a lot of these functions existed, he just didn't write the code to take care of everything, and from that, we got this "artist-submitted thumbnail" kluge.

Hope this helps!
 

Torin_Darkflight

Active Member
Here's a possibly dumb suggestion. How about a light-weight, low-bandwidth scheme or setting that can be universally applied to all of FA to make it more friendly for slower connections (Such as dialup)? It would be a setting that can be enabled or disabled in the account preferences.

Here's some theoretical examples of what would happen with this setting enabled:

-Smaller thumbnails (Current limit is 150x150, perhaps the low-bandwidth limit could be 100x100)
-No animated thumbnails (Only the first frame would be shown
-No animated avatars (Again, only the first frame would be shown)
-Fewer or smaller ornamental/non-functional graphics (Such as the FA title banner)
-No extra javascript or other client-side scripts/tasks beyond what is absolutely required for normal functionality

There is one more potential function this could perform, but I imagine it would be too difficult or too tedious to implement (But it's still worth suggesting). When the low-bandwidth setting is enabled, there would be a set limit on the number of avatars that will be displayed on a single page (For example, 20). If the number of avatars on the page exceeds this limit, the extras will not be displayed, leaving only the first 20 (Or whatever the limit is) with the rest just showing text usernames. This would greatly speed up the load times on pages with a lot of comments and replies and whatnot from other viewers.

Once these initial slowdown problems are solved, an overall low-bandwidth setting like this would further help the site load faster on dialup and other slow connections (I'm not the only one here still using dialup, am I?).

And as long as I am here, there's one more quick little suggestion. As it is right now, there is no easy direct way to navigate from a user's gallery to their front user page. For example, let's say I'm looking at "FireWolf's" art gallery. Now, if I want to go to their front user page, currently the only way to do this is to open one of the pictures in their gallery, then click on their avatar. A direct link available on the gallery page would make this easier.

That's it. Sorry if I annoyed anyone with my rambling.
 

vashdragon

Member
Scanned through and didnt see it posted so i got a small suggestion.

If there is one thing i liked about y-gallery, its that you can make polls. People can actually vote on your little question. I honestly loved this feature and i think that it would actually be rather cool to see this thing on FA.

And any other suggestion i may have are just you know better tags, better search options, and the ability to organize your own gallery into subfolders. Same thing everyone else is asking for.
 

Foxiekins

New Member
Dunno if it's been suggested before, but when browsing someone's gallery, it would be handy to have a link to the images on either side of it... That way, sequential images could be easily viewed in order, without having to attempt going back to the gallery page, which currently seems to give the system fits...

That's when I get 99 percent of my errors, when trying to go back...
 

vashdragon

Member
I dont remember if i posted this before. Ive looked around and cant find it so, i guess i will post it now. If it was deleted for some reason then my apologies.

Anyways, i noticed that while searching the ToS, Faq, and Submission Guidelines, i can find no regulations on Thumbnails.

I honestly beleive that there should be regulations added to this.

Basicaly, imagine someone drawing a general audience picture, and marking it as general. But upon uploading it, they give it a very adult thumbnail. How would you handle that? Technicaly its not against the ToS, but such an act is practicaly a direct way to get around the mature content filter. Basicaly it can be used to piss people off. Without regulations, technicaly they can do this cant they?

At any rate, i think regulations of some kind should be added to the ToS that covers the content of thumbnails.
 

revil

New Member
In user settings, for type of artist. I think "3d modeller" should be spelt "3d modeler". When I checked the dictionary thats what it it read.
 

yak

Site Developer
Administrator
after clicking on the 'full view' there's no way to get back to the thumbnail view except by pressing the 'back' button in the browser.

and it would be nice for the if the viewing process consisted of 3 stages - thubmnail view, screen size view and full view (the screen size could be retrieved by JavasScript). because peope often submit really huge pictures, and for you to simply view a close-up you have to download the entire pic and then scroll it.

and a little PHP image processing, copying a small "plus" image in the lower left corner and a small 'full view' image onto the lower right corner of the thumbnail, then making it an 'image map' will eliminate the need for the 'full view' link and add a bit of 'style' too. i do not believe i saw it anywhere.
 

Planeswalker

New Member
Browsing species'

Add 'any' or similar to the species browsing options for each species with sub-categories (e.g. felines). Only asking this because the current 'general' categories seems to be more accurately 'unknown/unspecified' than the species in general (all the sub-species available and the unspecified).

Asking this simply because I'd love to be able to browse felines/any (as an example) instead of browsing all the different feline sub-categories separately.
 

Vorotaev

New Member
Don't know if this has been suggested yet, apologize if it has.

It would be nice to put a notice on the settings page that you need Javascript enabled. By default, I disable scripting to prevent malicious sites from trying to abuse it and to prevent annoying pop-up windows. When I tried to use the buttons on the settings page, nothing happened, leading to a great deal of confusion.

I managed to guess that it was based on Javascript, but not everyone will realize that, so it might be a nice thing to mention.
 

Torin_Darkflight

Active Member
Forgive me if this has already been mentioned, but I still see several complaints from members about not being able to stay logged in, a majority of which root from the same problem: the debacle of http://furaffinity.net vs http://www.furaffinity.net.

So. perhaps something should be added that automatically redirects visitors to a single URL. For example, if I try to access http://www.furaffinity.net, it'll automatically redirect me to http://furaffinity.net. By doing something like this, a vast majority of the "I can't stay logged in!" problems resulting from the different URLs will disappear.
 

nrr

Member
Torin_Darkflight said:
For example, if I try to access http://www.furaffinity.net, it'll automatically redirect me to http://furaffinity.net. By doing something like this, a vast majority of the "I can't stay logged in!" problems resulting from the different URLs will disappear.
That's a bandwidth (and general computing resources) waster of a solution. That also happens to be the reverse of what usually happens, but I won't touch upon that. If you insist on using a redirect, you'll end up sending one HTTP response to tell the client to redirect, and when the client follows the redirect, you'll send a second HTTP response with the real data. Did I mention that this is wasteful?

The correct way to fix this, near as I can tell, is to set the cookie domain to be .furaffinity.net instead of www.furaffinity.net. Really, this should be a matter of running find . -iname "*.php" | xargs perl -i -pe 's/www.f/.f/g' or similar (most likely more intelligent!) against a branch of the source tree, testing it to make sure nothing broke, and shoving it live.

When FA can feasibly afford to be sending 10% more across the wire, I'll be the first one to back using redirects. However, in the meantime, it's just cheaper to throw programmer time at the problem.
 

Planeswalker

New Member
Disagreement with the redirect :p

@nrr

Sending about 100 bytes to each client writing the hostname "wrong" at the start of each session is hardly such a great loss when you can have proxies and browsers serve (some of) the pages properly from cache, instead of for two different hosts as it most likely does now (causing two fetches for static content (e.g. thumbnails, images, etc.) instead of one, when the page is revisited on the different host). So we can save (rather than waste) bandwidth and resources in such cases much more by redirecting.

Hmh.. I think I said that in a bit too confusing manner.. oh well, let's hope people understand -.-;

Anyways, to sum it up.
No redirect = Caching for two hosts on client-side, two fetches for (some of the) same content IF both hosts are accessed.
Redirect = Small overhead ONLY when "wrong" hostname is used. Only one host cached.

Non-caching clients will re-fetch the content anyways, even for single host, wasting bandwidth on their and on the servers side no matter if we redirect or not, so we leave them out of this consideration (I think they're extremely rare, anyways).
 

nrr

Member
Re: Disagreement with the redirect :p

Planeswalker said:
Sending about 100 bytes to each client writing the hostname "wrong" at the start of each session is hardly such a great loss when you can have proxies and browsers serve (some of) the pages properly from cache, instead of for two different hosts as it most likely does now (causing two fetches for static content (e.g. thumbnails, images, etc.) instead of one, when the page is revisited on the different host). So we can save (rather than waste) bandwidth and resources in such cases much more by redirecting.
Emphasis mine. It's a good point that I feel needs discussing. :)

Yes, it is true that there are two distinct hosts serving static content, and that is just a little bit bigger a bandwidth waster than an HTTP redirect. However, the easiest and, perhaps, most sane way of fixing that issue is just to set up an alias record in DNS to point a name like static.furaffinity.net to someplace responsible for handling static content and evade the two distinct hosts problem entirely. In addition, this also future proofs things just a little bit because, then, a simple load balancer, should the need arise, can just drop in place to take care of whatever needs to be handled statically. If you want details on this, ask; I'm intentionally leaving them out as to narrow the focus on this post.

Now, as far as a cache goes, you have to be careful. PHP is passive with regard to caches, so if you want your dynamic pages to be cached, you have to write code to take advantage of the If-Modified-Since header field in an HTTP request and mangle the output appropriately. The static content is easy because it's, well, static. Even if an output filter for user-uploaded images is applied, you can take the file's mtime and determine whether or not the cache's copy has expired entirely based off of that. With dynamic pages, it's a little trickier. You have to store the mtime of the data used to generate the pages in a place that's easily accessible (read: allowing as little overhead as possible) and tell the cache about it.

Planeswalker said:
Non-caching clients will re-fetch the content anyways, even for single host, wasting bandwidth on their and on the servers side no matter if we redirect or not, so we leave them out of this consideration (I think they're extremely rare, anyways).
I've learned to treat browser clients as non-caching clients; their caching algorithms are generally dumb. Moreover, a browser's cache behavior usually isn't able to be as finely tuned as an actual dedicated caching proxy.
 

Tiarhlu

AKA Tack
I haven't gone through all 13 pages to see if this was mentioned but, can we have the submissions in our message center be viewable by more than 15 at a time? It makes browsing and deleting so much easier. Right now I get things faster than I can look, and I'm overloaded with pictures.

Also when submitting something, if we select other for species, can we have an option to write something in? I don't expect a catagory for every single thing, and I think being able to write it in when we don't have the option normally would be great.
 

yak

Site Developer
Administrator
However, the easiest and, perhaps, most sane way of fixing that issue is just to set up an alias record in DNS to point a name like static.furaffinity.net to someplace responsible for handling static content and evade the two distinct hosts problem entirely. In addition, this also future proofs things just a little bit because, then, a simple load balancer, should the need arise, can just drop in place to take care of whatever needs to be handled statically.
i could not agree more. it is always great to sepatare the static content from the dynamic as much as possible - and use as much headers/settings to make the static content as cacheable as it can be. well, images seem pretty static to me. and it is only a coincidence that my browser caches them properly (wow! now that is something you don't see every day!), because FA is not sending any cache headers with them (i checked).
probably a good idea would be making a subdomain for the images, stories, music etc - one for each submission type. one for status, one for login... they are all static.
in other words use subdomains instead of subfolders. it may look masochistic, but it gives you a lot more control over caching, error control, divide the site into modules etc. - in my oppinion.
I've learned to treat browser clients as non-caching clients;
same here. unless you explicitly tell them to cache stuff, they will not, or will do it in some weird unpredictible stupid way. maybe some will do it properly, but not all. it is just another problem like cross-browser compatibility, which FA is familiar with (i am talking about the 'description popups' for the recent submissions on the index page, viewable only in IE).

i could suggest rewriting FA from scratch, using the 'new stuff' that has been around for some time. like, *sigh* ajax for a truck load of stuff, ifox for full-context search, smarty for code/template separation, overlib for javascript popups, rethink the database structure, subdomains, add possibilities for future extensions, think _a_lot_ on the sites's structure and make it as modularized as possible, thus as scalable as possible. and maybe even code some sort of CMS, since FTP/SSH access is somehow a problem.
i know how hard it is to maintain and update something that wasn't designed to be updated in the first place. you run into a lot of 'oh boy, ANCIENT code. how much reed did i/author had to smoke to write something like this?'. sometimes it is a lot easier to just 'drop' the old version and think of the new... it may take a while, but it will be worth it! so if it is possible, it is a preffered option...
if it wasn't for the final 2 months before my graduation from the Technical university, IT spec. i'd be showering you with suggestions and implementations. it is like a programmer's hell out here, to me at least...

and in the mean time, redirect will work just fine. it is a temporary workaround, but i've learned that there is nothing more perpetual then a temporary solution, so... it is better then nothing.

<VirtualHost localhost:80>
DocumentRoot "W:/SVN/AIM_2_1_4/aim2/docroot"
ServerName aim2local
ServerAlias www.aim2local
ErrorLog logs/aim2local-error_log
CustomLog logs/aim2local-access_log common
</VirtualHost>
isn't the ServerAlias directive usually used to bind several dns names to one host? or is there something i do not know or understand?
 

Arshes Nei

Masticates in Public
yak said:
i could suggest rewriting FA from scratch, using the 'new stuff' that has been around for some time. like, *sigh* ajax for a truck load of stuff, ifox for full-context search, smarty for code/template separation, overlib for javascript popups, rethink the database structure, subdomains, add possibilities for future extensions, think _a_lot_ on the sites's structure and make it as modularized as possible, thus as scalable as possible. and maybe even code some sort of CMS, since FTP/SSH access is somehow a problem.

Ajax is a horrible solution for large database type sites. There isn't much support for it other than people thinking it's a good idea and it's really a fad thing at this point. The latest and new thing isn't always the best solution. Many other sites with large databases function fine without using ajax, this is a mere matter of properly streamlining and coding things properly. Also, this only works for new browsers anyways, and you're not saving anything or doing favors for bandwidth.
 

Guppy

New Member
Posted this in the wrong thread, so I deleted it and am moving it here.

I suggest that you remove the (376 users online) from the top of the page, and the associated database call.

If the call is being made to the user database every time a page loads, it has to scan the database, check to see who's been seen within a set amount of time. That's a costly call, I know it's the most painful database call on my site :).

If you really need that count, do it once every 5-10 minutes, save the text to a file and just include the file in the page. When the file timestamp gets too old, replace the file with the new count from a single database call.

Mind, if you've already done this, kudos to you for forward thinking :)

I'm currently rewriting my site and I installed timers around the database calls and around the page generation to find the slow points. By doing so I've noticed that indexing the time someone last hit a page is pricey and should be put on a timer, stored in the session and only donce once in a while to cut down on database grind.

Hope this info helps your programmers :)
 

yak

Site Developer
Administrator
Ajax is a horrible solution for large database type sites. There isn't much support for it other than people thinking it's a good idea and it's really a fad thing at this point.
well, i have to disagree on just about everything stated here.
first of all i'll say that EVERYTHING can be a horrible solution if implemented improperly or overly too much - and there are no exclusions.
how can FA benefit from ajax? simple - faving, watching, commenting and deleting. adding it to something else would be an overkill at the moment. there is really no need to reload all the stuff on the page, when you can reload just the '+watch' link to become '-watch', remove the image container or add a text container at the bottom of a table.
so instead of executing all the SELECT queries on the huge DB, which eat _a_lot_ of RAM in the process, you only have to reload a small portion of it, less then a kilobyte(even in the case of a comment), and execute a lot less queries, mostly the quick INSERT or UPDATE ones(the functional part), that can even be grouped into blocks by the server for faster execution. so it is reducing the server loads in just about any case used, and by a lot.

as for support - hey, this is going to be the next big thing to hit the internet since javascript, a revolution in application development - and these are not even my own words. just about every big company is investing/interested in it's development. i don't know how could you miss these news.
it still has some unresolved issues(like the 'back' button not working), but they are beeing worked on, and solutions are available(cumbersome or crude, but solutions still).
i am no fan of this technology, i just find it extremely usefull, small and fast - and really easy to implement. new features it is offering are worth a bit more time spent with the text editor.

The latest and new thing isn't always the best solution.
using the 'new stuff' that has been around for some time to prove it is beeing stable and cross-browser compatible. i thought i added this emphasized line, i really did. why is it missing is beyond me.... what i wanted to say is that stuff that i mentioned proved to be stable, reliable, fast enouth and _really_ usefull to be used in the web-site development.

Also, this only works for new browsers anyways, and you're not saving anything or doing favors for bandwidth.
not true. it is supported even by the very old browsers, 'cause the XMLHTTPRequest class hase been around with them like since the beginning of time... maybe the implementations you saw used some features of the CSS 2.1 to render the received content(yeah, really ugly in opera), but trust me, the stuff it receives is just an xml code - render it how you want. other points stated above.

PS: i know what i am talking about, i code PHP for living.
 

yak

Site Developer
Administrator
If the call is being made to the user database every time a page loads, it has to scan the database, check to see who's been seen within a set amount of time. That's a costly call, I know it's the most painful database call on my site Smile.
i can suggest a lifetimetime of 3-5 sec, to keep the counter's info as much 'up-2-date' as possible. this should probably save some tens of SQL queries.

my guess is that the 'cost' of this type of call is mostly on the processor. and processor power is something FA is veeeery abundant in :) 2xXeons are packing a lot of power. so i do not think the burden is very significant here, but still it saves MySQL connections for other users, and that is a good thing.
 

Guppy

New Member
yak said:
i can suggest a lifetimetime of 3-5 sec, to keep the counter's info as much 'up-2-date' as possible. this should probably save some tens of SQL queries.

my guess is that the 'cost' of this type of call is mostly on the processor. and processor power is something FA is veeeery abundant in :) 2xXeons are packing a lot of power. so i do not think the burden is very significant here, but still it saves MySQL connections for other users, and that is a good thing.

3-5 seconds might be a bit quick even, this is a counter that in the long run doesn't really matter, especially when it's as high as it is. perhaps once every 30 seconds, but yeah, that would free up a database call for every page that's loaded (apart from one every 30 seconds)

The call probably takes .015 seconds to complete, but that adds up when there are 400 people hitting the site at once. not to mention yeah the queue for SQL commands stacks up as well I'm sure.

I hope they consider this :)

Guppy
 

Guppy

New Member
Another hog is probably the message count at the top of every screen. Perhaps those counts ought be stored in session variables and updated once every 5 minutes or when a message view is loaded.
 

yak

Site Developer
Administrator
i agree, use the 'event' type of handling stuff - instead of polling the database every time for changes, store the intermediate data somewhere, use it and update it only if it is changed. it takes some rethinking of how stuff works and a good deal of code tweaks, but is a great performance gainer..
lets take the user's galery for example - it should be regenerated only when a new pic is added or a description is changed. the whole part of the galery page can be cached. now _that_ would save a lot of SQL requests. and if you do a bit of woodoo around a campfire, you can make the browser cache this part of the page too, but that is prone to a lot of headache..
what i want to say is that even in the dynamic site there are parts that can be considered static for some time, and if that time exeeds one minute, it is worth caching, and probably shoud be cached. page parts, SQL results, etc.
i've coded several picture galeries in my practice, mostly to organize my own collection of images the way i wanted them, and only one had this 'cache engine' thing(just for practice, cached galery pages) - and tell you what - it worked blazingly fast(duh!), compared to my previuos stuff - and that is with the database of more then 750k images(ima huge art admirer..).

but i think it is a bad idea to store data in a session in this case. session handling is a weak point of FA's code at the moment(logging out etc.), at least to my view... But what about simple txt files in the user's dir, or even better - serialized arrays and intermediate data? Yeah, as less SQL as possible. That should work.

edit: typos
 
Top