Recently, I was searching the web for some information on a subject (I won’t get into details, suffice to say that it had to do with my sound card) when a promising link lead me to a forum discussion (which I won’t name here for obvious reasons). A user had uploaded an image as an attachment with his message so I naturally clicked on it, only to be greeted with an error that resembled the following (just a random page that I retrieved from Google; nothing specific to Kaspersky): http://forum.kaspersky.com/index.php?&showuser=150963. I had to create an account, which would translate in providing my details then trying to pass the usually incomprehensible case-sensitive CAPTCHA tests that seem to block humans alongside robots, only to wait for a verification e-mail that would never arrive.
Screw that noise, perhaps another post might help me find a textual answer to the problem. I searched using four terms (SB X-Fi Elite Pro) only to see that SB, X-Fi and Pro were removed from search, because they were too short. Okay, what about “Sound Blaster Elite”? That should yield a few results, right? Well, I had to wait 40something seconds for my next search, in order to protect from flooding. And then I had a zen moment: “Man, forum software suck!”
Let’s start with the inability to look at an attached image as an unregistered user. I understand that this is done to prevent hot-linking, but come on, use a HTTP-ref check and be done with it. Then comes the error message. Look at this goldmine:
“Sorry, an error occurred. If you are unsure on how to use a feature, or don’t know why you got this error message, try looking through the help files for more information.
The error returned was:
Sorry, but you do not have permission to use this feature. If you are not logged in, you may do so using the form below if available.”
Is this some kind of a joke? The message begins with the statement that “an error has occured” and urges me to look at the help files. That’s a generic cover text, for the following lines that state “the error returned was:”. Returned from where? While I understand the concept of a method that returns a result, I’d hardly say that such terminology is common for the average user. I only find out what “went wrong” and what the “error” actually was on the third line.
I understand that IPS (the creators of that particular forum software) wants to implement a generic way of handling errors. I’m also sure that their implementation would make architecture astronauts happy. Not having looked at their code, I’d imagine it somewhat like that: “Our authentication module returns an IResult implementation which has a field ErrorCode (or something) which is then added to the stack of request errors which are then presented to the client. It’s clever because it’s reusable; instead of having separate logic for authorization we’ve implemented it as a part of our error handling!” It’s not that it’s a bad idea necessarily, it’s just that the vast majority of users don’t and won’t care how cleverly you’ve implemented your infrastructure; they just want to see friendly error messages with one-click solutions (if possible).
In that specific case, I’d either show a pop-up on mouseover on the attachment link that you have to login (as Youtube does) or I would hide the links completely, only showing a hint that you have to login in order to actually see the links. Both of these options save you from a server round-trip and they pass a clearer message.
It gets worse though. The signup process I’ve described above appears to be common in almost every forum installation I’ve encountered. Why do I have to enter a CAPTCHA, especially since 1) Most CAPTCHAs break easily and a determined cracker will use effective N.N implementations with a 95% success rate, 2) There are far superior methods for checking an automated signup, such as nonces, honeypot form fields, time between requests limits etc. This also eradicates the need to activate my account.
Or does it? Proponents of activation claim that apart from the bot control, activation provides a verification of the e-mail address. But you don’t need my e-mail address. There are literally dozens of temporary mail services on the net; you can’t block them all (a forum I knew even blocked gmail addresses in order to prevent spam; some people are clueless and they don’t ask for advice). You may think that your administrative newsletter is witty and important but I probably won’t care and it will end up on SpamBayes‘s jaws anyway. Truth is, unless your business model absolutely requires a valid e-mail address, you probably have no realistic use for it apart from spamming me with useless notifications. If I need to use my e-mail productively with your website, I’ll be sure to type a valid address.
Having cleared that out of the way, Search is another gripe I have with most forum software. Unless it’s disabled for anonymous users (dear god!) you are probably limited in words that contain at least 4 letters, can’t implement even rudimentary useful search techniques (e.g phrase searching) and you usually can’t have dashes or symbols within a search term. The effect on the search feature ends up being severely limited, unless you are lucky enough to use terms that fit those requirements.
That wouldn’t be so unbearable if it weren’t for the loads of protection that’s implemented into a search feature. It’s not uncommon to see incomprehensible case sensitive CAPTCHAs for unregistered users in a search function. The sheer inconvenience of this is only matched with the sheer stupidity that’s called “flood protection”.
Now, to be fair, flood protection is an important feature which can greatly reduce server loads due to spam attacks. If a malicious user sent massive search requests and the target server required say… 5ms per request, in only one second it would execute 200 database requests (unless the results are cached somehow and it’s a cache hit) which can be costly. Suppose that the server imposed a limit of no more than a single search request per user per 50ms. The same user now would only manage to force the server to execute 20 requests per second, effectively slowing down the depletion of resources.
If we were to put a more sensible limit, let’s say 1 second between search requests per user, that user would only get 1 search request per second – the problem? It’s gone! What’s clever about using a limit of 1 or 2 or even 5 seconds is that a) After a point (1 second sounds good) further increasing the waiting time offers diminishing returns b) a legitimate user that attempts consecutive searches (perhaps due to the poor results of the terms he used) will probably not notice. He’d need a few seconds at the very least in order to perceive the page that just loaded, to change the terms and press Search again.
Now compare these numbers with the ****s that set limits of 20, 40 and even 60 seconds! A common encounter I have with vBulletin implementations is that I’ll attempt to search for a term, then I’ll be transported to the search page because I also have to enter a CAPTCHA which was obviously not accessible from the search textbox in the sidebar, I’d get the CAPTCHa wrong a few times because the admin would enable ALL options which render it incomprehensible (PHPBB suffers from this too) only to find out that my terms are invalid. Then, after another successful go against the CAPTCHA monster I’d be greeted with an error message which would inform me I’d have to wait 40 more seconds before I could search again. Meanwhile I’d probably have sent over 100 requests on the server (taking into account the images plus the creation of the CAPTCHA images) with my interactions probably costing more in resources than simply using a non-CAPTCHA method to fight bots and a saner search limit.
The point is?
Have a look around. Most sites won’t fall for the same traps (or at least, not at the same time). Complex registration pages which require you to complete 20-30 fields are non-present in creating a Twitter account, CAPTCHAs and pervasive time limits are void from Google’s search and the hate for unregistered users is almost non-existent (apart from very specific cases).
This is the whole point of Web 2.0 actually: Realizing that your web site is not alone in the world; realizing that your users don’t want to be greeted with a boring registration procedure in order to find information and be converted. Forum software appears to have stayed in the past and it doesn’t get any better when one takes into account the lack of information they provide. I’ve administrated quite a few forums in my past, implementing a policy that ranged from anarchic to draconian (depending on the requirements) and still was difficult to maintain proper and on-point discussions. It was far too common for trolls, topic hi-jackers or even honest users to derail a topic. I might have been a bad admin however this is a trend one can clearly see in other forums as well.
Apart from that, forum software come with a massive amount of features that only provide cosmetic value (and which the users will exploit to the maximum), be it massive signatures, number of posts, large avatar images etc, which end up grabbing more space than the message itself!
In the end, I am not sure what a proper replacement for a forum engine might be and so far it appears to be the prime method of supporting a community. Established social websites do help, but exchanging the whole of your on-site community for a Facebook one might be a mistake (depending on your site’s context). Still, you can minimize the damage by properly using a forum software. It may not be as open and accessible as other methods are but you can at least make sensible customization choices (some of which were mentioned above).
Massive PS. I have a bone to pick with a specific company. My latest attempt at creating a community for critics.gr was done using Community Server 2007 from Telligent. The pricing was hillariously expensive but I did test their free version. I could have gone with a much cheaper solution but I liked the fact that it was built on .net technology, even though it did seem to lack in features compared to other established solutions (even free ones). Telligent is notorious for not providing any pricing information on their website and for heavily jacking up the price between releases (their excuse is support costs. My excuse is that they’re jerks; read on). To put it simply, according to this page, the least expensive paid version you could buy back in 2009 did cost $12000! At this petty price you only got 10 blogs & 10 forums, plus that’s per-cpu licensing!
No matter how you name your product and sugar-coat its features, it’s still a forum engine (well, at least it was back then, by now I guess it went all collab-enabled in order to justify the pricing). Just because it’s called community server it doesn’t mean that’s actually a server in the same vein as Windows or PostGreSQL are. Also keep in mind that this started as a free product, the first asp.net forum software ever created.
Oh, and if by any chance you thought that 10 forums are rather limiting and you’d need say… 51 forums, you’d have to purchase the Enterprise version at “only” $72K per CPU! As a comparison, MSSQL Server 2008 R2 Enterprise costs $27K per CPU, about one third the cost of a god damn forum software! Apart from the fact that there’s no justification whatsoever between the different pricing models (why does a 50 forum license cost so much more than the 10 one? It’s not as if Telligent would need to put extra work in it), should you need to use it on a server farm the costs would skyrocket! With that kind of money, you could probably hire enough people to work on modding an existing implementation and retrieve MUCH cheaper results soon enough. No matter how enterprisey they want their product to sound, the price increase is only due to their high profile clients (such as Myspace) and it still remains a forum software. It’s just the icing on the cake that asking for so much money that even Microsoft wouldn’t think of asking is insulting to software business in general.
Please, think of the children; boycott them.