RSS and U

RSS is a great tool.  With minimal effort, you can keep track of your favorite websites and be notified when they update their content.  You can also see those changes (sometimes a portion of the new content or all of it) without having to go to the website at all.

Seems great.  Websites get a closer relationship with their readers and users can keep ontop of multiple sites with little effort.  Apart from never being able to get anything done at work, seems like a win-win situation for both, right?  But there is a problem with RSS that website admins have to keep a close eye on.  Or rather, a problem with RSS readers.

RSS content is retrieved by RSS readers.  These programs poll website RSS feeds to see if new content has been added.  Where this becomes a problem is when badly coded RSS readers get into the hands of lots of users. 

If these readers are set to poll websites at too small a time period and are used by a large number of users, they essentially become a huge "Distributed Denial of Service" network.  Imagine 20, 30 thousand people downloading your entire RSS feed every minute, whether they need it or not, 24 hours a day.  Do the math: 

20,000 users x 1kb RSS x 60 minutes x 24 hours = 14 gigabytes

That's 14 gigs of traffic in one day.  Okay, that's a bit extreme of an example, but you get the point.  I could go on, but there's a great post on Coding Horror all about the subject. 

So, website admins, if you offer RSS feeds, make sure to keep an eye on bandwidth consumed retrieving them.  That includes checking the user agent string to see if a particular reader is behaving badly.  And if the bandwidth just gets too much, you can always outsource your syndication to some of the free RSS hosts (who shall remain nameless as they don't pay me) out there.

Programming Post by: McGurk at 08:04 PM | Reply
Kick this post:

Press butan, recieve imagelet. Hover for preview. Imagelets are pasted at the end of your comment. Think ahead.


Comments are disabled. Post is locked.