Yesterday we discussed the scalability issues Twitter is having, and how the same design constraints can impact ColdFusion sites. Today seems like a good time to touch on one little-mentioned way of solving the problem.
When the word “scalability” pops up, everyone automatically thinks of adding more servers. This can be done in several ways, from simply load-balancing across multiple copies of the same site, to “segmentation”, where you divide servers up according to functionality: this set serves pages from your site and that set handles order processing. This set handles your forum software, and that one is a media server, which only serves images and files leaving the nasty work to the big boys.
Unfortunately, the core portion of any dynamic web site is the database, and that sets a limit on the number of servers you can add. Too many web servers means too many simultaneous connections hitting your database servers, and everything bogs down. Caching will help, and we’ll discuss that later, but eventually you’ll still reach the limit.
In Twitter, Ruby, and Scalability, I talked about segmenting database functionality so that you could have multiple databases or clusters, each devoted to a specific area like membership or orders, much like the web server segmentation mentioned above.
Twitter’s problem is that, out-of-the-box, Ruby-on-Rails (RoR) wants to store all of its data in a single database, which limits the number of potential solutions down to installing a larger database server, attempting to setup a cluster, or trying to implement some sort of master-slave solution.
In ColdFusion, many frameworks and ORMs suffer from the same limitation in that they limit you to a single datasource, effectively placing you in the same situation as Twitter.
So, once again the problem comes down to too many servers talking to a single database. The solution, then, is to limit the number of servers that talk to the database. These servers are known as application servers.
In this arrangement, the web servers handle requests and render pages using data gathered from the application servers using web services, which return their data using XML, WDDX, or JSON. The application servers, in turn, talk to the database to do their work, and handle most of the business logic.
If an application server can handle a dozen web servers, and a database server can handle a dozen application servers, then this “fan-out” arrangement can handle 144 web servers. (All numbers made up, but you get the idea.)
Further, you can setup multiple application servers to handle, say, memberships and registrations, or user session management, further spreading out the fan. Design the application server to be stateless, and those can be load-balanced as well.
Using application servers requires you to be smart about your application design, especially if you initially want or need to startup your site without them.