Very often, at the beginning of a project performance of the website is assumed, rather than designed into the site. Trying to improve the website performance after development is much more costly than thinking beforehand and designing for performance.
This is true for other non-functional requirements as well: security, privacy, accessibility, et cetera.
When designing a website, you have an audience in mind, and interaction of that audience with your website. But how large will your audience be in three years, and how intensively will they interact with your website?
When discussing performance, there are two important angles: the responsiveness of the website (how fast are pages loading) and the traffic load that a website can handle (how many concurrent sessions does the website handle with sufficient responsiveness).
Another aspect to take into consideration is this: will the load on the website be more or less evenly distributed throughout the day or week? Or should we count on peaks?
The planned website is aimed at German-speaking territories (Germany, Austria, Switzerland, Liechtenstein, Luxembourg and a bit of Belgium, maybe).
You offer professional products. In other words, your website will be a B2B- or business-to-business website. Most interaction will take place during business hours (say, 40 hours per week).
Further, assume that the average user session (very simplified of course) consists of:
So, per session we have:
Assume at a certain point we have 12,000 sessions per week on the web site. Further assume that all interaction is during business hours and evenly distributed (we will come back to that later). Since our audience is all in the same time zone, we end up with an average and 12,000 / 40 = 300 sessions per hour.
The product owner adds a 50% increase to vector in the uneven distribution during the work week. We end up with 450 sessions per hour.
So, per hour we have:
What is an acceptable responsiveness of the website? Many wars have been fought over this. Traditionally, a standard response time (page fully loaded in the browser) of 3 seconds was often used. I prefer a 2 second maximum, as users get more and more impatient with websites that perform badly.
What can we do to make sure our design will meet these expectations?
There are many, but let me list a few by following the process that occurs when a user clicks on a link or button on your website:
With profiling tools, you may be able to find the most processor-intensive steps in the process. But in many cases, you should expect the execution of (macro) partial views will take up relatively much processor capacity.
Test performance both on the server as well as in the client.
Umbraco offers multiple ways to employ databases. First of all, you could use a SDF (SQL Server Compact Database File). This database format can hold up to 4GB of information. Or you can make a connection with a separately installed instance of SQL Server or SQL Server Express. If database performance becomes an issue, you have more options with the full SQL Server version as this is a very scalable environment.
Monitoring database performance should be something every webmaster should do regularly. Which queries take relatively long? How much memory is used by the database? How big is the database itself?
When still in the implementing phase, look at custom database structures. Of course, Umbraco and LINQ are highly optimized. Nevertheless, it could be appropriate to create additional tables or views. Many developers prefer to stay away from this aspect of the job, but this is part of what a "full stack" developer should definitely master.
Light-weight pages make for less data to be transferred and a shorter browser loading time.
Often, you can make the outputted page compacter by removing comments and empty sections from it, but this generally does not make a lot of a difference. But images are a different story altogether. Make sure that images have the resolution that is necessary, but not more than that.
The easiest way to accomplish this is by giving clear instructions to the editors of the website. There are also automated ways to scale images to optimum resolution. This can be done either at upload time or at render time. My personal preference is: uploading in the original resolution, during render time scale to the optimum resolution, cache for future use.
Another method to create light-weight pages is to not embed code (such as CSS code) in your template, but the create a reference to a separate CSS. While visiting the first page during a session, this CSS will be downloaded, but subsequently visited pages will be able to use the CSS that is now available in the browser cache.
For some situations, lazy loading is a good strategy. On this page, we will implement some lazy loading in the Umbraco Starter Kit.
Umbraco offers ways to cache a macro.
So we have three options, and three questions to ask ourselves:
The first question depends on the nature of the content processed in this macro. If you are building a news site and creating a list of the five most recent news items, caching should be over a period of no more that a minute, or so. But if you are processing news categories in this particular macro, caching over a period of an hour may be okay (you do not add new categories all the time).
The second question refers to the fact that a macro may appear on different pages. Caching of the macro output by page is not necessary if you want to display all categories of news items, but if a macro displays the previous and next page in a tutorial, the outcome of the macro will be different for each page. In that case, choose for caching by page.
A further step is caching for individual users. If a site is built with a high degree of personalization, using profile information of the user, the outcome of a macro may be different for different users. So caching for individual users may seem a good idea.
Often I find that developers have made sure caching is off during development, because that makes debugging pages and scripts easier. However, before going live, we have to make sure that client browsers contribute to the performance by caching as much as possible, thus reducing the load on the web site.