Slow websites annoy users. Amazon has calculated that for an increase of a tenth of a second to load a page on their site, it costs them 1% in sales.
A quick and efficient website doesn’t just benefit the end-users, however. Overheads can also be reduced by a website that can run on cost-effective hardware specifications, or manage greater volumes of traffic on comparable hardware.
For a recent eCommerce client, an emphasis was placed on performance due to these issues and considerations. Performance targets were set from a Server Response Time (SRT) and the time required to load the DOM. The development team’s focus was on the SRT and the areas of interest for performance improvements were grouped as:
From an application perspective, we found a performance overhead through utilising an MVC framework. Despite the flexibility and structure offered by an MVC framework from a coding perspective, the overhead of autoloading, bootstrapping, routing and serving content for simple pages was large. Therefore, we developed a technique whereby the application would immediately shut down the routing process and serve cached page content if a valid cache entry were found. Output caching of selected page elements was also used - this cached content drawn into the page via AJAX when initially outside the viewport. Static code analysis tools such as PHPMD and CodeSniffer were used to identify suboptimal code, overcomplicated expressions and unused parameters, methods, and properties.
From a database perspective, Xdebug profiling was useful. As database calls occur through functions within the application, a high cost of a database-interacting functions was identified. Further profiling and debugging was performed through the MySQL EXPLAIN command, which highlighted the query execution plan and optimisations that could be made. Calls to the database were reduced wholesale by storing often-used data in the application cache or registry (a singleton for storing variables which can be retrieved throughout the application), or within a Redis cache.
The application has a close integration with third party Web services. The initial step was to identify any unnecessary calls to the Web services via Xdebug and application logging, and remove or reduce such calls.
Next, we identified the Web services that returned suitably static data which could be reliably cached within Redis. However, due to the volatile, changing nature of some endpoints, responses to Web services could not be cached. Unfortunately, these types of requests would often incur a 100-150% increase in SRT. To mitigate this effect, functionality that required ‘live’ data from a Web services was populated via AJAX after page load. This would improve the initial SRT, and importantly, the perceived response time for the user.
From a server and software configuration perspective, the latest stable versions of PHP and MySQL were installed. The performance benefits of PHP’s more recent versions are well documented, and alongside advances in security and functionality, these were a welcome addition to performance gains. OPcache was installed to improve PHP performance by storing precompiled script bytecode in shared memory, thereby removing the need for PHP to load and parse scripts on each request. Smaller improvements were made through optimising PHP’s include_path setting.
Though there wasn’t a single technique above that offered a ‘golden bullet’ to improving website performance, combined they have created a website that is able to offer great performance to its users. Pages are served quickly, and this minimises the time it takes for the user to purchase products and helps improve conversion rates. The website can run on cost-effective hardware with large volumes of users and our predictions show that the website will continue to perform for years to come.