Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

What is your process for optimizing page load times?

#1
Because one of my favourite things to do is to optimize pages for speed on the client side (which is often about as far as you can go with a shared hosting account), I thought I would start a thread where we could share tips and tricks or methods to get a faster loadtime for our visitors.

It is very important, especially if you've monetized your website, to get the website loaded and interactive in less than 1000ms from the time to first byte (this is the first byte of data the client receives back when requesting your website, usually it's the DNS lookup). There are plenty of marketing websites online that talk about how increases in speed can lead directly to increases in conversion rate (if you weren't ware, conversion refers to "converting" a "lead" - this is a potential customer - into a "customer" - this is, obviously, an actual paying customer).

One term I will be using a lot here is "client" or "client side"... in this context I am referring to the end user, and often times just the browser, not the user themselves. You can contrast this with "server side" which would refer to the goings-on with your webserver- the time Apache takes to compile your php scripts, for instance, is a "server side" metric whereas the time it takes a browser to compile and draw your HTML/CSS/JavaScript is a "client side" metric.

So! With this out of the way... you want to know... what can I do NOW to make my website FASTER?

Step 0: Disable .htaccess. Frequently this is only possible on a VPS or dedicated server, though on some shared hosting accounts with friendly support staff, you may be able to have them turn it off for your account. Why? Read the freakin' manual:
Apache httpd Wrote:You should avoid using .htaccess files completely if you have access to httpd main server config file. Using .htaccess files slows down your Apache http server. Any directive that you can include in a .htaccess file is better set in a Directory block, as it will have the same effect with better performance.
Basically, if you enable .htaccess, Apache will search for a root .htaccess (usually in /home/public_html/ or /home/www/), then, say the client requests /images/blah.jpg, not only will apache look for the root .htaccess, it will also search for .htaccess in descendent folder the client accesses. So, if the client requests instead /assets/images/blah.jpg, that's apache looking up 3 .hatccess files, not just one. Be careful though, many software are written with the expectation that you are using .htaccess, and will break if you don't add the appropriate rules to your server config.
___________________________

Step 1: Ensure your HTML/CSS is valid. Always validate as much of your code as is humanly possible before putting it into production. In the old days, invalid html could cause serious rendering issues. This is less possible today, because browsers have matured in their development cycles, they are frequently able to correct malformed HTML on the fly, saving your users a terrible experience. However, this does lead to a bit slower performance on the client side. All in all, if you take web development seriously, you should innately be able to write compliant html. I have never used more than notepad style programs to write css/html/JavaScript. For instance, if you're still using XHTML doctypes, you cannot place block level elements into inline elements, like a <form> nested into a <p> element will be invalid. The same goes for malformed CSS values. 
___________________________

Step 2: Minimizing http requests. This is important and often overlooked. How many external scripts does your index file load? Last time I installed WordPress (few years ago now) I noticed the software, by defaut, references some 5 external JavaScript files in the header, and most custom themes have not remedied this. WHY?!? There is absolutely no reason to include 5 external files on every page of your website unless you are modularly including/excluding them based on the use case (for instance, you only have images in blog posts, so you only load your lazyloadimages.js script on pages in the /blog/ directory -- this is a great optimization). If you are including all 5 files on every single page of your website, it is much faster to combine all 5 files into a single payload.js file and incude it on every page. This saves 4 http requests on every page client side, and will lighten the load on your webserver at the same time.

In fact, many times when the JQuery library is incuded, it is just for a handful of design elements (hamburger menu, lazy load images and some basic DOM manipulation) -- this is a waste of a 30kb download for your client, as you can accomplish all of this with homebrew JavaScript in under 10kb. Plus, you have more flexibility when coding yourself. 
___________________________

Step 3: Prioritizing "above the fold" content: Many designers today still do not understand what is "above the fold content"... to put it succinctly, "above the fold" content refers to anything that is in view, without scrolling, the moment the page is fully loaded. If you are very familiar with CSS3, it is equivalent to 100vh- in fact, a very popuar technique for controlling what will be "above the fold" on every page is to set a large "hero unit" at the top of every page and make it "100vh" (if you aren't familiar with this measurement, it's 100% of the viewport height).

In order to get the quickest "above the fold" experience, all the content "above the fold" should be available without ANY external http requests. This is almost impossible for high-resolution images, HOWEVER, it is possible to inline SVG icons and use them in place of "hero photography" (hero photography is extremely hard to optimize for, because it requires the best resolution you can muster). 

This means: 
- any css rules applied to above the fold content need to be inlined on the page (in the header, <style></style>).
- any icons (hamburger menu icon, "x" icon for closing notifications etc.) need to be created with CSS rather than using iconfonts (because they incur at least 1 external http request) - this is way easier than you think it will be, and you can then animate the icons! 
- any external http requests in you header should be delayed until after the page load event fires. WHY?? Because most javascript and CSS is render blocking. If you include css and javascript files in the header AT ALL, and don't specify that the browser defer them, they will block the loading of the remainder of your page until the browser has downloaded and parsed the content of the files. How to defer loading JavaScript until page loads with Varvy.com & The FilamentGroup's loadCSS javascript function.
___________________________

By following steps 1-3 you should be able to see a marked improvement in your PageSpeed Insights score. No, it's not necessary to get 100/100 from insights, this can actually lead to a worse experience. You need to temper your optimizations by testing the website yourself... Just because you got an extra few points on PageSpeed doesn't mean the experience became noticeably faster for your users, it could, indeed, lead to your page locking up for longer while it downloads all external resources etc. 
___________________________

For testing, I recommend gtmetrix.com and webpagetest.org. Webpagetest is extremely in depth. Also most browser developer tools have very indepth timelines which can help you find bottlenecks in performance, but they require a much greater understanding of the internal workings of the browser. 
___________________________
​​​​​​​
Poweruser optimization techniques include:
- Responsive lazyloading of images
- Responsive LQIP techniques- one of Akamai's most popular services (LQIP stands for Low Quality Image Placeholder, the first file you serve in any image is a very low quality, double size image, which lazyloads to the full resolution after the page is fully loaded)
- Serving all static assets from a cookieless domain
- Minimize and gzip all static assets using mod_deflate
- Restructuring JavaScript for performance (careful with your "for" loops!)
- Moduar progressive enhancement (using feature detection libraries like modernizr, you can add or subtract various features at load time to keep users from downloading scripts their browser can't utilize... for example, if the browser doesn't support flex-box, send a stylesheet that doesn't use flex-box and vice-versa. Flex-box designs draw significantly faster than non flex-box designs, however, if the browser doesn't support it, it's wasted bandwidth)
___________________________
​​​​​​​
If you have any questions or comments please do not be afraid to post. I am always open to discussion.

Also I would be very interested to hear what everyone else does to make their website as fast as possible.

Thank you!
[-] The following 3 users Like drinks2go's post:
  • AleksssBoss157, Genesis, Yozora
#2
Place your <script>'s at the botttom of the code.
#3
Actually I'm using Joomla for my websites. I've installed the plugin Jch Optimize that automate a lot of functions described over. With Admintools (another good component for joomla) you can repair database tables and optimize the system. 
Jch plugin write directly to htaccess files for best performance
#4
Thanks.I didn't disabled .htaccess because I didn't know it slows down the websites but thanks.Oh and I recomend for more optimizing use cloudflare and allways on for even more speed(if you have your domain bought because I don't think you can do this with subdomains from free hosting).
#5
If your webpage fetches a database, such as mysql or even sqlite, I would suggest double checking the queries. YOu may find that you can build one larger query instead fo multiple subsequent queries. That reduces the number of fetches against the website hard drive or storage. Take into account that the shift registers are the quickest, if you need something else you may use your CPU's cache memory. If not, you may go to RAM which is about 1000 times slower, if not, you may got tho fetch from Random Access Memory RAM wich is 1000 times slower, if not you may fetch data from a non volatile medium such as a hard drive wich is 1000 times slower.

To sum up, try using smarter fetches that avoid reading from non volatile storage.
  




Users browsing this thread:
1 Guest(s)

What is your process for optimizing page load times?513