A blog where I talk about the stuff I make, do and more

Friday, December 4, 2015

PageSpeed Insights and optimizing websites

No comments
The day before yesterday I let Google have a look at my website via the PageSpeed Insights tool. In case you're not familiar with it: it's an online tool that analyzes a website and gives it a score based on how fast it loads, how well images are compressed, etc. It's really nice to get some ideas about which areas of your site may need some more work (just in case you thought you were done).

I knew my site was far from perfect but it actually didn't score that bad. I didn't write down the actual score but it was above 50/100 for Speed and 100/100 for user experience. The things I should improve on, according to Google, were:
  • Leverage browser caching
  • Minify html, css and javascript
  • Optimize images
  • Eliminate render-blocking JavaScript and CSS in above-the-fold content
I postponed the first and forth point for now. Browser caching doesn't make too much sense at this time as I'm still working on the site and things may change. Cache is your worst enemy in those scenarios. Regarding render-blocking javascript and css (because it's loaded in the <head> section so the <body> won't render until it has loaded all the files): javascript is needed for my site to work as soon as content is shown (so I need it before any content is shown). It can be solved by showing a splash/loading screen and wait for the javascript to load asynchronous. But since minifying javascript will make loading the javascript faster it will also partially eliminate some of the loading time my website requires before it can show something. Css and fonts can be loaded async so that's not a real issue but I still have to do that.

So I had a look at minifying css, javascript and optimizing the images.

I like to try and fix problems my own way before I look at other possibilities (sorry, I'm an autodidact, it's what we do ;)) I created a small python project that runs through all the files in a directory (including subdirectories, of course) and optimizes what it can:
  • Javascript using Python-jsmin
  • Css using Python-csscompressor
  • Images using PIL (or Pillow) and later also PyImageOptimiser
It all works pretty well. My css and javascript files are reduced by an average of 30 to 50% and jpegs are reduced to 50% quality and that saves a lot, too. The only thing I still need to look into some more are png'ss - Google keeps saying I should optimize them but I already compressed them with everything PIL has to give (9 out of 9) and when I try to optimize the palet (the available colors in the image) it produces larger images than the original files. So I guess Photoshop isn't all that stupid, after all. It actually does a better job at optimizing png's than me. I gave PyImageOptimiser a try and after fixing some indentation errors in its code it did a pretty good job. Still, Google keeps telling me I should optimize my images. Not sure if Google is wrong or just not updating the files it checks? Oh well, I'll look into that some more.

After I tested it with my new website and found that it was working well I decided it would be nice if it would also work for other projects. So I let it run through the webapp part of Icerrr and it reduced the size of the www folder by ~40%! The installer shrunk from ~6MB to a little less than 4MB. Nice!

I'll open source the project as soon as I made it a little more easy to configure (command line options, probably) and put it on my github. I'll also update this post with a link to the repo.

UPDATE:

Here's the github: https://github.com/rejhgadellaa/rgpy-tools/tree/master/rgpy-web-optimizer

No comments :

Post a Comment