General Topics > SEO

CSS Management, pageload speed, & BLOAT

(1/3) > >>

Geoffrey:
There are two reasons to seek optimal pageload speed:
1 - faster pageloads = more retained visitors
2 - google defines what it considers to be optimal pageload speed, and also provides recommendations for how to achieve it.  It seems unlikely that pageload speed is not a factor in their ranking algorithm. 

Following google guidelines, the biggest and easiest improvement i made to improve pageload speed was to combine and minify css. 
If you look in your head.tpl, you will see two things:
1 - calls for 5 separate css files.
2 - AC's notes on how to merge and minify css into one file, and even a link to an essay on css management. 

head.tpl calls 5 separate css files.  The bootstrap & font-awesome & style.css files are BIG.  My site uses a third or less of the rules in each of those files.
 
As a rule of thumb, all CSS files are render-blocking, meaning that the page cannot start rendering until all css files are loaded and parsed. 
It takes longer to call and parse multiple css stylesheets than it takes to call and parse one.
It takes longer to parse an un-minified css file than to parse a minified css file. 
Obvious solution: use a text editor to create a new css file called "bigboy" or whatever you want.  Then copy/paste the entire contents of each of the five oem css files into bigboy, then minify it and save it and upload it to your live stylesheet folder. 
Then edit your live head.tpl file to comment out the calls for the five separate css files and replace those with a single call for the new bigboy file. 

This dramatically improves AC pageload speed. 

But the job isn't really done, because most AC sites are like mine - they are only using a quarter or less of the rules contained in the oem css files.

The rest is just bloat. 

There are 2 methods I know of to get rid of this bloat:
1 - Firefox Dustme extension can identify unused rules in your css file, which you can then manually delete.  This is a labor intensive process that must be performed carefully over time, but you can shrink your bigboy css files from 9000 lines to 2000 lines pretty easily.  This will speed up parsing and therefore pageload, and will also make you feel less bloated. 

2 - There are scripts available on github that constantly screen oem css files to identify only those rules that are actually being used, and then combine and minify only the used rules into one small css file that runs the site.  You always work within the original separate large css files to make changes to your site, the script always runs in the background to include your modifications into the single small css file that the site runs on. 

Method 2 is much better in terms of longterm site maintenance, because the de-bloating and combining and minifying steps are automated, meaning that you don't have to undo and redo this process each time you edit your site within the oem files. 

Which leads to my questions: 
How do you use these kinds of scripts with AC?   
Has anyone done it? 
Has anyone tested any of the various different opensource css management scripts that are available? 
Is there a tutorial available on how to install and run these scripts on AC? 
Would you like to write one? 

I'd do it, but I don't even know step 1.

Thx. 

Basara:
Hello.

First in list should be server/hosting! Then you should configure it correctly for example at least enable .htaccess optimizations

One of the largest content on your site are images, not CSS so consider to load all images with CDN network.
And be careful when you remove CSS rules. Default AbanteCart tempalte is responsive so CSS rule not working in desktop - will work on ipad or mobile screen. Test well after you made changes.  ;) 

http://docs.abantecart.com/pages/tips/performance.html

Geoffrey:
Hi, for the sake of future readers who, like me, are not web development specialists, I offer the following comments:

Basara provides advice pertaining to AC features that should be enabled on any site:

1 - If you want fast pageloads, use a quality hosting company that provides reliable and fast server service. 

2 - Rename the oem .htaccess.txt file to .htaccess, and then add or uncomment various RewriteCond or RewriteRule rules within the file to accomplish the following objectives: direct all forms of url to your preferred url (ie www or https: or whichever you select as your primary site address), enable AC Retina screen image management, enable the AC SEO url, compression and cahce settings. 

3 - Use a content delivery network to deliver your site content, instead of having all of your content served only by your site host.  If you are Amazon, you need CDN, but you also buy web development, so none of my posts will ever mean anything to you.  If you are a small mom & pop with 30 products and 6 images of each product, my opinion is a CDN is not going to make a big difference in your pageload speed for most of your customers, provided that you have of course resized your images to site-appropriate dimensions and then optimized them with 4:2:0 chroma with 85% compression. 

Basara's final comments are related to my earlier paragraph that described FireFox DustMe extension.  The AC stylesheet obviously must include rules for various screen size breakpoints because AC is responsive.  As he says, the process of identifying unused rules must be accomplished over a period of time during which the site is thoroughly evaluated on ever conceivable screen size.  This is why Dustme is actually not a very helpful tool for a site such as AC that uses nearly 10,000 lines of css rules (unminified).  The hours spent manually deleting css rules over time would outweight the small gain reulting from reduced bloat. 

Which all brings us back to the central point of my post:
There are scripts available that apparently do a really good job over time of shrinking your css.  I wonder if anyone on the AC forum has made use of these scripts?  I wonder if the AC devs have considered them at all? 

Google webmaster tools include a speed assessment tool that measures pageload speed of your site in both mobile and desktop universe.  It is very simple to check speed, then change a site parameter and check speed again to see the effect of the change.  Google even goes so far as to make recommendations on how to improve your speed. 
This is how I determined that combining and minifying my css bumped me from high 70's to 90th percentile.  My site is almost what google considers to be fast. 

I'm quite sure that if shrink my css from 10,000 lines to 2000, I'll probably land at around 94, and i will have also done everything reasonable to optimize my site. 

Sooo, anyone had any success with css management scripts on AC or any other cms?

Geoffrey:
In head.tpl,

Line 48 is a comment: "Set $faster_browser_rendering == true; for loading tuning."

Does that mean to set == true and leave it set to true, or

set to true for load tuning, but set back to false for production? 

Geoffrey:
Follow-up:

grunt is a task performing app.

node.js is a js runtime environment you can install on a pc and use via command prompt.

Xenu's Link Sleuth is an old broken-link finder app that also spits out a sitemap if you want. 

grunt-uncss is a cooperatively created grunt task suitable for cleaning the css file for an html document.

grunt-contrib-cssmin is a grunt task that minifies css.

With node and Link Sleuth installed, you can create a sitemap of all site links, reformat the map into a url list suitable for grunt, set some limiting parameters for uncss, load the url list into grunt, designate target css files, and then run a grunt task that will clean out unused css rules and then output a minified file.

Two considerations:

1 - test your responsive functions after a cleanup.

2 - the vast majority of unused css in an AC site will be in the bootstrap and font-awesome files.  So if you tell grunt-uncss to ignore the other css files, then it will clean only bootstrap and FA.  This will cut your css filesize in half while minimizing the threat of nuking css rules that you actually need. 

There are two possible approaches to item2 above:
A - Target all css files, ignore all css files except BS and FA, then let grunt clean and minimize the batch. 
B - Target only BS and FA, let grunt-uncss clean them.  Then batch the cleaned BS and FA files with the other css files, and run another grunt task to combine and minify them. 

I haven't done this yet.  Maybe next week.  I will probably try a couple of different iterations, conservative vs. aggressive, and test to see what breaks.
 
I think my goal will be to do a big clean-up one time, archive the cleaned individual css files locally, make future edits to these files on a local site that runs on the individual files, combine and minify when finished with changes, and upload the new minified file to live server. 

Info:

i.   https://wireflare.com/blog/using-grunt-and-uncss-on-a-wordpress-site/
ii.   http://deanhume.com/home/blogpost/automatically-removing-unused-css-using-grunt/
iii.   https://github.com/addyosmani/grunt-uncss
iv.   https://gruntjs.com/getting-started
v.   https://nodejs.org/en/docs/guides/getting-started-guide/
vi.   https://moz.com/blog/xenu-link-sleuth-more-than-just-a-broken-links-finder


Navigation

[0] Message Index

[#] Next page

Go to full version
Powered by SMFPacks Social Login Mod