Skip to main content
Topic: CSS and JS minification (Read 1471 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

CSS and JS minification

I know there is already another discussion, but I can't find it... >:( [1]
Well, whatever.

After a question on an Italian board, I started thinking (again) at this stuff.

One of the limitations of the current approach is that we may potentially end up loading different files for each different "type" of pages.
For example: a pair for board index, one for message indexes, one for topics, one for search, one for ... etc.

I tried to track down all the different hives a guest downloads at the moment, and I found 7 files (talking only about javascript), for a total of ~400 KiB.
Then I measured the size of the hives for browsing: board index + message index + topics and the size is ~282 KiB.

So, I did some test.
  • The first thing I did was pack the whole "scripts" directory (including jQuery and jQuery-UI)[2]
  • For the second test I removed jQuery and jQuery-UI and compacted everything else. The resulting file was ~380 KiB. Comparable to the sum of the various hives downloaded by a guest browsing the different pages of the website (see the ~400 KiB above).
  • Another step, I removed the admin.js file obtaining an hive of ~347 KiB.
  • Then I removed the files that guests should never have: PersonalMessage, draft and draft.plugin, the result was a 330 KiB hive.
  • And as a final test, I removed anything related to the editor (dropAttachments.js, jquery.atwho.js, jquery.caret.min.js, jquery.sceditor.bbcode.min.js, jquery.sceditor.elkarte.js, jquery.sceditor.min.js, mentioning.js, mentioning.plugin.js, post.js, spellcheck.js, splittag.plugin.js), leaving only the files that are likely to be useful to a guest in a common configuration. In this setup, the resulting file was ~195 KiB, smaller than the 282 KiB a gues downloads if browses this very website.

Of course it's just a preliminary test, but based on that I would consider worth (in the future, probably 2.0) explore the possibility to create "groups" of javascript and css files (and maybe code in general, I'm not sure on that part now) and minify them with a scheduled task in advance.
It may work somehow that way:
a table in the db with schema | group | filename | type (js/css) | defer (true/false) |
each js/css is added to the table (for example by an addon) and when added the scheduled task is run to rebuild the hives.
from time to time the hives are rebuild "just in case" (or maybe not)
each group would remain a different file (so in the admin panel we'd be served at least two/three files: an hive for "guests", one for "registered users" and one for "admins")
The biggest problem I can see at the moment is how to bind the groups of javascript/css to permissions. I'm thinking about posting, for example: a forum may want to have guests able to post, but that would require the "post" group to be presented to guests as well and not only to registered members.

Dunno. Some food for thought.
Unless I wrote it as a draft some time ago and then I forgot to publish it... that already happened twice, so it's a possibility worth consideration. xD
I didn't spend time separating defer and not-defer, I just put everything together[(footnote], this gives a file of about 700 KiB. Quite big.
Bugs creator.
Features destroyer.
Template killer.

Re: CSS and JS minification

Reply #1

Good summary of the state of affairs.

I've gone back and forth on whats the most efficient approach as well, and I'm still have not come to any conclusions.  What is good about today's way is that is very flexible, especially for addons etc. 

The downside is a member may have several various hive packages they need to retrieve when the only difference between on and the other is a couple of bits, so they get hit with another 200KiB file ... once the browser cache is loaded its invisible, but the first page loads do carry the payload.

I suppose doing it feature based or member group based holds some merit, in an ideal world the member would download maybe a single 250KiB file and be done for all pages vs say 2 to 3 200KiB ones, just not sure how to get there.