javascript - Frontend resource optimisation: Requests vs caching -
i'm working on big site has quite bit of technical debt need work away. there quite bit of js , css being loaded in site. files aggregated , minified in layers. 1 layer used on each page, while other layers loaded on pages use them.
for example:
page 1: - default.css - page1.css - some-feature.css - default.js - page1.js - some-feature.js page 2: - default.css - page2.css - default.js - page2.js page 3: - default.css - page3.css - some-feature.css - some-other-feature.css - default.js - page3.js - some-feature.js - some-other-feature.js
now, besides these resources there lot of external resources loaded well, used tracking, advertising, social integration etc.
i have feeling these resources served faster (both on initial , subsequent requests) if aggregated , minified in 1 single js , 1 single css file per page. example page1.css + page1.js
, on page page2.css + page2.js
. although result in less requests, end in loading general content twice (like original default.css
)
what preferred way of loading these resources? , have test results on this?
tldr: people prefer caching, because page survive through first page load gzipped payload
most of projects i've seen, had front-end assets aggregated in single files. gzip
compression take care, might amazed how huge reduction of file size.
consider outputting small amounts of page-specific css code inline styling.
regarding javascript, best can convert assets in amd modules , use requirejs handle dependencies , execution order.
inlining small pieces of js code works great too, can save little on main package size way.
after all, having huge amount of advertisement huge letdown front-ender, unless can make banners load asynchronously(see postscribe)
consider using pagespeed tools google, it's simple tool, might give tip on optimizing payload.
Comments
Post a Comment