Serve External Javascript Files locally for Increased Speed
One of the ways I speed up AskApache.com is by downloading all the external javascript files to my server and then serve them from my own server instead of the external site. Currently I'm using this method to serve the Google Analytics javascript file, and the Quantcast javascript file.
Speed Benefit
The speed benefit occurs because normally a site visitor has to perform a DNS query for both the google-analytics.com and quantcast.com servers, create a connection to each server, and then download the files. By hosting these scripts on your own server you remove all the extra DNS queries, and by hosting all the files on your single server you enable advanced HTTP 1/1 protocol features like Pipelining the requests, which means a single connection for multiple files.
Cache and Compression Control
Serving these javascript files from your own server gives you complete control over the caching and compression settings of those files, which may not seem like a big deal... unless you know about the advanced caching and compression techniques available to make your site super-fast.
Caching Control
By using the mod_rewrite cache hack like I do, you can make sure that a fresh version of the .js file
is used by your visitors.
Compression Control
You can also specify advanced compression for these files, which could mean using Apache's mod_deflate or mod_gzip modules, or you can even use a compression tool like Dojo's Shrinksafe, Packer, JSMin, or YUICompressor to further compress the javascript files.
Automated Downloading with Crontab
Using crontab I tell my web server to run this shell script every day, to make sure I have updated versions of all the files. Here's the crontab entry I use to run this shell script every day.
@daily /bin/sh /web/cron-scripts/grab_javascripts.sh 2>&1 &>/dev/null
Shell Script to Download JS
I created a simple Bash shell-script to download the javascript files. All you need to do is modify the variables to save to your directory and send your site as the Referring URL and you are good to go. You can add as many files as you want by adding them to the JSFILE array.
You can also download the shell-script: grab_javascripts.sh
#!/bin/bash # 2008-09-30 umask 022 VER="2.0" ### Set as the referring url for downloads SITE=https://www.askapache.com/ ### Directory to save the downloaded files JSD=$HOME/j/ ### The files to download JSFILE[0]=http://www.google-analytics.com/ga.js ### run script with args to turn on debugging [[ $# -ne 0 ]] && set -o xtrace && set -o verbose ### SHELL OPTIONS set +o noclobber # allowed to clobber files set +o noglob # globbing on set -e # abort on first error shopt -s extglob #--=--=--=--=--=--=--=--=--=--=--# # pt #==-==-==-==-==-==-==-==-==-==-==# function pt(){ case ${1:-d} in i) echo -e "${C6}=> ${C4}${2} ${C0}"; ;; m) echo -en "nn${C2}>>> ${C4}${2} ${C0}nn"; ;; *) echo -e "n${C8} DONE ${C0} n"; ;; esac } #=# CATCH SCRIPT KILLED BY USER trap 'kill -9 $$' SIGHUP SIGINT SIGTERM #=# MAKE MAIN SCRIPT NICE AS POSSIBLE SINCE IT DOESNT DO MUCH renice 19 -p $$ &>/dev/null #=# TURNS ON COLORING ONLY FOR TERMS THAT CAN SUPPORT IT C=" 33[";C0=;C1=;C2=;C3=;C4=;C5=;C6=;C7=;C8=;C9=; C0="${C}0m";C1="${C}1;30m";C2="${C}1;32m";C3="${C}0;32m";C4="${C}1;37m" C5="${C}0;36m";C6="${C}1;35m";C7="${C}0;37m";C8="${C}30;42m";C9="${C}1;36m"; clear echo -e "${C1} __________________________________________________________________________ " echo -e "| ${C2} ___ __ ___ __ ${C1} |" echo -e "| ${C2} / _ | ___ / /__ / _ | ___ ___ _____/ / ___ ${C1} |" echo -e "| ${C2} / __ |(_- '_// __ |/ _ / _ `/ __/ _ / -_) ${C1} |" echo -e "| ${C3} /_/ |_/___/_/_/_/ |_/ .__/_,_/__/_//_/__/ ${C1} |" echo -e "| ${C3} /_/ ${C1} |" echo -e "| |" echo -e "| ${C4} LOCAL JAVASCRIPT FILES SCRIPT Version ${VER} ${C1} |" echo -e "${C1} __________________________________________________________________________ ${C0} nn" #=# BUILD INCOMING DIRS [[ ! -d "${JSD}" ]] && pt m "BUILDING DIRS" && mkdir -p $JSD &>/dev/null && pt i "CREATED $JSD" && pt #=# DOWNLOAD JAVASCRIPT FILES cd $JSD && pt m "DOWNLOADING JAVASCRIPT FILES" for theurl in "${JSFILE[@]}"; do pt i "${theurl}" curl -m 60 --connect-timeout 10 --retry 10 --retry-delay 180 -s -S -L -e '${SITE}' -A 'Mozilla/5.0' -O "${theurl}" done pt && cd $OLDPWD ######## curl options # -S Show error. With -s, make curl show errors when they occur # -s Silent mode. Don't output anything # -e Referer URL (H) # -HCustom header to pass to server (H) # -L Follow Location: hints (H) # -A User-Agent to send to server (H) # -m Maximum time allowed for the transfer # --connect-timeout Maximum time allowed for connection # --globoff Disable URL sequences and ranges using {} and [] # -O Write output to a file named as the remote file # --retry Retry request times if transient problems occur # --retry-delay When retrying, wait this many seconds between each # --retry-max-time Retry only within this period ############################################################################# exit 0
« COMPUTER SECURITY TOOLBOXTop 3 Speed Tips for Sites using Google Analytics »
Comments