Alright. So we have a lighttpd server pumping out the loading of our images.
It logs on a daily basis, and we're considering moving all those images to our S3 account for better load times, but before we can do that i need to get atleast a "feel" for what our transfer will look like.
So right now we have the standard lighttp access log
accesslog.filename = "/var/log/lighttpd/images.access_log" accesslog.format = "%h %V %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\""
so want to dump all the "bytes transferred" from that log file, nice and easy.
cat images.access_log | awk '{print $10}'
which kicks out a output similar to this.
19547
6138
17782
8044
345
0
2727
2125
1838
1649
2127
3275
3653
0
16688
Now -- I've done a bit of googling and maybe i'm forgetting something, is there a command hidden away in linux somewhere that will take all that output, and just add it all together for me? so i can run this command and just have it spit out an absurdly large number every day until i get a baseline of our bandwidth per day?
--- EDIT ---
i Found https://stackoverflow.com/questions/450799/linux-command-to-sum-integers-one-per-line
IS there anyway to get awk to return the full number, instead of doing what is shown below?
cat images.access_log | awk '{print $10}' | awk '{s+=$1} END {print s}'
9.48886e+10
You can combine these into one command. If the 10th column has the numbers, then:
Try
cat images.access_log | awk '{print $10}' | awk '{s+=$1} END {printf "%.f\n",s}'