IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New Web page compression
Just tinkering with an idea. I'm wanting to play with compressing web pages on the server and then having javascript decompress the page on the client side. I figure someone has to have devoted some effort to such things. The following snippet is kind of along the lines that I want to go:
<script>\n   // the compression would take place on the server\n   function compress(s) {\n      return s.replace(/Hello HTML/g, "%1%");\n   }\n   var htmlsource = "<html>";\n   htmlsource += "<head><title>Hello HTML</title></head>";\n   htmlsource += "<body>Hello HTML</body>";\n   htmlsource += "</html>";\n   compressedsource = compress(htmlsource);\n\n   // decompression would take place on the client side\n   function decompress(s) {\n      return s.replace(/%1%/g, "xHello HTMLx");\n   }\n   document.write(decompress(compressedsource));\n</script>

The compress and decompress functions are rather brain dead in this example. So what I'd like is to find a compression/decompression scheme that meets three criteria:
  • Efficient. Trying to get the size of the page transfer down to free up bandwidth. The one advantage we have is that the text being compressed is all ascii text (no binary) and has a bit of statistical normality (being html and embedded javascript).
  • Fast. It's gotta happen pretty quick as the user is waiting on things whilst this is going on. Be nice if it was faster than the freed up bandwidth.
  • Reliable. Not sure if internet connections can be considered to deliver reliable bits. Drop a character in html, and all you lose is the character. Drop a bit in the compressed bytes, an you get garbage. Perhaps some sort of crc check on the compressed string?


Thanks.
New Use mod_gzip or equivalent
To deny the indirect purchaser, who in this case is the ultimate purchaser, the right to seek relief from unlawful conduct, would essentially remove the word consumer from the Consumer Protection Act
- [link|http://www.techworld.com/opsys/news/index.cfm?NewsID=1246&Page=1&pagePos=20|Nebraska Supreme Court]
New That's the direction I'd like to take
[Edit:] Will be looking into it. Thanks.
Expand Edited by ChrisR April 8, 2004, 11:43:31 AM EDT
New 10q 10q 10q
Exactly what I needed - and I didn't even know it. Must remember to upgrade to a decent network administrator so that I don't have to think about such things. For now, I can go back to writing absurdly long web pages and not caring about efficiency. :-)

Guess I should ask if there's a catch to such a free gift?
New Yes...
Processor power is required.

On both ends.

Other than that, a browser the understand compressed content.
--
[link|mailto:greg@gregfolkert.net|greg],
[link|http://www.iwethey.org/ed_curry|REMEMBER ED CURRY!] @ iwethey

Give a man a match, he'll be warm for a minute.
Set him on fire, he'll be warm for the rest of his life!
Expand Edited by folkert April 8, 2004, 12:21:32 PM EDT
New The browser is not really required
Browsers that support it advertise that they do, and it only gets turned on for them. So ones that don't support it still work.

As long as most of the browsers going by support it, you see the overall bandwidth wins that you want.

Cheers,
Ben
To deny the indirect purchaser, who in this case is the ultimate purchaser, the right to seek relief from unlawful conduct, would essentially remove the word consumer from the Consumer Protection Act
- [link|http://www.techworld.com/opsys/news/index.cfm?NewsID=1246&Page=1&pagePos=20|Nebraska Supreme Court]
New True... but for the compression to actually work...
A browser that supports it, is needed.
--
[link|mailto:greg@gregfolkert.net|greg],
[link|http://www.iwethey.org/ed_curry|REMEMBER ED CURRY!] @ iwethey

Give a man a match, he'll be warm for a minute.
Set him on fire, he'll be warm for the rest of his life!
New True. BTW CPU time on the server isn't always needed.
If you use mod_gunzip, then you can store static content in pre-compressed form, and the server only uses (minimal) extra CPU if a browser doesn't understand compressed data. (Which is fairly rare these days - support for compressed data was standardized in the 4.x series.)

With mod_gunzip you're saving local disk space and activity, bandwidth, and not even taking up server CPU!

The catch is that this ONLY works for static content.

Cheers,
Ben
To deny the indirect purchaser, who in this case is the ultimate purchaser, the right to seek relief from unlawful conduct, would essentially remove the word consumer from the Consumer Protection Act
- [link|http://www.techworld.com/opsys/news/index.cfm?NewsID=1246&Page=1&pagePos=20|Nebraska Supreme Court]
New Re: Web page compression
[link|http://www.webreference.com/internet/software/servers/http/compression/|http://www.webrefere...http/compression/]

I got called away while I was looking this up for you, or I would have had it here earlier. :-)
Regards,

-scott anderson

"Welcome to Rivendell, Mr. Anderson..."
     Web page compression - (ChrisR) - (8)
         Use mod_gzip or equivalent -NT - (ben_tilly) - (6)
             That's the direction I'd like to take - (ChrisR)
             10q 10q 10q - (ChrisR) - (4)
                 Yes... - (folkert) - (3)
                     The browser is not really required - (ben_tilly) - (2)
                         True... but for the compression to actually work... - (folkert) - (1)
                             True. BTW CPU time on the server isn't always needed. - (ben_tilly)
         Re: Web page compression - (admin)

Freud would have wanted it this way.
56 ms