Data compression is the compacting of info by lowering the number of bits that are stored or transmitted. Thus, the compressed info takes substantially less disk space than the original one, so much more content might be stored on the same amount of space. You'll find many different compression algorithms which function in different ways and with some of them only the redundant bits are erased, therefore once the information is uncompressed, there's no decrease in quality. Others remove unnecessary bits, but uncompressing the data later on will lead to lower quality compared to the original. Compressing and uncompressing content takes a significant amount of system resources, particularly CPU processing time, therefore every web hosting platform that employs compression in real time must have ample power to support this attribute. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of saving the actual code.
Data Compression in Shared Hosting
The compression algorithm that we employ on the cloud hosting platform where your new shared hosting account will be created is called LZ4 and it is applied by the leading-edge ZFS file system which powers the platform. The algorithm is more advanced than the ones other file systems work with as its compression ratio is much higher and it processes data significantly faster. The speed is most noticeable when content is being uncompressed as this happens more quickly than information can be read from a hard disk. Consequently, LZ4 improves the performance of every website stored on a server that uses the algorithm. We take full advantage of LZ4 in an additional way - its speed and compression ratio let us make multiple daily backup copies of the entire content of all accounts and keep them for a month. Not only do these backup copies take less space, but their generation won't slow the servers down like it often happens with various other file systems.