Data compression is the compacting of information by decreasing the number of bits which are stored or transmitted. As a result, the compressed information will take much less disk space than the initial one, so more content could be stored using identical amount of space. You'll find many different compression algorithms which function in different ways and with a lot of them only the redundant bits are removed, which means that once the information is uncompressed, there is no loss of quality. Others erase unneeded bits, but uncompressing the data later will lead to reduced quality in comparison with the original. Compressing and uncompressing content takes a huge amount of system resources, especially CPU processing time, therefore any hosting platform which employs compression in real time needs to have enough power to support that feature. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of keeping the actual code.

Data Compression in Shared Hosting

The compression algorithm used by the ZFS file system that runs on our cloud web hosting platform is known as LZ4. It can enhance the performance of any site hosted in a shared hosting account on our end as not only does it compress data more efficiently than algorithms employed by other file systems, but it uncompresses data at speeds which are higher than the hard disk reading speeds. This is achieved by using a great deal of CPU processing time, which is not a problem for our platform owing to the fact that it uses clusters of powerful servers working together. A further advantage of LZ4 is that it enables us to create backup copies at a higher speed and on lower disk space, so we can have several daily backups of your files and databases and their generation won't change the performance of the servers. This way, we could always restore the content that you may have erased by accident.