Data compression is the compacting of data by lowering the number of bits that are stored or transmitted. Consequently, the compressed information will take less disk space than the initial one, so extra content could be stored on the same amount of space. There are many different compression algorithms that function in different ways and with some of them just the redundant bits are deleted, which means that once the data is uncompressed, there is no loss of quality. Others delete unnecessary bits, but uncompressing the data later will result in lower quality in comparison with the original. Compressing and uncompressing content consumes a large amount of system resources, and in particular CPU processing time, so any web hosting platform which uses compression in real time should have ample power to support that attribute. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of keeping the actual code.
Data Compression in Hosting
The ZFS file system which runs on our cloud web hosting platform employs a compression algorithm identified as LZ4. The latter is significantly faster and better than any other algorithm on the market, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the overall performance of Internet sites hosted on ZFS-based platforms. As the algorithm compresses data very well and it does that very fast, we are able to generate several backups of all the content kept in the hosting accounts on our servers daily. Both your content and its backups will require less space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the web servers where your content will be stored.