Data compression is the lowering of the number of bits which should be stored or transmitted and this process is really important in the internet hosting field since information filed on hard disk drives is generally compressed to take less space. There're many different algorithms for compressing information and they offer different effectiveness based on the content. Some of them remove just the redundant bits, so that no data will be lost, while others remove unneeded bits, which leads to worse quality once the data is uncompressed. This method employs a lot of processing time, so a hosting server has to be powerful enough in order to be able to compress and uncompress data in real time. One example how binary code may be compressed is by "remembering" that there are five sequential 1s, for example, in contrast to storing all five 1s.

Data Compression in Shared Hosting

The ZFS file system which is run on our cloud Internet hosting platform employs a compression algorithm called LZ4. The aforementioned is substantially faster and better than every other algorithm you will find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the overall performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that quickly, we can generate several backups of all the content kept in the shared hosting accounts on our servers daily. Both your content and its backups will require reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not influence the performance of the web servers where your content will be stored.