Data compression is the reduction of the number of bits which have to be stored or transmitted and this particular process is really important in the web hosting field because data stored on HDDs is generally compressed in order to take less space. There are different algorithms for compressing info and they provide different efficiency based on the content. A number of them remove just the redundant bits, so that no data will be lost, while others delete unnecessary bits, which leads to worse quality when your data is uncompressed. This method requires a lot of processing time, so an internet hosting server has to be powerful enough so as to be able to compress and uncompress data immediately. An instance how binary code could be compressed is by "remembering" that there're five sequential 1s, for example, in contrast to storing all five 1s.

Data Compression in Cloud Hosting

The ZFS file system that operates on our cloud Internet hosting platform uses a compression algorithm identified as LZ4. The latter is considerably faster and better than any other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the overall performance of sites hosted on ZFS-based platforms. As the algorithm compresses data really well and it does that quickly, we can generate several backups of all the content stored in the cloud hosting accounts on our servers every day. Both your content and its backups will need reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not affect the performance of the web servers where your content will be kept.

Data Compression in Semi-dedicated Servers

In case you host your Internet sites in a semi-dedicated server account with our company, you can experience the advantages of LZ4 - the powerful compression algorithm employed by the ZFS file system that is behind our advanced cloud web hosting platform. What separates LZ4 from all the other algorithms out there is that it has a better compression ratio and it is much quicker, in particular with regard to uncompressing web content. It does that even quicker than uncompressed information can be read from a hard drive, so your sites will perform faster. The higher speed is at the expense of using a great deal of CPU processing time, that's not a problem for our platform as it consists of a large number of clusters working together. In addition to the superior performance, you'll have multiple daily backups at your disposal, so you can restore any deleted content with just a few clicks. The backups are available for an entire month and we can afford to store them because they take a lot less space than traditional backups.