Data compression is the compacting of data by lowering the number of bits that are stored or transmitted. This way, the compressed info will need much less disk space than the initial one, so a lot more content can be stored on identical amount of space. You will find many different compression algorithms which function in different ways and with a number of them just the redundant bits are erased, so once the info is uncompressed, there's no decrease in quality. Others remove excessive bits, but uncompressing the data at a later time will lead to lower quality in comparison with the original. Compressing and uncompressing content requires a huge amount of system resources, particularly CPU processing time, therefore any Internet hosting platform that uses compression in real time should have adequate power to support that attribute. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of keeping the actual code.