Compress and Split large files under Linux

One of the common problems i used to face is transferring large files from Linux through mediums that doesn’t support large files, especially when it comes to more than 100 GB files, usually that is a virtual machine hard disk, or backup file.

To compress and split files under linux, this command can be used:

tar -cvj large-file.dvi | split -b 1000m -d - "small-files.tar.bz."

This command will compress and split the “large-file.dvi” to many files with size 1000mb for each, giving it the name “small-files.tar.bz.0, small-files.tar.bz.1, etc….”

To join the files together again, this command can be used:

cat small-files.tar.bz.* > large-file.tar.bz

and to extract the output file :

tar -xvj large-file.tar.bz