Copy a Big size file(100GB or more) with resume within a Linux SystemYou may have problem during copy a GB size file(100 GB or more size of a single file).
You may not have time to leave system ON for so many hours. Also more then 100GB file size can take several hours depending upon system.
But you need resume copy option during big files copy, so that you can copy a single file in different time interval.
I use "curl" command to copy a single file more then 100GB.
Go to destination folder and run this command
$ mkdir destination_folder
$ cd destination_folder
$ curl -C - -O file:///media/1TB_HDD/source_folder/100-GB-source-file.gz
This command work well and anytime you can stop copy and resume copy without any problem.
For verification of data file, you can use checksum command like this
$ cd /media/1TB_HDD/source_folder/
$ md5sum 100-GB-source-file.gz > test.checksum.md5
Go back to destination_folder
$ cd -
$ md5sum -c /media/1TB_HDD/source_folder/test.checksum.md5
This ensure that you have copied file correctly.