Tiny Core Linux
Tiny Core Base => TCB Q&A Forum => Topic started by: mbertrand on July 18, 2012, 09:22:47 AM
-
I've noticed that the backup is very slow and that mydata.tgz is only 15MB nad located on sda1.
I don't have much in installed at least I don't think so. I only use .files.lst for persistence. No boot codes.
-
Use Control Panel then System Stats and the default tab displays BigHomeFiles.
Use that to decide what to add to .xfiletoo.lst. Likely it is browser database files.
Or open a Terminal window and type
$ filetool.sh -d
The d is for dry-run this will display all files that are included in your backup.
-
34 MB uncompressed
15 MB compressed.
I have my 2 program that is 15 MB and a QT lib that is 12 MB and the rest is not worth mentioning, just basic.
My computer is used in automation industry to control show there is the bare minimal with X and ssh for my program.
-
Why would you include a static file, QT lib, in your backup.
Static content should be repackaged into an extension.
-
Well as of now I don't know how to create extensions and currently the repos only has QT 4.7.4 and I need 4.8.2 so I'm managing it my self. Any ways, I believe the backup was find with these files and then suddenly became slow, like really slow (2 minutes).
-
Visit the wiki such is not difficult.
-
Of course I will, but you think this is the cause of a 2 minute backup?
-
Well, is it a slow cpu? Compressing 34mb of content can take a while.
-
Hi mbertrand
but you think this is the cause of a 2 minute backup?
You can easily answer that question yourself.
1. Copy your filetool.lst to filetool.bak.
2. Edit filetool.lst to remove any references to the two programs you are backing up.
3. Open a terminal and enter time filetool.sh -b
4. Copy the filetool.bak to filetool.lst
5. Run another backup so you don't lose your two programs.
Step three will report how long it took to run your backup.
-
My two programs and one QT lib (my big files) total about 30MB. The backup goes from 1m 40s to 0m 14s.
Wow I'm surprised that compression would take that long. Well thanks for your help.
-
You haven't provided any specs neither about your CPU or your write data throughput (bus + medium), but I can't see anything surprising at those values when compressing and writing such large binary files.
-
The fact remains that static content has no useful purpose to be re-written upon every shutdown. It should be read only as provided by moving such out from any backup consideration and into an extension. That is the very purpose of extensions. Using the system as not intended will lead to poor results.