Tiny Core Linux
Tiny Core Base => Corepure64 => Topic started by: tinycorelinux on December 13, 2020, 01:41:13 AM
-
As title, the system cannot download large files, as long as the downloaded file is too large,
when the file has not been downloaded, the system will be stuck, crash.
Whether downloaded from the command line or from file management,
the system will fake death without any response.
for example
wget -T0 https://mirrors.tuna.tsinghua.edu.cn/qt/archive/qt/5.12/5.12.0/qt-opensource-linux-x64-5.12.0.run
NOTE: I'm just taking Qt as an example. I know there's QT in the TCL's repo.
-
You ran out of RAM without swap. That happens when you try to download too large files into RAM. Put them on your hd.
-
Hello forum.
Perhaps you could try it from a source or public network that you are aware of.
Sometimes they have very very fast systems indeed.. (i mean throughput and bandwidth)
thx
v
-
Hi vinceASPECT
It's not a speed issue, he just ran out of RAM.
-
Rich and forum,
Ahh right. That is what conflicted with me.,,,,,,,,,where actually it is as you described Rich.
i never really understand servers and clients, which is hierarchy for delivery speed----- versus what your ISP demands the data at.
while i guess it's about traffic right.....amount(s)
so it's just storage amounts for the information (fole).....in the case above (RAM).....
so wonder whether the person can download it to solid state storage (pen drive) or device (other)
to solve it
many thanks Rich,
Vince
-
"That happens when you try to download too large files into RAM. "???
"he just ran out of RAM."???
Forgive me, I have been confused, I did not download the file into memory,
I downloaded the file to /tmp, Does /tmp use RAM instead of hard disk?
-
Yes, /tmp is in ram - you need to save large files to a mounted storage device - for example /mnt/sdb1
-
Yes, /tmp is in ram.
Okay, I get it.