My experimental AI testing attempt was successful

{ As usual, I'm writing this with translation. }
Hardware testing platform information:
Processor: (2012 model) AMD A4-5300 2-core (with APU graphics unit),
but I was specifically looking for an AI without a GPU.
Motherboard with 16 GB RAM
8 GB USB flash drive (ext2 format)
During the Google Gemini mentoring process:
The slogan "Never give up" really motivated me

I created solution scripts suitable for TCL chemistry

I created the "run ai.sh" file,
The function of the script is to download and upload files from the internet.
Performance speed; I can't say it responded faster than me in practice,
it outputs about one word per second (depending on processor power),
but I think the average code writing speed is faster than mine


| |-- [ 17] libggml-base.so -> libggml-base.so.0
| |-- [ 21] libggml-base.so.0 -> libggml-base.so.0.0.0
| |-- [ 0] libggml-base.so.0.0.0
| |-- [ 0] libggml-cpu-alderlake.so
| |-- [ 0] libggml-cpu-haswell.so
| |-- [ 0] libggml-cpu-icelake.so
| |-- [ 0] libggml-cpu-sandybridge.so
| |-- [ 0] libggml-cpu-skylakex.so
| |-- [ 0] libggml-cpu-sse42.so
| `-- [ 0] libggml-cpu-x64.so
|-- [363M] ollama
| |-- [ 36M] bin
| | `-- [ 36M] ollama
| `-- [327M] lib
| `-- [327M] ollama
| |-- [327M] cuda_v12
| | |-- [ 21] libcublas.so.12 -> libcublas.so.12.8.4.1
| | |-- [111M] libcublas.so.12.8.4.1
| | |-- [ 23] libcublasLt.so.12 -> libcublasLt.so.12.8.4.1
| | |-- [215M] libcublasLt.so.12.8.4.1
| | |-- [ 20] libcudart.so.12 -> libcudart.so.12.8.90
| | |-- [712K] libcudart.so.12.8.90
| | `-- [ 0] libggml-cuda.so
| |-- [ 17] libggml-base.so -> libggml-base.so.0
| |-- [ 21] libggml-base.so.0 -> libggml-base.so.0.0.0
| |-- [ 0] libggml-base.so.0.0.0
| |-- [ 0] libggml-cpu-alderlake.so
| |-- [ 0] libggml-cpu-haswell.so
| |-- [ 0] libggml-cpu-icelake.so
| |-- [ 0] libggml-cpu-sandybridge.so
| |-- [ 0] libggml-cpu-skylakex.so
| |-- [ 0] libggml-cpu-sse42.so
| `-- [ 0] libggml-cpu-x64.so
|-- [1.7G] ollama-linux-amd64.tar.zst
|-- [ 18K] ollama.log
|-- [ 123] ollama_error.log
|-- [379M] ollama_models
| |-- [379M] blobs
| | |-- [ 490] sha256-0...
| | |-- [ 68] sha256-6...
| | |-- [ 11K] sha256-8...
| | |-- [379M] sha256-c...
| | `-- [1.4K] sha256-e...
| `-- [ 17K] manifests
| `-- [ 13K] registry.ollama.ai
| `-- [8.8K] library
| `-- [4.8K] qwen2.5
| `-- [ 857] 0.5b
`-- [ 846] txt.txt
2.9G used in 15 directories, 49 files