options
********************************************************************************
MAQAO 2025.1.1 - 3f8f629befdd61fbefd06c681d308ad28e2e93f8::20250630-093426 || 2025/06/30
/scratch/users/amazouz/Tools/x86_64/maqao/maqao.x86_64.2025.1.1/bin/maqao oneview -R1 -c=/scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_run_1757941614/config.json --with-FLOPS --replace xp=/scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614 -of=html 
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/binaries/gcc_4/exec --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614/binaries/exec
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../gcc_4/bin/libggml-base.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614/libs/libggml-base.so
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../gcc_4/bin/libggml-blas.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614/libs/libggml-blas.so
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../gcc_4/bin/libggml-cpu.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614/libs/libggml-cpu.so
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../gcc_4/bin/libggml.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614/libs/libggml.so
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../gcc_4/bin/libllama.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614/libs/libllama.so
CMD:   /scratch/users/amazouz/Tools/x86_64/maqao/maqao.x86_64.2025.1.1/bin/maqao lprof _caller=oneview  --xp="/scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614/tools/lprof_npsu_run_0" --mpi-command="mpirun -n 6  " --collect-CPU-time-intervals -p=SSE_AVX_FLOP  --collect-topology tpp=32  -ldi=libggml-base.so,libggml-blas.so,libggml-cpu.so,libggml.so,libllama.so  -- /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/compilers/gcc_4/oneview_results_1757941614/binaries/exec -m meta-llama-3.1-8b-instruct-Q8_0.gguf -no-cnv -t 32 -n 512 -p \"what is a LLM?\" --seed 0
In run gcc_4, 69 loops were discarded from static analysis because their coverage
are lower than object_coverage_threshold value (0.01%).
That represents 1.1205840732% of the execution time. To include them, change the value
in the experiment directory configuration file, then rerun the command with the additionnal parameter
--force-static-analysis
134 functions were discarded from static analysis because their coverage
are lower than object_coverage_threshold value (0.01%).
That represents 2.1004228137172% of the execution time. To include them, change the value
in the experiment directory configuration file, then rerun the command with the additionnal parameter
--force-static-analysis
×