********************************************************************************
MAQAO 2025.1.2 - ad4b42c12cfbc289a7a711f3ded92abe2eb90c0a::20250917-142411 || 2025/09/17
/beegfs/hackathon/users/eoseret/MAQAO_ad4b42/bin/maqao oneview -R1 -c=/beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_run_1759250338/config.json --with-FLOPS object-coverage-threshold=0.1 lprof_params=btm=fp --replace xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338 -of=html
CPY: [true] /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/base_runs/defaults/gcc/exec --> /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338/binaries/exec
CPY: [true] /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/build/llama.cpp/../gcc/bin/libggml-base.so --> /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338/libs/libggml-base.so
CPY: [true] /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/build/llama.cpp/../gcc/bin/libggml-blas.so --> /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338/libs/libggml-blas.so
CPY: [true] /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/build/llama.cpp/../gcc/bin/libggml-cpu.so --> /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338/libs/libggml-cpu.so
CPY: [true] /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/build/llama.cpp/../gcc/bin/libggml.so --> /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338/libs/libggml.so
CPY: [true] /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/build/llama.cpp/../gcc/bin/libllama.so --> /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338/libs/libllama.so
CMD: /beegfs/hackathon/users/eoseret/MAQAO_ad4b42/bin/maqao lprof _caller=oneview btm=fp --xp="/beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338/tools/lprof_npsu_run_0" --mpi-command="mpirun -n 1 " --collect-CPU-time-intervals -p=SSE_AVX_FLOP --collect-topology tpp=192 -ldi=libggml-base.so,libggml-blas.so,libggml-cpu.so,libggml.so,libllama.so -- /beegfs/hackathon/users/eoseret/qaas_runs_test/175-924-9259/intel/llama.cpp/run/oneview_runs/defaults/gcc/oneview_results_1759250338/binaries/exec -m meta-llama-3.1-8b-instruct-Q8_0.gguf -no-cnv -t 192 -n 512 -p \"what is a LLM?\" --seed 0
In run gcc_0, 39 loops were discarded from static analysis because their coverage
are lower than object_coverage_threshold value (0.1%).
That represents 0.040999953373103% of the execution time. To include them, change the value
in the experiment directory configuration file, then rerun the command with the additionnal parameter
--force-static-analysis
86 functions were discarded from static analysis because their coverage
are lower than object_coverage_threshold value (0.1%).
That represents 0.16768302201672% of the execution time. To include them, change the value
in the experiment directory configuration file, then rerun the command with the additionnal parameter
--force-static-analysis