********************************************************************************
MAQAO 2025.1.1 - 3f8f629befdd61fbefd06c681d308ad28e2e93f8::20250630-093426 || 2025/06/30
/scratch/users/amazouz/Tools/x86_64/maqao/maqao.x86_64.2025.1.1/bin/maqao oneview -R1 -c=/scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_run_1757937332/config.json --with-FLOPS --replace xp=/scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332 -of=html
CPY: [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/base_runs/defaults/aocc/exec --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332/binaries/exec
CPY: [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../aocc/bin/libggml-base.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332/libs/libggml-base.so
CPY: [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../aocc/bin/libggml-blas.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332/libs/libggml-blas.so
CPY: [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../aocc/bin/libggml-cpu.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332/libs/libggml-cpu.so
CPY: [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../aocc/bin/libggml.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332/libs/libggml.so
CPY: [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/build/llama.cpp/../aocc/bin/libllama.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332/libs/libllama.so
CMD: /scratch/users/amazouz/Tools/x86_64/maqao/maqao.x86_64.2025.1.1/bin/maqao lprof _caller=oneview --xp="/scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332/tools/lprof_npsu_run_0" --mpi-command="mpirun -n 6 " --collect-CPU-time-intervals -p=SSE_AVX_FLOP --collect-topology tpp=32 -ldi=libggml-base.so,libggml-blas.so,libggml-cpu.so,libggml.so,libllama.so -- /scratch/users/amazouz/QAAS/service/Llama.cpp/sdp772511/175-793-6543/llama.cpp/run/oneview_runs/defaults/aocc/oneview_results_1757937332/binaries/exec -m meta-llama-3.1-8b-instruct-Q8_0.gguf -no-cnv -t 32 -n 512 -p \"what is a LLM?\" --seed 0
In run aocc_0, 73 loops were discarded from static analysis because their coverage
are lower than object_coverage_threshold value (0.01%).
That represents 1.5562673186214% of the execution time. To include them, change the value
in the experiment directory configuration file, then rerun the command with the additionnal parameter
--force-static-analysis
145 functions were discarded from static analysis because their coverage
are lower than object_coverage_threshold value (0.01%).
That represents 1.860599978354% of the execution time. To include them, change the value
in the experiment directory configuration file, then rerun the command with the additionnal parameter
--force-static-analysis