options
********************************************************************************
MAQAO 2025.1.1 - f3e40b5f1dbd62488bc0cc5f885d40677c87bfe8::20250630-094248 || 2025/06/30
/scratch/users/amazouz/Tools/aarch64/maqao/maqao.aarch64.2025.1.1/bin/maqao oneview -R1 -c=/scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_run_1759313640/config.json --replace xp=/scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640 -of=html 
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/exec --> /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640/binaries/exec
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/build/llama.cpp/../build/bin/libggml-base.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640/libs/libggml-base.so
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/build/llama.cpp/../build/bin/libggml-blas.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640/libs/libggml-blas.so
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/build/llama.cpp/../build/bin/libggml-cpu.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640/libs/libggml-cpu.so
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/build/llama.cpp/../build/bin/libggml.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640/libs/libggml.so
CPY:  [true] /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/build/llama.cpp/../build/bin/libllama.so --> /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640/libs/libllama.so
CMD:   /scratch/users/amazouz/Tools/aarch64/maqao/maqao.aarch64.2025.1.1/bin/maqao lprof _caller=oneview  --xp="/scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640/tools/lprof_npsu_run_0" --mpi-command="mpirun -n 1 --bind-to none --report-bindings " --collect-CPU-time-intervals --collect-topology tpp=72  -ldi=libggml-base.so,libggml-blas.so,libggml-cpu.so,libggml.so,libllama.so  -- /scratch/users/amazouz/QAAS/service/Llama.cpp/ortce-gh/175-931-3387/llama.cpp/run/oneview_runs/defaults/orig/oneview_results_1759313640/binaries/exec -m meta-llama-3.1-8b-instruct-Q8_0.gguf -no-cnv -t 72 -n 512 -p \"what is a LLM?\" --seed 0
In run orig_0, 25 loops were discarded from static analysis because their coverage
are lower than object_coverage_threshold value (0.01%).
That represents 0.025191399967298% of the execution time. To include them, change the value
in the experiment directory configuration file, then rerun the command with the additionnal parameter
--force-static-analysis
65 functions were discarded from static analysis because their coverage
are lower than object_coverage_threshold value (0.01%).
That represents 0.95289209828479% of the execution time. To include them, change the value
in the experiment directory configuration file, then rerun the command with the additionnal parameter
--force-static-analysis
×