options

Executable Output


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_0  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 8867 tid 8867 thread 0 bound to OS proc set {0}
OMP: pid 8867 tid 8966 thread 1 bound to OS proc set {48}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_1  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 9048 tid 9048 thread 0 bound to OS proc set {0}
OMP: pid 9048 tid 9147 thread 1 bound to OS proc set {24}
OMP: pid 9048 tid 9148 thread 2 bound to OS proc set {48}
OMP: pid 9048 tid 9149 thread 3 bound to OS proc set {72}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_2  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 9236 tid 9236 thread 0 bound to OS proc set {0}
OMP: pid 9236 tid 9336 thread 2 bound to OS proc set {24}
OMP: pid 9236 tid 9335 thread 1 bound to OS proc set {12}
OMP: pid 9236 tid 9337 thread 3 bound to OS proc set {36}
OMP: pid 9236 tid 9338 thread 4 bound to OS proc set {48}
OMP: pid 9236 tid 9339 thread 5 bound to OS proc set {60}
OMP: pid 9236 tid 9340 thread 6 bound to OS proc set {72}
OMP: pid 9236 tid 9341 thread 7 bound to OS proc set {84}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_3  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 9362 tid 9362 thread 0 bound to OS proc set {0}
OMP: pid 9362 tid 9461 thread 1 bound to OS proc set {6}
OMP: pid 9362 tid 9463 thread 3 bound to OS proc set {18}
OMP: pid 9362 tid 9462 thread 2 bound to OS proc set {12}
OMP: pid 9362 tid 9465 thread 5 bound to OS proc set {30}
OMP: pid 9362 tid 9464 thread 4 bound to OS proc set {24}
OMP: pid 9362 tid 9469 thread 9 bound to OS proc set {54}
OMP: pid 9362 tid 9468 thread 8 bound to OS proc set {48}
OMP: pid 9362 tid 9472 thread 12 bound to OS proc set {72}
OMP: pid 9362 tid 9474 thread 14 bound to OS proc set {84}
OMP: pid 9362 tid 9466 thread 6 bound to OS proc set {36}
OMP: pid 9362 tid 9470 thread 10 bound to OS proc set {60}
OMP: pid 9362 tid 9473 thread 13 bound to OS proc set {78}
OMP: pid 9362 tid 9471 thread 11 bound to OS proc set {66}
OMP: pid 9362 tid 9475 thread 15 bound to OS proc set {90}
OMP: pid 9362 tid 9467 thread 7 bound to OS proc set {42}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_4  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 9556 tid 9556 thread 0 bound to OS proc set {0}
OMP: pid 9556 tid 9656 thread 2 bound to OS proc set {8}
OMP: pid 9556 tid 9655 thread 1 bound to OS proc set {4}
OMP: pid 9556 tid 9657 thread 3 bound to OS proc set {12}
OMP: pid 9556 tid 9659 thread 5 bound to OS proc set {20}
OMP: pid 9556 tid 9660 thread 6 bound to OS proc set {24}
OMP: pid 9556 tid 9663 thread 9 bound to OS proc set {36}
OMP: pid 9556 tid 9658 thread 4 bound to OS proc set {16}
OMP: pid 9556 tid 9662 thread 8 bound to OS proc set {32}
OMP: pid 9556 tid 9666 thread 12 bound to OS proc set {48}
OMP: pid 9556 tid 9671 thread 17 bound to OS proc set {68}
OMP: pid 9556 tid 9664 thread 10 bound to OS proc set {40}
OMP: pid 9556 tid 9667 thread 13 bound to OS proc set {52}
OMP: pid 9556 tid 9668 thread 14 bound to OS proc set {56}
OMP: pid 9556 tid 9670 thread 16 bound to OS proc set {64}
OMP: pid 9556 tid 9665 thread 11 bound to OS proc set {44}
OMP: pid 9556 tid 9672 thread 18 bound to OS proc set {72}
OMP: pid 9556 tid 9661 thread 7 bound to OS proc set {28}
OMP: pid 9556 tid 9669 thread 15 bound to OS proc set {60}
OMP: pid 9556 tid 9673 thread 19 bound to OS proc set {76}
OMP: pid 9556 tid 9675 thread 21 bound to OS proc set {84}
OMP: pid 9556 tid 9674 thread 20 bound to OS proc set {80}
OMP: pid 9556 tid 9677 thread 23 bound to OS proc set {92}
OMP: pid 9556 tid 9676 thread 22 bound to OS proc set {88}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_5  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 9698 tid 9698 thread 0 bound to OS proc set {0}
OMP: pid 9698 tid 9798 thread 2 bound to OS proc set {6}
OMP: pid 9698 tid 9797 thread 1 bound to OS proc set {3}
OMP: pid 9698 tid 9799 thread 3 bound to OS proc set {9}
OMP: pid 9698 tid 9801 thread 5 bound to OS proc set {15}
OMP: pid 9698 tid 9805 thread 9 bound to OS proc set {27}
OMP: pid 9698 tid 9804 thread 8 bound to OS proc set {24}
OMP: pid 9698 tid 9800 thread 4 bound to OS proc set {12}
OMP: pid 9698 tid 9802 thread 6 bound to OS proc set {18}
OMP: pid 9698 tid 9809 thread 13 bound to OS proc set {39}
OMP: pid 9698 tid 9808 thread 12 bound to OS proc set {36}
OMP: pid 9698 tid 9810 thread 14 bound to OS proc set {42}
OMP: pid 9698 tid 9807 thread 11 bound to OS proc set {33}
OMP: pid 9698 tid 9813 thread 17 bound to OS proc set {51}
OMP: pid 9698 tid 9814 thread 18 bound to OS proc set {54}
OMP: pid 9698 tid 9815 thread 19 bound to OS proc set {57}
OMP: pid 9698 tid 9812 thread 16 bound to OS proc set {48}
OMP: pid 9698 tid 9803 thread 7 bound to OS proc set {21}
OMP: pid 9698 tid 9811 thread 15 bound to OS proc set {45}
OMP: pid 9698 tid 9817 thread 21 bound to OS proc set {63}
OMP: pid 9698 tid 9816 thread 20 bound to OS proc set {60}
OMP: pid 9698 tid 9818 thread 22 bound to OS proc set {66}
OMP: pid 9698 tid 9824 thread 28 bound to OS proc set {84}
OMP: pid 9698 tid 9820 thread 24 bound to OS proc set {72}
OMP: pid 9698 tid 9821 thread 25 bound to OS proc set {75}
OMP: pid 9698 tid 9819 thread 23 bound to OS proc set {69}
OMP: pid 9698 tid 9822 thread 26 bound to OS proc set {78}
OMP: pid 9698 tid 9823 thread 27 bound to OS proc set {81}
OMP: pid 9698 tid 9826 thread 30 bound to OS proc set {90}
OMP: pid 9698 tid 9827 thread 31 bound to OS proc set {93}
OMP: pid 9698 tid 9806 thread 10 bound to OS proc set {30}
OMP: pid 9698 tid 9825 thread 29 bound to OS proc set {87}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_6  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 9848 tid 9848 thread 0 bound to OS proc set {0}
OMP: pid 9848 tid 9949 thread 3 bound to OS proc set {7}
OMP: pid 9848 tid 9947 thread 1 bound to OS proc set {2}
OMP: pid 9848 tid 9948 thread 2 bound to OS proc set {4}
OMP: pid 9848 tid 9956 thread 10 bound to OS proc set {24}
OMP: pid 9848 tid 9951 thread 5 bound to OS proc set {12}
OMP: pid 9848 tid 9952 thread 6 bound to OS proc set {14}
OMP: pid 9848 tid 9959 thread 13 bound to OS proc set {31}
OMP: pid 9848 tid 9950 thread 4 bound to OS proc set {9}
OMP: pid 9848 tid 9957 thread 11 bound to OS proc set {26}
OMP: pid 9848 tid 9954 thread 8 bound to OS proc set {19}
OMP: pid 9848 tid 9963 thread 17 bound to OS proc set {41}
OMP: pid 9848 tid 9960 thread 14 bound to OS proc set {33}
OMP: pid 9848 tid 9953 thread 7 bound to OS proc set {16}
OMP: pid 9848 tid 9979 thread 33 bound to OS proc set {80}
OMP: pid 9848 tid 9965 thread 19 bound to OS proc set {46}
OMP: pid 9848 tid 9961 thread 15 bound to OS proc set {36}
OMP: pid 9848 tid 9964 thread 18 bound to OS proc set {43}
OMP: pid 9848 tid 9981 thread 35 bound to OS proc set {84}
OMP: pid 9848 tid 9955 thread 9 bound to OS proc set {21}
OMP: pid 9848 tid 9978 thread 32 bound to OS proc set {77}
OMP: pid 9848 tid 9980 thread 34 bound to OS proc set {82}
OMP: pid 9848 tid 9958 thread 12 bound to OS proc set {29}
OMP: pid 9848 tid 9962 thread 16 bound to OS proc set {38}
OMP: pid 9848 tid 9966 thread 20 bound to OS proc set {48}
OMP: pid 9848 tid 9967 thread 21 bound to OS proc set {50}
OMP: pid 9848 tid 9968 thread 22 bound to OS proc set {53}
OMP: pid 9848 tid 9975 thread 29 bound to OS proc set {70}
OMP: pid 9848 tid 9970 thread 24 bound to OS proc set {58}
OMP: pid 9848 tid 9971 thread 25 bound to OS proc set {60}
OMP: pid 9848 tid 9974 thread 28 bound to OS proc set {67}
OMP: pid 9848 tid 9976 thread 30 bound to OS proc set {72}
OMP: pid 9848 tid 9972 thread 26 bound to OS proc set {63}
OMP: pid 9848 tid 9969 thread 23 bound to OS proc set {55}
OMP: pid 9848 tid 9984 thread 38 bound to OS proc set {92}
OMP: pid 9848 tid 9983 thread 37 bound to OS proc set {89}
OMP: pid 9848 tid 9982 thread 36 bound to OS proc set {87}
OMP: pid 9848 tid 9973 thread 27 bound to OS proc set {65}
OMP: pid 9848 tid 9977 thread 31 bound to OS proc set {75}
OMP: pid 9848 tid 9985 thread 39 bound to OS proc set {94}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_7  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 10066 tid 10066 thread 0 bound to OS proc set {0}
OMP: pid 10066 tid 10165 thread 1 bound to OS proc set {2}
OMP: pid 10066 tid 10166 thread 2 bound to OS proc set {4}
OMP: pid 10066 tid 10167 thread 3 bound to OS proc set {6}
OMP: pid 10066 tid 10177 thread 13 bound to OS proc set {26}
OMP: pid 10066 tid 10170 thread 6 bound to OS proc set {12}
OMP: pid 10066 tid 10173 thread 9 bound to OS proc set {18}
OMP: pid 10066 tid 10176 thread 12 bound to OS proc set {24}
OMP: pid 10066 tid 10171 thread 7 bound to OS proc set {14}
OMP: pid 10066 tid 10181 thread 17 bound to OS proc set {34}
OMP: pid 10066 tid 10178 thread 14 bound to OS proc set {28}
OMP: pid 10066 tid 10168 thread 4 bound to OS proc set {8}
OMP: pid 10066 tid 10169 thread 5 bound to OS proc set {10}
OMP: pid 10066 tid 10174 thread 10 bound to OS proc set {20}
OMP: pid 10066 tid 10179 thread 15 bound to OS proc set {30}
OMP: pid 10066 tid 10197 thread 33 bound to OS proc set {66}
OMP: pid 10066 tid 10182 thread 18 bound to OS proc set {36}
OMP: pid 10066 tid 10172 thread 8 bound to OS proc set {16}
OMP: pid 10066 tid 10198 thread 34 bound to OS proc set {68}
OMP: pid 10066 tid 10175 thread 11 bound to OS proc set {22}
OMP: pid 10066 tid 10199 thread 35 bound to OS proc set {70}
OMP: pid 10066 tid 10188 thread 24 bound to OS proc set {48}
OMP: pid 10066 tid 10183 thread 19 bound to OS proc set {38}
OMP: pid 10066 tid 10196 thread 32 bound to OS proc set {64}
OMP: pid 10066 tid 10185 thread 21 bound to OS proc set {42}
OMP: pid 10066 tid 10180 thread 16 bound to OS proc set {32}
OMP: pid 10066 tid 10200 thread 36 bound to OS proc set {72}
OMP: pid 10066 tid 10184 thread 20 bound to OS proc set {40}
OMP: pid 10066 tid 10192 thread 28 bound to OS proc set {56}
OMP: pid 10066 tid 10201 thread 37 bound to OS proc set {74}
OMP: pid 10066 tid 10189 thread 25 bound to OS proc set {50}
OMP: pid 10066 tid 10193 thread 29 bound to OS proc set {58}
OMP: pid 10066 tid 10187 thread 23 bound to OS proc set {46}
OMP: pid 10066 tid 10194 thread 30 bound to OS proc set {60}
OMP: pid 10066 tid 10204 thread 40 bound to OS proc set {80}
OMP: pid 10066 tid 10206 thread 42 bound to OS proc set {84}
OMP: pid 10066 tid 10186 thread 22 bound to OS proc set {44}
OMP: pid 10066 tid 10205 thread 41 bound to OS proc set {82}
OMP: pid 10066 tid 10202 thread 38 bound to OS proc set {76}
OMP: pid 10066 tid 10203 thread 39 bound to OS proc set {78}
OMP: pid 10066 tid 10195 thread 31 bound to OS proc set {62}
OMP: pid 10066 tid 10191 thread 27 bound to OS proc set {54}
OMP: pid 10066 tid 10190 thread 26 bound to OS proc set {52}
OMP: pid 10066 tid 10209 thread 45 bound to OS proc set {90}
OMP: pid 10066 tid 10207 thread 43 bound to OS proc set {86}
OMP: pid 10066 tid 10208 thread 44 bound to OS proc set {88}
OMP: pid 10066 tid 10210 thread 46 bound to OS proc set {92}
OMP: pid 10066 tid 10211 thread 47 bound to OS proc set {94}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_8  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 10232 tid 10331 thread 1 bound to OS proc set {1}
OMP: pid 10232 tid 10232 thread 0 bound to OS proc set {0}
OMP: pid 10232 tid 10332 thread 2 bound to OS proc set {3}
OMP: pid 10232 tid 10342 thread 12 bound to OS proc set {20}
OMP: pid 10232 tid 10333 thread 3 bound to OS proc set {5}
OMP: pid 10232 tid 10334 thread 4 bound to OS proc set {6}
OMP: pid 10232 tid 10337 thread 7 bound to OS proc set {12}
OMP: pid 10232 tid 10339 thread 9 bound to OS proc set {15}
OMP: pid 10232 tid 10335 thread 5 bound to OS proc set {8}
OMP: pid 10232 tid 10343 thread 13 bound to OS proc set {22}
OMP: pid 10232 tid 10347 thread 17 bound to OS proc set {29}
OMP: pid 10232 tid 10338 thread 8 bound to OS proc set {13}
OMP: pid 10232 tid 10336 thread 6 bound to OS proc set {10}
OMP: pid 10232 tid 10345 thread 15 bound to OS proc set {25}
OMP: pid 10232 tid 10348 thread 18 bound to OS proc set {31}
OMP: pid 10232 tid 10340 thread 10 bound to OS proc set {17}
OMP: pid 10232 tid 10341 thread 11 bound to OS proc set {19}
OMP: pid 10232 tid 10363 thread 33 bound to OS proc set {57}
OMP: pid 10232 tid 10344 thread 14 bound to OS proc set {24}
OMP: pid 10232 tid 10346 thread 16 bound to OS proc set {27}
OMP: pid 10232 tid 10378 thread 48 bound to OS proc set {83}
OMP: pid 10232 tid 10379 thread 49 bound to OS proc set {84}
OMP: pid 10232 tid 10362 thread 32 bound to OS proc set {55}
OMP: pid 10232 tid 10364 thread 34 bound to OS proc set {58}
OMP: pid 10232 tid 10349 thread 19 bound to OS proc set {32}
OMP: pid 10232 tid 10355 thread 25 bound to OS proc set {43}
OMP: pid 10232 tid 10350 thread 20 bound to OS proc set {34}
OMP: pid 10232 tid 10354 thread 24 bound to OS proc set {41}
OMP: pid 10232 tid 10356 thread 26 bound to OS proc set {45}
OMP: pid 10232 tid 10380 thread 50 bound to OS proc set {86}
OMP: pid 10232 tid 10381 thread 51 bound to OS proc set {88}
OMP: pid 10232 tid 10351 thread 21 bound to OS proc set {36}
OMP: pid 10232 tid 10353 thread 23 bound to OS proc set {39}
OMP: pid 10232 tid 10358 thread 28 bound to OS proc set {48}
OMP: pid 10232 tid 10367 thread 37 bound to OS proc set {64}
OMP: pid 10232 tid 10365 thread 35 bound to OS proc set {60}
OMP: pid 10232 tid 10359 thread 29 bound to OS proc set {50}
OMP: pid 10232 tid 10361 thread 31 bound to OS proc set {53}
OMP: pid 10232 tid 10366 thread 36 bound to OS proc set {62}
OMP: pid 10232 tid 10371 thread 41 bound to OS proc set {71}
OMP: pid 10232 tid 10368 thread 38 bound to OS proc set {65}
OMP: pid 10232 tid 10360 thread 30 bound to OS proc set {51}
OMP: pid 10232 tid 10369 thread 39 bound to OS proc set {67}
OMP: pid 10232 tid 10370 thread 40 bound to OS proc set {69}
OMP: pid 10232 tid 10372 thread 42 bound to OS proc set {72}
OMP: pid 10232 tid 10375 thread 45 bound to OS proc set {77}
OMP: pid 10232 tid 10357 thread 27 bound to OS proc set {46}
OMP: pid 10232 tid 10352 thread 22 bound to OS proc set {38}
OMP: pid 10232 tid 10384 thread 54 bound to OS proc set {93}
OMP: pid 10232 tid 10382 thread 52 bound to OS proc set {90}
OMP: pid 10232 tid 10383 thread 53 bound to OS proc set {91}
OMP: pid 10232 tid 10373 thread 43 bound to OS proc set {74}
OMP: pid 10232 tid 10377 thread 47 bound to OS proc set {81}
OMP: pid 10232 tid 10385 thread 55 bound to OS proc set {95}
OMP: pid 10232 tid 10374 thread 44 bound to OS proc set {76}
OMP: pid 10232 tid 10376 thread 46 bound to OS proc set {79}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9

To display your profiling results:
##############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                   #
##############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_9  #
##############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 10406 tid 10505 thread 1 bound to OS proc set {1}
OMP: pid 10406 tid 10406 thread 0 bound to OS proc set {0}
OMP: pid 10406 tid 10506 thread 2 bound to OS proc set {3}
OMP: pid 10406 tid 10507 thread 3 bound to OS proc set {4}
OMP: pid 10406 tid 10508 thread 4 bound to OS proc set {6}
OMP: pid 10406 tid 10511 thread 7 bound to OS proc set {10}
OMP: pid 10406 tid 10510 thread 6 bound to OS proc set {9}
OMP: pid 10406 tid 10513 thread 9 bound to OS proc set {13}
OMP: pid 10406 tid 10509 thread 5 bound to OS proc set {7}
OMP: pid 10406 tid 10512 thread 8 bound to OS proc set {12}
OMP: pid 10406 tid 10516 thread 12 bound to OS proc set {18}
OMP: pid 10406 tid 10518 thread 14 bound to OS proc set {21}
OMP: pid 10406 tid 10521 thread 17 bound to OS proc set {25}
OMP: pid 10406 tid 10514 thread 10 bound to OS proc set {15}
OMP: pid 10406 tid 10519 thread 15 bound to OS proc set {22}
OMP: pid 10406 tid 10538 thread 34 bound to OS proc set {51}
OMP: pid 10406 tid 10515 thread 11 bound to OS proc set {16}
OMP: pid 10406 tid 10537 thread 33 bound to OS proc set {50}
OMP: pid 10406 tid 10517 thread 13 bound to OS proc set {19}
OMP: pid 10406 tid 10522 thread 18 bound to OS proc set {27}
OMP: pid 10406 tid 10520 thread 16 bound to OS proc set {24}
OMP: pid 10406 tid 10553 thread 49 bound to OS proc set {74}
OMP: pid 10406 tid 10536 thread 32 bound to OS proc set {48}
OMP: pid 10406 tid 10555 thread 51 bound to OS proc set {77}
OMP: pid 10406 tid 10554 thread 50 bound to OS proc set {75}
OMP: pid 10406 tid 10526 thread 22 bound to OS proc set {33}
OMP: pid 10406 tid 10524 thread 20 bound to OS proc set {30}
OMP: pid 10406 tid 10532 thread 28 bound to OS proc set {42}
OMP: pid 10406 tid 10552 thread 48 bound to OS proc set {72}
OMP: pid 10406 tid 10539 thread 35 bound to OS proc set {53}
OMP: pid 10406 tid 10533 thread 29 bound to OS proc set {43}
OMP: pid 10406 tid 10535 thread 31 bound to OS proc set {46}
OMP: pid 10406 tid 10540 thread 36 bound to OS proc set {54}
OMP: pid 10406 tid 10557 thread 53 bound to OS proc set {80}
OMP: pid 10406 tid 10534 thread 30 bound to OS proc set {45}
OMP: pid 10406 tid 10541 thread 37 bound to OS proc set {56}
OMP: pid 10406 tid 10549 thread 45 bound to OS proc set {68}
OMP: pid 10406 tid 10525 thread 21 bound to OS proc set {31}
OMP: pid 10406 tid 10558 thread 54 bound to OS proc set {81}
OMP: pid 10406 tid 10523 thread 19 bound to OS proc set {28}
OMP: pid 10406 tid 10556 thread 52 bound to OS proc set {78}
OMP: pid 10406 tid 10559 thread 55 bound to OS proc set {83}
OMP: pid 10406 tid 10542 thread 38 bound to OS proc set {57}
OMP: pid 10406 tid 10529 thread 25 bound to OS proc set {37}
OMP: pid 10406 tid 10528 thread 24 bound to OS proc set {36}
OMP: pid 10406 tid 10548 thread 44 bound to OS proc set {66}
OMP: pid 10406 tid 10547 thread 43 bound to OS proc set {65}
OMP: pid 10406 tid 10545 thread 41 bound to OS proc set {62}
OMP: pid 10406 tid 10550 thread 46 bound to OS proc set {69}
OMP: pid 10406 tid 10530 thread 26 bound to OS proc set {39}
OMP: pid 10406 tid 10551 thread 47 bound to OS proc set {71}
OMP: pid 10406 tid 10544 thread 40 bound to OS proc set {60}
OMP: pid 10406 tid 10561 thread 57 bound to OS proc set {86}
OMP: pid 10406 tid 10546 thread 42 bound to OS proc set {63}
OMP: pid 10406 tid 10560 thread 56 bound to OS proc set {84}
OMP: pid 10406 tid 10543 thread 39 bound to OS proc set {59}
OMP: pid 10406 tid 10563 thread 59 bound to OS proc set {89}
OMP: pid 10406 tid 10527 thread 23 bound to OS proc set {34}
OMP: pid 10406 tid 10562 thread 58 bound to OS proc set {87}
OMP: pid 10406 tid 10564 thread 60 bound to OS proc set {90}
OMP: pid 10406 tid 10565 thread 61 bound to OS proc set {92}
OMP: pid 10406 tid 10531 thread 27 bound to OS proc set {40}
OMP: pid 10406 tid 10566 thread 62 bound to OS proc set {93}
OMP: pid 10406 tid 10567 thread 63 bound to OS proc set {95}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10

To display your profiling results:
###############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                    #
###############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_10  #
###############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 10647 tid 10746 thread 1 bound to OS proc set {1}
OMP: pid 10647 tid 10748 thread 3 bound to OS proc set {4}
OMP: pid 10647 tid 10750 thread 5 bound to OS proc set {6}
OMP: pid 10647 tid 10647 thread 0 bound to OS proc set {0}
OMP: pid 10647 tid 10749 thread 4 bound to OS proc set {5}
OMP: pid 10647 tid 10747 thread 2 bound to OS proc set {2}
OMP: pid 10647 tid 10753 thread 8 bound to OS proc set {10}
OMP: pid 10647 tid 10754 thread 9 bound to OS proc set {12}
OMP: pid 10647 tid 10752 thread 7 bound to OS proc set {9}
OMP: pid 10647 tid 10756 thread 11 bound to OS proc set {14}
OMP: pid 10647 tid 10751 thread 6 bound to OS proc set {8}
OMP: pid 10647 tid 10759 thread 14 bound to OS proc set {18}
OMP: pid 10647 tid 10755 thread 10 bound to OS proc set {13}
OMP: pid 10647 tid 10778 thread 33 bound to OS proc set {44}
OMP: pid 10647 tid 10757 thread 12 bound to OS proc set {16}
OMP: pid 10647 tid 10763 thread 18 bound to OS proc set {24}
OMP: pid 10647 tid 10762 thread 17 bound to OS proc set {22}
OMP: pid 10647 tid 10760 thread 15 bound to OS proc set {20}
OMP: pid 10647 tid 10764 thread 19 bound to OS proc set {25}
OMP: pid 10647 tid 10779 thread 34 bound to OS proc set {45}
OMP: pid 10647 tid 10780 thread 35 bound to OS proc set {47}
OMP: pid 10647 tid 10758 thread 13 bound to OS proc set {17}
OMP: pid 10647 tid 10795 thread 50 bound to OS proc set {67}
OMP: pid 10647 tid 10793 thread 48 bound to OS proc set {64}
OMP: pid 10647 tid 10761 thread 16 bound to OS proc set {21}
OMP: pid 10647 tid 10774 thread 29 bound to OS proc set {39}
OMP: pid 10647 tid 10766 thread 21 bound to OS proc set {28}
OMP: pid 10647 tid 10771 thread 26 bound to OS proc set {35}
OMP: pid 10647 tid 10765 thread 20 bound to OS proc set {26}
OMP: pid 10647 tid 10777 thread 32 bound to OS proc set {43}
OMP: pid 10647 tid 10796 thread 51 bound to OS proc set {68}
OMP: pid 10647 tid 10810 thread 65 bound to OS proc set {87}
OMP: pid 10647 tid 10773 thread 28 bound to OS proc set {37}
OMP: pid 10647 tid 10798 thread 53 bound to OS proc set {71}
OMP: pid 10647 tid 10768 thread 23 bound to OS proc set {30}
OMP: pid 10647 tid 10794 thread 49 bound to OS proc set {66}
OMP: pid 10647 tid 10786 thread 41 bound to OS proc set {55}
OMP: pid 10647 tid 10769 thread 24 bound to OS proc set {32}
OMP: pid 10647 tid 10784 thread 39 bound to OS proc set {52}
OMP: pid 10647 tid 10785 thread 40 bound to OS proc set {53}
OMP: pid 10647 tid 10811 thread 66 bound to OS proc set {88}
OMP: pid 10647 tid 10781 thread 36 bound to OS proc set {48}
OMP: pid 10647 tid 10775 thread 30 bound to OS proc set {40}
OMP: pid 10647 tid 10767 thread 22 bound to OS proc set {29}
OMP: pid 10647 tid 10783 thread 38 bound to OS proc set {51}
OMP: pid 10647 tid 10809 thread 64 bound to OS proc set {86}
OMP: pid 10647 tid 10770 thread 25 bound to OS proc set {33}
OMP: pid 10647 tid 10790 thread 45 bound to OS proc set {60}
OMP: pid 10647 tid 10772 thread 27 bound to OS proc set {36}
OMP: pid 10647 tid 10802 thread 57 bound to OS proc set {76}
OMP: pid 10647 tid 10787 thread 42 bound to OS proc set {56}
OMP: pid 10647 tid 10782 thread 37 bound to OS proc set {49}
OMP: pid 10647 tid 10791 thread 46 bound to OS proc set {61}
OMP: pid 10647 tid 10806 thread 61 bound to OS proc set {82}
OMP: pid 10647 tid 10812 thread 67 bound to OS proc set {90}
OMP: pid 10647 tid 10789 thread 44 bound to OS proc set {59}
OMP: pid 10647 tid 10805 thread 60 bound to OS proc set {80}
OMP: pid 10647 tid 10800 thread 55 bound to OS proc set {74}
OMP: pid 10647 tid 10804 thread 59 bound to OS proc set {79}
OMP: pid 10647 tid 10803 thread 58 bound to OS proc set {78}
OMP: pid 10647 tid 10808 thread 63 bound to OS proc set {84}
OMP: pid 10647 tid 10801 thread 56 bound to OS proc set {75}
OMP: pid 10647 tid 10797 thread 52 bound to OS proc set {70}
OMP: pid 10647 tid 10807 thread 62 bound to OS proc set {83}
OMP: pid 10647 tid 10792 thread 47 bound to OS proc set {63}
OMP: pid 10647 tid 10776 thread 31 bound to OS proc set {41}
OMP: pid 10647 tid 10799 thread 54 bound to OS proc set {72}
OMP: pid 10647 tid 10813 thread 68 bound to OS proc set {91}
OMP: pid 10647 tid 10814 thread 69 bound to OS proc set {92}
OMP: pid 10647 tid 10816 thread 71 bound to OS proc set {95}
OMP: pid 10647 tid 10788 thread 43 bound to OS proc set {57}
OMP: pid 10647 tid 10815 thread 70 bound to OS proc set {94}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11

To display your profiling results:
###############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                    #
###############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_11  #
###############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 10838 tid 10937 thread 1 bound to OS proc set {1}
OMP: pid 10838 tid 10939 thread 3 bound to OS proc set {3}
OMP: pid 10838 tid 10938 thread 2 bound to OS proc set {2}
OMP: pid 10838 tid 10945 thread 9 bound to OS proc set {10}
OMP: pid 10838 tid 10941 thread 5 bound to OS proc set {6}
OMP: pid 10838 tid 10838 thread 0 bound to OS proc set {0}
OMP: pid 10838 tid 10942 thread 6 bound to OS proc set {7}
OMP: pid 10838 tid 10943 thread 7 bound to OS proc set {8}
OMP: pid 10838 tid 10944 thread 8 bound to OS proc set {9}
OMP: pid 10838 tid 10940 thread 4 bound to OS proc set {4}
OMP: pid 10838 tid 10948 thread 12 bound to OS proc set {14}
OMP: pid 10838 tid 10951 thread 15 bound to OS proc set {18}
OMP: pid 10838 tid 10950 thread 14 bound to OS proc set {16}
OMP: pid 10838 tid 10949 thread 13 bound to OS proc set {15}
OMP: pid 10838 tid 10946 thread 10 bound to OS proc set {12}
OMP: pid 10838 tid 10969 thread 33 bound to OS proc set {40}
OMP: pid 10838 tid 10953 thread 17 bound to OS proc set {20}
OMP: pid 10838 tid 10970 thread 34 bound to OS proc set {41}
OMP: pid 10838 tid 10985 thread 49 bound to OS proc set {59}
OMP: pid 10838 tid 10947 thread 11 bound to OS proc set {13}
OMP: pid 10838 tid 10986 thread 50 bound to OS proc set {60}
OMP: pid 10838 tid 11001 thread 65 bound to OS proc set {78}
OMP: pid 10838 tid 10952 thread 16 bound to OS proc set {19}
OMP: pid 10838 tid 10987 thread 51 bound to OS proc set {61}
OMP: pid 10838 tid 11002 thread 66 bound to OS proc set {80}
OMP: pid 10838 tid 10955 thread 19 bound to OS proc set {23}
OMP: pid 10838 tid 10954 thread 18 bound to OS proc set {21}
OMP: pid 10838 tid 10958 thread 22 bound to OS proc set {26}
OMP: pid 10838 tid 10965 thread 29 bound to OS proc set {35}
OMP: pid 10838 tid 10956 thread 20 bound to OS proc set {24}
OMP: pid 10838 tid 10957 thread 21 bound to OS proc set {25}
OMP: pid 10838 tid 10962 thread 26 bound to OS proc set {31}
OMP: pid 10838 tid 11000 thread 64 bound to OS proc set {77}
OMP: pid 10838 tid 10966 thread 30 bound to OS proc set {36}
OMP: pid 10838 tid 10961 thread 25 bound to OS proc set {30}
OMP: pid 10838 tid 10972 thread 36 bound to OS proc set {43}
OMP: pid 10838 tid 10968 thread 32 bound to OS proc set {38}
OMP: pid 10838 tid 10960 thread 24 bound to OS proc set {29}
OMP: pid 10838 tid 10959 thread 23 bound to OS proc set {27}
OMP: pid 10838 tid 10963 thread 27 bound to OS proc set {32}
OMP: pid 10838 tid 10967 thread 31 bound to OS proc set {37}
OMP: pid 10838 tid 10964 thread 28 bound to OS proc set {33}
OMP: pid 10838 tid 10973 thread 37 bound to OS proc set {44}
OMP: pid 10838 tid 10971 thread 35 bound to OS proc set {42}
OMP: pid 10838 tid 11003 thread 67 bound to OS proc set {81}
OMP: pid 10838 tid 10981 thread 45 bound to OS proc set {54}
OMP: pid 10838 tid 10974 thread 38 bound to OS proc set {46}
OMP: pid 10838 tid 10991 thread 55 bound to OS proc set {66}
OMP: pid 10838 tid 10980 thread 44 bound to OS proc set {53}
OMP: pid 10838 tid 10976 thread 40 bound to OS proc set {48}
OMP: pid 10838 tid 10975 thread 39 bound to OS proc set {47}
OMP: pid 10838 tid 10996 thread 60 bound to OS proc set {72}
OMP: pid 10838 tid 10984 thread 48 bound to OS proc set {58}
OMP: pid 10838 tid 10982 thread 46 bound to OS proc set {55}
OMP: pid 10838 tid 10988 thread 52 bound to OS proc set {63}
OMP: pid 10838 tid 10977 thread 41 bound to OS proc set {49}
OMP: pid 10838 tid 10979 thread 43 bound to OS proc set {52}
OMP: pid 10838 tid 11005 thread 69 bound to OS proc set {83}
OMP: pid 10838 tid 10989 thread 53 bound to OS proc set {64}
OMP: pid 10838 tid 10992 thread 56 bound to OS proc set {67}
OMP: pid 10838 tid 10997 thread 61 bound to OS proc set {73}
OMP: pid 10838 tid 11009 thread 73 bound to OS proc set {88}
OMP: pid 10838 tid 10983 thread 47 bound to OS proc set {56}
OMP: pid 10838 tid 10990 thread 54 bound to OS proc set {65}
OMP: pid 10838 tid 11004 thread 68 bound to OS proc set {82}
OMP: pid 10838 tid 11006 thread 70 bound to OS proc set {84}
OMP: pid 10838 tid 10978 thread 42 bound to OS proc set {50}
OMP: pid 10838 tid 10994 thread 58 bound to OS proc set {70}
OMP: pid 10838 tid 11008 thread 72 bound to OS proc set {87}
OMP: pid 10838 tid 10993 thread 57 bound to OS proc set {69}
OMP: pid 10838 tid 10999 thread 63 bound to OS proc set {76}
OMP: pid 10838 tid 11007 thread 71 bound to OS proc set {86}
OMP: pid 10838 tid 10998 thread 62 bound to OS proc set {75}
OMP: pid 10838 tid 11010 thread 74 bound to OS proc set {89}
OMP: pid 10838 tid 10995 thread 59 bound to OS proc set {71}
OMP: pid 10838 tid 11012 thread 76 bound to OS proc set {92}
OMP: pid 10838 tid 11013 thread 77 bound to OS proc set {93}
OMP: pid 10838 tid 11011 thread 75 bound to OS proc set {90}
OMP: pid 10838 tid 11014 thread 78 bound to OS proc set {94}
OMP: pid 10838 tid 11015 thread 79 bound to OS proc set {95}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12

To display your profiling results:
###############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                    #
###############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_12  #
###############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 11040 tid 11140 thread 2 bound to OS proc set {2}
OMP: pid 11040 tid 11139 thread 1 bound to OS proc set {1}
OMP: pid 11040 tid 11141 thread 3 bound to OS proc set {3}
OMP: pid 11040 tid 11143 thread 5 bound to OS proc set {5}
OMP: pid 11040 tid 11142 thread 4 bound to OS proc set {4}
OMP: pid 11040 tid 11040 thread 0 bound to OS proc set {0}
OMP: pid 11040 tid 11147 thread 9 bound to OS proc set {9}
OMP: pid 11040 tid 11152 thread 14 bound to OS proc set {15}
OMP: pid 11040 tid 11155 thread 17 bound to OS proc set {18}
OMP: pid 11040 tid 11153 thread 15 bound to OS proc set {16}
OMP: pid 11040 tid 11151 thread 13 bound to OS proc set {14}
OMP: pid 11040 tid 11150 thread 12 bound to OS proc set {13}
OMP: pid 11040 tid 11149 thread 11 bound to OS proc set {12}
OMP: pid 11040 tid 11148 thread 10 bound to OS proc set {11}
OMP: pid 11040 tid 11146 thread 8 bound to OS proc set {8}
OMP: pid 11040 tid 11157 thread 19 bound to OS proc set {20}
OMP: pid 11040 tid 11172 thread 34 bound to OS proc set {37}
OMP: pid 11040 tid 11186 thread 48 bound to OS proc set {52}
OMP: pid 11040 tid 11187 thread 49 bound to OS proc set {54}
OMP: pid 11040 tid 11144 thread 6 bound to OS proc set {6}
OMP: pid 11040 tid 11188 thread 50 bound to OS proc set {55}
OMP: pid 11040 tid 11173 thread 35 bound to OS proc set {38}
OMP: pid 11040 tid 11154 thread 16 bound to OS proc set {17}
OMP: pid 11040 tid 11156 thread 18 bound to OS proc set {19}
OMP: pid 11040 tid 11171 thread 33 bound to OS proc set {36}
OMP: pid 11040 tid 11179 thread 41 bound to OS proc set {45}
OMP: pid 11040 tid 11170 thread 32 bound to OS proc set {35}
OMP: pid 11040 tid 11183 thread 45 bound to OS proc set {49}
OMP: pid 11040 tid 11178 thread 40 bound to OS proc set {44}
OMP: pid 11040 tid 11180 thread 42 bound to OS proc set {46}
OMP: pid 11040 tid 11203 thread 65 bound to OS proc set {71}
OMP: pid 11040 tid 11182 thread 44 bound to OS proc set {48}
OMP: pid 11040 tid 11184 thread 46 bound to OS proc set {50}
OMP: pid 11040 tid 11185 thread 47 bound to OS proc set {51}
OMP: pid 11040 tid 11168 thread 30 bound to OS proc set {33}
OMP: pid 11040 tid 11164 thread 26 bound to OS proc set {28}
OMP: pid 11040 tid 11166 thread 28 bound to OS proc set {30}
OMP: pid 11040 tid 11174 thread 36 bound to OS proc set {39}
OMP: pid 11040 tid 11159 thread 21 bound to OS proc set {23}
OMP: pid 11040 tid 11163 thread 25 bound to OS proc set {27}
OMP: pid 11040 tid 11177 thread 39 bound to OS proc set {42}
OMP: pid 11040 tid 11189 thread 51 bound to OS proc set {56}
OMP: pid 11040 tid 11161 thread 23 bound to OS proc set {25}
OMP: pid 11040 tid 11175 thread 37 bound to OS proc set {40}
OMP: pid 11040 tid 11191 thread 53 bound to OS proc set {58}
OMP: pid 11040 tid 11165 thread 27 bound to OS proc set {29}
OMP: pid 11040 tid 11167 thread 29 bound to OS proc set {31}
OMP: pid 11040 tid 11202 thread 64 bound to OS proc set {70}
OMP: pid 11040 tid 11162 thread 24 bound to OS proc set {26}
OMP: pid 11040 tid 11190 thread 52 bound to OS proc set {57}
OMP: pid 11040 tid 11192 thread 54 bound to OS proc set {59}
OMP: pid 11040 tid 11205 thread 67 bound to OS proc set {73}
OMP: pid 11040 tid 11195 thread 57 bound to OS proc set {62}
OMP: pid 11040 tid 11158 thread 20 bound to OS proc set {22}
OMP: pid 11040 tid 11194 thread 56 bound to OS proc set {61}
OMP: pid 11040 tid 11193 thread 55 bound to OS proc set {60}
OMP: pid 11040 tid 11204 thread 66 bound to OS proc set {72}
OMP: pid 11040 tid 11198 thread 60 bound to OS proc set {66}
OMP: pid 11040 tid 11169 thread 31 bound to OS proc set {34}
OMP: pid 11040 tid 11206 thread 68 bound to OS proc set {74}
OMP: pid 11040 tid 11160 thread 22 bound to OS proc set {24}
OMP: pid 11040 tid 11196 thread 58 bound to OS proc set {63}
OMP: pid 11040 tid 11181 thread 43 bound to OS proc set {47}
OMP: pid 11040 tid 11200 thread 62 bound to OS proc set {68}
OMP: pid 11040 tid 11197 thread 59 bound to OS proc set {65}
OMP: pid 11040 tid 11176 thread 38 bound to OS proc set {41}
OMP: pid 11040 tid 11199 thread 61 bound to OS proc set {67}
OMP: pid 11040 tid 11208 thread 70 bound to OS proc set {77}
OMP: pid 11040 tid 11201 thread 63 bound to OS proc set {69}
OMP: pid 11040 tid 11145 thread 7 bound to OS proc set {7}
OMP: pid 11040 tid 11211 thread 73 bound to OS proc set {80}
OMP: pid 11040 tid 11216 thread 78 bound to OS proc set {85}
OMP: pid 11040 tid 11207 thread 69 bound to OS proc set {76}
OMP: pid 11040 tid 11219 thread 81 bound to OS proc set {89}
OMP: pid 11040 tid 11214 thread 76 bound to OS proc set {83}
OMP: pid 11040 tid 11217 thread 79 bound to OS proc set {87}
OMP: pid 11040 tid 11212 thread 74 bound to OS proc set {81}
OMP: pid 11040 tid 11209 thread 71 bound to OS proc set {78}
OMP: pid 11040 tid 11210 thread 72 bound to OS proc set {79}
OMP: pid 11040 tid 11213 thread 75 bound to OS proc set {82}
OMP: pid 11040 tid 11220 thread 82 bound to OS proc set {90}
OMP: pid 11040 tid 11215 thread 77 bound to OS proc set {84}
OMP: pid 11040 tid 11218 thread 80 bound to OS proc set {88}
OMP: pid 11040 tid 11221 thread 83 bound to OS proc set {91}
OMP: pid 11040 tid 11223 thread 85 bound to OS proc set {93}
OMP: pid 11040 tid 11222 thread 84 bound to OS proc set {92}
OMP: pid 11040 tid 11224 thread 86 bound to OS proc set {94}
OMP: pid 11040 tid 11225 thread 87 bound to OS proc set {95}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13

To display your profiling results:
###############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                    #
###############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_13  #
###############################################################################################################################################################################################################################################


* [MAQAO] Info: Detected 1 Lprof instances in ip-172-31-47-249.ec2.internal. 
If this is incorrect, rerun with number-processes-per-node=X
OMP: pid 11307 tid 11409 thread 3 bound to OS proc set {3}
OMP: pid 11307 tid 11411 thread 5 bound to OS proc set {5}
OMP: pid 11307 tid 11415 thread 9 bound to OS proc set {9}
OMP: pid 11307 tid 11410 thread 4 bound to OS proc set {4}
OMP: pid 11307 tid 11407 thread 1 bound to OS proc set {1}
OMP: pid 11307 tid 11408 thread 2 bound to OS proc set {2}
OMP: pid 11307 tid 11414 thread 8 bound to OS proc set {8}
OMP: pid 11307 tid 11412 thread 6 bound to OS proc set {6}
OMP: pid 11307 tid 11307 thread 0 bound to OS proc set {0}
OMP: pid 11307 tid 11413 thread 7 bound to OS proc set {7}
OMP: pid 11307 tid 11416 thread 10 bound to OS proc set {10}
OMP: pid 11307 tid 11418 thread 12 bound to OS proc set {12}
OMP: pid 11307 tid 11417 thread 11 bound to OS proc set {11}
OMP: pid 11307 tid 11421 thread 15 bound to OS proc set {15}
OMP: pid 11307 tid 11420 thread 14 bound to OS proc set {14}
OMP: pid 11307 tid 11424 thread 18 bound to OS proc set {18}
OMP: pid 11307 tid 11439 thread 33 bound to OS proc set {33}
OMP: pid 11307 tid 11455 thread 49 bound to OS proc set {49}
OMP: pid 11307 tid 11441 thread 35 bound to OS proc set {35}
OMP: pid 11307 tid 11456 thread 50 bound to OS proc set {50}
OMP: pid 11307 tid 11423 thread 17 bound to OS proc set {17}
OMP: pid 11307 tid 11419 thread 13 bound to OS proc set {13}
OMP: pid 11307 tid 11471 thread 65 bound to OS proc set {65}
OMP: pid 11307 tid 11425 thread 19 bound to OS proc set {19}
OMP: pid 11307 tid 11422 thread 16 bound to OS proc set {16}
OMP: pid 11307 tid 11427 thread 21 bound to OS proc set {21}
OMP: pid 11307 tid 11426 thread 20 bound to OS proc set {20}
OMP: pid 11307 tid 11440 thread 34 bound to OS proc set {34}
OMP: pid 11307 tid 11428 thread 22 bound to OS proc set {22}
OMP: pid 11307 tid 11472 thread 66 bound to OS proc set {66}
OMP: pid 11307 tid 11430 thread 24 bound to OS proc set {24}
OMP: pid 11307 tid 11457 thread 51 bound to OS proc set {51}
OMP: pid 11307 tid 11443 thread 37 bound to OS proc set {37}
OMP: pid 11307 tid 11432 thread 26 bound to OS proc set {26}
OMP: pid 11307 tid 11434 thread 28 bound to OS proc set {28}
OMP: pid 11307 tid 11436 thread 30 bound to OS proc set {30}
OMP: pid 11307 tid 11438 thread 32 bound to OS proc set {32}
OMP: pid 11307 tid 11473 thread 67 bound to OS proc set {67}
OMP: pid 11307 tid 11429 thread 23 bound to OS proc set {23}
OMP: pid 11307 tid 11444 thread 38 bound to OS proc set {38}
OMP: pid 11307 tid 11454 thread 48 bound to OS proc set {48}
OMP: pid 11307 tid 11431 thread 25 bound to OS proc set {25}
OMP: pid 11307 tid 11435 thread 29 bound to OS proc set {29}
OMP: pid 11307 tid 11447 thread 41 bound to OS proc set {41}
OMP: pid 11307 tid 11448 thread 42 bound to OS proc set {42}
OMP: pid 11307 tid 11446 thread 40 bound to OS proc set {40}
OMP: pid 11307 tid 11433 thread 27 bound to OS proc set {27}
OMP: pid 11307 tid 11442 thread 36 bound to OS proc set {36}
OMP: pid 11307 tid 11470 thread 64 bound to OS proc set {64}
OMP: pid 11307 tid 11460 thread 54 bound to OS proc set {54}
OMP: pid 11307 tid 11445 thread 39 bound to OS proc set {39}
OMP: pid 11307 tid 11437 thread 31 bound to OS proc set {31}
OMP: pid 11307 tid 11450 thread 44 bound to OS proc set {44}
OMP: pid 11307 tid 11451 thread 45 bound to OS proc set {45}
OMP: pid 11307 tid 11462 thread 56 bound to OS proc set {56}
OMP: pid 11307 tid 11463 thread 57 bound to OS proc set {57}
OMP: pid 11307 tid 11475 thread 69 bound to OS proc set {69}
OMP: pid 11307 tid 11449 thread 43 bound to OS proc set {43}
OMP: pid 11307 tid 11452 thread 46 bound to OS proc set {46}
OMP: pid 11307 tid 11458 thread 52 bound to OS proc set {52}
OMP: pid 11307 tid 11453 thread 47 bound to OS proc set {47}
OMP: pid 11307 tid 11476 thread 70 bound to OS proc set {70}
OMP: pid 11307 tid 11480 thread 74 bound to OS proc set {74}
OMP: pid 11307 tid 11483 thread 77 bound to OS proc set {77}
OMP: pid 11307 tid 11464 thread 58 bound to OS proc set {58}
OMP: pid 11307 tid 11479 thread 73 bound to OS proc set {73}
OMP: pid 11307 tid 11481 thread 75 bound to OS proc set {75}
OMP: pid 11307 tid 11478 thread 72 bound to OS proc set {72}
OMP: pid 11307 tid 11484 thread 78 bound to OS proc set {78}
OMP: pid 11307 tid 11466 thread 60 bound to OS proc set {60}
OMP: pid 11307 tid 11459 thread 53 bound to OS proc set {53}
OMP: pid 11307 tid 11485 thread 79 bound to OS proc set {79}
OMP: pid 11307 tid 11477 thread 71 bound to OS proc set {71}
OMP: pid 11307 tid 11487 thread 81 bound to OS proc set {81}
OMP: pid 11307 tid 11469 thread 63 bound to OS proc set {63}
OMP: pid 11307 tid 11474 thread 68 bound to OS proc set {68}
OMP: pid 11307 tid 11467 thread 61 bound to OS proc set {61}
OMP: pid 11307 tid 11489 thread 83 bound to OS proc set {83}
OMP: pid 11307 tid 11486 thread 80 bound to OS proc set {80}
OMP: pid 11307 tid 11468 thread 62 bound to OS proc set {62}
OMP: pid 11307 tid 11482 thread 76 bound to OS proc set {76}
OMP: pid 11307 tid 11495 thread 89 bound to OS proc set {89}
OMP: pid 11307 tid 11493 thread 87 bound to OS proc set {87}
OMP: pid 11307 tid 11496 thread 90 bound to OS proc set {90}
OMP: pid 11307 tid 11494 thread 88 bound to OS proc set {88}
OMP: pid 11307 tid 11488 thread 82 bound to OS proc set {82}
OMP: pid 11307 tid 11490 thread 84 bound to OS proc set {84}
OMP: pid 11307 tid 11499 thread 93 bound to OS proc set {93}
OMP: pid 11307 tid 11491 thread 85 bound to OS proc set {85}
OMP: pid 11307 tid 11465 thread 59 bound to OS proc set {59}
OMP: pid 11307 tid 11500 thread 94 bound to OS proc set {94}
OMP: pid 11307 tid 11461 thread 55 bound to OS proc set {55}
OMP: pid 11307 tid 11497 thread 91 bound to OS proc set {91}
OMP: pid 11307 tid 11492 thread 86 bound to OS proc set {86}
OMP: pid 11307 tid 11501 thread 95 bound to OS proc set {95}
OMP: pid 11307 tid 11498 thread 92 bound to OS proc set {92}
what is a LLM? and why should i care?
A Large Language Model (LLM) is a type of artificial intelligence (AI) that can process and generate human-like text based on the input it receives. LLMs are trained on vast amounts of text data, which allows them to learn patterns, relationships, and context in language. This enables them to generate coherent and often informative responses to user queries.

Here are some reasons why you should care about LLMs:

1.  **Improved search and content generation:** LLMs can help improve search results by providing more accurate and relevant information. They can also generate content such as articles, blog posts, and even entire books.
2.  **Personalized experiences:** LLMs can be used to create personalized experiences for users. For example, they can generate customized news feeds, product recommendations, or even entire stories based on a user's interests and preferences.
3.  **Customer support:** LLMs can be used to provide 24/7 customer support by answering frequently asked questions, helping with simple transactions, and even handling complex issues.
4.  **Language learning:** LLMs can help language learners by providing personalized feedback, practicing conversations, and even generating language learning materials.
5.  **Content creation:** LLMs can be used to create content such as dialogue, scripts, and even entire stories. This can help writers, filmmakers, and other creators to generate ideas and develop their projects.

Some popular examples of LLMs include:

1.  **Chatbots:** Many companies use LLMs to power their chatbots, which can help customers with simple transactions, answer frequently asked questions, and even provide customer support.
2.  **Virtual assistants:** LLMs are used in virtual assistants like Siri, Google Assistant, and Alexa to provide information, set reminders, and even control smart home devices.
3.  **Language translation:** LLMs are used in language translation tools like Google Translate to provide accurate and context-specific translations.

Overall, LLMs have the potential to revolutionize the way we interact with technology, from simple tasks like search and customer support to more complex tasks like content creation and language learning. As LLMs continue to evolve, we can expect to see even more innovative applications and uses for these powerful tools. [end of text]




Your experiment path is /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14

To display your profiling results:
###############################################################################################################################################################################################################################################
#    LEVEL    |     REPORT     |                                                                                                   COMMAND                                                                                                    #
###############################################################################################################################################################################################################################################
#  Functions  |  Cluster-wide  |  maqao lprof -df xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14      #
#  Functions  |  Per-node      |  maqao lprof -df -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14  #
#  Functions  |  Per-process   |  maqao lprof -df -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14  #
#  Functions  |  Per-thread    |  maqao lprof -df -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14  #
#  Loops      |  Cluster-wide  |  maqao lprof -dl xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14      #
#  Loops      |  Per-node      |  maqao lprof -dl -dn xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14  #
#  Loops      |  Per-process   |  maqao lprof -dl -dp xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14  #
#  Loops      |  Per-thread    |  maqao lprof -dl -dt xp=/home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/multicore/armclang_3/oneview_results_1758031731_v2/tools/lprof_npsu_run_14  #
###############################################################################################################################################################################################################################################

×