Part Number: TMS320C6678 Tool/software: TI-RTOS Hi all, I am working on a benchmark for the MessageQ library. Core 0 is the master and sends out messages to all other cores to do some work. The slaves answer the master with some acknowledgement when they did their work. The MessageQ approach was suggested to me in this post: https://e2e.ti.com/support/processors/f/791/p/813105/3010666#3010666 I started with the example provided in ti/ipc_3_50_02_02/examples/C6678_bios_elf/ex11_ping but slightly modified it to fit my purpose. For the purpose of prototyping I'm currently creating the same image for all cores and switching take actions for master/slaves based upon the core ID. The only variable I added to the message structure is an enum defining a message type. I went through all suggestions listed in http://processors.wiki.ti.com/index.php/IPC_Users_Guide/Optimizing_IPC_Applications e.g. disabled all run-time asserts, chose BIOS.LibType_Custom, selected the ti.sdo.ipc.family.c647x.NotifyCircSetup etc. I'm compiling with -O3 and all debug information disabled. I'm using Ipc.ProcSync_PAIR and attaching all slaves to the master and the master to all slaves. I now measured the number of cycles it takes for (all messages allocated previously and freed afterwards): - Master sending one message to all cores - Master receiving the acknowledgement from all slaves It takes ~ 63000 cycles to do this exercise (second run)! However, the work all slaves are doing is expected to be between 4000 to 20,000 cycles. I expected the MessageQ approach to be much faster as I'm barely sending any data around. My questions: - Is this time expected? - How can I improve the performance? - Is there any benchmark available for the master-slave approach? It would be great if you could give me a few more additional hints or point me to some suggestions. I also carefully pasted all my playground-code into one file and attached it including the cfg file. Thank you very much for your help. (Please visit the site to view this file)(Please visit the site to view this file)
↧