Saturday, March 24, 2018

[Discuss-gnuradio] Dropping samples "D" when CPU at 60%

Hi all,


I have written a block in python to detect and decode some packet transmissions. I've noticed that when I have 3 rx streams running at 6.25Msps (total network traffic is 75MB/s), the host starts dropping samples after a while. The CPU utilization does not exceed 60%, and the Ethernet isn't being saturated. I've increased buffer sizes (net.core.w/rmem_max) to the largest values possible


A 'D' in the console means that the host isn't consuming samples fast enough.


Is it just that the block I've written is not efficient enough? I would expect higher CPU utilization in that case.


I think the most expensive thing I'm doing is looking at each sample in a stream of floats to find where the values goes above a threshold. This part is in my block, and triggers the decoding logic, which is in the same block.


I don't know how to do this more efficiently. I'm thinking of re-writing it in C++, but I don't know how much better it'll get.


Thank you,


AB

No comments:

Post a Comment