On 08/14/2019 01:14 PM, Moses Browne Mwakyanjala wrote:
The problem with any buffering approach is that if your host system cannot, on average, "keep up", your buffers will slowly fill up, since theyHello everyone,I'm experiencing packet dropping when I operate the USRP X310 (1GBe, 1472 bytes buffer) at high sample rates (around 20 MSamples/Second). This severely limits the nominal bit rate I was hoping for. Over a year ago, I stumbled upon a presentation [1] where the presenter was able to go around the problem by creating a block he called buffer. Basically, this block converts high rate complex IQ samples into 2-byte char and store them somewhere in RAM. The data is then released at a low-enough rate such that subsequent blocks cause no overflows. Sadly, the code is not public and I was thinking of writing a similar block myself. I'm looking for ideas on how to efficiently reproduce his results. All suggestions are highly welcomed.
Regards,Moses.
aren't being drained as fast as they are being filled.
You need a faster computer, or a more-efficient processing flow. No amount of buffering will help you, unless you're doing short
captures. But for continuous capture/streaming, buffering cannot help you.

No comments:
Post a Comment