I am experiencing some extremely annoying behaviour with my FPGA / RT code.
I have a bitfile which works. I can use it in a minimal test case where all DMA writes and Reads seem to work. But when I incorporate it into a larger application, sometimes (for reasons unbeknownst to me) one of the Read channels stops working. It's not that it blocks upon reading (with Timeout -1) but it returns immediately but the array output of the DMA Read node is empty. There is NO ERROR. This should essentially NEVER HAPPEN. Telling a DMA Channel to give me 88 elements with a Timeout of -1 should either produce an error or wait until 88 elements are available.
It's almost as if the DMA Read node is being constant folded with an empty array (or the RIO driver is messed up)...... but only sometimes. Of course the code is dependent on a rather large RT project and the bitfile won't even run without our specific hardware so submitting the code for purposes of reproducing the issue is very difficult.
We had a problem in the past where DMA Reads on an PXIe-8840 were showing similar behaviour (only ona single-core kernel). This was apparently fixed in the RIO driver. Now I am seeing similar problems on a PXIe-8115. Although it's weird that my minimal test case (using the SAME VIs) works but in the full software doesn't. Again, it's only a read DMA which dowsn't work, the Write DMA works (being called int he same Timed loop as the Read).
Does this sound familiar to anyone? It's driving me nuts. I'm using LV 2015 SP1 15.0.1.f3. NI RIO driverversion is 15.5.