OK, guys, one more question for the DAQ fire hose I'm drinking from. I'm trying to run my little cDAQ 9205 AI card in a 9174 chassis just as fast as I can. I'm trying to achieve the absolute minimum *latency* of the analog voltage input measurement. When I crank up the sample rate to the 9205 maximum of 250kS/s, I soon get a -200279 error. As you can see, I'm doing extremely little to slow down the PC. I can upsize the DAQ buffer, but of course that just prolongs the agony - it just takes longer to croak. I *think* I understand that the default transfer mechanism is DMA (because it seems to be supported for my hardware, although I'm not positive on that. Since I'm not really *doing anything* with the data (other than occasionally displaying it), I wouldn't think I'd really need to set up a buffer in the software. What's wrong with my thought process? Thanks for any insight, paul
↧