I read that LV2019 included a feature that allowed you do highlight the execution of a region of code instead of the whole block diagram, but I cannot find any documentation on how to do it. Can someone clue me in?
Thanks,
-Jamie
I read that LV2019 included a feature that allowed you do highlight the execution of a region of code instead of the whole block diagram, but I cannot find any documentation on how to do it. Can someone clue me in?
Thanks,
-Jamie
Hi,
I have an old LabVIEW VI that prints characters to a number of different label printers. However since NI obsoleted the "Standard Format" in their reports, we are not able to print anymore to an Intermec 3440 printer.
Writing a driver to use IPL is not an option because we are also going to be using this VI to send commands to other printers in remote locations.
I have tried unsuccessfully to print in Word, Excel, an HTML.
Any suggestions?
I have an NI-9209 for thermocouple readings, a NI-9214 for voltage readings, and two cDAQ-9171 ports for interfacing with my computer via two USB cables. I can take readings from the 9209 or the 9214 individually using DAQ Assistant. But when I set up the DAQ assistant with channels from both devices I get error-201483. I am new to all this but the error makes it sound like I am not using the right equipment to take readings from two different devises in one VI, but I don't know. Is there a way for me to get temperature and voltage readings from both devices at the same time in the same VI?
Perhaps I could use DAQmx to make it so the VI read one devise at a time? But I don't know enough about DAQmx to write that code.
Preview attachment Daq with multiple modules 2.PNG
Preview attachment Daq with multiple modules 1.PNG
Hello all,
I have a complicated program which was seemingly working fine several days ago, however, it now crashes the entire LabView environment quite frequently. I have attached an error dump from LabView (2015, 32-bit).
Can anyone provide any insight as to what the problem might be?
Hi all,
In Labview I successfully hooked up two pulse monitoring systems to the Labview software. The problem I am having now is that the signals seem to interfere with one another. When you toggle with one of the photo resistors the other photoresitor gives the same graphical readout and vice versa. Any help?
Can someone explain to me how the DAQmx edge counter handles the data being read. I don't understand the data capture process other than it is counting on the falling edge of a pulse. I have it set to take continuous data 100 samples at a rate of 1k. What exactly does this mean and what will the data be? I am trying to get this counter to give me a value and then reset every time and add the elapsed times per read so that I have an accurate counts per second.
I need to compute the FFT of 300,000 elements of fixed point type (FFT length of 150,000). On my current FPGA (Kintex‑7 325T FPGA), when using the FFT express function I have a grayed out limit of 65,536, and the largest fft length actually available to me is 8,192. I know there are ways to work around this by breaking up my data into smaller DFT's and then recombining them, or programming my own implementation (which would surely be less efficient than LabVIEW's FFT function?) But I assume the issue is lack of resources, and I believe either approach would run into the same sort of trouble. Is there some way to overcome this on my current FPGA, and if not, then what sort of specs should I be looking for when purchasing a new one?
I am using 32bit LV 2018 on W10, 64bit.
I have a 9627 sbRIO target in the project. I have configured it to have a 9684 RMC board. None of the RMC channels/resources appear to be working. When I drop a FPGA IO Node on the diagram of an FPGA VI, the RMC channels are not available in the selection lists. When I drag an RMC channel from the project onto the VI diagram, that RMC channel is there, but it is "bad". See the screen shot. For an AO channel, for example, the node looks like it returns a value, instead of accepts a value. And the wire out of the return value is dotted. Some sort of error. Any clue how to enable these channels?
TIA,
DaveT
I have two PCIe FPGA cards (7851R Multifunction RIO and 1473R Camera Link) installed in a Dell Precision 5820 running Windows 10, with LabVIEW 2018, all updated to the latest releases. The Camera Link card is transferring data to Windows through a DMA FIFO at about 400 MB/s, and the RIO card at about 5k/s through a second DMA FIFO. Every so often (typically a few hours), while running a program that uses both cards, the computer will completely freeze, with a frozen screen, and no response to keyboard or mouse. Programs that use either of the two cards on their own (using the same compiled FPGA code) never freeze the computer, even when run for several days.
By chance, on one freeze, Windows started to display an error dialog - there has never otherwise been any description of the fault, or any error logged in LabVIEW or Windows.
The error message doesn't give much information: "An error was detected in the communication between the host computer and the FPGA target." I am not using any external clocks, only the 100MHz image clock on the 1473R, and the normal 40MHz clock on the 7851R card.
I've checked that it's not a memory fault (running an overnight memory check), and I've tried swapping card slots on the PCIe bus. Neither was successful. I was wondering whether there are issues with conflicts when using multiple DMAs, but haven't found any warnings about that being the case. I've changed the code so that DMA writes do not occur from both cards at the same time, but this has not solved the problem either.
Does anyone know of a possible cause of these lockups, or how to troubleshoot in this situation? Or has used an equivalent hardware configuration? It's very puzzling that this fault would take down Windows completely. Any and all suggestions are very welcome!
I am looking for drivers for a Eurotherm 2216 dual action controller. Older community posts I have read suggest that ni.com has these drivers, but I can no longer find them when searching on the website. Could someone direct me to where I might find them, or if such drivers are no longer supported, would drivers for a similar device like the Eurotherm 2416 work? Thank you for your help.
I am new to BLE and to communicate to the DUT that works on BLE4.2, i have procured the nrf52840 USB module. I would like to know If anyone has experience working on this HW and have tried it using LabVIEW.
Specific Question I have besides LabVIEW drivers are:
1. How many DUTs connections can I make considering the dongle as Central Device?
2. I see nRF connect shall be used as utility SW. Has anyone used it?
And, finally, is there any device that NI supports for BLE functional testing (not the protocol analyser kind). Here I would like to mention that I have ordered BLED112 as back up based on information that i found in the NI forum.
Dear LabVIEW community,
let me please ask you regarding the following.
I'd like to have C# interface for my application (b/c LabVIEW possibilities are not suitable for my task), but I need also to access VIs programatically, and read/write properties of UI objects via property nodes.
Is it possible to do it in the way, that:
Will such combination work? Or, there will be problems like different application context (b/c VIs will be in edit mode; and C# application will run as executable), or some other issues?
Does anyone have experience with the similar task?
Thank you very much,
Sincerely, kosist90.
I use ni webservcie to build a webservcie program and publish it to the local machine.
The system is windows server 2012R2 ,mysql 8.0 and the labview version is 2015 SP1 32 bit.
The main function of the webservcie program is to insert the post data into the mysql database and respond accordingly, with no other functions.
Another programs are about 1s to 20s apart Communicate with webservcie via post.
It is now found that each communication will cause NIWebServiceContainer.exe to increase the memory consumption by tens of k to several hundred k.
After running for 4 hours, NIWebServiceContainer.exe is changed from 20M at boot time to more than 110M.
After running for about 20 hours, the memory reaches 418M, and there is a phenomenon that the database cannot be inserted.
What causes it?Thanks
Hello, everyone,
I have a question about continuous acquisition and real-time playback of sound signals using NI-Scope. I tried to solve the problem with queues, but there are still interruption issues. As I tried to solve the problem, I found that the min record length parameter in the niScope Configure Horizontal Timing had to be set. Otherwise, the sound signal cannot be collected. It leads to interrupted collection. And it doesn't sound very continuous.
How to achieve continuous acquisition without interrupting the playback of sound?
Thank you for your valuable assistance and I hope we will find a solution. Thank you for sharing your experience with me.
Lee
from some NI board's instructions,I see that there are FIFOs for AO channels' control.
I checked some demo VIs of DAQ AOs control ,but didnot find further useage of it.
Can I use the FIFOs for time_based accurate control by this ? just to ensure the time's precision of AO voltage channels , so as to accomplish parts performance of RTOS.
I consider that , I can first set the FIFOs buffer size to 3k, set the channel clock to board CLOCK 1K, and then ,I write the FIFOs buffer with a 1D 2k length's array .
so the CLOCK will "eat" the FIFOs with a rate of 1k/s.
and then ,I check the FIFOs' size repeatly , once it is less than 1K, I will enrich it , and add another 1k waveform data to it. so it seems I can use the board's system to continuously output a smooth voltage signal with proper time precision.
But ,from the VI demos in NI folder , the demos shows that ,if I want to use the FIFOs to output a voltage signal(for example, SIN signal) , I need
1,first produce the array data of that SIN signal ,
2,and then use labview to put the array into the board's FIFOs,
3,and then the board output the SIN signal.
after all these steps finish,labview will again put the array into FIFOs, the DAQ board again output the voltage signal.
it seems there is no chance for me to check the left FIFOs' size, no chance for me to enrich it continuously.
So, if I use it on windows system, there may probably be a time error between two FIFOs' voltage signal's connection part on AO channel pin. The output pin's voltage waveform may be easily disgunished and divided into sections,and each section refers to the FIFOs that laview wrote.
Am I right? Is there some misunderstanding that I need to correct? I want to output time_accruate signals(such as voltage signal) with simple NI board , as most RTOS can do.
Hello. I need your ideas. I have a system. This system has some tools. Like pumps and sensors. The tools of the system contain data. When the system starts up, the data starts to flow, and then sends them to a hub. The hub is connected to a computer. The computer contains the Labview vi interface. Incoming data is tried to be interpreted and plotted in the labview and then transmitted back to the sensors and pump according to the new values entered. First, my goal is to be able to observe the information coming from the tools and then to input data to them. Where should I start? I need to design a system-controllable interface. You can see an example in the picture. This is an executable file that was not created by me. But I want to design the same interface. How do I do that?
Hi!
Thanks for last time helping!
I am currently connecting BK PRECISION 8500 model with LabVIEW, but I got to say it is not working. I already went through the datasheet and manual of the 8500 and downloaded the driver of the adapter. When I try to run the example code of the instruments, the code works fine with no errors but there is no response between the LabVIEW and the instrument. (no instruments detected) Anyone knows why is that?
Here I attached the code I was using.
Thanks in advance!
Hello,
I have a small program for grabbing .avi videos with IMAQdx (See attachement). I use a 3rd party mini USB camera with HD quality, 1 MPixels. My problem is that video is too fast. The speed is about 2 times faster than normal. I try to adjust the framerate down, but at some point it will be too few samples/second and video will stutter. I have also tried to put a "Wait" and "Wait until next.." function in loop and adjust the time, but it doesn't affect at all the speed of video. What is wrong? What can I do to "tune" the video to correct speed?
br,
paalbrok