Pls find attachment.
The softwares are not listed/displayed.
Help pls.
Pls find attachment.
The softwares are not listed/displayed.
Help pls.
I am using a NI 9234 Daq .
I have 4 channels configured using standard Daqmx VI’s not the express Daq assistant VI as it will not allow me to add a control for the sensitivity.
The issue I am having is that the VI runs slow when you look at the acceleration display you see segments that match the loop iteration but the FFT Graph looks fine.
Attached is my VI please have a look and see if I am missing something.
Thank you
hi Folks,
i made a code that basically enters infomation in a web browser programmatically through activex and submit it after the entering it. when trying to select a value in a dropdown, i am able to enter the value and display it on the webpage. but when submitting the webpage, the page thinks that no value was entered even the dropdown displays the value that i'm entering. I'm attaching the snapshot of the webpage and of my code. would love to get feedback to understand what i might be doing wrong.
thanks,
Santosh
Hello!!
In the attached image you can see in my VI I use a single Bounble to build a custer with 3 elements. Everything was good before insert the 3th element into the Cluster so now the Unboundle by Name does not show the 3th element but a single Unboundle does. Why?
It doesnt make sense to me. Seems an issue.
Im using LV 2016 32bit on Win7
Thanks in advanced!
My current setup involves acquiring signals from over 300 sensors. I was able to configure them properly on DAQmx, inputting them into the Sensor Mapping VI with an imported model. The issue lies in placing the sensors themselves. From my understanding, the only way to place a sensor is to drag it onto the model or right click on the model. This becomes difficult when we have ove 300 sensors with variable locations. Is there a way to import co-ordinates from an excel file and place the sensors according to it.
I am currently writing a GUI that updates its components at ~ 20 Hz. More specifically, it consists of 3 image displays, 4 waveforms and a bunch of buttons and numeric indicators. The more elements I added, the more laggy the GUI got, which is especially visible in the camera image which is updating at ~ 2 Hz instead of 20 Hz.
So i stumbled upon the "Defer Panel Updates" property and hoped that would solve my problem. The GUI is built as a queued message handler and I included a message which sets "Defer Panel Updates" to true before the "Update GUI" message and another one afterwards that sets "Defer Panel Updates" to false (see attachement).
To my surprise, this doesn't change anything at all. But e.g. resizing the window such that only one image display is visible speeds that one up to normal speed (I guess LabView doesn't redraw hidden elements).
The "Defer Panel Updates" property seems to work since setting it only to True gives me a "frozen" VI as expected.
Am I missing something here or misunderstanding how this property is supposed to work? Are there other ways to speed up the GUI updates?
Thanks!
PS: I also have another VI in a SubPanel if that makes a difference. But disabling this didn't change anything.
Hey there Labview forums,
I am currently working on a project that involves working with three different DAQs, acquiring at three different speeds:
The above three are connected to an NI-9184 chassis. I display the actual data real time using property nodes, so high speed acquisition for the latter two is very much preferable. Each of those has a SubVI responsible for a specific set of measurements.
Now, for the purpose of logging the acquired data, I will be using a template-based excel logging system, which works just fine in logging the data it acquires, however, since the data from the pressure and pulse DAQs is acquired so much faster than the temperature DAQ, for each one measurement of temperature, there are many more of pressure and frequency, which throws off my whole log in excel. My questions are:
If you'd like to see code, it can be of course included, but I felt my questions were more theoretical, and at this stage, it is a bit half baked and mostly uncommented so I didn't include it for now.
Thank you for your help.
Hi everyone!
I'm trying to control a stepping motor using myRio - I need to generate a boolean pulse train with regulated frequency. I tried with creating sin function and checking (if>0) but it doesn't seem to work. Which functions should I use?
EDIT: I added the screen of my program.
Hello.
Curetly I'm trying to familiarize myself with LV Database Toolkit version 2014.
In our company we are using database on remote server, so our IT guy installed on my computer Oracle and sets DSN. Now, in LV, I'm able to create connection and list all tables in that database, but when I try to list column information I'm getting this error:
"Error -2147217865 occurred at NI_Database_API.lvlib:Conn Execute.vi->NI_Database_API.lvlibB Tools List Columns.vi->Untitled 1"
In possible reasons it says that table doesn't exist on database, but that is non-sens. is there something that I'm missing? Something in DSN setting or....?
Bellow is screenshot of my code and that error message.
Thanks
I've recently started using the C-series module NI 9860 which is a neat little hardware configurable XNet device. This is great alternative to the other C-series XNet hardware which only has one port, and it is either dual wire CAN or LIN. This can be two CAN, or two LIN, or one of each. (and maybe single wire CAN in the future? Hint, Hint, NI)
Now the reason for this post is because I'm using the LIN transceiver, and the card itself needs to be powered. But when I use the LIN interface it also needs to be powered. Is there a reason for this? Can't NI just wire the power from Vsup internally to the DB9 interface? And furthermore whey is VSup even needed?
I'm using a RT CDAQ chassis and I need to provide power to it (9-30VDC). I then need to wire this same power to the VSup terminal (9-30VDC) on the input of the card. AND I need to wire this same power to pins 9 and 3 of the DB9 of the LIN interface (8-18VDC). What's the point of all of this? Can't the card be powered by the chassis? Can't the transceiver be powered by the card? If anyone else uses LIN on the 9860 just know you need to power both the card and transceiver, (and the chassis if applicable) and they can be from the same supply.
Hi , i have a labview GUI which receives serial data via VISA, and basically chops the strings and carries out matches until i have the values I seek. I then convert from Ascii string to decimal and feed the new values to a waveform. However I have discovered that some of the output values are incorrect. I checked this out by pulling the data in through Realterm.
Is there any way that I can see the modified data after each stage of the gui so that I can find where the problem lies? I have created indicators after each stage but they are not displaying the data, i assume its because the data is going in too quick.
I am very new to labview and have struggled through every bit of this GUI, as any of you regular posters will know. If you can suggest a solution please know that I will probably ask the most basic questions in order to implement your solution.
I have attached my GUI along with a screenshot of the same data coming through Realterm. Basically anywhere that you see TR xxxx TR or W xxxxW, know that it is the data between them ie xxxx which is the data of interest. Everything else is discarded.
I suspect the problem lies in the conversion from string to decimal and where 3 digit values increase to 4 digit values. ie any values under 850 are correct but that is the max value reached. My values should be reaching 2500. Any help would be greatly appreciated.
I have the datasheet from a NTC (Negative Thermistor Coefficient) and I want to calculate the most approximate equation
that describes it's behaviour. The table has temperature and resistance values. So the resistance of the NTC varies depending on the temperature it detects. I've already tried the next:
- Excel: polynomial fitting
- Maple: polynomial fitting
- LabVIEW: Polynomial Fit.vi
I've been getting equations with a not very good approximate parameteres. I'm suspecting that this method "polynomial FIT" it's not the best method for this type of data/equation (being a nonlinear equation).
So has someone used the Nonlinear Fit.vi to approximate the behaviour of a dataset? If so, can you explain me how this VI works... I would really appreciate it.
Thanks in advance and I'll wait for your answer.
I am using a NI-USB-6001 to measure some voltages @ 5V level.
I am trying to use one input to measure the noise level on top of a 5V signal. The ripple is roughly 200mV p-p (when looking at a scope in AC coupling mode). I can't find a suitable setting in the DAQmx VI to set it to measure with a higher resolution (4.9V-5.1V)
to be able to see this ripple.
Working with PXI 8513/2. LabVIEW 26016.
I have developped a CAN Driver and would like to test it.
I never implemented CAN communication so i am kind a noob with it !
I am using Port 1 for Read and Port 2 for Write in Frame In/Out Single Point.
My questions are :
1) Why do we need to connect an X-NET Database to intiate a CAN communication ?
2) Why do all Frame List specified into X-NET Database are sent at the end of my CAN init ? (I do a CAN Read in Timeout)
3) When Writing a new frame, CAN communication openned, i receive message error telling the X-NET session name is empty. Do i need to connect by hardware my port1 to Port 2 ?
I already read whitepapers CAN and wkipedia CAN !
BR,
Vincent
I need to do an image process(to get some info) but after I do it original i can't print the original image.
Hello, I want to display the real-time temperature recording on LabVIEW
through YOKOGAWA MX100
However I installed the DAQmx driver, and it still didn't work.
Then It seems that I should download the driver package for MX100 from
but I don't know how could I deal with the file...
Is there anyone who know about this?
Does anyone knows if there is a function to take the inverse laplace transform and display the equation like it is shown on the picture below?
I would like to be able to display it in both forms.
I have a PXI-6281 M series card and am trying to use a pause trigger to gate acquisition and synchronize two 6281 cards.
If the gate activates after the task is started, the T0 of the waveform outputs is based on the task start time. Any pauses in acquisition result in no gaps in the waveform output.
I don't plan to have a pause in the middle of acquisition, but the starting T0 needs to be based on the leading edge of the pause trigger. Not of the task start VI call.
Is there a way to do this?
Thanks,
XL600
I'm working with MGI save settings "sample attached below" the problem is that i have simple multi slide "slide control" but i can't restore the two slides value, only one slide is restored!
when i browse the saved config file i fond that the two slides saved under the same name, how can i save that slides values under different names so i can restore them?
attached snippet explain the problem
I am using the DAQmx TDMS logging functionality (LV2015) on a PXIe-8135 RT controller. When I compare the first sample's actual T0 to the T0 in the resulting TDMS file, they do not match. The tdms file appears to round down the wf_start_time timestamp to the nearest whole second. Is this a known behavior?
Thanks,
XL600