Quantcast
Channel: LabVIEW topics
Viewing all 66905 articles
Browse latest View live

C Code Generator for LabView 2014

$
0
0

Does anyone know if there is a 2014 version for the C generator? We have Labview 2014 at work, and I have Labview Home Edition 2014 32 bit at home. I can find and download a 2015 version:

http://sine.ni.com/nips/cds/view/p/lang/en/nid/209015

 

and also a 2013 version:

http://www.ni.com/download/-2013/4437/en/

 

but no 2014 version for some reason. I can download these toolkits but am unable to use them as I get the message "20xx labview version must be installed".

 

Do I have any options to use the c generator, other than forking out $hundreds?


NI 9237: True strain or Engineering strain ?

$
0
0

Hi guys,

Thanks for reading my query.

I am bit confused about the measurement from my beloved NI 9237.

I am using 120 ohms strain gauges with NI 9237, having gauge factor of 2.1.

I want to know that the strain I am measuring is True strain or Engineering strain ?

Kindly let me know, and once again thanks for reading my question.

Cheers

Syed

NI PCI-6251 compatibility with Windows 10

$
0
0

Hello, I was wondering if anyone knew whether the M Series NI PCI-6251 is compatible with Windows 10. The data sheet only mentions Windows XP, Windows Vista, and Windows 7. I appreciate any help you can offer. Thank you.

How can I split a cluster into its elements when transferred through a queue as an array?

$
0
0

Hey guys,

 

I have two clusters being sent over a queue and I am having difficulty extracting the correct data. 

 

The data being sent is split into 4 parts. Position, Velocity, Boolean and an Enum. The size of the elements are 48, 48, 1 and 8 respectively. This means that the data has to be split before being sent in the queue (since 64 is the largest amount that can be sent at any one time.)

 

The data is first broken into a boolean array before being built and split into two 64 bit numbers that get put on the queue and sent down.

 

 

I have (as a somewhat fluke) gotten the position and velocity data extracted but cant seem to get the boolean. I am yet to try to extract the enum. I am also not certain why I had to bitwise and the velocty with 2^32 and then divide by 256. I was looking at the output and this ended up working.

 

I think I may have confused myself. I have two 64 bit numbers that I want to split into its 48,48,1,8 bit counterparts. Any help would be appreciated. Thank you!

CRC32

$
0
0

Hi I am trying to do CRC32 on my data but somehow getting wrong value. I am using the CRC 32 code from the following link:

https://forums.ni.com/t5/Example-Programs/Calculating-the-CRC32-of-a-File-with-LabVIEW/ta-p/3496230

but instead of input a file, I am inputting the HEX value of "000102030405060708090A0B" to test the CRC. When i try to compare the results from online calculator : http://crccalc.com/ 

and use HEX input,the result matches "

CRC-32/JAMCRC

0x6D8F369A

But I should get 

CRC-32/MPEG-20xAD88945B

I am not sure what the online calculator is reffering to as "Ref in" and "Ref out" as that is the only thing different in two results. 


Is there any setting I need to change from the CRC 32 labview code.

Any help would be appreciated.


Thanks

TDMS file Read and write file operation taking very long time to finish in a particular PC

$
0
0

Hi All

I'm facing an issue (very slow execution) in reading and writing to a TDMS file. It's nearly taking around 23-30 minutes for a file of 108 MB size and index file size of 6 MB. 

PC details:

It's an NI PXI-e controller in chassis with 32 bit Windows 7 OS with 4 GB RAM. This OS was installed nearly 5 years ago (not reinstalled OS or machine not serviced meanwhile). PC is having NI TestStand 2010 SP1 and LabVIEW 2011 SP1.

 

Observation 1:

The same operation works relatively faster in other laptops with the same input file. It takes around ~1.5 to 3.3 minutes (tested with both 32 bit Windows7 OS & 4GB RAM and a laptop with 64 bit Windows7 OS +8GB RAM).

 

Observation 2:

I've checked the slow PC's hard disk C drive (Check disk using Right click on C drive>>Properties>>Tools>>Check error. It says no errors found.

Also, around 100GB space is available in C drive.

Observation 3:

However, there are many other similar TDMS files are stored in this PC with their respective index files (total around 8 GB). This is not the case in other laptops where it's found working faster. I don't know if this has any linkage but I tried the operation by deleting the other files also. But it's the same.

 

Observation 4:

When I monitored the task manager during the VI run, it shows red lines in the charts under CPU performance tab. I've also captured the screenshot where the CPU shows red without running any operation. CPU is not 100% occupied and memory looks available. but show red lines. This definitely appear that something going wrong with CPU.

In the other laptops, there are no red lines with CPU performance.

Observation 5:

TDMS file operation inside the VI uses FOR loop with parallelism using CPU affinity (No. of logical processors =8) in this PC. I'm suspecting if CPU is having issues, is this having any issue with trying parallelism?

any bad sectors in drive or CPU logical processors not really working as affinity says?

 

I've provided screenshots and VI for reference. Please provide me your suggestions or ideas to check this CPU issue.

THANK YOU.

-Krishna

 

 

 

KEYSİGHT 34465A Measurement Resistance and Voltage

$
0
0

Hİ. 

I am trying to measure resistance and dc voltage.  When ı try to measure one of them ı can get true values , but  when ı try to measure both of them ı have error: (VISA Write in Agilent 3446X Series.lvlib:Configure Measurement Trigger Parameters.vi->resistance mesaurement.vi)  

I could not fix this problem .  Thanks from now for helping

Wi-Fi communication between computer and myRio via TCP

$
0
0

Hi everybody,

I am using LabView 2015 on Windows 10 and I am trying to communicate with a myRio 1900 through TCP connection via Wi-Fi. I have already set the wi-fi up; I can ping the myRio (IP: 172.16.0.1) and it works. 

I would like to send and receive data from and to the board through TCP (other methods are not acceptable for us), by using myRio as the Server and the PC as the Client.

The main problem at this point is that the "TCP_Listen.vi" block on the Server hangs and it does not even start to listen on the port (perhaps we selected the wrong port number?).

Secondly, we have an Error 63 ("Serial port receive buffer overflow") on the Client.

 

In attachment, you can find our code.

Thank you very much in advance,

 

Lorenzo


DSB/Full Carrier AM demodulate using OAT Envelope Detection VI always gives 180deg out of phase

$
0
0

I've tried to use OAT Envelope Detection VI to demodulate a DSB/Full Carrier conventional AM signal, however the output always has 180deg. out of phase to the input message. Any idea why?

 

How to save image of entire front panel to folder?

$
0
0

So I've attached my VI, and upon error checking it seems to work fine up until the Write JPEG to File code is implemented. It gives ERR 1 there. I feel like I've almost got it figured out, but no dice yet.

 

Just upload whatever personal VI you have to test my code.

Call by reference and EXE

$
0
0

Hello everybody,

I did a code where a main VI call and open some subVIs.

In order to compact the code, I replaced the standard case structure where each subVI is located in a case with a call by reference.

 call by reference Vs Casecall by reference Vs Case

The problem is in the executable distribution because I don't have to leave the subVI source codes in the distribution for protection reasons.

 

The executable of the call by reference structure doesn't work when I delete the subvis.vi....obviously.

 

do you have any advice on how to solve it?

Thanks in advance and have a great summer.

Lucio

Producer consumer architecture for image acquisition

$
0
0

Hello!

I would like to take as many images as possible using my AVT Manta G146 camera in 0.25 s. Is Producer/Consumer architecture the fastest way to do this? I have attached my code for reference. Is there any other better and faster way to grab and save images?

Thanks in advance.   

USB HID Sending 20 BYTE string

$
0
0

Hi All

 

Myself and our external electronics contractor have developed a custom switch matrix. The switch matrix has approx. 151 electromechanical relays.

 

There is a PIC on board and USB HID interface.

Our electronics contractor is currently on holiday for 3 weeks. He made a quick C code interface that allows the user to switch the relays.

 

To switch the relays a 8BIT hex value needs sending to the switch matrix, each of the first 19 BYTEs. Each Bit corresponds to the relay to turn on and off. The last bit of the 19th BYTE is not used because there is 151 relays (19 X 8 =152)

 

When plugging the PCB into the labtop via USB (only usb interface on the PCB) it is not seen by NIMAX. I have the VID and PID numbers for the USB HID interface.

USB_CONFIG_PID  0x0022

USB_CONFIG_VID  0x0461

 

I can use the NI VISA driver wizard to create an INF file for the USB HID interface. I can find the instrument and verify this by ensuring the VID and PID numbers are correct. I then save the INF file. and the NI VISA driver wizard automatically closes

 

when I go into NIMAX I can now see the PCB under devices and interfaces.

the following ID is returned.

 

USB0::0x0461::0x0022::NI-VISA-10002::RAW

 

I have the following questions;

 

in Labview using a VISA open command, the device above is not seen on the visa resource name.

I need to send 20 bytes of information to the device for example

 

BYTE 0 - 00

BYTE 1 - 00

BYTE 2 - 00

.....

BYTE 18 - 2C

BYTE 19 - 00

 

How can I do this.

 

On the C application this works, its very manual but it works. It also returns the information you have sent.

 

Any help would be gratefully appreciated.

 

 

Graphing multiple analog channels on xy graph

$
0
0

Good morning all. Need help from any of you that may have the answer to my question. I need to graph multiple analog channels on a xy graph. I am measuring voltage from a device and I am taking 1000 samples per second at a rate of 10,000 hertz. I would like to graph each sample taken vs its corresponding time at which each happen on a xy graph. How do I get the time so that I can plot it on a xy graph? The code sample I provide it shows a timer but when I run the program, it does not work properly. Looking forward to hear from any of you!!

Thanks,

 

Isidoro Vazquez

 

EUCMC function error

$
0
0

Hi, 

 

I am trying to read from ECU with MC ECU measurement read with protocol XCP CAN and interface CAN1. I am using NI-XNET USB8502 CAN interface to communicate with hardware and A2L file for reference. I am getting -301048 error code if i am running the program. Could anyone tell me why i am facing this error? I hope information provided is sufficient for analysis. Below i am attaching picture of my code.

 

 


Merge two TDMS files

$
0
0

Good day

 Have been trying to find a way to merge 2 TDMS flies created using Sound and Vibration assistance one is the acceleration and the second is the frequency.

I would like to be able to port to Excel to present the data.

Thank you

Jay

Keithley 2400 simulated through IVI

$
0
0

I've gotten the drivers to simulate a Keithley 2400. But my setup is buggy. Even though it appears in the palettes, when I use its VI, I get the error message "KE2400 Initialize.vi<ERR>
Driver Status:  (Hex 0xBFFF00A5) The interface type is valid, but the specified interface number is not configured." How can I fix this?

Is there anyone out there with experience with this kind of thing that can help or direct me to examples or ressources?
image.png

Triggering a Picoscope 3406B with a sync out from a laser

$
0
0

Gurus of LabVIEW,

So, since my early days of posting on the NI forums, I have now become a full fledged Assistant Professor at a university. (Woo!) That being said, there are still things in LabVIEW that I don't understand fully, one of the big ones being parallel computing (I never have enough time to do fun things, lol!) and I've run into an issue with a program one of my students wrote that I think requires some parallel work. First off, I'll describe what we're trying to achieve.

We have three pieces of equipment that are linked together (communication wise):
1. Upgrade COMPILER from Passat Ltd. (Picosecond laser)
2. A Picoscope 3406B (Oscilloscope/data collection)
3. An Arduino Uno and SparkFun easydriver (Drive a stepper motor to rotate a prism and send a 5 V high pulse to the trigger on the laser to fire a laser pulse)

The idea is that we're trying to create a LabVIEW program to collect photoacoustic data (ultrasound) from light that is impinging on the interface between the rotating prism and a sample of dye. The ultrasonic sensor is plugged into the Picoscope (channel A) and I have the program set up to trigger the Picoscope from the external port of the picoscope by hooking it up to the sync out of the laser. Ideally, the program would rotate the prism, pulse the laser/gather data, rotate the prism, pulse the laser/gather data, etc. The issue we're running into is that I'm not sure how to code up the LabVIEW such that the laser pulse (induced by the Arduino sending a 5 volt high to the trigger port on the laser) happens around the same time that the LabVIEW code is running the Picoscope block SubVI so that the Picoscope will trigger/obtain data. So far what happens is that the LabVIEW stalls once it gets to the Picoscope block SubVI because the electrical trigger pulse from the laser (sync out) happens before the Picoscope 3000 block is looking for it (I think). If I manually pulse the laser it gets beyond the Picoscope 3000 block as this then triggers it.

So, how do I code this so that the Picoscope 3000 block is looking for a pulse from (sync out) while at the same time sending the pulse from the Arduino in LabVIEW? (I have attached an image and the VI) Also, I realize that the error stuff/other parts of the program might be odd. I've tried several things to get this to work (unsuccessfully).TIRPAS.PNG

Simulated device in MAX

$
0
0

I simulated two DAQ devices (NI USB-6259) in NI-MAX. I can now create daq express vi and select either of those two as input or output but how do I read signal sent from one at the other (since there are no physical cable linking anything) Do I configure virtual channels and link those? How? Any resource or example that you can point to would be really appreciated.

I:

    -created/simulated NI devices A  and B in Ni-MAX

    -want to output a signal on A virtually

    -want to read said signal on B.

 

How do I link A and B?

How to use TCP when a Client uses DHCP

$
0
0

Hello,

 

I'm using a slightly modified example from the simple TCP VI's (find attached) and am running into a few problems. I keep getting "error 56" which states that "LabVIEW:  The network operation exceeded the user-specified or system time limit."

 

 

I think the issue is that I do not know the IP address of my client (an mbed microcontroller) and the library to statically assign an IP address for this client are currently not available/broken according to their forums. 

 

Is there a way for the server to listen on the port I programmed for my mbed, even if I use DHCP? Also, I believe this may be arising as I haven't got my mbed configured in NI MAX, as again I do not know the IP address of the device, so I'm stuck in a bit of a hole.

 

Any help would be appreciated.

Viewing all 66905 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>