Quantcast
Channel: LabVIEW topics
Viewing all 66899 articles
Browse latest View live

Unknown Network Published Shared Variables Resolution

$
0
0

Hello,

 

I recently ran into an issue where I had to pull data from a Network Published Shared Variable that is an intricate cluster of clusters, types, arrays, etc...We have numerous versions of this code deployed on hundreds of machines where the type definition deployed in the field changed slightly with each version.  The issue with NPSV's is that if the type definition on the client side doesn't match exactly with that published by the server, it throws: 

 

Error -1950678965 occurred at Read Variable in #####.vi

Possible reason(s):

LabVIEW:  The value read is of a data type that cannot be converted to the data type of this Shared Variable node.

 

I needed code that will work regardless of the version of the server.  It is possible to keep all revisions of the typedef and iterate through them to find one that does not error out, but there is a simpler solution.  Fortunately, the top level type is a cluster and I know that I only needed the 7th element of that cluster.

 

When pulling a NPSV, LabVIEW will allow you to pull it as a variant of any NPSV without error!  Then I just needed to decode the variant.  I found this article that was very helpful:

 

https://forums.ni.com/t5/LabVIEW/Get-a-cluster-element-value-from-a-variant/m-p/3814182/highlight/true#M1077354

 

So here's the code I ended up using:

 

Unknown Network Shared Varible Decoding.png

 

 

This code works for all versions of the deployed code as long as the position of cluster item does not change (it doesn't).

 

I thought this would be helpful for others, future me, and NI Tech support to show how I solved the problem.  Using the VI's in the "Data Type Parsing" palette, same as the "Get Cluster Information.vi", it would be possible to dig through more complex data types if needed.

 

Enjoy!

 

 


Stop a While Loop Before My Wait Time Elapses in LabVIEW

$
0
0

Hi guys,

I have a While loop with a long Wait (ms) time function. My Stop button takes too long to stop my loop. How can I program my Stop button to end a while loop immediately? I appreciate  your efforts in answering my question.

 

Kindly check my Project as a snippet.vi.png

App Builder Strange Behavior

$
0
0

This isn't really a problem, but it bothers me.  I have a project that contains a library (both attached).  In the build spec, two of the files (Aux Data.ctl, and Step Data.ctl) used to go to a special destination (A separate app needed to load them dynamically).  I changed things so that I no longer needed those 2 files to be deployed in a special location. In the current build spec, they go into the .exe.

Here's what bugs me: several times during the build process, it's hunting for Aux Data.ctl and Step Data.ctl (but not for any of the other .ctl files).  Here is one of the message boxes that is displayed:

hunt.png

It's looking in the .exe. Eventually it finds the files and the build is OK. But why is this happening? I tried recreating the build spec from scratch, but the same thing happens.  What could be causing this?

problem in Real-Time module labview 2016

$
0
0

Hi everyone, i had my labview 2016 and myrio toolkit installed in my computer windows 10. When i want to install the real time module, it prompt me about the NI device drivers. What is the driver actually and where can i get it ? I download the myrio toolkit and this real time module from labview myio software bundle. Please help me, thanks!!

How to publish in the web with NXG

How to force case stucture to start from the first case after pressing stop and run again

$
0
0

Hi guys,

 

I made a small program that display numerics in every case structure . I just want the program to display the numerics from the first structure if i clicked stop and they run the program again.v.png

 

I appreciate all your help.

 

Thanks

 

FPGA Development - Why is this comparator not working as intended?

$
0
0

Hello,

 

I am a student whose project has taken me to the beautiful land of Labview FPGA development. I am very new to this so any advice regarding my design is appreciated. Admittedly, Labview is much nicer than using Vivado, but I am having some trouble with some functions that should really be basic. I am reading two voltage inputs, comparing them, and their difference is compared to a user set value. Below is a picture from part of the front and back panels of this functionality (ignore the fact that I called my voltage readings I2 and I3): 

https://imgur.com/a/M2KRuen

 

As you can see from the front panel, the difference between the two is calculated correctly (around 1.6 in this picture). However, the LED "Diff Warning" that SHOULD turn on as soon as this difference is above "Differential Setting" is still turned off. As a consequence, "Fault Location" stays at 0 and is never set to 1. I've bashed my head at this for a while and honestly every compilation takes 10 minutes and I can't take it anymore.

 

Any idea why this might be? I will leave the code file attached as well if anyone is interested.

Thank you for your help everyone!

 

PS: Here is my system info

-NI cRIO 9024

-FPGA cRIO 9114

-Input module is NI 9223 (seems to work fine)

-Output module is NI 9401 (not as issue at this time)

interactive pull down inputs

$
0
0

Hi,

 

I am trying to automated a Wi-Fi board. Some of its inputs (pull down selections, e.g Params Value) are dependent to other inputs ( Combo iw,  e.g iwlist/iwconfig). How to have some inputs values depend on the others before executing the vi interactively. In another words, each iwlist or iwconfig is going to have different pull down menu (case).

 

Thanks,

Rostam

 


best way to implement IIR vi

$
0
0

First,  no signal processing toolbox!!

 

I have an IIR filter to implement in a vi.  My only trouble is that the filter has stored data states that must be maintained between calls.  This vi will only handle one signal source so it can store the filter state in the vi.  It should have one input and one output.  So for the data kept between calls to this vi should I use global variable or ?????

 

 

Application Directory VI is missing

$
0
0

Hi all,

 

I can't seem to find the Application Directory VI in LabVIEW 2017. It's not there under the "File constants" directory. All the other vi's/functions are there. Any explanation or help is greatly appreciated.

 

Any Ethernet/IP examples or tutorials?

$
0
0

Hi!

 

So, for the first time, i have to work with the ethernet/ip protocol.

 

I would like to know if there is any examples or tutorials about the protocol and how to program the communication read/write.

 

I've check on the internet and read the documentation and i know a bit how to do it with the address and everything but nothing more.

 

One thing i don't know what it is, is the "number of elements" input. Is it only the number of element in my array for example? So, if my array contain 4 values and i put 3 elements in the inputs of the "Tag Read", i will only read the first 3 values?

 

Another thing is that i dont have access to a PLC for now. Is there a way to simulate one? So i can know my path is good, know if i send/receive the data i want and test more things before having the PLC with me?

 

Thank you!

    - Dave

Hotkeys (e.g. ctrl + e to toggle block diagram and front panel) not responsive.

$
0
0

As the subject says, I am having difficulty with the keyboard shortcuts for LabVIEW 2018 on Win10.

Ctrl-anything is problematic.  I have to leave the keyboard untouched for ~1sec. then very deliberately press Ctrl and then e to switch to the block diagram. 

This is across several machines, and I have never experienced it before in >15yrs of LV development. 

Any idea what could be causing the issue?

Keyboard incl. shortcuts are working fine in all other Windows apps.  Drop-down menus are normally responsive.  Typing in LabVIEW shows normally responsive keyboard.  Only keyboard shortcuts are having the issue.  

Problem in plotting data read from Visa

$
0
0

Hi Guys,

 

I am working on a labview code to read data from atmega 32 microcontroller through USB. But I am facing a problem. whenever the data crosses form +ve voltage to -ve or vise versa, the data point is lost and instead a positvie or negative spike occurs. I have tested my circuit on the CRO and no such peak exists. Thus I think the problem lies with the way I am reading the data. The (digital) hex data read, is converted to analog as per the "result representation" of atmega manual. Please find my VI, read waveform and the result representation attached. Please let me know if anyone finds my mistake

Thanks a lot!!

 

Have a great weekend!

Hardware-in-the-loop qualification

$
0
0

I've built a tester for automotive solenoids that used NI DAQs and a computer.  We simulate the powertrain signal, apply it to the solenoid, measure the pressure response, then save and analyze the data.

 

Today, I happened across a few NI articles on Hardware-in-the-Loop system.  As I'm trying to understand the HIL concept, I was wondering if our test system qualify as a HIL?

 

Thank you for your help,

Ron

Labview graph data delayed

$
0
0

Hello all,

 

I'm currently working on a Lab view interface for a Rocket thrust measurement Test stand. We're using an arduino and XBee connection, although for our current tests we've been using a hardwired connection. The problem is, the serial data input from our test weights is currently being written to graphs at an incredibly delayed rate. As in, I'll take the weight off the sensor and the graph/response will show that fact dozens of seconds later.

We're using VISA in this to receive serial data (the arduino is coded and calibrated to give accurate weight) in the form of pounds vs seconds

 

How should I counter this? I tried increasing my loop delay, but that didn't seem to be having an effect. Ideally, we want the graph updated in real time.

 

I attached the file, although it's currently unfinished (formatting in particular, and it also has three version of the graph I'm using). I don't (currently) need help with anything but the graph-writing delay however.

 

Thank you for your help.


How to open .MAX extension file.

$
0
0

hello friends... I have a binary file with a .MAX extension, created by ECG machine. I can't open in normal mode. I tried with LabView to read that file. results obtained are not satisfactory. it is attached here. please, someone, help me to read that file and extract data from that file. 

thank you 

USB-demodulation

$
0
0

Hello.
I'm developing complex program, which works with IQ-data.
I'm really new in demodulation.
So, the app reads iq-file, draws waterfall and next step, a user selects fragment on waterfall an dthe app wust play this signal.
Here the WF. You can see 2 gignals. I need to play the right one (USB)

wf.PNG
In my company we have od version of demodulation, and it almost works, but I hear second signal (like peep).
Now I hry use modulation toolkit. And AM-demodulation VI has single-side mode, but I can't tune it correctly.
Here my test vi.

demodulation.png
I've also attached LV-15 archive with all subVIs.
Can anybody help my with USB-demodulation?

 

Central F in file = 10.052400MHz

MyRio i2c slave

$
0
0

Hi everyone,

I'm having some troubles to get my PIC (18F248) to communicate with my MyRio 1900 Board through i2c.
In my case, the PIC must be the master and the MyRio the slave.
Does anyone know how to make the MyRio act as a slave on i2c bus ? (with FPGA for example)

Thank you !

Error 8 when creating a DLL library in System32 folder (Windows 10 and 8)

$
0
0

Hello,

 

I am trying to create a DLL library inside the system32 folder from a series of .vi files. The build fails, yielding an Error 8 message. However, it succeeds if I select a different destination folder (e.g. C:\).

 

Please note that this used to work in Windows XP and Windows 7 but it fails on Windows 8, 8.1, and 10.

 

I would appreciate any help.

 

I am quite sure that the administrative privileges are sorted, and I also tried the suggestions on the NI website to no effect.

MyDAQ Single Point treatment, ......, possible ?

$
0
0

Hi,

I'm a french electronic teacher and I use Labview and MyDAQ board with my students.

 

So this is my question :

 

I know that MyDAQ does not support the Hardware-Timed Single Point Sampling Mode (HWTSP). So, if I have well understood, if I make an acquisition with continious samples, this samples are automatically stored in the buffer before they go to the vi treatment ? And the same for the Write function ?

 

The treatment sequency should be this one :

1 : reading sample

2 : storing in the reading buffer

3 : vi traitment

4 : storing in the writing buffer

5 : writing sample

 

If the sample frequency is set to 1kHz (see attached VI) : at t=0ms, the board reads only 1 sample, puts it in the reading buffer, then treats the data (this duration is approximativly few microseconds), then the sample goes to the writing buffer, and after it is written to the board analog output.

At t=1ms, we do the same, etc each ms ...

 

Conclusion : are we making single point treatment (N=1 in the For Loop) ??? The difference with the HWTSP is that we must pass through the buffer ... ?

 

Thanks for your help !

Regards,

David

Viewing all 66899 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>