Quantcast
Channel: LabVIEW topics
Viewing all 66787 articles
Browse latest View live

Simple Serial inside a while loop

$
0
0

The VI below is a simple one that works well under ideal conditions. But as mentioned there in real life there are issues to address. OK one method is to have a logic implemented to do this :

1. Sense the error status.

2. On Error, close the Serial session and then restart the session. 

 

But I just want a basic clarification... I think the Serial Start / Write / Read functions are blocking type ? Meaning till the intended functions completes or Time out happens the loop is frozen .

I would like o know if there any way of making these non-block type. Like I just initiate a Serial Read  and then go do something else . Once the function completes it lets me know and I do something with the data just received. For those from the embedded world this is a normal way with Processor Interrupts but I am not sure about the Windows world !! Any tips on this ?

SimpleSerialInWhileLoop.png


Jitter Analysis Toolkit - RJ DJ Separation example

$
0
0

i guys,

 

I am new to jitter analysis so I'm not very familiar with the terminologies. 

 

I was looking at the RJ DJ Separation example and played around with it a little. 

The terminologies I wanted to clear up are "Original Spectrum" and "Periodic region".

The two terms are shown in the graph at the bottom of the front panel. 

JAT original spectrum.png

Can someone help me and give me a brief explanation of what each term means, and how they interact with each other?

I think this will clear up a lot for me!

 

Thank you in advance! Thanks will be in kudos!

 

Kind Regards,

Tim

Save string of data in array and control from array

$
0
0

Hi,

I have a questions about arrays.

 

As shown in "Save array.PNG",

1) I need to save the item name and value inside 1st row when "Save ASCII" button is pressed.

2) Save the item name and value inside 2nd row when "Save Hexadecimal" button is pressed. (hex input will be converted to ASCII format before saving in array).

Will keep continue to save into the array when new input is entered.

 

Then i need to make this array to become control so that i can choose it like in combo box as shown in "Combo box.PNG".

 

The VI is attached below.

 

Anyone can please help me to solve this? Thanks.

NINE hours to download and install LV 2018 + std Drivers?!

$
0
0

Guys.  What is up here?

 

I'm on an older i7 with 8GB of RAM and a pretty decent SSD.

 

2 hours to download 18GB of install files (just LV, std Drivers, Ultiboard), and the install portion has been running for over 6 straight hours- only 52% done with DAQmx!!

 

Yesterday a similar install (only installed DAQmx + VISA) took over 5 hours.

 

The time to install LV2018 is long, but the DRIVERS are obscene.  What is going on?!

Create MQTT Server in labview?!

$
0
0

Hi, 
i don't find a free module of server to send my data with MQTT in labview to my client?
Thank you for your help.

Array last element display if scrollbar is on

$
0
0

Hello,

is there a way to have an array of N elements with scrollbar but prevent the N+1 -th  element from showing?

This is what happens when you add a scrollbar:

2018-05-31 11_11_30-Test Selector.lvclass_Test Me.vi Front Panel on UniMove.lvproj_My Computer _.png

.. and this is what i would like: (ms paint work)

Array.png

 

Thank you for any suggestions!

Labview 2018, Python, x64

$
0
0

Hi,

Labview 2018 support Python http://zone.ni.com/reference/en-XX/help/371361R-01/glang/python_pal/

In http://www.ni.com/product-documentation/54295/en is written, python36.dll is used.

I have Miniconda 3.6 x64 installed.

If Labview call python36.dll  direct (not via network), I need Labview 2018 x64?

 

Up to now I used Labview 32bit, because not all Toolkits are available for x64: http://www.ni.com/product-documentation/52818/en/

But this document is no up to date, the LabVIEW Advanced Signal Processing Toolkit support now 64-bit, http://www.ni.com/pdf/manuals/375335d.html#64bit

 

Peter

Spectrum of DAQ received signal

$
0
0

So I have transmitted a chirp signal to a low pass filter using DAQ assistant and wanted to analyze the spectrum of received signal.  Though the received signal in time domain is okay why am I not being able to get spectrum? Is there any different approach to get the spectrum of signal when it is received through DAQ?


Please delete

falcon connection with myrio

$
0
0

I need someone help how to connect the haptic falcon device with myrio 

ERROR showed: put the code in the real time target 

 

Filtering an array of waveforms

$
0
0

Greetings

I have a 1D array of waveforms which were acquired from strain gauges using a cDAQ-9174 and a pair of NI 9237's.

The waveforms are displayed on a graph, configured to update as a strip chart.

The PC on which I created the application displays the strains as smooth curves, which is what I want. However when the built application is installed and run on a different PC (of a much lower specification) the strain plots are messy, as if there's noise superimposed on the signals.

I'm not concerned about getting a perfect strain representation shown on the destination PC because it's just for a demonstration.

So, I've tried to add a low pass filter to my vi to remove the noise which for some reason appears on the destination PC. The attached screen capture shows my attempt at this. However when I run the vi I get an error at the subtraction operation (in the centre of the screen) which says that the 2 waveforms have different dt values.

Also attached is a screen capture of my front panel which, on the left, has indicators for the waveforms, which shows their dt values. One of them is 0.000620 and the other is 1.000.

Am I going about implementing the filter correctly?

Any other ideas how I may fix the apparent noise on the graph on the destination PC?

Many thanks

 

Occupied resource error with DAQmx channel generation.

$
0
0

Hi guys,

Long story short is I'm trying to modify my labview code so that I can run a stepper motor in both continuous and single stepping mode. I have created a finite state machine to deal with both cases only I get this really irritating error when transitioning from continuous mode to single stepping mode. Here is the continuous mode state. 

Screenshot (86).png

Sorry the image is a bit large to fit the whole thing. To the far left before the start task a continuous pulse train is applied to a counter pin on the NI elvis 2 which sends the signal to the step pin of the stepper motor driver IC. The number of samples per channel is then set before the task is started and the loop is executed and the stepper motor steps continuously at the prescribed frequency. When I press the "Halt continuous stepping" button the loop stops and so one would expect the loop to stop the task and clear the channel before the next state "single" gets sent to the shift register. 

 

The image below is the single state of the FSM:

Screenshot (87).png

And so when I go to hit the "step button" I immediately get the following message shown below:

Screenshot (88).pngWhich is the reason why I put that flat sequence before sending the "single" state to the shift register. I thought it might ensure that the channel was clear before executing the next stage and creating the new channel. I have no idea how to fix this, when I ran it in highlighted execution mode it executed properly when I pressed step so why does it say that the resource is busy when run at normal speed. Also I did have a name for the task so I really don't know why it said the task was unnamed. 

 

If anyone could offer me an explanation for this error in order to help me debug it I'd greatly appreciate it. 

Polar pict background for x-y graph

$
0
0

Hi,

Want to use a polar plot as background for an x-y graph.  Will use x-y cursor to identify coordinates and convert to polar (r, theta).  I can't seem to find a way to 'co-locate' the origin of the plot pict with the origin of the x-y graph. Picture controls are 'kinda' confusing it seems.

 

Please see my attempt so far.

 

Thanks,

Chris

GUI sizing problem after windows (10) display timeout

$
0
0

I have a user interface developed in LabVIEW 2017 (32-bit) that is bundled into an exe that runs on Windows 10.  It was previously in labview 2009 and running on windows 7.  I mass compiled and grew the GUI to take advantage of a new/larger monitor.  Now, each time Windows goes to sleep (not sure if that's the right term), i.e. the display goes to screen saver, and I re-enter my login information, the GUI is all distorted.  It's larger than the monitor size, and I have to scroll up/dn/sideways to get to the various controls.  The code used to have some sizing attributes enabled, where edges of certain controls were dependent on the edges of some bordering controls - I inherited this code, and I think that was in there to allow resizing, but it had some unpredictable behavior.  I 'think' I commented all of that behavior out, as it wasn't allowing me to resize/grow the GUI.  Any thoughts on what may be causing this distortion?  

Labview 2015 Installation > Runtime ERROR

$
0
0

Hi All,

 

I'm trying my hardest to get 2015 installed onto a machine, but getting the same issue across 3 separate PC's now so there must be something fundamental I'm missing.

 

I get all the way to the point of gathering the files from the web based installer before it falls over stating that application has requested runtime to terminate in an unusual way (See attached)

 

Anyone familiar with what I'm experiencing?

Our Labview Guru is on vacation and has resulted in me losing a bit of hair, and gradually marbles, this week.

 

Thanks In Advance

 

 


Labview 2015 executable "Control Could Not Be Loaded" in windows 10 but works in windows 7

$
0
0

Any suggest is welcome.

 

I use Labview 2015 on Windows 10 and compiled an exe file for a non-Labview Windows 7 computer. It's has been working fine for years until I recently upgraded to Windows 10. The same exe file can be launched and run without showing any error message. But in areas where we used to type in numeric data cannot be loaded suddenly. The run time engine file we use is
LVRTE2015_f3Patchstd.

 

I tried a few things, 1. uninstall & reinstall the same engine, 2. repair, 3. uninstall and install 64-bit version engine, 4. install Window 10 SDK, 5. Enable ActiveX server when build exe, 6. Build a Installer instead of exe file, 7. Check NI USI in additional installers. Non of them works so far.

 

Any idea? Highly appreciative! 

Intermittent 88709 "task aborted or device removed" error

$
0
0

Hello All,

 

I'm befuddled. The code I've attached (ugly as it is) worked fine for months. Suddenly I'm getting the error 88709 "task aborted or device removed" occasionally at the DAQmx "stop" command. My first question is what are some common reasons why tasks might get aborted? Nothing is getting physically removed, so that's my first suspicion. Right now I don't have error messages for the DAQmx sub VIs wired, though I'm not sure if that would help since the whole VI is getting stopped. The only pattern I've noticed is that I can sit and watch for hours and not see the error. When I leave the VI running by itself (which is pretty much the whole point) I come back to the error screen after a seemingly random amount of time. 

 

For a little background on what I'm trying to do: The code is supposed to (and has in the past) run an experimental apparatus through a number of different operating conditions, gathering data for each set until the temperatures stop changing (as determined by a sub VI) then moving onto the next set. A for loop steps through an array of clusters, each entry with values for an individual experiment, and starts the equipment. Then a nested while loop gathers data and evaluates whether the temperatures have stopped changing. Right now I initialize\reinitialize the DAQmx tasks inside the for loop, then start and stop the task inside the while loop (see attached). I do this so I have access to the measured values to evaluate my termination criterion, and so I can take a pressure measurement every so often as well (other entry in the case structure). Right now I self test the cDAQ chassis once at the beginning of the whole set of experiments, since I've had issues in the past with it not working if I don't. Perhaps I should be self testing at each for loop?

 

Any suggestions are extremely appreciated.

 

NI-Max Crashing MacOS High Sierra 10.13.4

$
0
0

I am running on MacOS High Sierra 10.13.4, LabVIEW 2017 and NI-VISA 2018 but I'm unable to use VISA. Whenever I try and open NI-MAX, it pops up initially before crashing. Similarly, if I try and open up example code for instrument drivers it crashes LabVIEW. I have reinstalled all applications (as recommended on another forum post) but can't resolve this issue. Any ideas? 

Record measurements from serial device and DAQ with common clock

$
0
0

What would be the best way to go about recording two signals, one of which is analog input through a National Instruments DAQ, and the other is through serial VISA access to a USB device, with a common clock?

 

Here is a more detailed description -- I'm still very new to LabVIEW so I apologize if I'm omitting necessary details, please let me know and I'll include them in a reply:  I have a 4-channel NI DAQ that my group has previously used to sample and record two analog signals. One input comes from a CT scanner and indicates whether the beam is on or off.  The other is the output of a pressure to voltage transducer.  We need both of these signals to have the same sample clock so that we can determine what the pressure measurements were when the beam was on.  This was very straightforward when both signals were coming through the DAQ, however we've recently replaced the transducer with a newer model that connects by USB (OMEGA PX409-USBH).  The transducer has no provisions for timing.  It can be accessed through VISA or as a .NET object.

 

I am able to access the new transducer through LabVIEW using VISA read/writes, but I'm unsure about how to go about writing a VI that will record the analog input through the DAQ with shared time information.  I need to either force the VISA read to occur as close as possible to the read from the DAQ, or measure both signals independently but record timestamps with a common clock.  Is one of these approaches more straightforward/advantageous than the other?

 

From other threads I've read, this should be possible using software timing.  Synchronization accuracy to within 10 ms would be ideal, but really anything under 100 ms or so would be acceptable.  In the VI, we are just writing the two signal measurements to a file that can be post-processed later.

 

Would the producer/consumer design pattern (http://www.ni.com/white-paper/3023/en/) with 2 loops to measure the signals and a 3rd to synchronize/time stamp and then write to file be the way to go here, or am I completely confused?

 

Thanks in advance.

Replace an array row with in place element structure

$
0
0

I have a 2-D array that is three elements wide and many elements long. I would like to use an In Place Element Structure to replace one  three-element row of this array. It's not cooperating. It insists on having an index for every dimension, which seems to make this impossible. Is there a way?Replace row problem.PNGWhat I wanted to do

Viewing all 66787 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>