Quantcast
Channel: LabVIEW topics
Viewing all 67078 articles
Browse latest View live

Dynamic control of AI input range for 8 AI channels.

$
0
0

Hi everyone!

I am trying to sample voltage on 8 AI channels using NI-9205. All the channels are connected differentially and all are sampling "at the same time".

The challenge is to be able to change the AI input range in some channels to either i) gain resolution or ii) avoid oversaturating the ADC at a certain point in time.

 

I have been able to achieve this by using the attached VI, but I think this slows down the rate of data sampling. Is there a more efficient way to dynamically set the AI range on multiple channels? At 30k sampling rate with 100 samples per channel to collect, each loop should take approx 27ms (8 channels x 100 samples)/30000kHz = 0.027sec. Unfortunately, with the way my VI is arranged, it takes about 100ms

Is there a way to speed it up?

 

Channel AI range.jpg

Cheers!


DAQmx Task Questions

$
0
0

These are probably Sir Jeff B questions, but anyone who is a DAQmx guru feel free to chime in.

 

An aside for any newbies reading the post, in case you did not know the knights/soon to be knights specialties, here are some:

  1. Jeff B - Knight of DAQmx and fearless predictions
  2. Altenbach - Knight of optimization
  3. GerdW - Knight of quick replies and quick wit
  4. Bob Schor - Knight of channel wires, report toolkit, and include your %&#(* code
  5. Hooovahh - Knight of TDMS, xnodes, and cheerful responses
  6. rolfk - knight of LabVIEW, Windows, Linux, internals. Answers questions like poetry, beautiful to read, but hard to understand.

Anyway I need to stop digressing. Here is a summary of my program, I cannot include it due to company policy.

 

I have built a general purpose DAQ program for voltage analog input tasks. The user hooks up a DAQ to the computer, usually through USB, and starts my program. The program dynamically queries the DAQ and find the allowed voltage ranges, sampling ranges, etc. The user then chooses how to run the DAQ by changing the voltage range, sampling rate, number of channels, etc. Note we are in an R&D environment, the task is not static and may change from day to day or even hour to hour. (Note I am running the DAQmx device in continuous acquisition mode, basically my program is like a tape recorder.) The compiled exe is set to allow multiple instances, so multiple programs and devices may be running at the same time.

 

So previously, whenever the user changed a DAQ parameter, I would STOP, then CLEAR the DAQmx task and start a new DAQmx task. However, after updating to DAQmx 17.6.1, and running the program on 32 bit systems, I would sometimes get the unable to allocate resource error, or not enough resources error. Something like that.

 

So I decided to change my program somewhat due to the advice in the forums here. So please correct the following where I am mistaken as this is what I understood from the forums:

  1. Reusing a DAQmx task is better than starting a new one. Reusing a task is possible for the following conditions
    1. Adding channels to the task
    2. Changing the sampling rate
  2. Once a DAQmx task property is set is impossible to unset it.
    1. For example, I can reuse the task is the user decides to save the data, in Log and Read mode. However, if the user saves the data in Log mode only, and then tries to switch back to Log and Read mode, any properties that were set in the task for Log mode only are not erased and I get an error.
  3. In order to reuse the DAQmx task I use the following sequenceSnap1.png

     The DAQmx Stop has to be right after the abort VI otherwise any modifications to the task do not work. After the abort VI the task is in an unstable state and only Start or Stop will put it in a stable state according to the help file.

  4. If I need to start a new task I will use the above followed by a Clear Task VI.

 

So I guess my questions are:

  1. Is there anything wrong in my logic here?
  2. Are there any other situations where a task can be reused?
  3. I cannot get around closing and starting tasks, is that always going to be a problem?

Thanks to anybody who has read this long post. Kudos to you.

 

Cheers,

mcduff

national instruments windows event viewer "msinstaller"

$
0
0

I have a windows 10, 64bit computer. Is it normal that my windows event log shows a daily product reconfiguration of all national instruments components? If it is, then it must be a new feature because I don't see it on my windows 7 laptop. I have attached a screenshot.

It does everyday around the same time for hundreds of NI components and last about 10-20 seconds.

thank you.

 

TDMS writing 2 tasks to SD card on compactdaq: failures

$
0
0

I am trying to stream continuous data from 2 tasks to an SD card on a cDAQ-9137.  I thought this was easy, until it wasn't.  I'll list the hardware below, and then what I'm doing, and hopefully someone has insight.

 

Problem:

Using 2 TDMS streams for long term continuous logging to and SD card fails over time in compactdaq 9137 using SD card bay.

 

Hardware:

9137 (running WES7, LV 2016, various 32 GB Sandisk SD cards, 2GB ram)

9223 (2 channels at 100 kHz, Task A)

9223 (4 channels at 10 kHz, Task B)

9223 (4 channels at 10 kHz, Task B)

9223 (4 channels at 10 kHz, Task B)

 

What I did:

All tasks were set up using the DAQ Assistant, and then I used the 'Generate DAQ-mx code' option to get the tasks defined.  That said, I have the same problems with long term writing even just using the DAQ assistant generated helpers in a while loop.  I've attached the task setup for Task A (fastDAQ.vi) and Task B (slowDAQ.vi).

 

I define two tasks, one for 100kHz x 2 channels (task A) and one for 10kHz x 12 channels (task B).  I set both for continuous logging to TDMS file, log only, 100 MB buffer (started smaller but ramped that up hoping it would help).  Initially everything worked fine (short hour long tests), then when trying to do longer runs (10 hours), I went form completing an entire run, to shorter and shorter acquisition times before getting the dreaded 200279 buffer error.

 

I then started breaking up the files using the 'samplesperfile' TDMS property to break files at roughly 1 hour.  That did not make any difference as random 200279 crashes still happened ranging from 30 minutes to several hours of run time.  I was then hopeful that using the 'filepreallocationsize' combined with 'samplesperfile' would help.  Here I set filepreallocationsize = samplesperfile=360009728 for task A and 36003840 for Task B (these values equal about 1 hour).  I think it does help to do this, but the next problem I run into is that the pre-allocation only happens for the first file (up to samplesperfile value).  The next TDMS file made (continuous acquisition) does not appear to pre-allocate.  It hasn't crashed yet, but I know it will.  I can see on the disk that when the initial file was created it was allocated for the full size, then when the TDMS incremented to the next file the sizes started at zero and are increasing as data is being written to the card.

 

To give an idea of file sizes involved, for task A 1 hour = 1.4 GB and task B = 0.84 GB.

 

I definitely have flash problems on the SD card and have tried using different type of brand new cards.  Writing two streams on this system without the pre-allocation fails over time.  Without the pre-allocation, I can see where there can be a big problem writing two streams to flash.  I have also tried fat32, exFat and NTFS formatting without any noticeable difference.  Using performance monitor, I can see where the problems come in as streaming works fine up to a point and then the drive queues get high, then 200279 kicks in.  I chalk all this up to me not combining labview and flash the right way.

 

Labview questions:

1) Is there any way to use both samplesperfile and filepreallocationsize such that every file in a continuous acquisition will be pre-allocated (not just the first one)

2) is there some more optimal way to write to SD cards in this 2 task scenario?

 

If anyone has a recommendation for different storage or known SD cards that are good to use I would appreciate the advice.  I'm starting to wonder if I should plug a hard drive into the USB port and give that a try.  Thanks for any advice.

cDAQ voltage four channel LED control - LED stays on in final repeat

$
0
0

I have a four channel LED system where I'd like to control LED intensity and time-on for each LED separately- with specific number of iterations of this stimulus (see attached diagram).

 

We're using an A/O module on cDAQ system to control each individual LED and zero frequency sine waves to set up the LED voltages (snippets attached) - each LED also has a specific 0V pre and post stimulus time. The LEDs need to have the same timing so that each waveform executes at the same time and the stimulus needs to be repeated several times (controlled by the front panel and a loop). Ive set up a subVI which ensures all the waveforms are of equal length (MakeWavelengthEqual(SubVI)) as well as one subVI to generate a digital output for the duration of the LED on time (TrigOutGen(SubVI)) to trigger out data acquisition equipment. 

 

The VI creates these Waveforms and output them to each individual channel, whilst also repeating them. However I keep having an issue where the LED outputs don't quite correlate to the waveforms I've set up - and the final LED output repetition seems to create a situation where the LED stays on even after the VI completes its run. 

 

I'm new to labview so troubleshooting these issues is very difficult to me, I get no errors, the waveforms look correct when plotted and the task is set up to clear outside of the loop so I'm not sure where the problem with the final output occurs.  I'd really appreciate any help anyone can provide. 

how to initialize many controls

$
0
0

Hi

I have a state machine and in the init state I am putting all controls in my app (many, more than 50) into a big cluster that I pass into a SR and through the states. This init state is ugly with all the controls there and takes a lot of real estate. Is there a different way to do that ?

Thanks

open cv installation

$
0
0

hi

i want to install the OPEN CV toolkit for labview 2014 , but i don't know how install it .

please help me .

thanks

Measuring twist of a drive shaft using two counters

$
0
0

I would like to measure the drive shaft twist when engine starts by a dyno. I have two encoders mounted on both ends of the shaft, and would like use to two counter/timer of an NI 6363 to do the job.

 

The drive shaft is soft with two ends isolated by rubber elements

 

At the beginning, engine and dyno are stationary, so both ends of the shaft are no moving. During start, one (dyno) end moves first, and the other (engine) end follows.

 

I start the two counter acquisitions at the same time using an AI trigger.

 

The problem I am facing is that I do not know how to measure the time difference between the 1st pulses of the two counters.

 

If I can get the absolute time values of the edges of the encoder pulses, everything else is straight forward, but I cannot find this option.

 

Can someone advise if this is possible?

 

Thanks.

 

Ian 

 

 


Multiple independent error "streams"? Good or bad idea?

$
0
0

I'm working on a project that involves running through a file with a list of commands/procedures that are to be performed on a "widget". The easiest way to imagine this is to imagine a non-linear manufacturing facility where raw blocks are the units to be operated on and they can go to various machines for various steps along the way to becoming the finished product.

 

So there's a main coordinator that will be handling when machines are available and sending the commands to move items to a machine and telling the machines to start their job and waiting for the machine drivers to notify when done/error/whatever.

 

In my mind there are two, maybe 3, levels of errors. Traditionally, I think of the error functions in labview as relating to Labview errors (problems with the software). In this project, it's quite possible to receive errors from a machine (say you tell the machine to run a program, like a CNC program, but lets the file isn't found). I would want that handled differently, likely logged differently, than a software error. For instance, I would not want lots of sections of code bypassed in this instance the way a lot of code is bypassed with error/no error case structures.

 

So would have two separate error "streams"? Should I use something other than the typical labviLa error cluster even though the cluster is adequate for the level of information that would be needed?

Am I thinking of this all wrong? Is there a better way to handle this on the same wires as software related errors?

How to make NI myDAQ student version log data from its DMM to an array or graph?

$
0
0

How to make NI myDAQ student version log data from its DMM to an array or graph?

 

So the issue is that if i use the aio port it has the limit of 10 volts while the DMM has 60 volts. So this NI mDAQ can definitely register higher voltage.. but it is from its DMM port and I need to use the DMM interface. I see no way to record that value in labview to get a chart or to save it in an array. 

I appreciate any help. 

How implement a set of commands/properties, various i/o data types, for an instrument driver?

$
0
0

I have this machine that is run by the manufacturer's software. The software is basically designed for a user to create various programs (kind of like a scripted procedure list) and then be able to run them. However, they also provide a COM/ActiveX interface for their customers who want to integrate the machine into a larger, more automated/robotically controlled platform. So instead of a user standing by the machine and running it as a standalone device, something more like a SCADA/job scheduler program could have a robot bring something to the machine and tell it to perform some task.

 

The automation interface includes a pretty wide range of properties & methods that includes a mix of read only/write only/read-write properties, some methods that have no input parameters, some have up to 3. These inputs & outputs also have a number of different data types.

 

My first crack at making a driver for this was to make a vi for every property and method I could foresee wanting, and then making a polymorphic vi of all of them so that I could just plop it down wherever I needed to interact with the instrument and could select the property/method I wanted and have the connector pane adjust to the appropriate number of inputs/outputs and data types.

 

However, as I've progressed further, I've come to realize I'm calling these from lots of different places/vis and I'm thinking I really need these commands to be serialized to ensure the order in which they are sent and avoid simultaneously trying to send more than one request. My assumption is that polymorphic vis are both non-reentrant & nor are they really a single vi that ensures no more than one subvi that makes up the poly can't be called simultaneously. Basically, when you make your selection for the poly, it's the equivalent of having placed that particular subvi on the wiring diagram, right? So if the poly is called by vi A and vi B at the same time, and the poly selection is different, two different commands could be sent simultaneously to the ActiveX interface, right?

 

Is there just a better way to go about this? I'm just so hung up on how to deal with different data types and number of input/outputs. While a variant could be used for the data types problem I can't wrap my head around how make it so that it knows how to wrap/unwrap & route variants once they're "inside" to the appropriate ActiveX method/property and how to deal with outputs for properties that have them. I also want to address the issue of ensuring these commands are called sequentially no matter which vi calls them.

 

Anyone got any pointers or strategies for this?

Labview has encountered a problem and needs to close.

$
0
0

Hello, I recently installed LabView 64 bit on my Win10 PC  and whenever I try to launch it I get the license wizard to activate it, I enter my email and password and a few moments later it appears a dialog that says "Labview  has encountered a problem and needs to close." I already installed the DAQ (17.00) drivers but I really don't know what can be wrong. Any ideas?

Thanks

Software Simulator to test machine code

$
0
0

When i am developing code for machine control, the normal challenge is to test the code for functionality before loading on to actual machine. This is much safer and almost the full sequence can be safely tested out.  For this currently i use a Hardwired simulator which has following functionality :

1. 32 Toggle switches to simulate Digital Inputs ( DI) 

2. 32 LEDs to check digital outputs ( DO)

3. 16 Potentiometers to simulate Analog Inputs ( AI)

4. 4 nos DPM to verify the Analog Outputs ( AO) 

But the problem this hardwired simulator is bulky and not portable. 

 

What i want to do instead is develop a simple LV interface with all of the above in Software and run that on another computer or tab. Then send the variables alone from that interface to the PC running actual code. In essence the variables will replace the DAQMx I/O signals. 

 

Doing this in software has many advantages like customized controls for each project, portability so i can test code on the move etc without having to contend with the large Hardwired stuff. 

 

Anybody done such a thing ... one of the challenges i have is to pass the variables via USB..Software Simulator.jpg

Problem when using a .NET assembly that dynamically loads another assembly

$
0
0

We have implemented a series of VIs that use a number of .NET assemblies. One of these .NET assemblies (call it assembly A) dynamically loads and uses another .NET assembly (assembly B). All assemblies are in the same folder. These VIs are distributed to our customers as components to use in their own VIs.

 

I have discovered that the behaviour of LabView is different depending on whether the top-level VI is run from within a LabView project or not. If the VI is run from a project, then the code works fine. However, if the VI is loaded and run stand-alone, then it gives a .NET exception indicating that the dynamically loaded assembly cannot be found. I have used Process Explorer to see what is happening and can see a difference in behaviour:

  • When run as part of a project, LabView loads the assemblies directly from the installed location.
  • When run as a stand-alone VI, each assembly is loaded from a shadow directory (a different directory for each assembly) under the user's AppData folder. Presumably, as LabView does not know about the dynamic dependency, it has not copied assembly B into the same folder as assembly A. If I manually do this, then the VI runs successfully.

Why does LabView exhibit this difference in behaviour? Is there any way that we can force LabView to load the assemblies directly from their installed folder when running a stand-alone VI? Or alternatively, is there any way to ensure that LabView knows about the dynamic dependency?

 

This is using LabView 2014.

 

Thanks.

Map Input Size

$
0
0

Hi, I am wondering how I can reduce the Map input shown below. I can successfully increase the size of it but I cannot find the way of reducing its size (number of Rows and Columns) . Thanks in advance.


cRIO show a error while deploy the program to it.

$
0
0

Hi everyone,
My project aim is predict a input signals class. I have a cRIO  and write e RT code based FPGA interface. I acquire analog data and the data gived to machine learnin vi- that calss vi referance  . This sub vi calls in main RTvi but while in deploying process display  error Smiley Frustrated I need help for this problem . Thnks


System Devices cRIO 9074 ,NI9215 ADC module. 

 

The error 

Analytics and Machine Learning.lvlib:AML SVM.lvclass:aml_Deploy Model.vi loaded with errors on the target and was closed.
LabVIEW: Failed to load shared library niamlsvm.*:nisvm_predictValues:C. Ensure that the library is present on the RT target. Use either MAX to install NI software or FTP to transfer custom libraries to the RT target.
LabVIEW: Failed to load shared library niamlsvm.*:nisvm_predictProbability:C. Ensure that the library is present on the RT target. Use either MAX to install NI software or FTP to transfer custom libraries to the RT target.

Labview memory leak

$
0
0

I have built a large test application using Labview 2014 and Teststand 2014 for a customer to calibrate the tools they make that contain accelerometers and magnetometers.  The hardware of the test system includes a motion simulator, thermal chamber, magnetometer, and power supply.  All of those including the UUT communicates to the software over a VISA connection of some sort.  The problem I am having is when the software runs, memory handles continue to increase until the application's memory limit is reached and an out of error message is displayed. Using tools from Sysinternals, I have found that semaphore handles grow throughout the test.  I have found a semaphore handle named \Sessions\1\BaseNamedObjects\SemBlocking that continues to grow.  Investigating further, it seems that the handles grow when a VISA device is being accessed.  During portions of the test where a VISA connection is not being accessed, the handles don't seem to grow.  The VISA sessions are opened at the beginning of the test and are closed at the very end.  This is a 16 hour test and it takes awhile before the memory limit is reached. I have also used Desktop Execution Trace Tookit, which showed everything working properly as far as memory handling is concerned.  So I am wondering if anybody else has seen anything like this before?  Are there any other tools I can use that could show me what is generating these semaphore handles?

 

Thanks

Control ESCON controller using Labview and NI DAQ 6212

$
0
0

Hello,

 

I am new to Labview and Maxon and I do not know from where to begin. I have a NI DAQ 6212, a maxon motor control ESCON  70/10 and an EC motor. I have to get/send data to/from DAQ so the ESCON controller to parse the data received or sent and do something to the motor. 

Is there any Add-on or library to install for labview? I installed SDI plugin but I can't figure ti out how to use it. 

 

Thank you

Inverse Fourier Transform

$
0
0

Hi,

I have a basic doubt when using the IFT.vi

I'm analysing some data I do the FFT of some signals (accelerometers) and then I pick a particular peak and perform the IFT to obtain a time decay signal. I've tried both IFT Real and IFT Complex and they do produce different results but I'm not sure which one I should use. Anyone could clarify when to use one and the other?

Thanks!

Does "VISA Find Resource.vi" need administrative rights?

$
0
0

I developed an application with LabVIEW 2017 SP1 where I use "VISA Find Resource.vi".

One of my customers has problems when the user who runs the application has no administrative rights on the PC.

 

The OS is Win 7 x64 + SP1.

 

Does this vi require administrative rights to work

Viewing all 67078 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>