Quantcast
Channel: LabVIEW topics
Viewing all 67022 articles
Browse latest View live

for loop

$
0
0

Hi all,

 

I would be grateful if someone can help me to solve this issue,

 

I have a FOR loop, this FOR loop gives me a sequence of numbers from zero to1 million (1000000) as mentioned in the attached file. the probelm is this for loop makes my code is too slow when I run the code. So is there anyone can help me to replace this for loop with something more efficient and gives me the same results. thanks in advance

 

Regards


Error 200428 When Creating Task in DAQmx Base

$
0
0

I've written a basic program that captures a voltage signal, plots it, and writes it to a file, but when I run it, I keep getting this error:

 

"Error -200428 occurred at Value passed to the Task/Channels In control is invalid."

 

I have attached a copy of my VI. Initially, I did not have the "DAQmx Base Create Task" VI in the block diagram and I was getting the above error. After searching through some documentation, it seemed that the issue was that I needed to put the "Create Task" VI in when using DAQmx Base. Unfortunately when I did that, I still kept getting the same error.


Is there something I am doing wrong here?

 

Here are my software/hardware information:

 

Operating Software: Mac OS X 10.11.14

Hardware: NI myDAQ

Software: LabVIEW 15.0 (64-bit)

Drivers: NI DAQmx Base 15, NI VISA 15.5, NI-488.2 15.5, NI-DAQmx for myDAQ on Mac OS X (all for OS X)

Error -2367 occurred at External Model in Control & Simulation Loop

$
0
0
Hi, I am doing Multisim and LabVIEW Cosimulation. Every time I run it, I get an error that Transient time point calculation did not converge possibly due to convergence issue of Multisim model. I have tried a lot of things but this error doesn't go away. I have constructed Multisim model so many times but the error persists. However, the model works fine if run in Multisim only. Please, help me in resolving this issue. Regards,

How to configure build so that a dynamically loaded lvclass in a built EXE can find dependent .NET assemblies

$
0
0
I have application that uses .NET assemblies and that works in the development environment, when the lvproj is open, but does not run when built into an EXE. I’d like to know if there’s a way to ensure that an lvclass, that gets loaded at runtime, will find the .NET assemblies (DLLs) it requires. In my development environment, I’ve followed the rules in LabVIEW Help for selecting the locations of the .NET assemblies (DLLs); i.e., the .NET assemblies are in a subdirectory where the lvproj resides and all the VIs that use the .NET control are added to the lvproj. http://zone.ni.com/reference/en-XX/help/371361L-01/lvconcepts/net_defaults/http://zone.ni.com/reference/en-XX/help/371361L-01/lvconcepts/loading_assemblies/ Here is my folder structure: MyProjects\ BFProject.lvproj ThorDotNETAssemblies\ (the .NET assemblies (DLLs) reside here) MySource\ MyDrivers\ ThorMotorDrivers (VIs that use the .NET control reside here) MySubsystems\ (the lvclasses reside here) 1AxisMotor-generic.lvclass 1AxisMotor-Thor.lvclass 1AxisMotor-MForce.lvclass I’ve added all the VIs in MyDrivers\ThorMotorDrivers that use the .NET control into the lvproj (auto-populating folder). I’ve also added all the lvclasses in MySubsystems to the lvproj – these classes have methods that call the driver VIs in ThorMotorDrivers. Now, in the development environment with the lvproj open, I can successfully run the methods of the 1AxisMotor-Thor.lvclass. However, when I close the lvproj and open the lvclass by itself, it can’t find the .NET control and the VIs are broken. The .NET container on the front panel says “Control could not be loaded”. [When the class opens I can see that LabVIEW has to go through the search paths to find the DLLs, but it does not complain that it can’t find them.] For the built EXE I use an XML configuration file that specifies which of the 1AxisMotor lvclasses to load at run time based on the hardware configuration; i.e., I can use any one of the child classes of 1AxisMotor-generic.lvclass. However, when the application goes to load 1AxisMotor-Thor, it seems it can’t find the .NET assemblies – the LabVIEW run-time engine throws error 1498 saying the library (1AxisMotor-Thor.lvclass) has errors and needs to be fixed. [I’ve also seen Error 7 that the 1AxisMotor-Thor.lvclass file cannot be found during my trial-and-error attempts at fixing the problem.] Is there a way ensure that the dynamically loaded lvclass will be able to find the .NET assemblies? Is there a way to test that the lvclass will work without having to make the build, which currently takes 3.5 hours? I’m using 32-bit LabVIEW 2014 SP1 on Windows 7 64-bit. In the build spec in the lvproj I’ve specified that the entire ThorDotNetAssemblies folder always be included in the build, and that the destination for all contained items is the data folder (support directory) – my thinking here is that the LabVIEW application builder may not be finding all the dependent DLLs so I make sure to include them all. [I did use dependency walker to try to find out which DLLs specifically are needed, but then just decided to include them all.] In the build spec, all the ThorMotorDriver VIs and all the lvclasses are always included and are specified to be put into the .exe destination location. I’ve also set the search paths for VIs in the Tools > Options > Paths > VI Search Paths to include the folder where the .NET assemblies are. The .NET assemblies come from ThorLabs – their Kinesis 32-bit software for 64-bit windows: https://www.thorlabs.com/software_pages/ViewSoftwarePage.cfm?Code=Motion_Control&viewtab=1https://www.thorlabs.com/Software/Motion%20Control/KINESIS/Kinesis-Labview.pdf The Kinesis software has been installed on the LabVIEW build computer and the computer that runs the EXE. I’ve also tried adding the ThorMotorDrivers folder and the ThorDotNetAssemblies folder to the lvclass, but that didn’t work; and, in fact, I ended up with a corrupted lvclass file – LabVIEW couldn’t even open it. Based on NI forums feedback (e.g., Built exe not referencing correct .NET assembly locations: http://forums.ni.com/t5/LabVIEW/Built-exe-not-referencing-correct-NET-assembly-locations/m-p/2488890/highlight/true#M759880, and Creating drivers using an unsigned .NET framework and distributing through VIPM: http://forums.ni.com/t5/LabVIEW/Creating-drivers-using-an-unsigned-NET-framework-and/m-p/3213904#M933057), I’ve also tried creating exe.config files to specify the version of CLR to use – no luck. I did verify that the .NET framework versions are all installed – 2 through 4.5. [How does one set up an exe.config file when there are a number of other .NET assemblies corresponding to other hardware, each potentially targeted to run on a different version of the CLR?] Worst case scenario is that I abandon the .NET assemblies and re-do all the driver VIs using ThorLabs’ ActiveX control; but I’d rather use the .NET assemblies since I’ve already completed the code and tested it … in the development environment anyway. Thanks.

How can I find the equation used by a .vi?

$
0
0

I have a LabView project with a few different .vi's and each of them has their own equation. These equations allow the software to report flow and pressure based on voltages.

 

When I open one of these .vi's, which are named things like "Volt to pressure" or "amp to pressure", I see two boxes. One is labeled "Volt" and the other "Pressure"; pretty straightforward. I enter a voltage in the vi and I can get it to spit a pressure back to me. I need to know how to access and modify the equation being used.

 

I am very very new and inexperienced with LabView, so any help I can get would be great. Thank you.

serial communication with VISA

$
0
0

I am trying to communcate with a Power supply via serial connection.  The communcation works great in Hyperterminal and PuTTY but i cannot seem to get it to work in labview.  It times out when I try to read the data.  

 

The hyperterminal settings are all basic and it works fine 

9600 baud rate

8 data bits

no parity

1 stop bit

no flow control

 

send "VOLT?" with CTRL+J for linefeed

retruns the voltage at the output of the UUT.

 

I am wondering if i am sending the command incorrectly in labview.  Any help would be great.  I have attached the VI, this is my first try at serial comms so it is pretty baisc and i am sure I am missing something

Software development of dual axis galvo system scanner for laser marking

$
0
0

Hello everybody

 

I'd like to develop a programm/GUI for the synchronization between a laser and a dual axis galvo scanner system. A dual axis galvo system is composed of two mirrors X and Y which reflect the laser in the X and Y axis respectively. My initial idea is to choose the 'image' (sqare, logo...) that we want to mark on a surface with the laser and from this image (.bitmap) extract an array of {x,y,N} parameters for each pixel:

 

- x and y are the coordinates for each pixel which will correspond later to the laser position 

- N is the number of laser pulses which is  linked to the value of the pixel at (x,y). Therefore the number of laser pulse for each pixel/laser position is between 0 and 255.

 

I know the mathematical formula for the mechanical angles of each mirror X and Y of the galvo scanner which are linked to x and y coordinates of the image field respectively.

 

Unfortunately I'm totally novice in Labview Smiley triste and I have basic knowledges in C++, can you give some clues for my project?

Thank so much for your help !

 

Database id key from logs

$
0
0

Hi there,

 

I've been tasked with taking logs from different testers and inserting those log files into a central database.

 

I have been able to split the data that i require out but i am having real issues with creating an ID key. This now i have split the log files into 2 seperate tables.

 

1st table contains the main results (So this has Serial number, test type, unit type, duration, start date/time and status)

2nd table contains the details of the tests performed merged (cocacenated) with the pass / fail result.

The two tables will be linked by this ID.

 

I am trying to achieve this on HeidiSQL (MySQL)

I have setup the ODBC for mysql and its linked to the database. 

 

Below is the sample code im working off of, but i cant get it to work either.

Attempt 1.JPG

 

However whenever i run this i just get syntax errors.

database1.JPG

 

Have i done something daft without realising it? 

 

 

Kind Regards


How to reduce precision of TDMS file with express vi?

$
0
0

I notice that the express VI that saves binary data saves it with 8 byte precision. 

That ends up being about the same size as the text version. (My data is usually less than 2 bytes). 

 Is there an easy way to get the express VI to reduce its precision from 8 bytes to 2?

 

Calculating Historical Rate of Change From Continuous Real-Time Input

$
0
0

Hi everyone,

 

I'm trying to create a Sub-VI that will calculate the change in a value collected in a real-time over the last 10 minutes.

 

The way my main VI is running now, a new value is calcuated once every second from data streamed via DAQ @ 1ks/s. Since everything is in a giant while loop, the data is constantly being written over my the next wave of data being read. I was wondering what is a good approach to "remember" what the value was 10 minutes prior so that I can compare it to the current value once every minute. I know shift registers can be used to pass information between iterations of a while loop, but we're talking about hundreds of thousands of iterations here. I don't know if there's some way to conditionally read/write to something outside the while loop as the program is running.

 

Basically what I need to do is have a program that monitors a value calculated in real time. This program also needs to be able to know how much this value has changed over the last 10 minutes once every minute (the rate of change over last 10 mins). The way that I've done it is to rewrite an array at the top of every minute with the new value until it reaches 10 values, and then it just starts back to 0 index writing over the values. This way I always have a snapshot of the values, once per minute, for the last 10 mins. However to "retain" this array for the other 59 seconds of every minute when I am not writing a new value to it, I just have it constantly rewriting the same array to itself repeatedly, and I believe this may be causing my program to develop a memory leak and throwing a buffer overflow error after a few minutes.

 

Is there some more effective way of approaching this problem? Also do you think that this is the reason behind my buffer overflow error? My main VI was running slowly, but without errors, before I added this subVI so that's why I'm suspecting this is the problem.

 

I have attached my subVI. I know it's kind of a giant mess and difficult to decipher, so please let me know if you need clarity on anything.

 

Thanks!

Call Library Function errors and ag8614x driver

$
0
0

I have inherited a test program written in LabVIEW 7.1 running on a PC running Windows XP.

 

We don't have the CDs for LabVIEW 7.1 but do have a license for LabVIEW 8.5.1

 

I am developing on a PC running Windows 7 Pro 32-bit.

 

The test program makes use of VXI Plug and Play driver from Agilent.

I am able to locate that driver on the Keysight website and install the driver.

 

Unfortunately, this newer version is not the one used by the test program.

The old version VIs uses the Call Library Function which calls ag8614xb.dll

The new version VIs uses the Call Library Function which calls ag8614x_32.dll

 

I thought it would be a simple matter of changing all the calls to the new DLL.

 

But errors occur.

 

For example  ag8614x VXIPnp Error Converter.vi 

I initially get Call Library Function Node: library not found or failed to load.

 

So, I configure the Call Library Function node to find the DLL at C:\VXIPNP\WinNT\bin\ag8614x_32.dll and checked the checkbox for Specifiy path on diagram.

 

I then get the error One or more required inputs to this function are not wired or are wired incorrectly.

The inputs are two integers and a string. I cannot figure out what is causing the error.

The function prototype is   int32_t ag8614x_error_message(int32_t arg1, int32_t arg2, CStr arg3);   

 

If I don't check the checkbox for  Specifiy path on diagram , I get the error message Error loading "C:\VXIPNP\WinNt\bin\ag8614x_32.dll". A dynamic link library (DLL) initialization routine failed. 

 

 

 

LabVIEW: Memory is full

$
0
0

Hi everyone!

 

I implemented a full image processing algorithm in LabVIEW using quite a lot of Matlab code imported throughout Mathscript nodes and when I have finished implementing the algorithm I found out that it works fine for a few number of pictures, but runs out of memory if I give him more than 25-30. My intentions are to run it through a database of approximately 1200 pics, so this very small amount that it can handle looks to be a pretty drastical problem for me. How can I make the algorithm to scale better and handle hundreds or even thousands of pictures?

 

Here you have my top level VI that just simply runs through a folder containing the images and compares every one of them to a reference image. Expected result is to have an array of similarity scores between each image from the folder and this reference one.

TopLevelVI.JPG

I have two subVI's that I figured that they may cause the problem. The Retrieve Properties VI does the most of the job which takes on the image as an input, calls lots of Matlab code from .m files and outputs a binary template and mask for the image. Then the Hamming Distance VI takes on the images from the folder and the template and mask from the reference picture, runs the Retrieve Properties VI for the image from the folder and does some logical operations (XOR, AND) on the templates and masks and outputs a double precision score. This subVI is called for every image in the folder and this is how it looks:

Hamming Distance.JPG

As I mentioned previously, the algorithm works fine for a small number of pictures, but when I give him more than 25-30 I get the error that "Memory is full". I've read some posts on this issue and most of them said to reduce number of front panel elements, arrays, copies of arrays and every large data structure that uses contiguous allocation, so I tried to do so. My only element on the front panel is the array with the final scores, however I cannot manipulate the Matlab code too much as it is not my code, I just have an authorization to use it.

 

As a final information that could be relevant I should mention the size of the pictures: they are 150x200 grayscale pictures with the size of around 22-25 kB's of data.

 

Does anyone have an idea how to make this algorithm work on a higher scale of pictures?

 

Searching for configuration files

$
0
0

I have a set of CSV files which I need to read on the startup of my main VI or the executable built from my main VI and all its dependencies (Either under WIndows 7 or LabView RT).  Effectively, I need to search for these CSV files in a programmatic search path such as:

 

.\*.csv

..\config\*.csv

<project path>\config\

 

What is the recommended strategy for this and are there any VIs available for searching an arbitrary set of absolute and relative paths?

 

Also, how would this be handled in an AB exe build (EX: Can resource files be enbedded in an EXE)?

 

For LabView RT, I currently just have a deployed resource directory under c:\config.  That works fine for RT, but it's a horrible way to do it on a Windows PC.  Looking for a smarter approach.

 

Thanks,

 

XL600

 

 

How to prevent the numeric control button value changes when it is disabled and grayed out in labview

$
0
0

Hi,

In my Labview code i'm using Numeric Controls and making it ""Disabled and Grayed out" at particular instance and after executing few cases (Stacked Sequence Structure cases) i'm enabling the controls again. During this period if user clicks the Numeric Control Buttons i'm expecting that it will not take this input event (i mean it should not change its value) since i disabled and grayed out the control. But it is changing the value if a user event is occured during this period.

 

I'm attaching a sample-Numeric Control Changes.vi explaining this scenario.

 

And i have one more query - whenever i use the "Display message to User" Node or "Dialogue" nodes,(Ex: "One button dialogue" node or "Two buttons dialogue" node) while run time it pop-ups the window and then background UI freezes until user clicks the "OK" or "Abort" button given in the pop up window. Is there any way to make the UI not to freeze in this situation and also is there any way to automatically close the pop up window after particular duration of time?

can someone please answer these questions.

 

Thanks in Advance.

Siva.

U2702A master-slave

$
0
0

I have three U2702A scope modules in the U2781A chassis.  I would like them to all trigger when a particular one triggers.  It seems that this is done via the master slave settings in the chassis SSI configuration.  I see where you can set this via there Agilent Measurement Manager tool, but does anyone know how to se this in Labview?  I do not see this capability in either thier instrument driver or the IVIScope driver.

 

On a side note is there any way to send VISA writes/reads to a IVI controlled device?  I tried and it fails.  It looks like the IVI session has it locked so you cannot perfrom a VISA write to it.  I think I know the command I need to send.


FPGA read write control node changing unrelated control when built and run as a startup rtexe

$
0
0

I'm targeting a CRIO 9066. Running LabVIEW RT 2015SP1. My application works well when run from the development environment but when I build it and run it as a startup exe I noticed some odd behavior. I was able to trace it down to one node

request node.PNG

That's writing 1 into the previous control (I8)

previous control.PNG

That is, samples should be 9 but if I execute the request node (FYI, it's a latching boolean, I don't know if that's relevant), samples becomes 1.

 

This only happens when run as an RT executable.  Any ideas or suggestions to prevent random errors like this?

 

Other details which may or may not be relevant:

Open FPGA reference opens the build spec (not a bitfile or VI)

Open FPGA ref is in a functional global that is used in multple places:

FPGA open func global.PNG

 

 

Remote Development Application Error

$
0
0

I am getting an NI LabVIEW Remote Development Application Error while running on a cRIO using LabVIEW FPGA software, see attachment.

 

What is the error?  Where is the Application Log?  I do not see anything in MyDocuments\LabVIEW Data\Remote Development directory.

 

Thanks,

 

Paul

 

NI-6009 USB - Stop Data Acquisition on one sensor but retain the last value it measured for calculations

$
0
0

Hi everyone, I'm a beginner so I guess my question might be basic. So I'm working on a very simple project that should automatically record the mass of an object with a microload cell and also record the water displaced  with a liquid level sensor if the mass is placed into a circular container and is submerged.

 

Both the load cell and the liquid level sensor are working and give a change in voltage when the load cell is pressed or when the liquid level is touching water. And then I just need to calibrate the load cell so I scale the voltage into an actual weight or level measurement.

 

However, I want to calculate the density of the object as well which right now I have labview taking the current mass and level measurement and calculating the volume and then dividing mass over volume. But the object's weight and water displacement cannot be measured at the same time which is requried in order to calculate density.

 

The data acquistion is basically set at continuous measurements. I have one DAQ assistant that takes in the signal from the level sensor and the load cell and then I split the signal so that calculations can be done on them. Basically what I would like to do is to allow labview to stop data acquisition of the load cell upon pressing some sort of stop button in labview but it would retain the last value it stopped  at. This is necessary so it would still be able to calculate the density after I take the object and put it into the water container to measure the liquid level. Is there some sort of labview block that allows this to happen? 

 

Ignore the nested while loop in the top left, it doesn't do anything. 

 

Thank you in advance for any help.

 

Labview.PNG

Open VI Reference utilizando lvlib

$
0
0

Olá,

 

estou tentando utilizar o "Open VI Reference" para chamar uma VI externa ao meu executável, o problema é que nessa VI eu utilizo algumas funções que fazem parte de lvlib do próprio LabVIEW como é o caso da função "Open Config Data.vi".

A minha VI externa funciona normalmente desde que eu utilize outras funções, mas quando tento utilizar esse tipo de função que pertence a alguma biblioteca ao rodar o executável a minha VI retorna quebrada dizendo que não conseguiu localizar a função "Open Config Data.vi" por exemplo.

 

Executavel.png

Está é a VI que seria o meu executável, nesse caso apenas uma função para chamar a minha VI externa e abrir o Front Panel.

 

VIExterna.png

Está é a minha VI externa, utilizando apenas funções do LabVIEW, estas funções pertencem à NI_LVConfig.lvlib

 

ErroVIExterna.png

Este é o erro que aparece quando abre a tela frontal da minha VI externa.

 

Tem algum modo de adicionar essa biblioteca junto com o executável para que a minha VI externa localize a mesma?

 

Obrigado,

João.

USB Set Configuration is always set to 0

$
0
0

Hello,

 

I have a development board where I am developing firmware for a USBTMC device. This device will contain 2 configuration descriptors. In Labview, I issue a Control Out transfer to set the configuration to use per USB specification. Here is the code:

 

usbtmc_ctrl_out_setcnfg.PNG

 

The bmRequestType's bit-map field corresponds to a Host-to-Device (0), Standard (00), Device (00000) request type. The request, 0x09, corresponds to SET_CONFIGURATION and value is the configuration descriptor number I wish to set.

 

When I run this code and observe on my ellisys USB analyzer, I see that the configuration value being set is 0 instead of 2. See below:

usbcap_setcfg_0.PNG

The highlighted area of the table shows what was actually sent over the bus to my device. This is problematic for 2 reasons. One, the code is not setting the desired configuration as expected. Two, instead of setting to the default configuration (1), VISA is setting the configuration to 0 which translates as "not configured".

 

I'm using LabVIEW 2014 64-bit with NI VISA 14.0.

 

Can anyone give any guidance on how I might be able to properly set the configuration of my USBTMC device using LabVIEW and NI VISA?

 

Thanks!

Viewing all 67022 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>