Quantcast
Channel: LabVIEW topics
Viewing all 66860 articles
Browse latest View live

7966R not detected in MAX

$
0
0

Hi,

 

I moved to a new computer running Win7SP1 and reinstalled LV. Now my 7966R is not detected in MAX. Can somebody point me to the right driver? Please take a look at the attached screenshots for the installed software/hardware.

 

Chassis: PXIe-1075

Interface card(s): 8370/71

LabVIEW versions: 2017,2018

 

This setup worked in my previous system.

 


ERR (-61017) in Labview program

$
0
0

I have an error that causes my labview program to be stuck indefinitely once every few times I run the program. I suspect this has something to do with the FPGA startup since I see ERR (-61017) on the output line from the FPGA VI reference block inside my host program's block diagram. I was able to observe this error when running the host program in debug single stepping mode. The only way to resolve this condition seems to be restarting the labview environment and running the program again. But since I need to be able to run this host program smoothly from a custom C++ application, this error needs to be avoided. Any ideas on how to proceed would be much appreciated.

 

P.S: The function of the host program is data acquisition using FPGA from 6 channels (3 each from two NI-9239 analog input modules connected to the board). Please find the labview project, vi files, and screenshots of the observed error attached here.

Changing the Duty Cycle (PWM) with buttons

$
0
0

Hi everyone, this is Osman,

 

I am currently working on a project and all the hardware side is done now. The only problem I have is changing the duty cycle with buttons. Whatever I have done and search for it, found nothing. I mean, really I do not understand if the while loop of LabVIEW works fine. Anyway, could you please help me with this case. 

This is the video of the software I want to make one: https://www.youtube.com/watch?v=zkiU2CZrSZA

I attached the .vi file to this post.

Inside of the file, you will see some buttons and their tasks are following;

Top Button: (Duty Cycle * 0) + 1  (Which will make the duty cycle 255

Up Button: Add one more duty cycle that we set

Down button: Subtract one more duty cycle that we set

Bottom: (Duty cycle * 0) (Which will make the duty cycle 0)

 

Kindly

NI USB 8451 communication with multiple devices of the same type

$
0
0

Is it correct that a NI USB-8451 can communicate with 8 slaves simultaneously using I2C protocol? Currently, I am using it to communicate with 2 slaves simultaneously. One of these 2 slaves is a TI battery charger EVM and another is a TI fuel gauge. For communication with these, I have to specify the slave address in LabVIEW using the I2C configuration property node. These slave addresses have been provided by TI in their datasheet, and are different for both these devices. So, basically, I'm able to communicate with these 2 slaves simultaneously. 

 

Now, I want to communicate with four of each of these 2 slaves, i.e. 4 battery chargers and 4 fuel gauges, thus total 8 slaves. Since four of them have the same address, how do I distinguish between them while communication?

 

P.S.: I'll attach a sample VI of communication with the fuel gauge.

IMAQ ImageToArray error 1074396080 while reading 16 bit RGB tiff

$
0
0

I want to get data from a big 16 bit RGB tiff image from my camera (110MB). Image appeare in Image windows, but IMAQ ImageToArray return error 1074396080. If I (at IMAQ Create) select "Grayscale U16" then I get B&W picture and data in "Image Pixels (U16)" indicator without error. With IMAQ Create at RGB64 Image is RGB and you can get 16 bit RGB pixel value under Image window.

I'm using LabView2012 SP1.

Wireless communication

$
0
0
Duplicate Post. No need to post same question twice.

Labview 2012 DLL with Analog Video Generator

$
0
0

1.png 

I am building a DLL for use with Analog Video Generator in C ++.

Previously Support said Labview 2018 is not supported, so we are working on Labview 2012. We have confirmed that it works without problems when it is made with Labview2012 application.

I built this into a DLL and tested it with the Visual Studio 2013 C ++ console application, but I get a problem getting Initialized to get the DeviceList. The vi used is mxVideoGenerator-pub Init.vi. It is located inside the Labview Test Project \ AnalogVideoGenerator folder in the attached Project folder. This folder is a copy of the folder created when installing Analog Video Generator3.1 of Ni.

 

LVTest.lvproj in the Labview Test Project folder After building the DLL file using the Labview project, when the Initialize function is called from the ConsoleApplication1.vcxproj C ++ project that is attached, the break will be caught inside the function (image 01)

01.Break.png

 

It will exit from the hash function. However, if the heap corruption message (image 02) appears, and you continue to execute again, you will exit the function.

02.Heap 손상.png

 

If you check the value of the deviceList variable passed as an argument, you can check that the value has been entered normally. (03 image)

03.Watch.png

 

 

I have tried to create a DLL using Labview 2012 One Btn Dialog, but I confirmed that it works normally.

Visual Studio 2010/2013/2017 all have the same symptoms and have been reinstalled.

 

 

NI 9423 - Frequency Read VI Help

$
0
0

Hello,

 

We are looking to read an 0 - 5V speed sensor into an NI 9423 using a cRIO 9178. We have debugged the hardware and checked to verify that there is an output signal for the sensor using an oscilloscope.

 

Before moving on to concluding that the module itself may be broken, we wanted to verify that out VI is correct.

 

Utilizing reference designs for this we put together a VI that will simply read what we believe is the correct channel on the module.

 

The problem is our is giving us nothing and the module has no LEDs indicating that the module is working.

 

I have attached a copy of the VI in the hopes that someone can tell us if we are doing something blatantly stupid.

 

Thanks,

MC


Select Function Text String Color

$
0
0

New to LabVIEW so apologize in advance if this is a basic question , I have a select function with a text string for the true/false cases.  For the false case I would like the color of the text string to be red vs. black for the true case.  I attempted to do this by just changing the color of the false string using the toolbar color palette.  However, this doesnt seem to work when I run the VI.  What am I missing?

Memory leak??? or more likely not knowing proper cleanup

$
0
0

Hello

 

As the title suggests I think I have a memory problem.  I wrote a VI in LabVIEW 2017 to process data generated from a scientific instrument.  This VI does not interact with any hardware, so it's purely a software VI.  Unfortunately I cannot show the code because it is proprietary, but I can show some screenshots and I think describing the problem might be enough.

 

The VI lets a user load a data set into the program and interact with it, correct for noise, outliers, ect.  It can also load multiple data sets together which allows for better correction of noise, outliers, ect.  Each data set is roughly 2000x300 floating point numbers and when I load 300 or so data sets I have the following issues I'm hoping to address:

 

1) it takes a very long time to load.

Each data set is a text file which I'm loading into an array with "Read delimited spreadsheet".  Then I display them on one graph and change the line colors to black and the symbol to a hollow circle.  20 or so data sets takes about 8 seconds to load, which is no problem.  300 data sets takes forever.  The loading part is probably as fast as it can be, but the changing line color takes almost just as long as the loading because I have to do it for every curve.  I there a way to set the line color and symbol before loading multiple curves, that will stay for every new curve loaded?

 

2) When I clear the data I still show lots of memory being used.

I think I'm clearing the data, but when I do, task manager still shows most of my ram being used (my laptop has 12GB ram).  Only when I stop the VI does my ram usage go back to normal.  Is there something I should be doing to clear a variable and clear a graph other than writing an empty set to them?  For reference 300 data sets loaded as double floats take about 5GB of memory. 

 

Thank you in advance for any help!

 

1.png shows one data set loaded

2.png shows multiple data sets loaded

3.png shows multiple data sets after some processing

4.png shows my clear data function.

Network streams to pass data into a subVI

$
0
0

Hello All,

 

I'm using network streams to transfer some data to my PC from cDAQ 9132 with LabVIEW 2017. The data is a 2D array of string. The data transfer keeps happening and once every 60 minutes, a subVI will be triggered and open in front panel. At that time, I don't want to view the data in my main VI and I want to view it in my subVI. What should be the best approach for this. I have tried using passing the network streamed data to a global variable and reading it in the subVI. But the VI is slowing down. and I'm not able to see the live data properly. Can anyone please suggest a better method than this.

 

 

Regards,

Rishi

9995071465

Save 1d array for use later

$
0
0

Hi all,

 

I'm using a spectrometer VI that requires both a dark and light reference before processing. The spectrometer itself outputs a spectra in the form of a 1d array. In some rare occations, I cannot take new dark and light reference spectra due to sample issues and would like to use the last taken reference spectra (probably from the last time the VI was run).

 

Basically what I need to do is save the reference 1d array somewhere that can be called upon when the instrument is run again. I'm guessing, having it written to a file that can be read when called. What would be the best way to write a 1d array to a file so that it can be converted back to a 1d array as efficiently as possible?

 

Or is there a better way to do it that I'm missing?

 

Thanks

Error while sending an email via SMTP

$
0
0

Hello, I am trying to send an email via SMTP using LabView, but when the event that would trigger the true case happens there is an error which says that I have provided invalid email and password combination. But the thing is that I am 100% certain that both email and password are correct for that account because I actually get the email on the other account that was supposed to be an addressee. Does anyone have an idea what may cause that error to occur?

TDMS data unreadable due to windows language settings??

$
0
0

Hello,

 

I have a strange issue. We have an executable created from Labview to run tests and create a pdf report from that test. This exe is installed on about 30 to 40 computers and with one I have an issue. In short the way we work:

  1. We create a tdms file and fill it with information.
  2. Now we start a test and enter the data into the same tdms. We close the tdms afterwards.
  3. We open the tdms to create a report, hereby the information entered in step 1 is used to select the correct test and to be displayed in the program.
  4. Now the program wants to get the data to create graphs from it and it ends with an error (20312).

 

I've been checking a lot of things, like read/write issues but that is not the case. The only difference I could find so far is the date in Windows language settings. But I sure hope that will not be the issue!

Anybody ever had similar issues and/or a solution to this?

 

With kind regards,

Jeroen

Monitor if Digital Input Doesn't Change State w/in Time Period

$
0
0

This program cycles a solenoid on and off for a user-defined period. I want to capture the time it takes for a digital input to turn ON and turn OFF, after a digital output comes ON or OFF (respectively).  In practice, I have a solenoid (digital output) that will be used to actuate a valve; a proximity sensor (digital input) will be used to verify stem travel.  I want to know:

   1. How long after the output turns on/off, that the input turns on/off, respectively.

   2. If the input never changes states during the cycle. 

 

I am able to follow the digital output and report the time it takes for the digital input to come on or turn off (following the output).  Two things I can't accommodate: reporting a value if the input never changes during the cycle and resetting the timers at the end of the cycle.

 


RAD Fails to deploy

$
0
0

I am attempting to deploy an image file that a vendor sent us for our NI system.  The image was created from a RAD retrieve on their duplicate copy.  The log says that it is incompatible and a "compare" shows the Original Image (local) as having no name version, or software installed and indicates that is the problem

.   I also performed a retrieve to capture a baseline image of my device.  That was successful but that also fails the compare in an identical way.

 

Is it required that my computer I use to run RAD have the full lab view suite that the vendor had when they created the image?  It was my understanding that RAD was a way for deployment computers to no required 10k+ of software to push an image.

 

 

Deploy log:

 

Model    Model Code    Serial Number    MAC Address    IP Address    Host Name    App Image Name    App Image Version    Time    Result
sbRIO-9651    775E    01D2A6CB    00:80:2F:18Smiley Very Happy7:39    10.152.xx.xx    AGS-1-fets            0%    The selected image is incompatible.
Make sure the image controller and backplane match the target.

 

compare.png

 

 

Need help with create multiple instant of Thorlabs APT ActiveX

$
0
0

I am building a wrapper for Thorlabs APT ActiveX driver. I would like to create multiple instant of it to be able to control multiple KDC101 controllers at the same time. I put the activeX controller in a class and try to call multiple instant of the class, but the reference value still return the same number. I tried with Automation Open but still no success. Please let me know how should I fix this.

 

Thanks in advance.

Which is the more efficient way to feed a constant into a loop: wire or feedback node?

$
0
0

I'm writing (or rather rewriting) a big program and I need to feed a minimum, maximum, and initial value into a loop to ensure any set value stays within those parameters (these are values for current, voltage, and temperature in case anyone is curious). If I understand it right, wires are always always always the most efficient way to transfer the data, but stretching a bunch of wires across my program, well.....it's gonna look ugly as hell and I'm trying for maximum readability (so I don't have to be called to fix it after I graduate).

Here's the question:

Is there a large drop in efficiency if I use feedback nodes rather than wires? It'll save me stretching wires all over the place, but I've been eliminating feedback nodes and shift registers when not necessary to try to stream line this thing.

Attached is a picture of an example program and the actual program. Any thoughts would be greatly appreciated. Thanks in advance!

Addendum: in the picture, I have simple values for min and max, but in actuality I'll have some calculations out there. Nothing fancy, but I'm trying to avoid doing that math every loop....unless it doesn't matter and I'm overthinking this.....

What is the most efficient way to replace array subset in this case on LabVIEW FPGA?

$
0
0

Hello, this is a kind of quiz which asks the most efficient way to do the following thing on FPGA. 

 

The VI attached on this post is the way I applied to what I would like to accomplish.  

Pad Window.png

 

What I would like to accomplish is the following.  

  1. There is an U32 array with 64 elements
  2. Specify an array index 
  3. Replace all values before the index with the value specified by the array index.  

Let's say, I have an array of [5, 6, 7, 8, 9].  If I specify 3 for array index, that points to 8 in that array.  Then, the pointed value replaces all values before the index, so, resulted array is [8,8,8,8,9].  If I specify 1 for array index, 6 is pointed by the index, and the resulted array will be [6, 6, 7, 8, 9].

 

What is the most efficient way to accomplish this on LabVIEW FPGA?  I tried various array functions such as Delete from Array, Replace Array Subset, Array Subset, etc, but Undefined Array Size error pops up at intermediate file generation.  

 

If any smart people could share any idea, it would be really appreciated.  

 

Thanks much in advance.

 

Having Trouble Inserting into a Bit Data Type MySQL Database Column

$
0
0

Hello,

I am trying to insert data into a simple 3 column MySQL database table.

2019-01-23 12_12_35-MySQL Workbench.png

No matter what data type I send into the PASS_FAIL column I get a "DATA TOO LONG" error. I have tried Booleans, converting the Boolean to 0,1 8 bit integer, and using "0" or "1" strings. What do I need to convert the data to?

BOOLEAN.pngINTEGER.pngSTRING.png

 Specific Error Code: -2147467259

Specific Error Message: NI_Database_API.lvlib:Cmd Execute.vi->NI_Database_API.lvlibSmiley Very HappyB Tools Insert Data.vi->Insert step_result.vi<ERR>ADO Error: 0x80004005
Exception occured in Microsoft OLE DB Provider for ODBC Drivers: [MySQL][ODBC 5.3(a) Driver][mysqld-5.1.47-community]Data too long for column 'PASS_FAIL' at row 1 in NI_Database_API.lvlib:Rec Create - Command.vi->NI_Database_API.lvlib:Cmd Execute.vi->NI_Database_API.lvlibSmiley Very HappyB Tools Insert Data.vi->Insert step_result.vi

 

Thanks for the help!

Viewing all 66860 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>