Quantcast
Channel: LabVIEW topics
Viewing all 67009 articles
Browse latest View live

NI Max on Redhat 7.5 Linux Will Not Install

$
0
0

I am trying to install NI Max on my system, and it will not install with the NI-VISA 15.0 package.

 

I've installed NI-KAL and configured it to my system. I also installed NI-VISA 15.0 from here: http://www.ni.com/download/ni-visa-15.0/5410/en/

 

I am running Redhat 7.5 Linux with Redhat 7.4 Linux kernel available. I am running LabVIEW 2018 64bit version.

 

I know that there is not a separate version of NI Max that I can install, so how can it get it on my system since it thought it was packaged with NI-VISA, but did not install?

 

Thanks.

 

 


Help with how to write a loop!

$
0
0

Hello, I tried looking through the forums but I wasn't sure how to word my problem.

 

I have attached a screenshot of a while loop; basically what I want is for that loop to write the "Temperature 1" double value into "String" and "Time 1" into "String 2" and then wait for the value of "Time 1" before the next iteration of the loop. In the next iteration of the loop, the same procedure should be performed but for Temperature 2 and Time 2 and so on and so forth until all the iterations are finished. What is the most efficient way to design such a loop? Do I make an array of these time values and somehow feed that into a "wait" function? Any help is appreciated, thank you!

LabVIEW crash with Copy.vi from network path (windows 10 1803)

$
0
0

labview bug win10_1803.jpg

 

 

This was working perfectly on windows 10 1709, LabVIEW 2015.

 

Yesterday I updated to 1803.

My softwares crash on startup. Investigations led to this Copy.vi that makes LV development AND runtime crash when source path is a network path (for example: \\server\configs\product_xyz

 

If Source Path is a local folder (C:\blabla), it works.

 

This was tested with failure on EVERY LabVIEW 2015, 2016 and 2018.

 

It crashes both development and on EXE (application builder).

 

I noticed that some files are indeed copied, then cursor turns busy, then crash.

No error output.

 

Pls investigate.

SubVI running simultaneously errors

$
0
0

I'm parsing a json file several times using a sub VI and when I run it in highlight execution mode I see that the input string is different in the sub VI than what is being inputted into the other VI.

Here are the VIs:SubVI parsing.pngSubVI

 

VI Parsing.pngVI

Attached is the json file that's being parsed.  

 

 Edit: An example of the parsing going wrong is that the McKenna_Desired_Phi is outputting zero as well as several of the other parsed values

Help needed with missing objects in block diagram

$
0
0

Hello community, thanks in advance for reading my message. I have an issue with my labview. I was building my program but i realized it is too big because of problem visualizing several objects in the block diagram. Then I followed several post here trying to clean my program and doing several subvis in order to make my BD smaller and i get it. Even so, there are few objects that still missing like the while loop endind condition. My diagram does not respond any more to clean up. I do not know what to do, can you give me some sugestion?

In the image you can see in a red circle the elements that should be visible. 

I am using LAbview 2015.

Thanks

EtherCAT loop strangely slowing down

$
0
0

I'm using a while loop to run my ethercat input and output, and the ethercat cycle time is set at 10 ms. To my great surprise, I measured and found that the loop was taking 30 – 40 ms to complete. Not only that, but as the program runs for minute or so, that time gradually rises to 70 – 80 ms. Doesn't matter whether it is busy or idle (steady in state 20). I looked in vain for any likely cause: an array that is steadily increasing, resources continually being taken up. Could find none.

 

Questions: 1. Where are some likely places to look for such slowness? Apart from the reading and writing of the ethercat data itself, the main thing going on is the displaying a handful of status items from that data as Boolean and Numeric indicators. Is this time-consuming?

 

2. Where some likely places to look for steadily increasing loop times?

 

(The main loop VI and its main subroutine, the state machine, are attached. It is impossible to attach all the dependencies, most of which come from the commercially available ethercat library from ackermann-automation.de)

 

Thanks,
Ken

 

Please convert to LV2015

$
0
0

Please convert to LV2015, Thanks, John

Please convert to LV2015

$
0
0
There is a separate forum for version requests. Moved to Version Conversion board

Error -89134, starting task DAQmx 17

$
0
0

I am using PCIe6612 (counter and DIO) with DAQmx, Labview. I am starting tasks on multiple lines as input, outputs and multiple counters. I get the following error. The code is attached. Please help.

 

Error -89134 occurred at DAQmx Start Task.vi:7220004

Possible reason(s):

Specified inversion cannot be satisfied, because it requires resources that are currently in use by another route.

Property: Pause.DigLvl.Src
Property: Pause.DigLvl.When
Source Device: Dev1
Source Terminal: Low

Required Resources in Use by
Task Name: _unnamedTask<0>
Source Device: Dev1
Source Terminal: Low
Destination Device: Dev1
Destination Terminal: Ctr4Gate

Task Name: _unnamedTask<4>

16 analog input signals NI 9209

$
0
0

The previous programmer had created one task and added 16 different scales to the task.

Is this the right way to be going about this?

I am getting error 5003. I Google'd this error, but I did not find anything.

 

DAQ isn't my strong suit.

Controlling an Instrument using an Instrument Driver

$
0
0

Dear All,

 

Attached is the image I have of my code so far. The code is really simple, all it does is connect the instrument I am using,the 34970A, with Labview, I then use it to read temperatures. Problem is, I do not want to display the temperature in Labview, I want to display it on my instrument. For example, when I run the program, it should immediately display the temperature on Labview. Does anyone know a solution or is willing to give me a hint on how to tackle this problem. 

Equivalent Command for CL HIL Write in Roboclaw controller

$
0
0

Hi

I'm using a Roboclaw 2X7A controller to control the DC motor for my inverted pendulum project.

 

I'm using VISA commands in LABVIEW to communicate to the motor controller via an USB cable.

 

I'm taking reference from Quanser's Inverted Pendulum project.In that they use a command called CL HIL Write to write an analog voltage for the control input.Can anyone please help me by explaining what that command exactly does and if possible guide me on what command I should be using for performing  the same function in Roboclaw using VISA commands?

I'm attaching the simple VI which is used to model the motor, from which I took the reference.

 

P.S Ideally I wanted to achieve an acceleration  that is proportional to the voltage that I'm sending in.

 

Thanks in Advance

Regards

Denesh

DAQms Start Task VI slow

$
0
0

Hi there,

 

I am using a PXIe system with PXIe-8821 with Win7 (64 bit) and a PXIe-4302 +/-10V, 32-Ch, 24-bit, 5kS/s/ Voltage Input Module to collect 24 separate channels of analog data.

 

I have a fairly simple state machine which implements the setup and acquisition. Attached are images showing the creation and configuration of the task, as well as starting the task, acquiring the data and stopping the task (we also clear it but a screenshot wouldn't be very illuminating). Everything seems to be put together properly, but for some reason the Start Task VI is terribly slow, we timed it as taking 800 ms! For various reasons this is unacceptable. I've searched high and low and can't find a solution. What can I do to speed up Start Task.vi?

General forum etiquette question

$
0
0

Hello quick question on NI forum etiquette. If I have a post that is squarely in the subject matter of a specialty board (that does not get very much/any traffic) is it acceptable to cross-post to the main LabVIEW board in hopes of finding a solution?

Check and plot data of an array of touchstone files

$
0
0

Hi,

I'm working on automating a manual board test process. It’s an amplifier board that has 4 types of filters, each of which is associated with an attenuator except the band pass at higher cutoff frequency has 2 attenuators, so I treated at another type of filter in my program. Basically I have a vi that sweeps the attenuator of each filter from 10db to 31db and the output is measured in a VNA. Every time the attenuator gets incremented, the output is updated and my vi will acquire the data from the VNA and save it in a touchstone format. So I’ve figured that part out so far.

 

Now I need to check to see if the data in the touchstone files that were saved earlier are within the limits for each s-parameter type. By the way, my touchstone files are composed of mag and phase of only s11, s21, s22. So basically I have 110 touchstone files that need to be checked and plotted. What I did in my program is first categorize the touchstones files based on its filter types, and then sort the each category by the attenuator's values (from 10 to 31 because for my application, s11 and s22 should remain the same regardless of the attenuator's values while s21 gets decremented by 1 each time the attenuator gets incremented and therefore the limit for s21 gets decremented as well). After that I read the touchstone files, create an array of 5 elements, each of which are a cluster of 3 elements: filter types, array of frequency and a 2D array (22x3). That is a subVI and the main VI is to set the limits and plot the data.

 

I figured out the way to do it, but I feel like my program is not really efficient in term of data manipulation because I keep bundle and unbundle them into/out of a cluster and I use so many for loops. My mentor frequently told me my code is not robust and more complex than it's supposed to be, so I'm trying to think of another way to write my program but didn't have any luck so far.  Also, I wasn't able to find a way to identify the filter type and the value of the attenuator if one of the s parameter traces fails. 

 

Any help would be much appreciated!

Thank you!

MV


Reading numbers over serial

$
0
0

I have an arduino UNO hooked up through usb reading all 6 analog inputs and formatting it (I know it's not ideal for efficiency) as 14 bytes, [upper 8][lower 8] [upper 8] [lower 8]...[carriage return] [linefeed] such that every transmission ends with 13 10. The problem is that sometimes my sensors actually read 10 and then I'm stuck with the array not being right and the buffer starts filling up (number of bytes at port skyrockets and latency follows close behind), basically everything spirals out of control and I have to reset. I forced the VISA read for 14 bytes, but that still isn't helping. I don't have a string stripper any more just in case that was the issue, but it's not. The program already knows some of the stuff I want it to because it always cuts off the 13 and 10 at the end to signify that they aren't needed. Since every other byte is either a 0, 1, 2, or 3, given the nature of converting a 10 bit number to 2 8 bit numbers, can I just add some kind of detection?

 

It might be helpful to know that everything works perfectly fine in realterm where it keeps the 13 and 10 at the end

 

(labview and arduino code attached [you'll need to rename it from .doc to .ino], you shouldn't need to hook anything up to the arduino to get it to run so if you have one around for testing it should work fine)

NI myRIO error code

$
0
0

Error is appearing after we reinstalled all drivers. A critical unexpected error occurred.

 

Screen Shot 2018-07-14 at 10.25.51 AM.png

How get the precision of MAX6675 (arduino) to labview (VISA)

$
0
0

Hi, i use visa for comunication between arudino to labview, for measurement of temperatures, with the MAX6675 and his thermocuple K type, well, the problerm is when i see the monitor serie in arudino's interface i can see 2 decimal of precision but in labview, only gets integers, i changed the propierties of numeric indicator: data type (double) and display format to 4 digits (floating point), but nothing changed. So the question is how can gets the precision that can see in monitor serial of arduino. I think that the problen isn't in the indicators, maybe in VISA.

Coloring multicolumn listbox

$
0
0

Dear All,

I want to color the strings in the multicollumn listbox. I do it in a loop, looping through every listbox row and using "CellFontColor" property

image.png

The problem is that the process is pretty slow. I have few thousand rows in the list, and the list recolors in this way in tens of seconds. The colors of the rows are to be changed for example when a user clicks on a row, and when I click on the row #1000 I wait for 10 s until is has been recolored.

Is there any more proper (faster) way to dynamically recolor the multicolumn listbox?

Thank you.

 

Interfacing the 1D array logic to NI DAQ output through port

$
0
0

Hi,

I have implemented a logic and it gives the output as 1D array. I wanted to convert the one dimension array output to digital boolean logic output and interface it to Ni DAQ output port, so it could act as circular PRBS generator. Could you please let me know how to interface this 1D output to NI DAQ output port and make it as a function generator. 

 

Thanks

Ramesh B

 

Viewing all 67009 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>