Quantcast
Channel: LabVIEW topics
Viewing all 66957 articles
Browse latest View live

Error 0xBFFF003A on Keysight ENA 5063A

$
0
0
Hello,

I'm trying to connect to the Keysight ENA E5063A using LabVIEW 2015 and USB port. To check if I'm connected properly I'm using NI MAX. It shows that the connection is correct, however when I use the VISA Test Panel to write operation *IDN? I get the following error:  VISA: (Hex 0xBFFF003A see capture1.png). I get the same error when I try to initialize in Labview (see capture2.png).
 
As recommanded by Keysght, I've installed their VISA in secondary (newest VISA IOLibSuite_18_1_23218). As recommanded by Keysight, iI have enabled the option « Enable Keysight GPIB Cards for 488 Programs » in their soft (Keysiht Connection Expert : picture3.png). I have turned on "Passport for Tulip" (see capture4.png).
 

Short summary of my installation :

  • Windows 10Pro
  • LabView2015 (32 bit and 64 bit
  • Ni-MAX 18
  • NI-VISA15 (32 and 64 bit support)
  • KeuysightIO Libraries Suite v IOLibSuite_18_1_23218_2
  • Keysight ENA5063A under windows10 and firmware Ver A-05-04

and settings:

  • Keysight IOLib - Installed as secondary VISA over custom installation, I set the 488 option.
  • MAX - All passports are activated.

This is installation recommanded by Keysight but I have the error 0xBFFF003A in NI-MAX. The team of Keysight many actions but are not ebable to give me help.

 

In addition, I have got an old Keysight (ENA 5063A under windows7 and firmware ver A.03.00) which works fine without installing Keysight IOLib.

 

have you ever encountered this type of problem ?

 

Best regards

Djamel


Suggestions on implementing a Target stop?

$
0
0

Just looking for any ideas on how to implement a target stop into this VI. Possibly with the ability to continue counting from the last number it stopped at. Your help is much appreciated.

Array Max

$
0
0

Hi there,

 

I am trying to find the maximum point of the Fourier transform function using the Array max&min function (picture below). However these seems to be a problem with connections both at the input and output of the Array max function. Any ideas how to resolve this problem?

Thanks in advance!

Screenshot 2019-02-01 at 14.31.48.png

SVN Issue and Separate Compiled Code Feature

$
0
0

I'm working on several LabVIEW Projects of modest size (a few hundred VIs and TypeDefs), each contained in a LabVIEW Project (.lvprog), each maintained in its own Directory Tree (named for the Project, with physical Folders matching the Project's Virtual Folders), and each under its own SVN Repository.  I typically start my work day by updating the Project, open it, often bring all of the VIs into memory (to inspect them, find where the sub-VI I'm modifying is called, etc.), and at the end of the day, I Commit, often updating 4-10 VIs (a relatively small number).

 

Then I go home, and do a "little more work", but on another computer.  Both computers are running the same OS (Windows 10 x64, I think Version 1803 or 1809), and have the same LabVIEW installation (including Toolkits, Modules, Drivers, and settings).  I again go through the Update, Work, Commit cycle.

 

What's peculiar is that the Update cycle typically involve 20 or so VI, many of which I never "touched" (other than perhaps opened to see where another VI was called).  Now, I may have modified a VI that was used by the "changed" VI in question --- oh, I think I just figured it out, it's the "separate the Compiled Code", or something like that.  

 

That's it!  Type "Separate Compiled Code" into Google and it brings up a LabVIEW Help Message explaining the whole thing!  Sometimes "slowing down" by writing Documentation, talking to an NI Support Engineer, or even posting a question on the LabVIEW Forum can make you suddenly "much smarter" -- I'm going to leave this message here, change the Title to make it possibly more helpful for other Forum members, and gain a few "Humility" points.

 

Bob Schor

Floating Point Errors when using "Coerce to nearest"

$
0
0

My students found a strange bug this semester that I never saw before (currently using LabVIEW 2018).

A (double precision) knob is coerced to the nearest 0.01.

If the selected value is lower than -1 or larger than 1, an error light should come on.

With "care", the knob can be rotated by hand to a value of -1.000000000000000222044605, which is exactly one bit smaller than -1.000000. In this case,the nearest value is -1.00 (which is correctly shown on the knob's digital display). However, instead of actually coercing the value to -1.000, the output of the knob remains at -1.000000000000000222044605.

As a check, a comparison node says that the value truly is less than -1.

While the comparison node is obviously working fine, why isn't the value actually being coerced?

Even recognizing that final "coerced" values might not be perfect due to floating point round-off, this output value is not correct, because the correct value of -1.00 is exactly representable as a floating point number. I would expect a problem of this type if the cutoff was -1.01, but never for -1.00 itself.

Obviously, workaround exist... I could multiply the value by 100, then convert to an integer, then do my range checking on the integer. But I think this is an actual bug that needs to be corrected, and I'm hoping for another pair of eyes on it.

Start/restart LabVIEW Web Service programmatically

$
0
0

Hi,

 

I found two options to start LabVIEW Web Services which are acceptable for our requirements:

 (a) Create an executable which contains the web service (will run in a separated private web server).

 (b) Publish web service to NI Application Web Server.

 

The main requirement would be to change the web service port number programmatically, which can be done in case of:

 (a) If we call the executable (which contains the web service) from an other LabVIEW (main) application, so we can open the service configuration file (niembeddedws.conf), change the port and start executable via "System exec.vi".

 (b) We must publish our web service manually to Application Web Server. After that if the main application is running, we have to change the port number of published web service dynamically (i.e: we have to change the App Web Server port). In this case we have to stop the Application Web Server service, change the port number in global configuration file and restart the App Web Server service.

 

The problem is:

 (a) We have to build a main application (exe) which will execute an other executable. It doesn't seem very nice, but everything can be done programmatically.

 (b) In this case we have to deploy our web service, which will run inside of App Web Server, and as far I see, it can be unpublish/start/restart manually although it can be done via web configuration page (Tools/Options/Web Server/Configure Web ApplicationServer). To change the port number, we have to stop the App Server, which may be used by another application independently for our application and can break those services.

 

What do you think, could be a more acceptable solution to run a web service from a main application and handle it dynamically (e.g.: start/restart/change port number)?

 

Thanks,

Balint

OPCUA Server Timestamp issues

$
0
0

I noticed some strange anomolies in the OPC UA server when publishing tags at a 100ms rate using the OPC UA examples in LV 2018.

As you can see from the screen shots, that OPC Server is generating continuous waveforms and the client is acquiring tag historical data and graphing it.  On occasion I see what appears to be a time shifted trend seen as the rapid oscillations in the trend data.  
Is this caused by the fast tag write speed?  I am trying to simulate real world conditions where I will be writing 10 tags/s
 
Server.pngclient.png

Need VIPM 2013

$
0
0

Hello Everyone,

I have a LabVIEW 2013 project code which uses NI GOOP modules. I need VIPM 2013 to get NI GOOP installed for LabVIEW 2013. VIPM 2018 has all other toolkits except NI GOOP. In another system with LabVIEW 2018, I can download VIPM 2018 and install NI GOOP. 

From where can I get NI GOOP VIPM 2013?

Thanks in advance for guidance


Using I2C and a conflict with digital I/O

$
0
0

I have a myRio and hence am using LabView real-time. I read an accelerometer and gyro from an external sensor using i2C. This works fine.

Then I add a PWM motor part. For the H bridge I require two logic outs.  I find that if I connect my H bridge logic outputs to A/Dioo that it stops the I2C working. Almost as if the Dio is in some way connected to the I2C, yet it is not.

In frustration I connected to the B port and it works fine again. Does using I2C in some way restrict you using the rest of the Port A? Seems a little odd. I had a problem in the past when I tried to use I2C and the FPGA at the same time - that's another story. Never resolved that either.

 

I made the I2C handle multiple opens (=True in the vi) in case it had something to do with that. didn't make any difference.

 

DAQ and timing

$
0
0

Hi, 

I am using a NI USB 6356 DAQ to read data from 8 pressure transducers. We use a timing hub to control the rest of our experiment (cameras, solenoid valve, spark plug, etc) and I was hoping to add the data collecting from the pressure transducers to this as well. However, we use all the analog BNC connections for pressure cables and only have digital input connections available. Currently the VI I am using I have to make sure I press go/start/run each time I want to collect pressure data. Is there a way for me to press run and have the program not collect data until I also press run on the timing hub either in the VI or on the DAQ or both? I attached a copy of my current block diagram

Arduino stop function after Flat sequence structure runned

$
0
0

Hi everyone, 

 

I have a problem with my VI. I attached the program. I have a time sequence structure and it works well alone without connecting anything. But after connecting it to an Arduino, Arduino stops the connection. 

 

How to solve this problem?

 

Best

Arduino stop function after Flat sequence structure runned

PXI delay between modules

$
0
0

I have two PXIe-4137 modules (SMU) in a single chassis with an 8840 controller.  I would like to set a delay between the two (roughly 6.8 ms).  The first SMU is to be triggered via software for DC voltage (multiple measurements for 40 ms). The second one is to perform a Pulse Voltage measurement (50 volts, 1.5 amp).  I can configure the two SMU tasks and route a trigger such that they start at approximately the same time (measure trigger from SMU1 to SMU2), but I can't seem to delay the pulse event.  There does not appear to be a Pulse Voltage parameter for a delay between the "pulse start trigger" and the rise of the pulse.  Is there a way to delay the routed trigger?  Would I need a third module to handle the timing between the two?

 

I'm not sure i can achieve the necessary delay with two separate software triggers.

LabVIEW 2018 issues with CrowdStrike

$
0
0

CrowdStrike Falcon is an endpoint protection enterprise software package (i.e. virus scanner run from company servers on all client PCs).  It appears that LabVIEW 2018 executables may not be compatible with CrowdStrike.  I noticed some of my deployed applications at a customer site (I am a LV consultant) were being "quarantined".  The symptom is not pretty:  attempting to run the EXE results in the file being "deleted" from Windows Explorer and Windows showing a dialog implying that the file lacks the correct permissions or otherwise is missing.  In fact, the program is not really deleted (it is recoverable by the IT department running the CrowdStrike admin tools), but to the user it is effectively gone (and of course can't be run).

My point of posting here is:
1. To see if anyone else is running on systems with CrowdStrike deployed and has experienced similar issues (or perhaps is running with CrowdStrike and has not experienced issues?).
2. Assuming the answer to 1 is No (I could not find any existing forum threads mentioning CrowdStrike), to provide a starting point for future folks searching the forums for information about this issue.
3. Perhaps there is a way to build the EXE to avoid the problem, if so I'll update this thread with any information about the workaround.
4. If NI R&D gets involved, provide a place for updates as to any action NI has taken to resolve the issue.

Here is a little bit more detail about what we have discovered so far.
Only LV2018 built EXEs have demonstrated the issue.  I have built identical programs with both LV2016 and LV2017 and they do not get quarantined.  You might think the issue is "What exactly is the program doing?".  But, my test application literally does nothing.  It is an empty VI panel, with an empty diagram.  This single VI is built into an Application using the default options and run.  That's it.  If you do this in 2016/17, no problem (the blank panel launches and is done).  In 2018, it is quarantined.

CrowdStrike has been sent some test applications.  They ran and analyzed the programs and claim that they exhibit "malware-like behavior".  This includes disk access, network access, kernel access, etc. (I won't get into all the technical details now, but might add more in a later post).  CrowdStrike Falcon is "behavior based" protection.  It doesn't do traditional file scanning to find viruses, it actually observes the behavior of the program itself as it starts to run.  The location of the file does not matter.  You can copy the EXE to a different location and it exhibits the same behavior (despite the Windows dialog implying you don't have correct file permission- this is a standard dialog that results from the file "disappearing" from the system).

I have tried this using all the "flavors" of LV2018 (2018, 2018f1, 2018f2, 2018SP1), and all exhibit the same problem.

I have tried disabling VI Server in the INI file.

The LabVIEW 2018 IDE itself seems fine- the issue is only with a built EXE Application using the RTE.  I have also built similar programs into shared libraries (this is something I do quite often in my distributed systems), and these run fine.  It is only the stand alone EXEs that demonstrate the issue.

I am attaching my test project and the built EXEs in 2016/17/18.

 

Eric Behrs

Behrs Engineering Services

Certified LabVIEW Developer

Installing compiled programs takes a long time when started from network drive

$
0
0

This may be a normal thing, but I'd like to hear from others. We have a number of programs we use in-house who have installers located on the network share. These installers are typically 1.5 Gb or so, and include LV Runtime, DAQmx drivers with configuration support (i.e., MAX), and NI VISA.

 

When I run these installers from the network, it can take something like an hour to get the install done, even on an SSD. While I haven't done actual timed tests, it seems to run much faster if the install directory is copied to the local drive before running it. Additionally, the installation time is basically zero if I build an installer that only installs the actual LabVIEW program with none of the backup material.

 

Is this a common experience? If so, is there a way to make an installer that caches itself (entirely) to a local drive before running?


Debugging in textual languages

$
0
0

My work has been requiring me to do more and more development in visual studio. Every time I switch back to LV I get a feeling of being able to really see what is happening to my data while developing due to the required front panel automatically built when defining the "variables" inputs outputs of the sub. This panel also allows to quickly and easily add more controls to see the data in different ways.

 

The question is this. Is that feeling shared by others and if not what techniques, tools, add-ons or whatever are used to make textual development as easy as LabVIEW graphical dataflow paradigm.

 

My own opinion is that textual languages will never be even close to offering anything remotely close to LV. Of course that being said I have no choice to use visual studio so every time I go back to LV and feel like this and question myself, hoping that maybe, I have overlooked some crucial technique or tool to stop feeling so blind in visual studio.

USBExpress driver for Silicon Labs USB MCU's

$
0
0

Hi

I am using USBExpress driver for Silicon Labs USB MCU's and c8051F381 MCU and Getnumdevices function is working, get part number is working, Si open is working but when I am using Write function firstly it was giving me the error code 6 which is SI_Invalid_Parameter.
> Then I changed the buffer data type to string then that error code is not appearing but now SI_Status is giving me 12 code which is SI_SYSTEM_ERROR_CODE.
> I have gone through the AN169 document but it only says: Call GetLastError (Win32 Base) to retrieve Windows System Error Code.The error codes are defined on MSDN.
> Please help how to solve this...

Thanks

Seeking LabView programmer to fix code

Fitting two xy sequences of data

$
0
0

Hi,

 

Can anyone help me figure out how to align to set of x,y data points.

These are extracted from a contour in an image.

The template sequence is the contour of the "ideal" sequence and sample sequence is from a contour containing flaws.

 

Todo:
1. Fit the sample set sequence to to the template sequence
2. Translate all points in the sample set to the template set (offset). So the curves are aligned
3. Measure the distance between the points

 

It can be done using image analysis in VDM but it is to slow and I need a mathematical approach.

 

The VI attached contains an example of the data sets to align/match.

 

Thanks

 

Send and receive video over TCP/IP

$
0
0

 

If i have two PC connected to each other by server , how can i sending video from PC1 and receiving this vodeo by PC2.

Thank you

Viewing all 66957 articles
Browse latest View live