Quantcast
Channel: LabVIEW topics
Viewing all 66905 articles
Browse latest View live

DTLS communications

$
0
0

I am currently working in a Smart Home Project which uses the DTLS as a Security. I am planning to Programm my Project in LabVIEW I have some Questions regarding LabVIEW and DTLS:

1) Does LabVIEW supports DTLS?

2) Is there any Toolskit(e.g: LVS Tools) which makes easy to Programm in LabVIEW with UDP Packet?

Thak you  Smiley Happy

 

Best Regards ,

Raja

 


NI-DAQmx 19.0 installation issues - not appearing in pallets and cannot be found by applications

$
0
0

Hi Team,

 

Has anyone else had issues installing and using DAQmx 19?

 

I have LV15 (32-bit)*, 18 (32+64bit), 19 (64-bit) and then installed DAQmx 19.0 without error. However, now none of the ADEs are able to use the driver.

 

From MAX I can confirm I have:

  • NI-DAQmx ADE Support 19.0
  • NI-DAQmx Device Driver 19.0
  • NI-DAQmx MAX Configuration 19.0

And the drivers were installed after the ADEs

 

I have already reinstalled the driver and have been through the following articles:

Can’t Find NI-DAQmx Functions after LabVIEW or NI-DAQmx Install

DAQmx Functions Doesn't Appear in LabVIEW Functions Palette

Upgrading or Downgrading LabVIEW with NI-DAQmx

 

*2015 isn't supported by 19.0, but it should still appear in the other LV versions.

 

Any thoughts? Known issue?

Memorizzazione dati in output

$
0
0

Ciao a tutti,

sono nuovo del forum e chiedo scusa in anticipo se non riesco ad essere subito esaustivo nello spiegare la problematica.

 

Io ho creato una subVI la quale in ingresso ha una stringa di caratteri (comando) ed in uscita due array. A seconda del comando dato in ingresso la subVI in uscita mi riempirà uno dei due array. 

Il "problema" è il seguente: vorrei che quando vado a riempire uno dei due array in uscita l altro non si reinizializzasse ai valori di default ma mantenga lo stato del precedente run.

Grazie.

How to disable LabVIEW checks the Grammar/Eddit Error Automatically function

$
0
0

Dear All

Because my program is huge, my source code is more than 60M above, it is quite a trouble when i eddit the source code, LabVIEW costs much time to check the grammar/the code error even I just move/replace an element or just make a little bit modification, sometimes, sometimes it cost 10 minutes and I can't do anything at that time, So, I think that could I disable LabVIEW checking the grammar/eddit error automatically function untill I compile the souce code or enable the funciton by manual? thanks.

LabVIEW script

$
0
0

Dear All

I am Srikanth Vuppala, working on Gas separations using membranes. In my work I am using LabVIEW 2018, I need help regarding VI - if scripts available. 

 

For pressure transducer (precision Fluid controls) 0 to 2.5 bar (down stream) and 0 to 40 bar (up stream) for these two pressure transducers I am using Pixsys ATR 142 sensors to record the data. For the acquisition of data in LabVIEW 2018 I need the VI script if available. 

I request to you if someone has these VIs. 

 

Thanks and Regards 

Lokking for Database Toolkits

$
0
0

Hello everybody,

 

I'm looking for a Database Toolkit for no particulary DB. Probably mySQL over odbc, but shure it has to be 64bit. I'm very surprised that I can't find very much. Here are my results until now:

 

  1. ADO Toolkit or here
  2. LabView Database Library
  3. LabVIEW SQL Toolkit von halvorsen
  4. NI Database Connectivity Toolkit
  5. Database Connectivity Toolkit für schnelle Datenbank-Transaktionen von Ovak Technologies
  6. GDataBase for MySQL
  7. LabSQL

1. Seems to have all basic functionality, but no support and it's not under development any longer.

2. I can't get it to work with 64 bit.

3. Similar to 1.

4. Would be our first Option - but doesn't support 64bit, so it's out. What does NI think to not support 64bit?

5. Nice - you can convert the variant after a select to a typedef Cluster

6. Seems to have most of extra functions, like transactions and conversions (UTF8 <-> String, Timestamp <-> date string). Unfortunatly the evaluation version won't connect to my test mySQL Server, so I can't test it more deeply

7. I've read this term often in forums - but all links are broken, so it seems to be dead.

 

Are there other Toolkits out? What Toolkit should I choose? What are your experiences? Thanks for any ideas!

Cheers, Alex

 

It's a crosspost

Execute order of an event trigged by other event

$
0
0

最近想买一个数据采集卡用来采集10ms以内的信号,由于担心labview的系统响应时间在2ms会对采集时间造成影响,所以请问下数据采集卡6601/6602/6612是否支持buffered event couting功能?

how to solve error -61499?

$
0
0
When compiling FPGA vi, I encountered the following errors: Error -61499 occurred in: niFpgaCompileWorker_AnalyzeTimingViolations.vi<-niFpgaCompileWorker_AnalyzeTimingPaths.vi<-niFpgaCompileWorker_CheckForErrors.vi<-niFpgaCompileWorker_JobComplete.vi<-niFpgaCompile_Worker.vi:1 Additional information:Traceback (most recent call last): File"./objects/lvfpga_dist/win32U/i386/msvc90/chn/root\resource\RVI\TimingViolation\scripts\niFpgaTimingViolationMain.py", line 280, in NameError: name 'constraintDict' is not defined Why this error come out?how to solve it?

Looking for opinion on my VI

$
0
0

Hi everyone,

I'm new to Labview and I would like your opinion on my VI, to improve myself and better understand how Labview works.

 

I use a distance sensor (a led + a photodiode) that I first need to calibrate at two distances (0mm and 0.5mm). I use Labview NXG to acquire voltage across a shunt resistor. This calibration is then used to process the distance between the photodiode and my object.

 

I used an event loop to retrieve the high and low value and "store" these values with the help of shift registers. Once the calibration is done, the third event structure continuously acquires new data and process it to display the distance. on a chart.

 

What can I improve ? Did I make any mistake ?

 

Thanks for your help.

Remote front panel for myRIO not working

$
0
0

Hi,

I'm working on an academic project where I should access the VI's targeted to myRIO from my computer browser.

To start with, I first tried on VI targeted to my Computer and it works perfectly. But it is not working for the VI targeted to myRIO.

I followed all the steps as described in below links

https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000P7YuSAK&l=de-DE

http://zone.ni.com/reference/en-XX/help/370622R-01/lvrthowto/remotepanels_rt_target/

 

Please find the attachments for reference, also let me know if more details are required.

I would really appreciate your help! Thank you in advance!

 

Regards,

Manoj

Producer/Consumer architecture

$
0
0

In Producer and consumer which loop will run fast and why?

Passing messages between loops (Consumer - Producer)

$
0
0

All,

I am needing some assistance or really, conformation of what I am doing is right or if there is a better way of communicating between two loops, specifically the Consumer/Producer loop with an Event Structure.

 

I am using a queue to pass messages (MessageID + Data in variant form via a cluster) from the UI (top) loop to the Data Processing (bottom) loop. If I need to send a message back loop back up to the UI Loop, I am using User Events (with the same Message ID/Data cluster). As of now, this is not time sensitive data...the loop will process it as the state is in the Wait state and all other messages are handled.

 

Is this the proper way or is there a better, more efficient way? I was thinking of using notifiers but, investigating that solution, didn't seem like a good fit.

 

Your thoughts are much appreciated.

interface to labview via CED power1401

$
0
0

Dear all

I am using "CED power1401", as interface for EMG data acquisition. Usually, I use the Signal software to control the CED power1401. However, current experiment needs to use labview to conduct complicated program. In our lab, we don't have NI DAQ device.

There is a interface program on CED homepage (http://ced.co.uk/downloads/contributed#1401labVIEW). However, this file is too old for running in current our labview program.

Do anyone have new version program?  Or could you guys please tell me another way?

Timestamp to number - precision

$
0
0

Hi,

could you explain to me the precision of a timestamp and the related conversion to a number.

Here is my test VI:

 

timing bd.png

LabVIEW Help states "Use the To Double Precision Float function to convert the timestamp value to a lower precision, floating-point number."

https://zone.ni.com/reference/en-XX/help/371361P-01/glang/get_date_time_in_seconds/

 

Here is a tutorial explaining a timestamp is actually a "128-bit fixed-point number with a 64-bit radix.".

http://www.ni.com/tutorial/7900/en/

 

Looking at the converted number, the double claims to be giving a nanoseconds precision.

The U64 shows the time in seconds (as expected).

timing fp.png

 

I'm on a regular PC, so I was expecting a preciosion down to maybe 1 ms, not 1 ns. Where are those ns coming from? How do I know the precision of a Timestamp and the converted double on a system?

 

Regards

Christoph

 

 

Announcing LabVIEW 2019 and the latest version of LabVIEW NXG

$
0
0

 

LabVIEW 2019 and the latest version of LabVIEW NXG are now available for download. 

- LabVIEW 2019 Download

- LabVIEW NXG Download

 

Below are some of the many features included in the new releases.

 

LabVIEW 2019 

The latest LabVIEW 2019 releases increase developer productivity through improved visibility in the integrated development environment, powerful debugging enhancements, and new G-language data types. Enhancements include: 

 

  • Maps and Sets: Two new data types help organize and manipulate data collections. LabVIEW 2019 Maps and Sets.pngLabVIEW 2019 Maps and Sets
  • Package Installers: Create and publish packages to SystemLinkTM software and NI Package Manager feeds for streamlined distribution. LabVIEW 2019 Package Installers with Auto-Publishing Feeds.pngLabVIEW 2019 Publish Packages to Feeds

     

  • Highlight Execution: Focus on a section of code instead of the whole VI block diagram. LabVIEW 2019 Highlight Execution.pngLabVIEW 2019 Toggle Execution Highlighting
  • History Probes: Monitor historical data flow while troubleshooting to make more informed decisions. LabVIEW 2019 Probes with History.pngLabVIEW 2019 Probes with History

     

  • Handle Errors in Case Structures: Configure case structures to execute subdiagrams for specific errors or lists of errors. LabVIEW 2019 Error Management Case Structures.pngLabVIEW 2019 Handle Errors in Case Structures

     

To see all of the features introduced in LabVIEW 2019 please see the LabVIEW 2019 Upgrade Notes.

 

 

LabVIEW NXG

The newest version of LabVIEW NXG simplifies the most time-consuming tasks in automated test and automated measurement applications. You now can deploy and distribute code faster, providing better application interaction and control. Enhancements include: 

 

  • NI Hardware Configuration Export: Capture NI hardware dependencies in an NI Hardware Configuration file (.nihwcfg) and deploy it to another system.LabVIEW NXG Hardware Configuration File.pngLabVIEW NXG Hardware Configuration File

     

  • Command Line Interface: Automate the build process for applications and libraries. LabVIEW NXG Command Line Interface.pngLabVIEW NXG Command Line Interface

     

  • Panel Container:Display the panel of a running VI in the panel of the current VI.LabVIEW NXG Panel Container.pngLabVIEW NXG Panel Container

     

  • Third-Party Software Integration:Import and export The MathWorks, Inc. MATLAB® software data (.mat) for improved interoperability. 

    LabVIEW NXG Import Export MATLAB Data Files.pngLabVIEW NXG Import and Export MATLAB data files

     

  • New WebVI Capabilities: Design responsive user interfaces and stream data to and from a server.LabVIEW NXG Flexible Layout.pngLabVIEW NXG Web Module Flexible Layout

     

 

For all of the new LabVIEW NXG features, please see the LabVIEW NXG 3.1 New Features. Along with these releases comes new releases of  LabVIEW Modules and Toolkits. A few of these modules Readme links are listed below. To find a specific Module or Toolkit Readme, search for them at www.ni.com/manuals/.

 

 

As we continue to build on our 30+ year investment in software, this latest update is just one in a series of fast-paced releases aimed to expand engineering capabilities from design to test.  

 

 


NI 6587 ADC example - acquisition clock

$
0
0

I have some questions about this example.

It's a example for LVDS communication with ADC. There is something which I don't understand. In "NI 6587 Communicating with ADC (FPGA).vi",  "LVDS Input Reading and Configuration Loop". See the picture below:image.png

I think this clock should in rate and phase match the frame clock(FR). That means a complete frame will be acquired in one clock cycle. But the example said the clock for this SCTL should be configured from DCO signal:

image.png

My questions:

1. Clock rate can be changed to match FR, but how can we guarantee the clock phase is aligned with the data?

2. My ADCs sample rate is 100MHz. The FR is also 100MHz. Is there a problem that SCTL runs in such a clock rate?

Ramp Graph varies Time by using DAQ with PXI-6221

$
0
0

Hello Friends;

 

I have the task by using the LabView with DAQ for controlling the voltage Analoge Output with PXI-6221. By controlling the voltage, I have to monitor the ramp graph through Oscilloscope. I have to programm the task in order I could monitor it through Oscilloscope. The following details are describe how the Ramp Graph works:

  • from 0s - 10s = 0V constant
  • from 10s - 20s = 5V (the Voltage shows ramp curve)
  • from 20s - 80s = 5V (the Voltage is constant)
  • from 80s - 100s = 0V (the Voltage shows de-ramp curve)
  • from 100s - 110s = 0V constant

The following attachment shows us the curve that is going to monitor in Oscillospoe through LabView Programming by using DAQ Communication.

Anyone could help me to guide it properly?

generate signal and acquire at the same time (Bender elements measurement)

$
0
0

I am working on the bender elements measurements in which I send a wave through one bender element and receive the wave with the other. The problem I have with LabVIEW VI is that I cannot synchronize the two DAQassistants for generating and acquiring sine signals. I think the problem is with the sampling rate and also triggering the signal acquiring with the starting time of sine wave generation. (please see the two VIs which are simple and a bit more complex one).

I would really appreciate if anyone could help me solve the problem. I want the acquire signal start just at the time as the generated sine wave started. Please let me know if there is something not clear in my explanation.

 

Thanks in advance for your kind help.

Labview menus magnify and display for to the right of where they should

$
0
0

Hi,

 

While I was looking for an option in the Labview 2018 toolbar menus, some sudden error occurred where the menus became oddly bigger in size and translated to the right like 10 inches on the screen. Choosing options from this dislocated menu still required me to click where the menu would have been normally and at its typical size.

 

Wondering if this issue can be fixed.

Running a LabVIEW executable (developed on Windows machine) on an Ubuntu machine

$
0
0

I'm right now experimenting with a C++ program on a machine running Ubuntu 16.04. What I'd like to do is get the C++ program to somehow integrate/incorporate/play nice with a LabVIEW 2014 executable developed on a machine running Windows 10.

 

Here's my question (and I realize I'm likely asking the wrong question): is it possible for a LabVIEW 2014 executable developed in Windows to work in Ubuntu, provided all the necessary drivers are installed?

Viewing all 66905 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>