Quantcast
Channel: LabVIEW topics
Viewing all 67075 articles
Browse latest View live

Strings font

$
0
0

Hello.

 

If I choose to write NA and WB in different fonts, when I make it an array, I cannot get my customized fonts anymore!

 

 

.police of string in array.png NAWB.PNG

 

Does anyone know how I can do it?

 

Thank you in advance.


Error code text file or Error ring, pros and cons of each

$
0
0

I'm looking for some community input on the pros and cons of using either the Error Code Text file, or the Error Ring for creating custom errors in an application.  My question comes about as a result of this post.

 

I've used the Error Rings exclusively in the past to avoid the extra maintenance steps of bringing the text file along with the LV project and source control.  I assumed the error information was simply converted to ring constant.  But with my linked post above, I'm wondering how deep my mistakes may cut me.

 

Please weight in on this with pros/cons, , and any past experiences that may be helpful.

Interrupt USB acquisition rate inside a timed loop

$
0
0

I'm using a USB RAW instrument (a laser mouse with VISA drivers installed) to acquire interrupt data—which, from what I understand, represents displacement in the interval polling period of the mouse in units of counts/inch (CPI). I want to take samples of total displacement over a given interval, which I set to 100 ms (10 Hz) in a timed loop. However, I know the mouse's polling rate is much faster than that, and wouldn't like to lose the data that's returned from the mouse between loop iterations; in other words, I want to sum each individual sample that the mouse outputs inside the timed loop and create one of these "summed displacements" to be output by the time loop every 100 ms (to be plotted onto a waveform graph).

 

I probed the data buffer of VISA Get USB Interrupt Data and noticed it was only changing every 100 ms—only changing once every loop iteration—which makes me think it's losing all of the other data points that are collected in each 100 ms window, and just outputting the "instantaneous" interrupt data one time each iteration. That'd be fine if the polling rate of the mouse is 10 Hz, but I know it's clearly not. 

 

I need to collect each data point the mouse returns in 100 ms, convert them to signed bytes (to represent positive and negative displacement), sum the points, and then index each of these sums at the end tunnel of the timed loop.

 

forum post 6.19.15.png

 

What I've tried with the concatenated While Loop was to sum the first 20 interrupt data points of the timed loop iteration—since I'm polling at 250 Hz from the mouse, I only took 20 as opposed to 25 so that the timed loop wouldn't be late. Anyway, I take these 20 data points, let them pile up at the index of the While Loop's end tunnel, and then sum the 1-D array the index created to output an integer representing the net displacement of the first 20 interrupt data points of each Timed Loop. But since my probe indicates that VISA Get USB Interrupt Data is only collecting data every 100 ms, the while loop's index is uselessly just summing one interrupt data point 20 times. Any suggestions would be appreciated!

VI to control VISA device polling rate

$
0
0

I'm using a laser gaming mouse as a VISA USB device to collect interrupt data. The mouse has software made by the manufacturer that lets you manipulate polling rate, CPI, acceleration, and a ton of other stuff. However, to make it a VISA-recognized device, I need to wipe the other drivers from the OS and install VISA drivers for the device, erasing any kind of customizability that was possible before. Is there any custom VI that allows me to manipulate polling rate for VISA instruments? Even better would be one that also let me change the CPI for mice, specifically.

 

Thanks!

Conversion 2014 to 2013

$
0
0

Hello. please help me by converting this program from LabVIEW 2014 to LabVIEW 2013 version. This file is very important for me and you will help me a lot. Thx.

Config file "byte order mark"

$
0
0

I wanted to put this out in this forum as a warning for future searching. I was ripping my hair out for awhile on this one.

 

Attached are two files that I finally boiled my debug down to. What's the difference? The broken one has the BOM (byte order mark) included in the UTF-8 format. This is essentially a header to a text file. EVERY text editor just skips over the bytes, so I had to find this with a hex reader. What this did is render first section "unfindable" to key reading.

 

The conversion somehow happened while using Notepad++ to edit my INI files (somehow... still not sure). Fixing it is as simple as selecting "Encoding", "Encode in UTF-8 without BOM". Funny enough, another way to fix it is to just add a blank or header line to the top of the file. 

 

Now that I'm past the warning, I wanted to get a general sense if this is well-known. (debating on adding an Idea Exchange to either error if the wrong BOM for native, or skip if correct).

 

 

PXIe-5170R decimation/filtering using design libraries?

$
0
0

Hi,

 

NOTE: This question mentions FPGA but users of the 5170R who haven't touched the FPGA may be able to help.

 

I'm looking to develop an application using the 5170R oscilloscope design libraries, and although I'm hoping to achieve my aims without modifying the FPGA side code, I did take a peek out of curiosity.

 

To stream the number of channels I want I will probably end up having to set a decimation factor of greater than one (set on the host VI, sent to the FPGA to set the decimation between the sample loop and the output stream), however the decimator in the FPGA specifically states that it provides no anti-aliasing filter, so my question is, is this design library not fit for 'out of the box' use at non-unity decimation factors, or is there something I'm not seeing, like a setting for some internal device hardware filter on the acquisition side of the unit?

 

For users without FPGA experience, if you have used a decimation factor>1, was there aliasing on your output signal? 

 

Thanks

Lee

 

 

Conversion LV 2014 to LV 2013

$
0
0

Hello! Can you help me by converting a 2014 LabVIEW file into a 2013 LabVIEW file? It's very important for me.

Thx.

 

EDIT: wrong section. sorry


controlling permissions of network Cdaq with network published shared variables

$
0
0

Hi all,

 

Im currently trying to set access permissions of a network cdaw using network published shared variables.  Problem im running in to is when i change the variable i dont see it update. I am just using a simple write and read the variable to test it out.  Now sometimes the first time i change the variable it updates, but then i cant seem to get it to update again.  Do i need Real time module to do this?

 

Thank you

function generator virtual to physical

$
0
0

Hello, 

 

I know that labVIEW can work with the real world (physical) and the virtual world. labVIEW can communicate physical to virtual, meaning physical signals can be measured in labVIEW. However, can the opposite be done?

 

So, instead of labVIEW measuring signals from the real world can labVIEW send signals to the real world. In particular, I want to make a function generator in labVIEW (Virtual) and send it to a circuit using analog output devices. Is this possible to do? If so, is there any sample program where I can see how people have done so?

build freezes

$
0
0

I have a build where my startup VI and a bunch of classes are in a lvlib. It "completes" (i.e renables all the buttons like explore, done etc) but it says it's still saving the lvlib. It's totally hung. Has anyone seen this behavior. Sorry, can't post the code. If I look on disk, the exe isn't finished, it still is represented by folders.

 

subvi terminals in loops

$
0
0

I'm trying to create a subVI from a VI in which most of the controls and indicators (that will become the terminals of the subVI) reside in loops. I then connect the resulting subVI into a wrapper VI.

 

One boolean control within a while loop in the subVI is assigned to a terminal. If I connect a button in the wrapper VI to this terminal, the subVI doesn't seem to notice the input from the button at all. If I push the button on the subVI front panel, however, this works fine.

 

For indicators that reside in loops, the corresponding output terminals do not seem to reflect the subVI outputs either.

 

What am I doing wrong?

 

TCP/IP connection between PC and UR5 Universal Robot

$
0
0

I have a UR5 robot connected to a PC via ethernet. I am currently trying to have the robot and PC send strings back and forth via the TCP/IP functions in LabVIEW. Using the code below I have been able to recieve strings from my robot sent to my pc, but have been unable to send strings from my to my robot. I am able to send and recieve strings in both directions via the program Sockettest. Does anyone have a lot of TCP/IP knowledge and know how to send strings from my PC to my robot? Alternatively does anyone know how I could connect Sockettest to my LabVIEW code?

 

TCP_IP_UR5.PNG

sending arrays from Excel into Labview

$
0
0

I want to send an array from Excel VBA into a LabView VI.

The Excel example shows how to send individual numbers and return an array into Excel but not send an array.

 

Everything tried sends blank arrays into LabView, which can then be modified in LabView and returned to Excel.

A VBA code snippit that does not work is:

 

Sub LoadData()
'
' LoadData Macro
'
' Keyboard Shortcut: Ctrl+l
'
' This is an example to demonstrate LabVIEW's Active-X server capabilities.
' Executing this macro loads a LabVIEW supplied example VI "Frequency Response.vi",
' runs it and plots the result on an Excel Chart.

Dim lvapp As LabVIEW.Application
Dim vi As LabVIEW.VirtualInstrument
Dim paramNames(0)
Dim paramVals As Variant


Set lvapp = CreateObject("LabVIEW.Application")
viPath = lvapp.ApplicationDirectory + "\examples\apps\freqresp.llb\DAK Frequency Response.vi"

Set vi = lvapp.GetVIReference(viPath)   'Load the vi into memory
vi.FPWinOpen = True                     'Open front panel

paramNames(0) = "Foo"
paramVals = Sheet1.Range("j1:j1000").Value


Call vi.Call(paramNames, paramVals)

 

This code generates an error - expecting 1D array or variants.

However, if I supply the correct array, then LabView thinks it is empty

'foo' is a cluster

'Array' is a double array

Neither work to receive the data

Using LabVIEW, what is the best way to trigger a high speed camera (with a trigger input) to take photos that are synchronized with data acquisition at 8000 S/s?

$
0
0

Current code:

 I am this far: I currently have one DAQmx task for the data acquisition and one for the camera trigger propogated through my entire VI, and they are synchronized. That being said, I simply "close task" for the camera trigger because I don't know what to use to output a trigger to my camera's "trigger" input.  I tried the DAQmx "export signal" VI, selected "start trigger", but this gives me very little control over what type of trigger signal I am exporting (if it even is the correct VI to use). My camera requires the trigger signal to be held "low" for at least 10 times the length of time of the rising edge trigger. I can think of some complicated things I might be able to do with "Simulate Signal", but I know there must be a simple, elegant way. 


Is texas instrument ADS5292EVM compatible with LabVIEW?

$
0
0

Hello,

 

Any kind of information is appreciated, even if does not answer my question directly.

 

Thanks Smiley Happy

Can I run or duplicate a photoshop process (filter->distort->displace) in labview?

$
0
0

I am trying to project an image onto an uneven surface. I have a depthmap of the surface and want to distort the projected image to match the uneven surface. I beleive I can achive this using photoshop's displacement map filter. Is there a way I can run that process in labview?

 

link shows the displace filter effect:

http://photoshopcafe.com/tutorials/dispmap/dispmap.htm

 

Thanks.

how to create ground+ power supply in ultiboadr 13v

$
0
0

hi

 

anyone can help me... iam fresh in using this program  Smiley Sad .. ineed to put ground and power supply in  my cricuit in ultiboad version 13 ?how?

 

and how can i export the file in to gerber file?

 

thanks allSmiley Happy

 

 

plz help me 

 

how to create ground+ power supply in ultiboadr 13v

$
0
0

hi anyone can help me... iam fresh in using this program Smiley Sad .. ineed to put ground and power supply in my cricuit in ultiboad version 13 ?how? and how can i export the file in to gerber file? thanks all Smiley Happy plz help me

I want to run my .exe file in same time when Labview started to run

Viewing all 67075 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>