Quantcast
Channel: LabVIEW topics
Viewing all 66934 articles
Browse latest View live

COM port doesn't recognized in Measurement and Automation

$
0
0

Hi , im using labview 2014 , and i'd like to initialize an srial communication , i built an vi , using VISA , but i cant choose my com port .

I dont have an COM port on my pc , im using USB to COM converter , it works great with arduino and HyperTerminal . I tried to check the Measurement and Automation , and i found that my COM port is not recognized there.

What should i do ?

 

thank you in advance .


System Manager Directory? What is it

$
0
0

Labview 2011 SP1, I am constantly getting this error when loading a project where is it trying to load some resource vi's from this directory, If I let it I cannot save my project without errors. 

I deleted the directory and my project loaded fine after searching for awhile for the correct pathed sub vi's. I did a mass compile and a full save but if I restore the deleted directory the next I load it goes right back to this location and errors again! Do I need this directory or can I simply delete it and never need it? 

 

Mean PtbyPt in a loop with parallelism

$
0
0

Hello,

 

Can anyone explain to me why this does not work as expected? (10.5 and 0.5 values)

I get 5.5 and 5.5.

 

I checked the MeanPtbyPt.vi, all is clonable, the vi itself is set to preallocate clones. I also tried to set it to Inline, does not help. Tried C of 1 and 2, no luck. If I open Mean ptbypt while operating, I see that the SR holding the array of data is interleaved with 10.5 and 0.5 values, as if both clones use the same memoty space, but on the input I always see only ~0.5 coming in. What gives? I thought parallelism will create two clones, no?

DSC - OPC Connection - Can't read or write to tags

$
0
0

I am working on controlling some motors over an OPC server.

 

The PLC is an M258, and the OPC server is CoDeSyS OPC DA.

 

When I use MatrikonExplorer, I can read the value of tags, and change their values as needed.

 

Using an evaluation version of NI OPC client, I could see the tags, but their value was marked as "unknown" with bad quality. In their properties, the value of the tags is given as "BAD VARTYPE".

 

I used DSC to add the codesys server to a library and imported all the relevant variables. When I try to read them in a VI, they give me error code: -1950679034, so that they have no value. And when I try to change a value, I can see that the value never changes.

 

I would greatly appreciate any help you can provide.

software timed commands

$
0
0

Hi

 

  • The program l have developed works correctly in all areas but one.
  • I cannot obtain consistent time period for the activation of the first command. The problem is best viewed after operating the program a number of times and pressing the data trigger button a number of times and viewing responses on the command data chart. You will see that command position 5  has a variable time step when compared to the other commands which are equally spaced in time.
  • If you view the For Loop indicator you will see that sometimes the counter displays a 1 instead of a zero when the data trigger is first pressed so command position 5 is nearly bypassed by command position 12.
  • I am after a solution to this problem.

 

  • Time loop is used to provide 100 mS to 1 minute time steps between each command.
  • You will see that the program automatically performs a ramp of commands 5 to 60 and when the ramp has finished the last value 60 remains.

 

Thank you

 

wetland

 

the program in labview that following path of of image.

$
0
0

hello

i want make program that when you give an image to labview for example one black sircle inside white area in image. labview take it and start following on circle path. i want to join this program to one XY stage that working with two stepper motor.

receive data to array using serial communication

$
0
0

Hi , i did a search in the community and i cant find what im looking for.

Im using arduino to sample 10s of a signal , with resolution of 8bit and samplin rate of 5.25Khz , i stored the samples into byte array size 52500 . I need to send this array using serial communication , store it , as array , and plot it on a graph , the plot will be after the array is received in labvew , i dont need a real time plotting . only an array and the plot of it . any ideas ? VI examples ?

 

Andor External Trigger Buffer Problem?

$
0
0

Hi there,

 

I am using Andor Zyla 5.5 model, but I think the problem I encountered is more general in high speed camera models. I used the following settings: Rolling Shutter, 0.005 Exposure time, External Trigger, Overlap ON, Full Frame(5M pixels). The problem exists as well in the example in Andor's labview SDK example(circular buffer, live mode) as well, fps is ~100.

 

When I use the Software Trigger, everything works fine. But when I use Internal or External Trigger, the Camera is "frozen" after  continuing display for a while. In fact the acquisition is 'Frozen' since same thing happens even if I removed anything other than download from the WaitForBuffer.vi. Then I looked into the acquisition loop, it seems like camera run out of input buffer queue so it has to wait until some buffer space is queued back from output buffer queue. The strange thing is in either circular buffer mode or live mode, it always queue back the buffer when it is buffer is not used, maybe there are other reasons so that the loop is too slow hence it cannot catch up? I assigned sufficient buffer with the createBuffer.vi (attached)

 

Further investigate into the issue, I measured time in the acquisition loop, in Cycle mode 'Fixed' it's about ~10ms which is reasonable for a 100fps, but in Cycle 'Continuous' mode, it is about ~100ms. Every other settings are exactly the same. However, even in 'Fixed' mode, the acquisition stops after a while(run out of input buffer queuem again). It's just so confusing to me which step actually delays the looping time? I tried using queue instead of local variables but still no luck.

 

The other thing I don't understand is the QueueBuffer.vi and CreateBuffer.vi WaitForBuffer.vi (attached). I understand its outputs which are just buffer(addresses to the memory location), but why the input buffer is a digital number or char? is it because since the output is the actual buffer, so the function pass in the address to the buffer as input? 

 

I have attached respective files, I would appreciate any help in camera buffer management or experienced users with Andor cameras. Thank you in advance! I appreciate any of your help!

 

Kind regards,

E


How outsourcing received signal and insert it on Labview

$
0
0

How outsourcing received signal and insert it on labview ؟

 

Subtract previous value from value in a large 1d array

$
0
0

I have data in a 1d array (grabbed from csv file) and I need to generate a new 1d array that represents the difference between  the previous value and the current value at each timepoint (ie, Ax1 minus Ax-1). I cannot find a LabView function to do this, so any assistance would be greatly appreciated. Cheers!

How to install unsupported (obsolete) toolkits/modules in newer LV versions?

$
0
0

Hello,

 

Installers of LV toolkits/modules do not permit installation in newer LV environments. I don't know if that is necessary, but I am quite positive most of them would actually work 100%. If this "feature" cannot be revised, is it possible to obtain a workarround procedure (file/folder specification) for manual installation?

 

I am specifically interested in installing the LV 2012 ARM Module in LV 2014. Even if I find time and mashine to install LV2012, than the ARM module, I am not sure I would be able to completely "move/copy" the ARM installation into the corresponding folder structure of the LV 2014.

 

So, is there an explicit  manual installation procedure?

 

Thanks in advance,

Accelerometer not running

$
0
0

hi, i need help, i not sure why my accelerometer labview is not running. is it because i need to type the program to run? need helpaccelerometer.png

Tick count

$
0
0

Hi. Can I ask how do you make the tick count reset when you start the program .

Thanks in advance!

Troubles with programming a program with different multiple channels!

$
0
0

Dear LabView users,

 

i've been struggling with a program I need to write about the stress-strain curve and the E-modulus of a material tested with strain gauges, and LVDT's.

 

 

The thing is, I need to acquire data with the nDAQ-system from strain gauges (Virtual channel: Strain), LVDT's (Virtual channel: Voltage) AND a pressure cell (Virtual channel: Voltage), and I need to log this data and do some operations with this data. The problem is, this program needs to work with either the combination of 1 strain gauge, 1 LVDT and 1 pressure cell, but also with the combination, for example, of 4 strain gauges, 2 LVDT's and 1 pressure cell. For the strain gauge I need to be able to configure the strain gauge properties (but this is able with the bridge configuration cluster provided by LabView) for each strain gauge seperatly, also I need to be able to configure the scale and the input voltage for each LVDT or pressure cell, also seperatly. And I need to be able to select the channel for each gauge,LVDT and force cell. Also I need to be able to set these elements to zero before the testing begins. 

Thereafter all these data needs to be logged into an excel file and also be shown on a clear graph that shows me the strain from the gauges, the displacement of the LVDT's and the force of the pressure cell. 

Eventually, I need to become a stress-strain curve and the E-modulus of the material we're testing, presented in LabVIEW.

 

Does someone has an idea about how I should work this out step by step? Any help would be very usefull.

 

With kind regards,

 

Peter

 

 

LV Modbus library 1.2.1 - Error code 6101 (timeout)

$
0
0

This seems to be a common problem but did not found a clear solution. Timeout (error code 6101) appears randomly during register(s) read in Windows and Real-Time environment and with various computers. No matter which RS232-RS485 converter or short/log cable is used. Does anybody know what is the main reason for this error?


How to generate and use a clock signal for internal triggering.

$
0
0

Hello all, 

 

Please excuse my ignorance, but I have been tasked with programming a data collection/operation system for the experiment I am working on, and have no idea where to start. My goal is to generate a 0-5V clock signal with a user desired frequency control on the front panel and use said signal to internally trigger events. As for the output, I need to control a digital and analog port. Both channels should output a pulse of user specified width at a user specified delay from every from every rising edge of the clock signal. The only difference between the two is that I need to specify the output voltage to the analog port.

 

If anyone could offer some guidance as to the best way to do this, it would be great! Thanks in advance!

No Image - 1473R With Vision Assistant

$
0
0

Hopefully this is a simple fix and I'm missing something very obvious, so here is what's up. I'm originally a text programmer but for this project I'm stuck using LabVIEW which is completely unfamiliar; I've been working on this for days with no progress so I thougt I'd see if anyone had some pointers. The goal of the project is to use the PCIe-1473R FPGA to do live gain control, overlay, and maybe some recognition.

 

I started with the "Image Processing with Vision Assistant on FPGA" example and made a few simple changes to just attempt to get a video feed through. The camera we are using is a Pulnix TM 1325 CL, which outputs a 1 tap/10 bit Camera Link signal. Since this example VI is originally configured 1 tap/8 bit I changed the incoming pixel to be read as 1 tap/10 bit and compiled and tested. When I try to start video acquisition I get no errors but no frames are grabbed. The frame's acquisitioned count does not increase and nothing is displayed. If I switch to line scan I get a scanning static image, but this is not a line scan camera and my other NI frame grabber card shows an image from the camera fine.

 

I wasn't all that surprised with this result, as the input is 10 bit and the acquisition FIFO and DMA FIFO are both 8 bit orginally. So, I changed them to U16 and also changed IMAQ FPGA FIFO to Pixel Bus and IMAQ FPGA Pixel Bus to FIFO blocks on either side of the Vision Assistant to U16. With this configuration, I again get no image at all; same results. I suspect this is because the incoming image is signed, so the types should be I16 instead. However, there is no setting for I16 on the Pixel Bus conversion methods. Am I not understanding the types happneing here or is there an alternative method for using Vision Assistant with signed images? I'd think it'd be odd not have support for signed input.

 

Anyway, I've tried all the different combos of settings I can think of. Does anyone have any input? I feel like it must be either a buffer size problem or a signing problem, but I don't really know. Any and all input is welcome!

 

Thanks for helping out a new guy,

 

Kidron

controls appear and disappear

$
0
0

Hello,

 

I want to make controls on the front panel to appear. So, say the front panel has two visible controls A and B. If the user makes A true, then another control, X, appears and the user can make that true or false. If the user makes B true, then Y appears, and the user can make Y true or false. How would I go about in doing something like this? 

 

Thank you

Simulated device in MAX, self tests without error and has working Test Panels, but doesn't show up in DAQ assistant.

$
0
0

I'm trying to create a development machine where we can test new code without using our physical hardware. I've followed this guide in setting up a simulated device. I can get to step 3.2b, but the device does not show up in the DAQ assistant. In MAX, the device self tests and self calibrates successfully, and when I open the test panels, I see some sort of signal. I assume this is a default simulated input since I haven't told the device to look for anything? Note that the two devices I'm trying to create show up in the Devices and Interfaces section, but that even after running Self-Calibrate, the Self-Calibration date is still unspecified.

 

max_panels.png

 

When I try to test the device and create a voltage input according to the guide, I am unable to see either device in the DAQ task creator.

 

daq_assist.png

 

Steps 1 and 2 of this guide are obviously satisfied. Step 3 is not, but this is unsurprising since a simulated device wouldn't be found in the Device Manager anyways. Also, I am not running RT, so step 4 is satisfied.

 

Does anyone have any ideas?

 

 

 

type def and string

$
0
0

I created a type def out of a cluster.  One of the elements is a string control.  I hard coded in some string values ..ie Upper Limit...Nominal and Lower Limit.  I "applied changes" and "saved" the .ctrl file.  When I go to my vi and place the type def ctr on my front panel the string is empty.  if I re open the ctrl file the string values are still there.  They don't appear on my vi.  Can you NOT hard code in string values in a type def??  Is the string control just a place holder??

  I tried "Make current values default" and that didn't work.

Thanks..

Viewing all 66934 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>