Quantcast
Channel: LabVIEW topics
Viewing all 67078 articles
Browse latest View live

serial communication

$
0
0

Please see the  VI attached. This is the sender's side labview program

My task is to communicate various cases of event structure to a subsystem ( like laser) using serial communication and when it receives data, it send a acknowlegement in form of another hexadecimal code to 1st PC.

So for laser standby I need a code which is different from emission On or emission off...

My problem is how do i make the 2nd subsytems labview program.

Since i have to control only through 1st PC, I cannot include event structure in 2nd PC's program because in that case we need to manually change value in front panel of the program. 

If i use state machine, will I be able to communicate between two different LabVIEW programs.

 

 

Please help. Thanks


Vision Assistant on FPGA - Error -5400 Reading Images from DMA FIFO

$
0
0

I suppose I should preface this question by saying, I'm a computer science summer intern who has been tasked with this project and I'm completely new to LabVIEW, so that is surely part of the problem. I've written text based code for years, so trying to switch to this has been a bit of a nightmare. For the last several days I've been trying to no avail to get a video stream up from our FPGA. We had it working before one of my mentors left, and now working alone with a slightly different example and the orginal I can't get anything up.

 

The goal of the project is to stream 12 bit camera link video through a PCIe-1473R-LX110 card, doing some image processing (auto-gain and image overlay), and then view the video live on the host. The Vision Assistant seems the best way to make this happen, so I started off with the "Image Processing with Vision Assistant on FPGA" example. Since our camera is 12 bit 1 tap and the example is for 8 bit 1 tap I changed that setting and modified everything to use 16 bit integers...meaning I changed the DMA FIFO to 16 bit, changed the IMAQ setting to I16 (our images are signed) and everything else I could find that needed to be expanded. Whenever I try to aquire images now, I always get the same error - "Error -5499 occurred at Invoke Mathod: FIFO.Aquire Read Region in NI_VDM_FPGA_Basics.lvlib:IMAQ FPGA Image Transfer from Target U16.vi -> 1473r Acquisition Template (Host).vi." So, basically it times out every time the Image Transfer from Target method is called from the best I can tell.

 

I've noticed several things I think may be indicators of the problem, but since I'm fairly new, I don't know exactly how to interpret them. First of all, the front panel reads out image height/width. Interestingly height shows a value while width shows zero; looking into it, width is pulled from Acq Status.ClocksPerLine. If this is a 0 value, could this mean something deeper is going on that is messing up image transfer...maybe some timing is wrong some place where the images are being read/written? I also have played around with every parameter I can think of in case something is overflowing a buffer; I expanded the acq FIFO on the FPGA, I applied a deeper depth to the host FIFO storing the incoming images, etc. The only thing I notice any change one was when I changed the time out on the Image Transfer from Target method. It would take longer before it errored and the front panel read out would claim that a larger number of frames had been acquisitioned. This lead me to believe this was not a buffer/FIFO size issue.

 

One other thing I tried was removing the Vision Assitant block and just sending the pixel data strate to the DMA FIFO, which made no change at all. Anyway, hopefully something in this makes sense to someone. I'm completely stumped at this point. I was on the phone extensively yesterday with engineers from the Vision department and they were not able to get me anywhere. I've attached a zip file of the project as it sits now. Anything to point me in the correct direction would be greately appreciated.

 

Thanks,

 

Kidron

cRIO exe with 'Front panel communication' and no LabVIEW

$
0
0

I have a project with a cRIO VI and a FPGA VI.  The cRIO VI is ran with 'front panel communcation' i.e. the front panel is open on the Windows host.  Is it possible to create a standalone executable that I could place on a laptop that does not have LabVIEW installed (just free runtime engine) that can use front panel communication with the cRIO VI?

 

If not, is it possible if I used a web interface or some other method instead?

 

If that is not possible, what is the preferred solution - create a host VI that communicates via shared variables with the cRIO VI; make a cRIO exe and deploy it as startup on the cRIO; then make a host exe?

Self-pace Training, Core 3 Exercise incorrect

$
0
0

I am following the Self-pace Training, Core3.  The ZIP file for the Exercise has a problem.  Starting from section 5 where we start dealing with the TLC project, everything is already done.  Basically, I see the solution, rather than the exercise file that I am supposed to work on.  This is a big error.  Where do I get the correct Exercise ZIP file?  HELP!!!

only one column,when saving 2D Array in csv file

$
0
0

Hi every one,

 

the problem is, that when I save the 2D array data in a csv file, there will only one column, and this column include all the single value of each signal. See Picture.

But after searching, I've found people just wrote the program the same way I do. What's the problem. 

 

PS: the values are germany format, comma instead of point, and I'm sure it disturbes nothing.

 

So appreciate your favour.

best regards

Melo

 

CSV_failed.JPG

is it possible to progrommatically check if cdaq chassis is in use

$
0
0

Hi all,

 

Im writing a program that will reserve a cdaq chassis, but i want to know if the daq is in use before taking control of it.  Can this be done in the code?

 

Thank you

capacitive coupling to AC couple a signal with "NI-PXIe-6124" DAQ device.

$
0
0

Hi every body.

I need my input channel to be AC coupled but my device soes not support AC coupling and gives me the error when I try to do that using coupling property for analog input.

Does any body know how I can Ac couple my signal by inserting a capacitor in series with the signal? how can I decide about the capacitor size?

 

thank you.

Show dialog from .NET assembly?

$
0
0
I'm just diving into the LabView demo to see what it is capable of. In particular, I am wondering if LabView is capable of allowing me to write simple test applications that exercise methods in my .NET interface, which is used to control various kinds of hardware. After floundering about on the net, I've been able to figure out how to piece together a VI that seems to be syntactically correct. I have a factory class with a method called GetInstance that returns an interface for the hardware that I want to control. Via this interface, I am able to call Initialize. Then I call my Diagnostics method, which should display a modal dialog on the screen. The problem is, even though it looks like the method is called as I single-step through my VI, nothing appears. Can anyone tell me if this is even possible in LabView? Thanks!

Why labview would allocate more memory in every run?

$
0
0

Dear all, 

 

I have been working with big images in LabVIEW for some time and started to have some memory usage  problems. 

The problems got worse when I packed my code segments in smaller subVIs. I suppose the compiler creates additional copies of the image arrays for all interfaces within the SubVIs.

That seems to make sense. What  I still do not get is why would the compiler allocate more memory everytime I run my code (same image, same parameters). As far as I understand, that should take the same memory everytime (which is allocated by the compiler in the first execution). However, I see  every time the code finishes execution how the use of memory is slightly higher (about 10-20 MB more).

Do you know, which could be the typical reasons of such a behaviour? 

After the execution of my code, I´m deallocating the memory taken by the images and opened IMAQ sessions. I also tried with Request Deallocation but same results.

Slightly better results were observed if I call the SubVIs by reference (dynamic calling), but the increasing memory usage can still be observed. 

I would appreciate your help.

 

Thanks,

Esteban 

multiple tasks in daqmx

$
0
0

Hello,

 

I have project where I use 2 AO and 4DO to generate some predefined signals and  2 AI to acquire response. When testing only one sensor this works fine. In future I will have to test aditional sensor concurrently to the first one. User can start fitst and secould slot whenever he wants.

I can't create 2 AO tasks at the same time using only one PCIe card. I can;t combine all AO from slot 1 and slot 2 into one task becaus test on slot 1 has diffrent frequency (defined by user when other slot i running), amplitude and is started by user at  different time.

Do I need to buy additional PCIe card to run test on slot 2?

 

IMAQ image overlay with alpha blending (partial transparency) ?

$
0
0

 I am using Labview 2014 on Win7. I have found examples of color image overlay with binary transparency (A + B = OUTPUT with each output pixel either 100% image A or image B) but have not found any examples of alpha blend (partial transparency). For example, some output pixels are (20% A + 80 % B).   Can I do that in LabView?

 

IMAQ ArrayToColorImage VI takes U64 image pixels as clusters of 4 unsigned 16 bit values, interpreted as Red, Green, Blue, and Alpha planes.  Is there any VI that makes use of the 4th plane (alpha) to do alpha blending when combining two bitmaps?

 

IMAQ Overlay Bitmap VI comes with the note "This VI does not support alpha blending." Is there any other VI which does support alpha blending?

 

how to print to a usb thermal receipt printer

$
0
0

I recently bought a 58mm usb thermal recepit printer and i am having difficulty sending commands to the device. I was able to connect the device using the VISA device manger and know its a NI-Visa raw usb device. Using the example usb raw bulk example i can send text to the device, however i can't send commands to the printer, it will only print out the commands and not implement them. An example is if i wanted to print the logo the ASCII command is GS \ m, which m is used to alter the size. When i send the data the printer will just print that command on the receipt. My question is how do i send a command to the print and have the print know its a command. 

is the cRIO 9067 compatible with LabVIEW 2013?

$
0
0

I just purchased 2 cRIO 9067s and I'm trying to configure one of them to use.  I got the IP address set and booted it into safe mode.  The status light blinks twice, then pauses.  The documention says this means I need to download software.  NI MAX only suggests using versions 14.?, which I'm assuming is LabVIEW 2014, but that option is grayed out.  And if I  try to do a custom install, everything in the list produces a concfict and wont download.  Is the 9067 only compatible with LV 2014?  Or am I missing a step?  the version of LV that I am using is 2013.  I'm installing 2014 to test this theory, but I'm no thopeful.  Thanks for the help people.  

Error Ring changing code number?

$
0
0

I wrote an application a few years ago and used error ring constant in the code.  It was to indicate to that a dynamic dispatch VI was not opened by a child class as intended.  The I found a description from the included list, and it described the condition very well.  That development was done in LV 2011 or even earlier perhaps.

 

Now after a mass compile to VL2013 SP1, my error code (400046 specifically) is undefined.  I guess I expected error codes to have a continuous, none changing life once created.  Any ideas why it is no longer recognized?

How to control solenoid direction valve with Labview

$
0
0

Hi everyone,

 

I am very new to Labview, so please be patient with my "acknowledge"

I need to control a solenoid operated direactional valve in hydraulic system with Labjack U12. I am using EDigitalOut vi. to control two Grayhill I/O modules 70M-ODC5 (DC output) for set mode and retract mode.

My goal is to switch two modes atumatically for at leat 1000 cycles, and the sequence is active set mode for 20 secs, then de-active set line, active the retract mode for 10 secs, then de-active retract line. active the set mode again.... 

Labview version: 2011

Firmware: Labjack U12, RB 16 relay board

 

I have a draft of my vi. attached, as for now, I can only get it running for one cycle, hope some can take a look at this and help me out on how to get it running continuiously,

 

Thank you!

 

missing data in queue

$
0
0

I have 4 asynchronous running VIs that share data between each other via queues.  The producer VI puts the data on 2 different queues; one for a consumer VI to write data to a file and one for 2 consumer VIs to display data.  The first display VI takes all the data (in waveform format) and converts it to a string for display in a table.  The second display VI takes a subset of the data and converts it to DBL for display on individual indicators.

 

I am not experiencing any lost data being written to the data file.  I am experiencing missing data for the displays.  I put a test sine wave on one of the data channels to figure out which queue output is having the problem.  There are about 90 channels of data being collected.

 

The table VI only displays the latest data point for each channel in the table.  I didn't notice any missing data until I wired the test channel into a chart.  I thought could be an update issue.  However, ...

 

The subset VI displays all the data from the test channel in a chart and there is missing data.  Out of all the channels only one is being displayed.

 

The producer VI puts the data on the queue using Lossy Enqueue Element function.  The consumers get the data from the queue using the Preview Queue Element function.

 

Any ideas on where to begin to diagnosis the problem?  I would post code but I haven't successfully posted any files to this forum.  Is there any other location I could post?

Labview freezes on VI write operation

$
0
0

I am using a vi from the community (http://forums.ni.com/t5/LabVIEW/Driver-for-Omega-UTC-USB/td-p/2360528) to read/write to an Omega USB thermocouple reader.  This uses the VISA write, VISA read functions.  I have it set up to read a thermocouple once per second and then adjust the output power of a heater based on the reading.  It runs for a set period of time as part of a PID loop, then stops, and my experiment carries on.  It also logs the temperatuer and output power at each step.

 

I have had a problem occur several times now that is extremely problematic.  The vi above will just freeze at the VISA write function, causing the parent vi to freeze as well.  I have to kill Labview in order to fix the problem.  I then lose the heating record as well, since this was being written by the parent vi.  I could change it so that it writes the file continuously, but this is really just there for error checking.  The larger problem is that the PID loop stops running, so I lose control of the temperature, and it does not stop at the end of the allotted time.  There is no error and the parents are all frozen, so there is no way to check that this has happened and even stop the heating.  If I call the vi as a reference, the caller simply waits as there is no error message and for some reason VISA never times out.

 

I can open the Omega software for reading these devices and see that the device is working fine, during or after the freeze.

 

The only similar problems I see in the archives seem to have been solved by a software update years ago (this computer has seen only LV 2012 and 2013), or by replacing the serial hardware.  In my case, the only hardware is the USB thermocouple reader, so I have to throw out all this equipment to change anyway.  Is my only option to switch to NI thermocouple readers?  We use mostly Omega for temperature control, so I would rather not have to do that.

 

Is there any way that I can at least force VISA to timeout and give me an error so that I don't have a runaway heating?  I'll still lose the sample, but at least it will be less dangerous for the hardware.

 

Hardware/Software involved: Dell XPS 8700, Labview 2013, Windows 8.1, Omega UTC-USB

Timestamping DAQ Data from Multiple channels for logging to an excel file

$
0
0

I am writing a labview application to be installed on a PC as an executable file with the Labview Engine.  I want to acquisition multiple channels from my DAQmx device at different times during my program.  I need to timestamp these data acquisitions and log them to excel to be built as a timeline chart in excel.  I will acquire cahnnel 1 during one phase of the acquisition, and then channel 1 again at the next phase of testing with the hopes of reconstructing these acquisition logs in my excel file on a single chart showing all acquisitions. 

 

How do I create a timestamp system that can log this data with precision?  The DAQmx VIs only bring data back in a double array or a waveform.

 

 

 

Setting DAQmx Tasks up in project instead of NI Max

$
0
0

I am writing a labview application to be installed on a PC as an executable file with the Labview Engine.  I can put my DAQmx tasks in my project or NI Max.  If I put them in my project and I build my application, do I lose the ability to change the IO settings in NI Max independent of my application?

 

What is the best way to mechanize this with all the features of things like DAQmx tasks being created in my project rather than NI MAX? 

 

Event structure behavior when a control's property is changed

$
0
0

Hi,

 

Is the attached VI normal operation? Why would changing a random control's property affect the behavior of an event structure?

 

Thx

 

Untitled.png

Viewing all 67078 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>