salut,
svp comment generer un signal PWM sur la carte Arduino mega 2560 à partir de labview . ( j'ai deja installe LIFA et LINX).
thx.
salut,
svp comment generer un signal PWM sur la carte Arduino mega 2560 à partir de labview . ( j'ai deja installe LIFA et LINX).
thx.
Hey all,
TL, DR; My 6602's channel 2 counter is dropping my signal sporadically. Any thoughts as to why?
Background:
I have an embedded machine running LV 7.1 with DAQmx 7.3 and MAX 4.0. I have been running into a problem with one of the counters on my 6602 card that is in the chassis. The basics of my problem is that I'm reading in a PWM signal from a motor. The signal goes from the motor, to a back panel connection, to a line driver, to a TB 2715 connector, then to a PXI 6602 card. The PWM signal goes to the gate of my counters and I'm using the high time read function (assuming my signal to be a perfect 1 kHz signal). I have 8 separate stations that follow this same set up. I was running into some noise issues early on, so I set the channel property to enable filtering on the 6602 counter and am ignoring any signal that is less than 10 ms, my lowest signal coming back to me will have a high time of 50 ms.
Problem:
One of my counters (counter channel 2) is having an issue where it is "dropping the signal". It will read fine correctly for 5-50 reads, then drop to 0 for a read, then pick back up for a few reads, then drop back to 0. All my other channels work as I expect and can run my code fine.
What I've tried:
At first I thought my line drivers were bad, so I made sure everything was wired securely. I then made sure that the connection at the back panel and the TB 2715 were secure and I had continuity where I expected it to be (IE, back panel to line driver, line driver to TB 2715). I swapped line drivers for channel 1 (works) and channel 2 (doesnt work) to determine if it was a driver issue. Channel 1 still worked fine but channel 2 was still dropping the signal when I was expecting channel 1 to drop due to a bad line driver. I then tied both channel 1 and 2's counters together and connected them to channel 1's line driver - thought is that they should both be bad or both be good. Channel 1 was fine, channel 2 was not. I then tried using a different TB 2715 terminal block to make sure the terminal block was working as expected in my original configuration. Channel 1 works, channel 2 does not. I then swapped 6602 cards from a different tower with the original and new terminal block. Channel 1 works, channel 2 does not. I then went into my code and verified that I set the filter to ignore signals lower than 10 ms by setting the filter, then reading back the physical channel, filter enable state, and filter time from the set (a node for writing, followed by a node for reading). Channel 1 works, channel 2 does not. I then hooked up a function generator to my channel 2 input. Finally, I see a clean signal. Replace it with the original PWM connection, and it goes back to dropping the signal. I have monitored both sides of my line driver with an oscope to verify that they have the same frequency and high time at channel 2 along with looking at the noise on my signal (there is some noise at 0V and 5V, but the rise and fall times looks fine and I dont see the signal dropping back to 0 during a high time).
What I'm looking for:
Basically what to do next. At this point I'm not sure what is wrong with my system. I have verified that the connection between my back panel and my 6602 card exisits. I have ruled out the possability of a bad line drive (swapped line drivers with a good signal, channel 2 still drops, I have tied my channel 2 counter to a "good" counter pin to see if they both fail or not, and I have put channel 2's signal through channel 1's line driver and read the high time fine). I have ruled out the 2715 terminal block by using a different one. I have ruled out my 6602 card by using a different one (granted, it COULD be possible that both cards some how suffer from the same bad channel). The only thing I can see is signal fidelity being an issue. That being said, when I tied channel 2 to channel 1, channel 2 was dropping while channel 1 was reading fine. I have verified that my filter settings for the channel are set correctly by writing then reading. The only thing that has worked has been the function generator. At this point, I'm thinking its a bad back plane in the chassis and I'm not going to be able to run channel 2. I guess it could be that the filter for channel 2 is getting over written somehow, but I dont use any channel property nodes in my code outside of the initialization VIs, so I don't see how it could be over written.
Anyone got any tips or advice for troubleshooting or what to do next? I'm open to anything sans sacraficing a baby to get it to work I can attach code but I'm not sure what part you guys would want and I feel strange just dumping the entire project. Basic set up is a "make the tasks" VI, then a "start the tasks" VI which includes setting the filter, then run the test VI (several state machines), then once the test is done, close the tasks.
Matt
Hello,
I have an array of "n" waveforms I got from a cDAQ device, each with their own channel names.
I am trying to create multiple checkboxes with n elements with each bolean text containing each channel names. The "n" elements of the checkbox should resize when sometimes there are 3, 4 or 5 waveforms.
Should I use a cluster? How do I change the bolean text and checkboxes elements programmatically?
Dear users
For a project with Labview I have to create and compare the characteristics of a photovoltaic panel (note: school related). First of all I have to turn on power outlets with DAQmx, which then triggers the lamps. These will produce a voltage-output on the photovoltaïc panel and then I'll run the Yokogawa GS610 from within labview to run a linear sweep and import the results.
I have created a basis state machine in which the user can select how many lamps (power outlets) should be enabled to acquire the results. Each state gives feedback to an LED on the front panel.
I'm currently stuck with turning on the relays from DAQmx. I used the DAQ-assist and selected port1/line 0 which turns on the power outlet number 1. The problem is that I have connected a boolean from the LED's to the data-input from the VI, and so if I switch to another case with my enum on the front panels, the power outlet stays high.
Any solutions or improvements on the view of the program for your side?
I'll add my project in attachment, there are some trials in there as well, but the main file should be untitled project.
Thanks in advance.
Hi,
I am using HC-SR04 sensor and LINX function. The distance was read via the ultrasonic function of LINX. But it was always 0. How can I read distance with the sensor?
Thank you.
Actually I have an idea of guiding blind people by giving instruction about the obstacles by commanding in certain distance of 3 feet with image proceesing technique in LabVIEW .Since I am interested in doing this in LabVIEW anyone please guide us with basic ideas using DIP.
Greetings,
I would like some help please in creating data files with time stamped file names from inside a consumer loop. The loop is receiving queued data from three sources, at different rates, in string format, which then has a time stamp appended.It is then sent to a txt file in json format if the record button on the front panel has been pressed. Currently, my code creates a new file every second as the file name changes with the PC clock. If my file name sub-VI is outside the loop, it will only ever read the time at startup of course. How can I create a filename once, at the time the record button is pressed?
I am sure this can be done, maybe with shift registers and a bit of Boolean, but I can't seem to crack it.
If there are any glaring errors with my loop architecture I would be happy for you to point them out too
I'm using NI-845X I2C to read from a LTC4151 component register. What I need to do for a successful read is the following:
1) Write the device address ( in this case it is 0xD8) and write the register address that I want to read from ( in this case 0x02 ) for this case the R/W needs to be 0...
2) Then I need to perform the read by again writing the device address (0xD8) but now with the R/W bit set to 1 (0xD8 | 0x01).
This is what I have so far but I'm having issues with it.
Please see this simple code - to me it looks like a bug.
A usability study published by Jeffrey Stylos and Steven Clarke titled Usability Implications of Requiring Parameters in Objects' Constructors discusses a comparison of two methods of Object construction and their relative merits - it concludes that what they term the "Create-Set-Call" method is preferable for almost all users of an API.
In LabVIEW terms, this is something like dropping a default object constant to the block diagram, then calling a series of "Write Property A", "Write Value B", "Set Init Properties" or similar methods.
The "Required Constructor" syntax would presumably be something more like placing the class inside a library as a private member, then providing a public VI which output an initialized version of that class' object having already called those "Write ..." VIs on the class member (taking as arguments the required values, probably as "required" inputs).
The participants in the study are noted as giving feedback as follows (in favour of C-S-C):
Less restrictive: In general, APIs should let their consumers decide how to do things, and not force one way over another.
My question becomes - is this true? Should an API allow programming mistakes that can be avoided by forcing specific operations via (in LabVIEW) access control?
A similar example (not discussed in the paper) would seem to be the Template Method (wikipedia, NI implementation), in which some parts of the API are restricted from the user (in my case, often still me) in order to enforce some specific usage. Why is this inherently different? Just because it looks like a normal method from the outside?
Dear All:
I need to run this function for a huge number of times, so improve the performance of this function is important for me, does anyone can tell me is there something I can do to improve performance of such function?
Thanks!
The exaprom pdf creation vis work Ok with me but I have a bit of an issue with concatenation of pages, The gaps between pages are too large, would like to get them to zero with just the normal page break left. I have tried using the page setup but it only seems to work for the first page.
None of the topics I found relating to error -89137 really addresses what seems to me to be an unusual error description.
I have simultaneous AO and AI processes, each with their own task and their own loop on the same device and triggered off the same digital input channel. The AO task is a finite sample waveform and the timing is set by the input waveform. The AI task is continuous sample with implicit clock.
I cannot attach my actual application, but I have attached a minimum working example. However, while I believe the attached code captures all the relevant processes, it must be missing something because it works and my actual application does not. Which leads me to seek help deciphering the error code thrown by my actual application:
AO Configure Hardware.vi<append>
<B>Property: </B>Start.DigEdge.Src
<B>Property: </B>Start.DigEdge.Edge
<B>Source Device: </B>Dev11
<B>Source Terminal: </B>PFI0
<B>Required Resources in Use by</B>
<B>Device: </B>Dev11
<B>Reserved Terminal: </B>AO/PFI0
<B>Task Name: </B>_unnamedTask<59>
Dev11 is a simulated USB-6002. I've had resource reservation errors before, but what confuses me is the reserved terminal "AO/PFIO". I don't know where that name comes from. Why isn't it "Dev11/PFIO"? But as the attached code demonstrates, Dev11/PFIO should not be the issue (multiple tasks can trigger off the same digital channel).
Can anyone help me decipher this?
i am trying read the status of software in software but text not received, any idea about this?
Hi,
I would like to close the default browser using labview, I know how to close a VI for example, but I do not know how to close the browser because is not the same process.
Could you help me?
Thanks in advance.
Buenos días,
actualmente tengo en mi proyecto dos .vi uno alojado en el FPGA y otro en el host de la tarjeta. Para llamar al .vi de la FPGA desde el host utilizo las funciones propias para ello, las que se encuentran en FPGA-interface, primero pongo el bloque de open despues el de read/write y por ultimo el de close FPGA.
Lo que quiero saber es como es posible que cuando le doy a correr el programa en el host, me ponga de forma automatica en modo ejecucion el .vi de la FPGA pues cada vez que quiero ejecutar el .vi del host tengo que entrar primero en el .vi de la FPGA ejecutarlo y despues volver al .vi del host.
Gracias. Un saludo.
Hi,
I'm trying to build a simple runtime application for "creating/writing" the following key (value) into the w10 registry:
[HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\EdgeUI]
"AllowEdgeSwipe"=dword:00000000
Here is what I got so far, but it does not work
(Not sure if i should use DWORD or just STRING)
Bonjour, thank you to convert my 5.0 vi to 2017 version.
Hi, I have formulated a transfer function from a certain feedback control system. The transfer function is a 4th order transfer function. I cant seem to make the simulation loop work. My vi doesn't compile. Can anyone help me?
Much Thanks!
Hi,
I have a contreller connected via serial and I want to some information abou the controller.
I can get this information using serial visa but I have a problem when I try to read it.
On the attached VI, I read the command that I am sending, and I do not want that, I would like to read the response thath I recieve from the controller.
For example, if I send PV, i wave to recieve the version of the controller, but I recieve the same, PV.
Please, could you help me?
I attach de vi.
Tanks in advance