I am trying to create a JSON formatted output that dynamically changes the name of the JSON object based on the loop iteration.Image may be NSFW. Clik here to view.
my visa is receiving this information "20;220;801.2" but neither the read buffer is showing the information and neither the substring is working so anyone have an idea about the problem?
Need help. Must open a spreadsheet file, convert to an array, and read a column name in that array and then search to find a value in a specified row. This is my work so far, please help I have been working for hours and hours on this and have tried everything that I possibly can and cannot figure it out. Thank you in advance.
I have an existing program which can display TDMS file data on waveform. The file that I used has only one group. As I used the same vi for reading a multi group TDMS file, the waveform graph was not generated. Instead I got the error that Error 2525: LabVIEW: TDMS file data could not be converted into the specified data type.
FYI I was able to read a particular channel from a particular group but as I read the timestamp column, I got the same error. Later, I am getting storage vi internal error.
The lab I work for would like to downsize our current setup for laser-induced fluorescence measurements, and we are developing a LabVIEW application to replace various functionalities formerly handled by clunky pieces of hardware. I see old forum posts discussing a Lock-In Amplifier toolkit for LabVIEW from years and years ago -- is there a more modern version available? And if not, does anyone have that toolkit so I can try to use it? I know other people have more recently developed the software on their own, but I thought I would try this first. Thanks!
I have searched the forums and cant seem to find an answer to this. I have a simple project that is initializing communication to a power supply, and then quickly pulses the output voltage from high to low repeatedly. My problem is that I can not get it to work unless I am highlighting execution, thereby slowing it down a ton. This may sound similar to other posts, but the difference is that I am not reading anything back from the power supply- I am simply writing to it. Another piece of detail is once I upset the power supply by running my application at full speed, the highlight mode will no longer work. This is fixed by me closing Labview and starting over. I must be tying up the VISA some how...
I have attached my code and a screen shot. The problem must be occurring within the inner most for loop when the VISA writes occur. Seems to me that the delays of 500ms should be plenty long between writings. Anyone have any ideas how I can improve on this?
In my application I'm logging data samples to a Database (DB is SQL server, Inserting a set of data for every one sec). If database connection lost I retry to reconnect every 1 min until I succeed. My application is pretty large does data acquisition, process and display all have been implemented in different independent loops.
When DB connection is lost, a separate loop will be enabled to try connecting DB, loop iteration is ~1min. The issue I'm facing is when I call "DB Tools Open Connection VI" with connection time out 5 sec, the entire application execution slows down(all the independent loops) for 5 sec. An interesting fact is the processor utilization goes down in the 5 sec time.
Processor utilization in normal execution time is ~45%.
Can any one experienced similar issue or any thoughts on this?
Thank you
Adarsh
Development environment: LabVIEW 2016 32bit on a Windows7 64bit i5 processor with 8GB RAM
I am creating a tool for managing user error codes. What I realized is that the "xxx-errors.txt" files, shipped with LabVIEW, are prepared without escaping special characters (' " < > &) in the text nodes (in the error descriptions). This means that it is almost impossible to load this file using built-in LabVIEW tools for XML. I have created my custom converter for special characters (figure below) but it fails with "<" character since it returns the wrong line in "Parse Error" output. However, it's not my point that I don't know how to work this around. I know that I still can parse the text with the proper use of regex, but I would like to know what is the purpose of formatting the data with XML, but in the way that it is incompatible with the tools for it? It's totally brainless IMO...
The Program is running almost as intended; for some reason the outputs "New Picture 3" and "New Picture" appear as an "aqua-scale" but the intended result was a gray-scale. And I really dont know where my coding went wrong; I am still learning and would be very grateful If someone could offer some assistance. I attached the VI and an example image to be used as input. I attached the image as jpeg but it should really be a BMP (The website does not allow BMP files) Thanks in advance
So far I am using the default FPGA personality, but recently I was given a bitfile ending with the extension .lvbitx”. I needed some of additional functionality which the new bitfile will provide me, but I'm not so sure I'll stick with it and there's a chance I need to roll back to the default FPGA personality.
Installing seems pretty straight forward with Ni MAX. I just browse to remote_system>myRIO>device_interfaces>myRIO_FPGA and click on update firmware and it will allow me to browse to the location where I stored the given bitfile. My concern is that if update FPGA firmware to this bitfile, will I be able to roll back to the old default FPGA firmware/bitfile if I wanted to? I have no idea where current bitfile is stored. Do I need to manually restore it or will it come back once I click on “erase firmware”? There is a high chance that I need to use back the original default personality.
I create a "A.dll" in Visual Studio 2015, another "B.dll" is loaded explicitly in "A.dll". I also make a test program in Visual Studio 2015, it called "A.dll" implicitly(statically) and "B.dll" can be loaded by "A.dll" normally. All of them look great. After that I integrate the "A.dll" into LabView and also put "B.dll" in the same folder.
But it always fails in the LabView when A.dll loads "B.dll". I add some debug information and find the error code is 126(The specified module could not be found).
I have tried to add the path of "B.dll" into the VI search Path of LabView, and I also put "B.dll" into /windows/system32 folder, the problem cannot be resolved.
Could you please tell me the reason and how to resolve it?
Thought I would reach out to the LV community to see where people stand on this topic ... Do you face push back on the use of LabVIEW in the workplace from programmers such as those who primarily work in JS, C#, Python, text based environments, lets call them "professional coders" ( not meant to be an insult to the pros in LV.. please read on). If so, how do you manage this? Is LV seen as an inferior language in these circles?
I have been programming in a LabVIEW environment for about 5 years now and really enjoy it. I've found that the graphical programming methodology has allowed me to "see" inefficiencies in my code structure quickly and allowed me to make higher level architecture decisions that would have taken me much longer to realize in a text based language. But I do not consider myself a "programmer". I can make some great, efficient, easy to read programs, but at the end of the day I'm a jack of all trades (master of none). I didn't go to school for programming, I use my programming skills as a piece of the puzzle to solving an issue and get the job done. I could be programming a robotic arm one day, or setting up a PLC another, wiring a panel, or sketching up an idea for a mechanical fixture. I'm sure a formally educated programmer is able to setup an efficient architecture in a text based environment, but I find that for the non-programmer, LabVIEW is a faster way to realize good coding practices.
Recently I have been told about how great Python is, and when it comes to a text based language, it looks pretty simple. But through my initial learnings, making nice GUI's seems overly complicated, even more apparent with the juxtaposition of Labview which makes such heavy use of the object oriented programming model. Possibly Phython is better for making headless programs? Those that have no need for a GUI?
Maybe the answer is combining both worlds? Calling the .PY code where it is a more efficient way to get the job done, or when its the easier route to get along with other colleagues who opt for the keyboard over the mouse and don't want to go anywhere near LabVIEW.
What do you think, what have been your experiences in this topic?
so in my software (2014_Conex working version) a small delay stage is controlled by an external Conex CC controller box (both Newport). At the moment, the delay stage moves in a specific range and steps at specified number of positions. I'd like to add a Newport Power Meter (PM, 841-PE-USBImage may be NSFW. Clik here to view. in such way, that it measures the power at a specific position and then plots both, where the position of the delay stage is x-axis and the power is the y-axis. I was told to save the PM code as a SubVI since it's fairly small and just insert it into the main and larger code. I'm not sure, where and how to connect the SubVI and I was hoping you could help me? I'd really appreciate your help, since I'm new to LabVIEW.