Hello, I'm currently following the NI Tutorial to setup a Remote Panel on RT Target, I managed to set up just as the tutorial indicates but when I'm trying to access the VI I can´t find it.
Here is my configuration:
The IP address of cRio is : 192.168.0.250
Remote Panel Port: 8000
System Web Server Port: 80
The vi configured as startup is: HydraLog500_3.vi
I created html file using Web Publishing Tool. The file is: maletines.html
I created a destination path called: "www" located in: /c/ni-rt/system/www
Several months ago I created a LabVIEW program that takes data from a string potentiometer and a load cell, filters it, shows a live graph & readout, and then creates a graph to save the data. Once I got it to where I liked it I created an executable file, tested it, and it worked great. Now, months later, I got a new computer (that does NOT yet have LabVIEW, but DOES have the NI Device Monitor) and the executable file will not show the graph and collect data. When I open the configuration for the device from the NI Device Monitor and run the "self test", it appears to be working, so my guess is that the executable file is no longer able to connect to my DAQ unit. Is there anyway I can fix this without having to reinstall LabVIEW and edit the program to fit this computer?
I have managed to send values and get results by compiling my VI to a dll and using C# code. Does anyone know if this same functionality is available/replicable using LabVIEW for Linux and the language (e.g python) that can make this possible on an Ubuntu platform for example. Am using LabVIEW 2011.
I am fairly new LabVIEW user and I am trying to send an analog output pulse through an NI 9269 to an NI 9219 on the same chassis NI cDAQ-9184. I am using the custom_pulse_generator.vi I found through the NI website. I was able to get the hardware to communicate on the AO and I am user jumper wire from a single the AO to a single AI, however, I cannot get the AI waveform graph to pull up the pulse data on the front panel. I have tried with/without while loops around the DAQmx read VI, to no success. The AI is simply to verify what AO pulse signal is actually being sent out.
This is the only hardware I have available. I have attached the VI for reference.
I have a true/false case structure. True, I subtract a number, False goes straight through. I only want to connect one output to both ture and false but I get a "wire connected to a undirecteded tunnel" error. I know how to fix this when it is a numeric case structure by right clicking and selecting "use default if unwired", but there is no default for a true/false case structure. ThanksImage may be NSFW. Clik here to view.
I am upgrading a test system to a new PC with Windows 10 and LabVIEW 2016 from Windows XP and LabVIEW 8.5. When I run this NI Switch executive VI on the new system, I get an error. It runs fine on the old one. I also tried doing a Validate in NI Max for the virtual channel and it gives the same error. I did imports from the old system for all the IVI drivers and virtual devices so they look identical on NI Max. Not sure why it's not working.
Is it possible in LabVIEW web services to transmit a binary file to the client (to be downloaded) in response to a HTTP Request? For example map a URL as /download/filename to download the file to local computer?
Hi, As it happens in the Actor FrameworkwiththeActorFrameworkMessageMaker... Ispossible to buildmessagessingletonclass?
Image may be NSFW. Clik here to view.
I want is to build a subVI with the inputs and outputs, and that the part of the code you see in the image is automatically generated. Is this possible?
In this way, as one of themethodscan be addedto apolimorphicVI.
When creating a singleton class, is it possible inheritor of the singleton class and adapatarse so dynamic at runtime? That is, there may be some method, change the class at runtime and this starts to behave like a child.
Hi everyone, I want to buy a lightweight and long battery life and i found a lot very good laptop with core M processors. But i doubt the performance of these CPUs. I am not a professional software guy but mainly focused on electronics and optics. Most of the time i use labview as a general programming tools to communicate with hard wares through serial port/spi/ethernet. sometimes i will do offline data processing like fitting/array processing. Is there anyone tried labview on laptops with core m cpu like asus ux305? is there any problem? thanks a lot!
Hey, I am controlling two Thorlabs stages (MTS50-Z8 & LTS300) using the Kinesis software provided by Thorlabs (My specs are Kinesis 32-bit, LabVIEW 15 32-bit, Windows 7 64-bit, .NET 4.x). I want to be able to see the current position of the stages even as they are moving. I believe this is a multithreading problem, so here's my question: Does the .NET container allow multithreading? More precisley, can I query the stages while they are running? If so, how? Has anyone got any Kinesis experience? Cheers in advance.
I'm trying to retrieve text from a text file, then have my computer type that text into another programs text box. I got the text from the file into labview, but I can't figure out how to make the computer type out that text. I looked into the keystroke event functions but I am stuck.
I would like to connect a ESP8266EX module to sbRIO 9623 but the I am not sure about the connections. Do I need additional module to interface them?
I am thinking about using TCP to communicate between sbrio and pc with sbrio as server side and PC as client side to indicate a emergency signal from sbrio on PC
Hi everybody, I am having some problems in my labview code,
I am runing a Triax320 from Horiba and I want it the program to save a file with the data from it every time it takes data. I want it to put the data files all in the same folder and with the date and time as file name.
I have managed to add the saving part to my code but, I get a promp every time asking for the file name and the folder.
Obviosly I am doing something wrong but I don`t know how to solve this problem.
Can you help me please?
I am attaching a jpg image of the code and the vi.
This is a project for my undergradute thesis through a co-op program. I am not super familiar with LabVIEW but it is what I have to work with so I have a number of questions I am looking for help on.
The task at hand is to create a program that will do the following;
Create a frequncy sound sweep from 20hz to 2000hz that will be output through the soundcard from the aux jack.
Simultaneously collect data from two microphones that are on either end of a tube with baffling in the middle to show the difference the baffling makes
Graph the data/output the data to excel to make a graph of the ratio of sound pressure output over sound pressure input to show relative effectiveness to other configurations
I have not been able to get the Play Waveform.vi to work through my soundcard so I have been working with a much more crude method. I also can not seem to get the data I am looking for to output after collecting the sound.
Attached are two of my best attempts thus far. The v5 file is a modification of a .vi I was able to find.
I use Network Stream to communicate between a VI & Exe - here VI is the writer and Exe is the reader. While creating the endpoint I give a minimum buffer size, say 100. But when I actually write to the stream I input an array of DBL of size 1000. As per the documentation this situation is to be handled by LabVIEW & I am also reading all my 1000 values. But the memory keeps piling up for the Exe while LabVIEW memory is fairly constant or not increasing at a noticeable rate.The same happens even if I give the buffer size to be same as the expected array size.
So even though I use the scalar data type as recommended (array of numeric) and the array size is also fixed for all transfers, why does the memory keep increasing?
So I'm trying to make a labview program that can recognize an image that may spontaneously pop up on the screen. When that image pops up, I need the program to click on it. I know how to do the clicking part, but I don't know how to have labview recognize the image. Any ideas?
I'm having a problem programmatically creating Custom Scales for Analog Input with DAQmx. Here's the background. We're trying to develop a small triaxial sensor for monitoring muscle twitches (it's a Student Project, I'm an "advisor"). We have a Triaxial Accelerometer that puts out voltages proportional to the X, Y, and Z components of acceleration. The Accelerometer runs off a 5v supply, with each channel having a bias (offset) of 1.5v and a gain of 0.3v/g, with about a 10% variability in these parameters (i.e. gain ranges from 0.27 to 0.33 v/g).
We can easily read the three channels of acceleration with an NI USB-6009. In the original "Proof of Concept", we were easily able to measure muscle twitches, but noted that the channels were definitely uncalibrated.
I devised a Calibration Procedure that was quick, robust, and reproducible, requiring measuring the X, Y, Z voltages when the accelerometer was held in six orientation without moving -- the entire process takes 15 seconds. But then, I decided to "get clever" (always a bad sign).
I've never used Custom Scales in DAQmx, but thought "Why not use the Calibration values for the X, Y, and Z axes to programmatically set Custom Scales for the three Axes so everything is in units of "g" and centered around 0g". I had defined a "Triax" Task in my Project -- if I manually entered the Scale Factors deduced from my Calibration, the axes all read between +1g and -1g (depending on orientation) when the sensor was motionless.
But I cannot figure out how to take the Calibration data and programmatically assign each channel its own Custom Scale.
Here's some code. This first Routine finds the Triax Task stored in the Project, and makes sure the the correct USB-6009 is connected. Within the Project Task, I've named the three Physical Channels ai0, ai1, and ai2 to X, Y, and Z, and set the voltage range from 0 to 5 v -- this code basically resets the scaling for these three channels. It seems to work fine.
Image may be NSFW. Clik here to view.
The reason that I do this is that I want to record these three channels using the standard Volts scaling, as I'll use these voltages to run the Calibration procedure I developed to get the three Bias and Gain settings needed for the Custom Scale.
Here is what I tried to do to create a Custom Scale for the X, Y, and Z channels (the Scales are named X_Scale, Y_Scale, and Z_Scale). I tried a number of things, this is just one of them, but I've not had success getting anything to work (to say nothing of avoiding DAQmx errors). Incidentally, if I enter the Scale Factors "manually", my Accelerometer readings are, indeed, properly scaled between -1g and +1g.
Image may be NSFW. Clik here to view.Some of this code may look overly complicated -- trust me, I tried to make it simpler, but there are all kinds of "hidden gotcha's" in some of these functions that "made me do it" this way. But, of course, it doesn't work.
Insights and suggestions are most welcome! I'll continue to work on it, of course (but, fortunately, I didn't bring the device home with me, so I get a little break ...).
Hey, I am playing around with the PT104 example VI provided by picolog. I am trying to save some temperature data with timestamps. See the attached VI. For some reason, the saved CSV file that this code produces only consists of two lines, the first "Time, Temperature" and the second "0, 17". How can I append datapoints to the file? I want my CSV file to look like this: