Hi,
I'm wondering whether there are any Labview implementations of an efficient optimization algorithm for "real-world" signals (i.e. a control problem rather than a pure math problem). The goal is to automatically adjust a set of physical parameters (e.g. voltages, motor positions or the like), which maximize a measured "objective" signal. An intuitive example would be the alignment of a laser beam through a pinhole using a motorized mirror and the feedback is provided by a photodiode or powermeter behind the pinhole.
I thought about using one of the optimization VIs, but I think they would have an inherent problem with an objective function derived from a "real-world" measurement because of the noise and/or drift. Maybe it's possible to control the step size and tolerance to get a reasonable result but I couldn't figure out a robust procedure so far.
I'd rather treat it as a control problem with an approach similar to a PID controller but instead of locking the "process variable" to a setpoint on a monotonous curve, it should rather lock it to an extremum (which can drift over time). I'm not very familiar with control algorithms and I don't even know whether it's possible at all. Does anyone know a suitable solution?