Quantcast
Channel: LabVIEW topics
Viewing all articles
Browse latest Browse all 66896

Memory function in Control and Simulation loop - ODE solver problem

$
0
0

Hello,

 

I am currectly using the control & simulation loop to simulate the behaviour of what is essentially a spring-damper-mass system. In the process the change in time (dt) is being used to integrate an arbitrary value. I am using a built in memory function to store the time, to calculate the time change (dt).

 

The simulation is rather complex, due to the necessary accuracy needed, not all the ODE solvers can handle it. Currently I am using Adams-Moulton method, this works fine for the simulation. However it cannot detect the change in time, the change is constantly zero. This problem worked it self out by using another ODE solver, but then the simulation was rather messed up (even when I tuned the step sizes and tolerances). So I am quite confident that Adams-Moulton is one of the best suited ODE solver for the problem at hand.

 

Is there another way to store the previous time and use it calculate the time difference, than using the memory function? Has anyone experienced such problems before?

 

I have been doing alot of error searching using the probe, but I am quite sure that there is a problem with the ODE solver and the memory function. See picture below, showing in basic how the change in time is being calculated.

 

I am rather new to LabVIEW, so if there could be something else I have missed I will be glad to hear it.Basic usage of memory function to calcualte the change in time

 

PS! I have tuned the minimum step size/relative and absolute tolerances for the Adams-Moulton to simulate the behaviour of the system correctly.


Viewing all articles
Browse latest Browse all 66896

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>