|
Post by hooovahh on Nov 3, 2015 17:34:44 GMT
So I was trying to find a way to combine several sequences into a single one, and I was successful. But I wasn't sure when it was appropriate to use the Dispose Sequence.vi so I ran some tests looking at LabVIEW.exe memory usage. I figured I would put the dispose in every place I could think of and see memory usage was stable, then I would start removing them and see what disposes were actually required for a stable system.
What I found is that even if you have a while loop, where you perform a New Sequence, followed by a Dispose Sequence, LabVIEW memory usage will continue to increase until LabVIEW throws out of memory errors. Is there something I'm doing wrong? Is the PTP sequencer stable in terms of memory usage? I assumed it was since it was recommended for RT applications. What can be done to ensure that memory usage doesn't continue to increase after every new and dispose operation. Thanks.
|
|
|
Post by Thoric on Nov 4, 2015 9:53:40 GMT
Hi Hoovahh,
How quickly is the RAM increasing and at what Sequence creation rate have you tested this? There is a service that runs in the background to manage Sequence references, plus a library of unique identifiers is stored to help differentiate between elements. If you rapidly create and dispose of Sequences this library of unique IDs will grow pretty quickly, although I wouldn't expect it to throw memory errors.
Another possibility: LabVIEW lacks the ability to unload classes when the number of instances reaches zero, a limitation imposed to avoid the need for mutexing slowdowns in data instantiation functions. PTP Sequencer makes strong use of a class based architecture, which throws up the reasonable question of whether this creation and disposal of new sequences is somehow related.
Unfortunately I'm not in a position to investigate this issue, but I'll see what can be done to investigate.
|
|
|
Post by hooovahh on Nov 4, 2015 14:01:19 GMT
Thanks Thoric, due to some oddness in IT at my work this morning I don't have task manager. But when I did I saw something like 500KB of increased memory on LabVIEW.exe every second or so. Here is a snippet which demonstrates what I saw. Again the actual application won't be doing this type of thing, but I assumed that if I performed a dispose for every New Sequence, that memory would be stable. The actual implementation maybe opening several sequences, getting the sequence elements, then inserting them into a single sequence to concatenate several sequences into a single one. So while it won't be performing a New Sequence thousands of times a second like my demo, it will be opening several of them and just wanted to make sure it would be stable. You mentioned a unique ID is generated for every new sequence, if these unique IDs were destroyed or closed out in the dispose process would prevent continual increase of memory?
|
|
|
Post by Thoric on Nov 4, 2015 14:34:43 GMT
Hi hoovahh,
An improvement to the ID manager would be to remove unused IDs from the library on dispose, but unfortunately in V1.0 it doesn't do that. I guess in that respect, if you left a system operating for many many months that manipulated/edited/created sequences regularly, you could eventually have yourself a memory issue.
One option might be to componentise your code that handles the sequencing and dynamically load it. If you detect memory issue you could unload that code by causing it to drop to idle and closing the reference - that should remove it and it's memory allocations. You could then reload it afresh and carry on from where you were. It's a workaround, and not one I can guarantee will work, but possible a way through this for you?
I'd liked to have had an opportunity to look into the source to diagnose the true origins of the problem, but as I said before I'm not in a position to investigate I'm afraid.
|
|