Memory Leak?

Aug 25, 2010 at 3:21 AM

Hi - I am new to ExcelDNA, but am finding that it can be incredibly useful after only a few short days experimenting.  I have discovered that when I return a 2d array (type double[,] in f#) to Excel, and then show the result by creating an array function, that each successive recalculation of that array - done by going to any part of the array, hitting F2 then Ctrl + Shift + Enter - will result in what looks like to me, a memory leak.

For example, write the function bigArray() that creates a 10,000 by 100 array populated by the number 1.0

let bigArray() = Array2D.Create 10000 100 1.0

 

Load Excel, and the .xll file, and call this function - works as it should.  Now, select 10,000 rows, and a number of columns, enter the function =bigArray() and hit Ctrl + Shift + Enter - everything is populated with the number 1, as it should be.  Now open task manager, and look at the memory usage of Excel.  Watch it grow with each successive Ctr +Shift + Enter recalculation.

I understand that this function is useless as-is, but I would like to return large datasets to spreadsheets.  Am I implementing something wrong here - is there a way to make ExcelDNA release this memory before it loads the next recalculation of the array, or is it lost until Excel is reset?

Coordinator
Aug 25, 2010 at 7:41 AM

Hi jm,

This is certainly interesting, but I don't think you are seeing an Excel-Dna memory leak.

I can recreate the behaviour you see, but note that the memory usage does not grow if I recalculate (even a full recalc with Ctrl+Alt+F9). You need to re-enter the formula to get the growth in memory usage. And the memory is released when you close the WorkBook. This all indicates that Excel is holding on to the extra memory, not the marshaling layer in Excel-Dna, which is one place a memory leak could occur.

Excel exhibits the same behaviour with a VBA function - try this in a module in VBA and repeat your test (noting how much slower VBA is):

Function TestBigArray() As Double()
    Dim Result(0 To 10000, 0 To 100) As Double
    
    For i = 0 To 9999
        For j = 0 To 99
            Result(i, j) = i + j
        Next
    Next
    TestBigArray = Result
End Function

Perhaps we are seeing the undo stack? Note that every time you edit the formula, the change gets added to the undo list. Excel 2007 has a 100 level undo stack, so you might like to re-enter the formula 100 times, and see if the memory usage then stabilizes. I don't know what Excel does when the memory gets near exhaustion. --- But I wouldn't worry about it.

 

I'd love to see what you come up with using F# and Excel-Dna . . .  :-)

 

Cheers,

Govert

 

 

Coordinator
Aug 25, 2010 at 7:56 AM
Edited Aug 25, 2010 at 11:04 AM

Ah!

Yes, it is definitely related to the undo stack. You can actually set the undo stack depth:

http://unofficialexcelstuff.blogspot.com/2007/08/undo-excel.html

If you add a DWORD value to "HKEY_CURRENT_USER\Software\Microsoft\Office\12.0\Excel\Options" called "UndoHistory" and set the value to 5, and restart Excel, you will find the memory usage stabilizes for both the VBA function and the Excel-Dna function after six or seven re-enters.

Running a macro that modifies the _WorkSheet_ that you're busy with will also clear the Undo stack and reclaim the memory. 

Cheers,

Govert

Aug 25, 2010 at 11:33 AM

Govert - thank you so much for your help and quick reply - that clearly solves it, and with no fault to ExcelDNA!   I look forward to using this more and more, and the best part is that the .net libraries are portable rather than just writing in VBA. 

One note on VBA:  if you do have to program in VBA, the arrays are column major, and you can get a big speed-up by going down columns rather than across rows when populating an array.

Thanks agan,

jm