c# - How to get unused memory back from the large object heap LOH from multiple managed apps? -


Talking to a colleague about a particular group of applications using approximately 1.5G memory at startup ... He told me

the part has understood me ...

For example, if you allocate 1 MB of memory in one block then the big object pile size Expands to 1 MB in When you release this object, large object heap does not decode virtual memory, so the stack remains in 1MB size. If you later allocate another 500-KB block, then large block of new block memory is allocated within 1 MB of block. During the lifetime of the process, the large object stack always grows to catch all the big allocation allocated in the present, but when the garbage collection happens, even when things are not released, the figure on the next page is 2.4, large An example of an object stack shows.

We now say that we have a fictional app that creates a flood of big objects (> 85 kb), so the big object pile increases for 200 meg. That we have such 10 such example ... to allocate to 2000 megans. Now this memory is never returned to the OS until the process closes ... (which I have understood)

Is there any difference between my understanding? How can we get unused memory in different LOHeaps; Do not we make OutOfMemory the perfect storm of exception?

Update: By the response of Mark, I wanted to clarify that LOH objects are not referenced - large objects are use-n-throw - although the heap also does not shrink though the heap Initial growth is relatively empty post.

Update # 2: Includes just one code snippet (exaggerated but I get the point). I have an out-of-the-board exception when virtual memory is 1.5 on my machine Gram points (1.7 G on the other). From ', the process memory can be seen as a huge file on the disk.' - The result is thus unpredictable. In this example, the machines were free of GB on HDD machines. Does the PageFile.sys OS file (or related settings) apply any restrictions?

Fixed Float _megabytes; Static Readable Ent BYTES_IN_MB = 1024 * 1024; Static zero bitbit () {try {var list = new list & lt; Byte [] & gt; (); Int i = 1; {Var memory = new byte [BYTES_IN_MB + i] for (int x = 0; x & lt; 1500; x ++); _megaBytes + = Memory.Labby / BYTES_IN_MB; List.Add (memory); Console.lightline ("Allocation # {0}: {1} MB Now", I ++, _megabytes); }} Hold (exception e) {Console.WriteLine ("boom! {0}", e); // I put a breakpoint to check the console throw; }} Fixed zero main (string [] args) {BigBit (); Console WrightLine ("Check VM Now!"); Console.ReadLine (); _megabytes = 0; ThreadPool.QueueUserWorkItem (Rep. {BigBite ();}); ThreadPool.QueueUserWorkItem (Rep. {BigBite ();}); Console.ReadLine (); // will blow before reaching here}

is an explanation that I want to make first - Suppose that you are running the app as a 32-bit app, even if you have a large number of position files, even though the available VA space for your process is only 2 GB, 3 GB, so even if you are 32 bit If you run 64 bits, then this is the case, where you have a large address space.

  • Objects with size are allocated at 85000 bytes LOH, note that this is not 85000 bytes no 85K, also the implementation details Now that can change, back to your question. GC will cancel those LOH segments which are not used in 2 situations 1- When memory pressure is high on the machine (~ 95-98%) 2- When this fails to meet the new allocation request, then Pages in unused LOH

so that you can get one of these storage back. The fact is that before you reach the 2 GB limit you are killing OOM, it can mean that you have VA fragmentation, VA fragmentation occurs when you do not have constant VA address space to complete the new allocation For example, you ask for 8 KB Segment, T is consistent 2 pages in your VA (4 K, assuming page size)

You can use this in the debugging tool for windows to validate it. Wamp Dibgar A Can use the extensions.

Hope it helps thanks


Comments