An article teaches you how to troubleshoot. NET memory leaks

  • 2021-12-04 09:56:33
  • OfStack

Directory preface checks managed memory usage Generate an dump file Analyze core dump Summarize

Preface

Memory leak usually means that after some objects of an application complete its life cycle, they are accidentally referenced by other objects, which leads to the failure of subsequent gc to recycle them. In the long run, it will lead to the degradation of program performance and potential OutOfMemoryException.

In this article, we use a memory leak tool to analyze the memory leak of. NET Core program. If the program runs on windows, we can directly use Visual Studio for diagnosis.

Check managed memory usage

Before you start analyzing memory leaks, you must have evidence that there is a memory leak, which can be verified by looking at various indicators of the application with dotnet-counters.

Run the program


dotnet run

Find out the pid of the program,


dotnet-counters ps

4807 DiagnosticScena /home/user/git/samples/core/diagnostics/DiagnosticScenarios/bin/Debug/netcoreapp3.0/DiagnosticScenarios

Start the monitor using monitor, where--refresh-interval denotes the refresh interval


dotnet-counters monitor --refresh-interval 1 -p 4807
Press p to pause, r to resume, q to quit.
    Status: Running

[System.Runtime]
    # of Assemblies Loaded                           118
    % Time in GC (since last GC)                       0
    Allocation Rate (Bytes / sec)                 37,896
    CPU Usage (%)                                      0
    Exceptions / sec                                   0
    GC Heap Size (MB)                                  4
    Gen 0 GC / sec                                     0
    Gen 0 Size (B)                                     0
    Gen 1 GC / sec                                     0
    Gen 1 Size (B)                                     0
    Gen 2 GC / sec                                     0
    Gen 2 Size (B)                                     0
    LOH Size (B)                                       0
    Monitor Lock Contention Count / sec                0
    Number of Active Timers                            1
    ThreadPool Completed Work Items / sec             10
    ThreadPool Queue Length                            0
    ThreadPool Threads Count                           1
    Working Set (MB)                                  83

Focus on the GC Heap Size (MB) index under 1. You can see that the current GC heap memory is 4M after the program starts. Open the link: https://localhost: 5001/api/diagscenario/memleak/20000, and then look at the GC heap memory. You can see that 30M is reached under 1.

GC Heap Size (MB) 30

By comparing the use of memory, we can say with confidence that the memory leaks.

Generate an dump file

If you want to analyze the memory leak of your program, you must first have access to the GC heap, so that you can analyze the relationship between heap memory and objects, and then you can boldly guess why the memory has not been released. To generate the dump file of the. NET Core program, you can use the dotnet-dump tool.


dotnet-dump collect -p 4807
Writing minidump with heap to ./core_20190430_185145
Complete

Analysis of core dump

You can then parse the generated dump file using dotnet-dump analyze.


dotnet-dump analyze core_20190430_185145

Here core_20190430_185145 is the name of dump you want to analyze. It is worth mentioning that if you encounter libdl. so cannot be found error, it is recommended to install libc6-dev package package under 1.

First, we look at the statistical list of all objects on the managed heap through the sos command.


> dumpheap -stat

Statistics:
              MT    Count    TotalSize Class Name
...
00007f6c1eeefba8      576        59904 System.Reflection.RuntimeMethodInfo
00007f6c1dc021c8     1749        95696 System.SByte[]
00000000008c9db0     3847       116080      Free
00007f6c1e784a18      175       128640 System.Char[]
00007f6c1dbf5510      217       133504 System.Object[]
00007f6c1dc014c0      467       416464 System.Byte[]
00007f6c21625038        6      4063376 testwebapi.Controllers.Customer[]
00007f6c20a67498   200000      4800000 testwebapi.Controllers.Customer
00007f6c1dc00f90   206770     19494060 System.String
Total 428516 objects

From the output, most of them are String and Customer objects, and then you can get all the instance objects under the method table through mt parameters.


> dumpheap -mt 00007faddaa50f90

         Address               MT     Size
...
00007f6ad09421f8 00007faddaa50f90       94
...
00007f6ad0965b20 00007f6c1dc00f90       80
00007f6ad0965c10 00007f6c1dc00f90       80
00007f6ad0965d00 00007f6c1dc00f90       80
00007f6ad0965df0 00007f6c1dc00f90       80
00007f6ad0965ee0 00007f6c1dc00f90       80

Statistics:
              MT    Count    TotalSize Class Name
00007f6c1dc00f90   206770     19494060 System.String
Total 206770 objects

You can use it next! gcroot to see who holds a certain string.


> gcroot -all 00007f6ad09421f8

Thread 3f68:
    00007F6795BB58A0 00007F6C1D7D0745 System.Diagnostics.Tracing.CounterGroup.PollForValues() [/_/src/System.Private.CoreLib/shared/System/Diagnostics/Tracing/CounterGroup.cs @ 260]
        rbx:  (interior)
            ->  00007F6BDFFFF038 System.Object[]
            ->  00007F69D0033570 testwebapi.Controllers.Processor
            ->  00007F69D0033588 testwebapi.Controllers.CustomerCache
            ->  00007F69D00335A0 System.Collections.Generic.List`1[[testwebapi.Controllers.Customer, DiagnosticScenarios]]
            ->  00007F6C000148A0 testwebapi.Controllers.Customer[]
            ->  00007F6AD0942258 testwebapi.Controllers.Customer
            ->  00007F6AD09421F8 System.String

HandleTable:
    00007F6C98BB15F8 (pinned handle)
    -> 00007F6BDFFFF038 System.Object[]
    -> 00007F69D0033570 testwebapi.Controllers.Processor
    -> 00007F69D0033588 testwebapi.Controllers.CustomerCache
    -> 00007F69D00335A0 System.Collections.Generic.List`1[[testwebapi.Controllers.Customer, DiagnosticScenarios]]
    -> 00007F6C000148A0 testwebapi.Controllers.Customer[]
    -> 00007F6AD0942258 testwebapi.Controllers.Customer
    -> 00007F6AD09421F8 System.String

Found 2 roots.

From the reference chain of string, it is held by CustomerCache, and then you can look for problems in the code.

Original link: https://docs.microsoft.com/en-us/dotnet/core/diagnostics/debug-memory-leak

Summarize


Related articles: