Skip to content

Aspnet Core 1.1.7 Memory Leak #6167

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
guneysus opened this issue Dec 28, 2018 · 6 comments
Closed

Aspnet Core 1.1.7 Memory Leak #6167

guneysus opened this issue Dec 28, 2018 · 6 comments
Assignees
Labels

Comments

@guneysus
Copy link

guneysus commented Dec 28, 2018

Describe the bug

We have a memory leak problem which occurs in at least once a day, sometimes in a half hour, sometimes in a couple of hours. We do not the root cause so need help to narrow down the issue.

To Reproduce

Steps to reproduce the behavior:

  1. We are using this version of ASP.NET Core 'Microsoft.AspNetCore.Mvc 1.1.7'
  2. Deployed on Windows Server 2012 R2 and IIS 6.2 Build 9200

Expected behavior

The memory usage of our API project spikes suddenly occasionally which eats all available memory in a couple minutes. We do not know the root cause but we spotted that when it occurs, starts from ~140MB RAM (stable) and increase in ~50-100 MB/s.

We took a memory dump and analysed with DebugDiag and took a report. Since it is a long report, I will share some selected parts of it. If necessary, I will provide more.

Additional context

PS > dotnet --version
Microsoft .NET Core Shared Framework Host

  Version  : 1.1.0
  Build    : 928f77c4bc3f49d892459992fb6e1d5542cb5e86

DebugDiag Report (some selected parts)

Warning The following threads in BA.ECommerce.Api.DMP are waiting in a WaitMultiple( 11 13 15 18 20 22 23 24 25 28 30 )26.19% of threads blocked Typically threads waiting in WaitMultiple are monitoring threads in a process and this may be ignored, however too many threads waiting in WaitOne\WaitMultiple may be a problem. Review the callstack of the threads waiting to see what they are waiting on
Warning The following threads in BA.ECommerce.Api.DMP are waiting in a WaitOne( 0 7 8 9 10 34 )14.29% of threads blocked Typically threads waiting in WaitMultiple are monitoring threads in a process and this may be ignored, however too many threads waiting in WaitOne\WaitMultiple may be a problem. Review the callstack of the threads waiting to see what they are waiting on
-- -- --
Warning The following threads in BA.ECommerce.Api.DMP are waiting for .net garbage collection to finish.The current set of scripts were not able to determine which thread induced GC( 35 36 )4.76% of threads blocked When a GC is running the .NET objects are not in a valid state and the reported analysis may be inaccurate. Also, the thread that triggered the Garbage collection may or may not be a problematic thread. Too many garbage collections in a process are bad for the performance of the application. Too many GC's in a process may indicate a memory pressure or symptoms of fragmenation. Review the blog ASP.NET Case Study: High CPU in GC - Large objects and high allocation rates for more details
-- -- --
Warning Detected possible blocking or leaked critical section at 0x000000a3`c4337db0 owned by thread 37 in BA.ECommerce.Api.DMPImpact of this lock2.38% of threads blocked(Threads 40)The following functions are trying to enter this critical section0x00000000 The following vendors were identified for follow up based on root cause analysisPlease f

Possible Blocking or Leaked Criticical Section Details

Critical Section 0x000000a3`c4337db0
Lock State Locked
Lock Count 1
Recursion Count 1
Entry Count 0
Contention Count 19
Spin Count 33556429
Owner Thread 37
Owner Thread System ID 3592
Entry point ntdll!RtlUserThreadStart+34
Create time 12/24/2018 12:06:09 AM
Time spent in user mode 0 Days 00:00:00.00
Time spent in kernel mode 0 Days 00:00:00.00
Function Source
ntdll!NtWaitForSingleObject+a  
ntdll!RtlpWaitOnCriticalSection+e1  
ntdll!RtlpEnterCriticalSectionContended+a4  
coreclr!CrstBase::Enter+7a  
coreclr!ThreadSuspend::LockThreadStore+b8  
coreclr!ThreadStore::AddThread+29  
coreclr!SetupUnstartedThread+50  
coreclr!ThreadpoolMgr::CreateUnimpersonatedThread+99  
coreclr!ThreadpoolMgr::MaybeAddWorkingWorker+19e  
coreclr!ThreadpoolMgr::GateThreadStart+6e0  
kernel32!BaseThreadInitThunk+22  
ntdll!RtlUserThreadStart+34
@Eilon
Copy link
Member

Eilon commented Jan 3, 2019

@sebastienros - can you provide info on how to diagnose this?

BTW support for ASP.NET Core 1.1 is ending in the near future, so I recommend moving up to a newer version. Also, lots of great new features, performance improvements, and bug fixes!

@sebastienros
Copy link
Member

@guneysus

which eats all available memory in a couple minutes

Does it reach the point where you get an OutOfMemoryException and the application crashes? Or is it just taking a lot of memory?

If so, would you be able to share the memory dump I send you a link to a onedrive folder to put the file on? Ideally when it's already taking a lot of memory so I can see what is using it.

@guneysus
Copy link
Author

guneysus commented Jan 4, 2019

Hi @sebastienros.

I did not remember OutOfMemoryException exception in production but in test environment yes, there was OutOfMemoryException.

It did not stop eating all available RAM on the server, and page faults starts to happen.

Sure, where can I upload the memory dump file, which takes about 50-60MB in zip but takes ~16GB if extracted.

@sebastienros
Copy link
Member

Please send me an email at sebastien.ros at microsoft.com and I will share a onedrive folder with you.

@scottsauber
Copy link
Contributor

FWIW - I hit this too but it's caused by a SQL Exception that trips EF Core to loop over all of our DbSets and do a SELECT * on all of them which then causes us to run out of memory. But we have a rule in IIS that recycles the app pool once we hit a certain limit. Happens about once a week.

Logged here: dotnet/efcore#13310

I need to get a minimal repro set up...

@analogrelay
Copy link
Contributor

Closing this as we haven't heard from you and generally close issues with no response. Please feel free to comment if you're able to get the information we're looking for and we can reopen the issue to investigate further!

Also, as @Eilon mentioned, .NET Core 1.1 is now out of support. If you still see this issue in a supported version of .NET Core, let us know!

@ghost ghost locked as resolved and limited conversation to collaborators Dec 3, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

5 participants