Skip to content

Memory leak on Node 19/20 with includeLocalVariables enabled #14162

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
3 tasks done
seeARMS opened this issue Nov 2, 2024 · 8 comments
Closed
3 tasks done

Memory leak on Node 19/20 with includeLocalVariables enabled #14162

seeARMS opened this issue Nov 2, 2024 · 8 comments
Assignees
Labels
Package: node Issues related to the Sentry Node SDK Stale

Comments

@seeARMS
Copy link

seeARMS commented Nov 2, 2024

Is there an existing issue for this?

How do you use Sentry?

Sentry Saas (sentry.io)

Which SDK are you using?

@sentry/node

SDK Version

8.36.0

Framework Version

Node 19.9.0, also tested on Node 20.18.0

Link to Sentry event

N/A

Reproduction Example/SDK Setup

import * as Sentry from "@sentry/node"

import api from "@opentelemetry/api"



function trimData(data) {
  const size = JSON.stringify(data || "").length

  // Arbitrarily cut off massive payloads. Sentry silently drops events that are too large,
  // so trim them.
  // See https://github.com/getsentry/sentry-javascript/issues/3036#issuecomment-1066491190
  if (size > 200000) {
    return data?.slice(0, 200000)
  }

  return data
}

function checkIfSentryIgnore(url) {
  if (!url) return false

  for (const route of SENTRY_IGNORED_ROUTES) {
    if (typeof route === "string" && url.endsWith(route)) {
      return true
    } else if (route instanceof RegExp && route.test(url)) {
      return true
    }
  }
}

/**
 * Need to ignore long-running websockets.
 *
 * See: https://github.com/getsentry/sentry-javascript/issues/13412#issuecomment-2340434248
 */
const IGNORE_WEBSOCKETS = ["infura", "ws", "alchemy", "ankr", "zora"]

Sentry.init({
  dsn: "snip",

  // Disable entirely on non-production environments.
  enabled: true,
  debug: !!process.env.SENTRY_DEBUGGING,
  registerEsmLoaderHooks: { exclude: [/openai/] },
  includeLocalVariables: true,
  integrations: [
    Sentry.onUncaughtExceptionIntegration({
      exitEvenIfOtherHandlersAreRegistered: false,
    }),
    Sentry.expressIntegration(),
    Sentry.httpIntegration({
      ignoreOutgoingRequests(url, request) {
        console.log("SENTRY: should we ignore outgoing request?", url)
        for (const route of IGNORE_WEBSOCKETS) {
          if (url.includes(route)) {
            console.log("SENTRY: Ignoring outgoing request", url)
            return true
          }
        }

        console.log("SENTRY: Not ignoring outgoing request", url)
        return false
      },
    }),
  ],

  release: process.env.COMMIT,
  environment: process.env.FLY_APP_NAME || "development",

  beforeSendTransaction(event) {
    console.log("SENTRY: beforeSendTransaction", event)

    if (event.transaction === "redis-connect") {
      // Don't send the event to Sentry
      return null
    }

    console.log("SENTRY: Sending transaction", event)

    return event
  },
  beforeSend: (event) => {
    console.log("SENTRY: beforeSend", event)
    try {
      if (event.request?.data) {
        event.request.data = trimData(event.request.data)
      }
      if (event.message) {
        event.message = trimData(event.message)
      }

      return event
    } catch (e) {
      console.error("Error in Sentry beforeSend", e, event)
      return event
    }
  },
})

Steps to Reproduce

Run Node 19 or Node 20 with Sentry enabled in my specific setup. The memory gradually increases until an OOM occurs.

This graph shows when Node was updated from 18 to 19.9.0. No other code changes occurred.
Image

Expected Result

Memory does not gradually increase until an OOM occurs.

This graph shows when I disabled includeLocalVariables in the above Sentry.init: memory stops increasing.
Image

Actual Result

Memory increases gradually until it OOMs, if includeLocalVariables is enabled.

If Sentry is disabled entirely, or if the includeLocalVariables flag is disabled, then memory looks as expected.

I originally suspected this was the issue, since we also use websockets and long-running requests, but after ignoring the requests and confirming via the console.logs that none of the long-running requests got spans created (with the issue still persisting), I tried disabling local variables and that fixed it.

These are (truncated) logs with trace GC enabled. It's not particularly showing much other than memory growth. This shows with local variables enabled, and this is with it disabled.

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 3 Nov 2, 2024
@github-actions github-actions bot added the Package: node Issues related to the Sentry Node SDK label Nov 2, 2024
@timfish timfish self-assigned this Nov 4, 2024
@timfish
Copy link
Collaborator

timfish commented Nov 4, 2024

Thanks for reporting this. I tested the recent changes around local variable capture for memory leaks since this has been an issue before. I did see an increase in memory usage with includeLocalVariables enabled but after running for a couple of hours I concluded that it wasn't leaking. I'll take another look!

@timfish
Copy link
Collaborator

timfish commented Nov 5, 2024

I've been unable to reproduce a memory leak with express and includeLocalVariables enabled.

If you have a minimal reproduction that I could debug that would help.

@vitor-ramalho-deel
Copy link

vitor-ramalho-deel commented Nov 27, 2024

Faced the same issue on my project using node 20.17.0, nestjs 10.4.3, sentry/nestjs 8.35.0

Image

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 3 Nov 27, 2024
@AbhiPrasad
Copy link
Member

@vitor-ramalho-deel are you able to provide a heap snapshot or reproduction to help us dig in further? It's hard to tell what might be causing this (might be another dependency playing bad with Sentry for example).

@getsantry
Copy link

getsantry bot commented Jan 1, 2025

This issue has gone three weeks without activity. In another week, I will close it.

But! If you comment or otherwise update it, I will reset the clock, and if you remove the label Waiting for: Community, I will leave it alone ... forever!


"A weed is but an unloved flower." ― Ella Wheeler Wilcox 🥀

@getsantry getsantry bot added the Stale label Jan 1, 2025
@getsantry getsantry bot closed this as not planned Won't fix, can't repro, duplicate, stale Jan 9, 2025
@tellisnz
Copy link

tellisnz commented Apr 12, 2025

Can confirm this is in at least 9.1.0 node 22

Image

I got a heap dump and nothing looked out of the ordinary - it seems like it was a native memory issue?

ChatGPT thoughts:
Image

Apologies but i am unable to provide a reproduction.

@AbhiPrasad
Copy link
Member

@tellisnz could you open a new GH issue and fill out the issue template? We'll take a look!

@tellisnz
Copy link

@AbhiPrasad - created #16041 but to be honest we're not going to chase it, we just simply disabled in prod and can work around it in staging/dev

I can't create a good reproduction case with our setup, and company policy precludes providing dependency set, sorry.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Package: node Issues related to the Sentry Node SDK Stale
Projects
Archived in project
Development

No branches or pull requests

6 participants