Skip to content

Sentry HTTP 429 errors causing AWS Lambda InvokeErrror (status 502 response) result #4582

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
7 of 11 tasks
tmilar opened this issue Feb 15, 2022 · 15 comments
Closed
7 of 11 tasks
Labels
Package: serverless Issues related to the Sentry Serverless SDK

Comments

@tmilar
Copy link
Contributor

tmilar commented Feb 15, 2022

Package + Version

  • @sentry/browser
  • @sentry/node 6.17.6
  • raven-js
  • raven-node (raven for node)
  • other:
    • @sentry/serverless 6.17.6
    • @sentry/tracing 6.17.6

Version:

6.17.6

Description

Sometimes, due to large Sentry API usage, we are getting HTTP 429 error responses with Sentry SDK in our Node.js AWS Lambda functions.

However, the problem is that sometimes we end up getting this error caused by Sentry SDK that results in a seemingly unhandleable InvokeError (resulting in AWS lambda HTTP 502 response) which is problematic for us because our actual business logic is working just fine.

2022-02-15T18:50:47.224Z	67dc62db-a670-4e84-ad4a-679145ecd1e1	ERROR	Invoke Error 	
{
    "errorType": "SentryError",
    "errorMessage": "HTTP Error (429)",
    "name": "SentryError",
    "stack": [
        "SentryError: HTTP Error (429)",
        "    at new SentryError (/var/task/node_modules/@sentry/utils/dist/error.js:9:28)",
        "    at ClientRequest.<anonymous> (/var/task/node_modules/@sentry/node/dist/transports/base/index.js:212:44)",
        "    at Object.onceWrapper (events.js:520:26)",
        "    at ClientRequest.emit (events.js:400:28)",
        "    at ClientRequest.emit (domain.js:475:12)",
        "    at HTTPParser.parserOnIncomingClient (_http_client.js:647:27)",
        "    at HTTPParser.parserOnHeadersComplete (_http_common.js:127:17)",
        "    at TLSSocket.socketOnData (_http_client.js:515:22)",
        "    at TLSSocket.emit (events.js:400:28)",
        "    at TLSSocket.emit (domain.js:475:12)"
    ]
}

Our expectation is that internal Sentry errors should not cause an outage in our own APIs.

We didn't found any way to handle/catch this error, because it looks like it's failing outside of our Sentry calls such as Sentry.init(), Sentry.wrapHandler() etc.

The 429 error response we are mostly certain that is due to an exceess in our Transactions quota.

When decreasing traceSampleRate config in Sentry.init() from 0.2 to 0, we stopped having this error.

We were seeing this issue was happening consistently in random cases, seemingly the same as the value we had for traceSampleRate. We've also tried setting it to 1 to confirm, and we could confirm a 100% error rate.

So currently, the workaround we are using is disabling this entire feature by setting it to 0, and no more unhandled Sentry 429 HTTP errors were thrown. Still, it doesn't seem to be correct solution having to disable entire features to have an external dependency not breaking my app.

For completeness, this is our current config:

    Sentry.init({
      debug: SLS_STAGE === 'dev',
      dsn: sentryKey,
      tracesSampleRate: 0,

      environment: SLS_STAGE,
      release: `${SLS_SERVICE_NAME}:${SLS_APIG_DEPLOYMENT_ID}`,
    })
@tmilar tmilar changed the title Sentry HTTP 429 errors causing AWS Lambda InvokeErrror result Sentry HTTP 429 errors causing AWS Lambda InvokeErrror (status 502 response) result Feb 16, 2022
@AbhiPrasad
Copy link
Member

This is the logic that we are hitting.

let rejectionMessage = `HTTP Error (${statusCode})`;
if (res.headers && res.headers['x-sentry-error']) {
rejectionMessage += `: ${res.headers['x-sentry-error']}`;
}
reject(new SentryError(rejectionMessage));

Sentry though doesn't capture internal SentryErrors though. Is AWS Lambda monitoring errors that bubble up to certain handlers and setting a response based on that?

Perhaps there is a way we can edit https://github.com/getsentry/sentry-javascript/blob/master/packages/serverless/src/awslambda.ts to address that. Maybe we could monkey patch whatever aws lambda is listening for to filter out SentryError.

We could also try and not throw errors for 429s since they are pretty common.

What do you think?

@collierrgbsitisfise
Copy link

collierrgbsitisfise commented Feb 21, 2022

@AbhiPrasad we was facing the same issue. Is is legit way to add additional parameter on Sentry.init or on Sentry.captureError, which will explicitly define if internal(sentry) error should be ignored or not ?

@tmilar
Copy link
Contributor Author

tmilar commented Feb 22, 2022

@AbhiPrasad
Yes, it looks like AWS lambda default behavior is to respond with InvokeError on any unhandled error.

I think that unhandleable errors should not ever happen from an external library.

On the current Senttry implementation. I'm not seeing a way the client could catch this error since it's happening asynchronously, a correct approach in this case would be at least having them absorbed and just logged. Maybe, log it always, or only in case of debug: true, or some other logging flag would be set.

@ichina
Copy link
Contributor

ichina commented Feb 23, 2022

@tmilar hopefully will be fixed with #4620

@tmilar
Copy link
Contributor Author

tmilar commented Feb 23, 2022

Yes @ichina , that probably would work fine. Thanks!

@AbhiPrasad
Copy link
Member

Option released with https://github.com/getsentry/sentry-javascript/releases/tag/6.18.0

exports.handler = Sentry.AWSLambda.wrapHandler(yourHandler, {
  // Ignore any errors raised by the Sentry SDK on attempts to send events to Sentry
  ignoreSentryErrors: true,
});

@mariomnts
Copy link

Hi! I've seen this is also happening when using Lambda Layer @sentry/serverless/dist/awslambda-auto but I can't find the option to configure ignoreSentryErrors in that scenario.

@AbhiPrasad
Copy link
Member

You'll need to be on version 6.18.0 and above - which map to these versions (and above) for the lambda layers: https://raw.githubusercontent.com/getsentry/sentry-release-registry/master/aws-lambda-layers/node/6.18.0.json

@mariomnts
Copy link

You'll need to be on version 6.18.0 and above - which map to these versions (and above) for the lambda layers: https://raw.githubusercontent.com/getsentry/sentry-release-registry/master/aws-lambda-layers/node/6.18.0.json

@AbhiPrasad and if I'm on that version or above, how can I tell Sentry layer NOT to fail the Lambda on Sentry failure or Sentry 429? Thanks

@AbhiPrasad
Copy link
Member

You have to supply the option into the handler, as was stated above: https://docs.sentry.io/platforms/node/guides/aws-lambda/#ignore-sentry-errors.

@mariomnts
Copy link

https://docs.sentry.io/platforms/node/guides/aws-lambda/#ignore-sentry-errors

@AbhiPrasad I think I may be missing something because when I use the Lambda Layer integration I don't have a handler where I can configure the options like I do when I use the library

@lewnelson
Copy link

We've just had a fairly catastrophic failure on our lambdas due to Sentry being integrated and we didn't set the ignoreSentryErrors option to true. It feels really odd that it isn't enabled by default. Presumably to maintain backwards compatibility, but it feels as though having it off by default is a bug rather than a feature.

It's not easy to catch in testing before going to production and hitting higher loads of traffic as you're less likely to hit a rate limit.

@giankotarola
Copy link

Is ignoreSentryErrors valid for other platforms as electron for example? I'm seeing unhandledRejection SentryError: HTTP Error (429)\ on an electron app but I'm not sure or I don't know what is the best way to handle it or avoid it as unhandledRejection 🤔

@AlastairTaft
Copy link

AlastairTaft commented May 29, 2022

I have to agree having an error reporting package that can bring down your application is incredibly scary. I added a lambda wrapper feeling confident it would sit there silently and not interrupt the actual logic.

The decision to not make ignoreSentryErrors the default behaviour seems arguable. Lots more real world applications are going to have outages under volume until they stumble across this ticket. Or at least a too many requests error specifically shouldn't cause down time.

@MrRhodes
Copy link

MrRhodes commented Jun 10, 2022

I would love to know if a meeting has taken place to decide if ignoreSentryErrors should be on or off as default. I wonder how many people have hit this issue due to hitting sentry transaction limits to then quickly pay Sentry more money to upgrade their plan....

sincerely

a disappointed customer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Package: serverless Issues related to the Sentry Serverless SDK
Projects
None yet
Development

No branches or pull requests

9 participants