Skip to content

Magento 2.4 critical load of RAM triggering OOM killer due to CSP module 100.4.0 #29964

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
4 tasks
onepack opened this issue Sep 9, 2020 · 84 comments
Closed
4 tasks
Labels
Reported on 2.4.0 Indicates original Magento version for the Issue report.

Comments

@onepack
Copy link

onepack commented Sep 9, 2020

Preconditions (*)

  1. Magento 2.4.0
  2. PHP 7.3 and 7.4
  3. Redis enabled for FPC and backend cache
  4. Elasticsearch 7.8.1
  5. Database mariadb 10.4
  6. 8gb to 10gb of VPS ram with Magento having 2gb assigned.

Steps to reproduce (*)

  1. Set the shop to production mode
  2. Have around 2000+ products in multiple categories
  3. Clear all caches and also redis-cli flushall
  4. Open a dropdown with categories and click them open in a new tab one by one.
  5. Search a couple of times and wait for the result.

Expected result (*)

  1. The expected result is that all 15 categories would open one by one and look ok
  2. The search result would look ok.

Actual result (*)

  1. The actual result is that maybe 12 categories would open ok but around 2 or 3 would look completely broken with broken styling or even a 500 server error.
  2. The search result would look broken.
  3. Sometimes the OOM killer was triggered that would sacrifice the Elasticsearch etc.

Please provide Severity assessment for the Issue as Reporter. This information will help during Confirmation and Issue triage processes.

  • Severity: S0 - Affects critical data or functionality and leaves users without workaround.
  • [X ] Severity: S1 - Affects critical data or functionality and forces users to employ a workaround.
  • Severity: S2 - Affects non-critical data or functionality and forces users to employ a workaround.
  • Severity: S3 - Affects non-critical data or functionality and does not force users to employ a workaround.
  • Severity: S4 - Affects aesthetics, professional look and feel, “quality” or “usability”.

We found out via SSH top that the RAM consumption was peaking during the category generation on the frontend (from uncachd to cached).
First we thought it was a Redis issue because 9 out of ten times the page would look broken like we have seen in the M1 days because of a Redis issue.
But actually "the broken page layout" was the result of PHP running out of memory.
The result would get cached and stay in the cache until the redis cache was cleared.

After a lot of digging I found the source of the issue in:
vendor/magento/module-csp/Model/BlockCache.php
Under the function: load($identifier)
This function uses a crazy amound of RAM.
Quick workaround is disabling the module to keep the shop RAM usage healthy.

The log errors show that this function is causing issues.

I copied the log errors from another user running into the same issue:

[Mon Aug 17 19:40:50.052935 2020] [php7:error] [pid 16932:tid 140669993522944] [client 127.0.0.1:40654] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 48238592 bytes) in /var/www/html/vendor/magento/framework/Serialize/Serializer/Json.php on line 24
[Mon Aug 17 19:40:51.592030 2020] [php7:error] [pid 16933:tid 140670169769728] [client 127.0.0.1:40666] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/vendor/magento/module-csp/Model/BlockCache.php on line 108
[Mon Aug 17 19:41:12.706529 2020] [php7:error] [pid 17146:tid 140670127806208] [client 127.0.0.1:40702] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/vendor/magento/module-csp/Model/BlockCache.php on line 108
[Mon Aug 17 19:41:12.740424 2020] [php7:error] [pid 17146:tid 140670127806208] [client 127.0.0.1:40702] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 122880 bytes) in Unknown on line 0
[Mon Aug 17 19:41:24.887257 2020] [php7:error] [pid 16939:tid 140670111020800] [client 127.0.0.1:40798] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 4096 bytes) in /var/www/html/vendor/magento/module-csp/Model/BlockCache.php on line 78
[Mon Aug 17 19:41:24.913106 2020] [php7:error] [pid 16939:tid 140670111020800] [client 127.0.0.1:40798] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 65536 bytes) in Unknown on line 0
[Mon Aug 17 19:41:39.997127 2020] [php7:error] [pid 16933:tid 140670152984320] [client 127.0.0.1:40904] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/vendor/magento/module-csp/Model/BlockCache.php on line 108, referer: http://ip-of-my-ec2-instance:80/
[Mon Aug 17 19:41:40.030659 2020] [php7:error] [pid 16933:tid 140670152984320] [client 127.0.0.1:40904] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 40960 bytes) in Unknown on line 0, referer: http://ip-of-my-ec2-instance:80/
[Mon Aug 17 19:42:08.849538 2020] [php7:error] [pid 17146:tid 140670111020800] [client 127.0.0.1:41016] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/vendor/magento/framework/Serialize/Serializer/Json.php on line 42
[Mon Aug 17 19:42:08.865389 2020] [php7:error] [pid 17146:tid 140670111020800] [client 127.0.0.1:41016] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 32768 bytes) in Unknown on line 0
[Mon Aug 17 19:44:23.245570 2020] [php7:error] [pid 17146:tid 140670027093760] [client 127.0.0.1:41798] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/vendor/magento/framework/Serialize/Serializer/Json.php on line 42, referer: https://www.google.com/
[Mon Aug 17 19:44:23.312177 2020] [php7:error] [pid 17146:tid 140670027093760] [client 127.0.0.1:41798] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 45056 bytes) in Unknown on line 0, referer: https://www.google.com/
[Mon Aug 17 19:44:29.224653 2020] [php7:error] [pid 16939:tid 140670052271872] [client 127.0.0.1:41742] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/vendor/magento/module-csp/Model/BlockCache.php on line 108
[Mon Aug 17 19:44:29.264725 2020] [php7:error] [pid 16939:tid 140670052271872] [client 127.0.0.1:41742] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 90112 bytes) in Unknown on line 0
[Mon Aug 17 19:44:34.254009 2020] [php7:error] [pid 17146:tid 140670043879168] [client 127.0.0.1:41688] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/vendor/magento/module-csp/Model/BlockCache.php on line 108
[Mon Aug 17 19:44:34.294439 2020] [php7:error] [pid 17146:tid 140670043879168] [client 127.0.0.1:41688] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 36864 bytes) in Unknown on line 0
[Mon Aug 17 19:44:40.650547 2020] [php7:error] [pid 17146:tid 140670010308352] [client 127.0.0.1:41882] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/vendor/magento/framework/Serialize/Serializer/Json.php on line 42
[Mon Aug 17 19:44:40.690603 2020] [php7:error] [pid 17146:tid 140670010308352] [client 127.0.0.1:41882] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 49152 bytes) in Unknown on line 0
[Mon Aug 17 19:44:49.990225 2020] [php7:error] [pid 16939:tid 140670144591616] [client 127.0.0.1:42006] PHP Fatal error: Allowed memory size of 792723456 bytes exhausted (tried to allocate 33554440 bytes) in /var/www/html/vendor/magento/module-csp/Model/Collector/DynamicCollector.php on line 31

@m2-assistant
Copy link

m2-assistant bot commented Sep 9, 2020

Hi @onepack. Thank you for your report.
To help us process this issue please make sure that you provided the following information:

  • Summary of the issue
  • Information on your environment
  • Steps to reproduce
  • Expected and actual results

Please make sure that the issue is reproducible on the vanilla Magento instance following Steps to reproduce. To deploy vanilla Magento instance on our environment, please, add a comment to the issue:

@magento give me 2.4-develop instance - upcoming 2.4.x release

For more details, please, review the Magento Contributor Assistant documentation.

Please, add a comment to assign the issue: @magento I am working on this


⚠️ According to the Magento Contribution requirements, all issues must go through the Community Contributions Triage process. Community Contributions Triage is a public meeting.

🕙 You can find the schedule on the Magento Community Calendar page.

📞 The triage of issues happens in the queue order. If you want to speed up the delivery of your contribution, please join the Community Contributions Triage session to discuss the appropriate ticket.

🎥 You can find the recording of the previous Community Contributions Triage on the Magento Youtube Channel

✏️ Feel free to post questions/proposals/feedback related to the Community Contributions Triage process to the corresponding Slack Channel

@magento-engcom-team magento-engcom-team added the Issue: Format is valid Gate 1 Passed. Automatic verification of issue format passed label Sep 9, 2020
@gwharton
Copy link
Contributor

gwharton commented Sep 18, 2020

Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480 bytes) in /home/ubuntu/www/vendor/magento/module-csp/Model/BlockCache.php on line 108
Allowed memory size of 1073741824 bytes exhausted (tried to allocate 67108872 bytes) in /home/ubuntu/www/vendor/magento/module-csp/Model/Collector/DynamicCollector.php on line 31
Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480 bytes) in /home/ubuntu/www/vendor/magento/framework/Serialize/Serializer/Json.php on line 37
Allowed memory size of 792723456 bytes exhausted (tried to allocate 4096 bytes) in /home/ubuntu/www/vendor/magento/module-csp/Model/BlockCache.php on line 78

I see it trying to allocate a 67MB CSP policy!!!

I'm going to run with CSP module disabled for a while and see if that helps

@onepack
Copy link
Author

onepack commented Sep 18, 2020

@gwharton a quick workaround is to disable the CSP module.
This weekend I'm going what the source of this issue with the CSP load is.

@gwharton
Copy link
Contributor

Seemed to be something to do with the BlockCache with the limited debugging I was able to do before throwing one's hands in the air and disabling CSP.

@m2-assistant
Copy link

m2-assistant bot commented Sep 21, 2020

Hi @AlexMaxHorkun. Thank you for working on this issue.
In order to make sure that issue has enough information and ready for development, please read and check the following instruction: 👇

    1. Verify that issue has all the required information. (Preconditions, Steps to reproduce, Expected result, Actual result).
      DetailsIf the issue has a valid description, the label Issue: Format is valid will be added to the issue automatically. Please, edit issue description if needed, until label Issue: Format is valid appears.
    1. Verify that issue has a meaningful description and provides enough information to reproduce the issue. If the report is valid, add Issue: Clear Description label to the issue by yourself.
    1. Add Component: XXXXX label(s) to the ticket, indicating the components it may be related to.
    1. Verify that the issue is reproducible on 2.4-develop branch
      Details- Add the comment @magento give me 2.4-develop instance to deploy test instance on Magento infrastructure.
      - If the issue is reproducible on 2.4-develop branch, please, add the label Reproduced on 2.4.x.
      - If the issue is not reproducible, add your comment that issue is not reproducible and close the issue and stop verification process here!

@AlexMaxHorkun
Copy link
Contributor

I was not able to reproduce this locally with 1200+ products and dozens of categories. Perhaps this issue can only occur on a heavily resource limited setup. However I can see that there is a room for improvement performance wise in CSP implementation and it will be addressed in the next 2.4.x release

@gwharton
Copy link
Contributor

eeeeekkkkk :(

67MB of CSP headers being allocated is not something that should be happening on any machine, let alone a resource starved one.

There is something wrong in the block cache/CSP area it seems.

I know when mine was failing, none of the hashes from the secure inline scripts were making it into the CSP headers either.

I am using varnish, I don't know if that matters.

@gwharton
Copy link
Contributor

Also, it only seems to occurr at random times. I can let my site run for days with no problem, then all of a sidden, something triggers it and I am getting out of memory errors and apache is reporting random 503's to varnish. If I then clear cache, it runs fine again for a few days, then bang! I have had no issues whatsoever since disabling CSP. I for one, don't think module-csp is ready for production use.

@dvershinin
Copy link

dvershinin commented Sep 29, 2020

This just happened to me on a second store. Redis usage grows to 14 GB in less than 24 hrs in a store with moderate traffic.
The amazing huge CSP hashes is the data that is constantly being inserted:

1601365923.254337 [0 unix:/var/run/redis/redis.sock] "HMSET" "zc:k:421_BLOCK_D0D5D79F37A04A787F6AD70C34ADE8FB6C40636F_7886_FINAL_PRICE_LIST_CATEGORY_PAGE_EUR_20200929_8_0_" "d" "{\"policies\":[{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"BStHi22teqUu50swU5OZrPUANmifudttOq1xnvuHFOI=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"48sb4Je7XoTlJimO7pm\\/+fwXo5BBI6oU4Vci+QqK2\\/I=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"tb6kG2xiP3wJ8b8k3K5Y66s8DN2QrZZrxDpFtEhn4Ss=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"9N89WMndeXJQQmez3zcXupuWhb0jRtPuHYgRtBa1Cjo=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"nHk4LqtZ0zapwB57kWmduJUqeJ54I7FBPnysQqhg0Lw=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"cHMJ2AAJCHWGIW4nyTqYh9J2NP0DzQ3KiiYGGPKJLUo=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"1z3VYjQl8LtcQRR81EXqqznKWOrYJyJVqu93BkXzkkk=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"21LoTH2EQ0uy3yQzlyZSgENO9IzGST3xuv2szY+LU\\/M=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"8iWqAQpE8l0bw+D23oEEajCTYJEGUTYn96PKWRkjb4Y=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"4rTRTwKUgPAVOi9Zp5e\\/754G27kLnd9Rff\\/NKCEXNNI=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"4eIGKpGuhjU8Bsof\\/Mm\\/AHwTzCZUFvevER0FsWS20nw=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"UuGKBjKpVoC\\/8zfiy\\/m2jdeGhNqz8QQKQ1K7x0R+A6M=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"Xj4FDkARMdOwyPTU2EZ0oj6QTAMX1Ymb\\/op4G0\\/\\/q04=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"DYh0vPP8uKimhqykQwOxc4w+rulqsQJ79neQPYRi5Ns=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"Wbek+bBB9O15rdozQeUDPmvlHeVul8ZfHTtyXyoRi+U=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"QZ98FJG2XexdAsIsPDnEsBCYIbUPWudEt6suwuYiUxo=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"LFhxcVU6KRNm8juJ49EGPihbH6dDlRb6s1pQFtrlYD4=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"PBEQfFi\\/gVIe7Z8Lwqi\\/WgweTpYkJhki3YFJgZoCzg4=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"Q7BepgmxKUsRtpFX4fb3xA3ldLYiBqk0Rk6SVfS5MXQ=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"HbpHVw9fO+wZnZFAa5gbVfSwOP\\/oIm7fpZUuTxBYQh8=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"ntqeMFyu\\/2v5BpzJHkvA3ZZVlClLaQ3PeLC4IFze8zA=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"Jj98CdytrDSwFu+0KylWE\\/9kVZSPQaFIiUWC0kL5mes=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"P87jUc1c\\/3wSpzp+87yecrXCwp7hCrve3cLlqumTxYo=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"RaxtWHpccZ2t8wqLOChw++JbLmKchcOgMG\\/QTwsex4o=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"tSAIDiDO+W1Rcytp2oUj2tF1hlUao38\\/perMphXecKY=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"j4lWFssGViA52hejnmgAonncxa2o\\/6Lr4aLD+GuNVFU=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"Y1SHvnQlHBvpdz4wFwxiW\\/q6ZOha+TsJxapUF6t+Ru4=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"Totv2RYsjc0hUZs2KX8S6eI5YD2c29xu3Xe4PndToT0=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"xFweBEfAIzSq85VerbpcJ0CivZDjPrdP9LHyUl5AiB8=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"+IUpoW8rS3Fpep3SBtuijdqoGf7VUQTXsC84m4uteR0=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"LWixx43imgqMI4Mx49DbtnSoQyQlWCssHJ71KLFWdmQ=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"M2F6PKej2Neu4sKPvbFUdC2vPE6uXs9LW4aSZaJUzxY=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"ec+4oeXNkrIxRdpBaTzWgj1KpclQHCtgkoVtYn8Xoqo=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"dsY+ngYpEyGwSwoEHhiSHthgmVwfisWUDuX4z4lC2ZE=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"S753tbiluj\\/bwkw2dRd\\/LupU8waF6iKffWjlvTFSLoo=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"2Jsxye4hbfekcMJ\\/fq\\/HNd6Q51wMgNV8j5UQO1+XkX4=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"CkzJPKXZMkxd9P2i9VCb1GTiH30lqzZ\\/P1e3H80ePWk=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"OV3DTDxoYMgkjIQZn3RK0sSgJE58mn4274mhvvgafmQ=\":\"sha256\"}},{\"id\":\"script-src\",\"hosts\":[],\"hashes\":{\"My9XEIRO+UJtEuIdIHqaryoloiSZ7K6QsnCV\\/OmI5cM=\":\"sha256\"}},{\"id\":\"style-src\",\"hosts\":[],\"hashes\":{\"jaZS\\/zfcxb9B9kZM+bYmauKa2GWqbrgmJn30+6uSDo8 ...

And it's always huge.

Prior to M2.4.0, the stores could run just fine without maxmemory configured in Redis.

The real problem is not the size, the real problem is continues insertion of varying hash data(?). Because only that would explain
continious growth to these sizes. So if one to restort to maxmemory-policy allkeys-lru for cache efficiency, they will have opposite results - cache having mostly the hash data for CSP. It will be highly inefficient with cache hit ratio close to ZERO.

@hostep
Copy link
Contributor

hostep commented Sep 29, 2020

Maybe the following patch can help: 1767ab8 (which is being mentioned in magento/magento-cloud-patches#44)

We had some Redis OOM issues on at least two Magento 2.3.5 shops even with the CSP module disabled. This patch seems to fix the OOM issue (although we've only had it running for a couple of days, so not 100% sure yet)

It's probably a workaround for the issue described in here, there seems to be a deeper underlying issue if I understand the comments correctly.

@dvershinin
Copy link

Technically speaking, everyone should set up some memory limit in Redis configuration to avoid OOM altogether.
It is still possible to have OOM with or without CSP, but with will lead to OOM quite fast, so the CSP module is very problematic in its current state.

@will-wright1
Copy link

I'm seeing this issue too, disabling CSP caused me issues, I've had to setup a cron job to clear the cache every 6 hours!!

@Quazz
Copy link

Quazz commented Oct 8, 2020

An incredibly high amount of keys is being inserted for normal cache on Magento 2.4

I have a redis instance where cache and full page cache are shared and the normal cache endlessly growing eventually makes the fullpage cache useless since those keys get evicted since it runs out of its maxmemory limit.

It also appears that cache:clean does not clean those keys up. Only flushing the redis db helps...

@Morgy93
Copy link
Member

Morgy93 commented Oct 16, 2020

I can confirm this issue without Redis on Magento 2.4.1
Actually we disabled Redis because it consumed more than 60 GB of RAM.

Now with file cache the hard drive gets eaten and we're left with:

PHP message: PHP Fatal error:  Allowed memory size of 8589934592 bytes exhausted (tried to allocate 20480 bytes) in magento/module-csp/Model/BlockCache.php on line 108
PHP message: PHP Fatal error:  Allowed memory size of 8589934592 bytes exhausted (tried to allocate 77824 bytes) in Unknown on line 0

Yes, even 8 GB of RAM for PHP is not enough to process.

I assume that the foreach loop is just huge for whatever reason or whatever calls the save() method is a big loop.

image
https://github.com/magento/magento2/blob/2.4.1/app/code/Magento/Csp/Model/BlockCache.php#L108

@AlexMaxHorkun Our server is not at all resource limited with dedicated 16 vCPU, 64 GB of RAM and 160 GB SSD by the way.

@Morgy93
Copy link
Member

Morgy93 commented Oct 16, 2020

@nathanjosiah 2.4.2? Did you mean 2.4.1 which is "generally available" since yesterday?
If yes, I am on the 2.4.1 from yesterday with the issue described above.

Else just enlighten me 😁

@nathanjosiah
Copy link
Contributor

@Morgy93 Apologies, I deleted my comment. I meant to say 2.4.1 but then I realized the fixes will not be available until 2.4.2

@nathanjosiah
Copy link
Contributor

However, somebody could potentially test using 2.4-develop since the performance fixes have been merged there already.

@Morgy93
Copy link
Member

Morgy93 commented Oct 16, 2020

All I can see right now is this one 047629a#diff-f82a4c282711708e05d2f1db5f2f2369e2533e5bfb184b3f71db21a5756eafaf from 21 days ago.
I could try some manual patching if we find all necessary files.

Current workaround is to magento clean:cache every 15 minutes via cronjob............

@stevenculshaw
Copy link

Hi,

Firstly thanks for all of this, it's a problem that's been driving me nuts.

Is there an easy workaround or easy way to clear the issue on 2.4.0?

I've been monitoring the php error log and then reindexing/flushing which seems to make it go away until next time, which could be an hour but it's getting more frequent.

Any help much appreciated.

@gwharton
Copy link
Contributor

gwharton commented Nov 5, 2020

@stevenculshaw Have you tried applying the patch listed in comment #29964 (comment)

I have been running with that patch on 2.4.1 with no problems. I don't know if it will apply cleanly on 2.4.0, but theres only one way to find out!

@stevenculshaw
Copy link

Yeah tried applying to 2.4.0 and just got a mash of errors so must be dependent on something else in 2.4.1.

Guess the route is update to 2.4.1 and then try patching the code in from 2.4.2 again.

I just wondered if there was anything anyone had found to help with the symptoms while I work out how to do that on the busiest shopping weekend of the year.

@stevenculshaw
Copy link

someone mentioned disabling module-csp, is that a really bad idea?

@gwharton
Copy link
Contributor

gwharton commented Nov 5, 2020

not at all. unless you have configured it otherwise, CSP only runs in report only mode, so it doesn't provide ANY extra security unless you set it up to. So if you havent configured any additional policies for your site, or changed it to enforcing, then the only thing it is giving you is the crashes. Go ahead and disable it. You won't see any negative effects from disabling it.

I did check that patch against 2.4.0 and it should apply cleanly. But like you say, for a busy shopping period......... get it disabled if its causing you issues :)

@stevenculshaw
Copy link

Thanks for that clarification Graham, much appreciated, will give it a go

@nathanjosiah
Copy link
Contributor

For what it's worth, @gwharton your patch contains much more than the changes I mentioned above in #29964 (comment) which is probably why it isn't applying cleanly for @stevenculshaw

@Morgy93
Copy link
Member

Morgy93 commented Nov 5, 2020

Yes, the patch contains much more than @nathanjosiah mentioned and I was a little confused about it, but I'd like to quickly confirm that the patch works wunderfully on 2.4.1
The production server currently is below 6 GB of RAM for hours (using Redis) and I can browse around as much as I like (not mentioning all the other bots and users doing stuff 😁 )

Finally fixed, thanks all! 👏

@gwharton
Copy link
Contributor

gwharton commented Nov 5, 2020

@nathanjosiah As far as I can see, everything in the patch I listed (which was the whole commit by the core team) is needed to support the few lines you posted in your comment. It adds support for the merger that is in your comment.

I also deployed 2.4.0, manually made the same changes as in the patch i posted, then diff'd against a vanilla 2.4.0 install, and compared the diffs to the original patch, and they are identical, so the patch SHOULD apply cleanly on 2.4.0.

@stevenculshaw
Copy link

Hey all, temporarily disabling the csp module has provided valuable breathing-space, not a single error logged since. I've made a scratch copy of the site and later I will try the patch again on that and see what happens. Thanks all for your help!

@iphigenie
Copy link

Just to add a bit of context to this:

I just had this happen, both the cache growing to huge sizes and the out of memory on front end pages.

On my production site using external server level full page cache (livemage, in my case), the issue happens.
On my test server using the "internal" full page cache, the issue does not seem to happen. Switching the test server to the external cache, it happens.

So either disabling CSP or switching to the (slower, granted) internal cache are options to mitigate until 2.4.2 is released.

@magento-engcom-team magento-engcom-team added the Reported on 2.4.0 Indicates original Magento version for the Issue report. label Nov 13, 2020
@NudeWeb
Copy link

NudeWeb commented Jan 19, 2021

I have the same problem but am a bit confused how to patch. Have a 2.4.1. site
This is my file vendor/magento/module-csp/Model/Collector/DynamicCollector.php

Could someone post how this should be changed and if any other changes need to be made please.

`<?php
/**

  • Copyright © Magento, Inc. All rights reserved.
  • See COPYING.txt for license details.
    */
    declare(strict_types=1);

namespace Magento\Csp\Model\Collector;

use Magento\Csp\Api\Data\PolicyInterface;
use Magento\Csp\Api\PolicyCollectorInterface;

/**

  • CSPs dynamically added during the rendering of current page (from .phtml templates for instance).
    /
    class DynamicCollector implements PolicyCollectorInterface
    {
    /
    *

    • @var PolicyInterface[]
      */
      private $added = [];

    /**

    • Add a policy for current page.
    • @param PolicyInterface $policy
    • @return void
      */
      public function add(PolicyInterface $policy): void
      {
      $this->added[] = $policy;
      }

    /**

    • @inheritdoc
      */
      public function collect(array $defaultPolicies = []): array
      {
      return array_merge($defaultPolicies, $this->added);
      }
      }
      `

@caricell
Copy link

caricell commented Feb 2, 2021

Can anyone please verify that this issue is fixed in M 2.42 ?
We have 2.41 with this issue and have disabled CSP module, which seems to work so far.
However, I do not see in 2.42 release notes that this issue is permanently fixed.

Can anyone advise please?
Thanks!

@hostep
Copy link
Contributor

hostep commented Feb 2, 2021

@caricell: I have access to the pre-release of 2.4.2 and can confirm that the commit @gwharton linked to before is included in there, but indeed it (MC-37799) doesn't appear to be included in the release notes of 2.4.2 which is strange.
I've alerted the person in charge of the release notes, hopefully she'll be able to figure out why it's not included yet.

@caricell
Copy link

caricell commented Feb 2, 2021 via email

@caricell
Copy link

caricell commented Feb 2, 2021

Thank you so much!
I look forward to hear your findings!

@lagunacellar
Copy link

We've been running 2.4.2. Can confirm this issue is resolved.

@caricell
Copy link

caricell commented Feb 2, 2021

Thank you as well, laguna. I guess they just missed it in the release notes.

@hostep
Copy link
Contributor

hostep commented Feb 2, 2021

I got confirmation that it will be included in the release notes soon 🙂 . Thanks for the heads-up!

@caricell
Copy link

caricell commented Feb 2, 2021 via email

@pmonosolo
Copy link

Can anyone please verify that this issue is fixed in M 2.42 ?
We have 2.41 with this issue and have disabled CSP module, which seems to work so far.
However, I do not see in 2.42 release notes that this issue is permanently fixed.

Can anyone advise please?
Thanks!

Nope. I just upgraded to 2.4.2-p1, still the same issue on multiple environments and cache types (redis and file storage cache)

@gwharton
Copy link
Contributor

gwharton commented Jul 8, 2021

Really. It was fixed in 2.4.2. it was missed from the release notes though but the fix was there. I've been running since 2.4.2 with csp enforcing with no issues.

@pmonosolo
Copy link

pmonosolo commented Jul 8, 2021

Really. It was fixed in 2.4.2. it was missed from the release notes though but the fix was there. I've been running since 2.4.2 with csp enforcing with no issues.

Yup, I created 2 threads about this on stackoverflow. I'm not sure how yours is working, but mine is an upgrade from 2.3.3 without any other modifications.

https://magento.stackexchange.com/questions/340934/magento-2-4-2-p1-redis-cache-full
https://stackoverflow.com/questions/68278489/docker-compose-mnt-sdb-is-always-full

@gwharton
Copy link
Contributor

gwharton commented Jul 9, 2021

I wonder if there is a separate issue. The commit 047629a#diff-f82a4c282711708e05d2f1db5f2f2369e2533e5bfb184b3f71db21a5756eafaf definately fixed my issue and has been present since 2.4.2

@pmonosolo
Copy link

I wonder if there is a separate issue. The commit 047629a#diff-f82a4c282711708e05d2f1db5f2f2369e2533e5bfb184b3f71db21a5756eafaf definately fixed my issue and has been present since 2.4.2

I definitely have this in my vendor folder and both files are exact copies from the commit.

As soon as I disabled the CSP plugin and flushed the cache it started to work right away. I ran Search Indexing multiple times that day and the only time it stopped filling the Redis container RAM was after disabling the CSP plugin.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Reported on 2.4.0 Indicates original Magento version for the Issue report.
Projects
None yet
Development

No branches or pull requests