-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Huge memory usage by canvas.loadFromJSON on NodeJS #1997
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I'm expriencing something similar to. |
Can we see an hires image also? |
just an example image... In the test case I'm using I never hae more than 6 images. |
So 1600x1200 is more or less 8MB ( how many of them? ) 2000x2500 canvas is 20MB + upperCanvas is other 20MB Fonts i have no idea sincely how much they take. Using filters also? cause this could easily make bigger memory usages. Any tool on nodejs to find out on wich operation the memory grows? can you do something like |
Possibly related node-canvas issues: Automattic/node-canvas#140 |
Hy, |
Under further investigation seems to me like the problem is on line 23639 |
Just finished trying with node 0.12.0, nothing changed. |
launching node with node --expose-gc --always-compact options, and calling global.gc() at the end of every rendering and again at the end of every rendering fixes the issue... I don't like it like solutions but having no other alternative... Also declaring one canvas and changing width and height at every iteration helped |
@DanieleSassoli I did the same thing in our app (couple years ago and it's still the same). I'm not sure if there's anything we can do in Fabric about this. Perhaps it should just be taken care of on an application level. |
I know this is closed but I just wanted to give my two cents because I still had problems after reading the suggestions in this thread. I had a class with a method that was creating a new canvas object every time it was called. For some reason they weren't being garbage collected so the memory was never being released. What fixed it for me is the last thing @DanieleSassoli suggested. I made the canvas a property of a class (so there was ever only one instance of canvas created), and within the method I cleared and resized it, then did what I needed to do. I did not have to use --expose-gc --always-compact or call glocal.gc() to prevent the leak. Hope this helps someone. |
@DPflasterer can you send me a gist of what you did? Currently having this same problem in production and Fabric.js seems like the best API out there to generate this images. @DanieleSassoli does the solution you provided keep working for you? |
@vjames19 yes, still working for me. |
@DanieleSassoli It works but it definitely brings downs performance. |
@vjames19 haven't had accurate tests on this, at the moment I haven't got the time to do this. |
We also ran into this problem when doing any kind of concurrent or repetitive manipulation of multiple var canvas = fabric.createCanvasForNode();
// blah blah canvas manipulation
canvas.clear();
canvas.dispoose();
// garbage collection should be good now |
Just a side-note -
|
FYI guys we run on embedded PCs which are really tight in terms of memory. We need to run under 80mb of ram consistently. We did some memory analysis... our issues were memory leaks within fabricjs in the |
what are you doing with your app? |
Hy
I'm rendering some canvas on node 0.10.25, the json arrives from a client, I change all the sources of the images just to load the hres images and the I load the canvas, then I render it and finally writeit to a file. The problem is that creating the canvas occupies 200M average, and loadFromJSON in some cases reaches 1GB PER CANVAS, and this memory never get released, or at least not all of it.
I have to render many canvas and this juge memory usage often leads to a ENOMEM error, and if I try to load them in parallel ( as I would like to) it always throw an ENOMEM. This is my code:
I tried calling
canvas.destroy();
and thencanvas = null;
on theout.on('finish')
event, but nothing changes.I'm I getting something wrong or is this an issue?
thank's.
The text was updated successfully, but these errors were encountered: