You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have read and understood the contribution guidelines
I'm trying to generate a PDF from a React component. In my scenario it is a table but it could be anything. The problem I'm facing is not with generating the PDF but the produced PDF itself which is too heavy and unusable.
I insert an HTMLElement which is the DOM content for my React component and it generates the PDF correctly. However this pdf is very big and extremely slow to load. Mind that I have an i9 with 32GB of RAM and it easily takes 15secs to render one page of the pdf in the browser.
Initially it was generating a 150MB pdf but then I set compressed to true and it's now around 600KB. That changes the size but it didn't improve the performance somehow. I've tried multiple computers and browsers and I've tinkered with the options for html2canvas and nothing seems to fix this.
I did notice however that pdf readers like Adobe Acrobat seem to deal with it fine.
Here is the pdf (docdroid seems to be able to read it fine, but the browser pdf viewer can't handle it)
After some analysis on the produced document I noticed a couple of things:
The decompression of too much high precision content as plain text it becomes a hog at 1,175,023 bytes one duplicated box entry looks like this 10. 1653.7799999999999727 m 569. 1653.7799999999999727 l 569. 588.6534374999998818 l 10. 588.6534374999998818 l 10. 1653.7799999999999727 l 569. 1653.7799999999999727 l
Every letter is masses of detail here is the first letter "I" on my first page BT /F1 9.1 Tf 10.4649999999999981 TL 0. 0. 0. rg 20.3999999999941792 794.1899999999999409 Td (I) Tj ET 1. w 0. 0. 0. rg 1. G 0. w 0 j 0. 0. 0. rg 10. 824.0900000000000318 m 565.0999999999985448 824.0900000000000318 l 565.0999999999985448 -225.4365625000001501 l 10. -225.4365625000001501 l 10. 824.0900000000000318 l 565.0999999999985448 824.0900000000000318 l W n 0. w . If this was being correctly generated it should be described as plain text (ASCII or UTF) but it's not.
There is something seriously wrong with the pdf generation engine to not be combining letters into words, but even if lines were lines of text that .999999999 structure is massive bloat requiring extra recalculation time
If someone has any insight on this, it would be extremely helpful.
The text was updated successfully, but these errors were encountered:
I have read and understood the contribution guidelines
I'm trying to generate a PDF from a React component. In my scenario it is a table but it could be anything. The problem I'm facing is not with generating the PDF but the produced PDF itself which is too heavy and unusable.
I have a function like this:
I insert an HTMLElement which is the DOM content for my React component and it generates the PDF correctly. However this pdf is very big and extremely slow to load. Mind that I have an i9 with 32GB of RAM and it easily takes 15secs to render one page of the pdf in the browser.
Initially it was generating a 150MB pdf but then I set compressed to true and it's now around 600KB. That changes the size but it didn't improve the performance somehow. I've tried multiple computers and browsers and I've tinkered with the options for html2canvas and nothing seems to fix this.
I did notice however that pdf readers like Adobe Acrobat seem to deal with it fine.
Here is the pdf (docdroid seems to be able to read it fine, but the browser pdf viewer can't handle it)
After some analysis on the produced document I noticed a couple of things:
10. 1653.7799999999999727 m 569. 1653.7799999999999727 l 569. 588.6534374999998818 l 10. 588.6534374999998818 l 10. 1653.7799999999999727 l 569. 1653.7799999999999727 l
BT /F1 9.1 Tf 10.4649999999999981 TL 0. 0. 0. rg 20.3999999999941792 794.1899999999999409 Td (I) Tj ET 1. w 0. 0. 0. rg 1. G 0. w 0 j 0. 0. 0. rg 10. 824.0900000000000318 m 565.0999999999985448 824.0900000000000318 l 565.0999999999985448 -225.4365625000001501 l 10. -225.4365625000001501 l 10. 824.0900000000000318 l 565.0999999999985448 824.0900000000000318 l W n 0. w
. If this was being correctly generated it should be described as plain text (ASCII or UTF) but it's not.If someone has any insight on this, it would be extremely helpful.
The text was updated successfully, but these errors were encountered: