-
-
Notifications
You must be signed in to change notification settings - Fork 4.8k
Fetch large data for reports slow #6543
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
This should not be slow. The most obvious reasons why it may be slow:
Considering your question, I suspect that the problem is the item 1. Trying properly indexing your collection. |
@omkar-tenkale You can look at the logs of your database to find out how long the query takes on the DB side. |
With mongo profiling i can see 58 queries are run for 58000 rows The last 2 blocks of log
|
Parse is still taking upto 10 seconds for 58000 items without any query constraints |
The query i simple,this is the cloud codee async function indexTest(req,res){ var Product = Parse.Object.extend("Product"); |
This is the log (pm2 logs 0) 0|app1 | timelog: 10.097s |
In your mongodb logs I don't see any excessive execution times. Can you turn on profiling to only log queries that take longer than 100ms? |
Actually, the |
How can i include execStats? |
Here's what i could find
|
it looks like the getmores are occurring in just a few milis. there is a bottleneck of some sort.
|
In mongoshell its quiet fast,1-2 secs
Closely similar to
Parse server specs Server version 3.9.0 (From Parse dashboard app list) db.version() System Ubuntu 16.04 |
Tried accessing db directly Included in index.js
Processed in under 3 seconds,
|
So the problem lies not in db but the server |
Are mongodb and server both run locally? Sent with GitHawk |
On the same Ubuntu 16.04 VPS server |
Sent with GitHawk |
CPU > There's a visible spike but cpu usage doesn't become 100% There may be a memory leak somewhere but this doesn't constitute to our problem as even a freshly restarted server with 86 mb usage gives slow response network connections > No idea on this,doesnt seem an issue as response is slow when no other client interacting with server |
There are other pm2 parse instances but even stopping them doesn't solve the issue |
Working on other checks.Does this info gives any clues? |
Is it possible to upgrade the database server? That is definitely an unsupported db version. Can you test with >= Mongo 3.6? |
After updating Parse server specs Server version 4.2.0 (From Parse dashboard app list) db.version() Updating has not solved the issue |
Did you try to profile the NodeJs app? Sent with GitHawk |
Seems to be related with mongoObjectToParseObject , Generated using node --prof index.js ... This is the full log |
Maybe related to #4658 (comment):
|
Closing via #7214 |
@omkar-tenkale It would be great if you could try the new master branch and confirm whether #7214 solves performance loss? |
This issue made us move away from parse :( |
@dplewis should we really close this already? We did quite some investigation into this one I remember, and the performance issue was evidently there. I think someone could re-run these perf tests. |
may I know what is the solution? I also have the same issues. |
Re-opened for further investigation; however, the issues reported by different users here may or may not be related to each other. The underlying issue may be the conversion from MongoDB document to Parse Object which in its currently implementation seems to be quite resource demanding; maybe there is something to be improved. In any case, we would need someone doing the testing to gather performance data to compare, otherwise we cannot investigate the issue and we may as well close it again. |
Old one, but while inspecting our servers for perf diagnostic, I discovered this issue.
In our case, this code block is taking 1% of our total CPU time 😅 |
@SebC99 I use query.find({ json: true }) to return the raw data because the JS encode to Parse.Object is slow. |
@dplewis I didn't even know we could do that... all the cloudcode could use this, and return json to the clients (we just need to add the |
@SebC99 I saw similar performance issues with the parseDate function.
|
True |
And If I remember correctly, one issue of course is the parsing which is slow, but all the too many calls to this parsing, as every hooks (beforeFind, afterFind, etc.) create the string objects to log before checking if the log levels are met. |
Oh. Nice find. |
I've just checked, and the regex is better outside for the function of course, but the impact isn't huge. Replacing the regex by |
@SebC99 any news on this optim ? |
+1 |
I'm trying to fetch more than 100,000 rows for report generation.
I tried these approaches,
Both the calls are slow,
For cloud code query
query = //new query
query.equalTo("SOMEKEY","SOMEVALUE")
For total of 1,500,000 rows
In cloud code,
In recursive call approach, there is approx 1 second waiting time per 1000 rows.
In query.each() approach this query takes approx 60 seconds to process, http waiting time
Is there a better approach for processing such huge data.
One alternative I can think of is parse aggregate but this approach doesn't respect parse authentication, security, acl etc. and doesn't work with existing query logic.
Will this show any performance improvement and worth a try?
The text was updated successfully, but these errors were encountered: