Skip to content

Parse Server performance loss #7036

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
4 tasks done
uzaysan opened this issue Dec 2, 2020 · 46 comments
Closed
4 tasks done

Parse Server performance loss #7036

uzaysan opened this issue Dec 2, 2020 · 46 comments

Comments

@uzaysan
Copy link

uzaysan commented Dec 2, 2020

New Issue Checklist

Issue Description

When I do performance test with parse server, performance decreases. And When I say decreases, I don't mean it decrease a little then stop. I mean It always decrease and doesn't stop decreasing. But This only happens when we use parse query in cloud code and query (in cloud code) returns parse object(s). But queries that doesn't return parse object(count query) or queries that should return array of parse objects but returns empty array(for example collection is empty), performance doesn't decrease. They are fine. Problem only happens when query return parse object. You can see more information on forum thread that I have created

Steps to reproduce

Create a cloud code and put a simple query in it(query must return parse object(s)). Then do countinues benchmark test against this cloud code.
I used wrk library to do benchmark tests. This is the code I used when I do benchmarking:
wrk -t1 -c400 -d30s -H "Content-Type: application/json" -H "X-Parse-Application-Id: YOUR_APP_ID" -s post.lua http://parse_server_ip:1337/parse/functions/codeWithQuery

Here is an example cloud code:

Parse.Cloud.define("codeWithQuery", async (request) => {
  const Follow = Parse.Object.extend("Follow");
  //Follow class has 35 objects. So this query returns 35 parse objects.
  const getFollow = new Parse.Query(Follow);
  return await getFollow.find({useMasterKey:true});
});

Actual Outcome

Requests per second decreases. You can see my benchmark history on this message.
And You can see information on that thread.

Expected Outcome

Performance shouldn't decrease. Even if it does, It should be stable after some time.

Environment

Server

  • Parse Server version: 4.3.0
  • Operating system: Ubuntu 18.04
  • Local or remote host (AWS, Azure, Google Cloud, Heroku, Digital Ocean, etc): Remote host

Database

  • System (MongoDB or Postgres): MongoDB
  • Database version: 4.4.1
  • Local or remote host (MongoDB Atlas, mLab, AWS, Azure, Google Cloud, etc): Remote

Client

  • SDK (iOS, Android, JavaScript, PHP, Unity, etc): Cloud Code
  • SDK version: Cloud Code

Logs

@mtrezza
Copy link
Member

mtrezza commented Dec 2, 2020

Maybe related to #6543.

@mtrezza
Copy link
Member

mtrezza commented Dec 2, 2020

Thanks for reporting.

To eliminate the possibility that this is caused by any custom configuration, can you please redo the tests with:

  • a fresh, clean, basic installation of Parse Server
  • some simple example data in the database, I suggest to create a class with just a number field

Then it would be interesting if you could share your Parse Server configuration and a full description of the Parse Objects that you queried. You could also try a different or external benchmarking tool to get a second opinion.

@uzaysan
Copy link
Author

uzaysan commented Dec 2, 2020

Hey @mtrezza Thank you for taking time on this. I can do what you said. But I want to say a thing. I was doing tests. I changed my code.

Previously my cloud code was like this:

Parse.Cloud.define("code", async (request) => {

  const Follow = Parse.Object.extend("Follow");
  
  const getFollow = new Parse.Query(Follow);
  return await getFollow.find({useMasterKey:true});

});

And I changed it to this:

Parse.Cloud.define("code", async (request) => {
  return await Parse.Cloud.httpRequest({
    url: `http://127.0.0.1:1337/parse/classes/Follow`,
    method: 'GET',
    headers: {
        'X-Parse-Application-Id': "myAppId"
    }
  });
});

Both cloud code returns same thing(well, second cloud code returns status code and http headers etc as extra. but also returns query result so basicly same thing).

But second code Doesnt decrease perormance. It was stable. In fact I get 30% performance increase with second code.
requests per seconds
Cloud code one: 70 - 60 - 45 ...
But second code: 90 - 103 - 108 - 98 - 103 ...

So With this way performace doesnt decrease and its stable. What do you think about this?

@mtrezza
Copy link
Member

mtrezza commented Dec 2, 2020

Very interesting indeed. Can you share your Parse Server configuration?

@uzaysan
Copy link
Author

uzaysan commented Dec 2, 2020

@mtrezza Of course

var api = new ParseServer({
  allowClientClassCreation:false,
  databaseURI: databaseUri || 'mongodb://localhost:27017/dev',
  cloud: process.env.CLOUD_CODE_MAIN || __dirname + '/cloud/main.js',
  appId: process.env.APP_ID || 'appId',
  masterKey: process.env.MASTER_KEY || 'masterKey', //Add your master key here. Keep it secret!
  maxUploadSize: "10mb",
  databaseOptions: { poolSize: 100 },
  directAccess: true,
  cacheTTL: 0,
  enableAnonymousUsers: false,
  enableSingleSchemaCache: true,
  serverURL: process.env.SERVER_URL || 'http://my_ip:1337/parse',
  appName: 'App Name',
  filesAdapter: s3Adapter,
  emailAdapter: emailAdapter,
  publicServerURL: 'https://example.com/parse'
});

@davimacedo
Copy link
Member

directAccess should help on that but since it is still an experimental feature, could you repeat the test with original code and directAccess set false? It should basically do same thing you did in the second test.

@uzaysan
Copy link
Author

uzaysan commented Dec 3, 2020

@davimacedo hey. Unfortunatly setting directAccess to false didnt work. Here is my test history with my original code(directAccess is false):

1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   479.86ms   75.73ms 659.70ms   61.51%
    Req/Sec    66.45     37.16   180.00     68.03%
  1860 requests in 30.05s, 18.34MB read
Requests/sec:     61.90
Transfer/sec:    625.00KB
------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   742.65ms  149.09ms   1.99s    95.48%
    Req/Sec    43.78     19.46    80.00     67.05%
  1187 requests in 30.05s, 11.70MB read
  Socket errors: connect 0, read 0, write 0, timeout 8
Requests/sec:     39.50
Transfer/sec:    398.91KB
------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   812.93ms   87.04ms   1.10s    76.50%
    Req/Sec    38.15     15.09    60.00     68.12%
  1085 requests in 30.04s, 10.70MB read
Requests/sec:     36.12
Transfer/sec:    364.68KB
------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   935.15ms   86.71ms   1.16s    83.60%
    Req/Sec    33.27     12.77    50.00     61.01%
  945 requests in 30.05s, 9.32MB read
Requests/sec:     31.45
Transfer/sec:    317.53KB
------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.05s   141.00ms   1.61s    88.21%
    Req/Sec    29.80     10.89    50.00     82.85%
  840 requests in 30.05s, 8.28MB read
Requests/sec:     27.96
Transfer/sec:    282.30KB
-------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.14s   140.36ms   1.47s    80.83%
    Req/Sec    27.26      9.25    40.00     68.25%
  772 requests in 30.04s, 7.61MB read
Requests/sec:     25.69
Transfer/sec:    259.46KB
---------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.18s   119.30ms   1.47s    85.92%
    Req/Sec    26.47      9.03    40.00     73.72%
  746 requests in 30.06s, 7.36MB read
Requests/sec:     24.81
Transfer/sec:    250.56KB
----------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.34s   156.01ms   1.68s    86.45%
    Req/Sec    23.17      8.08    40.00     81.88%
  657 requests in 30.04s, 6.48MB read
Requests/sec:     21.87
Transfer/sec:    220.81KB
--------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.43s   155.88ms   1.95s    77.83%
    Req/Sec    22.37      8.49    40.00     78.85%
  612 requests in 30.06s, 6.03MB read
  Socket errors: connect 0, read 0, write 0, timeout 3
Requests/sec:     20.36
Transfer/sec:    205.61KB
------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.55s   165.92ms   1.99s    88.63%
    Req/Sec    19.65      7.77    30.00     49.23%
  538 requests in 30.05s, 5.31MB read
  Socket errors: connect 0, read 0, write 0, timeout 72
Requests/sec:     17.90
Transfer/sec:    180.76KB
---------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.57s   201.17ms   1.99s    87.76%
    Req/Sec    19.29      7.49    30.00     53.16%
  537 requests in 30.05s, 5.30MB read
  Socket errors: connect 0, read 0, write 0, timeout 47
Requests/sec:     17.87
Transfer/sec:    180.43KB
------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.66s   194.30ms   1.99s    86.24%
    Req/Sec    18.35      6.67    30.00     59.12%
  521 requests in 30.05s, 5.14MB read
  Socket errors: connect 0, read 0, write 0, timeout 5
Requests/sec:     17.34
Transfer/sec:    175.07KB
------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.58s   177.14ms   1.99s    87.66%
    Req/Sec    18.53      7.19    30.00     56.15%
  505 requests in 30.05s, 4.98MB read
  Socket errors: connect 0, read 0, write 0, timeout 108
Requests/sec:     16.81
Transfer/sec:    169.70KB

And This is the test results with my second code. DirectAccess is false:

1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   317.31ms   81.34ms 597.19ms   68.16%
    Req/Sec    95.52     49.58   232.00     64.71%
  2817 requests in 30.04s, 147.77MB read
Requests/sec:     93.77
Transfer/sec:      4.92MB
------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   303.74ms   76.44ms 559.74ms   65.90%
    Req/Sec   100.49     52.35   217.00     62.98%
  2944 requests in 30.05s, 154.29MB read
Requests/sec:     97.98
Transfer/sec:      5.13MB
-------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   300.25ms   80.35ms 669.52ms   65.65%
    Req/Sec   101.23     53.94   240.00     65.52%
  2978 requests in 30.04s, 156.29MB read
Requests/sec:     99.12
Transfer/sec:      5.20MB
------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   286.56ms   68.18ms 547.28ms   72.88%
    Req/Sec   106.67     60.80   260.00     63.67%
  3116 requests in 30.03s, 163.25MB read
Requests/sec:    103.77
Transfer/sec:      5.44MB
-----------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   295.32ms   74.57ms 567.86ms   70.37%
    Req/Sec   104.47     55.34   232.00     60.78%
  3031 requests in 30.04s, 158.80MB read
Requests/sec:    100.88
Transfer/sec:      5.29MB
--------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   293.58ms   70.19ms 541.37ms   68.97%
    Req/Sec   104.48     58.37   252.00     64.83%
  3045 requests in 30.05s, 159.93MB read
Requests/sec:    101.34
Transfer/sec:      5.32MB
---------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   295.09ms   76.47ms 571.64ms   66.86%
    Req/Sec   104.72     57.28   260.00     63.83%
  3030 requests in 30.05s, 158.75MB read
Requests/sec:    100.83
Transfer/sec:      5.28MB
--------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   289.72ms   70.19ms 540.56ms   72.74%
    Req/Sec   105.64     51.80   250.00     63.54%
  3092 requests in 30.04s, 162.00MB read
Requests/sec:    102.94
Transfer/sec:      5.39MB
-----------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   315.76ms  104.00ms 962.10ms   84.36%
    Req/Sec    97.85     57.07   250.00     59.51%
  2837 requests in 30.05s, 149.35MB read
Requests/sec:     94.41
Transfer/sec:      4.97MB
----------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   297.30ms   74.37ms 571.66ms   66.87%
    Req/Sec   103.20     51.96   232.00     61.75%
  3009 requests in 30.05s, 157.74MB read
Requests/sec:    100.14
Transfer/sec:      5.25MB
----------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   319.34ms   93.17ms 762.05ms   73.88%
    Req/Sec    95.70     54.94   260.00     59.93%
  2804 requests in 30.05s, 146.91MB read
Requests/sec:     93.31
Transfer/sec:      4.89MB
-----------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   297.52ms   70.50ms 566.23ms   67.61%
    Req/Sec   102.30     55.37   290.00     63.01%
  3013 requests in 30.04s, 157.86MB read
Requests/sec:    100.29
Transfer/sec:      5.25MB
------------------------------------------------------------------------------------
1 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   302.28ms   77.71ms 575.40ms   68.36%
    Req/Sec   101.50     53.86   232.00     64.11%
  2952 requests in 30.06s, 155.41MB read
Requests/sec:     98.22
Transfer/sec:      5.17MB

@davimacedo
Copy link
Member

I have one more questions. I see that in your second code you are not passing the master key. Is this query really working and returning the same results of the first code (because in the first one you are using the master key)?

@uzaysan
Copy link
Author

uzaysan commented Dec 3, 2020

Using masterkey is habit. I used to. I disable all class level permissions for every class. So I put master key in every query. but for testing porpouses, I enabled public permissions for that class.
Yes. Second code returns query result.

Here is what second code returns:

{
    "result": {
        "status": 200,
        "headers": {
            "x-powered-by": "Express",
            "access-control-allow-origin": "*",
            "access-control-allow-methods": "GET,PUT,POST,DELETE,OPTIONS",
            "access-control-allow-headers": "X-Parse-Master-Key, X-Parse-REST-API-Key, X-Parse-Javascript-Key, X-Parse-Application-Id, X-Parse-Client-Version, X-Parse-Session-Token, X-Requested-With, X-Parse-Revocable-Session, X-Parse-Request-Id, Content-Type, Pragma, Cache-Control",
            "access-control-expose-headers": "X-Parse-Job-Status-Id, X-Parse-Push-Status-Id",
            "content-type": "application/json; charset=utf-8",
            "content-length": "9708",
            "etag": "W/\"25ec-LhWfCVTz+HZrI3HA0XyJHrP3olU\"",
            "date": "Thu, 03 Dec 2020 07:26:12 GMT",
            "connection": "close"
        },
        "buffer": {
            "type": "Buffer",
            "data": [
                123,
                34,
                ...// and 9 thousand more integer
                93,
                125
            ]
        },
        "text": "{\"results\":[{\"objectId\":\"lhgGyETaOX\",\"owner\" .... //Stringified json I guess? ... }",
       "data": {
            "results": [
                {
                    "objectId": "lhgGyETaOX",
                    "owner": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "lyIMZ2wRk5"
                    },
                    "who": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "KQNjPL3nek"
                    },
                    "cid": "lyIMZ2wRk5KQNjPL3nek",
                    "createdAt": "2020-11-23T10:03:13.759Z",
                    "updatedAt": "2020-11-23T10:03:13.759Z"
                },
                {
                    "objectId": "kVcHyHvwBl",
                    "owner": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "lyIMZ2wRk5"
                    },
                    "who": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "gKd3ETSy5O"
                    },
                    "cid": "lyIMZ2wRk5gKd3ETSy5O",
                    "createdAt": "2020-11-23T12:59:41.688Z",
                    "updatedAt": "2020-11-23T12:59:41.688Z"
                },
                {
                    "objectId": "bKcoiiCKAC",
                    "owner": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "gKd3ETSy5O"
                    },
                    "who": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "lyIMZ2wRk5"
                    },
                    "cid": "gKd3ETSy5OlyIMZ2wRk5",
                    "createdAt": "2020-11-23T13:13:11.139Z",
                    "updatedAt": "2020-11-23T13:13:11.139Z"
                },
                {
                    "objectId": "cofJONYpYR",
                    "owner": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "lyIMZ2wRk5"
                    },
                    "who": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "Ddnzb1Z3s9"
                    },
                    "cid": "lyIMZ2wRk5Ddnzb1Z3s9",
                    "createdAt": "2020-11-24T16:37:09.600Z",
                    "updatedAt": "2020-11-24T16:37:09.600Z"
                },
                {
                    "objectId": "lhhxh7hsLf",
                    "owner": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "lyIMZ2wRk5"
                    },
                    "who": {
                        "__type": "Pointer",
                        "className": "_User",
                        "objectId": "eQgbVtFMeT"
                    },
                    "cid": "lyIMZ2wRk5eQgbVtFMeT",
                    "createdAt": "2020-11-24T16:37:11.286Z",
                    "updatedAt": "2020-11-24T16:37:11.286Z"
                },
                //Other parse objects ....
               
            ]
        }
    }
}

@davimacedo

@mtrezza
Copy link
Member

mtrezza commented Dec 3, 2020

var api = new ParseServer({
allowClientClassCreation:false,
databaseURI: databaseUri || 'mongodb://localhost:27017/dev',
cloud: process.env.CLOUD_CODE_MAIN || __dirname + '/cloud/main.js',
appId: process.env.APP_ID || 'appId',
masterKey: process.env.MASTER_KEY || 'masterKey', //Add your master key here. Keep it secret!
maxUploadSize: "10mb",
databaseOptions: { poolSize: 100 },
directAccess: true,
cacheTTL: 0,
enableAnonymousUsers: false,
enableSingleSchemaCache: true,
serverURL: process.env.SERVER_URL || 'http://my_ip:1337/parse',
appName: 'App Name',
filesAdapter: s3Adapter,
emailAdapter: emailAdapter,
publicServerURL: 'https://example.com/parse'
});

I see that there are still some options in your configuration, that are not the default Parse Server configuration.

As I mentioned earlier:

To eliminate the possibility that this is caused by any custom configuration, can you please redo the tests with:

  • a fresh, clean, basic installation of Parse Server
  • some simple example data in the database, I suggest to create a class with just a number field

To clarify, by that I mean:

  • remove all options like databaseOptions. poolSize, enableSingleSchemaCache: true (not just setting it false, but removing the key); even better make a fresh clone of the Parse Server repo
  • use a fresh DB with just one class that contains just 1 object with a number field, gradually add more objects and see if that makes any difference in performance

@uzaysan
Copy link
Author

uzaysan commented Dec 3, 2020

I see that there are still some options in your configuration, that are not the default Parse Server configuration.

Yes. Thats my current setup.

I tried with fresh install parse server(cloned parse-server-example repo). But I used my existing database.And Problem still exists. Even on fresh install parse server, performace decreases.

About fresh mongoDb, I dont make the connection why this can be a problem. I mean when I restart parse server ıts give full performance or when we do query /classes/className endpoint, It works again. I dont think think ıts related to database but I will try. I will install mongodb and see what happens.

@uzaysan
Copy link
Author

uzaysan commented Dec 3, 2020

@mtrezza Ok. You were right. I made a fresh install MongoDb database. Creates class named "Test". I added one integer field called "int".(It has other default fields like id and createdAt etc..) Set value to 12. Now In my database I have one class and one object. I do query without any parameter like this:

Parse.Cloud.define("code", async (request) => {

  const getTest = new Parse.Query("Test");
  return await getTest.find({useMasterKey:true});

});

Performace doesnt decrease. Only diference this object and my other objects is pointer fields and one string field. I will add a string field to object first. And if performace was stable, I will Create another class and create a pointer field on my Test class.

@mtrezza
Copy link
Member

mtrezza commented Dec 3, 2020

Great investigative process, we are getting somewhere.

I see 4 dimensions you can experiment with:

  • number of objects in class
  • field types of objects
  • Parse Server configuration
  • DB setup (e.g. indices)

@uzaysan
Copy link
Author

uzaysan commented Dec 3, 2020

@mtrezza Ok. I replicated all fields. Created pointer and string fields. Performace didnt decrease. Then I connect my previous parse server to fresh mongodb database. Performace decrease. So I think problem is related to parse server configuration

@mtrezza
Copy link
Member

mtrezza commented Dec 4, 2020

It would be interesting now to find out which configuration specifically is related to the performance decrease. Let us know when you found the culprit.

@uzaysan
Copy link
Author

uzaysan commented Dec 4, 2020

@mtrezza I will stop investigating for now. I have finals in mid january(18th) I have to study my exams. And I will refractor my currect cloud code. I will change parse queries to http requests. When my exams finished I can start re investigate in february.

@mtrezza
Copy link
Member

mtrezza commented Dec 4, 2020

If you could just do one last test and take the current deployment that works fine but set the full Parse Server config that you had issues with, then we could get some important insight.

I'm afraid that in Feb things will not look the same and we'll loose track of this, so if you could it would be great, if not then we'll pick this up in Feb.

@uzaysan
Copy link
Author

uzaysan commented Dec 4, 2020

@mtrezza I'm afraid this cloud take longer than we tought. I said ıts related to parse config. But I made another test yesterday(my first test with fresh parse server and existing database). But performance was still decreases. see:

I tried with fresh install parse server(cloned parse-server-example repo). But I used my existing database.And Problem still exists. Even on fresh install parse server, performace decreases.

I dont know why I skipped this. ıt was midnight and I was sleepy.

So This might be or might not be related to parse config. Yes I can do few more tests. But This looks like it will take longer than we think

@mtrezza
Copy link
Member

mtrezza commented Dec 4, 2020

As I understand it now:

- 1. old parse server + old database = performance decrease
- 2. new parse server + old database = performance decrease
+ 3. new parse server + new database = no performance decrease
# 4. old parse server + new database = ?
# 5. new parse server with old configuration + new database = ?

I don't want to ask for too much, but the test case 5 is what would be interesting to see.

@uzaysan
Copy link
Author

uzaysan commented Dec 4, 2020

@mtrezza I will format my server. And re install both parse-server and mongodb. And will test all 5 scenario.

@mtrezza
Copy link
Member

mtrezza commented Dec 4, 2020

Let's just make sure we don't loose the old DB and old Parse Server, so that we can reproduce the issue.

@uzaysan
Copy link
Author

uzaysan commented Dec 5, 2020

@mtrezza Ok. I made some tests. I found the problem. Its not related to parse config. Its related to what I pass into query constructor.

For example:

var Test = Parse.Object.extend("Test");
var query = new Parse.Query(Test);

This decreases performance.
var query = new Parse.Query("Test");
But this doesn't. I tested this with my old server and new parse server. Even in fresh parse server +fresh mongodb, If I pass ParseObject to query, performance decreases. If I pass classname directly, its fine

@mtrezza
Copy link
Member

mtrezza commented Dec 5, 2020

Ah right, well done. I think it's not supported to pass a Parse Object. Interesting that that worked at all. Did you see this anywhere in the docs as an example?

@uzaysan
Copy link
Author

uzaysan commented Dec 5, 2020

@mtrezza
Copy link
Member

mtrezza commented Dec 5, 2020

Very interesting and well researched from your side - now that we circled in the issue, the next step would be to look into the Parse Server / JS SDK code. It just a simple property setting, so it's not obvious at this point why this would cause a performance impact:

this.className = objectClass.className;

I also don't see the object being retained beyond the method call.

It could be an issue with ParseObject or its extend method. Can you run a test again where you just create the extended object, but not feed it into the query? To replicate the same object usage, it could be necessary to write the className to the log.

Parse.Cloud.define("code", async (request) => {

  var testObj = Parse.Object.extend("Test");
  console.log(testObj.className);

  var query = new Parse.Query("Test");
  return await query.find({ useMasterKey:true });
});

If the code above also decreases performance, then the issue may not be related to ParseQuery but to ParseObject and possibly a memory leak.

@uzaysan
Copy link
Author

uzaysan commented Dec 5, 2020

@mtrezza

extend method.

I'm suspecting that too.

Edit: Ok. I will do that

@uzaysan
Copy link
Author

uzaysan commented Dec 5, 2020

@mtrezza
I tested. It still decrease performance. Doesnt matter if I pass it to parse query.

@mtrezza
Copy link
Member

mtrezza commented Dec 5, 2020

Glad to hear that, we are getting closer to solving this.

Can you try again without extending the Parse Object, but just creating a new one?

Parse.Cloud.define("code", async (request) => {

  var testObj = new Parse.Object("Test");
  console.log(testObj.className);

  var query = new Parse.Query("Test");
  return await query.find({ useMasterKey:true });
});

Maybe can you also remove the query and still measure a performance decrease?

Parse.Cloud.define("code", async (request) => {

  var testObj = Parse.Object.extend("Test");
  console.log(testObj.className);
});

Also, you may want to try to remove the console.log, I suspect it is not necessary to recreate the issue.

@mtrezza
Copy link
Member

mtrezza commented Dec 5, 2020

I used wrk library to do benchmark tests. This is the code I used when I do benchmarking:
wrk -t1 -c400 -d30s -H "Content-Type: application/json" -H "X-Parse-Application-Id: YOUR_APP_ID" -s post.lua http://parse_server_ip:1337/parse/functions/codeWithQuery

Can you share your post.lua script?

@uzaysan
Copy link
Author

uzaysan commented Dec 5, 2020

@mtrezza

Parse.Cloud.define("code", async (request) => {

  var testObj = Parse.Object.extend("Test");
  console.log(testObj.className);
});

This doesnt decrease performance

Parse.Cloud.define("code", async (request) => {

  var testObj = new Parse.Object("Test");
  console.log(testObj.className);

  var query = new Parse.Query("Test");
  return await query.find({ useMasterKey:true });
});

This also doesnt decrease performance

Here is my lua script:
wrk.method = "POST"

@mtrezza
Copy link
Member

mtrezza commented Dec 6, 2020

I have been trying to recreate the issue of decreasing performance using wrk.

The results are:

Cloud Code functions
// Performance decreasing
Parse.Cloud.define("test", async (request) => {
    var testObj = Parse.Object.extend("Temp");
    var query = new Parse.Query(testObj);
    return await query.find({ useMasterKey:true });
});

// Performance stable
Parse.Cloud.define("test1", async (request) => {
    var query = new Parse.Query("Temp");
    return await query.find({ useMasterKey:true });
});

// Performance inconclusive
Parse.Cloud.define("test2", async (request) => {
    var testObj = Parse.Object.extend("Temp");
    var query = new Parse.Query("Temp");
    return await query.find({ useMasterKey:true });
});

// Performance stable
Parse.Cloud.define("test3", async (request) => {
    return Parse.Object.extend("Temp").className;
});

// Performance stable
Parse.Cloud.define("test4", async (request) => {
    return new Parse.Object("Temp").className;
});

// Performance stable
Parse.Cloud.define("test5", async (request) => {
    return "example string";
});

The Cloud Code function test shows indeed decreasing performance running on a fresh instance:

wrk log
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   164.05ms   60.53ms 713.18ms   92.50%
    Req/Sec    63.53     18.24   101.00     77.62%
  1843 requests in 30.09s, 3.54MB read
Requests/sec:     61.25
Transfer/sec:    120.53KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   191.24ms  152.52ms   1.82s    96.86%
    Req/Sec    56.15     22.58   100.00     64.64%
  1521 requests in 30.02s, 2.92MB read
  Socket errors: connect 0, read 0, write 0, timeout 7
Requests/sec:     50.66
Transfer/sec:     99.69KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   179.16ms   56.93ms 880.74ms   91.60%
    Req/Sec    56.54     16.22    99.00     69.47%
  1639 requests in 30.04s, 3.15MB read
Requests/sec:     54.56
Transfer/sec:    107.37KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   220.54ms  146.75ms   1.78s    94.95%
    Req/Sec    48.26     18.17    90.00     64.89%
  1393 requests in 30.07s, 2.68MB read
  Socket errors: connect 0, read 0, write 0, timeout 4
Requests/sec:     46.33
Transfer/sec:     91.16KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   264.34ms  209.52ms   1.97s    93.56%
    Req/Sec    43.93     15.84    89.00     67.29%
  1227 requests in 30.05s, 2.36MB read
  Socket errors: connect 0, read 0, write 0, timeout 2
Requests/sec:     40.83
Transfer/sec:     80.34KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   248.77ms   52.89ms 473.62ms   72.47%
    Req/Sec    40.08     11.98    70.00     79.79%
  1188 requests in 30.05s, 2.28MB read
Requests/sec:     39.53
Transfer/sec:     77.78KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   310.53ms  170.92ms   1.99s    94.99%
    Req/Sec    32.92     12.34    68.00     61.15%
  950 requests in 30.05s, 1.83MB read
  Socket errors: connect 0, read 0, write 0, timeout 5
Requests/sec:     31.62
Transfer/sec:     62.21KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   315.16ms   70.92ms 846.97ms   77.49%
    Req/Sec    31.61     11.44    70.00     62.89%
  938 requests in 30.04s, 1.80MB read
Requests/sec:     31.22
Transfer/sec:     61.44KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   353.37ms  141.22ms   1.80s    88.39%
    Req/Sec    28.22     11.09    59.00     71.64%
  810 requests in 30.05s, 1.56MB read
  Socket errors: connect 0, read 0, write 0, timeout 7
Requests/sec:     26.96
Transfer/sec:     53.05KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   389.79ms  129.77ms   1.65s    88.36%
    Req/Sec    25.68      9.72    60.00     70.22%
  747 requests in 30.08s, 1.44MB read
  Socket errors: connect 0, read 0, write 0, timeout 2
Requests/sec:     24.83
Transfer/sec:     48.86KB
-----------------------------------------------------

However, after a while the performance seems to stabilize in the lower range:

wrk log
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.50s   233.25ms   1.98s    66.67%
    Req/Sec     7.10      4.55    29.00     92.41%
  191 requests in 30.11s, 375.84KB read
  Socket errors: connect 0, read 0, write 0, timeout 5
Requests/sec:      6.34
Transfer/sec:     12.48KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.48s   238.73ms   1.95s    72.82%
    Req/Sec     6.88      3.82    20.00     89.02%
  195 requests in 30.01s, 383.72KB read
Requests/sec:      6.50
Transfer/sec:     12.79KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.42s   177.94ms   1.85s    85.85%
    Req/Sec     7.26      3.88    20.00     91.53%
  205 requests in 30.02s, 403.39KB read
Requests/sec:      6.83
Transfer/sec:     13.44KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.51s   255.28ms   1.97s    69.63%
    Req/Sec     6.62      3.76    20.00     94.84%
  191 requests in 30.02s, 375.84KB read
Requests/sec:      6.36
Transfer/sec:     12.52KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.58s   244.29ms   1.99s    70.06%
    Req/Sec     6.23      3.40    20.00     90.00%
  181 requests in 30.09s, 356.17KB read
  Socket errors: connect 0, read 0, write 0, timeout 4
Requests/sec:      6.02
Transfer/sec:     11.84KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.44s   191.72ms   1.95s    81.09%
    Req/Sec     7.30      4.42    20.00     91.28%
  201 requests in 30.05s, 395.52KB read
Requests/sec:      6.69
Transfer/sec:     13.16KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.61s   244.69ms   1.92s    68.18%
    Req/Sec     5.85      3.22    20.00     91.56%
  173 requests in 30.04s, 340.42KB read
  Socket errors: connect 0, read 0, write 0, timeout 19
Requests/sec:      5.76
Transfer/sec:     11.33KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.41s   195.47ms   1.88s    76.70%
    Req/Sec     7.33      4.07    20.00     90.61%
  206 requests in 30.08s, 405.36KB read
Requests/sec:      6.85
Transfer/sec:     13.48KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.48s   216.69ms   1.92s    73.77%
    Req/Sec     6.92      4.32    20.00     91.93%
  191 requests in 30.08s, 375.84KB read
  Socket errors: connect 0, read 0, write 0, timeout 8
Requests/sec:      6.35
Transfer/sec:     12.49KB
-----------------------------------------------------
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.39s   241.53ms   1.90s    78.26%
    Req/Sec     7.51      4.27    20.00     87.57%
  207 requests in 30.08s, 407.33KB read
Requests/sec:      6.88
Transfer/sec:     13.54KB
-----------------------------------------------------

@uzaysan
Copy link
Author

uzaysan commented Dec 6, 2020

@mtrezza Did you look for extend method? Any abnormal codes?

And yeah I can confirm it stabilize on 5-6 req/sec. I've made a test a month ago. It was set to one hour. And gave me 5 req/sec. It was on 4 core machine.

And Today I refractored my cloud code. Changed parse objects to className strings. Performance was stable. So changing parseobject to string definetly solves the issue.

Edit: I think easiest workaround for now would be disabling passing parse object to query constructor. İf user passed a parseobject to query constructor, Throw an error. But this requires developer to refreactor the code. But I've made it in 10 minutes on 8k lines of code. Thanks to Ctrl+F

@mtrezza
Copy link
Member

mtrezza commented Dec 6, 2020

I did not reach a conclusive result on whether test2 is stable or decreasing in performance. I would have to look into this more.

The decreasing performance and eventual stabilization could have to do with the NodeJS heap allocation or garbage collection algorithm. Maybe a NodeJS expert could give more insight here.

@Moumouls
Copy link
Member

Moumouls commented Dec 9, 2020

Here the JS SDK code

  /**
   * @param {(String|Parse.Object)} objectClass An instance of a subclass of Parse.Object, or a Parse className string.
   */
  constructor(objectClass: string | ParseObject) {
    if (typeof objectClass === 'string') {
      if (objectClass === 'User' && CoreManager.get('PERFORM_USER_REWRITE')) {
        this.className = '_User';
      } else {
       // No reference 
        this.className = objectClass;
      }
    } else if (objectClass instanceof ParseObject) {
     // Can lead to some JS ref issue may be
      this.className = objectClass.className;
    } else if (typeof objectClass === 'function') {
      if (typeof objectClass.className === 'string') {
        this.className = objectClass.className;
      } else {
        var obj = new objectClass();
        this.className = obj.className;
      }
    } else {
      throw new TypeError(
        'A ParseQuery must be constructed with a ParseObject or class name.'
      );
    }

it would be interesting if we try to break some JS object reference.
May be this.className =${objectClass.className}`` could improve some things

@mtrezza
Copy link
Member

mtrezza commented Dec 11, 2020

May be this.className =${objectClass.className}`` could improve some things

I would be surprised if string interpolation would make any difference because className is a primitive string.

However, I think we need some more analysis to be sure that there is actually a problem that needs to be fixed and how we would measure any improvements we attempt to make. I'll do some more tests.

@Moumouls
Copy link
Member

@mtrezza me too ahaha, i think here it could be interesting to run a node --inspect-brk and check the JS Profiler
Here a guide: https://medium.com/@paul_irish/debugging-node-js-nightlies-with-chrome-devtools-7c4a1b95ae27
May be the Profiler can show us what's wrong 😄

@mtrezza
Copy link
Member

mtrezza commented Dec 11, 2020

I did the tests above while profiling, couldn't interpret any causes, just saw the execution times rising, but for all JS ops, hence I assumed that it may just be a natural V8 characteristic. But feel free to give it a go.

@uzaysan
Copy link
Author

uzaysan commented Dec 11, 2020

Hey guys. I need to mention that, Performance only decreases when query returns a parse objects. When you do test with count query, performance is stable. Maybe you should try with count method. So This is not only related to extend method(I guess)

See this: https://community.parseplatform.org/t/parse-server-performance-loss-after-time/1070/10?u=uzaysan

Can you please test this code:

Parse.Cloud.define("test", async (request) => {
    var testObj = Parse.Object.extend("Temp");
    var query = new Parse.Query(testObj);
    return await query.count({ useMasterKey:true });
});

This code was stable for me

@mtrezza
Copy link
Member

mtrezza commented Dec 12, 2020

Maybe related: #7065 describes neg. performance impact on queries with pointers.

@mtrezza
Copy link
Member

mtrezza commented Dec 17, 2020

Maybe related: #7080 describes slow ops on WT engine

@mtrezza
Copy link
Member

mtrezza commented Feb 12, 2021

Maybe related: #6636 (comment) describes possible memory leak with directAccess: true

@mtrezza
Copy link
Member

mtrezza commented Feb 12, 2021

Maybe related: #6405 describes slow queries with certain Parse Server / Node versions

@dplewis
Copy link
Member

dplewis commented Mar 16, 2021

Closing via #7214

@dplewis dplewis closed this as completed Mar 16, 2021
@dplewis
Copy link
Member

dplewis commented Mar 16, 2021

@uzaysan It would be great if you could try the new master branch and confirm whether #7214 solves performance loss?

@uzaysan
Copy link
Author

uzaysan commented Mar 21, 2021

@dplewis I will try it when I have a free time.

@xeoshow
Copy link

xeoshow commented Oct 7, 2021

Just would like to know if this issue has been resolved? Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants