-
-
Notifications
You must be signed in to change notification settings - Fork 4.8k
FR: Inbuilt testing utility for Dashboard / CLI #6996
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@dblythy unfortunately, i think it's not the purpose of parse to be a test framework. We can redirect developers to use may be: BUT, running parse server in integration testing (live server with isolated database ) need some advanced knowledge on Parse Server and the test framework. May be a special function // Jest example
beforeAll( async () => {
// Start a parse and a in memory mongo DB, return some Parse (node) initialized instance
const { Parse, serverUrl, masterKey, mongoClient } = await ParseTestServer.start(someOptions)
}) @davimacedo @mtrezza @dblythy what do you think about this ? |
Thanks for suggesting. I am excited about the idea to make Cloud Code testing more accessible and promote it as best practice to developers who may not be experienced enough to set up unit testing on their own. It's not clear to me yet what the scope of such testing could be or how that could look conceptually, so I think we'd have to dig deeper to understand the scope, complexity and viability of such a PR. Let's break down the idea:
Correct me if I misread this, but it almost sounds like "monitoring" a production environment. Is the essence of the FR continuously monitoring a production environment or build-triggered testing a development environment before it is deployed into production?
I think @Moumouls has a point, that is why I am even more "all ears" for this FR and a solution that could shift some of that complexity away from developers. |
I think we can work on a PS: Adding testing UI via dashboard is not a good deal since some developers do not use and do not need dashboard, also showing some tests feedback on dashboard, technically it's too late since you parse server have may be already broken changes. Traditional CI/CD process is:
|
Yes, @mtrezza you've hit the nail on the head here as to what I was thinking about.
That would essentially be the idea. The suggestion would allow for inbuilt Parse.Cloud testing, with a mongodb in memory, and automatic cleanup. As @Moumouls has stated - this can already be done with other frameworks, however as the FR states, the intention is to try to integrate testing simply into Parse.Cloud for an easier experience for devs to learn testing. Again - if this is "reinventing the wheel", maybe it's an issue for the docs repo. We could even start by having an example test file in Parse Server Example, and add some commentary around testing there. |
Maybe we can start with your example and go from there. Let's say we have a cloud code function that increments the number of stars a user has:
Then we would add a test:
As a basis for discussion. |
Also related #2488 |
I'd rather work on a "create-parse-app" CLI (inspired by create-react-app, create-next-app, etc) that would create all the boilerplate needed for a parse app, including a setup for the cloud code tests. |
I definitely feel that adding this in-built testing into Parse is heading in the wrong direction i.e. undifferentiated heavy-lifting. There are definitely excellent frameworks out there that can do all of this work for us, and I'd argue that it's better to introduce devs to those tools first anyway, as they are used everywhere. However, we could definitely improve the documentation. I spent quite some time trying to figure out the right thing to do in my own project because there was no clear guidance here. I ended up going down the route of a refactor + Jest + using hustle/parse-mockdb. Apart from a few small issues, it was quite a pleasant experience and all of my tests run super quick. (Note that it was mentioned a couple of times in #2488 way back in 2016.) For example (un-tested code!)... cloud/main.js const { _addStars } = require('./stars_helper');
// Extracted method to simplify testing.
const addStars = async (req, res) => {
const user = req.user;
const starsToAdd = req.params.starsToAdd;
const stars = await _addStars(user, starsToAdd);
return { newStars: stars };
}
Parse.Cloud.define('addStars', addStars); cloud/main.test.js (will be picked up by Jest test runner) /* global Parse */
global.Parse = require('parse/node');
global.ParseMockDB = require('parse-mockdb');
// Set up a mocked in-memory Parse server before each test().
beforeEach(() => ParseMockDB.mockDB(Parse));
afterEach(() => ParseMockDB.cleanUp());
const { addStars } = require('./main');
describe('adding a star', () => {
test('increments star count', async () => {
// Arrange
const user = await Parse.User.signUp(...);
const params = { starsToAdd: 1 };
const req = { user, params };
// Act
const res = await addStars(req);
// Assert
expect(res.newStars).toBe(2);
});
}); Excuse my use of old-school I am aware that there are concerns that all mocks end up growing to be a (broken) reimplementation of the thing that they're mocking, and that's definitely valid. But you also have to balance that downside with the fast-feedback and confidence that they give. I'm definitely going to add a layer of integration tests in my testing pyramid that use Docker Compose and a real Parse Server instance to check the main workflows are OK. I'd be happy to suggest some improvements to the server example code via PR - even if only highlighting one possible approach. I had some other things I was tempted to add in there anyway (although I now run the official Docker image instead of rolling my own server nowadays TBH). Could I add a cheeky suggestion that we push to get |
I think this could perhaps be close by #384 in the Parse Server example repo, which add jasmine and attaches mockDB to the Parse Server instance. This removes the hassle from installing / setting up jasmine, mongodb-runner, etc. Testing cloud functions responses can be done as: it("register", async () => {
const user = new Parse.User();
user.set("username", "user");
user.set("businessName", "Test Business");
user.set("password", "password");
user.set("businessPhone", "phone");
await user.signUp();
const current = user;
expect(current).toBeDefined();
expect(current.get("username")).toBe("user");
expect(current.get("password")).toBeUndefined();
const business = current.get("business");
expect(business).toBeDefined();
expect(business.get("name")).toBe("Test Business");
expect(business.get("profile")).toBe("testbusiness");
expect(business.get("phone")).toBe("phone");
expect(business.get("admin").id).toBe(current.id);
// check security
const userAcl = current.getACL().toJSON();
expect(userAcl["*"]).toBeUndefined();
const businessAcl = current.get("business").getACL().toJSON();
expect(businessAcl["*"]).toBeDefined();
expect(businessAcl["*"]["read"]).toBe(true);
expect(businessAcl["*"]["write"]).not.toBe(true);
}); I think this satisfies having an "inbuilt" or "boilerplate" utility that developers can easily use with Parse Server to test their cloud functions are working as expected. What are your thoughts? |
Is your feature request related to a problem? Please describe.
This could be potentially due to my inexperience with JS testing (I've only been exposed to JS testing since contributing here) but I think it could be cool if we could have an inbuilt cloud code testing utility, that could potentially add a testing suite to the dashboard (e.g /tests), with a bunch of tests that have been written.
Describe the solution you'd like
Not sure the best way to do this, but here's what I've considered:
Specifying a tests function just like a cloud function, and then creating new cloud functions, such as:
And then these "test" triggers get loaded into the dashboard and ran on request, or specified as an option when starting the server (either CLI --test or
runTests:true
). The results of these tests could be potentially shown in the dashboard (e.g as green, or red with issues).Describe alternatives you've considered
Can already use #4310 to unit test
Additional context
I would personally hope to build some tests that run on server create and every week, so I can monitor if anything breaks from an update or external change. This might be re-inventing the wheel, I'm just thinking how to make it more accessible and easier for our community.
The text was updated successfully, but these errors were encountered: