You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -54,6 +54,7 @@ Microservice to manage CRUD operations for all things Projects.
54
54
```
55
55
Alternatively, you may update `config/local.js` and replace `dockerhost` with your docker IP address.<br>
56
56
You may try using command `docker-machine ip` to get your docker IP, but it works not for all systems.
57
+
Also, be sure to update `busApiUrl` if you are running `tc-bus-api` locally. (See below)
57
58
58
59
Explanation of configs:
59
60
-`config/mock.local.js` - Use local `mock-services` from docker to mock Identity and Member services instead of using deployed at Topcoder dev environment.
@@ -85,12 +86,66 @@ Microservice to manage CRUD operations for all things Projects.
85
86
The project service will be served on `http://localhost:8001`.
86
87
87
88
### Import sample metadata & projects
89
+
88
90
```bash
89
91
CONNECT_USER_TOKEN=<connect user token> npm run demo-data
90
92
```
91
-
This command will create sample metadata entries in the DB (duplicate what is currently in development environment).
93
+
To retrieve data from DEV env we have to provide a valid user token (`CONNECT_USER_TOKEN`). You may login to http://connect.topcoder-dev.com and find the Bearer token in the request headers using browser dev tools.
94
+
95
+
This command for importing data uses API to create demo data. Which has a few pecularities:
96
+
- data in DB would be for sure created
97
+
- data in ElasticSearch Index (ES) would be only created if services [project-processor-es](https://github.com/topcoder-platform/project-processor-es) and [tc-bus-api](https://github.com/topcoder-platform/tc-bus-api) are also started locally. If you don't start them, then imported data wouldn't be indexed in ES, and would be only added to DB. You may start them locally separately, or better use `local/full/docker-compose.yml` as described [next section](#local-deployment-with-other-topcoder-services) which would start them automatically.
98
+
-**NOTE** During data importing a lot of records has to be indexed in ES, so you have to wait about 5-10 minutes after `npm run demo-data` is finished until imported data is indexed in ES. You may watch logs of `project-processor-es` to see if its done or no.
99
+
100
+
### Local Deployment with other Topcoder Services.
101
+
102
+
* There exists an alternate `docker-compose.yml` file that can be used to spawn containers for the following services:
* To have kafka create a list of desired topics on startup, there exists a file with the path `local/full/kafka-client/topics.txt`. Each line from the file will be added as a topic.
117
+
* To run these services simply run the following commands:
118
+
119
+
```bash
120
+
export AUTH0_CLIENT_ID=<insert required value here>
121
+
export AUTH0_CLIENT_SECRET=<insert required value here>
122
+
export AUTH0_URL=<insert required value here>
123
+
export AUTH0_AUDIENCE=<insert required value here>
124
+
export AUTH0_PROXY_SERVER_URL=<insert required value here>
125
+
126
+
cd local/full
127
+
docker-compose up -d
128
+
```
92
129
93
-
To retrieve data from DEV env we need to provide a valid user token. You may login to http://connect.topcoder-dev.com and find the Bearer token in the request headers using browser dev tools.
130
+
* The environment variables specified in the commands above will be passed onto the containers that have been configured to read them.
131
+
* The above command will start all containers in the background.
132
+
* To view the logs of any of the services use the following command, replacing "SERVICE_NAME" with the corresponding value under the "Name" column in the above table:
133
+
134
+
```bash
135
+
cd local/full
136
+
docker-compose logs -f SERVICE_NAME
137
+
```
138
+
139
+
* The containers have been configured such that all Topcoder services will wait until all the topics listed in `local/full/kafka-client/topics.txt` have been created. To monitor the progress of topic creation, you can view the logs of the `kafka-client` service, which will exit when all topics have been created.
140
+
141
+
***WARNING**<br>
142
+
After all the containers are started, make sure that `project-processor-es` service started successfully, as sometimes it doesn't start successfully as Kafka wasn't yet properly started at that moment. So run `docker-compose logs -f project-processor-es` to see its logs, you should see 3 lines with text `Subscribed to project.action.` like:
If you don't see such lines, restart `project-processor-es` service ONLY by running `docker-compose restart project-processor-es`.
94
149
95
150
### Run Connect App with Project Service locally
96
151
@@ -99,7 +154,9 @@ To be able to run [Connect App](https://github.com/appirio-tech/connect-app) wit
99
154
100
155
```js
101
156
PROJECTS_API_URL:'http://localhost:8001'
157
+
TC_NOTIFICATION_URL:'http://localhost:4000/v5/notifications' # if tc-notfication-api has been locally deployed
102
158
```
159
+
103
160
2. Bypass token validation in Project Service.
104
161
105
162
In `tc-project-service/node_modules/tc-core-library-js/lib/auth/verifier.js` add this to line 23:
@@ -142,3 +199,33 @@ You can paste **swagger.yaml** to [swagger editor](http://editor.swagger.io/) o
142
199
143
200
#### Deploying without docker
144
201
If you don't want to use docker to deploy to localhost. You can simply run `npm run start:dev` from root of project. This should start the server on default port `8001`.
202
+
203
+
### Kafka Commands
204
+
205
+
If you've used `docker-compose` with the file `local/full/docker-compose.yml` to spawn kafka & zookeeper, you can use the following commands to manipulate kafka topics and messages:
206
+
(Replace TOPIC_NAME with the name of the desired topic)
0 commit comments