You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Update some outdated documentation
- Add README to sphinx documentation
- Update readthedocs config to generate documentation from README in
pre_build
The CORGI Job Dashboard or "CORGI dashboard" consists of a front-end microservice and a backend microservice. The CORGI Dashboard acts mainly as a "queue" of jobs to be processed by the Enki Concourse pipeline.
27
27
28
-
1. Backend - written using Python and the [FastAPI ASGI Framework](https://fastapi.tiangolo.com/). The backend API is used by the front-end and Enki pipelines to create, retrieve, or update job information.
28
+
1. Backend - written using Python and the [FastAPI ASGI Framework](https://fastapi.tiangolo.com/). The backend API is used by the front-end and Enki pipelines to create, retrieve, or update job information.
29
29
30
-
2. Frontend - written using [nuxt.js](https://nuxtjs.org/) and acts as the main dashboard interface of the CORGI system. You can see the list of jobs, create jobs, or abort a job that's in progress. Shows information pertaining to errors and status.
30
+
2. Frontend - written using [svelte](https://svelte.dev/) and acts as the main dashboard interface of the CORGI system. You can see the list of jobs, create jobs, or abort a job that's in progress. Shows information pertaining to errors and status.
31
31
32
32
### [Enki](https://github.com/openstax/enki)
33
33
@@ -38,7 +38,7 @@ The full explanation of Enki it is out of scope for this documentation. To learn
@@ -50,7 +50,7 @@ The full explanation of Enki it is out of scope for this documentation. To learn
50
50
51
51
52
52
53
-
## Local development
53
+
## Local development
54
54
55
55
**General Workflow**
56
56
1. Start stack with or without GitHub OAuth (see below)
@@ -108,52 +108,15 @@ To check the logs run:
108
108
109
109
### Functions Patched in Leashed Mode
110
110
111
-
Leashed mode attempts to patch any `app.github.api` function functions starting with "get_". It looks for the associated mock functions in `backend/app/tests/unit/init_test_data.py`.
111
+
Leashed mode attempts to patch any `app.github.api` function functions starting with `get_`. It looks for the associated mock functions in `backend/app/tests/unit/init_test_data.py`.
112
112
113
113
### Hot Reloading
114
114
115
-
Using the development stack, the Svelte frontend is rebuilt inside the container
115
+
Using the development stack, the Svelte frontend is rebuilt inside the container
116
116
as you make changes: no restarts required. The page should reload automatically
117
117
as well. The same is true for the backend: as you make modifications, the
118
118
backend server should reload with your latest changes.
119
119
120
-
121
-
### View the Docs
122
-
123
-
For our documentation we use [Sphinx-docs](https://www.sphinx-doc.org/en/master/)
124
-
and lives in the [./docs](./docs) directory.
125
-
126
-
If you are currently running the entire stack you should be able to see the
127
-
documentation by visiting [http://localhost:8000](http://localhost:8000).
128
-
129
-
The documentation is configured to watch for changes and re-build the documentation.
130
-
This allows developers the ability to preview their documentation changes as they
131
-
make them.
132
-
133
-
If you would like to run the documentation without the entire stack running you
134
-
can do so by running:
135
-
136
-
docker-compose up docs
137
-
138
-
### Editing The Docs
139
-
140
-
Edits are done in restructured text (rst).
141
-
142
-
While editing, you can check the logs by running
143
-
```
144
-
$ ./corgi docs logs
145
-
```
146
-
or
147
-
```
148
-
$ ./corgi docs logs -f
149
-
```
150
-
151
-
If edits have been made to the Navigation and are not reflected, re-build the docker image:
152
-
```
153
-
$ ./corgi build <stack-name>
154
-
$ ./corgi start [stack-name]
155
-
```
156
-
157
120
### Run backend unit tests
158
121
159
122
To run unit tests:
@@ -170,7 +133,7 @@ cd backend/app
170
133
poetry run pytest tests/unit --init-test-data --github-token "<token>"
**NOTE**: `host.docker.internal` refers to the host machine's `localhost` instead of the container's (see [documentation](https://docs.docker.com/desktop/networking/#use-cases-and-workarounds-for-all-platforms) for more info). The ui tests will still fail if CORGI is running in leashed mode because of how the the github api calls are patched.
151
+
**NOTE**: `host.docker.internal` refers to the host machine's `localhost` instead of the container's (see [documentation](https://docs.docker.com/desktop/networking/#use-cases-and-workarounds-for-all-platforms) for more info). The ui tests will still fail if CORGI is running in leashed mode because of how the the github api calls are patched. If you want the ui tests to work locally, you will need to configure the
152
+
OAuth credentials accordingly and run CORGI in dev mode.
189
153
190
154
### How to develop UI tests
191
155
192
-
Playwright UI tests are stored in `backend/app/tests/ui`.
156
+
Playwright UI tests are stored in `backend/app/tests/ui`.
193
157
194
158
### Clear the database
195
159
@@ -221,14 +185,68 @@ Do the migration:
221
185
222
186
docker-compose exec alembic upgrade head
223
187
224
-
### Load testing for the backend
188
+
### Load testing for the backend (Under Construction)
189
+
190
+
*UNDER CONSTRUCTION*
225
191
226
192
Load testing with Locust.io is in the directory `./backend/app/tests/performance/`
227
193
228
194
Please look at the [README](./backend/app/tests/performance/README.md) in this directory on how to run load tests locally and for production systems.
195
+
196
+
## Sphinx Documentation
197
+
198
+
### View the Docs
199
+
200
+
For our documentation we use [Sphinx-docs](https://www.sphinx-doc.org/en/master/)
201
+
and lives in the [./docs](./docs) directory.
202
+
203
+
If you are currently running the entire stack you should be able to see the
204
+
documentation by visiting [http://localhost:8000](http://localhost:8000).
205
+
206
+
The documentation is configured to watch for changes and re-build the documentation.
207
+
This allows developers the ability to preview their documentation changes as they
208
+
make them.
209
+
210
+
If you would like to run the documentation without the entire stack running you
211
+
can do so by running:
212
+
213
+
docker-compose up docs
214
+
215
+
### Editing The Docs
216
+
217
+
Edits are done in restructured text (rst).
218
+
219
+
While editing, you can check the logs by running
220
+
```
221
+
$ ./corgi docs logs
222
+
```
223
+
or
224
+
```
225
+
$ ./corgi docs logs -f
226
+
```
227
+
228
+
If edits have been made to the Navigation and are not reflected, re-build the docker image:
229
+
```
230
+
$ ./corgi build <stack-name>
231
+
$ ./corgi start [stack-name]
232
+
```
233
+
234
+
### Auto-generated docs
235
+
236
+
Additional documentation is automatically pulled from this README and added to the sphinx docs when the docs service is running and when corgi.readthedocs.io is deployed. Each level 2 header (##) becomes a separate page in the sphinx docs. Links, like [README.md](./README.md), are resolved to github links relative to the markdown file's parent directory.
237
+
238
+
This documentation is stored in [docs/auto](./docs/auto) and this directory should not be modified directly. It is saved as markdown and converted with the m2r2 sphinx extension (see [conf.py](./docs/conf.py) for more information).
239
+
240
+
When the files are generated, they are numbered so that the glob result will match the order the headers occurred in within the README.
241
+
242
+
The `docs` docker image uses [watchdog](https://pypi.org/project/watchdog/) to automatically update the auto-generated documentation as you update [README.md](./README.md).
243
+
244
+
There is a pre_build step in [.readthedocs.yaml](./.readthedocs.yaml) that generates the documentation during deployment.
245
+
246
+
229
247
## Releasing
230
248
231
-
The documentation is located in the [Releasing CORGI article]((https://openstax.atlassian.net/wiki/spaces/CE/pages/1256521739/Releasing+CORGI)) in our Confluence documentation.
249
+
The documentation is located in the [Releasing CORGI article](https://openstax.atlassian.net/wiki/spaces/CE/pages/1256521739/Releasing+CORGI) in our Confluence documentation.
232
250
233
251
## Deploying Web Hosting Pipeline
234
252
@@ -272,7 +290,7 @@ with requests.session() as session:
272
290
273
291
## Testing Unmerged CORGI & Enki Changes in Concourse
274
292
275
-
[CORGI Hotdog](https://corgi-hotdog.ce.openstax.org/) is a testing environment for experimenting with changes before they go to staging.
293
+
[CORGI Hotdog](https://corgi-hotdog.ce.openstax.org/) is a testing environment for experimenting with changes before they go to staging.
- On concourse, corgi-hotdog, wait for hotdog-head to get new checked out version, this triggers a concourse build pipeline is set in build-deploy-Enki
294
312
- See api call: `corgi-hotdog.openstax.org/hotdog/head` if you would like more details
295
313
- Since concourse depends on production Enki tags on dockerhub, there isn’t a way to do concourse + dev Enki tag — except thru hotdog
296
-
- Hotdog tag on dockerhub: corgi-hotdog rebuild as whatever dev ref you give it, then rebuilds the docker image for Enki.
314
+
- Hotdog tag on dockerhub: corgi-hotdog rebuild as whatever dev ref you give it, then rebuilds the docker image for Enki.
297
315
- Rebuild triggered by submitting a new ref.
298
316
- Wait to build hotdog tag and push to dockerhub. At this point, you can create a job. still have to wait for concourse (steps: corgi-git-pdf & corgi-resource) to fetch the tag. then eventually the job will run.
299
317
- Some additional details are
300
318
- If you checkout a branch, you will need to checkout the branch again if you want to pull the latest changes from the branch
301
319
- Jobs run on concourse. See logging & progress details on concourse/CORGI.
302
320
- In hotdog corgi ui: worker-version tells you what Enki ref it is & a timestamp
303
321
- Only one corgi-hotdog tag at a time
304
-
322
+
305
323
### Hotdog TODO
306
324
1. Automatically pull changes from branches
307
325
1. Automatically redeploy the stack if python dependencies change (maybe add a field to corgi refs that, when set, will trigger deploy on checkout?)
308
326
1. Consider creating a hybrid of PR pipeline so each PR can have a hotdog stack
0 commit comments