Skip to content

Commit

Permalink
Merge pull request #82 from chanzuckerberg/dev_docs
Browse files Browse the repository at this point in the history
Add developer guide to working with platformics
  • Loading branch information
ninabernick authored Jul 23, 2024
2 parents 0ed3d43 + a2586e7 commit ec383ea
Show file tree
Hide file tree
Showing 6 changed files with 85 additions and 14 deletions.
13 changes: 1 addition & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,19 +48,8 @@ The libraries and tools that make Platformics work:
2. Run `make codegen` to re-run code gen and restart the API service
3. If your changes require DB schema changes, run `make alembic-autogenerate` and `make alembic-upgrade-head` to generate DB migrations and run them.

## Debugging
1. Install the `Dev Containers` extension for vscode
2. Open a new VSCode window in your api directory. It will read the `.devcontainer/devcontainer.json` configuration and prompt you to reopen the directory in a container (lower right side of the screen). Click "Reopen in container"
3. Click the "Run and Debug" icon in the icon bar on the right side of the VSCode window (or ctrl+shift+d). Then click the "start debugging" icon at the top of the run and debug panel (or press F5). This will launch a secondary instance of the API service that listens on port 9008.
4. Set all the breakpoints you want. Browse to the api at http://localhost:9008 to trigger them. Remember that the application restarts when files change, so you'll have to start and stop the debugger to pick up any changes you make!

## Debugging and developing platformics itself
1. Run `make dev` in the root of this directory. This launches a compose service called `dev-app` that has the `platformics` directory in this repo mounted inside the `test_app` application as a sub-module, so it can be edited directly and be debugged via the VSCode debugger.
2. Open a new VSCode window in the root of this reopo. It will read the `.devcontainer/devcontainer.json` configuration and prompt you to reopen the directory in a container (lower right side of the screen). Click "Reopen in container"
3. Click the "Run and Debug" icon in the icon bar on the right side of the VSCode window (or ctrl+shift+d). Then click the "start debugging" icon at the top of the run and debug panel (or press F5). This will launch a secondary instance of the API service that listens on port 9008.
4. Set all the breakpoints you want. Browse to the api at http://localhost:9008 to trigger them. Remember that the application restarts when files change, so you'll have to start and stop the debugger to pick up any changes you make!

## HOWTO
- [Work with platformics](docs/HOWTO-working-with-platformics.md)
- [Extend the generated API](docs/HOWTO-extend-generated-api.md)
- [Customize Codegen templates](docs/HOWTO-customize-templates.md)

Expand Down
80 changes: 80 additions & 0 deletions docs/HOWTO-working-with-platformics.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
# How To: Working with Platformics

## Structure

### Platformics
Notable files and subdirectories:
* `api/` - base code and utilities for setting up API
* `core/`
* `deps.py` - dependencies injected to FastAPI endpoints
* `query_builder.py` - functions for querying DB given GraphQL queries
* `gql_loaders.py` - dataloaders for relationships to avoid GQL N+1 queries
* `strawberry_extensions.py` - extensions to apply dependencies to resolvers
* `types/`
* `entities.py` - base entity code
* `files.py` - GQL types, mutations, queries for files
* `codegen/`
* `lib/linkml_wrappers.py` - convenience functions for converting LinkML to generated code
* `templates/` - all Jinja templates for codegen. Entity-related templates can be overridden with [custom templates](https://github.com/chanzuckerberg/platformics/tree/main/platformics/docs/HOWTO-customize-templates.md).
* `generator.py` - script handling all logic of applying Jinja templates to LinkML schema to generate code
* `database/`
* `models/`
* `base.py` - SQLAlchemy model for base entity
* `file.py` - SQLAlchemy model and methods for file
* `connect.py` - functions for connecting to database
* `support/` - miscellaneous support enums, functions for files
* `test_infra/` - contains base entity and file factories
* `settings.py` - config variables using [Pydantic Settings](https://docs.pydantic.dev/latest/concepts/pydantic_settings/)


### Test app
Notable files and subdirectories:
* `api/` - entrypoint for GQL API service
* `helpers/` - generated GQL types and helper functions for GROUPBY queries
* `types/` - generated GQL types
* `validators/` - generated Pydantic validators for create and update inputs
* `mutations.py` - generated mutations (create, update, delete) for each entity type
* `queries.py` - generated queries (list and aggregate) for each entity type
* `schema.graphql` - GQL format schema
* `schema.json` - JSON format schema
* `cerbos/` - generated access policies for user actions for each entity type
* `database/` - code related to establishing DB connections / sessions
* `migrations/` - alembic migrations
* `models/` - generated SQLAlchemy models
* `schema/`
* `schema.yaml` - LinkML schema used to codegen entity-related files
* `test_infra/`
* `factories/` - FactoryBoy factories generated for each entity type
* `tests/` - your custom tests (not codegenned)
* `etc/` - some basic setup configuration

## Containers
There are two main ways of running the test app depending on what kind of development you're doing: making changes in the test app only, and making changes to the core platformics library.

To develop in the test app, `make init` will build the `platformics` image from the latest base image and start up the test app listening on port 9009. Changes within the `platformics` repo will not be picked up unless the image is rebuilt.

Containers (`test_app/docker-compose.yml`)
* `motoserver`: mock of S3 services to run test app entirely locally for development
* `cerbos`: resource authorization
* `platformics-db`: Postgres database
* `graphql-api`: API

When developing on `platformics` itself, running `make dev` will start all of the above containers, then stop the `graphql-api` container and start a new `dev-app` compose service.
The compose service called `dev-app` has the `platformics` directory in this repo mounted inside the `test_app` application as a sub-module, so it can be edited directly and be debugged via the VSCode debugger.
`graphql-api` and `dev-app` share a port, so the `graphql-api` container is stopped before starting the `dev-app` container.


For either of these two flows, the main app will be listening on port 9009 and debugging sessions will listen on port 9008.


## Debugging

### Using VSCode debugger
1. Install the `Dev Containers` extension for vscode
2. Open a new VSCode window in your api directory. It will read the `.devcontainer/devcontainer.json` configuration and prompt you to reopen the directory in a container (lower right side of the screen). Click "Reopen in container"
3. Click the "Run and Debug" icon in the icon bar on the right side of the VSCode window (or ctrl+shift+d). Then click the "start debugging" icon at the top of the run and debug panel (or press F5). This will launch a secondary instance of the API service that listens on port 9008.
4. Set all the breakpoints you want. Browse to the api at http://localhost:9008/graphql to trigger them. Remember that the application restarts when files change, so you'll have to start and stop the debugger to pick up any changes you make!


### Queries
To view SQL logs for queries, set `DB_ECHO=true` in `docker-compose.yml`. Run `make start` or `docker compose up -d` to apply the change.
2 changes: 1 addition & 1 deletion platformics/api/core/deps.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ async def get_engine(
settings: APISettings = Depends(get_settings),
) -> typing.AsyncGenerator[AsyncDB, None]:
"""Wrap resolvers in a DB engine"""
engine = init_async_db(settings.DB_URI)
engine = init_async_db(settings.DB_URI, echo=settings.DB_ECHO)
try:
yield engine
finally:
Expand Down
2 changes: 1 addition & 1 deletion platformics/database/connect.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ def session(self) -> sessionmaker[Session]:


def init_async_db(db_uri: str, **kwargs: dict[str, Any]) -> AsyncDB:
engine = create_async_engine(db_uri, echo=False, pool_size=5, max_overflow=5, future=True, **kwargs)
engine = create_async_engine(db_uri, pool_size=5, max_overflow=5, future=True, **kwargs)
return AsyncDB(engine)


Expand Down
1 change: 1 addition & 0 deletions platformics/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ class Settings(BaseSettings):
DEFAULT_UPLOAD_PROTOCOL: str
BOTO_ENDPOINT_URL: typing.Optional[str] = None
AWS_REGION: str
DB_ECHO: bool = False

############################################################################
# Computed properties
Expand Down
1 change: 1 addition & 0 deletions test_app/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ services:
- AWS_REGION=us-west-2
- AWS_ACCESS_KEY_ID=test
- AWS_SECRET_ACCESS_KEY=test
- DB_ECHO=true
# TODO - these are keypairs for testing only! Do not use in prod!!
- JWK_PUBLIC_KEY_FILE=/app/etc/public_key.pem
- JWK_PRIVATE_KEY_FILE=/app/etc/private_key.pem
Expand Down

0 comments on commit ec383ea

Please sign in to comment.