Skip to content

Commit

Permalink
Merge pull request #1460 from boostercloud/v2.x.x
Browse files Browse the repository at this point in the history
v2.x.x - Next major version preparation branch
  • Loading branch information
Javier Toledo authored Nov 1, 2023
2 parents 7fab52d + 9d2fd48 commit 9ec3bd2
Show file tree
Hide file tree
Showing 117 changed files with 1,073 additions and 1,190 deletions.
3 changes: 1 addition & 2 deletions .github/README_CICD.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,7 @@ account that:
the user is not confused.
- It handles the creation and wiring of many cloud components, which are lots of moving pieces, so everything is double-checked
to prevent errors in deployed environments.
- It is a multi-cloud framework, and behavior is double-checked both on AWS and Azure, as they are the officially supported cloud
providers. Ensuring everything runs smoothly, regardless of the choice of the user.
- It is a multi-cloud framework, and behavior is double-checked both on AWS and Azure. Ensuring everything runs smoothly, regardless of the choice of the user.

We always keep improving our CI/CD processes, but we always make sure that we have the above covered.

Expand Down
2 changes: 1 addition & 1 deletion .github/actions/build/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ runs:
steps:
- uses: actions/setup-node@v3
with:
node-version: 16.14
node-version: 18.18

# First we cache the rush project, to ensure we don't build multiple times, nor we download more dependencies than needed
- name: Cache Rush project
Expand Down
6 changes: 4 additions & 2 deletions .github/workflows/codeql-analysis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,12 @@ name: 'CodeQL'

on:
push:
branches: [main]
branches:
- main
pull_request:
# The branches below must be a subset of the branches above
branches: [main]
branches:
- main
schedule:
- cron: '17 3 * * 4'

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/re_test-integration-prepare.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ jobs:

- uses: actions/setup-node@v4
with:
node-version: 16.14
node-version: 18.18

# If this was triggered by a /integration command, check out merge commit
- name: Fork based /integration checkout
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/wf_check-changes.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@ on:
push:
branches:
- main
- '1.0.0'
paths-ignore:
- '**.md'
- 'website/**'
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/wf_check-lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@ on:
pull_request:
branches:
- main
- '1.0.0'
paths-ignore:
- '**.md'
- 'website/**'
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/wf_publish-docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
node-version: 18.18
cache: npm
cache-dependency-path: website/package-lock.json

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/wf_publish-npm.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ jobs:

- uses: actions/setup-node@v4
with:
node-version: 16.x
node-version: 18.18
registry-url: https://registry.npmjs.org/

- name: Rush Update
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/wf_test-unit.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@ on:
push:
branches:
- main
- '1.0.0'
paths-ignore:
- '**.md'
- 'website/**'
Expand Down
2 changes: 1 addition & 1 deletion .nvmrc
Original file line number Diff line number Diff line change
@@ -1 +1 @@
lts/gallium
lts/hydrogen
47 changes: 34 additions & 13 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Booster is divided in many different packages. The criteria to split the code in

- They must be run separately, for instance, the CLI is run locally, while the support code for the project is run on the cloud.
- They contain code that is used by at least two of the other packages.
- They're a vendor-specific specialization of some abstract part of the framework (for instance, all the code that is required by AWS is in separate packages).
- They're a vendor-specific specialization of some abstract part of the framework (for instance, all the code that is required to support Azure is in separate packages).

The packages are managed using [rush](https://rushjs.io/) and [npm](https://npmjs.com), if you run `rush build`, it will build all the packages.

Expand All @@ -62,8 +62,8 @@ The packages are published to `npmjs` under the prefix `@boostercloud/`, their p
- `cli` - You guessed it! This package is the `boost` command-line tool, it interacts only with the core package in order to load the project configuration. The specific provider packages to interact with the cloud providers are loaded dynamically from the project config.
- `framework-core` - This one contains all the framework runtime vendor-independent logic. Stuff like the generation of the config or the commands and events handling happens here. The specific provider packages to interact with the cloud providers are loaded dynamically from the project config.
- `framework-integration-tests` - Implements integration tests for all supported vendors. Tests are run on real infrastructure using the same mechanisms than a production application. This package `src` folder includes a synthetic Booster application that can be deployed to a real provider for testing purposes.
- `framework-provider-aws` - Implements all the required adapters to make the booster core run on top of AWS technologies like Lambda and DynamoDB using the AWS SDK under the hoods.
- `framework-provider-aws-infrastructure` - Implements all the required adapters to allow Booster applications to be deployed to AWS using the AWS CDK under the hoods.
- `framework-provider-aws` (Deprecated) - Implements all the required adapters to make the booster core run on top of AWS technologies like Lambda and DynamoDB using the AWS SDK under the hoods.
- `framework-provider-aws-infrastructure` (Deprecated) - Implements all the required adapters to allow Booster applications to be deployed to AWS using the AWS CDK under the hoods.
- `framework-provider-local` - Implements all the required adapters to run the Booster application on a local express server to be able to debug your code before deploying it to a real cloud provider.
- `framework-provider-local-infrastructure` - Implements all the required code to run the local development server.
- `framework-types` - This package defines types that the rest of the project will use. This is useful for avoiding cyclic dependencies. Note that this package should not contain stuff that are not types, or very simple methods related directly to them, i.e. a getter or setter. This package defines the main booster concepts like:
Expand Down Expand Up @@ -252,7 +252,7 @@ The Booster Framework project is organized following the ["rush monorepo"](https

- The "package.json" files that are on each package root should contain the dependencies used by that specific package. Be sure to correctly differentiate which dependency is only for development and which one is for production.

Finally, **always use exact numbers for dependency versions**. This means that if you want to add the dependency "aws-sdk" in version 1.2.3, you should add `"aws-sdk": "1.2.3"` to the corresponding "package.json" file, and never `"aws-sdk": "^1.2.3"` or `"aws-sdk": "~1.2.3"`. This restriction comes from hard problems we've had in the past.
Finally, **always use exact numbers for dependency versions**. This means that if you want to add the dependency "graphql" in version 1.2.3, you should add `"graphql": "1.2.3"` to the corresponding "package.json" file, and never `"graphql": "^1.2.3"` or `"graphql": "~1.2.3"`. This restriction comes from hard problems we've had in the past.

### Running unit tests

Expand All @@ -275,15 +275,36 @@ Still, it is recommendable to run them locally before submitting a PR for review

These are the available scripts to run integration tests:

- `rushx integration -v`: Run all the integration test suites in the right order.
- `rushx integration/aws-deploy -v`: This test just checks that the sample project in `packages/framework-integration-tests/src` can be successfully deployed to AWS. The deployment process takes several minutes and this project is used by all the other AWS integration tests, so it's a requirement to run this test before.
- `rushx integration/aws-func -v`: AWS functional integration tests. They stress the deployed app write API and checks that the results are the expected ones both in the databases and the read APIs.
- `rushx integration/end-to-end -v`: Runs complete and realistic use cases on several cloud providers. This tests are intended to verify that a single project can be deployed to different cloud providers. Currently, only AWS is implemented though.
- `rushx integration/aws-nuke -v`: This test checks that the application deployed to AWS can be properly nuked. This test should be the last one after other test suites related to AWS have finished.
- `rushx integration/local -v`: Checks that the test application can be launched locally and that the APIs and the databases behave as expected.
- `rushx integration/cli -v`: Checks cli commands and check that they produce the expected results.

AWS integration tests are run in real AWS resources, so you'll need to have your AWS credentials properly set in your development machine. By default, the sample project will be deployed to your default account. Basically, if you can deploy a Booster project to AWS, you should be good to go ([See more details about setting up an AWS account in the docs](https://github.com/boostercloud/booster/tree/main/docs#set-up-an-aws-account)). Notice that while all resources used by Booster are included in the AWS free tier, running these tests in your own AWS account could incur in some expenses.
1. **General Integration Tests:**
- `rushx integration -v`: Run all integration test scripts.

2. **CLI Integration Tests:**
- `rushx integration/cli -v`: Tests CLI commands and verifies that they produce the expected results.

3. **Local Integration Tests:**
- `rushx integration/local -v`: Runs all integration scripts in the local development server.
- `rushx integration/local-ongoing -v`: Runs the start and stop integration tests.
- `rushx integration/local-start -v`: Checks the start functionality of the local environment.
- `rushx integration/local-func -v`: Functional tests for the local environment.
- `rushx integration/local-end-to-end -v`: Runs end-to-end tests in the local environment.
- `rushx integration/local-stop -v`: Checks the stop functionality of the local environment.

4. **AWS Integration Tests:**
- `rushx integration/aws -v`: Runs all integration test scripts for provider AWS.
- `rushx integration/aws-deploy -v`: Tests the deployment of a sample project to AWS.
- `rushx integration/aws-func -v`: Runs functional tests on AWS, stressing the deployed app's write API and verifying the results in databases and read APIs.
- `rushx integration/aws-end-to-end -v`: Performs end-to-end tests on AWS.
- `rushx integration/aws-load -v`: (Currently skipped) Intended for load tests on AWS.
- `rushx integration/aws-nuke -v`: Verifies that the deployed application on AWS can be properly nuked.

5. **Azure Integration Tests:**
- `rushx integration/azure -v`: Runs all integration test scripts for provider Azure.
- `rushx integration/azure-deploy -v`: Tests the deployment of a project to Azure.
- `rushx integration/azure-func -v`: Runs functional tests on Azure.
- `rushx integration/azure-end-to-end -v`: Performs end-to-end tests on Azure.
- `rushx integration/azure-nuke -v`: Verifies that the deployed application on Azure can be properly nuked.

Azure and AWS integration tests run in real environments, so you'll need to have your credentials properly set in your development machine in order to run them. They will deploy a sample project to your default account, run the tests and nuke the application when the process finishes. Notice that running integration tests in your cloud account could incur in some expenses.

### Github flow

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@

[Booster Framework](https://boosterframework.com) is a software development framework designed to create event-driven backend microservices that focus on extreme development productivity. It provides a highly opinionated implementation of the CQRS and Event Sourcing patterns in Typescript, using [DDD (Domain-Driven Design)](https://en.wikipedia.org/wiki/Domain-driven_design) semantics that makes business logic fit naturally within the code. Thanks to Booster, business, product, and technical teams can collaborate, sharing a much closer language.

Booster uses advanced static analysis techniques and takes advantage of the Typescript type system to understand the structure and semantics of your code and minimize the amount of glue code. It’s capable not just of building an entirely functioning GraphQL API for you, but also to build an optimal, production-ready and scalable cloud infrastructure for your application in your preferred cloud provider (Azure or AWS).
Booster uses advanced static analysis techniques and takes advantage of the Typescript type system to understand the structure and semantics of your code and minimize the amount of glue code. It’s capable not just of building an entirely functioning GraphQL API for you, but also to build an optimal, production-ready and scalable cloud infrastructure for your application in Azure or AWS.

Combining these features, Booster provides an unprecedented developer experience. On the one hand, it helps you write simpler code, defining your application in terms of commands, events, entities, and read models. On the other hand, you don't have to worry about the tremendous amount of low-level configuration details of conventional tools. You write highly semantic code, and if it compiles, you can run it on the cloud at scale.

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"changes": [
{
"packageName": "@boostercloud/framework-core",
"comment": "Upgraded for Node18 support",
"type": "minor"
}
],
"packageName": "@boostercloud/framework-core"
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"changes": [
{
"packageName": "@boostercloud/framework-core",
"comment": "Replaced the deprecated dependency `ttypescript` with `ts-patch`",
"type": "minor"
}
],
"packageName": "@boostercloud/framework-core"
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"changes": [
{
"packageName": "@boostercloud/framework-core",
"comment": "Bump version to 2.0.0",
"type": "major"
}
],
"packageName": "@boostercloud/framework-core"
}
Loading

0 comments on commit 9ec3bd2

Please sign in to comment.