Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[COST-5551] prep for AWS Glue Data Catalogue migration #5449

Merged
merged 17 commits into from
Jan 22, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 13 additions & 8 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ DATABASE_NAME=postgres
[email protected]
PGADMIN_PASSWORD=postgres
PGADMIN_PORT=8432
POSTGRES_SQL_SERVICE_HOST=localhost
POSTGRES_SQL_SERVICE_PORT=15432
POSTGRES_SQL_SERVICE_HOST=db
POSTGRES_SQL_SERVICE_PORT=5432
DATABASE_USER=postgres
DATABASE_ADMIN=postgres
DATABASE_PASSWORD=postgres
Expand All @@ -32,18 +32,23 @@ TAG_ENABLED_LIMIT=200 # Set the max amount of tags per account
DELAYED_TASK_TIME=30 # Set the seconds before a delayed summary task should expire
DELAYED_TASK_POLLING_MINUTES=5 # Set the time before the delayed task kick off.

ENABLE_S3_ARCHIVING=True
S3_BUCKET_NAME=koku-bucket
S3_BUCKET_PATH=data
S3_ENDPOINT=http://koku-minio:9000
S3_ACCESS_KEY=kokuminioaccess
S3_SECRET=kokuminiosecret
SKIP_MINIO_DATA_DELETION=False

# AWS
AWS_SHARED_CREDENTIALS_FILE=/etc/credentials/aws
AWS_RESOURCE_NAME=YOUR_COST_MANAGEMENT_AWS_ARN

# Glue
SCHEMA_SUFFIX="" # if DEVELOPMENT=True, this can be left empty and will default to $USER; otherwise, set this value to something unique

AWS_CATALOG_ID=589173575009
S3_ENDPOINT=https://s3.us-east-1.amazonaws.com

S3_BUCKET_NAME=hccm-local-s3
S3_ACCESS_KEY=CHANGEME
S3_SECRET=CHANGEME
S3_REGION=us-east-1

# GCP
GOOGLE_APPLICATION_CREDENTIALS=/etc/credentials/gcp
GCP_DATASET='dataset_example'
Expand Down
Loading
Loading