-
Notifications
You must be signed in to change notification settings - Fork 359
Techniques for Populating CC Database with Large Amounts of Test Data
This wiki page is still a work in progress.
In both Postgres and MySQL you can load large amounts of data relatively quickly by providing it via a csv
file.
General approach is to do a describe <table-name>
in MySQL or \d <table-name>
in Postgres to get a feel for what columns you'll need to account for. If the table you're modifying has any foreign keys you'll need to make sure that their values are real so that you do not violate any foreign key constraints. For example, in the csv generators below we're using hard-coded space guids/app guids as necessary. In general, it's easiest to start with a known record from the db and to tweak the columns that need to be unique or modified to suit your needs.
Examples:
require 'csv'
require 'securerandom'
CSV.open("/tmp/apps-postgres.csv", "wb") do |csv|
2_000_000.times do |i|
data = [
SecureRandom.uuid,
Time.now.utc,
Time.now.utc,
"044d2153-9e7a-45ae-a3cc-41fed42f32a7",
SecureRandom.uuid,
nil,
"STOPPED",
"qmryCYGHkP72BoUK+7GtcQ==",
"7d675587",
1,
nil,
0
]
csv << data
end
end
require 'csv'
require 'securerandom'
CSV.open("/tmp/processes-postgres.csv", "wb") do |csv|
500_000.times do |i|
data = [
SecureRandom.uuid,
Time.now.utc,
Time.now.utc,
1,
32,
1,
16384,
20,
"STOPPED",
SecureRandom.uuid,
'',
nil,
1,
nil,
1,
Time.now.utc,
"90094713-ba5b-4e04-bf9c-c1de7b3538eb",
SecureRandom.uuid,
"port",
'$HOME/boot.sh',
0,
nil,
nil,
nil,
nil
]
csv << data
end
end
First connect to the database using the --local-infile
option. E.g. mysql -u username -p database_name --local-infile
. This will look like:
mysql> LOAD DATA LOCAL INFILE <local-file-path> INTO TABLE <table-name> FIELDS TERMINATED BY ",";
Example:
mysql> LOAD DATA LOCAL INFILE "/tmp/apps-mysql.csv" INTO TABLE apps FIELDS TERMINATED BY ",";
If the operation was successful you'll see that rows were modified. If not, it will usually say "0 rows modified" and list a count of warnings. You can view these by executing: show warnings;
In Postgres you'll use the \copy
command.
cloud_controller=> \copy <table-name>(optional, column, names) from <local-file-path> (format csv);
Example:
cloud_controller=> \copy apps(guid, created_at, updated_at, space_guid, name, droplet_guid, desired_state, encrypted_environment_variables, salt, max_task_sequence_id, buildpack_cache_sha256_checksum, enable_ssh) from /tmp/apps-postgres.csv (format csv);
-
Pipelines
-
Contributing
- Tips and Tricks
- Cloud Controller API v3 Style Guide
- Playbooks
- Development configuration
- Testing
-
Architectural Details
-
CC Resources
- Apps
- Audit Events
- Deployments
- Labels
- Services
- Sidecars
-
Dependencies
-
Troubleshooting
- Ruby Console Script to Find Fields that Cannot Be Decrypted
- Logging database queries in unit tests
- Inspecting blobstore cc resources and cc packages(webdav)
- How to Use USR1 Trap for Diagnostics
- How to Perf: Finding and Fixing Bottlenecks
- How to get access to mysql database
- How To Get a Ruby Heap Dumps & GC Stats from CC
- How to curl v4 internal endpoints with mtls
- How to access Bosh Director console and restore an outdated Cloud Config
- Analyzing Cloud Controller's NGINX logs using the toplogs script
-
k8s
-
Archive