Skip to content
This repository has been archived by the owner on Nov 15, 2022. It is now read-only.

Commit

Permalink
Merge branch 'release/milestone1'
Browse files Browse the repository at this point in the history
  • Loading branch information
mariopino committed Mar 20, 2020
2 parents c162237 + acf31d3 commit 60fe28f
Show file tree
Hide file tree
Showing 14 changed files with 440 additions and 347 deletions.
133 changes: 132 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ npm run docker:<container-name>
- postgres
- graphql-engine
- crawler
- phragmen
- phragmen (temporarily disabled)

## Updating containers

Expand All @@ -60,3 +60,134 @@ The crawler is able to detect and fill the gaps in postgres database by harvesti
## Phragmen

This container includes an offline-phragmen binary. It is a forked modification of [Kianenigma](https://github.com/kianenigma/offline-phragmen) repository.

## Hasura demo

The crawler needs to wait for your substrate-node container to get synced before starting to collect data. You can use an already synced external RPC for instant testing by changing the environment variable WS_PROVIDER_URL in `docker-compose.yml` file:

```yaml
crawler:
image: polkastats-backend:latest
build:
context: ../../
dockerfile: ./docker/polkastats-backend/backend/Dockerfile
depends_on:
- "postgres"
- "substrate-node"
restart: on-failure
environment:
- NODE_ENV=production
- WS_PROVIDER_URL=wss://kusama-rpc.polkadot.io # Change this line
```
Just uncomment out the first one and comment the second and rebuild the dockers.
```
npm run docker:clean
npm run docker
```

Then browse to http://localhost:8082

Click on "Data" at the top menu

![](images/hasura-data.png)

Then add all tables to the tracking process

![](images/hasura-track.png)

From now on, hasura will be collecting and tracking all the changes in the data base.

In order to check it and see its power you could start a new subscription or just perform an example query such us this one:

### Query example. Static

- Block query example:
```
query {
block {
block_hash
block_author
block_number
block_author_name
current_era
current_index
new_accounts
session_length
session_per_era
session_progress
}
}
```

- Rewards query example:
```
query {
rewards {
era_index
era_rewards
stash_id
timestamp
}
}
```

- Validator by number of nominators example:
```
query {
validator_num_nominators {
block_number
nominators
timestamp
}
}
```

- Account query example:
```
query {
account {
account_id
balances
identity
}
}
```

### Subscription example. Dynamic

- Block subscription example:
```
subscription {
block {
block_number
block_hash
current_era
current_index
}
}
```

- Validator active subscription example:
```
subscription MySubscription {
validator_active {
account_id
active
block_number
session_index
timestamp
}
}
```

- Account subscription example:
```
subscription MySubscription {
account {
account_id
balances
}
}
```
5 changes: 2 additions & 3 deletions backend.config.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
// Also wss://kusama-rpc.polkadot.io
const DEFAULT_WS_PROVIDER_URL = 'ws://substrate-node:9944';

module.exports = {
Expand Down Expand Up @@ -35,7 +34,7 @@ module.exports = {
enabled: true,
module: require('./lib/crawlers/activeAccounts.js'),
config: {
pollingTime: 1 * 60 * 1000,
pollingTime: 10 * 60 * 1000,
},
},

Expand All @@ -45,7 +44,7 @@ module.exports = {
},

{
enabled: true,
enabled: false,
module: require('./lib/crawlers/phragmen.js'),
config: {
wsProviderUrl: process.env.WS_PROVIDER_URL || DEFAULT_WS_PROVIDER_URL,
Expand Down
25 changes: 14 additions & 11 deletions docker/polkastats-backend/backend/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,21 +1,24 @@
FROM rust AS builder

RUN mkdir -p /app; \
wget https://github.com/bigomby/offline-phragmen/archive/master.zip; \
unzip master.zip -d /app/

WORKDIR /app/offline-phragmen-master

RUN cargo build --release
#FROM rust AS builder
#
#RUN mkdir -p /app; \
# wget https://github.com/bigomby/offline-phragmen/archive/master.zip; \
# unzip master.zip -d /app/
#
#WORKDIR /app/offline-phragmen-master
#
#RUN cargo build --release

FROM node

WORKDIR /usr/app/polkastats-backend-v3

COPY --from=builder /app/offline-phragmen-master/target/release/offline-phragmen /usr/app/polkastats-backend-v3
#COPY --from=builder /app/offline-phragmen-master/target/release/offline-phragmen /usr/app/polkastats-backend-v3

RUN wget https://github.com/Bigomby/offline-phragmen/releases/download/0.1.0/offline-phragmen; \
chmod +x offline-phragmen

COPY . /usr/app/polkastats-backend-v3

RUN npm install

CMD ["npm", "start"]
CMD ["npm", "start"]
1 change: 1 addition & 0 deletions docker/polkastats-backend/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ services:
restart: on-failure
environment:
- NODE_ENV=production
- WS_PROVIDER_URL=ws://substrate-node:9944
#
# Persisten volumes
#
Expand Down
2 changes: 0 additions & 2 deletions docker/polkastats-backend/sql/polkastats.sql
Original file line number Diff line number Diff line change
Expand Up @@ -119,8 +119,6 @@ CREATE TABLE IF NOT EXISTS validator_active (

CREATE TABLE IF NOT EXISTS account (
account_id VARCHAR(100) NOT NULL,
account_index VARCHAR(100) NOT NULL,
nickname VARCHAR(100) NOT NULL,
identity TEXT NOT NULL,
balances TEXT NOT NULL,
timestamp BIGINT NOT NULL,
Expand Down
2 changes: 1 addition & 1 deletion docker/polkastats-backend/substrate-client/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ FROM phusion/baseimage:0.11
LABEL maintainer "@ColmenaLabs_svq"
LABEL description="Small image with the Substrate binary."

ARG VERSION=v0.7.22
ARG VERSION=v0.7.27

RUN apt-get update && apt-get install wget curl jq -y

Expand Down
Binary file added images/hasura-data.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/hasura-track.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
81 changes: 35 additions & 46 deletions lib/crawlers/activeAccounts.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,56 +4,45 @@ module.exports = {
console.log(`[PolkaStats backend v3] - \x1b[32mStarting active accounts crawler...\x1b[0m`);

// Fetch active accounts
const accounts = await api.derive.accounts.indexes();
const accountKeys = await api.query.system.account.keys()
const accounts = accountKeys.map(key => key.args[0].toHuman());

let accountsInfo = [];
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[32mProcessing ${accounts.length} active accounts\x1b[0m`);

for (var key in accounts ) {
let accountId = key;
let accountIndex = accounts[key]
let accountInfo = await api.derive.accounts.info(accountId);
let identity = accountInfo.identity.display ? JSON.stringify(accountInfo.identity) : '';
let nickname = accountInfo.nickname ? accountInfo.nickname : '';
let balances = await api.derive.balances.all(accountId);
accountsInfo[accountId] = {
accountId,
accountIndex,
identity,
nickname,
balances
}
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[32mProcessing account ${accountId}\x1b[0m`);
}
await accounts.forEach(async accountId => {

// console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[32mProcessing account ${accountId}\x1b[0m`);
const accountInfo = await api.derive.accounts.info(accountId);
const identity = accountInfo.identity.display ? JSON.stringify(accountInfo.identity) : ``;
const balances = await api.derive.balances.all(accountId);
const block = await api.rpc.chain.getBlock();
const blockNumber = block.block.header.number.toNumber();

let sql = `SELECT account_id FROM account WHERE account_id = '${accountId}'`;
let res = await pool.query(sql);

// Main loop
for (var key in accountsInfo ) {
if (accountsInfo.hasOwnProperty(key)) {
// console.log(key + " -> " + accounts[key]);
let sql = `SELECT account_id FROM account WHERE account_id = '${key}'`;
let res = await pool.query(sql);
const sqlBlockHeight = `SELECT block_number FROM block ORDER BY timestamp desc LIMIT 1`;
const resBlockHeight = await pool.query(sqlBlockHeight);
if (res.rows.length > 0) {
const timestamp = new Date().getTime();
sql = `UPDATE account SET account_index = '${accountsInfo[key].accountIndex}', nickname = '${accountsInfo[key].nickname}', identity = '${accountsInfo[key].identity}', balances = '${JSON.stringify(accountsInfo[key].balances)}', timestamp = '${timestamp}', block_height = '${resBlockHeight.rows[0].block_number}' WHERE account_id = '${key}'`;
try {
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[32mUpdating account ${accountsInfo[key].accountIndex} [${key}]\x1b[0m`);
await pool.query(sql);
} catch (error) {
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[31mError updating account ${key}\x1b[0m`);
}
} else {
const timestamp = new Date().getTime();
sql = `INSERT INTO account (account_id, account_index, nickname, identity, balances, timestamp, block_height) VALUES ('${key}', '${accountsInfo[key].accountIndex}', '${accountsInfo[key].nickname}', '${accountsInfo[key].idenity}', '${JSON.stringify(accountsInfo[key].balances)}', '${timestamp}', '${resBlockHeight.rows[0].block_number}');`;
try {
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[32mAdding account ${accountsInfo[key].accountIndex} [${key}]\x1b[0m`);
await pool.query(sql);
} catch (error) {
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[31mError adding new account ${key}\x1b[0m`);
}
}
if (res.rows.length > 0) {
const timestamp = new Date().getTime();
sql = `UPDATE account SET identity = '${identity}', balances = '${JSON.stringify(balances)}', timestamp = '${timestamp}', block_height = '${blockNumber}' WHERE account_id = '${accountId}'`;
try {
// console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[32mUpdating account ${accountId}\x1b[0m`);
await pool.query(sql);
} catch (error) {
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[31mError updating account ${accountId}\x1b[0m`);
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[31mError: ${error}\x1b[0m`);
}
} else {
const timestamp = new Date().getTime();
sql = `INSERT INTO account (account_id, identity, balances, timestamp, block_height) VALUES ('${accountId}', '${identity}', '${JSON.stringify(balances)}', '${timestamp}', '${blockNumber}');`;
try {
// console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[32mAdding account ${accountId}\x1b[0m`);
await pool.query(sql);
} catch (error) {
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[31mError adding new account ${accountId}\x1b[0m`);
console.log(`[PolkaStats backend v3] - Active Accounts - \x1b[31mError: ${error}\x1b[0m`);
}
}
}
});

setTimeout(
() => module.exports.start(api, pool, config),
Expand Down
Loading

0 comments on commit 60fe28f

Please sign in to comment.