Skip to content

Commit

Permalink
Merge pull request #376 from near/main
Browse files Browse the repository at this point in the history
Prod Release 09/11/23
  • Loading branch information
morgsmccauley authored Nov 9, 2023
2 parents 7f7ba0c + 91f7999 commit 46a91ba
Show file tree
Hide file tree
Showing 15 changed files with 306 additions and 485 deletions.
3 changes: 1 addition & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ With QueryApi you can
## 🧩 Components
1. [QueryApi Coordinator](./indexer)
An Indexer that tracks changes to the QueryApi registry contract. It triggers the execution of those IndexerFunctions
when they match new blocks by placing messages on an SQS queue. Spawns historical processing threads when needed.
when they match new blocks by placing messages on a Redis Stream. Spawns historical processing threads when needed.
1.a. Subfolders provide crates for the different components of the Indexer: indexer_rule_type (shared with registry contract),
indexer_rules_engine, storage.
2. [Indexer Runner](.indexer-js-queue-handler)
Expand Down Expand Up @@ -70,7 +70,6 @@ docker compose up

### Local Configuration
- Coordinator watches the dev registry contract by default (`dev-queryapi.dataplatform.near`). To use a different contract, you can update the `REGISTRY_CONTRACT_ID` environment variable.
- Coodinator will log SQS messages rather than sending them. To use an actual Queue, you can update the `QUEUE_URL` and `START_FROM_BLOCK_QUEUE_URL` environment variables.

### Known Issues

Expand Down
29 changes: 1 addition & 28 deletions indexer/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 0 additions & 5 deletions indexer/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,6 @@ This project is using `workspace` feature of Cargo.
Some tests require blocks with matching data. To download the test block, run
`./download_test_blocks.sh 93085141`. Some other useful blocks are 80854399 92476362 93085141 93659695.

To log a message instead of sending SQS messages set your `QUEUE_URL` to `MOCK` in your `.env` file.

## Design concept

Identified major types of the events on the network:
Expand All @@ -46,9 +44,6 @@ Identified major types of the events on the network:
DATABASE_URL=postgres://user:pass@host/database
LAKE_AWS_ACCESS_KEY=AKI_LAKE_ACCESS...
LAKE_AWS_SECRET_ACCESS_KEY=LAKE_SECRET...
QUEUE_AWS_ACCESS_KEY=AKI_SQS_ACCESS...
QUEUE_AWS_SECRET_ACCESS_KEY=SQS_ACCESS_SECRET
QUEUE_URL=https://sqs.eu-central-1.amazonaws.com/754641474505/alertexer-queue
```
## Running locally
Expand Down
4 changes: 2 additions & 2 deletions indexer/indexer_rules_engine/src/matcher.rs
Original file line number Diff line number Diff line change
Expand Up @@ -115,8 +115,8 @@ fn match_account(
outcome_with_receipt: &IndexerExecutionOutcomeWithReceipt,
) -> bool {
match account_id {
x if x.contains(",") => x
.split(",")
x if x.contains(',') => x
.split(',')
.any(|sub_account_id| match_account(sub_account_id.trim(), outcome_with_receipt)),
_ => {
wildmatch::WildMatch::new(account_id).matches(&outcome_with_receipt.receipt.receiver_id)
Expand Down
2 changes: 1 addition & 1 deletion indexer/queryapi_coordinator/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ prometheus = "0.13.0"
serde = { version = "1", features = ["derive"] }
serde_json = "1.0.55"
tokio = { version = "1.1", features = ["sync", "time", "macros", "rt-multi-thread"] }
tokio-util = "0.6.7"
tokio-stream = { version = "0.1" }
tracing = "0.1.34"

Expand All @@ -41,4 +42,3 @@ unescape = "0.1.0"
aws-types = "0.53.0"
aws-credential-types = "0.53.0"
aws-sdk-s3 = "0.23.0"
aws-sdk-sqs = "0.23.0"
1 change: 0 additions & 1 deletion indexer/queryapi_coordinator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,3 @@ see terraform scripts https://github.com/near/near-ops/tree/master/provisioning/
This app requires:
* a connection to a database containing "alert" rules to match blocks against;
* a redis server where identifiers of processed blocks are stored;
* a SQS queue to write to.
Loading

0 comments on commit 46a91ba

Please sign in to comment.