Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add scraping of Sealevel transactions into E2E tests #4850

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

ameten
Copy link
Contributor

@ameten ameten commented Nov 12, 2024

Description

Add scraping of Sealevel transactions into E2E tests

Drive-by changes

  • Add Stride as known Hyperlane domain
  • Calculate log index for Sealevel chains from transaction itself

Related issues

Backward compatibility

Yes

Testing

Manual run of Scraper
E2E tests for Ethereum and Sealevel

Copy link

changeset-bot bot commented Nov 12, 2024

⚠️ No Changeset found

Latest commit: ff41bad

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

Copy link

codecov bot commented Nov 12, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 74.60%. Comparing base (1159e0f) to head (ff41bad).

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #4850   +/-   ##
=======================================
  Coverage   74.60%   74.60%           
=======================================
  Files         103      103           
  Lines        1516     1516           
  Branches      195      195           
=======================================
  Hits         1131     1131           
  Misses        364      364           
  Partials       21       21           
Components Coverage Δ
core 84.61% <ø> (ø)
hooks 77.77% <ø> (ø)
isms 79.02% <ø> (ø)
token 89.07% <ø> (ø)
middlewares 77.58% <ø> (ø)

@ameten ameten changed the title feat: Add scraping of Sealevel transaction into E2E tests feat: Add scraping of Sealevel transactions into E2E tests Nov 12, 2024
"block which should contain {} transaction does not contain any after filtering",
self.transaction_description,
)))?;

// Construct log index which will be increasing relative to block
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fwiw I view us as being able to define this however we want per VM, and don't think we necessarily need to replicate the EVM behavior here, but happy as it is

"block which should contain {} transaction does not contain any after filtering",
self.transaction_description,
)))?;

// Construct log index which will be increasing relative to block
let log_index = U256::from((transaction_index << 8) + (program_index as usize));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Noting that because the log index is used to determine "uniqueness" of an IGP payment, changing this may have some consequences. If we re-index a payment, we'll consider them as two separate payments because the log index is different:

What I'm not sure about is if we'd expect to actually double-process any gas payments?
On the relayer/validator side I'm pretty sure this won't happen, but unsure about the scraper.

@@ -448,19 +448,19 @@ fn main() -> ExitCode {

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we add some invariants to the e2e test to make sure that these sealevel events / txs are being indexed? similar to what we do for the EVM things

@@ -462,6 +462,22 @@ const DOMAINS: &[RawDomain] = &[
is_test_net: true,
is_deprecated: false,
},
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something I'm wondering - is the scraper indexing the SVM IGP payments and deliveries in a sequence aware fashion, or using the rate limited / watermark approach?

In the relayer it indexes SVM IGP payments in a sequence aware manner, but other ones in a rate limited / watermark approach. We'll want to do the same in the scraper otherwise I'd expect bugs to occur

For reference, this is how we do it in the relayer:

We create ContractSyncs here

let interchain_gas_payment_syncs = settings
, which calls
pub async fn contract_syncs<T, D>(
&self,
domains: impl Iterator<Item = &HyperlaneDomain>,
metrics: &CoreMetrics,
sync_metrics: &ContractSyncMetrics,
dbs: HashMap<HyperlaneDomain, Arc<D>>,
) -> Result<HashMap<HyperlaneDomain, Arc<dyn ContractSyncer<T>>>>
where
T: Indexable + Debug + Send + Sync + Clone + Eq + Hash + 'static,
SequenceIndexer<T>: TryFromWithMetrics<ChainConf>,
D: HyperlaneLogStore<T>
+ HyperlaneSequenceAwareIndexerStoreReader<T>
+ HyperlaneWatermarkedLogStore<T>
+ 'static,
{
// TODO: parallelize these calls again
let mut syncs = vec![];
for domain in domains {
let sync = match T::indexing_cursor(domain.domain_protocol()) {
CursorType::SequenceAware => self
.sequenced_contract_sync(
domain,
metrics,
sync_metrics,
dbs.get(domain).unwrap().clone(),
)
.await
.map(|r| r as Arc<dyn ContractSyncer<T>>)?,
CursorType::RateLimited => self
.watermark_contract_sync(
domain,
metrics,
sync_metrics,
dbs.get(domain).unwrap().clone(),
)
.await
.map(|r| r as Arc<dyn ContractSyncer<T>>)?,
};
syncs.push(sync);
}
syncs
.into_iter()
.map(|i| Ok((i.domain().clone(), i)))
.collect()
}
that creates a sequenced_contract_sync if the CursorType is SequenceAware (the case for SVM IGP payments), and a watermark_contract_sync otherwise

The db used by the sequence aware cursors also requires it to impl HyperlaneSequenceAwareIndexerStoreReader

db: Arc<dyn HyperlaneSequenceAwareIndexerStoreReader<T>>,
which iiuc is only implemented for the Scraper's SQL db for messages atm
impl HyperlaneSequenceAwareIndexerStoreReader<HyperlaneMessage> for HyperlaneSqlDb {

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: In Review
Development

Successfully merging this pull request may close these issues.

2 participants