Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add scraping of Sealevel transactions into E2E tests #4850

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -462,6 +462,22 @@ const DOMAINS: &[RawDomain] = &[
is_test_net: true,
is_deprecated: false,
},
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something I'm wondering - is the scraper indexing the SVM IGP payments and deliveries in a sequence aware fashion, or using the rate limited / watermark approach?

In the relayer it indexes SVM IGP payments in a sequence aware manner, but other ones in a rate limited / watermark approach. We'll want to do the same in the scraper otherwise I'd expect bugs to occur

For reference, this is how we do it in the relayer:

We create ContractSyncs here

let interchain_gas_payment_syncs = settings
, which calls
pub async fn contract_syncs<T, D>(
&self,
domains: impl Iterator<Item = &HyperlaneDomain>,
metrics: &CoreMetrics,
sync_metrics: &ContractSyncMetrics,
dbs: HashMap<HyperlaneDomain, Arc<D>>,
) -> Result<HashMap<HyperlaneDomain, Arc<dyn ContractSyncer<T>>>>
where
T: Indexable + Debug + Send + Sync + Clone + Eq + Hash + 'static,
SequenceIndexer<T>: TryFromWithMetrics<ChainConf>,
D: HyperlaneLogStore<T>
+ HyperlaneSequenceAwareIndexerStoreReader<T>
+ HyperlaneWatermarkedLogStore<T>
+ 'static,
{
// TODO: parallelize these calls again
let mut syncs = vec![];
for domain in domains {
let sync = match T::indexing_cursor(domain.domain_protocol()) {
CursorType::SequenceAware => self
.sequenced_contract_sync(
domain,
metrics,
sync_metrics,
dbs.get(domain).unwrap().clone(),
)
.await
.map(|r| r as Arc<dyn ContractSyncer<T>>)?,
CursorType::RateLimited => self
.watermark_contract_sync(
domain,
metrics,
sync_metrics,
dbs.get(domain).unwrap().clone(),
)
.await
.map(|r| r as Arc<dyn ContractSyncer<T>>)?,
};
syncs.push(sync);
}
syncs
.into_iter()
.map(|i| Ok((i.domain().clone(), i)))
.collect()
}
that creates a sequenced_contract_sync if the CursorType is SequenceAware (the case for SVM IGP payments), and a watermark_contract_sync otherwise

The db used by the sequence aware cursors also requires it to impl HyperlaneSequenceAwareIndexerStoreReader

db: Arc<dyn HyperlaneSequenceAwareIndexerStoreReader<T>>,
which iiuc is only implemented for the Scraper's SQL db for messages atm
impl HyperlaneSequenceAwareIndexerStoreReader<HyperlaneMessage> for HyperlaneSqlDb {

RawDomain {
name: "sealeveltest1",
token: "SOL",
domain: 13375,
chain_id: 0,
is_test_net: true,
is_deprecated: false,
},
RawDomain {
name: "sealeveltest2",
token: "SOL",
domain: 13376,
chain_id: 0,
is_test_net: true,
is_deprecated: false,
},
];

#[derive(DeriveMigrationName)]
Expand Down
22 changes: 13 additions & 9 deletions rust/main/chains/hyperlane-sealevel/src/interchain_gas.rs
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ use tracing::{info, instrument};
use hyperlane_core::{
config::StrOrIntParseError, ChainCommunicationError, ChainResult, ContractLocator,
HyperlaneChain, HyperlaneContract, HyperlaneDomain, HyperlaneProvider, Indexed, Indexer,
InterchainGasPaymaster, InterchainGasPayment, LogMeta, SequenceAwareIndexer, H256, U256,
InterchainGasPaymaster, InterchainGasPayment, LogMeta, SequenceAwareIndexer, H256,
};

use crate::account::{search_accounts_by_discriminator, search_and_validate_account};
Expand Down Expand Up @@ -150,10 +150,19 @@ impl SealevelInterchainGasPaymasterIndexer {
self.interchain_payment_account(account)
})?;

self.sealevel_gas_payment(sequence_number, &valid_payment_pda_pubkey)
.await
}

async fn sealevel_gas_payment(
&self,
sequence_number: u64,
payment_pda_pubkey: &Pubkey,
) -> ChainResult<SealevelGasPayment> {
// Now that we have the valid gas payment PDA pubkey, we can get the full account data.
let account = self
.rpc_client
.get_account_with_finalized_commitment(&valid_payment_pda_pubkey)
.get_account_with_finalized_commitment(payment_pda_pubkey)
.await?;
let gas_payment_account = GasPaymentAccount::fetch(&mut account.data.as_ref())
.map_err(ChainCommunicationError::from_other)?
Expand All @@ -169,11 +178,7 @@ impl SealevelInterchainGasPaymasterIndexer {
};

let log_meta = self
.interchain_payment_log_meta(
U256::from(sequence_number),
&valid_payment_pda_pubkey,
&gas_payment_account.slot,
)
.interchain_payment_log_meta(payment_pda_pubkey, &gas_payment_account.slot)
.await?;

Ok(SealevelGasPayment::new(
Expand All @@ -189,14 +194,13 @@ impl SealevelInterchainGasPaymasterIndexer {

async fn interchain_payment_log_meta(
&self,
log_index: U256,
payment_pda_pubkey: &Pubkey,
payment_pda_slot: &Slot,
) -> ChainResult<LogMeta> {
let block = self.rpc_client.get_block(*payment_pda_slot).await?;

self.log_meta_composer
.log_meta(block, log_index, payment_pda_pubkey, payment_pda_slot)
.log_meta(block, payment_pda_pubkey, payment_pda_slot)
.map_err(Into::<ChainCommunicationError>::into)
}

Expand Down
39 changes: 22 additions & 17 deletions rust/main/chains/hyperlane-sealevel/src/log_meta_composer.rs
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,6 @@ impl LogMetaComposer {
pub fn log_meta(
&self,
block: UiConfirmedBlock,
log_index: U256,
pda_pubkey: &Pubkey,
pda_slot: &Slot,
) -> Result<LogMeta, HyperlaneSealevelError> {
Expand Down Expand Up @@ -64,15 +63,17 @@ impl LogMetaComposer {
)))?
}

let (transaction_index, transaction_hash) =
transaction_hashes
.into_iter()
.next()
.ok_or(HyperlaneSealevelError::NoTransactions(format!(
let (transaction_index, transaction_hash, program_index) = transaction_hashes
.into_iter()
.next()
.ok_or(HyperlaneSealevelError::NoTransactions(format!(
"block which should contain {} transaction does not contain any after filtering",
self.transaction_description,
)))?;

// Construct log index which will be increasing relative to block
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fwiw I view us as being able to define this however we want per VM, and don't think we necessarily need to replicate the EVM behavior here, but happy as it is

let log_index = U256::from((transaction_index << 8) + (program_index as usize));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Noting that because the log index is used to determine "uniqueness" of an IGP payment, changing this may have some consequences. If we re-index a payment, we'll consider them as two separate payments because the log index is different:

What I'm not sure about is if we'd expect to actually double-process any gas payments?
On the relayer/validator side I'm pretty sure this won't happen, but unsure about the scraper.


let log_meta = LogMeta {
address: self.program_id.to_bytes().into(),
block_number: *pda_slot,
Expand Down Expand Up @@ -120,7 +121,8 @@ pub fn is_interchain_payment_instruction(instruction_data: &[u8]) -> bool {
}

/// This function searches for relevant transactions in the vector of provided transactions and
/// returns the relative index and hashes of such transactions.
/// returns the relative index and hashes of such transactions together with index of the relevant
/// instruction.
///
/// This function takes a program identifier and the identifier for PDA and searches transactions
/// which act upon this program and the PDA.
Expand All @@ -144,16 +146,19 @@ fn search_transactions(
program_id: &Pubkey,
pda_pubkey: &Pubkey,
is_specified_instruction: fn(&[u8]) -> bool,
) -> Vec<(usize, H512)> {
) -> Vec<(usize, H512, u8)> {
transactions
.into_iter()
.enumerate()
.filter_map(|(index, tx)| filter_by_encoding(tx).map(|(tx, meta)| (index, tx, meta)))
.filter_map(|(index, tx, meta)| {
filter_by_validity(tx, meta)
.map(|(hash, account_keys, instructions)| (index, hash, account_keys, instructions))
.filter_map(|(txn_index, tx)| {
filter_by_encoding(tx).map(|(tx, meta)| (txn_index, tx, meta))
})
.filter_map(|(txn_index, tx, meta)| {
filter_by_validity(tx, meta).map(|(hash, account_keys, instructions)| {
(txn_index, hash, account_keys, instructions)
})
})
.filter_map(|(index, hash, account_keys, instructions)| {
.filter_map(|(txn_index, hash, account_keys, instructions)| {
filter_by_relevancy(
program_id,
pda_pubkey,
Expand All @@ -162,9 +167,9 @@ fn search_transactions(
instructions,
is_specified_instruction,
)
.map(|hash| (index, hash))
.map(|(hash, program_index)| (txn_index, hash, program_index))
})
.collect::<Vec<(usize, H512)>>()
.collect::<Vec<(usize, H512, u8)>>()
}

fn filter_by_relevancy(
Expand All @@ -174,7 +179,7 @@ fn filter_by_relevancy(
account_keys: Vec<String>,
instructions: Vec<UiCompiledInstruction>,
is_specified_instruction: fn(&[u8]) -> bool,
) -> Option<H512> {
) -> Option<(H512, u8)> {
let account_index_map = account_index_map(account_keys);

let program_id_str = program_id.to_string();
Expand Down Expand Up @@ -213,7 +218,7 @@ fn filter_by_relevancy(
return None;
}

Some(hash)
Some((hash, program.program_id_index))
}

fn filter_by_validity(
Expand Down
18 changes: 2 additions & 16 deletions rust/main/chains/hyperlane-sealevel/src/mailbox.rs
Original file line number Diff line number Diff line change
Expand Up @@ -718,7 +718,6 @@ impl SealevelMailboxIndexer {

let log_meta = self
.dispatch_message_log_meta(
U256::from(nonce),
&valid_message_storage_pda_pubkey,
&dispatched_message_account.slot,
)
Expand All @@ -743,7 +742,6 @@ impl SealevelMailboxIndexer {

async fn dispatch_message_log_meta(
&self,
log_index: U256,
message_storage_pda_pubkey: &Pubkey,
message_account_slot: &Slot,
) -> ChainResult<LogMeta> {
Expand All @@ -755,12 +753,7 @@ impl SealevelMailboxIndexer {
.await?;

self.dispatch_message_log_meta_composer
.log_meta(
block,
log_index,
message_storage_pda_pubkey,
message_account_slot,
)
.log_meta(block, message_storage_pda_pubkey, message_account_slot)
.map_err(Into::<ChainCommunicationError>::into)
}

Expand Down Expand Up @@ -800,7 +793,6 @@ impl SealevelMailboxIndexer {

let log_meta = self
.delivered_message_log_meta(
U256::from(nonce),
&valid_message_storage_pda_pubkey,
&delivered_message_account.slot,
)
Expand All @@ -823,7 +815,6 @@ impl SealevelMailboxIndexer {

async fn delivered_message_log_meta(
&self,
log_index: U256,
message_storage_pda_pubkey: &Pubkey,
message_account_slot: &Slot,
) -> ChainResult<LogMeta> {
Expand All @@ -835,12 +826,7 @@ impl SealevelMailboxIndexer {
.await?;

self.delivery_message_log_meta_composer
.log_meta(
block,
log_index,
message_storage_pda_pubkey,
message_account_slot,
)
.log_meta(block, message_storage_pda_pubkey, message_account_slot)
.map_err(Into::<ChainCommunicationError>::into)
}
}
Expand Down
9 changes: 5 additions & 4 deletions rust/main/hyperlane-core/src/chain.rs
Original file line number Diff line number Diff line change
Expand Up @@ -169,6 +169,7 @@ pub enum KnownHyperlaneDomain {
Sanko = 1996,
Sei = 1329,
SolanaMainnet = 1399811149,
Stride = 745,
Taiko = 167000,
Tangle = 5845,
Viction = 88,
Expand Down Expand Up @@ -319,8 +320,8 @@ impl KnownHyperlaneDomain {
DegenChain, EclipseMainnet, Endurance, Ethereum, Fraxtal, FuseMainnet, Gnosis,
InEvm, Injective, Kroma, Linea, Lisk, Lukso, MantaPacific, Mantle, Merlin,
Metis, Mint, Mode, Moonbeam, Neutron, Optimism, Osmosis, Polygon, ProofOfPlay,
ReAl, Redstone, Sanko, Sei, SolanaMainnet, Taiko, Tangle, Viction, Worldchain, Xai,
Xlayer, Zetachain, Zircuit, ZoraMainnet,
ReAl, Redstone, Sanko, Sei, SolanaMainnet, Stride, Taiko, Tangle, Viction,
Worldchain, Xai, Xlayer, Zetachain, Zircuit, ZoraMainnet,
],
Testnet: [
Alfajores, BinanceSmartChainTestnet, Chiado, ConnextSepolia, Fuji, Holesky, MoonbaseAlpha,
Expand Down Expand Up @@ -355,7 +356,7 @@ impl KnownHyperlaneDomain {
HyperlaneDomainProtocol::Fuel: [FuelTest1],
HyperlaneDomainProtocol::Sealevel: [EclipseMainnet, SolanaMainnet, SealevelTest1, SealevelTest2],
HyperlaneDomainProtocol::Cosmos: [
Injective, Neutron, Osmosis,
Injective, Neutron, Stride, Osmosis,

// Local chains
CosmosTest99990, CosmosTest99991,
Expand Down Expand Up @@ -387,7 +388,7 @@ impl KnownHyperlaneDomain {
HyperlaneDomainTechnicalStack::Other: [
Avalanche, BinanceSmartChain, Celo, EclipseMainnet, Endurance, Ethereum,
FuseMainnet, Gnosis, Injective, Linea, Lukso, Neutron, Osmosis, Polygon,
Sei, SolanaMainnet, Taiko, Viction, Zetachain,
Sei, SolanaMainnet, Stride, Taiko, Viction, Zetachain,

// Local chains
CosmosTest99990, CosmosTest99991, FuelTest1, SealevelTest1, SealevelTest2, Test1,
Expand Down
8 changes: 4 additions & 4 deletions rust/main/utils/run-locally/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -448,19 +448,19 @@ fn main() -> ExitCode {

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we add some invariants to the e2e test to make sure that these sealevel events / txs are being indexed? similar to what we do for the EVM things

state.push_agent(relayer_env.spawn("RLY", Some(&AGENT_LOGGING_DIR)));

log!("Setup complete! Agents running in background...");
log!("Ctrl+C to end execution...");

if let Some((solana_config_path, (_, solana_path))) =
solana_config_path.clone().zip(solana_paths.clone())
{
// Send some sealevel messages before spinning up the agents, to test the backward indexing cursor
// Send some sealevel messages after spinning up the agents, to test the backward indexing cursor
for _i in 0..(SOL_MESSAGES_EXPECTED / 2) {
initiate_solana_hyperlane_transfer(solana_path.clone(), solana_config_path.clone())
.join();
}
}

log!("Setup complete! Agents running in background...");
log!("Ctrl+C to end execution...");

// Send half the kathy messages after the relayer comes up
kathy_env_double_insertion.clone().run().join();
kathy_env_zero_insertion.clone().run().join();
Expand Down