Skip to content

Commit dfafeec

Browse files
committed
testing processor
1 parent 260f684 commit dfafeec

File tree

3 files changed

+288
-0
lines changed

3 files changed

+288
-0
lines changed

apps/nextra/next.config.mjs

+7
Original file line numberDiff line numberDiff line change
@@ -467,6 +467,13 @@ export default withBundleAnalyzer(
467467
"/en/build/indexer/indexer-sdk/documentation/advanced-tutorials/txn-script",
468468
permanent: true,
469469
},
470+
{
471+
source:
472+
"/indexer/indexer-sdk/documentation/advanced-tutorials/processor-test",
473+
destination:
474+
"/en/build/indexer/indexer-sdk/documentation/advanced-tutorials/processor-test",
475+
permanent: true,
476+
},
470477
{
471478
source: "/indexer/txn-stream/labs-hosted",
472479
destination: "/en/build/indexer/api/labs-hosted",

apps/nextra/pages/en/build/indexer/indexer-sdk/documentation/advanced-tutorials/_meta.tsx

+3
Original file line numberDiff line numberDiff line change
@@ -5,4 +5,7 @@ export default {
55
"txn-script": {
66
title: "Generating Transactions with Move Scripts",
77
},
8+
"processor-test": {
9+
title: "Testing Your Processor",
10+
},
811
};
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,278 @@
1+
---
2+
title: "Testing Processor"
3+
---
4+
5+
import { Callout } from "nextra/components"
6+
7+
8+
# Overview
9+
### What Is a Processor?
10+
A processor is a core component of the Aptos Indexer that handles blockchain transaction processing. It validates, transforms, and stores transactions into a database, enabling downstream applications like analytics, indexing, and querying. Testing the processor ensures that all transactions are correctly handled, maintaining data accuracy and consistency.
11+
12+
13+
### What Are We Testing With This?
14+
15+
- **Transaction correctness**: Ensure that each transaction is processed and stored accurately.
16+
- **Schema consistency**: Verify that the database schema is correctly set up and maintained throughout the tests.
17+
18+
### General Flow of how Processor Testing Works
19+
20+
1. Prepare testing transactions (refer to prior documentations).
21+
2. Update dependencies as needed.
22+
3. Import new transactions.
23+
4. Write test cases.
24+
5. Generate expected database output and validate.
25+
6. Merge.
26+
27+
## Prerequisites
28+
<Callout>
29+
Key Considerations:
30+
- Each test runs in an isolated environment using a PostgreSQL container to prevent interference.
31+
- Proper handling of versions ensures transactions are processed and validated in the correct order.
32+
- Validation logic must detect changes or issues by comparing processor output with the expected baseline.
33+
</Callout>
34+
35+
36+
1. Ensure Docker is running for PostgreSQL container support.
37+
- Set up docker engine/daemon on your machine
38+
- Start Docker if it's not running
39+
2. Identify the transactions to test.
40+
- Use imported transactions or write your own custom Move scripts to generate test transactions. Refer to Importing Transaction Guide and Generating Transaction using Move Script Guide for detailed instructions.
41+
3. Import necessary modules, see example:
42+
```rust
43+
use aptos_indexer_testing_framework::{
44+
database::{PostgresTestDatabase, TestDatabase},
45+
sdk_test_context::SdkTestContext,
46+
};
47+
```
48+
49+
## Steps to Write a Test
50+
51+
### 1. Set Up the Test Environment
52+
Before setting up the test environment, it’s important to understand the configurations being used in this step:
53+
54+
55+
What Are These Configurations?
56+
`generate_file_flag`
57+
This flag determines whether to run the test in a "diff mode" or not.
58+
In "diff mode," the system will compare the actual output from the processor with the expected output and highlight differences.
59+
Use this mode when validating changes or debugging discrepancies.
60+
`custom_output_path`
61+
An optional configuration to specify a custom path where the expected database output will be stored.
62+
If not provided, the test will use the default path defined by DEFAULT_OUTPUT_FOLDER.
63+
`DEFAULT_OUTPUT_FOLDER`
64+
This constant defines the default folder where the system stores output files for the tests.
65+
Example: "sdk_expected_db_output_files".
66+
Modify this value in your configuration if you prefer a different default directory.
67+
68+
69+
```rust
70+
let (generate_file_flag, custom_output_path) = get_test_config();
71+
let output_path = custom_output_path.unwrap_or_else(|| format!("{}/imported_mainnet_txns", DEFAULT_OUTPUT_FOLDER));
72+
73+
// Setup DB and replace as needed
74+
let mut db = PostgresTestDatabase::new();
75+
db.setup().await.unwrap();
76+
77+
let mut test_context = SdkTestContext::new(&[CONST_VARIABLE_OF_YOUR_TEST_TRANSACTION]); // Replace with your test transaction
78+
if test_context.init_mock_grpc().await.is_err() {
79+
panic!("Failed to initialize mock grpc");
80+
};
81+
```
82+
83+
Explanation of Each Component:
84+
85+
`get_test_config():`
86+
87+
This function fetches the configurations (diff_flag and custom_output_path) for the test.
88+
Modify or extend this function if you want to support additional custom flags or configurations.
89+
output_path:
90+
91+
Combines DEFAULT_OUTPUT_FOLDER with the subfolder imported_mainnet_txns if no custom_output_path is specified.
92+
This ensures all output files are stored in a predictable location.
93+
94+
`PostgresTestDatabase::new():`
95+
96+
Creates a new PostgreSQL database instance for testing.
97+
This database is isolated, ensuring no interference with production or other test environments.
98+
99+
`SdkTestContext::new():`
100+
101+
Initializes the test context with the transaction(s) you want to test.
102+
Replace CONST_VARIABLE_OF_YOUR_TEST_TRANSACTION with the appropriate variable or constant representing the transaction(s) to be tested.
103+
104+
`init_mock_grpc():`
105+
106+
Initializes a mock gRPC service for the test.
107+
This allows the processor to simulate transactions without interacting with live blockchain data.
108+
109+
110+
### 2. Configure the Processor
111+
112+
```rust
113+
let db_url = db.get_db_url();
114+
let transaction_stream_config = test_context.create_transaction_stream_config();
115+
let postgres_config = PostgresConfig {
116+
connection_string: db_url.to_string(),
117+
db_pool_size: 100,
118+
};
119+
120+
let db_config = DbConfig::PostgresConfig(postgres_config);
121+
let default_processor_config = DefaultProcessorConfig {
122+
per_table_chunk_sizes: AHashMap::new(),
123+
channel_size: 100,
124+
deprecated_tables: HashSet::new(),
125+
};
126+
127+
let processor_config = ProcessorConfig::DefaultProcessor(default_processor_config);
128+
let processor_name = processor_config.name();
129+
```
130+
131+
### 3. Create the Processor
132+
133+
```rust
134+
let processor = DefaultProcessor::new(indexer_processor_config)
135+
.await
136+
.expect("Failed to create processor");
137+
```
138+
Note: Replace `DefaultProcessor` with the processor you are testing.
139+
140+
### 4. Setup a Query
141+
142+
Set up a query to load data from the local database and compare it with expected results, see [example loading function](https://github.com/aptos-labs/aptos-indexer-processors/blob/a8f9c5915f4e3f1f596ed3412b8eb01feca1aa7b/rust/integration-tests/src/diff_test_helper/default_processor.rs#L45)
143+
144+
145+
### 5. Setup a Test Context run function
146+
Use the test_context.run() function to execute the processor, validate outputs using your query, and optionally generate database output files:
147+
148+
```rust
149+
let txn_versions: Vec<i64> = test_context
150+
.get_test_transaction_versions()
151+
.into_iter()
152+
.map(|v| v as i64)
153+
.collect();
154+
155+
let db_values = test_context
156+
.run(
157+
&processor,
158+
generate_file_flag,
159+
output_path.clone(),
160+
custom_file_name,
161+
move || {
162+
let mut conn = PgConnection::establish(&db_url).unwrap_or_else(|e| {
163+
eprintln!("[ERROR] Failed to establish DB connection: {:?}", e);
164+
panic!("Failed to establish DB connection: {:?}", e);
165+
});
166+
167+
let db_values = match load_data(&mut conn, txn_versions.clone()) {
168+
Ok(db_data) => db_data,
169+
Err(e) => {
170+
eprintln!("[ERROR] Failed to load data {}", e);
171+
return Err(e);
172+
},
173+
};
174+
175+
if db_values.is_empty() {
176+
eprintln!("[WARNING] No data found for versions: {:?}", txn_versions);
177+
}
178+
179+
Ok(db_values)
180+
},
181+
)
182+
```
183+
184+
185+
### 6. Run the Processor Test
186+
187+
Once you have your test ready, run the following command to generate the expected output for validation:
188+
189+
```bash
190+
cargo test sdk_tests -- generate-output
191+
```
192+
193+
Arguments:
194+
generate-output: A custom flag to indicate that expected outputs should be generated.
195+
output-path: it's an optional argument to specify the output path for the db output.
196+
197+
The expected database output will be saved in the specified output_path or sdk_expected_db_output_files by default.
198+
199+
200+
---
201+
202+
## FAQ
203+
204+
### What Types of Tests Does It Support?
205+
206+
- Database schema output diff.
207+
208+
### What Is `TestContext`?
209+
210+
`TestContext` is a struct that manages:
211+
212+
- `transaction_batches`: A collection of transaction batches.
213+
- `postgres_container`: A PostgreSQL container for test isolation.
214+
215+
It initializes and manages the database and transaction context for tests.
216+
217+
#### What Does `TestContext.run` Do?
218+
219+
This function executes the processor, applies validation logic, and optionally generates output files.
220+
221+
#### Key Features:
222+
223+
- Flexible Validation: Accepts a user-provided verification function.
224+
- Multi-Table Support: Handles data across multiple tables.
225+
- Retries: Uses exponential backoff and timeout for retries.
226+
- Optional File Generation: Controlled by a flag.
227+
228+
#### Example Usage:
229+
230+
```rust
231+
pub async fn run<F>(
232+
&mut self,
233+
processor: &impl ProcessorTrait,
234+
txn_version: u64,
235+
generate_files: bool, // Flag to control file generation
236+
output_path: String, // Output path
237+
custom_file_name: Option<String>, // Custom file name
238+
verification_f: F, // Verification function
239+
) -> anyhow::Result<HashMap<String, Value>>
240+
where
241+
```
242+
243+
### How to Generate Expected DB Output?
244+
245+
Run the following command:
246+
247+
```bash
248+
cargo test sdk_tests -- --nocapture generate-output
249+
```
250+
251+
Supported Test Args:
252+
253+
1. `generate-output`
254+
2. `output_path`
255+
256+
---
257+
258+
## Troubleshooting and Tips
259+
260+
1. **Isolate Tests**: Use Docker containers for database isolation.
261+
2. **Handle Non-Deterministic Fields**: Use helpers like `remove_inserted_at` to clean up timestamps before validation.
262+
3. **Enable Debugging**: Use `eprintln!` for detailed error logging.
263+
264+
#### How to Debug Test Failures?
265+
run following command to get detailed logs:
266+
267+
```bash
268+
cargo test sdk_tests -- --nocapture
269+
```
270+
271+
## Additional Notes
272+
273+
- **Adapting to Other Databases**:
274+
- Replace PostgreSQL-specific code with relevant database code you intend to use (e.g., MySQL).
275+
- Update schema initialization and query methods.
276+
- **Docker Installation**: Follow the [Docker setup guide](https://docs.docker.com/get-docker/).
277+
- **Referencing Existing Tests**:
278+
- Example: [Event Processor Tests](https://github.com/aptos-labs/aptos-indexer-processors/blob/main/rust/integration-tests/src/sdk_tests/events_processor_tests.rs#L139).

0 commit comments

Comments
 (0)