Skip to content

add ml processor for offline batch inference #3417

add ml processor for offline batch inference

add ml processor for offline batch inference #3417

Triggered via pull request March 6, 2025 23:10
Status Success
Total duration 4m 57s
Artifacts 8
Matrix: integration-tests
Publish Unit Tests Results
9s
Publish Unit Tests Results
Fit to window
Zoom out
Zoom in

Annotations

1 warning
Publish Unit Tests Results
This action is running on a pull_request event for a fork repository. It cannot do anything useful like creating check runs or pull request comments. To run the action on fork repository pull requests, see https://github.com/EnricoMi/publish-unit-test-result-action/blob/v1.20/README.md#support-fork-repositories-and-dependabot-branches

Artifacts

Produced during runtime
Name Size
data-prepper-opensearch-integration-tests-opendistro-0.10.0-java-11
6.59 KB
data-prepper-opensearch-integration-tests-opendistro-1.11.0-java-11
7.76 KB
data-prepper-opensearch-integration-tests-opendistro-1.12.0-java-11
7.78 KB
data-prepper-opensearch-integration-tests-opendistro-1.13.3-java-11
7.79 KB
data-prepper-opensearch-integration-tests-opendistro-1.3.0-java-11
7.62 KB
data-prepper-opensearch-integration-tests-opendistro-1.6.0-java-11
7.61 KB
data-prepper-opensearch-integration-tests-opendistro-1.8.0-java-11
7.64 KB
data-prepper-opensearch-integration-tests-opendistro-1.9.0-java-11
7.77 KB