- Airflow: support SQLExecuteQueryOperator
#1379
@JDarDagran
- Spark: databricks improvements to send better events
#1330
@pawel-big-lebowski Filter unwanted events, provide meaningful job name. - Python: validate eventTime field in python client
#1355
@pawel-big-lebowski
Validate eventTime of a RunEvent within client library.
0.17.0 - 2022-11-16
- Spark: support latest Spark 3.3.1
#1183
@pawel-big-lebowski
Adds support for the latest Spark 3.3.1 version. - Spark: add Kinesis Transport and support config Kinesis in Spark integration
#1200
@yogayang
Adds support for sending to Kinesis from the Spark integration. - Spark: Disable specified facets
#1271
@pawel-big-lebowski
Adds the ability to disable specified facets from generated OpenLineage events. - Python: add facets implementation to Python client
#1233
@pawel-big-lebowski
Adds missing facets to the Python client. - SQL: add Rust parser interface
#1172
@StarostaGit @mobuchowski
Implements a Java interface in the Rust SQL parser, including a build script, native library loading mechanism, CI support and build fixes. - Proxy: add helm chart for the proxy backed
#1068
@wslulciuc
Adds a helm chart for deploying the proxy backend on Kubernetes. - Spec: include possible facets usage in spec
#1249
@pawel-big-lebowski
Extends thefacets
definition with a list of available facets. - Website: publish YML version of spec to website
#1300
@rossturk
Adds configuration necessary to make the OpenLineage website auto-generate openAPI docs when the spec is published there. - Docs: update language on nominating new committers
#1270
@rossturk
Updates the governance language to reflect the new policy on nominating committers.
- Website: publish spec into new website repo location
#1295
@rossturk
Creates a new deploy key, adds it to CircleCI & GitHub, and makes the necessary changes to therelease.sh
script. - Airflow: change how pip installs packages in tox environments
#1302
@JDarDagran
Use deprecated resolver and constraints files provided by Airflow to avoid potential issues caused by pip's new resolver.
- Airflow: fix README for running integration test
#1238
@sekikn
Updates the README for consistency with supported Airflow versions. - Airflow: add
task_instance
argument toget_openlineage_facets_on_complete
#1269
@JDarDagran
Adds thetask_instance
argument toDefaultExtractor
. - Java client: fix up all artifactory paths
#1290
@harels
Not all artifactory paths were changed in the build CI script in a previous PR. - Python client: fix Mypy errors and adjust to PEP 484
#1264
@JDarDagran
Adds a--no-namespace-packages
argument to the Mypy command and adjusts code to PEP 484. - Website: release all specs since
last_spec_commit_id
, not just HEAD~1#1298
@rossturk
The script now ships all specs that have changed since.last_spec_commit_id
.
- Deprecate HttpTransport.Builder in favor of HttpConfig
#1287
@collado-mike
Deprecates the Builder in favor of HttpConfig only and replaces the existing Builder implementation by delegating to the HttpConfig.
0.16.1 - 2022-11-3
- Airflow: add
dag_run
information to Airflow version run facet#1133
@fm100
Adds the Airflow DAG run ID to thetaskInfo
facet, making this additional information available to the integration. - Airflow: add
LoggingMixin
to extractors#1149
@JDarDagran
Adds aLoggingMixin
class to the custom extractor to make the output consistent with general Airflow and OpenLineage logging settings. - Airflow: add default extractor
#1162
@mobuchowski
Adds aDefaultExtractor
to support the default implementation of OpenLineage for external operators without the need for custom extractors. - Airflow: add
on_complete
argument inDefaultExtractor
#1188
@JDarDagran
Adds support for running another method onextract_on_complete
. - SQL: reorganize the library into multiple packages
#1167
@StarostaGit @mobuchowski
Splits the SQL library into a Rust implementation and foreign language bindings, easing the process of adding language interfaces. Also contains CI fix.
- Airflow: move
get_connection_uri
as extractor's classmethod#1169
@JDarDagran
Theget_connection_uri
method allowed for too many params, resulting in unnecessarily long URIs. This changes the logic to whitelisting per extractor. - Airflow: change
get_openlineage_facets_on_start/complete
behavior#1201
@JDarDagran
Splits up the method for greater legibility and easier maintenance.
- Airflow: always send SQL in
SqlJobFacet
as a string#1143
@mobuchowski
Changes the data type ofquery
from array to string to an fix error in theRedshiftSQLOperator
. - Airflow: include
__extra__
case when filtering URI query params#1144
@JDarDagran
Includes theconn.EXTRA_KEY
in theget_connection_uri
method to avoid exposing secrets in URIs via the__extra__
key. - Airflow: enforce column casing in
SQLCheckExtractor
s#1159
@denimalpaca
Uses the parent extractor's_is_uppercase_names
property to determine if the column should be upper cased in theSQLColumnCheckExtractor
's_get_input_facets()
method. - Spark: prevent exception when no schema provided
#1180
@pawel-big-lebowski
Prevents evalution of column lineage when theschemFacet
isnull
. - Great Expectations: add V3 API compatibility
#1194
@denimalpaca
Fixes the Pandas datasource to make it V3 API-compatible.
- Airflow: remove support for Airflow 1.10
#1128
@mobuchowski
Removes the code structures and tests enabling support for Airflow 1.10.
0.15.1 - 2022-10-05
- Airflow: improve development experience
#1101
@JDarDagran
Adds an interactive development environment to the Airflow integration and improves integration testing. - Spark: add description for URL parameters in readme, change
overwriteName
toappName
#1130
@tnazarew
Adds more information about passing arguments withspark.openlineage.url
and changesoverwriteName
toappName
for clarity. - Documentation: update issue templates for proposal & add new integration template
#1116
@rossturk
Adds a YAML issue template for new integrations and fixes a bug in the proposal template.
- Airflow: lazy load BigQuery client
#1119
@mobuchowski
Moves import of the BigQuery client from top level to local level to decrease DAG import time.
- Airflow: fix UUID generation conflict for Airflow DAGs with same name
#1056
@collado-mike
Adds a namespace to the UUID calculation to avoid conflicts caused by DAGs having the same name in different namespaces in Airflow deployments. - Spark/BigQuery: fix issue with spark-bigquery-connector >=0.25.0
#1111
@pawel-big-lebowski
Makes the Spark integration compatible with the latest connector. - Spark: fix column lineage
#1069
@pawel-big-lebowski
Fixes a null pointer exception error and an error whenopenlineage.timeout
is not provided. - Spark: set log level of
Init OpenLineageContext
to DEBUG#1064
@varuntestaz
Prevents sensitive information from being logged unless debug mode is used. - Java client: update version of SnakeYAML
#1090
@TheSpeedding
Bumps the SnakeYAML library version to include a key bug fix. - dbt: remove requirement for
OPENLINEAGE_URL
to be set#1107
@mobuchowski
Removes erroneous check forOPENLINEAGE_URL
in the dbt integration. - Python client: remove potentially cyclic import
#1126
@mobuchowski
Hides imports to remove potentially cyclic import. - CI: build macos release package on medium resource class
#1131
@mobuchowski
Fixes failing build due to resource class being too large.
0.14.1 - 2022-09-07
- Fix Spark integration issues including error when no
openlineage.timeout
#1069
@pawel-big-lebowski
OpenlineageSparkListener
was failing when noopenlineage.timeout
was provided.
0.14.0 - 2022-09-06
- Support ABFSS and Hadoop Logical Relation in Column-level lineage
#1008
@wjohnson
Introduces anextractDatasetIdentifier
that uses similar logic toInsertIntoHadoopFsRelationVisitor
to pull out the path on the HDFS compliant file system; tested on ABFSS and DBFS (Databricks FileSystem) to prove that lineage could be extracted using non-SQL commands. - Add Kusto relation visitor
#939
@hmoazam
Implements aKustoRelationVisitor
to support lineage for Azure Kusto's Spark connector. - Add ColumnLevelLineage facet doc
#1020
@julienledem
Adds documentation for the Column-level lineage facet. - Include symlinks dataset facet
#935
@pawel-big-lebowski
Includes the recently introducedSymlinkDatasetFacet
in generated OpenLineage events. - Add support for dbt 1.3 beta's metadata changes
#1051
@mobuchowski
Makes projects that are composed of only SQL models work on 1.3 beta (dbt 1.3 renamed thecompiled_sql
field tocompiled_code
to support Python models). Does not provide support for dbt's Python models. - Support Flink 1.15
#1009
@mzareba382
Adds support for Flink 1.15. - Add Redshift dialect to the SQL integration
#1066
@mobuchowski
Adds support for Redshift's SQL dialect in OpenLineage's SQL parser, including quirks such as the use of square brackets in JSON paths. (Note, this does not add support for all of Redshift's custom syntax.)
- Make the timeout configurable in the Spark integration
#1050
@tnazarew
Makes timeout configurable by the user. (In some cases, the time needed to send events was longer than 5 seconds, which exceeded the timeout value.)
- Add a dialect parameter to Great Expectations SQL parser calls
#1049
@collado-mike
Specifies the dialect name from the SQL engine. - Fix Delta 2.1.0 with Spark 3.3.0
#1065
@pawel-big-lebowski
Allows delta support for Spark 3.3 and fixes potential issues. (The Openlineage integration for Spark 3.3 was turned on without delta support, as delta did not support Spark 3.3 at that time.)
0.13.1 - 2022-08-25
- Rename all
parentRun
occurrences toparent
in Airflow integration1037
@fm100
Changes theparentRun
property name toparent
in the Airflow integration to match the spec. - Do not change task instance during
on_running
event1028
@JDarDagran
Fixes an issue in the Airflow integration with theon_running
hook, which was changing theTaskInstance
object along with thetask
attribute.
0.13.0 - 2022-08-22
- Add BigQuery check support
#960
@denimalpaca
Adds logic and support for proper dynamic class inheritance for BigQuery-style operators. (BigQuery's extractor needed additional logic to support the forthcomingBigQueryColumnCheckOperator
andBigQueryTableCheckOperator
.) - Add
RUNNING
EventType
in spec and Python client#972
@mzareba382
Introduces aRUNNING
event state in the OpenLineage spec to indicate a running task and adds aRUNNING
event type in the Python API. - Use databases & schemas in SQL Extractors
#974
@JDarDagran
Allows the Airflow integration to differentiate between databases and schemas. (There was no notion of databases and schemas when querying and parsing results frominformation_schema
tables.) - Implement Event forwarding feature via HTTP protocol
#995
@howardyoo
AddsHttpLineageStream
to forward a given OpenLineage event to any HTTP endpoint. - Introduce
SymlinksDatasetFacet
to spec#936
@pawel-big-lebowski
Creates a new facet, theSymlinksDatasetFacet
, to support the storing of alternative dataset names. - Add Azure Cosmos Handler to Spark integration
#983
@hmoazam
Defines a new interface, theRelationHandler
, to support Spark data sources that do not haveTableCatalog
,Identifier
, orTableProperties
set, as is the case with the Azure Cosmos DB Spark connector. - Support OL Datasets in manual lineage inputs/outputs
#1015
@conorbev
Allows Airflow users to create OpenLineage Dataset classes directly in DAGs with no conversion necessary. (Manual lineage definition required users to create anairflow.lineage.entities.Table
, which was then converted to an OpenLineage Dataset.) - Create ownership facets
#996
@julienledem
Adds an ownership facet to both Dataset and Job in the OpenLineage spec to capture ownership of jobs and datasets.
- Use
RUNNING
EventType in Flink integration for currently running jobs#985
@mzareba382
Makes use of the newRUNNING
event type in the Flink integration, changing events sent by Flink jobs fromOTHER
to this new type. - Convert task objects to JSON-encodable objects when creating custom Airflow version facets
#1018
@fm100
Implements ato_json_encodable
function in the Airflow integration to make task objects JSON-encodable.
- Add support for custom SQL queries in v3 Great Expectations API
#1025
@collado-mike
Fixes support for custom SQL statements in the Great Expectations provider. (The Great Expectations custom SQL datasource was not applied to the support for the V3 checkpoints API.)
0.12.0 - 2022-08-01
- Add Spark 3.3.0 support
#950
@pawel-big-lebowski - Add Apache Flink integration
#951
@mobuchowski - Add ability to extend column level lineage mechanism
#922
@pawel-big-lebowski - Add ErrorMessageRunFacet
#897
@mobuchowski - Add SQLCheckExtractors
#717
@denimalpaca - Add RedshiftSQLExtractor & RedshiftDataExtractor
#930
@JDarDagran - Add dataset builder for AlterTableCommand
#927
@tnazarew
- Limit Delta events
#905
@pawel-big-lebowski - Airflow integration: allow lineage metadata to flow through inlets and outlets
#914
@fenil25
- Limit size of serialized plan
#917
@pawel-big-lebowski - Fix noclassdef error
#942
@pawel-big-lebowski
0.11.0 - 2022-07-07
- HTTP option to override timeout and properly close connections in
openlineage-java
lib.#909
@mobuchowski - Dynamic mapped tasks support to Airflow integration
#906
@JDarDagran SqlExtractor
to Airflow integration#907
@JDarDagran- PMD to Java and Spark builds in CI
#898
@merobi-hub
- When testing extractors in the Airflow integration, set the extractor length assertion dynamic
#882
@denimalpaca - Render templates as start of integration tests for
TaskListener
in the Airflow integration#870
@mobuchowski
- Dependencies bundled with
openlineage-java
lib.#855
@collado-mike - PMD reported issues
#891
@pawel-big-lebowski - Spark casting error and session catalog support for
iceberg
in Spark integration#856
@wslulciuc
0.10.0 - 2022-06-24
- Add static code anlalysis tool mypy to run in CI for against all python modules (
#802
) @howardyoo - Extend
SaveIntoDataSourceCommandVisitor
to extract schema fromLocalRelaiton
andLogicalRdd
in spark integration (#794
) @pawel-big-lebowski - Add
InMemoryRelationInputDatasetBuilder
forInMemory
datasets to Spark integration (#818
) @pawel-big-lebowski - Add copyright to source files
#755
@merobi-hub - Add
SnowflakeOperatorAsync
extractor support to Airflow integration#869
@merobi-hub - Add PMD analysis to proxy project (
#889
) @howardyoo
- Skip
FunctionRegistry.class
serialization in Spark integration (#828
) @mobuchowski - Install new
rust
-based SQL parser by default in Airflow integration (#835
) @mobuchowski - Improve overall
pytest
and integration tests for Airflow integration (#851
,#858
) @denimalpaca - Reduce OL event payload size by excluding local data and including output node in start events (
#881
) @collado-mike - Split spark integration into submodules (
#834
,#890
) @tnazarew @mobuchowski
- Conditionally import
sqlalchemy
lib for Great Expectations integration (#826
) @pawel-big-lebowski - Add check for missing class
org.apache.spark.sql.catalyst.plans.logical.CreateV2Table
in Spark integration (#866
) @pawel-big-lebowski - Fix static code analysis issues (
#867
,#874
) @pawel-big-lebowski
- Spark: Column-level lineage introduced for Spark integration (#698, #645) @pawel-big-lebowski
- Java: Spark to use Java client directly (#774) @mobuchowski
- Clients: Add OPENLINEAGE_DISABLED environment variable which overrides config to NoopTransport (#780) @mobuchowski
- Set log to debug on unknown facet entry (#766) @wslulciuc
- Dagster: pin protobuf version to 3.20 as suggested by tests (#787) @mobuchowski
- Add SafeStrDict to skip failing attibutes (#798) @JDarDagran
openlineage-airflow
now supports getting credentials from Airflows secrets backend (#723) @mobuchowskiopenlineage-spark
now supports Azure Databricks Credential Passthrough (#595) @wjohnsonopenlineage-spark
detects datasets wrapped byExternalRDD
s (#746) @collado-mike
PostgresOperator
fails to retrieve host and conn during extraction (#705) @sekikn- SQL parser accepts lists of sql statements (#734) @mobuchowski
- Missing schema when writing to Delta tables in Databricks (#748) @collado-mike
- Airflow integration uses new TaskInstance listener API for Airflow 2.3+ (#508) @mobuchowski
- Support for HiveTableRelation as input source in Spark integration (#683) @collado-mike
- Add HTTP and Kafka Client to
openlineage-java
lib (#480) @wslulciuc, @mobuchowski - New SQL parser, used by Postgres, Snowflake, Great Expectations integrations (#644) @mobuchowski
- GreatExpectations: Fixed bug when invoking GreatExpectations using v3 API (#683) @collado-mike
- Python implements Transport interface - HTTP and Kafka transports are available (#530) @mobuchowski
- Add UnknownOperatorAttributeRunFacet and support in lineage backend (#547) @collado-mike
- Support Spark 3.2.1 (#607) @pawel-big-lebowski
- Add StorageDatasetFacet to spec (#620) @pawel-big-lebowski
- Airflow: custom extractors lookup uses only get_operator_classnames method (#656) @mobuchowski
- README.md created at OpenLineage/integrations for compatibility matrix (#663) @howardyoo
- Dagster: handle updated PipelineRun in OpenLineage sensor unit test (#624) @dominiquetipton
- Delta improvements (#626) @collado-mike
- Fix SqlDwDatabricksVisitor for Spark2 (#630) @wjohnson
- Airflow: remove redundant logging from GE import (#657) @mobuchowski
- Fix Shebang issue in Spark's wait-for-it.sh (#658) @mobuchowski
- Update parent_run_id to be a uuid from the dag name and run_id (#664) @collado-mike
- Spark: fix time zone inconsistency in testSerializeRunEvent (#681) @sekikn
- CI: add integration tests for Airflow's SnowflakeOperator and dbt-snowflake @mobuchowski
- Introduce DatasetVersion facet in spec @pawel-big-lebowski
- Airflow: add external query id facet @mobuchowski
- Complete Fix of Snowflake Extractor get_hook() Bug @denimalpaca
- Update artwork @rossturk
- Airflow tasks in a DAG now report a common ParentRunFacet @collado-mike
- Catch possible failures when emitting events and log them @mobuchowski
- dbt: jinja2 code using do extensions does not crash @mobuchowski
- Extract source code of PythonOperator code similar to SQL facet @mobuchowski
- Add DatasetLifecycleStateDatasetFacet to spec @pawel-big-lebowski
- Airflow: extract source code from BashOperator @mobuchowski
- Add generic facet to collect environmental properties (EnvironmentFacet) @harishsune
- OpenLineage sensor for OpenLineage-Dagster integration @dalinkim
- Java-client: make generator generate enums as well @pawel-big-lebowski
- Added
UnknownOperatorAttributeRunFacet
to Airflow integration to record operators that don't produce lineage @collado-mike
- Airflow: increase import timeout in tests, fix exit from integration @mobuchowski
- Reduce logging level for import errors to info @rossturk
- Remove AWS secret keys and extraneous Snowflake parameters from connection uri @collado-mike
- Convert to LifecycleStateChangeDatasetFacet @pawel-big-lebowski
- Proxy backend example using
Kafka
@wslulciuc - Support Databricks Delta Catalog naming convention with DatabricksDeltaHandler @wjohnson
- Add javadoc as part of build task @mobuchowski
- Include TableStateChangeFacet in non V2 commands for Spark @mr-yusupov
- Support for SqlDWRelation on Databricks' Azure Synapse/SQL DW Connector @wjohnson
- Implement input visitors for v2 commands @pawel-big-lebowski
- Enabled SparkListenerJobStart events to trigger open lineage events @collado-mike
- dbt: job namespaces for given dbt run match each other @mobuchowski
- Fix Breaking SnowflakeOperator Changes from OSS Airflow @denimalpaca
- Made corrections to account for DeltaDataSource handling @collado-mike
- Support for dbt-spark adapter @mobuchowski
- New
backend
to proxy OpenLineage events to one or more event streams 🎉 @mandy-chessell @wslulciuc - Add Spark extensibility API with support for custom Dataset and custom facet builders @collado-mike
- airflow: fix import failures when dependencies for bigquery, dbt, great_expectations extractors are missing @lukaszlaszko
- Fixed openlineage-spark jar to correctly rename bundled dependencies @collado-mike
0.4.0 - 2021-12-13
- Spark output metrics @OleksandrDvornik
- Separated tests between Spark 2 & 3 @pawel-big-lebowski
- Databricks install README and init scripts @wjohnson
- Iceberg integration with unit tests @pawel-big-lebowski
- Kafka read and write support @OleksandrDvornik / @collado-mike
- Arbitrary parameters supported in HTTP URL construction @wjohnson
- Increased visitor coverage for Spark commands @mobuchowski / @pawel-big-lebowski
- dbt: column descriptions are properly filled from metadata.json @mobuchowski
- dbt: allow parsing artifacts with version higher than officially supported @mobuchowski
- dbt: dbt build command is supported @mobuchowski
- dbt: fix crash when build command is used with seeds in dbt 1.0.0rc3 @mobuchowski
- spark: increase logical plan visitor coverage @mobuchowski
- spark: fix logical serialization recursion issue @OleksandrDvornik
- Use URL#getFile to fix build on Windows @mobuchowski
0.3.1 - 2021-10-21
- fix import in spark3 visitor @mobuchowski
0.3.0 - 2021-10-21
- Spark3 support @OleksandrDvornik / @collado-mike
- LineageBackend for Airflow 2 @mobuchowski
- Adding custom spark version facet to spark integration @OleksandrDvornik
- Adding dbt version facet @mobuchowski
- Added support for Redshift profile @AlessandroLollo
- Sanitize JDBC URLs @OleksandrDvornik
- strip openlineage url in python client @OleksandrDvornik
- deploy spec if spec file changes @mobuchowski
0.2.3 - 2021-10-07
- Add dbt
v3
manifest support @mobuchowski
0.2.2 - 2021-09-08
- Implement OpenLineageValidationAction for Great Expectations @collado-mike
- facet: add expectations assertions facet @mobuchowski
- airflow: pendulum formatting fix, add tests @mobuchowski
- dbt: do not emit events if run_result file was not updated @mobuchowski
0.2.1 - 2021-08-27
- Default
--project-dir
argument to current directory indbt-ol
script @mobuchowski
0.2.0 - 2021-08-23
-
Parse dbt command line arguments when invoking
dbt-ol
@mobuchowski. For example:$ dbt-ol run --project-dir path/to/dir
-
Set
UnknownFacet
for spark (captures metadata about unvisited nodes from spark plan not yet supported) @OleksandrDvornik
- Remove
model
from dbt job name @mobuchowski - Default dbt job namespace to output dataset namespace @mobuchowski
- Rename
openlineage.spark.*
toio.openlineage.spark.*
@OleksandrDvornik
- Remove instance references to extractors from DAG and avoid copying log property for serializability @collado-mike
0.1.0 - 2021-08-12
OpenLineage is an Open Standard for lineage metadata collection designed to record metadata for a job in execution. The initial public release includes:
- An inital specification. The the inital version
1-0-0
of the OpenLineage specification defines the core model and facets. - Integrations that collect lineage metadata as OpenLineage events:
Apache Airflow
with support for BigQuery, Great Expectations, Postgres, Redshift, SnowflakeApache Spark
dbt
- Clients that send OpenLineage events to an HTTP backend. Both
java
andpython
are initially supported.