Data Quality Enforcement Using Great Expectations and Flyte
We’re excited to announce a new integration between Great Expectations and Flyte. We believe this is a much deeper and intuitive integration than ever before!
Great Expectations is a Python-based open-source library for validating, documenting, and profiling your data. It helps to maintain data quality and improve communication about data between teams.
If you haven’t worked with Great Expectations before, jump right into the getting started tutorial.
The goodness of data validation in Great Expectations can be integrated with Flyte to validate the data moving in and out of the pipeline entities you may have defined in Flyte. This helps establish stricter boundaries around your data to ensure that everything works as expected and data does not crash your pipelines anymore unexpectedly!
The Idea
We have entered into a data-first world. Data is often a product and is critical for running a business. As for software services, we have created multiple systems to manage the quality and ensure rigor and correctness at every step. This is not true with data systems. Data can be large, spread out, and sometimes duplicated. We have to enforce quality checks on the data.
It is hard and expensive to enforce quality after data is generated, and often, this may yield un-recoverable results.
At Flyte, we believe that data quality is not an afterthought but should be an integral part of the data definition and generation process. We may have multiple tasks and workflows where infinite bits of data pass through the pipelines. Keeping an eye on data all the time isn’t a feasible solution. What we need is an automated mechanism that validates our data thoroughly.
This is why Flyte has a native type system that enforces the correctness of data. However, a type system alone doesn’t suffice; we also need a comprehensive data validation tool.
This is where Great Expectations comes into the picture. It can help take Flyte’s data handling system to the next level.
With the Great Expectations and Flyte integration, we can now:
- Make the Flyte pipelines more robust and resilient
- Enforce validation rules on the data
- Eliminate bad data
- Not worry about unexpected data-related crashes in the Flyte pipelines
- Prevent data quality issues
- Validate the data pre-ingestion (data going into the Flyte pipeline) and post-ingestion (data coming out of the Flyte pipeline)
- Ensure that new data isn’t out of line with the existing data
How to Define the Integration
Great Expectations supports native execution of expectations against various Datasources, such as Pandas dataframes, Spark dataframes, and SQL databases via SQLAlchemy.
We’re supporting two Flyte types (along with the primitive “string” type to support files) that should suit Great Expectations’ Datasources:
- flytekit.types.file.FlyteFile: FlyteFile represents an automatic persistence object in Flyte. It can represent files in remote storage, and Flyte will transparently materialize them in every task execution.
- flytekit.types.schema.FlyteSchema: FlyteSchema supports tabular data, which the plugin will convert into a parquet file and validate the data using Great Expectations.
Flyte types have been added because, in Great Expectations, we have the privilege to give a non-string (Pandas/Spark DataFrame) when using a RuntimeDataConnector but not when using an InferredAssetFilesystemDataConnector or a ConfiguredAssetFilesystemDataConnector. For the latter case, with the integration of Flyte types, we can give a Pandas/Spark DataFrame or a remote URI as the dataset.
The datasources can be well-integrated with the plugin using the following two modes:
- Flyte Task: A Flyte task defines the task prototype that one could use within a task or a workflow to validate data using Great Expectations.
- Flyte Type: A Flyte type helps attach the <span class=“code-inline”>GreatExpectationsType</span> to any dataset. Under the hood, <span class=“code-inline”>GreatExpectationsType</span> can be assumed as a combination of Great Expectations and Flyte types where every bit of data is validated against the Expectations, much like the OpenAPI Spec or the gRPC validator.
Note: Expectations are unit tests that specify the validation you would like to enforce on your data, like expect_table_row_count_to_be_between or expect_column_values_to_be_of_type.
Plugin Parameters
- datasource_name: Datasource, in general, is the “name” we use in the Great Expectations config file for accessing data. A Datasource brings together a way of interacting with data (like a database or Spark cluster) and some specific data (like a CSV file or a database table). Datasources enable us to build batches out of data (for validation).
- expectation_suite_name: Defines the data validation.
- data_connector_name: Tells how the data batches are to be identified.
Optional Parameters
- context_root_dir: Sets the path of the Great Expectations config directory.
- checkpoint_params: Optional SimpleCheckpoint class parameters.
- batchrequest_config: Additional batch request configuration parameters.
• data_connector_query: Query to request a data batch
• runtime_parameters: Parameters to be sent at run-time
• batch_identifiers: Batch identifiers
• batch_spec_passthrough: Reader method if your file doesn’t have an extension - data_asset_name: Name of the data asset (to be used for <span class=“code-inline”>RuntimeBatchRequest</span>)
- local_file_path: User-given path where the dataset has to be downloaded
Note: You may always want to mention the <span class=“code-inline”>context_root_dir</span> parameter, as providing a path means no harm! Moreover, <span class=“code-inline”>local_file_path</span> is essential when using <span class=“code-inline”>FlyteFile</span> and <span class=“code-inline”>FlyteSchema</span>.
Examples
Firstly, install the plugin using the following command:
(or)
Note: You should have a config file with the required Great Expectations configuration. For example, refer to this config file.
Task example
Great Expectations can be written as a Flyte task. To do so,
- Initialize Great Expectations configuration
- Validate data using the configuration
Here’s an example using <span class=“code-inline”>FlyteFile</span>:
Additional Batch Request parameters can be given using BatchRequestConfig.
Data Validation Failure
If the data validation fails, the plugin will raise a Great Expectations ValidationError.
For example, this is how the error message looks:
Note: In the future, we plan to integrate Great Expectations data docs with Flyte UI. This can enhance the visualization of errors and capture the key characteristics of the dataset.
Type Example
Great Expectations validation can be encapsulated in Flyte’s type-system. Here’s an example using <span class=“code-inline”>FlyteSchema</span>:
Refer to the fully worked task and type examples to understand how well Great Expectations integrates with Flyte.
Note: Great Expectations’ <span class=“code-inline”>RunTimeBatchRequest</span> can be used just like a simple <span class=“code-inline”>BatchRequest</span> in Flyte. Make sure to set up the data connector correctly. The plugin then automatically checks for the type of batch request and instantiates it. Check this example.
Let us know if you find this plugin useful! Join the Flyte community or the Great Expectations community if you have any suggestions or feedback—we’d love to hear from you!