-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: SQL Expression Parsing Error, when expression includes quotes #162
Labels
bug
Something isn't working
Comments
3 tasks
mwojtyczka
added a commit
that referenced
this issue
Feb 12, 2025
* Provided option to customize reporting column names ([#127](#127)). In this release, the DQEngine library has been enhanced to allow for customizable reporting column names. A new constructor has been added to DQEngine, which accepts an optional ExtraParams object for extra configurations. A new Enum class, DefaultColumnNames, has been added to represent the columns used for error and warning reporting. New tests have been added to verify the application of checks with custom column naming. These changes aim to improve the customizability, flexibility, and user experience of DQEngine by providing more control over the reporting columns and resolving issue [#46](#46). * Fixed parsing error when loading checks from a file ([#165](#165)). In this release, we have addressed a parsing error that occurred when loading checks (data quality rules) from a file, fixing issue [#162](#162). The specific issue being resolved is a SQL expression parsing error. The changes include refactoring tests to eliminate code duplication and improve maintainability, as well as updating method and variable names to use `filepath` instead of "path". Additionally, new unit and integration tests have been added and manually tested to ensure the correct functionality of the updated code. * Removed usage of try_cast spark function from the checks to make sure DQX can be run on more runtimes ([#163](#163)). In this release, we have refactored the code to remove the usage of the `try_cast` Spark function and replace it with `cast` and `isNull` checks to improve code compatibility, particularly for runtimes where `try_cast` is not available. The affected functionality includes null and empty column checks, checking if a column value is in a list, and checking if a column value is a valid date or timestamp. We have added unit and integration tests to ensure functionality is working as intended. * Added filter to rules so that you can make conditional checks ([#141](#141)). The filter serves as a condition that data must meet to be evaluated by the check function. The filters restrict the evaluation of checks to only apply to rows that meet the specified conditions. This feature enhances the flexibility and customizability of data quality checks in the DQEngine.
Merged
mwojtyczka
added a commit
that referenced
this issue
Feb 12, 2025
* Provided option to customize reporting column names ([#127](#127)). In this release, the DQEngine library has been enhanced to allow for customizable reporting column names. A new constructor has been added to DQEngine, which accepts an optional ExtraParams object for extra configurations. A new Enum class, DefaultColumnNames, has been added to represent the columns used for error and warning reporting. New tests have been added to verify the application of checks with custom column naming. These changes aim to improve the customizability, flexibility, and user experience of DQEngine by providing more control over the reporting columns and resolving issue [#46](#46). * Fixed parsing error when loading checks from a file ([#165](#165)). In this release, we have addressed a parsing error that occurred when loading checks (data quality rules) from a file, fixing issue [#162](#162). The specific issue being resolved is a SQL expression parsing error. The changes include refactoring tests to eliminate code duplication and improve maintainability, as well as updating method and variable names to use `filepath` instead of "path". Additionally, new unit and integration tests have been added and manually tested to ensure the correct functionality of the updated code. * Removed usage of try_cast spark function from the checks to make sure DQX can be run on more runtimes ([#163](#163)). In this release, we have refactored the code to remove the usage of the `try_cast` Spark function and replace it with `cast` and `isNull` checks to improve code compatibility, particularly for runtimes where `try_cast` is not available. The affected functionality includes null and empty column checks, checking if a column value is in a list, and checking if a column value is a valid date or timestamp. We have added unit and integration tests to ensure functionality is working as intended. * Added filter to rules so that you can make conditional checks ([#141](#141)). The filter serves as a condition that data must meet to be evaluated by the check function. The filters restrict the evaluation of checks to only apply to rows that meet the specified conditions. This feature enhances the flexibility and customizability of data quality checks in the DQEngine.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is there an existing issue for this?
Current Behavior
When defining a SQL expression rule within a yaml file, if the expression contains any quotation marks, the parser is unable to validate the checks using the dq_engine.load_checks_from_workspace_file(file) function.
For example, when applying the following YAML rule to dq_engine.load_checks_from_workspace_file():
It will return the following error:
The same problem will occur if you used SQL expressions like:
Expected Behavior
The function should be able to expect quotation marks as part of the SQL expression argument, and ensure the quotation marks are carried over during JSON parsing.
Since it does not, you need to find a workaround to ensure the 'dq_engine.load_checks_from_workspace_file' function can ignore the quotation marks when parsing to JSON (Workaround is in Steps to Reproduce section)
Steps To Reproduce
As a temporary workaround, do the following:
) [Notes]. e.g: col1 like
Team %`Notes:
The curly apostrophe can be replaced with any other combination of symbols you'd like. You just need to ensure you change it within your replace_quotes function too.
Cloud
Azure
Operating System
Windows
Relevant log output
The text was updated successfully, but these errors were encountered: