Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support create table like in flink catalog and watermark in windows #12116

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

swapna267
Copy link

@swapna267 swapna267 commented Jan 27, 2025

This PR addresses

  1. Creation of dynamic Iceberg table in Flink Catalog using the underlying physical Iceberg table using LIKE clause.
  2. Iceberg Source to support Source Watermark, so it can be used in Flink WINDOW functions. https://github.com/apache/flink/blob/release-1.18/flink-table/flink-table-common/src/main/java/org/apache/flink/table/connector/source/abilities/SupportsSourceWatermark.java enables Flink to rely on the watermark strategy provided by the ScanTableSource itself.
CREATE TABLE table_wm (
      eventTS AS CAST(t1 AS TIMESTAMP(3)),
      WATERMARK FOR eventTS AS SOURCE_WATERMARK()
) WITH (
  'watermark-column'='t1'
) LIKE iceberg_catalog.db.table;

Reference:
#10219
#9346

@github-actions github-actions bot added the flink label Jan 27, 2025
@pvary
Copy link
Contributor

pvary commented Jan 28, 2025

@swapna267: Started the test runs so we can see the status of the PR.
Could you please remove the Flink 1.18/1.9 version part of the changes?
It makes the review easier if we concentrate only a single Flink version, and do the backport later when the changes to the main version are merged.

@swapna267
Copy link
Author

@pvary reverted 1.18/1.19 changes.

@swapna267
Copy link
Author

swapna267 commented Jan 29, 2025

Background:

  1. Creation of dynamic Iceberg table in Flink Catalog using the underlying physical Iceberg table using LIKE clause.

Currently (without the changes in PR), create table in flink catalog works by configuring flink connector as described in,
flink-connector

But that needs user to provide the schema for the table. A way to tackle that is to do create table LIKE using below DDL.

CREATE TABLE table_wm (
      eventTS AS CAST(t1 AS TIMESTAMP(3)),
      WATERMARK FOR eventTS AS SOURCE_WATERMARK()
) WITH (
  'connector'='iceberg',
  'catalog-name'='iceberg_catalog',
  'catalog-database'='testdb',
  'catalog-table'='t'
) LIKE iceberg_catalog.testdb.t;

Options like connector, catalog-name, catalog-database, catalog-table need to be duplicated as Iceberg FlinkCatalog doesn't return any catalog related properties during getTable. This PR addresses the issue by including these properties when getTable is called , which will be used by Flink when creating table in Flink Catalog.

  1. Iceberg Source to support Source Watermark
    As raised in The "Emitting watermarks" feature can't be used in flink sql? #10219, Source watermark implemented as part of https://iceberg.apache.org/docs/nightly/flink-queries/#emitting-watermarks cannot be used in Flink window functions.

Flink let's user to push down watermark to source using interface SupportSourceWatermark.java .

Here we are falling back to read options implemented in #9346 , to configure the watermark column on Iceberg Source.

Comment on lines -387 to -393
if (Objects.equals(
table.getOptions().get("connector"), FlinkDynamicTableFactory.FACTORY_IDENTIFIER)) {
throw new IllegalArgumentException(
"Cannot create the table with 'connector'='iceberg' table property in "
+ "an iceberg catalog, Please create table with 'connector'='iceberg' property in a non-iceberg catalog or "
+ "create table without 'connector'='iceberg' related properties in an iceberg table.");
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we remove this check?

Copy link
Author

@swapna267 swapna267 Jan 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tables can be created using LIKE in

  1. Flink Catalog - Not supported currently.
  2. Another table in Iceberg catalog itself as detailed in doc

This check basically fails, if we try to create table using LIKE in Iceberg catalog, basically case#2 if we have connector=iceberg in options . For example, DDL like below,

CREATE TABLE  `hive_catalog`.`default`.`sample_like` 
LIKE `hive_catalog`.`default`.`sample`
WITH ('connector'='iceberg')

In order to support Case#1 without user setting any extra Options using WITH clause, we need to add connector in getTable,

catalogAndTableProps.put("connector", FlinkDynamicTableFactory.FACTORY_IDENTIFIER);

This check was added in very old PR,
#2666
#2666 (comment) where Flink SQL didn't support CREATE TABLE A LIKE B , where A and B are in different Catalogs.

So, in this case by removing this check, we are ignoring connector option being passed, so following DDL can create table table_like in Flink catalog backed by iceberg_catalog.db.table. As we know source table is an Iceberg table, adding connector=iceberg would be redundant.

CREATE TABLE table_like (
      eventTS AS CAST(t1 AS TIMESTAMP(3)),
) LIKE iceberg_catalog.db.table;

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens when the source table is not an Iceberg table?
I'm trying to understand here, where we get the missing information in this case, and wether we have a way to check that we actually get the missing information. If we can create such a check, then we can still throw an exception when we don't get this information from any source

@@ -53,7 +54,8 @@ public class IcebergTableSource
implements ScanTableSource,
SupportsProjectionPushDown,
SupportsFilterPushDown,
SupportsLimitPushDown {
SupportsLimitPushDown,
SupportsSourceWatermark {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we have 2 feature in a single PR:

  • CREATE TABLE LIKE
  • Watermark support

Could we separate out these features to different PRs?
Could we write tests for both features?

Copy link
Author

@swapna267 swapna267 Jan 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These features were driven by mainly a use case, where an iceberg table is needed to be used in Flink window functions. This needs an incoming table to have MILLISECOND precision timestamp column and also watermark to be defined on source table.

As iceberg only supports MICROSECOND timestamp columns, we need to have a table with computed columns and we can create these only in Flink Catalog. Iceberg catalog doesn't support creating tables with computed columns.

i am happy to split them into 2 separate PR's .
I have tests for CREATE TABLE LIKE.

As Watermark support is just making Source to implement interface and falling back to #9346 for core logic, i didn't have a test case. I can add a validation on if watermark-column is configured or not , so it can fail fast. And a test case around that.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please separate out the features to 2 PR

Comment on lines -259 to -262
@TestTemplate
public void testConnectorTableInIcebergCatalog() {
// Create the catalog properties
Map<String, String> catalogProps = Maps.newHashMap();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this test removed?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is testing the check mentioned in #12116 (comment)

Fail creating a table in Iceberg Catalog if connector=iceberg is specified in the Option. As the check is been deleted, i removed this test case.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that this is still a valid check in most cases. Only not valid when the table is created with "CREATE TABLE.. LIKE" and only if the source table is an iceberg table.
Do I miss something?

@swapna267 swapna267 requested a review from pvary January 30, 2025 17:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants