Upgrading Flink Version based on Amoro Mixed-format table #1937
Replies: 5 comments 3 replies
-
+1 I think it should be like iceberg, which only supports the latest 3 Flink or Spark versions.As far as I know, Flink1.18 functions have been frozen and will be released in the October. I recommend starting from the mini version 1.16. |
Beta Was this translation helpful? Give feedback.
-
This matter needs to be considered together with another issue, namely the refactoring of the MixedIceberg format being promoted by the community, as well as the UnifiedCatalog Connector. First is the refactoring of the MixedIceberg format: The current MixedIceberg is implemented based on HadoopTables, which limits its ability to be registered in other metastores. In the future, Iceberg Catalog will be used to create Base/Change Store. This may require the engine to directly use the Iceberg engine connector to access the base/change store. UnifiedCatalog Connector The abstraction of the UnifiedCatalog only includes the table load process, and in the engine-side implementation, it is necessary to directly access the format's implementation in the engine. Therefore, I believe that the versions of Mixed-Iceberg supported on the engine side need to be consistent with the iceberg version that Amoro depends on. |
Beta Was this translation helpful? Give feedback.
-
In view of the previous discussion, I suggest that we can now open an ISSUES that supports Flink 1.17. I am willing to take on this part of the work.I will advance this issue simultaneously with Flink unit testing improvement (#1597). |
Beta Was this translation helpful? Give feedback.
-
@czy006 Thank you very much for volunteering to upgrade the Flink version and enhance Flink Unit Testing. It would be beneficial to discuss our next steps. Rather than improving the Flink Unit Tests, it may be wise to extract the 1.14 or 1.15 modules that are functionally the same into the Flink-common module. This will ensure that most of the unit tests can be maintained in the Flink-common module, which will reduce the cost of maintaining individual tests. Therefore, I suggest that we focus on enhancing the Flink Unit testing work after the extraction of the Flink-common module is complete. WDYT? Future Flink modules: |
Beta Was this translation helpful? Give feedback.
-
Dear team,
I would like to bring to your attention the discussion about upgrading the Flink version in Amoro. As you may know, currently Amoro supports Flink versions 1.12, 1.14, and 1.15. However, some Amoro users have suggested upgrading to Flink version 1.16 or even 1.17, citing concerns that Flink 1.12 is too old.
It is important to note that the Flink community generally maintains the last three versions, and the Iceberg Community also maintains only three versions of the Flink connector. Therefore, it would be wise for Amoro to maintain the last three Flink versions (1.15, 1.16, and 1.17) to stay up-to-date with the latest developments.
Another point to consider is that the Flink community has been improving the stability of the connector interface, which makes it easier to abstract the Flink common connector between Flink versions. This has been done successfully in some open-source communities, such as Paimon, where common implementations were abstracted and compiled successfully in newer versions of Flink like Flink 1.14 1.15, 1.16, and 1.17.
With these factors in mind, I suggest that we seriously consider upgrading Amoro to Flink version 1.16 or 1.17. This will not only keep us current with the latest developments but also ensure that we are providing our users with the most up-to-date and reliable service possible.
Please feel free to share any concerns, suggestions, or ideas you may have regarding upgrading the Flink version in Amoro. Let me know if there is anything I can do to assist you during this discussion.
Beta Was this translation helpful? Give feedback.
All reactions