CommunityDEENDEENProductsCore ServicesRoadmapRelease NotesService descriptionCertifications and attestationsPrivate CloudManaged ServicesBenefitsSecurity/DSGVOSustainabilityOpenStackMarket leaderPricesPricing modelsComputing & ContainersStorageNetworkDatabase & AnalysisSecurityManagement & ApplicationsPrice calculatorSolutionsIndustriesHealthcarePublic SectorScience and researchAutomotiveMedia and broadcastingRetailUse CasesArtificial intelligenceHigh Performance ComputingBig data and analyticsInternet of ThingsDisaster RecoveryData StorageTurnkey solutionsTelekom cloud solutionsPartner cloud solutionsSwiss Open Telekom CloudReferencesPartnerCIRCLE PartnerTECH PartnerBecome a partnerAcademyTraining & certificationsEssentials trainingFundamentals training coursePractitioner online self-trainingArchitect training courseCertificationsCommunityCommunity blogsCommunity eventsLibraryStudies and whitepaperWebinarsBusiness NavigatorSupportSupport from expertsAI chatbotShared ResponsibilityGuidelines for Security Testing (Penetration Tests)Mobile AppHelp toolsFirst stepsTutorialStatus DashboardFAQTechnical documentationNewsBlogFairs & eventsTrade pressPress inquiriesCommunity

0800 3304477 24 hours a day, seven days a week

Write an E-mail 

Book now and claim starting credit of EUR 250
ProductsCore ServicesPrivate CloudManaged ServicesBenefitsPricesPricing modelsPrice calculatorSolutionsIndustriesUse CasesTurnkey solutionsSwiss Open Telekom CloudReferencesPartnerCIRCLE PartnerTECH PartnerBecome a partnerAcademyTraining & certificationsCommunityLibraryBusiness NavigatorSupportSupport from expertsHelp toolsTechnical documentationNewsBlogFairs & eventsTrade pressPress inquiries
  • 0800 330447724 hours a day, seven days a week
  • Write an E-mail 
Book now and claim starting credit of EUR 250

New Features for Data Lake Insight (DLI)

New minor version of Data Lake Insight (DLI) offers new features and some changes on existing ones.

Resolved issues:

  • [Flink opensource SMN] There was an abnormal connection issue with the SMN destination when creating an opensource SQL.
  • There is an issue with the SDK link page on the console.
  • When navigating from DLI console to the API Reference page, a 404 error is displayed.
  • The SMN notification email for job failures includes wrong information from Cloud provider field.

New Features:

  • The DLI engine supports real-time metadata collection based on MetaStoreEventListener. Users can manage unified metadata based on Data Map.
  • The CPU consumption of DLI SQL jobs is displayed. SQL job metrics are added to the console, allowing users to view SQL job information more intuitively.
  • DLI now supports Flink 1.15 (this feature is only available to users on the whitelist). Whitelisted users can now use Flink 1.15 Jar jobs.
  • DLI Hetu tenant plane APIs now support native Presto JDBC + BI integration.
  • Flink jobs can now connect to DLI metadata and read/write data
    • Flink image is now compatible with DLI catalogs.
    • Flink extension now supports new configuration items for DLI catalog integration.
  • Hetu performance test and optimization:
    • Hetu now supports containerized memory self-adaptation, optimizing memory allocation and improving engine efficiency.
    • Hetu can now access DLI metadata.
    • Hetu image building is now supported.
    • Hetu is now compatible with JDK17, Hive, Hadoop, and other technologies, improving performance.
    • Hetu now supports Hive UDF compatibility features.
  • Users can now scale nodes in real-time when scaling in or out an elastic resource pool queue.
    • The upper and lower limits of the elastic resource pool queue have been modified to support real-time scaling.
  • The number of results that can be previewed is increased from 1,000 to 10,000
    • The number of preview results is increased from 1,000 to 10,000 for users on the whitelist.
  • The reporting cycle for DLI management plane metrics has been optimized, providing more frequent monitoring metrics for queues.
  • [Console] Global variables and sensitive variables are now restricted through a whitelist, preventing users from continuing to use them.
  • Spark SQL jobs now support real-time metrics during job execution, which are summarized upon completion.
    • You can now obtain the current concurrent CUs and scanned data volume for a specified running SQL job.
    • After a SQL job is completed, you can obtain the total CUs consumed, total scanned data volume, and total output data volume for that job.
    • You can also obtain the list of partitions read by a SQL job and the data volume of the partitions written to.
  • DLI now supports Oracle across sources.
  • DLI SQL now supports table lifecycle management.
    • SQL statements are allowed to operate on table lifecycles.
    • The DLI table lifecycle feature has been adapted for use on the default queue.
  • Non-elastic resource pool container queues now support automatic scaling based on job priority and resource load.
  • [DLI][Console] The console now supports queue-level version selection for Flink Jar jobs.
  • [Permission] DLI now supports data tag-based authorization.
    • DLI tables support tagging.
    • Database table authentication now supports tag-based authentication.
    • DLI tables can now be integrated with TMS. 
    • The console has been adapted for data tag-based authorization.
  • [Overload intervention] You can now throttle tenants causing overload through the O&M interface.
  • [Data-AI convergence] DLI now supports the integration of EI-Workspace's elastic resource pool, which can be used to provision notebook instances.
  • The number of parameters running in a queue can now be dynamically configured. The maximum number of SQL statements and maximum number of running statements in a queue can now be dynamically configured through O&M.
  • DLI general queues now support running notebooks.
  • Users can now access lakehouse features (with this feature being available only to those on the whitelist).
    • Users can specify the creation/query interface using lakehouse paths.
    • Metadata can be accessed through a general queue.
    • Lakehouse paths can now be used on the backend when creating managed tables.
    • Lakehouse database tables can be asynchronously deleted through user agencies.
    • Lakehouse bucket types are now limited to OBS parallel file systems.
    • Spark lakehouse images can now be created and published.

Changed Features:

  • Reconstructed the lifecycle management of SQL queue Spark instances:
    • SQL queues now support multiple driver instances (multiple SparkContexts).
    • Optimized pre-started Spark driver instances, with support for custom Spark parameters.
    • Improved the output of Cluster Agent metrics logs.
  • [API] Optimized the processing of the getJobResult API for SQL jobs by limiting the size of the result set.
  • Optimized the default parameters for selecting the master node in ClusterAgent for Kubernetes.
  • [Usability] Improved the viewing of DLI Spark UI logs.
  • Added table lifecycle to DLI and updated SQL job queries to display default job types: DDL, DCL, QUERY, INSERT, UPDATE, and DELETE. 
  • When creating package groups, hyphens (-) are now allowed in the group name.
  • Strengthened permissions for secrets in the Kubernetes cluster.
  • Users can now directly create and activate elastic resource pools by default, without the need for whitelist control.

Further information can be found in Data Lake Insight(DLI) Help Center.

Back to overview Release Notes 
 

Do you have questions?

We answer your questions about testing, booking and use – free of charge and individually. Try it! 
Hotline: 24 hours a day, 7 days a week
0800 3304477 from Germany / 00800 33044770 from abroad

Write an E-mail

The Open Telekom Cloud Community

This is where users, developers and product owners meet to help each other, share knowledge and discuss.

Discover now

Free expert hotline

Our certified cloud experts provide you with personal service free of charge.

 0800 3304477 (from Germany)

 +800 33044770 (from abroad)

 24 hours a day, seven days a week

Write an E-Mail

Our customer service is available free of charge via E-Mail

Write an E-Mail

AIssistant

Our AI-powered search helps with your cloud needs.