Skip to content

[Spark] Add set_cluster_configs and reset_cluster_configs procedures for Spark connector #3203

@XuQianJin-Stars

Description

@XuQianJin-Stars

Search before asking

  • I searched in the issues and found nothing similar.

Motivation

Currently, the Spark connector only supports the get_cluster_configs procedure for cluster configuration management. However, the Flink connector already provides a complete set of cluster configuration management procedures, including get_cluster_configs, set_cluster_configs, and reset_cluster_configs.

This inconsistency means that Spark users cannot dynamically modify or reset cluster configurations through SQL, which limits the usability of the Spark connector. Users who need to change dynamic cluster configurations (e.g., kv.rocksdb.shared-rate-limiter.bytes-per-sec) have to switch to the Flink connector or use other tools, which is inconvenient.

We should align the Spark connector's procedure capabilities with the Flink connector by adding set_cluster_configs and reset_cluster_configs procedures.

Solution

Add two new procedures to the Spark connector:

  1. set_cluster_configs - Dynamically set cluster configuration values. Accepts an array of key-value pairs and applies them to the cluster.
  2. reset_cluster_configs - Reset cluster configurations to their default values. Accepts an array of configuration keys to reset.

Anything else?

No response

Willingness to contribute

  • I'm willing to submit a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels
    No fields configured for Feature.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions