Search before asking
Motivation
Currently, the Spark connector only supports the get_cluster_configs procedure for cluster configuration management. However, the Flink connector already provides a complete set of cluster configuration management procedures, including get_cluster_configs, set_cluster_configs, and reset_cluster_configs.
This inconsistency means that Spark users cannot dynamically modify or reset cluster configurations through SQL, which limits the usability of the Spark connector. Users who need to change dynamic cluster configurations (e.g., kv.rocksdb.shared-rate-limiter.bytes-per-sec) have to switch to the Flink connector or use other tools, which is inconvenient.
We should align the Spark connector's procedure capabilities with the Flink connector by adding set_cluster_configs and reset_cluster_configs procedures.
Solution
Add two new procedures to the Spark connector:
set_cluster_configs - Dynamically set cluster configuration values. Accepts an array of key-value pairs and applies them to the cluster.
reset_cluster_configs - Reset cluster configurations to their default values. Accepts an array of configuration keys to reset.
Anything else?
No response
Willingness to contribute
Search before asking
Motivation
Currently, the Spark connector only supports the
get_cluster_configsprocedure for cluster configuration management. However, the Flink connector already provides a complete set of cluster configuration management procedures, includingget_cluster_configs,set_cluster_configs, andreset_cluster_configs.This inconsistency means that Spark users cannot dynamically modify or reset cluster configurations through SQL, which limits the usability of the Spark connector. Users who need to change dynamic cluster configurations (e.g.,
kv.rocksdb.shared-rate-limiter.bytes-per-sec) have to switch to the Flink connector or use other tools, which is inconvenient.We should align the Spark connector's procedure capabilities with the Flink connector by adding
set_cluster_configsandreset_cluster_configsprocedures.Solution
Add two new procedures to the Spark connector:
set_cluster_configs- Dynamically set cluster configuration values. Accepts an array of key-value pairs and applies them to the cluster.reset_cluster_configs- Reset cluster configurations to their default values. Accepts an array of configuration keys to reset.Anything else?
No response
Willingness to contribute