Skip to content

Commit 085a297

Browse files
dnskryaooqinn
authored andcommitted
[KYUUBI #6969] [DOC] Fix "Title underline too short" issues
### Why are the changes needed? The PR resolves multiple `"Title underline too short"` warnings to reduce noise during documentation building, for instance: ```shell ./kyuubi/docs/client/jdbc/mysql_jdbc.rst:18: WARNING: Title underline too short. `MySQL Connectors`_ ================ [docutils] ./kyuubi/docs/connector/hive/paimon.rst:17: WARNING: Title underline too short. `Apache Paimon (Incubating)`_ ========== [docutils] ./kyuubi/docs/connector/hive/paimon.rst:31: WARNING: Title underline too short. Apache Paimon (Incubating) Integration ------------------- [docutils] ``` ### How was this patch tested? Checked that there are no `"Title underline too short"` warnings during the documentation build process. ```shell make html ``` ### Was this patch authored or co-authored using generative AI tooling? No Closes #6969 from dnskr/doc-fix-title-underline-too-short. Closes #6969 2007a24 [dnskr] [DOC] Fix "Title underline too short" issues Authored-by: dnskr <[email protected]> Signed-off-by: Kent Yao <[email protected]>
1 parent 22ce315 commit 085a297

File tree

9 files changed

+28
-27
lines changed

9 files changed

+28
-27
lines changed

docs/client/jdbc/mysql_jdbc.rst

+2-2
Original file line numberDiff line numberDiff line change
@@ -14,12 +14,12 @@
1414
limitations under the License.
1515
1616
17-
`MySQL Connectors`_
17+
MySQL Connectors
1818
================
1919

2020
.. versionadded:: 1.4.0
2121

22-
Kyuubi provides an frontend service that enables the connectivity and accessibility from MySQL connectors.
22+
Kyuubi provides an frontend service that enables the connectivity and accessibility from `MySQL Connectors`_.
2323

2424
.. warning:: The document you are visiting now is incomplete, please help kyuubi community to fix it if appropriate for you.
2525

docs/connector/flink/paimon.rst

+3-3
Original file line numberDiff line numberDiff line change
@@ -13,10 +13,10 @@
1313
See the License for the specific language governing permissions and
1414
limitations under the License.
1515
16-
`Apache Paimon (Incubating)`_
17-
=============================
16+
Apache Paimon (Incubating)
17+
==========================
1818

19-
Apache Paimon (Incubating) is a streaming data lake platform that supports high-speed data ingestion, change data tracking, and efficient real-time analytics.
19+
`Apache Paimon (Incubating)`_ is a streaming data lake platform that supports high-speed data ingestion, change data tracking, and efficient real-time analytics.
2020

2121
.. tip::
2222
This article assumes that you have mastered the basic knowledge and operation of `Apache Paimon (Incubating)`_.

docs/connector/hive/paimon.rst

+6-6
Original file line numberDiff line numberDiff line change
@@ -13,10 +13,10 @@
1313
See the License for the specific language governing permissions and
1414
limitations under the License.
1515
16-
`Apache Paimon (Incubating)`_
17-
==========
16+
Apache Paimon (Incubating)
17+
==========================
1818

19-
Apache Paimon(incubating) is a streaming data lake platform that supports high-speed data ingestion, change data tracking and efficient real-time analytics.
19+
`Apache Paimon (Incubating)`_ is a streaming data lake platform that supports high-speed data ingestion, change data tracking and efficient real-time analytics.
2020

2121
.. tip::
2222
This article assumes that you have mastered the basic knowledge and operation of `Apache Paimon (Incubating)`_.
@@ -28,7 +28,7 @@ convenient, easy to understand, and easy to expand than directly using
2828
Hive to manipulate Apache Paimon (Incubating).
2929

3030
Apache Paimon (Incubating) Integration
31-
-------------------
31+
--------------------------------------
3232

3333
To enable the integration of kyuubi hive sql engine and Apache Paimon (Incubating), you need to:
3434

@@ -69,8 +69,8 @@ Configurations
6969

7070
If you are using HDFS, make sure that the environment variable HADOOP_HOME or HADOOP_CONF_DIR is set.
7171

72-
Apache Paimon (Incubating) Operations
73-
------------------
72+
Apache Paimon (Incubating) Operations
73+
-------------------------------------
7474

7575
Apache Paimon (Incubating) only supports only reading table store tables through Hive.
7676
A common scenario is to write data with Spark or Flink and read data with Hive.

docs/connector/spark/hive.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ To activate functionality of Kyuubi Spark Hive connector, we can set the followi
6464
spark.sql.catalog.hive_catalog.<other.hadoop.conf> <value>
6565
6666
Hive Connector Operations
67-
------------------
67+
-------------------------
6868

6969
Taking ``CREATE NAMESPACE`` as a example,
7070

docs/connector/spark/paimon.rst

+5-5
Original file line numberDiff line numberDiff line change
@@ -13,10 +13,10 @@
1313
See the License for the specific language governing permissions and
1414
limitations under the License.
1515
16-
`Apache Paimon (Incubating)`_
17-
==========
16+
Apache Paimon (Incubating)
17+
==========================
1818

19-
Apache Paimon(incubating) is a streaming data lake platform that supports high-speed data ingestion, change data tracking and efficient real-time analytics.
19+
`Apache Paimon (Incubating)`_ is a streaming data lake platform that supports high-speed data ingestion, change data tracking and efficient real-time analytics.
2020

2121
.. tip::
2222
This article assumes that you have mastered the basic knowledge and operation of `Apache Paimon (Incubating)`_.
@@ -28,7 +28,7 @@ convenient, easy to understand, and easy to expand than directly using
2828
spark to manipulate Apache Paimon (Incubating).
2929

3030
Apache Paimon (Incubating) Integration
31-
-------------------
31+
--------------------------------------
3232

3333
To enable the integration of Kyuubi Spark SQL engine and Apache Paimon (Incubating) through
3434
Spark DataSource V2 API, you need to:
@@ -68,7 +68,7 @@ To activate functionality of Apache Paimon (Incubating), we can set the followin
6868
spark.sql.catalog.paimon.warehouse=file:/tmp/paimon
6969
7070
Apache Paimon (Incubating) Operations
71-
------------------
71+
-------------------------------------
7272

7373

7474
Taking ``CREATE NAMESPACE`` as a example,

docs/connector/spark/tpcds.rst

+2-2
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@
1414
limitations under the License.
1515
1616
TPC-DS
17-
=====
17+
======
1818

1919
The TPC-DS is a decision support benchmark. It consists of a suite of business oriented ad-hoc queries and concurrent
2020
data modifications. The queries and the data populating the database have been chosen to have broad industry-wide
@@ -82,7 +82,7 @@ To add TPC-DS tables as a catalog, we can set the following configurations in ``
8282
spark.sql.catalog.tpcds.read.maxPartitionBytes=128m
8383
8484
TPC-DS Operations
85-
----------------
85+
-----------------
8686

8787
Listing databases under `tpcds` catalog.
8888

docs/connector/trino/paimon.rst

+5-5
Original file line numberDiff line numberDiff line change
@@ -13,10 +13,10 @@
1313
See the License for the specific language governing permissions and
1414
limitations under the License.
1515
16-
`Apache Paimon (Incubating)`_
17-
==========
16+
Apache Paimon (Incubating)
17+
==========================
1818

19-
Apache Paimon(incubating) is a streaming data lake platform that supports high-speed data ingestion, change data tracking and efficient real-time analytics.
19+
`Apache Paimon (Incubating)`_ is a streaming data lake platform that supports high-speed data ingestion, change data tracking and efficient real-time analytics.
2020

2121
.. tip::
2222
This article assumes that you have mastered the basic knowledge and operation of `Apache Paimon (Incubating)`_.
@@ -28,7 +28,7 @@ convenient, easy to understand, and easy to expand than directly using
2828
trino to manipulate Apache Paimon (Incubating).
2929

3030
Apache Paimon (Incubating) Integration
31-
-------------------
31+
--------------------------------------
3232

3333
To enable the integration of kyuubi trino sql engine and Apache Paimon (Incubating), you need to:
3434

@@ -71,7 +71,7 @@ For example, create $TRINO_SERVER_HOME/etc/catalog/tablestore.properties with th
7171
warehouse=file:///tmp/warehouse
7272
7373
Apache Paimon (Incubating) Operations
74-
------------------
74+
-------------------------------------
7575

7676
Apache Paimon (Incubating) supports reading table store tables through Trino.
7777
A common scenario is to write data with Spark or Flink and read data with Trino.

docs/extensions/server/events.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@
1414
limitations under the License.
1515
1616
Configure Kyuubi to use Custom EventHandler
17-
=======================================
17+
===========================================
1818

1919
Kyuubi provide event processing mechanism, it can help us to record some events. Beside the builtin ``JsonLoggingEventHandler``,
2020
Kyuubi supports custom event handler. It is usually used to write Kyuubi events to some external systems.

docs/tools/kyuubi-ctl.rst

+3-2
Original file line numberDiff line numberDiff line change
@@ -109,7 +109,8 @@ List all the service nodes for a particular domain.
109109
.. _create_servers:
110110

111111
Create server
112-
***********
112+
*************
113+
113114
Expose Kyuubi server instance to another domain.
114115

115116
First read ``kyuubi.ha.namespace`` in ``conf/kyuubi-defaults.conf``, if there are server instances under this namespace, register them in the new namespace specified by the ``--namespace`` parameter.
@@ -132,7 +133,7 @@ Get Kyuubi server info of domain.
132133
.. _delete_servers:
133134

134135
Delete server
135-
***********
136+
*************
136137

137138
Delete the specified service node for a domain.
138139

0 commit comments

Comments
 (0)