You can move around the graph by using the arrow keys.
Created with Raphaël 2.2.09May87653229Apr28272625242221201918171615141312111098762131Mar302928272625242322212019181716151412111098765432128Feb242322212018171615141110984131Jan302928272625232019181716151413121110975431Dec302928252423222120171615141312109876432130Nov2926252423[KYUUBI #2560] Upgrade kyuubi-hive-jdbc hive version to 3.1.3[KYUUBI #2602] Bump testcontainers-scala 0.40.7[KYUUBI #2565] Variable substitution should work in plan only mode[KYUUBI #2493][FOLLOWUP] Fix the exception that occurred when beeline rendered spark progress[KYUUBI #2378] Implement BatchesResource GET /batches/${batchId}/log[KYUUBI #2599] Bump scala-maven-plugin 4.6.1[KYUUBI #2493] Implement the progress of statement for spark sql engine[KYUUBI #2375][FOLLOWUP] Implement BatchesResource GET /batches[KYUUBI #2588] Reformat kyuubi-hive-sql-engine/pom.xml[KYUUBI #2558] fix warn message[KYUUBI #2427][FOLLOWUP] Flaky test: deregister when meeting specified exception[KYUUBI #2582] Minimize Travis build and test[KYUUBI #2500][FOLLOWUP] Resolve flink conf at engine side[KYUUBI #2571] Minimize YARN tests overhead[KYUUBI #2573] [KPIP-4][SUB-TASK] Add a seekable buffered reader for random access operation log[KYUUBI #2375][SUB-TASK][KPIP-4] Implement BatchesResource GET /batches[KYUUBI #2571] Release connection to prevent the engine leak[KYUUBI #2522] Even the process exit code is zero, also check the application state from resource manager[KYUUBI #2569] Change the acquisition method of flinkHome to keep it consistent with other engines[KYUUBI #2550] Fix swagger does not show the request/response schema issue[KYUUBI #2500] Command OptionParser for launching Flink Backend Engine[KYUUBI #2379][SUB-TASK][KPIP-4] Implement BatchesResource DELETE /batches/${batchId}更新.gitlab-ci.yml文件mastermaster[KYUUBI #2513] Support NULL type in trino engine and add QueryTests[KYUUBI #2403] [Improvement] move addTimeoutMonitor to AbstractOperation because it was used in multiple engines[KYUUBI #2531] [Subtask] Kyuubi Spark TPC-DS Connector - Initial implementation[KYUUBI #2523] Flaky Test: KyuubiBatchYarnClusterSuite - open batch session[KYUUBI #2376][SUB-TASK][KPIP-4] Implement BatchesResource GET /batches/${batchId}[KYUUBI #2547] Support jdbc url prefix jdbc:kyuubi://[KYUUBI #2549] Do not auth the request to load OpenApiConf[KYUUBI #2548] Prevent dead loop if the batch job submission process it not alive[KYUUBI #2533] Make Utils.parseURL public to remove unnecessary reflection[KYUUBI #2524] [DOCS] Update metrics.md[KYUUBI #2532] avoid NPE in KyuubiHiveDriver.acceptsURL[KYUUBI #2478][FOLLOWUP] Invoke getOpts method instead of Reflection[KYUUBI #2490][FOLLOWUP] Fix and move set command test case[KYUUBI #2517] Rename ZorderSqlAstBuilder to KyuubiSparkSQLAstBuilder[KYUUBI #2025][HIVE] Add a Hive on Yarn doc[KYUUBI #2032][Subtask] Hive Backend Engine - new APIs with hive-service-rpc 3.1.2 - SetClientInfo[KYUUBI #2490] Fix NPE in getOperationStatus