Help Center > > Developer Guide> Developing a DLI Datasource Connection Using a Spark Job> Overview


Updated at: May 28, 2020 GMT+08:00

DLI supports the native Spark Data Source APIs. Expansions are also made to query and analysis data from other services through SQL syntax and Spark jobs. Currently, the DLI datasource connection supports the following services: CloudTable, DWS, RDS, CSS, MRS, DCS, and DDS. To use the datasource capability of DLI, you need to create a datasource connection first. For details about operations on the management console, see the Data Lake Insight User Guide. When using Spark jobs to create a DLI datasource connection, you can use Scala, PySpark, or Java to compile the statements.

Did you find this page helpful?

Submit successfully!

Thank you for your feedback. Your feedback helps make our documentation better.

Failed to submit the feedback. Please try again later.

Which of the following issues have you encountered?

Please complete at least one feedback item.

Content most length 200 character

Content is empty.

OK Cancel