Help Center > > User Guide> Quick Start> Submitting a Spark Job

Submitting a Spark Job

Updated at: May 14, 2020 GMT+08:00

This section describes how to submit a Spark job using DLI. The general procedure is as follows:

Step 1: Logging in to the Public Cloud

Step 2: Uploading Data to OBS

Step 3: Entering the Spark Job Creation Page

Step 4: Creating a Queue

Step 5: Creating a Package

Step 6: Submitting a Spark Job

Step 1: Logging in to the Public Cloud

To use DLI, you need to log in to the public cloud.

  1. Open the cloud homepage.
  2. On the login page, enter the Username and Password, and click Login.

Step 2: Uploading Data to OBS

Before submitting Spark jobs, upload data files to OBS.

  1. From the menu on top of the public cloud homepage, move your cursor on Products.
  2. In the services displayed, click Object Storage Service in Storage.
  3. On the OBS product page, click Console. The OBS console page is displayed.
  4. Create a bucket. The bucket name is globally unique. In this example, assume that the bucket name is obs1.
    1. Click Create Bucket.
    2. On the Create Bucket page that is displayed, specify Region and Bucket Name.

      When creating an OBS bucket, you must select the same region as the DLI management console.

    3. Click Create Now.
  5. Click obs1 to switch to the Summary page.
  6. From the left navigation tree, click Object. Click Upload Object. In the displayed dialog box, drag files or folders to upload or add file to the file upload box, for example, spark-examples_09.jar. Then, click Upload.

    After the file is uploaded successfully, the file path to be analyzed is s3a://obs1/spark-examples_09.jar.

Step 3: Entering the Spark Job Creation Page

To submit Spark jobs, you need to enter the Spark job creation page first.

  1. From the menu on top of the public cloud homepage, move your cursor on Products.
  2. In the service list displayed, click Data Lake Insight in Enterprise Intelligence.
  3. On the DLI product page, click Access Console. The DLI management console page is displayed. If you log in to the DLI management console for the first time, you need to be authorized to access OBS.
  4. Click Spark Jobs on the Overview page or click Create Job on the right to go to the Spark job creation page of the Spark job.

Step 4: Creating a Queue

If it is your first time submitting a Spark job, create a queue first. For example, create a queue named testnew. For details about how to create a queue, see Creating a Queue.

Step 5: Creating a Package

Before submitting a Spark job, you need to create a package, for example, spark-examples_09.jar. For details about how to create a package, see Creating a Package.

Step 6: Submitting a Spark Job

  1. On the Spark job editing page, set related parameters. For details, see GUI Description.
  2. Click Execute in the upper right corner of the Spark job editing window, read and agree to the privacy agreement, and click OK. Submit the job. A message is displayed, indicating that the job is successfully submitted.
  3. (Optional) Switch to the Job Management > Spark Jobs page to view the status and logs of the submitted Spark job.

    When you click Execute on the DLI management console for the first time, you need to read the privacy agreement. Once agreed to the agreement, you will not receive any privacy agreement messages for subsequent operations.

Did you find this page helpful?

Submit successfully!

Thank you for your feedback. Your feedback helps make our documentation better.

Failed to submit the feedback. Please try again later.

Which of the following issues have you encountered?







Please complete at least one feedback item.

Content most length 200 character

Content is empty.

OK Cancel