diff --git a/dataproc/python-api-walkthrough.md b/dataproc/python-api-walkthrough.md index f64d7528601..1a8d436f720 100644 --- a/dataproc/python-api-walkthrough.md +++ b/dataproc/python-api-walkthrough.md @@ -1,4 +1,4 @@ -# Use the Python Client Library to call Cloud Dataproc APIs +# Use the Python Client Library to call Dataproc APIs Estimated completion time: @@ -7,13 +7,13 @@ Estimated completion time: -1. Enable the Cloud Dataproc, Compute Engine, and Cloud Storage APIs in your project. - * +1. Click the link below to enable the Dataproc, Compute Engine, and Cloud Storage APIs + in a separate GCP console tab in your browser. + + **Note:** After you select your project and enable the APIs, return to this tutorial by clicking + on the **Cloud Shell** tab in your browser. + + * [Enable APIs](https://console.cloud.google.com/flows/enableapi?apiid=dataproc,compute_component,storage-component.googleapis.com&redirect=https://console.cloud.google.com) ## Prerequisites (2) @@ -140,7 +145,8 @@ Job output in Cloud Shell shows cluster creation, job submission, ### Next Steps: * **View job details from the Console.** View job details by selecting the - PySpark job from the Cloud Dataproc + PySpark job from the Dataproc += [Jobs page](https://console.cloud.google.com/dataproc/jobs) in the Google Cloud Platform Console. @@ -160,5 +166,5 @@ Job output in Cloud Shell shows cluster creation, job submission, gsutil rm -r gs://$BUCKET ``` -* **For more information.** See the [Cloud Dataproc documentation](https://cloud.google.com/dataproc/docs/) +* **For more information.** See the [Dataproc documentation](https://cloud.google.com/dataproc/docs/) for API reference and product feature information.