Connect with Python Notebooks
Last updated
Last updated
Even though the data sets are hosted by us, you still need to connect to the BigQuery API so that Google knows you are querying from your project.
You can use BigQuery APIs to extract or fetch data from BigQuery datasets. In other words, the BigQuery service provides you with various REST APIs that are used to collect and transmit data from external data sources.
You can access BigQuery API using exclusive client libraries like Python, Java, and C++. When your application doesn’t permit you to use other client libraries, you will have to use HTTP commands to manually access the BigQuery API.
You can enable BigQuery API by using two methods: Using Cloud Console and Cloud Shell.
Method 1: Enabling BigQuery API from Cloud Console
Visit the Google Cloud Platform.
Now, go to the navigation menu and click on “APIs & Services“, as shown in the above image.
In the search bar, type “BigQuery API” and press Enter.
Now, you are redirected to the new page where you can see various APIs of BigQuery. Click on the “BigQuery API” displayed at the top.
Then, you are again taken to the new page, where you are prompted to enable the BigQuery API. Click on the “ENABLE” button.
On successfully enabling the BigQuery API, you can see the “API Enabled” tag, as shown in the above image.
You can also see the Activation status as “Enabled” in the overview tab of Google Cloud Platform.
Method 2: Enabling BigQuery API from Cloud Shell
Open your Cloud Shell and execute the following command to enable the BigQuery API.
gcloud services enable bigquery.googleapis.com
You can check whether the BigQuery API is successfully enabled by executing the command given below.
gcloud services list
To access data from BigQuery using the BigQuery API, you have to create a service account and download an authentication file for Jupyter Notebook.
For creating a new service account, go to the “Create Service Account” page on the Google Cloud Platform.
Now, you are prompted to enter the Service account name, ID, and description. After filling in the details, click on the “Create and Continue” button.
You can see the newly created service account in the “Credentials” field of the Google Cloud Platform, thereby ensuring the service account is created successfully.
Now, you have to create an authentication key for the service account. In the “Credentials” field, click the pencil icon beside the newly created service account. Then, click on the “Add Key” button and select the “Create New Key” option.
In the next step, select the key type as JSON and click on the “Create” button.
Now, the JSON file with the authentication information is downloaded to your local machine.
At this stage, you are all set to connect BigQuery Jupyter Notebook. Now, open a command prompt and enter the following code to install the necessary packages for connecting BigQuery with Jupyter Notebook.
You can also install the packages from Jupyter Notebook by executing the one-line command. Open your Jupyter Notebook and run the following command in the code cell.
Use the following notebook as a template to start running your first queries against Numia and doing some magic with python.