Velvet Star Monitor

Standout celebrity highlights with iconic style.

general

BigQuery: Permission denied while getting Drive credentials - Unable to resolve the error

Writer Andrew Henderson

I was hoping to get some help with this error code I have been coming across.

Context:

  • The company I work for use the GSUITE product.
  • My team have their own Cloud Project setup.
  • Google Drive isn't a "personal" drive.
  • We utilise Airflow to refresh our BigQuery tables on a daily/weekly/monthly basis.

I have followed these solutions

Access Denied: Permission denied while getting Drive credentials

"Encountered an error while globbing file pattern" error when using BigQuery API w/ Google Sheets

And also referenced

Problem

Cloud Composer : v 1.12.0

I have recently setup an external Bigquery table that reads a tab within a Google Sheet. My Airflow DAG has been failing to complete due to the access restriction to Drive. I have added the following to the Airflow connection scopes:

airflow scopes

And also added the service account e-mail address to the Google Sheet the table is referencing via Share. I have also updated the Service account IAM roles to BigQuery admin. After following these steps, I still receive the error BigQuery: Permission denied while getting Drive credentials.


Problem2

Following the above, I found it easier to trouble shoot in local, so I created a VENV on my machine because its where im most comfortable troubleshooting. The goal is to simply query a Bigquery table that reads a Google sheet. However, after following the same above steps, I am still unable to get this to work.

My local code:

import dotenv
import pandas as pd
from google.cloud import bigquery
import google.auth
def run_BigQuery_table(sql): dotenv.load_dotenv() credentials, project = google.auth.default( scopes=[ "", "", "", ] ) bigquery.Client(project, credentials) output = pd.read_gbq(sql, project_id=project, dialect='standard') return output
script_variable = "SELECT * FROM `X` LIMIT 10"
bq_output = run_BigQuery_table(script_variable)
print(bq_output)

My error:

raise self._exception google.api_core.exceptions.Forbidden: 403 Access Denied: BigQuery BigQuery: Permission denied > while getting Drive credentials.

raise GenericGBQException("Reason: {0}".format(ex)) pandas_gbq.gbq.GenericGBQException: Reason: 403 Access Denied: BigQuery BigQuery: Permission > denied while getting Drive credentials.

Is anyone able to help?

Cheers

1

2 Answers

So a colleague suggested that I explore the default pandas_gbq credentials, as this might be using default credentials to access the data.

Turns out, it worked.

You can manually set the pandas-gbq credentials by following this:

I simply added the following to my code

pdgbq.context.credentials = credentials

The final output:

import dotenv
import pandas as pd
from google.cloud import bigquery
import google.auth
import pandas_gbq as pdgbq
def run_BigQuery_table(sql): dotenv.load_dotenv() credentials, project = google.auth.default( scopes=[ "", "", "", ] ) pdgbq.context.credentials = credentials bigquery.Client(project, credentials) output = pd.read_gbq(sql, project_id=project, dialect='standard') return output
script_variable4 = "SELECT * FROM `X` LIMIT 10"
bq_output = run_BigQuery_table(script_variable3)
print(bq_output)
1

I often get these errors, and the vast majority were solved through creating and sharing service accounts. However I recently had a case where our gsuite administrator updated security settings so that only our employees could access gsuite related things (spreadsheets, storage etc). It was an attempt to plug a security gap, but in doing so, any email address or service account which did not have @ourcompany.com was blocked from using BigQuery.

I recommend you explore your company gsuite settings, and see if external access is blocked. I cannot say this is the fix for your case, but it was for me, so could be worth trying..

1

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.