Control access to resources with IAM

This document describes how to view the current access policy of a resource, how to grant access to a resource, and how to revoke access to a resource.

This document assumes familiarity with the Identity and Access Management (IAM) system in Google Cloud.

Required roles

To get the permissions that you need to modify IAM policies for resources, ask your administrator to grant you the BigQuery Data Owner (roles/bigquery.dataOwner) IAM role on the project. For more information about granting roles, see Manage access to projects, folders, and organizations.

This predefined role contains the permissions required to modify IAM policies for resources. To see the exact permissions that are required, expand the Required permissions section:

Required permissions

The following permissions are required to modify IAM policies for resources:

  • To get a dataset's access policy: bigquery.datasets.get
  • To set a dataset's access policy: bigquery.datasets.update
  • To get a dataset's access policy (Google Cloud console only): bigquery.datasets.getIamPolicy
  • To set a dataset's access policy (console only): bigquery.datasets.setIamPolicy
  • To get a table or view's policy: bigquery.tables.getIamPolicy
  • To set a table or view's policy: bigquery.tables.setIamPolicy
  • To create bq tool or SQL BigQuery jobs (optional): bigquery.jobs.create

You might also be able to get these permissions with custom roles or other predefined roles.

View the access policy of a resource

The following sections describe how to view the access policies of different resources.

View the access policy of a dataset

Select one of the following options:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. In the Explorer pane, expand your project and select a dataset.

  3. Click Sharing > Permissions.

    The dataset access policies appear in the Dataset Permissions pane.

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

  2. To get an existing policy and output it to a local file in JSON, use the bq show command in Cloud Shell:

    bq show \
       --format=prettyjson \
       PROJECT_ID:DATASET > PATH_TO_FILE

    Replace the following:

    • PROJECT_ID: your project ID
    • DATASET: the name of your dataset
    • PATH_TO_FILE: the path to the JSON file on your local machine

API

To view the access policy of a dataset, call the datasets.get method with a defined dataset resource.

The policy is available in the access property of the returned dataset resource.

View the access policy of a table or view

Select one of the following options:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. In the Explorer pane, expand your project and select a table or view.

  3. Click Share.

    The table or view access policies appear in the Share pane.

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

  2. To get an existing access policy and output it to a local file in JSON, use the bq get-iam-policy command in Cloud Shell:

    bq get-iam-policy \
       --table=true \
       PROJECT_ID:DATASET.RESOURCE > PATH_TO_FILE

    Replace the following:

    • PROJECT_ID: your project ID
    • DATASET: the name of your dataset
    • RESOURCE: the name of the table or view whose policy you want to view
    • PATH_TO_FILE: the path to the JSON file on your local machine

API

To retrieve the current policy, call the tables.getIamPolicy method.

Grant access to a resource

The following sections describe how to grant access to different resources.

Grant access to a dataset

Select one of the following options:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. In the Explorer pane, expand your project and select a dataset to share.

  3. Click Sharing > Permissions.

  4. Click Add principal.

  5. In the New principals field, enter a principal.

  6. In the Select a role list, select a predefined role or a custom role.

  7. Click Save.

  8. To return to the dataset info, click Close.

SQL

To grant principals access to datasets, use the GRANT DCL statement:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    GRANT `ROLE_LIST`
    ON SCHEMA RESOURCE_NAME
    TO "USER_LIST"

    Replace the following:

    • ROLE_LIST: a role or list of comma-separated roles that you want to grant
    • RESOURCE_NAME: the name of the resource that you want to grant the permission on
    • USER_LIST: a comma-separated list of users that the role is granted to

      For a list of valid formats, see user_list.

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

The following example grants the Data Viewer role on the dataset myDataset:

GRANT `roles/bigquery.dataViewer`
ON SCHEMA `myProject`.myDataset
TO "user:[email protected]", "user:[email protected]"

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

  2. To write the existing dataset information (including access controls) to a JSON file, use the bq show command:

    bq show \
       --format=prettyjson \
       PROJECT_ID:DATASET > PATH_TO_FILE

    Replace the following:

    • PROJECT_ID: your project ID
    • DATASET: the name of your dataset
    • PATH_TO_FILE: the path to the JSON file on your local machine
  3. Make changes to the access section of the JSON file. You can add to any of the specialGroup entries: projectOwners, projectWriters, projectReaders, and allAuthenticatedUsers. You can also add any of the following: userByEmail, groupByEmail, and domain.

    For example, the access section of a dataset's JSON file would look like the following:

    {
     "access": [
      {
       "role": "READER",
       "specialGroup": "projectReaders"
      },
      {
       "role": "WRITER",
       "specialGroup": "projectWriters"
      },
      {
       "role": "OWNER",
       "specialGroup": "projectOwners"
      },
      {
       "role": "READER",
       "specialGroup": "allAuthenticatedUsers"
      },
      {
       "role": "READER",
       "domain": "domain_name"
      },
      {
       "role": "WRITER",
       "userByEmail": "user_email"
      },
      {
       "role": "READER",
       "groupByEmail": "group_email"
      }
     ],
     ...
    }

  4. When your edits are complete, use the bq update command and include the JSON file using the --source flag. If the dataset is in a project other than your default project, add the project ID to the dataset name in the following format: PROJECT_ID:DATASET.

    bq update \
    --source PATH_TO_FILE \
    PROJECT_ID:DATASET
  5. To verify your access control changes, use the bq show command again without writing the information to a file:

    bq show --format=prettyjson PROJECT_ID:DATASET

Terraform

Use the google_bigquery_dataset_iam resources to update access to a dataset.

Set the access policy for a dataset

The following example shows how to use the google_bigquery_dataset_iam_policy resource to set the IAM policy for the mydataset dataset. This replaces any existing policy already attached to the dataset:

# This file sets the IAM policy for the dataset created by
# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.
# You must place it in the same local directory as that main.tf file,
# and you must have already applied that main.tf file to create
# the "default" dataset resource with a dataset_id of "mydataset".

data "google_iam_policy" "iam_policy" {
  binding {
    role = "roles/bigquery.admin"
    members = [
      "user:[email protected]",
    ]
  }
  binding {
    role = "roles/bigquery.dataOwner"
    members = [
      "group:[email protected]",
    ]
  }
  binding {
    role = "roles/bigquery.dataEditor"
    members = [
      "serviceAccount:bqcx-1234567891011-12a3@gcp-sa-bigquery-condel.iam.gserviceaccount.com",
    ]
  }
}

resource "google_bigquery_dataset_iam_policy" "dataset_iam_policy" {
  dataset_id  = google_bigquery_dataset.default.dataset_id
  policy_data = data.google_iam_policy.iam_policy.policy_data
}

Set role membership for a dataset

The following example shows how to use the google_bigquery_dataset_iam_binding resource to set membership in a given role for the mydataset dataset. This replaces any existing membership in that role. Other roles within the IAM policy for the dataset are preserved:

# This file sets membership in an IAM role for the dataset created by
# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.
# You must place it in the same local directory as that main.tf file,
# and you must have already applied that main.tf file to create
# the "default" dataset resource with a dataset_id of "mydataset".

resource "google_bigquery_dataset_iam_binding" "dataset_iam_binding" {
  dataset_id = google_bigquery_dataset.default.dataset_id
  role       = "roles/bigquery.jobUser"

  members = [
    "user:[email protected]",
    "group:[email protected]"
  ]
}

Set role membership for a single principal

The following example shows how to use the google_bigquery_dataset_iam_member resource to update the IAM policy for the mydataset dataset to grant a role to one principal. Updating this IAM policy does not affect access for any other principals that have been granted that role for the dataset.

# This file adds a member to an IAM role for the dataset created by
# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.
# You must place it in the same local directory as that main.tf file,
# and you must have already applied that main.tf file to create
# the "default" dataset resource with a dataset_id of "mydataset".

resource "google_bigquery_dataset_iam_member" "dataset_iam_member" {
  dataset_id = google_bigquery_dataset.default.dataset_id
  role       = "roles/bigquery.user"
  member     = "user:[email protected]"
}

To apply your Terraform configuration in a Google Cloud project, complete the steps in the following sections.

Prepare Cloud Shell

  1. Launch Cloud Shell.
  2. Set the default Google Cloud project where you want to apply your Terraform configurations.

    You only need to run this command once per project, and you can run it in any directory.

    export GOOGLE_CLOUD_PROJECT=PROJECT_ID

    Environment variables are overridden if you set explicit values in the Terraform configuration file.

Prepare the directory

Each Terraform configuration file must have its own directory (also called a root module).

  1. In Cloud Shell, create a directory and a new file within that directory. The filename must have the .tf extension—for example main.tf. In this tutorial, the file is referred to as main.tf.
    mkdir DIRECTORY && cd DIRECTORY && touch main.tf
  2. If you are following a tutorial, you can copy the sample code in each section or step.

    Copy the sample code into the newly created main.tf.

    Optionally, copy the code from GitHub. This is recommended when the Terraform snippet is part of an end-to-end solution.

  3. Review and modify the sample parameters to apply to your environment.
  4. Save your changes.
  5. Initialize Terraform. You only need to do this once per directory.
    terraform init

    Optionally, to use the latest Google provider version, include the -upgrade option:

    terraform init -upgrade

Apply the changes

  1. Review the configuration and verify that the resources that Terraform is going to create or update match your expectations:
    terraform plan

    Make corrections to the configuration as necessary.

  2. Apply the Terraform configuration by running the following command and entering yes at the prompt:
    terraform apply

    Wait until Terraform displays the "Apply complete!" message.

  3. Open your Google Cloud project to view the results. In the Google Cloud console, navigate to your resources in the UI to make sure that Terraform has created or updated them.

API

To apply access controls when the dataset is created, call the datasets.insert method with a defined dataset resource. To update your access controls, call the datasets.patch method and use the access property in the Dataset resource.

Because the datasets.update method replaces the entire dataset resource, datasets.patch is the preferred method for updating access controls.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import (
	"context"
	"fmt"

	"cloud.google.com/go/bigquery"
)

// updateDatasetAccessControl demonstrates how the access control policy of a dataset
// can be amended by adding an additional entry corresponding to a specific user identity.
func updateDatasetAccessControl(projectID, datasetID string) error {
	// projectID := "my-project-id"
	// datasetID := "mydataset"
	ctx := context.Background()
	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	ds := client.Dataset(datasetID)
	meta, err := ds.Metadata(ctx)
	if err != nil {
		return err
	}
	// Append a new access control entry to the existing access list.
	update := bigquery.DatasetMetadataToUpdate{
		Access: append(meta.Access, &bigquery.AccessEntry{
			Role:       bigquery.ReaderRole,
			EntityType: bigquery.UserEmailEntity,
			Entity:     "[email protected]"},
		),
	}

	// Leverage the ETag for the update to assert there's been no modifications to the
	// dataset since the metadata was originally read.
	if _, err := ds.Update(ctx, update, meta.ETag); err != nil {
		return err
	}
	return nil
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import com.google.cloud.bigquery.Acl;
import com.google.cloud.bigquery.Acl.Role;
import com.google.cloud.bigquery.Acl.User;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.Dataset;
import java.util.ArrayList;

public class UpdateDatasetAccess {

  public static void main(String[] args) {
    // TODO(developer): Replace these variables before running the sample.
    String datasetName = "MY_DATASET_NAME";
    // Create a new ACL granting the READER role to "[email protected]"
    // For more information on the types of ACLs available see:
    // https://cloud.google.com/storage/docs/access-control/lists
    Acl newEntry = Acl.of(new User("[email protected]"), Role.READER);

    updateDatasetAccess(datasetName, newEntry);
  }

  public static void updateDatasetAccess(String datasetName, Acl newEntry) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      Dataset dataset = bigquery.getDataset(datasetName);

      // Get a copy of the ACLs list from the dataset and append the new entry
      ArrayList<Acl> acls = new ArrayList<>(dataset.getAcl());
      acls.add(newEntry);

      bigquery.update(dataset.toBuilder().setAcl(acls).build());
      System.out.println("Dataset Access Control updated successfully");
    } catch (BigQueryException e) {
      System.out.println("Dataset Access control was not updated \n" + e.toString());
    }
  }
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

Set the dataset.access_entries property with the access controls for a dataset. Then call the client.update_dataset() function to update the property.

# TODO(developer): Set dataset_id to the ID of the dataset to fetch.
dataset_id = "your-project.your_dataset"

# TODO(developer): Set entity_id to the ID of the email or group from whom
# you are adding access. Alternatively, to the JSON REST API representation
# of the entity, such as a view's table reference.
entity_id = "[email protected]"

from google.cloud.bigquery.enums import EntityTypes

# TODO(developer): Set entity_type to the type of entity you are granting access to.
# Common types include:
#
# * "userByEmail" -- A single user or service account. For example "[email protected]"
# * "groupByEmail" -- A group of users. For example "[email protected]"
# * "view" -- An authorized view. For example
#       {"projectId": "p", "datasetId": "d", "tableId": "v"}
#
# For a complete reference, see the REST API reference documentation:
# https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets#Dataset.FIELDS.access
entity_type = EntityTypes.GROUP_BY_EMAIL

# TODO(developer): Set role to a one of the "Basic roles for datasets"
# described here:
# https://cloud.google.com/bigquery/docs/access-control-basic-roles#dataset-basic-roles
role = "READER"

from google.cloud import bigquery

# Construct a BigQuery client object.
client = bigquery.Client()

dataset = client.get_dataset(dataset_id)  # Make an API request.

entries = list(dataset.access_entries)
entries.append(
    bigquery.AccessEntry(
        role=role,
        entity_type=entity_type,
        entity_id=entity_id,
    )
)
dataset.access_entries = entries

dataset = client.update_dataset(dataset, ["access_entries"])  # Make an API request.

full_dataset_id = "{}.{}".format(dataset.project, dataset.dataset_id)
print(
    "Updated dataset '{}' with modified user permissions.".format(full_dataset_id)
)

Grant access to a table or view

Select one of the following options:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. In the Explorer pane, expand your project and select a table or view to share.

  3. Click Share.

  4. Click Add principal.

  5. In the New principals field, enter a principal.

  6. In the Select a role list, select a predefined role or a custom role.

  7. Click Save.

  8. To return to the table or view details, click Close.

SQL

To grant principals access to tables or views, use the GRANT DCL statement:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    GRANT `ROLE_LIST`
    ON RESOURCE_TYPE RESOURCE_NAME
    TO "USER_LIST"

    Replace the following:

    • ROLE_LIST: a role or list of comma-separated roles that you want to grant
    • RESOURCE_TYPE: the type of resource that the role is applied to

      Supported values include TABLE, VIEW, MATERIALIZED VIEW and EXTERNAL TABLE.

    • RESOURCE_NAME: the name of the resource that you want to grant the permission on
    • USER_LIST: a comma-separated list of users that the role is granted to

      For a list of valid formats, see user_list.

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

The following example grants the Data Viewer role on the table myTable:

GRANT `roles/bigquery.dataViewer`
ON TABLE `myProject`.myDataset.myTable
TO "user:[email protected]", "user:[email protected]"

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

  2. To grant access to a table or view, use the bq add-iam-policy-binding command:

    bq add-iam-policy-binding --member=MEMBER_TYPE:MEMBER --role=ROLE
     --table=true RESOURCE

    Replace the following:

    • MEMBER_TYPE: the type of member, such as user, group, serviceAccount, or domain.
    • MEMBER: the member's email address or domain name.
    • ROLE: the role that you want to grant to the member.
    • RESOURCE: the name of the table or view whose policy you want to update.

Terraform

Use the google_bigquery_table_iam resources to update access to a table.

Set the access policy for a table

The following example shows how to use the google_bigquery_table_iam_policy resource to set the IAM policy for the mytable table. This replaces any existing policy already attached to the table:

# This file sets the IAM policy for the table created by
# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.
# You must place it in the same local directory as that main.tf file,
# and you must have already applied that main.tf file to create
# the "default" table resource with a table_id of "mytable".

data "google_iam_policy" "iam_policy" {
  binding {
    role = "roles/bigquery.dataOwner"
    members = [
      "user:[email protected]",
    ]
  }
}

resource "google_bigquery_table_iam_policy" "table_iam_policy" {
  dataset_id  = google_bigquery_table.default.dataset_id
  table_id    = google_bigquery_table.default.table_id
  policy_data = data.google_iam_policy.iam_policy.policy_data
}

Set role membership for a table

The following example shows how to use the google_bigquery_table_iam_binding resource to set membership in a given role for the mytable table. This replaces any existing membership in that role. Other roles within the IAM policy for the table are preserved.

# This file sets membership in an IAM role for the table created by
# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.
# You must place it in the same local directory as that main.tf file,
# and you must have already applied that main.tf file to create
# the "default" table resource with a table_id of "mytable".

resource "google_bigquery_table_iam_binding" "table_iam_binding" {
  dataset_id = google_bigquery_table.default.dataset_id
  table_id   = google_bigquery_table.default.table_id
  role       = "roles/bigquery.dataOwner"

  members = [
    "group:[email protected]",
  ]
}

Set role membership for a single principal

The following example shows how to use the google_bigquery_table_iam_member resource to update the IAM policy for the mytable table to grant a role to one principal. Updating this IAM policy does not affect access for any other principals that have been granted that role for the dataset.

# This file adds a member to an IAM role for the table created by
# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.
# You must place it in the same local directory as that main.tf file,
# and you must have already applied that main.tf file to create
# the "default" table resource with a table_id of "mytable".

resource "google_bigquery_table_iam_member" "table_iam_member" {
  dataset_id = google_bigquery_table.default.dataset_id
  table_id   = google_bigquery_table.default.table_id
  role       = "roles/bigquery.dataEditor"
  member     = "serviceAccount:bqcx-1234567891011-12a3@gcp-sa-bigquery-condel.iam.gserviceaccount.com"
}

To apply your Terraform configuration in a Google Cloud project, complete the steps in the following sections.

Prepare Cloud Shell

  1. Launch Cloud Shell.
  2. Set the default Google Cloud project where you want to apply your Terraform configurations.

    You only need to run this command once per project, and you can run it in any directory.

    export GOOGLE_CLOUD_PROJECT=PROJECT_ID

    Environment variables are overridden if you set explicit values in the Terraform configuration file.

Prepare the directory

Each Terraform configuration file must have its own directory (also called a root module).

  1. In Cloud Shell, create a directory and a new file within that directory. The filename must have the .tf extension—for example main.tf. In this tutorial, the file is referred to as main.tf.
    mkdir DIRECTORY && cd DIRECTORY && touch main.tf
  2. If you are following a tutorial, you can copy the sample code in each section or step.

    Copy the sample code into the newly created main.tf.

    Optionally, copy the code from GitHub. This is recommended when the Terraform snippet is part of an end-to-end solution.

  3. Review and modify the sample parameters to apply to your environment.
  4. Save your changes.
  5. Initialize Terraform. You only need to do this once per directory.
    terraform init

    Optionally, to use the latest Google provider version, include the -upgrade option:

    terraform init -upgrade

Apply the changes

  1. Review the configuration and verify that the resources that Terraform is going to create or update match your expectations:
    terraform plan

    Make corrections to the configuration as necessary.

  2. Apply the Terraform configuration by running the following command and entering yes at the prompt:
    terraform apply

    Wait until Terraform displays the "Apply complete!" message.

  3. Open your Google Cloud project to view the results. In the Google Cloud console, navigate to your resources in the UI to make sure that Terraform has created or updated them.

API

  1. To retrieve the current policy, call the tables.getIamPolicy method.
  2. Edit the policy to add members or bindings, or both. For the format required for the policy, see the Policy reference topic.

  3. Call tables.setIamPolicy to write the updated policy. Note: Empty bindings with no members are not allowed and result in an error.

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import com.google.cloud.Identity;
import com.google.cloud.Policy;
import com.google.cloud.Role;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.TableId;

// Sample to create iam policy for table
public class CreateIamPolicy {

  public static void main(String[] args) {
    // TODO(developer): Replace these variables before running the sample.
    String datasetName = "MY_DATASET_NAME";
    String tableName = "MY_TABLE_NAME";
    createIamPolicy(datasetName, tableName);
  }

  public static void createIamPolicy(String datasetName, String tableName) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      TableId tableId = TableId.of(datasetName, tableName);

      Policy policy = bigquery.getIamPolicy(tableId);
      policy
          .toBuilder()
          .addIdentity(Role.of("roles/bigquery.dataViewer"), Identity.allUsers())
          .build();
      bigquery.setIamPolicy(tableId, policy);
      System.out.println("Iam policy created successfully");
    } catch (BigQueryException e) {
      System.out.println("Iam policy was not created. \n" + e.toString());
    }
  }
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

from google.cloud import bigquery

bqclient = bigquery.Client()

policy = bqclient.get_iam_policy(
    your_table_id,  # e.g. "project.dataset.table"
)

analyst_email = "[email protected]"
binding = {
    "role": "roles/bigquery.dataViewer",
    "members": {f"group:{analyst_email}"},
}
policy.bindings.append(binding)

updated_policy = bqclient.set_iam_policy(
    your_table_id,  # e.g. "project.dataset.table"
    policy,
)

for binding in updated_policy.bindings:
    print(repr(binding))

Revoke access to a resource

The following sections describe how to revoke access to different resources.

Revoke access to a dataset

Select one of the following options:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. In the Explorer panel, expand your project and select a dataset.

  3. In the details panel, click Sharing > Permissions.

  4. In the Dataset Permissions dialog, expand the principal whose access you want to revoke.

  5. Click Remove principal.

  6. In the Remove role from principal? dialog, click Remove.

  7. To return to dataset details, click Close.

SQL

To remove access to datasets from principals, use the REVOKE DCL statement:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    REVOKE `ROLE_LIST`
    ON SCHEMA RESOURCE_NAME
    FROM "USER_LIST"

    Replace the following:

    • ROLE_LIST: a role or list of comma-separated roles that you want to revoke
    • RESOURCE_NAME: the name of the resource that you want to revoke permission on
    • USER_LIST: a comma-separated list of users who will have their roles revoked

      For a list of valid formats, see user_list.

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

The following example revokes the Admin role on the dataset myDataset:

REVOKE `roles/bigquery.admin`
ON SCHEMA `myProject`.myDataset
FROM "group:[email protected]", "serviceAccount:[email protected]"

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

  2. To write the existing dataset information (including access controls) to a JSON file, use the bq show command:

    bq show \
      --format=prettyjson \
      PROJECT_ID:DATASET > PATH_TO_FILE

    Replace the following:

    • PROJECT_ID: your project ID
    • DATASET: the name of your dataset
    • PATH_TO_FILE: the path to the JSON file on your local machine
  3. Make changes to the access section of the JSON file. You can remove any of the specialGroup entries: projectOwners, projectWriters, projectReaders, and allAuthenticatedUsers. You can also remove any of the following: userByEmail, groupByEmail, and domain.

    For example, the access section of a dataset's JSON file would look like the following:

    {
     "access": [
      {
       "role": "READER",
       "specialGroup": "projectReaders"
      },
      {
       "role": "WRITER",
       "specialGroup": "projectWriters"
      },
      {
       "role": "OWNER",
       "specialGroup": "projectOwners"
      },
      {
       "role": "READER",
       "specialGroup": "allAuthenticatedUsers"
      },
      {
       "role": "READER",
       "domain": "domain_name"
      },
      {
       "role": "WRITER",
       "userByEmail": "user_email"
      },
      {
       "role": "READER",
       "groupByEmail": "group_email"
      }
     ],
     ...
    }

  4. When your edits are complete, use the bq update command and include the JSON file using the --source flag. If the dataset is in a project other than your default project, add the project ID to the dataset name in the following format: PROJECT_ID:DATASET.

    bq update \
        --source PATH_TO_FILE \
        PROJECT_ID:DATASET
  5. To verify your access control changes, use the show command again without writing the information to a file:

    bq show --format=prettyjson PROJECT_ID:DATASET

API

Call datasets.patch and use the access property in the Dataset resource to update your access controls.

Because the datasets.update method replaces the entire dataset resource, datasets.patch is the preferred method for updating access controls.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import (
	"context"
	"fmt"

	"cloud.google.com/go/bigquery"
)

// revokeDatasetAccess updates the access control on a dataset to remove all
// access entries that reference a specific entity.
func revokeDatasetAccess(projectID, datasetID, entity string) error {
	// projectID := "my-project-id"
	// datasetID := "mydataset"
	// entity := "[email protected]"
	ctx := context.Background()
	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	ds := client.Dataset(datasetID)
	meta, err := ds.Metadata(ctx)
	if err != nil {
		return err
	}

	var newAccessList []*bigquery.AccessEntry
	for _, entry := range meta.Access {
		if entry.Entity != entity {
			newAccessList = append(newAccessList, entry)
		}
	}

	// Only proceed with update if something in the access list was removed.
	// Additionally, we use the ETag from the initial metadata to ensure no
	// other changes were made to the access list in the interim.
	if len(newAccessList) < len(meta.Access) {

		update := bigquery.DatasetMetadataToUpdate{
			Access: newAccessList,
		}
		if _, err := ds.Update(ctx, update, meta.ETag); err != nil {
			return err
		}
	}
	return nil
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

Set the dataset.access_entries property with the access controls for a dataset. Then call the client.update_dataset() function to update the property.

# TODO(developer): Set dataset_id to the ID of the dataset to fetch.
dataset_id = "your-project.your_dataset"

# TODO(developer): Set entity_id to the ID of the email or group from whom you are revoking access.
entity_id = "[email protected]"

from google.cloud import bigquery

# Construct a BigQuery client object.
client = bigquery.Client()

dataset = client.get_dataset(dataset_id)  # Make an API request.

entries = list(dataset.access_entries)
dataset.access_entries = [
    entry for entry in entries if entry.entity_id != entity_id
]

dataset = client.update_dataset(
    dataset,
    # Update just the `access_entries` property of the dataset.
    ["access_entries"],
)  # Make an API request.

full_dataset_id = f"{dataset.project}.{dataset.dataset_id}"
print(f"Revoked dataset access for '{entity_id}' to ' dataset '{full_dataset_id}.'")

Revoke access to a table or view

Select one of the following options:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. In the Explorer panel, expand your project and select a table or view.

  3. In the details panel, click Share.

  4. In the Share dialog, expand the principal whose access you want to revoke.

  5. Click Delete.

  6. In the Remove role from principal? dialog, click Remove.

  7. To return to the table or view details, click Close.

SQL

To remove access to tables or views from principals, use the REVOKE DCL statement:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    REVOKE `ROLE_LIST`
    ON RESOURCE_TYPE RESOURCE_NAME
    FROM "USER_LIST"

    Replace the following:

    • ROLE_LIST: a role or list of comma-separated roles that you want to revoke
    • RESOURCE_TYPE: the type of resource that the role is revoked from

      Supported values include TABLE, VIEW, MATERIALIZED VIEW and EXTERNAL TABLE.

    • RESOURCE_NAME: the name of the resource that you want to revoke permission on
    • USER_LIST: a comma-separated list of users who will have their roles revoked

      For a list of valid formats, see user_list.

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

The following example revokes the Admin role on the table myTable:

REVOKE `roles/bigquery.admin`
ON TABLE `myProject`.myDataset.myTable
FROM "group:[email protected]", "serviceAccount:[email protected]"

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

  2. To revoke access to a table or view, use the bq remove-iam-policy-binding command:

    bq remove-iam-policy-binding --member=MEMBER_TYPE:MEMBER --role=ROLE
     --table=true RESOURCE

    Replace the following:

    • MEMBER_TYPE: the type of member, such as user, group, serviceAccount, or domain.
    • MEMBER: the member's email address or domain name.
    • ROLE: the role that you want to revoke from the member.
    • RESOURCE: the name of the table or view whose policy you want to update.

API

  1. To retrieve the current policy, call the tables.getIamPolicy method.
  2. Edit the policy to remove members or bindings, or both. For the format required for the policy, see the Policy reference topic.

  3. Call tables.setIamPolicy to write the updated policy. Note: Empty bindings with no members are not allowed and result in an error.

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import com.google.cloud.Identity;
import com.google.cloud.Policy;
import com.google.cloud.Role;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.TableId;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;

// Sample to update iam policy in table
public class UpdateIamPolicy {

  public static void main(String[] args) {
    // TODO(developer): Replace these variables before running the sample.
    String datasetName = "MY_DATASET_NAME";
    String tableName = "MY_TABLE_NAME";
    updateIamPolicy(datasetName, tableName);
  }

  public static void updateIamPolicy(String datasetName, String tableName) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      TableId tableId = TableId.of(datasetName, tableName);

      Policy policy = bigquery.getIamPolicy(tableId);
      Map<Role, Set<Identity>> binding = new HashMap<>(policy.getBindings());
      binding.remove(Role.of("roles/bigquery.dataViewer"));

      policy.toBuilder().setBindings(binding).build();
      bigquery.setIamPolicy(tableId, policy);

      System.out.println("Iam policy updated successfully");
    } catch (BigQueryException e) {
      System.out.println("Iam policy was not updated. \n" + e.toString());
    }
  }
}

Deny access to a resource

IAM deny policies let you set guardrails on access to BigQuery resources. You can define deny rules that prevent selected principals from using certain permissions, regardless of the roles they're granted.

For information about how to create, update, and delete deny policies, see Deny access to resources.

Special cases

Consider the following scenarios when you create IAM deny policies on a few BigQuery permissions:

  • Access to authorized resources (views, routines, datasets, or stored procedures) lets you create, drop, or manipulate a table, along with reading and modifying table data, even if you don't have direct permission to perform those operations. It can also get model data or metadata and invoke other stored procedures on the underlying table. This capability implies that the authorized resources have the following permissions:

    • bigquery.tables.get
    • bigquery.tables.list
    • bigquery.tables.getData
    • bigquery.tables.updateData
    • bigquery.tables.create
    • bigquery.tables.delete
    • bigquery.routines.get
    • bigquery.routines.list
    • bigquery.datasets.get
    • bigquery.models.getData
    • bigquery.models.getMetadata

    To deny access to these authorized resources, add one of the following values to the deniedPrincipal field when you create the deny policy:

    Value Use case
    principalSet://goog/public:all Blocks all principals including authorized resources.
    principalSet://bigquery.googleapis.com/projects/PROJECT_NUMBER/* Blocks all BigQuery authorized resources in the specified project. PROJECT_NUMBER is an automatically generated unique identifier for your project of type INT64.
  • BigQuery caches query results of a job owner for 24 hours, which the job owner can access without needing the bigquery.tables.getData permission on the table containing the data. Hence, adding an IAM deny policy to the bigquery.tables.getData permission doesn't block access to cached results for the job owner until the cache expires. To block the job owner access to cached results, create a separate deny policy on the bigquery.jobs.create permission.

  • To prevent unintended data access when using deny policies to block data read operations, we recommend that you also review and revoke any existing subscriptions on the dataset.

  • To create a IAM deny policy for viewing dataset access controls, deny the following permissions:

    • bigquery.datasets.get
    • bigquery.datasets.getIamPolicy
  • To create a IAM deny policy for updating dataset access controls, deny the following permissions:

    • bigquery.datasets.update
    • bigquery.datasets.setIamPolicy