answerstu

google bigquery - Does the bq command line support writeDisposition : WRITE_TRUNCATE?

I am wondering if the bq command line utility supports the writeDisposition : WRITE_TRUNCATE option. I have searched the docs thoroughly, and help within bq command. Is it possible to specify configuration.load.writeDisposition with the bq utility? The command line utility is great, hopefully it is supported with it.In the API it is of course: https://cloud.google.com/bigquery/docs/reference/v2/jobs#configuration.loadThanks....Read more

bigquery backup all view definitions

I am working with bigquery, and there have been a few hundred views created. Most of these are not used and should be deleted. However, there is a chance that some are used and I cannot just blindly delete all. Therefore, I need to backup all view definitions somehow before deleting them.Does anyone know of a good way? I am not trying to save the data, just the view definition queries and their names.Thanks for reading!...Read more

How to load Avro files into BigQuery table in Google Cloud Datalab from Cloud Storage?

Export BigQuery table to Cloud Storage Avro file succeedstable = bq.Table(bq_table_ref)job_id = table.extract(destination=gs_object, format='avro')while import Cloud Storage Avro file into BigQuery table failstable = bq.Table(bq_table_ref)job_id = table.load(source=gs_object, source_format='avro')and the alternative fails as well as%%bq load -m create -f avro -p gs_object -t bq_table_refIs loading Avro files not supported within google.datalab.bigquery ?...Read more

Moving data from Google Big Query to Azure Data Lake Store using Azure Data Factory

I have a scenario where I need to connect the Azure Data Factory (v2) in Google Big Query for to move data to my Azure Data Lake but I don't work.When I create a Linked Service, I choose Big Query as Source, I put all informations about BigQuery, as project-name, type of connection, etc but when I click in Validade button a message is show for me: (for exemple)... UserError: ERROR [28000] [Microsoft][BigQuery] (80) Authentication failed: invalid_grant ERROR [28000] [Microsoft][BigQuery] (80) Authentication failed: invalid_grant'Type=,Message...Read more

n gram - Neither BigQuery nor the public data sets seems to have all the bigrams

Summary: All I'm trying to do is find out where to download the data I can see in the n-gram viewer since neither the raw data nor BigQuery seem to have as many results as the viewer...So in my attempt to download all the bigrams without opening each file manually (from the available raw data), I turned to BigQuery in an attempt to convert the trigram data down to bigrams, but realized, because of how the trigrams were constructed, there were plenty of bigrams that weren't included. So then I went the old fashioned way and, as a test, downloade...Read more

Google Cloud/BigQuery/Genomics data location

Some of our company's work requires that data in the Cloud be stored in the US.For Google Cloud, I can specify bucket locations to US locations. https://cloud.google.com/storage/docs/bucket-locationsBut for BigQuery and Google Genomics, there's no such options in the API. Anyone know the countries where the data for these services are stored?...Read more

Import data from BigQuery to Cloud Storage in different project

I have two projects under the same account: projectA with BQ and projectB with cloud storageprojectA has BQ with dataset and table - testDataset.testTableprjectB has cloud storage and bucket - testBucketI use python, google cloud rest apiaccount key credentials for every project, with different permissions: projectA key has permissions only for BQ; projectB has permissions only for cloud storageWhat I need:import data from projectA testDataset.testTable to projectB testBucketProblemsof course, I'm running into error Permission denied while I'm ...Read more

Transfer large file from Google BigQuery to Google Cloud Storage

I need to transfer a large table in BigQuery, 2B records, to Cloud Storage with csv format. I am doing the transfer using the console.I need to specify a uri including a * to shard the export due to the size of the file. I end up with 400 csv files in Cloud Storage. Each has a header row.This makes combining the files time consuming, since I need to download the csv files to another machine, strip out the header rows, combine the files, and then re-upload. FY the size of the combined csv file is about 48GB.Is there a better approach for thi...Read more

Google BigQuery, How to load data from google cloud storage to BigQuery

I am switching to Big Query due to high performance.But have no idea about how to upload data from Google Cloud Storage to Big Query Database.Some more questions...Can i directly access my database from google cloud storage while using Big Query.Will i have to convert it first to some format.How i will keep updating Big Query database to my Google Cloud Storage Database.Thanks In Advance....Read more

Google Cloud Storage Bucket in BigQuery

I'm trying to access a bucket on our Google Cloud Storage.On my login details, I don't have permissions to open the bucket. How do I add another user to use on the gcloud console using a JSON token?I know the bucket was actually setup by Google. It's a data dump from DoubleClick. Want to pull the data down with BigQuery, but I can't see any of the datasets or run any queries against it with my current level of access.Would this be something I can set myself using the token provided by Google or must I get hold of them again?...Read more