Skip to content Skip to sidebar Skip to footer

Downloading Table Data As Csv To Local Machine In Big Query

def downloadGbqToCsv(self,table_name,csv): credentials = GoogleCredentials.get_application_default() bigquery = discovery.build('bigquery', 'v2', credentials=credentia

Solution 1:

Run an Extract Job to export your table to Google Cloud Storage in your desired format, then you can use any GCS tool to download the files to your local file system.

Here's an example extract job configuration: https://cloud.google.com/bigquery/exporting-data-from-bigquery#extractconfig

I use gsutil to copy files from GCS to my local drive.

If you want to copy the table directly to your local drive, you will have to use tabledata.list to list the table row-by-row, and convert it to your output format locally. The bq command head works this way, for ex:

bq --format=csv head -n 100 project:dataset.table

But that only downloads the first 100 lines. The bq client doesn't support listing an entire table that way, as it's too slow. If your data sizes are small, however, it would work for you.

Solution 2:

Executing a query and store in local machine is quite indirect in Big Query, I've packaged the code and hope it helps.

Packages version:google-cloud(0.27.0)google-cloud-bigquery(0.26.0)google-cloud-storage(1.3.1)

https://github.com/IcarusSO/bigQueryExporter

Post a Comment for "Downloading Table Data As Csv To Local Machine In Big Query"