pandas.io.gbq Version 2 by jacobschaer · Pull Request #6937 · pandas-dev/pandas (original) (raw)

@jacobschaer If you have a chance. Can you break the big paragraph into say bullet points:

Finally, you can append data to a BigQuery table from a pandas DataFrame using the to_gbq() 
function. This function uses the Google streaming API which requires that your destination table exists in
 BigQuery. Given the BigQuery table already exists, your DataFrame should match the destination table 
in column order, structure, and data types. DataFrame indexes are not supported. By default, rows are 
streamed to BigQuery in chunks of 10,000 rows, but you can pass other chuck values via the chunksize 
argument. You can also see the progess of your post via the verbose flag which defaults to True. The 
http response code of Google BigQuery can be successful (200) even if the append failed. For this 
reason, if there is a failure to append to the table, the complete error response from BigQuery is 
returned which can be quite long given it provides a status for each row. You may want to start with 
smaller chuncks to test that the size and types of your dataframe match your destination table to make 
debugging simpler.

http://pandas-docs.github.io/pandas-docs-travis/io.html#io-bigquery