Insert All BigQuery Table Data
Streams data into BigQuery one record at a time without needing to run a load job. Requires the WRITER dataset role.
External Documentation
To learn more, visit the GCP documentation.
Basic Parameters
Parameter | Description |
---|---|
Dataset ID | Dataset ID of the destination table. |
Project ID | Project ID of the destination table. |
Rows | The rows to insert. |
Table ID | Table ID of the destination table. |
Advanced Parameters
Parameter | Description |
---|---|
Ignore Unknown Values | [Optional] Accept rows that contain values that do not match the schema. The unknown values are ignored. Default is false, which treats unknown values as errors. |
Kind | The resource type of the response. |
Skip Invalid Rows | [Optional] Insert all valid rows of a request, even if invalid rows exist. The default value is false, which causes the entire request to fail if any invalid rows exist. |
Template Suffix | If specified, treats the destination table as a base template, and inserts the rows into an instance table named "{destination}{templateSuffix}". BigQuery will manage creation of the instance table, using the schema of the base template table. See https://cloud.google.com/bigquery/streaming-data-into-bigquery#template-tables for considerations when working with templates tables. |
Example Output
{
"insertErrors": [
{
"errors": [
{
"debugInfo": "Debugging information. This property is internal to Google and should not be used.",
"location": "Specifies where the error occurred, if present.",
"message": "A human-readable description of the error.",
"reason": "A short error code that summarizes the error."
}
],
"index": 0
}
],
"kind": "bigquery#tableDataInsertAllResponse"
}
Workflow Library Example
Insert All Bigquery Table Data with Gcp and Send Results Via Email
Preview this Workflow on desktop