Covered in this article
Related pages
Latest Changelog
Version 1.1.0 ()
Google BigQuery component

Google BigQuery component

Integration component to interact with the Google BigQuery.

Table of Contents

General Information

Description

Integration component to interact with the Google BigQuery.

API version / SDK version

The component uses @google-cloud/bigquery client library, version 5.2.0.

Credentials

Service Account - Set of credentials (project ID, private key, etc) provided by Google.

More information on how to generate credentials you can find here.

After you perform all authentication steps described above just copy and paste content of an authentication JSON file as-is to the field Service Account credentials.

It should look like this:

{
  "type": "service_account",
  "project_id": "projectname",
  "private_key_id": "ds67f57s6df5sd76f57s6df57sdf67sdf76df",
  "private_key": "PRIVATE_KEY",
  "client_email": "email",
  "client_id": "2348238472834782348723",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "cert_url"
}

Triggers

This component has no trigger functions. This means it will not be accessible to select as a first component during the integration flow design.

Actions

Query

Performs a query provided by user.

Expected input metadata

Input metadata include query options according to documentations. The query option is required.

{
  "type": "object",
  "required": true,
  "properties": {
    "query": {
      "type": "string",
      "required": true
    }
  }
}

Expected output metadata

{
  "type": "object",
  "required": true,
  "properties": {
    "result": {
      "required": true,
      "type": "array"
    }
  }
}

Limitations

The query options are an experimental feature and correct behavior is not guaranteed. Only query, location, dryRun, useQueryCache, useLegacySql, parameterMode,maximumBytesBilled options were tested.

Insert Rows as Stream

Inserts an array on rows into the table as stream.

Configuration Fields

  • Dataset - (required, string) dataset to insert rows.
  • Table - (required, string) table to insert rows.
  • Throw error if insert fails - (required, checkbox) if selected a default error PartialFailureError will be thrown in case if insert fails. Otherwise, an object containing error details will be emitted.

Input metadata

Input metadata includes an array of json objects representing the table schema. There might be 1 or more objects in the array.

  • Rows - (array, required) Array of JSON objects each representing a row.

Example:

Integrator Mode:

[{"comment": "Lorem ipsum"}, {"comment": "dolor"}]

Developer Mode:

{
  "rows": [
    {
      "comment": "Lorem ipsum"
    },
    {
      "comment": "dolor"
    }
  ]
}

Output metadata

In case of a successful insert an object with an empty errors object inside will be emitted:

{
  "errors": []
}

Otherwise, the errors object will contain all the errors with regard to all the rows being sent. E.g.:

{
  "errors": [
    {
      "errors": [
        {
          "message": "no such field: commddent.",
          "reason": "invalid"
        }
      ],
      "row": {
        "commddent": "Lorem ipsum"
      }
    },
    {
      "errors": [
        {
          "message": "",
          "reason": "stopped"
        }
      ],
      "row": {
        "comment": "dolor"
      }
    }
  ]
}

Click here to learn more about the elastic.io iPaaS