Skip to content

Data transfer failed from bigquery to snowflake #174

@eyepax-ChandanaM

Description

@eyepax-ChandanaM

Hi there,
I was able to configure and run data transfer command from google bigquery to snowflake successfully from sling cli.
but when I set the gc_bucket option in biquery connection config the data transfer job failed. error.log.

100039 (22003): Numeric value '170712627001774500' is out of range
  File '"PUBLIC"."ADFORM_ENV_TMP"/2024-02-16T150145.423/part.01.0001.csv.zst', line 3, character 39
  Row 2, column "ADFORM_ENV_TMP"["TRANSACTION_ID":3]
  If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

Please note that data transfer is completed successfully once after removing the gc_bucket config from the bigquery connection config.

Note:
I'm using --mode full-refresh here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions