Skip to content

Conversation

@lolo115
Copy link

@lolo115 lolo115 commented Dec 11, 2025

Changes

Creating the oracle extractor.
Unit tests passed for:

  • dialog to configure the extractor
  • run scripts to gather data
  • local database population with extracted data from the Oracle Database.

What does this PR do?

Oracle profiler data extraction

Relevant implementation details

The data analysis is still WIP.

Caveats/things to watch out for when reviewing:

Linked issues

Resolves #..

Functionality

  • added relevant user documentation
  • added new CLI command
  • modified existing command: databricks labs lakebridge ...
  • ... +add your own

Tests

  • manually tested
  • added unit tests
  • added integration tests

@github-actions
Copy link

❌ 43/54 passed, 11 failed, 4m35s total

❌ test_validate_table_not_found: duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency (14ms)
duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency
[gw3] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_validate_non_empty_tables: duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency (169ms)
duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency
[gw9] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_validate_successful_schema_check: duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency (1ms)
duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency
[gw3] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_validate_mixed_checks: duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency (1ms)
duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency
[gw9] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_Oracle_profile_execution: FileNotFoundError: Configuration file not found for source oracle: Credentials file not found at /home/runner/.databricks/labs/lakebridge/.credentials.yml (43ms)
FileNotFoundError: Configuration file not found for source oracle: Credentials file not found at /home/runner/.databricks/labs/lakebridge/.credentials.yml
[gw4] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
14:55 ERROR [databricks.labs.lakebridge.assessments.profiler] Configuration file not found for source oracle: Credentials file not found at /home/runner/.databricks/labs/lakebridge/.credentials.yml
14:55 ERROR [databricks.labs.lakebridge.assessments.profiler] Configuration file not found for source oracle: Credentials file not found at /home/runner/.databricks/labs/lakebridge/.credentials.yml
[gw4] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_validate_invalid_schema_check: duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency (5ms)
duckdb.duckdb.IOException: IO Error: Could not set lock on file "/tmp/data/synapse_assessment/mock_profiler_extract.db": Conflicting lock is held in /opt/hostedtoolcache/Python/3.10.19/x64/bin/python3.10 (PID 3509). See also https://duckdb.org/docs/connect/concurrency
[gw4] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpiles_informatica_to_sparksql_non_interactive[True]: AssertionError: assert [{'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw4/test_transpiles_informatica_to0/errors.log'}] == [{'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}] (23.863s)
AssertionError: assert [{'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw4/test_transpiles_informatica_to0/errors.log'}] == [{'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}]
  
  At index 0 diff: {'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw4/test_transpiles_informatica_to0/errors.log'} != {'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}
  
  Full diff:
    [
        {
            'analysis_error_count': 0,
  -         'error_log_file': None,
  +         'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw4/test_transpiles_informatica_to0/errors.log',
            'generation_error_count': 0,
  -         'parsing_error_count': 0,
  ?                                ^
  +         'parsing_error_count': 1,
  ?                                ^
            'total_files_processed': 1,
            'total_queries_processed': 1,
            'validation_error_count': 0,
        },
    ]
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
[gw4] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML (errors: 1)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 0
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML: 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML (errors: 1)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 0
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML: 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=FAILURE, kind=PARSING, severity=ERROR, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML', message='Command '['/tmp/pytest-of-runner/pytest-0/popen-gw4/labs0/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'INFA', '-u', '/tmp/pytest-of-runner/pytest-0/popen-gw4/test_transpiles_informatica_to0/overrides.json', '-n', 'transpiled', '-i', 'originals/wf_m_employees_load.XML', '-v', '-H', '7e6ce4e794e4427662a6075962a210c4693adb5f']' returned non-zero exit status 255.')
[gw4] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpiles_informatica_to_sparksql: AssertionError: assert [{'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw9/test_transpiles_informatica_to0/errors.log'}] == [{'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}] (23.285s)
AssertionError: assert [{'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw9/test_transpiles_informatica_to0/errors.log'}] == [{'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}]
  
  At index 0 diff: {'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw9/test_transpiles_informatica_to0/errors.log'} != {'total_files_processed': 1, 'total_queries_processed': 1, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}
  
  Full diff:
    [
        {
            'analysis_error_count': 0,
  -         'error_log_file': None,
  +         'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw9/test_transpiles_informatica_to0/errors.log',
            'generation_error_count': 0,
  -         'parsing_error_count': 0,
  ?                                ^
  +         'parsing_error_count': 1,
  ?                                ^
            'total_files_processed': 1,
            'total_queries_processed': 1,
            'validation_error_count': 0,
        },
    ]
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
[gw9] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML (errors: 1)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML: 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML (errors: 1)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML: 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=FAILURE, kind=PARSING, severity=ERROR, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/informatica/wf_m_employees_load.XML', message='Command '['/tmp/pytest-of-runner/pytest-0/popen-gw9/labs0/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'INFA', '-u', 'base_infapc2databricks_sparksql.json', '-n', 'transpiled', '-i', 'originals/wf_m_employees_load.XML', '-v', '-H', 'a526b3617f9afb689dfbf839654b5a41693adb60']' returned non-zero exit status 255.')
[gw9] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_teradata_sql: AssertionError: assert [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 2, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw3/test_transpile_teradata_sql0/errors.log'}] == [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw3/test_transpile_teradata_sql0/errors.log'}] (26.386s)
AssertionError: assert [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 2, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw3/test_transpile_teradata_sql0/errors.log'}] == [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw3/test_transpile_teradata_sql0/errors.log'}]
  
  At index 0 diff: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 2, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw3/test_transpile_teradata_sql0/errors.log'} != {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw3/test_transpile_teradata_sql0/errors.log'}
  
  Full diff:
    [
        {
            'analysis_error_count': 0,
            'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw3/test_transpile_teradata_sql0/errors.log',
            'generation_error_count': 0,
  -         'parsing_error_count': 0,
  ?                                ^
  +         'parsing_error_count': 2,
  ?                                ^
            'total_files_processed': 2,
            'total_queries_processed': 2,
  -         'validation_error_count': 1,
  ?                                   ^
  +         'validation_error_count': 2,
  ?                                   ^
        },
    ]
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
[gw3] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 2
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql: 1 , 1  found
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql: 1 , 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 2
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql: 1 , 1  found
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql: 1 , 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=FAILURE, kind=PARSING, severity=ERROR, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql', message='Command '['/tmp/pytest-of-runner/pytest-0/popen-gw3/labs0/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-v', '-H', '78175d2671829a254de178f8817126ba693adb60']' returned non-zero exit status 255.')
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=VALIDATION_ERROR, kind=VALIDATION, severity=WARNING, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql', message='[UNRESOLVED_ROUTINE] Cannot resolve routine `cole` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].')
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=FAILURE, kind=PARSING, severity=ERROR, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql', message='Command '['/tmp/pytest-of-runner/pytest-0/popen-gw3/labs0/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-v', '-H', 'd64f5197c4caf96a0c6cfeda2dc051a3693adb64']' returned non-zero exit status 255.')
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=VALIDATION_ERROR, kind=VALIDATION, severity=WARNING, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql', message='
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: [PARSE_SYNTAX_ERROR] Syntax error at or near ','. SQLSTATE: 42601 (line 2, pos 4)
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: 
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: == SQL ==
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: EXPLAIN CREATE TABLE REF_TABLE
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: ,NO FALLBACK
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: ----^^^
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: (
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col1    BYTEINT NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col2    SMALLINT NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col3    INTEGER NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col4    BIGINT NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col5    DECIMAL(10,2) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col6    DECIMAL(18,4) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col7    TIMESTAMP(1) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col8    TIME,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col9    TIMESTAMP(5) WITH TIME ZONE NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col10   CHAR(01) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col11   CHAR(04) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col12   CHAR(4),
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col13   DECIMAL(10,0) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col14   DECIMAL(18,6) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col15   DECIMAL(18,1) NOT NULL DEFAULT 0.0,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col16   DATE FORMAT 'YY/MM/DD',
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col17   VARCHAR(30) NOT CASESPECIFIC,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col18   FLOAT NOT NULL
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: )
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: UNIQUE PRIMARY INDEX (col1, col3);
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: ')
[gw3] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_teradata_sql_non_interactive[True]: AssertionError: assert [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no0/errors.log'}] == [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no0/errors.log'}] (30.751s)
AssertionError: assert [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no0/errors.log'}] == [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no0/errors.log'}]
  
  At index 0 diff: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 1, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no0/errors.log'} != {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no0/errors.log'}
  
  Full diff:
    [
        {
            'analysis_error_count': 0,
            'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no0/errors.log',
            'generation_error_count': 0,
  -         'parsing_error_count': 0,
  ?                                ^
  +         'parsing_error_count': 1,
  ?                                ^
            'total_files_processed': 2,
            'total_queries_processed': 2,
            'validation_error_count': 1,
        },
    ]
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
[gw6] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 1
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql: 1 , 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Installing Databricks bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.transpiler.installers] Successfully installed bladebridge transpiler (v0.1.22)
14:55 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 1
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql: 1 , 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=FAILURE, kind=PARSING, severity=ERROR, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql', message='Command '['/tmp/pytest-of-runner/pytest-0/popen-gw6/labs0/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no0/overrides.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-v', '-H', '70a8125744178606b242d3138213470a693adb5f']' returned non-zero exit status 255.')
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=VALIDATION_ERROR, kind=VALIDATION, severity=WARNING, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql', message='[UNRESOLVED_ROUTINE] Cannot resolve routine `cole` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].')
[gw6] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_teradata_sql_non_interactive[False]: AssertionError: assert [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 2, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no1/errors.log'}] == [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no1/errors.log'}] (5.864s)
AssertionError: assert [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 2, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no1/errors.log'}] == [{'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no1/errors.log'}]
  
  At index 0 diff: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 2, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no1/errors.log'} != {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 0, 'validation_error_count': 1, 'generation_error_count': 0, 'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no1/errors.log'}
  
  Full diff:
    [
        {
            'analysis_error_count': 0,
            'error_log_file': '/tmp/pytest-of-runner/pytest-0/popen-gw6/test_transpile_teradata_sql_no1/errors.log',
            'generation_error_count': 0,
  -         'parsing_error_count': 0,
  ?                                ^
  +         'parsing_error_count': 2,
  ?                                ^
            'total_files_processed': 2,
            'total_queries_processed': 2,
  -         'validation_error_count': 1,
  ?                                   ^
  +         'validation_error_count': 2,
  ?                                   ^
        },
    ]
[gw6] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
14:55 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 2
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql: 1 , 1  found
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql: 1 , 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (errors: 2)
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 2
14:55 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql: 1 , 1  found
14:55 ERROR [databricks.labs.lakebridge] /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql: 1 , 1  found
14:55 DEBUG [tests.integration.transpile.test_bladebridge] Captured 0 log file(s): []
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=FAILURE, kind=PARSING, severity=ERROR, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql', message='Command '['/tmp/pytest-of-runner/pytest-0/popen-gw6/labs0/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-v', '-H', '8f2c0f84624ee86950f63c2c5705b4f7693adb6b']' returned non-zero exit status 255.')
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=VALIDATION_ERROR, kind=VALIDATION, severity=WARNING, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql', message='[UNRESOLVED_ROUTINE] Cannot resolve routine `cole` on search path [`system`.`builtin`, `system`.`session`, `catalog`.`schema`].')
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=FAILURE, kind=PARSING, severity=ERROR, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql', message='Command '['/tmp/pytest-of-runner/pytest-0/popen-gw6/labs0/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-v', '-H', '039e7317d6e0a1cf2c2a2c05f86cb97d693adb6c']' returned non-zero exit status 255.')
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: TranspileError(code=VALIDATION_ERROR, kind=VALIDATION, severity=WARNING, path='/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql', message='
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: [PARSE_SYNTAX_ERROR] Syntax error at or near ','. SQLSTATE: 42601 (line 2, pos 4)
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: 
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: == SQL ==
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: EXPLAIN CREATE TABLE REF_TABLE
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: ,NO FALLBACK
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: ----^^^
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: (
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col1    BYTEINT NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col2    SMALLINT NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col3    INTEGER NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col4    BIGINT NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col5    DECIMAL(10,2) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col6    DECIMAL(18,4) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col7    TIMESTAMP(1) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col8    TIME,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col9    TIMESTAMP(5) WITH TIME ZONE NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col10   CHAR(01) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col11   CHAR(04) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col12   CHAR(4),
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col13   DECIMAL(10,0) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col14   DECIMAL(18,6) NOT NULL,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col15   DECIMAL(18,1) NOT NULL DEFAULT 0.0,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col16   DATE FORMAT 'YY/MM/DD',
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col17   VARCHAR(30) NOT CASESPECIFIC,
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: col18   FLOAT NOT NULL
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: )
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: UNIQUE PRIMARY INDEX (col1, col3);
14:55 ERROR [tests.integration.transpile.test_bladebridge] Error logged: ')
[gw6] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python

Running from acceptance #3159

@codecov
Copy link

codecov bot commented Dec 11, 2025

Codecov Report

❌ Patch coverage is 31.81818% with 15 lines in your changes missing coverage. Please review.
✅ Project coverage is 63.47%. Comparing base (5a297d2) to head (ea0f1ff).

Files with missing lines Patch % Lines
...abs/lakebridge/assessments/configure_assessment.py 16.66% 10 Missing ⚠️
...ks/labs/lakebridge/connections/database_manager.py 44.44% 5 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2187      +/-   ##
==========================================
- Coverage   63.56%   63.47%   -0.10%     
==========================================
  Files         100      100              
  Lines        8503     8523      +20     
  Branches      885      886       +1     
==========================================
+ Hits         5405     5410       +5     
- Misses       2931     2946      +15     
  Partials      167      167              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants