Full Code of klarna/HiveRunner for AI

main 0be34c87c421 cached
184 files
486.2 KB
119.5k tokens
874 symbols
1 requests
Download .txt
Showing preview only (542K chars total). Download the full file or copy to clipboard to get everything.
Repository: klarna/HiveRunner
Branch: main
Commit: 0be34c87c421
Files: 184
Total size: 486.2 KB

Directory structure:
gitextract_l_9r6xof/

├── .github/
│   ├── CODEOWNERS
│   ├── ISSUE_TEMPLATE/
│   │   ├── bug_report.md
│   │   └── feature_request.md
│   └── workflows/
│       ├── deploy.yml
│       ├── main.yml
│       └── release.yml
├── .gitignore
├── CHANGELOG.md
├── CODE-OF-CONDUCT.md
├── CONTRIBUTING.md
├── LICENSE.txt
├── README.md
├── RELEASING.md
├── pom.xml
└── src/
    ├── main/
    │   ├── java/
    │   │   └── com/
    │   │       └── klarna/
    │   │           ├── hiverunner/
    │   │           │   ├── HiveRunnerCore.java
    │   │           │   ├── HiveRunnerExtension.java
    │   │           │   ├── HiveRunnerRule.java
    │   │           │   ├── HiveServerContainer.java
    │   │           │   ├── HiveServerContext.java
    │   │           │   ├── HiveShell.java
    │   │           │   ├── HiveShellContainer.java
    │   │           │   ├── StandaloneHiveRunner.java
    │   │           │   ├── StandaloneHiveServerContext.java
    │   │           │   ├── ThrowOnTimeout.java
    │   │           │   ├── TimeoutException.java
    │   │           │   ├── annotations/
    │   │           │   │   ├── HiveProperties.java
    │   │           │   │   ├── HiveResource.java
    │   │           │   │   ├── HiveRunnerSetup.java
    │   │           │   │   ├── HiveSQL.java
    │   │           │   │   └── HiveSetupScript.java
    │   │           │   ├── builder/
    │   │           │   │   ├── HiveResource.java
    │   │           │   │   ├── HiveRunnerScript.java
    │   │           │   │   ├── HiveShellBase.java
    │   │           │   │   ├── HiveShellBuilder.java
    │   │           │   │   ├── HiveShellTearable.java
    │   │           │   │   ├── Script.java
    │   │           │   │   └── Statement.java
    │   │           │   ├── config/
    │   │           │   │   └── HiveRunnerConfig.java
    │   │           │   ├── data/
    │   │           │   │   ├── Converters.java
    │   │           │   │   ├── FileParser.java
    │   │           │   │   ├── InsertIntoTable.java
    │   │           │   │   ├── TableDataBuilder.java
    │   │           │   │   ├── TableDataInserter.java
    │   │           │   │   └── TsvFileParser.java
    │   │           │   ├── io/
    │   │           │   │   └── IgnoreClosePrintStream.java
    │   │           │   └── sql/
    │   │           │       ├── HiveRunnerStatement.java
    │   │           │       ├── StatementLexer.java
    │   │           │       ├── cli/
    │   │           │       │   ├── AbstractImportPostProcessor.java
    │   │           │       │   ├── CommandShellEmulator.java
    │   │           │       │   ├── CommandShellEmulatorFactory.java
    │   │           │       │   ├── CommentUtil.java
    │   │           │       │   ├── DefaultPreProcessor.java
    │   │           │       │   ├── PostProcessor.java
    │   │           │       │   ├── PreProcessor.java
    │   │           │       │   ├── beeline/
    │   │           │       │   │   ├── BeelineEmulator.java
    │   │           │       │   │   ├── RunCommandPostProcessor.java
    │   │           │       │   │   └── SqlLineCommandRule.java
    │   │           │       │   └── hive/
    │   │           │       │       ├── HiveCliEmulator.java
    │   │           │       │       ├── PreV200HiveCliEmulator.java
    │   │           │       │       ├── PreV200HiveCliPreProcessor.java
    │   │           │       │       └── SourceCommandPostProcessor.java
    │   │           │       └── split/
    │   │           │           ├── BaseContext.java
    │   │           │           ├── CloseStatementRule.java
    │   │           │           ├── Consumer.java
    │   │           │           ├── Context.java
    │   │           │           ├── DefaultTokenRule.java
    │   │           │           ├── NewLineUtil.java
    │   │           │           ├── PreserveCommentsRule.java
    │   │           │           ├── PreserveQuotesRule.java
    │   │           │           ├── StatementSplitter.java
    │   │           │           └── TokenRule.java
    │   │           └── reflection/
    │   │               └── ReflectionUtils.java
    │   └── license/
    │       └── APACHE-2.txt
    └── test/
        ├── java/
        │   └── com/
        │       └── klarna/
        │           └── hiverunner/
        │               ├── AggregateViewTest.java
        │               ├── AnnotatedBaseTestClass.java
        │               ├── AnnotatedFieldsInSuperClassTest.java
        │               ├── BeelineRunTest.java
        │               ├── BigResultSetTest.java
        │               ├── CommentTest.java
        │               ├── CtasTest.java
        │               ├── DisabledTimeoutTest.java
        │               ├── ExecuteFileBasedScriptIntegrationTest.java
        │               ├── ExecuteScriptIntegrationTest.java
        │               ├── HiveCliSourceTest.java
        │               ├── HiveRunnerAnnotationsTest.java
        │               ├── HiveRunnerExtensionTest.java
        │               ├── HiveServerContainerTest.java
        │               ├── HiveShellBeeLineEmulationTest.java
        │               ├── HiveShellHiveCliEmulationTest.java
        │               ├── HiveVariablesTest.java
        │               ├── InsertIntoTableIntegrationTest.java
        │               ├── IntegerPartitionFormatTest.java
        │               ├── InteractiveHiveShellTest.java
        │               ├── LeftOuterJoinTest.java
        │               ├── MSCKRepairNpeTest.java
        │               ├── MacroTest.java
        │               ├── MethodLevelResourceTest.java
        │               ├── MultipleExecutionEnginesTest.java
        │               ├── NeverEndingUdf.java
        │               ├── NoTimeoutTest.java
        │               ├── OrcSnappyTest.java
        │               ├── ParquetInsertionTest.java
        │               ├── PartitionSupportTest.java
        │               ├── ReservedKeywordTest.java
        │               ├── ResourceOutputStreamTest.java
        │               ├── SchemaResetBetweenTestMethodsTest.java
        │               ├── SerdeTest.java
        │               ├── SetHiveExecutionEngineTest.java
        │               ├── SetPropertyTest.java
        │               ├── SetTest.java
        │               ├── SlowlyFailingUdf.java
        │               ├── TestMethodIntegrityTest.java
        │               ├── TimeoutAndRetryTest.java
        │               ├── ToUpperCaseSerDe.java
        │               ├── UnresolvedResourcePathTest.java
        │               ├── UserDefinedFunctionTest.java
        │               ├── builder/
        │               │   └── HiveShellBaseTest.java
        │               ├── config/
        │               │   └── HiveRunnerConfigTest.java
        │               ├── data/
        │               │   ├── ConvertersTest.java
        │               │   ├── InsertIntoTableTest.java
        │               │   ├── TableDataBuilderTest.java
        │               │   ├── TableDataInserterTest.java
        │               │   └── TsvFileParserTest.java
        │               ├── examples/
        │               │   ├── HelloAnnotatedHiveRunnerTest.java
        │               │   ├── HelloHiveRunnerParamaterizedTest.java
        │               │   ├── HelloHiveRunnerTest.java
        │               │   ├── InsertTestDataTest.java
        │               │   ├── SetHiveConfValuesTest.java
        │               │   └── junit4/
        │               │       ├── HelloAnnotatedHiveRunnerTest.java
        │               │       ├── HelloHiveRunnerTest.java
        │               │       ├── InsertTestDataTest.java
        │               │       └── SetHiveConfValuesTest.java
        │               ├── io/
        │               │   └── IgnoreClosePrintStreamTest.java
        │               └── sql/
        │                   ├── cli/
        │                   │   ├── AbstractImportPostProcessorTest.java
        │                   │   ├── CommandShellEmulatorFactoryTest.java
        │                   │   ├── CommentUtilTest.java
        │                   │   ├── beeline/
        │                   │   │   ├── BeelineEmulatorTest.java
        │                   │   │   ├── BeelineStatementSplitterTest.java
        │                   │   │   ├── RunCommandPostProcessorTest.java
        │                   │   │   └── SqlLineCommandRuleTest.java
        │                   │   └── hive/
        │                   │       ├── HiveCliEmulatorTest.java
        │                   │       ├── HiveCliStatementSplitterTest.java
        │                   │       ├── PreV200HiveCliEmulatorTest.java
        │                   │       └── SourceCommandPostProcessorTest.java
        │                   └── split/
        │                       ├── BaseContextTest.java
        │                       ├── CloseStatementRuleTest.java
        │                       ├── ConsumerEolTest.java
        │                       ├── DefaultTokenRuleTest.java
        │                       ├── NewLineUtilTest.java
        │                       ├── PreserveCommentsRuleTest.java
        │                       ├── PreserveQuotesRuleTest.java
        │                       └── StatementSplitterTest.java
        └── resources/
            ├── AggregateViewTest/
            │   └── create_table.sql
            ├── CommentTest/
            │   └── comment.sql
            ├── CtasTest/
            │   └── ctas.sql
            ├── HelloHiveRunnerTest/
            │   ├── calculate_max.sql
            │   ├── create_ctas.sql
            │   ├── create_max.sql
            │   ├── create_table.sql
            │   └── hello_hive_runner.csv
            ├── HiveRunnerAnnotationsTest/
            │   ├── hql1.sql
            │   ├── setupFile.csv
            │   ├── setupPath.csv
            │   ├── testData.csv
            │   └── testData2.csv
            ├── HiveRunnerExtensionTest/
            │   └── test_query.sql
            ├── InsertIntoTableIntegrationTest/
            │   ├── data.tsv
            │   └── dataWithCustomNullValue.csv
            ├── InsertTestDataTest/
            │   ├── data1.tsv
            │   ├── data2.tsv
            │   ├── dataWithHeader1.tsv
            │   └── dataWithHeader2.tsv
            ├── MethodLevelResourceTest/
            │   └── MethodLevelResourceTest.txt
            ├── OrcSnappyTest/
            │   └── ctas.sql
            ├── PartitionSupportTest/
            │   └── hql_example.sql
            ├── SerdeTest/
            │   ├── create_table.sql
            │   └── hql_custom_serde.sql
            ├── SetTest/
            │   └── test_with_set.hql
            ├── TsvFileParserTest/
            │   ├── data.csv
            │   ├── data.tsv
            │   ├── dataWithCustomNullValue.csv
            │   ├── dataWithHeader.csv
            │   └── dataWithHeader.tsv
            └── log4j2.xml

================================================
FILE CONTENTS
================================================

================================================
FILE: .github/CODEOWNERS
================================================
* @HiveRunner/hiverunner-committers


================================================
FILE: .github/ISSUE_TEMPLATE/bug_report.md
================================================
---
name: Bug report
about: Create a report to help us improve

---
<!-- 
 Before raising a bug report please consider the following:
   1. If you want to ask a question don't raise a bug report - rather use the mailing list at https://groups.google.com/forum/#!forum/hive-runner-user
   2. Please ensure that the bug your are reporting is actually in HiveRunner and not with Hive itself. Because HiveRunner tests Hive queries, if there 
      are issues with your queries or Hive setup, it will just return any errors that Hive itself throws and users sometimes mistakenly report these
      as HiveRunner issues. The easiest way to check this is to perform your query against Hive directly. If the issue still persists then it's not 
      related to HiveRunner so please don't report it here.  
-->
**Describe the bug**
A clear and concise description of what the bug is.

**To Reproduce**
Steps to reproduce the behaviour ideally including the configuration files you are using (feel free to rename any sensitive information like server and table names etc.) 
Even better would be the source code of your unit test or a pull request against the HiveRunner unit tests containing a test that demonstrates the issue.

**Expected behavior**
A clear and concise description of what you expected to happen.

**Logs**
Please add the log output from HiveRunner when the error occurs, full stack traces are especially useful.

**Versions (please complete the following information):**
 - HiveRunner Version: 
 - Hive Versions: for whatever version of Hive you are using for your tests

**Additional context**
Add any other context about the problem here.


================================================
FILE: .github/ISSUE_TEMPLATE/feature_request.md
================================================
---
name: Feature request
about: Suggest an idea for this project

---

**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. For example - I'm always frustrated when [...]

**Describe the solution you'd like**
A clear and concise description of what you want to happen.

**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.

**Additional context**
Add any other context or screenshots about the feature request here.


================================================
FILE: .github/workflows/deploy.yml
================================================
name: Deploy SNAPSHOT
on:
  workflow_dispatch:
    inputs:
      branch:
        description: "The branch to use to deploy a SNAPSHOT from."
        required: true
        default: "main"
jobs:
  deploy:
    name: Deploy SNAPSHOT to Sonatype
    runs-on: ubuntu-20.04
    steps:
    - uses: actions/checkout@v2
      with:
        fetch-depth: 0
        ref: ${{ github.event.inputs.branch }}
    - name: Set up JDK
      uses: actions/setup-java@v2
      with:
        distribution: 'adopt'
        java-version: '8'
        # this creates a settings.xml with the following server
        settings-path: ${{ github.workspace }}
        server-id: ossrh # Value of the distributionManagement/repository/id field of the pom.xml
        server-username: SONATYPE_USERNAME # env variable for username in deploy
        server-password: SONATYPE_PASSWORD # env variable for token in deploy        
        # only signed artifacts will be released to maven central. this sets up things for the maven-gpg-plugin
        gpg-private-key: ${{ secrets.GPG_PRIVATE_KEY }} # Value of the GPG private key to import
        gpg-passphrase: GPG_PASSPHRASE # env variable for GPG private key passphrase
       
    - name: Run Maven Targets
      run: mvn deploy --settings $GITHUB_WORKSPACE/settings.xml --batch-mode --show-version --no-transfer-progress --activate-profiles oss-release
      env:
        SONATYPE_PASSWORD: ${{ secrets.SONATYPE_PASSWORD }}
        SONATYPE_USERNAME: ${{ secrets.SONATYPE_USERNAME }}
        GPG_PASSPHRASE: ${{ secrets.GPG_PRIVATE_KEY_PASSPHRASE }}


================================================
FILE: .github/workflows/main.yml
================================================
name: build

on: 
  pull_request:
  push:
    branches: 
      - main

jobs:
  test:
    name: Package and run all tests
    runs-on: ubuntu-20.04
    steps:
    - uses: actions/checkout@v2
      with:
        fetch-depth: 0
    - name: Set up JDK
      uses: actions/setup-java@v2
      with:
        distribution: 'adopt'
        java-version: '8'
    - name: Run Maven Targets
      run: mvn package --batch-mode --show-version --no-transfer-progress


================================================
FILE: .github/workflows/release.yml
================================================
name: Release to Maven Central
on:
  workflow_dispatch:
    inputs:
      branch:
        description: "The branch to use to release from."
        required: true
        default: "main"
jobs:
  release:
    name: Release to Maven Central
    runs-on: ubuntu-20.04

    steps:
    - name: Checkout source code
      uses: actions/checkout@v2
      with:
        fetch-depth: 0
        ref: ${{ github.event.inputs.branch }}

    - name: Set up JDK
      uses: actions/setup-java@v2
      with:
        distribution: 'adopt'
        java-version: '8'
        # this creates a settings.xml with the following server
        settings-path: ${{ github.workspace }}
        server-id: ossrh # Value of the distributionManagement/repository/id field of the pom.xml
        server-username: SONATYPE_USERNAME # env variable for username in deploy
        server-password: SONATYPE_PASSWORD # env variable for token in deploy        
        # only signed artifacts will be released to maven central. this sets up things for the maven-gpg-plugin
        gpg-private-key: ${{ secrets.GPG_PRIVATE_KEY }} # Value of the GPG private key to import
        gpg-passphrase: GPG_PASSPHRASE # env variable for GPG private key passphrase

    - name: Configure Git User
      run: |
        git config user.email "actions@github.com"
        git config user.name "GitHub Actions"

    - name: Run Maven Targets
      run: mvn release:prepare release:perform --settings $GITHUB_WORKSPACE/settings.xml --activate-profiles oss-release --batch-mode --show-version --no-transfer-progress 
      env:
        SONATYPE_PASSWORD: ${{ secrets.SONATYPE_PASSWORD }}
        SONATYPE_USERNAME: ${{ secrets.SONATYPE_USERNAME }}
        GPG_PASSPHRASE: ${{secrets.GPG_PRIVATE_KEY_PASSPHRASE}}
        GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}


================================================
FILE: .gitignore
================================================
# Project files #
#################
*.iml
*.ipr
*.iws
nbactions.xml
/.idea/

# Compiled source #
###################
*.com
*.class
*.dll
*.exe
*.o
*.so

# Maven target #
################
/target/**

# H2 db files #
###############
mem.h2.db
mem.lock.db

# HSQLDB files #
################
testdb.log
testdb.properties
testdb.script

# Packages #
############
# it's better to unpack these files and commit the raw source
# git has its own built in compression methods
*.7z
*.dmg
*.gz
*.iso
*.jar
*.rar
*.tar
*.zip

# Logs and databases #
######################
*.log
*.sqlite
metastore_db

# OS generated files #
######################
.DS_Store
.DS_Store?
._*
.Spotlight-V100
.Trashes
Icon?
ehthumbs.db
Thumbs.db

# Temporary files #
###################
*~

# Eclipse #
###########
.classpath
.project
.settings/

# IntelliJ #
############
.gradle/
*.eml

# jEnv #
########
/.java-version

# Misc #
########
/pubring.kbx


================================================
FILE: CHANGELOG.md
================================================
# Changelog
All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/) and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).
## [7.0.0] - 2024-11-22
### Added
- Added version `6.0.9` of `datanucleus-core`.
- Added version `6.0.4` of `datanucleus-api-jdo`.
- Added version `6.0.9` of `datanucleus-rdbms`.
- Added version `1.3` of `javax.transaction-api`.
- Added version `6.1.14` of `spring-jdbc`.
- Added version `10.15.2.0` of `derby`.
- Added version `10.15.2.0` of `derbytools`.
- Added version `5.6.2` of `kryo`.
- Added version `4.9.3` of `antlr4-runtime`.
- Added version `4.0.1` of `kafka-handler`.
- Added missing Hive & Datanucleus properties in StandaloneHiveServerContext so now the framework works with a new Hive dependency versions.

### Changed
- Updated `hadoop-mapreduce-client-common` from `3.1.0` to `3.4.1`.
- Updated `hadoop-mapreduce-client-core` from `3.1.0` to `3.4.1`.
- Updated `hadoop-client-runtime` from `3.1.0` to `3.4.1`.
- Updated `hive-exec` from `3.1.2` to `4.0.1`.
- Updated `hive-serde` from `3.1.0` to `3.4.1`.
- Updated `hive-jdbc` from `3.1.0` to `3.4.1`.
- Updated `hive-contrib` from `3.1.0` to `3.4.1`.
- Updated `hive-webhcat-java-client` from `3.1.0` to `3.4.1`.
- Updated `jackson-annotations` from `2.9.5` to `2.18.1`.
- Updated `reflections` from `0.9.8` to `0.10.2`.
- Updated `mockito-core` from `3.8.0` to `5.14.2`.
- Updated `mockito-junit-jupiter` from `3.8.0` to `5.14.2`.
- Updated `tez-common` from `0.9.1` to `0.10.4`.
- Updated `tez-mapreduce` from `0.9.1` to `0.10.4`.
- Updated `junit-jupiter` from `5.7.1` to `5.11.3`.
- Updated `junit-vintage-engine` from `5.7.1` to `5.11.2`.
- Updated `maven-surefire-plugin` from `2.22.2` to `3.5.1`.
- Updated `maven-compiler-plugin` from `3.7.0` to `3.13.0`.
- Updated `maven-jar-plugin` from `3.2.0` to `3.4.2`.
- Updated `maven-release-plugin` from `3.0.0-M1` to `3.1.1`.
- Updated `nexus-staging-maven-plugin` from `1.6.8` to `1.7.0`.
- Updated `maven-source-plugin` from `3.2.0` to `3.3.1`.
- Updated `maven-javadoc-plugin` from `3.2.0` to `3.10.1`.
- Updated `maven-gpg-plugin` from `1.6` to `3.2.7`.
- Updated `HiveConf` property names in `StandaloneHiveServerContext`
- Set `METASTORE_VALIDATE_CONSTRAINTS`, `METASTORE_VALIDATE_COLUMNS`, `METASTORE_VALIDATE_TABLES` properties to false in StandaloneHiveServerContext.

### Removed
- Removed `com.google.common.base.Predicates` in `HiveRunnerExtension`/`StandaloneHiveRunner` as it is no longer used in a new version of `org.reflections:reflections` library.

### Fixed
- Fixed warning "org.apache.hadoop.hive.metastore.MetastoreDirectSqlUtils - Failed to execute [select "FUNCS"."FUNC_ID" from "FUNCS" LEFT JOIN "DBS" ON "FUNCS"."DB_ID" = "DBS"."DB_ID" where "DBS"."CTLG_NAME" = ? ]..." is not logged anymore.
- Fixed `IgnoreClosePrintStream` as NPE was thrown after upgrading to Java >= 11
- Fixed error "org.apache.hadoop.hive.ql.exec.tez.DagUtils - Failed to add credential supplier java.lang.ClassNotFoundException: org.apache.hadoop.hive.kafka.KafkaDagCredentialSupplier"

## [6.1.0] - 2021-04-28
### Changed
- Maven Group Id changed from `com.klarna` to `io.github.hiverunner`.
- Set `HIVE_IN_TEST` to true in `StandaloneHiverServerContext` instead of `StandaloneHiveRunner` so checks for non-existent tables are skipped by both the JUnit4 runner and the JUnit5 extension (this removes a lot of log noise from tests using the latter).
- Made `HiveRunnerScript` constructor public.
- Made `scriptsUnderTest` variable in `HiveRunnerExtension` protected so it can be used in [MutantSwarm](https://github.com/HotelsDotCom/mutant-swarm).
- Fixed bug that appears in [Mutant Swarm](https://github.com/HotelsDotCom/mutant-swarm) when updating HiveRunner to version 5.2.1.
- Renamed `HelloAnnotatedHiveRunner` in `com.klarna.hiverunner.examples` to `HelloAnnotatedHiveRunnerTest`.
- Renamed `HelloHiveRunner` in `com.klarna.hiverunner.examples` to `HelloHiveRunnerTest`.
- Renamed `InsertTestData` in `com.klarna.hiverunner.examples` to `InsertTestDataTest`.
- Renamed `SetHiveConfValues` in `com.klarna.hiverunner.examples` to `SetHiveConfValuesTest`.
- Renamed `HelloAnnotatedHiveRunner` in `com.klarna.hiverunner.examples.junit4` to `HelloAnnotatedHiveRunnerTest`.
- Renamed `HelloHiveRunner` in `com.klarna.hiverunner.examples.junit4` to `HelloHiveRunnerTest`.
- Renamed `InsertTestData` in `com.klarna.hiverunner.examples.junit4` to `InsertTestDataTest`.
- Renamed `SetHiveConfValues` in `com.klarna.hiverunner.examples.junit4` to `SetHiveConfValuesTest`.
- Updated `surefire-version-plugin` from `2.21.0` to `2.22.0`.
- Updated `junit.jupiter.version` (JUnit5) from `5.6.0` to `5.7.1`.
- Updated `junit` (JUnit4) from `4.13.1` to `4.13.2`.
- Updated `mockito-core` from `2.18.3` to `3.8.0`.

### Added
- Added `getScriptPaths` method in `HiveRunnerCore`.
- Added `getScriptPaths` method in `HiveRunnerExtension` to be able to access the other method in `HiveRunnerCore` so that it can be used downstream in [MutantSwarm](https://github.com/HotelsDotCom/mutant-swarm).
- Added `fromScriptPaths` method in `HiveShellBuilder`.
- Added version `5.7.1` of `junit-vintage-engine`.
- Added version `3.8.0` of `mockito-junit-jupiter`.

### Fixed
- Fixed bug where the files specified in `@HiveSQL` weren't being run when using `HiveRunnerExtension`.
- Successful tests using "SET" no longer marked as "terminated" when run in IntelliJ. See [#94](https://github.com/klarna/HiveRunner/issues/94).

## [6.0.1] - 2020-09-07
### Removed
- Removed shaded jar that was being produced as a side-effect.

## [6.0.0] - 2020-09-03
### Changed
- Upgraded Hive version to 3.1.2 (was 2.3.7).

## [5.x]
### NOTE
- Releases from the 5.x (Hive 2) line are not tracked in this CHANGELOG, it only tracks 6.0.0 and above. For changes in 5.x please refer to https://github.com/klarna/HiveRunner/blob/hive-2.x/CHANGELOG.md.

## [5.0.0] - 2019-09-30
### Added
- JUnit5 [Extension](https://junit.org/junit5/docs/current/user-guide/#extensions) support with `HiveRunnerExtension`. See [#106](https://github.com/klarna/HiveRunner/issues/106).

### Changed
- Default supported Hive version is now 2.3.4 (was 2.3.3) as version 2.3.3 has a [vulnerability](https://nvd.nist.gov/vuln/detail/CVE-2018-1314).
- `TemporaryFolder` ([JUnit 4](https://junit.org/junit4/javadoc/4.12/org/junit/rules/TemporaryFolder.html)) has been changed to `Path` ([Java NIO](https://docs.oracle.com/javase/8/docs/api/java/nio/file/Path.html)) throughout the project for the JUnit5 update. 
- NOTE: The `HiveServerContext` class now uses `Path` instead of `TemporaryFolder` in the constructor.

## [4.1.0] - 2019-02-27
### Changed
- Internal refactoring to support upcoming "Mutant Swarm" project which provides unit test coverage for Hive SQL scripts. See [#65](https://github.com/klarna/HiveRunner/issues/65).

## [4.0.0] - 2018-07-17
### Added
- Support shell-specific `source` (`hive`) and ``!run`` (`beeline`) commands. These commands allow one to import and execute the contents of external files in statements or scripts.

### Changed
- Default supported Hive version is now 2.3.3 (was 1.2.1).
- Default supported Tez version is now 0.9.1 (was 0.7.0).
- Supported Java version is 8 (was 7).
- In-memory DB used by HiveRunner is now Derby (was HSQLDB).
- Log4J configuration file removed from jar artifact.
- System property to configure command shell emulation mode renamed to `commandShellEmulator` (was `commandShellEmulation`).

## [3.2.1] - 2018-05-31
### Changed
- Fixed issue where if case of column name in a file was different to case in table definition they would be treated as different [#73](https://github.com/klarna/HiveRunner/issues/73).
- The way of setting writable permissions on JUnit temporary folder changed to make it compatible with Windows [#63](https://github.com/klarna/HiveRunner/issues/63).

## [3.2.0] - 2017-02-09
### Added
- Added functionality for headers in TSV parser. This way you can dynamically add TSV files declaring a subset of columns using insertInto.

## [3.1.1] - 2017-01-27
### Added
- Added debug logging of result set. Enable by setting ```log4j.logger.com.klarna.hiverunner.HiveServerContainer=DEBUG``` in log4j.properties.

## [3.1.0] - 2016-10-17
### Added
- Added methods to the shell that allow statements contained in files to be executed and their results gathered. These are particularly useful for HQL scripts that generate no table based data and instead write results to STDOUT. In practice we've seen these scripts used in data processing job orchestration scripts (e.g `bash`) to check for new data, calculate processing boundaries, etc. These values are then used to appropriately configure and launch some downstream job.
- Support abstract base class [#48](https://github.com/klarna/HiveRunner/issues/48).

## [3.0.0] - 2016-02-05
### Changed
- Upgraded to Hive 1.2.1 (Note: new major release with backwards incompatibility issues). As of Hive 1.2 there are a number of new reserved keywords, see [DDL manual](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Keywords,Non-reservedKeywordsandReservedKeywords) for more information. If you happen to have one of these as an identifier, you could either backtick quote them (e.g. \`date\`, \`timestamp\` or \`update\`) or set hive.support.sql11.reserved.keywords=false.                                            
- Users of Hive version 0.14 or older are recommended to use HiveRunner version 2.6.0.
- Removed the custom HiveConf hive.vs. Use hadoop.tmp.dir instead.

## [2.6.0] - 2015-12-01
### Added
- Introduced command shell emulations to replicate different handling of full line comments in `hive` and `beeline` shells. Now strips full line comments for executed scripts to match the behaviour of the `hive -f` file option. 
- Option to use files as input for com.klarna.hiverunner.HiveShell.execute(...).

## [2.5.1] - 2015-11-12
### Changed
- Fixed deadlock in `ThrowOnTimeout.java` that occurred when running with long running test case and disabled timeout.

## [2.5.0]
### Added
- Added support with `HiveShell.insertInto` for fluently generating test data in a table storage format agnostic manner.

## [2.4.0]
### Changed
- Enabled any hiveconf variables to be set as System properties by using the naming convention hiveconf_[HiveConf property name]. e.g: hiveconf_hive.execution.engine.
- Fixed bug: Results sets bigger than 100 rows only returned the first 100 rows. 

## [2.3.0]
### Changed
- Merged Tez and MR context into the same context again. Now, the same test suite may alter between execution engines by doing e.g.: 

     hive> set hive.execution.engine=tez;
     hive> [some query]
     hive> set hive.execution.engine=mr;
     hive> [some query]

## [2.2.0]
### Added
- Added support for setting hivevars via HiveShell.


================================================
FILE: CODE-OF-CONDUCT.md
================================================
# Contributor Covenant Code of Conduct

## Our Pledge

In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, gender identity and expression, level of experience,
nationality, personal appearance, race, religion, or sexual identity and
orientation.

## Our Standards

Examples of behaviour that contributes to creating a positive environment
include:

* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members

Examples of unacceptable behaviour by participants include:

* The use of sexualised language or imagery and unwelcome sexual attention or
  advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
  address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
  professional setting

## Our Responsibilities

Project maintainers are responsible for clarifying the standards of acceptable
behaviour and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behaviour.

Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviours that they deem inappropriate,
threatening, offensive, or harmful.

## Scope

This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.

## Enforcement

Instances of abusive, harassing, or otherwise unacceptable behaviour may be
reported by contacting [a member of the project team](https://github.com/orgs/HiveRunner/teams/hiverunner-committers). All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.

Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.

## Attribution

This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html

[homepage]: https://www.contributor-covenant.org



================================================
FILE: CONTRIBUTING.md
================================================
# How To Contribute

We'd love to accept your patches and contributions to this project. There are just a few guidelines you need to follow which are described in detail below.

## 1. Fork this repo

You should create a fork of this project in your account and work from there. You can create a fork by clicking the fork button in GitHub.

## 2. One feature, one branch

Work for each new feature/issue should occur in its own branch. To create a new branch from the command line:
```shell
git checkout -b my-new-feature
```
where "my-new-feature" describes what you're working on.

## 3. Add unit tests
If your contribution modifies existing or adds new code please add corresponding unit tests for this.

## 4. Ensure that the build passes

Run
```shell
mvn package
```
and check that there are no errors.

## 5. Add documentation for new or updated functionality

Please review all of the .md files in this project to see if they are impacted by your change and update them accordingly.

## 6. Add to CHANGELOG.md

Any notable changes should be recorded in the CHANGELOG.md following the [Keep a Changelog](https://keepachangelog.com/en/1.0.0/) conventions.

## 7. Submit a pull request and describe the change

Push your changes to your branch and open a pull request against the parent repo on GitHub. The project administrators will review your pull request and respond with feedback.

# How your contribution gets merged

Upon pull request submission, your code will be reviewed by the maintainers. They will confirm at least the following:

- Tests run successfully (unit, coverage, integration, style).
- Contribution policy has been followed.

Two (human) reviewers will need to sign off on your pull request before it can be merged.


================================================
FILE: LICENSE.txt
================================================

                                 Apache License
                           Version 2.0, January 2004
                        http://www.apache.org/licenses/

   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

   1. Definitions.

      "License" shall mean the terms and conditions for use, reproduction,
      and distribution as defined by Sections 1 through 9 of this document.

      "Licensor" shall mean the copyright owner or entity authorized by
      the copyright owner that is granting the License.

      "Legal Entity" shall mean the union of the acting entity and all
      other entities that control, are controlled by, or are under common
      control with that entity. For the purposes of this definition,
      "control" means (i) the power, direct or indirect, to cause the
      direction or management of such entity, whether by contract or
      otherwise, or (ii) ownership of fifty percent (50%) or more of the
      outstanding shares, or (iii) beneficial ownership of such entity.

      "You" (or "Your") shall mean an individual or Legal Entity
      exercising permissions granted by this License.

      "Source" form shall mean the preferred form for making modifications,
      including but not limited to software source code, documentation
      source, and configuration files.

      "Object" form shall mean any form resulting from mechanical
      transformation or translation of a Source form, including but
      not limited to compiled object code, generated documentation,
      and conversions to other media types.

      "Work" shall mean the work of authorship, whether in Source or
      Object form, made available under the License, as indicated by a
      copyright notice that is included in or attached to the work
      (an example is provided in the Appendix below).

      "Derivative Works" shall mean any work, whether in Source or Object
      form, that is based on (or derived from) the Work and for which the
      editorial revisions, annotations, elaborations, or other modifications
      represent, as a whole, an original work of authorship. For the purposes
      of this License, Derivative Works shall not include works that remain
      separable from, or merely link (or bind by name) to the interfaces of,
      the Work and Derivative Works thereof.

      "Contribution" shall mean any work of authorship, including
      the original version of the Work and any modifications or additions
      to that Work or Derivative Works thereof, that is intentionally
      submitted to Licensor for inclusion in the Work by the copyright owner
      or by an individual or Legal Entity authorized to submit on behalf of
      the copyright owner. For the purposes of this definition, "submitted"
      means any form of electronic, verbal, or written communication sent
      to the Licensor or its representatives, including but not limited to
      communication on electronic mailing lists, source code control systems,
      and issue tracking systems that are managed by, or on behalf of, the
      Licensor for the purpose of discussing and improving the Work, but
      excluding communication that is conspicuously marked or otherwise
      designated in writing by the copyright owner as "Not a Contribution."

      "Contributor" shall mean Licensor and any individual or Legal Entity
      on behalf of whom a Contribution has been received by Licensor and
      subsequently incorporated within the Work.

   2. Grant of Copyright License. Subject to the terms and conditions of
      this License, each Contributor hereby grants to You a perpetual,
      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
      copyright license to reproduce, prepare Derivative Works of,
      publicly display, publicly perform, sublicense, and distribute the
      Work and such Derivative Works in Source or Object form.

   3. Grant of Patent License. Subject to the terms and conditions of
      this License, each Contributor hereby grants to You a perpetual,
      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
      (except as stated in this section) patent license to make, have made,
      use, offer to sell, sell, import, and otherwise transfer the Work,
      where such license applies only to those patent claims licensable
      by such Contributor that are necessarily infringed by their
      Contribution(s) alone or by combination of their Contribution(s)
      with the Work to which such Contribution(s) was submitted. If You
      institute patent litigation against any entity (including a
      cross-claim or counterclaim in a lawsuit) alleging that the Work
      or a Contribution incorporated within the Work constitutes direct
      or contributory patent infringement, then any patent licenses
      granted to You under this License for that Work shall terminate
      as of the date such litigation is filed.

   4. Redistribution. You may reproduce and distribute copies of the
      Work or Derivative Works thereof in any medium, with or without
      modifications, and in Source or Object form, provided that You
      meet the following conditions:

      (a) You must give any other recipients of the Work or
          Derivative Works a copy of this License; and

      (b) You must cause any modified files to carry prominent notices
          stating that You changed the files; and

      (c) You must retain, in the Source form of any Derivative Works
          that You distribute, all copyright, patent, trademark, and
          attribution notices from the Source form of the Work,
          excluding those notices that do not pertain to any part of
          the Derivative Works; and

      (d) If the Work includes a "NOTICE" text file as part of its
          distribution, then any Derivative Works that You distribute must
          include a readable copy of the attribution notices contained
          within such NOTICE file, excluding those notices that do not
          pertain to any part of the Derivative Works, in at least one
          of the following places: within a NOTICE text file distributed
          as part of the Derivative Works; within the Source form or
          documentation, if provided along with the Derivative Works; or,
          within a display generated by the Derivative Works, if and
          wherever such third-party notices normally appear. The contents
          of the NOTICE file are for informational purposes only and
          do not modify the License. You may add Your own attribution
          notices within Derivative Works that You distribute, alongside
          or as an addendum to the NOTICE text from the Work, provided
          that such additional attribution notices cannot be construed
          as modifying the License.

      You may add Your own copyright statement to Your modifications and
      may provide additional or different license terms and conditions
      for use, reproduction, or distribution of Your modifications, or
      for any such Derivative Works as a whole, provided Your use,
      reproduction, and distribution of the Work otherwise complies with
      the conditions stated in this License.

   5. Submission of Contributions. Unless You explicitly state otherwise,
      any Contribution intentionally submitted for inclusion in the Work
      by You to the Licensor shall be under the terms and conditions of
      this License, without any additional terms or conditions.
      Notwithstanding the above, nothing herein shall supersede or modify
      the terms of any separate license agreement you may have executed
      with Licensor regarding such Contributions.

   6. Trademarks. This License does not grant permission to use the trade
      names, trademarks, service marks, or product names of the Licensor,
      except as required for reasonable and customary use in describing the
      origin of the Work and reproducing the content of the NOTICE file.

   7. Disclaimer of Warranty. Unless required by applicable law or
      agreed to in writing, Licensor provides the Work (and each
      Contributor provides its Contributions) on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
      implied, including, without limitation, any warranties or conditions
      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
      PARTICULAR PURPOSE. You are solely responsible for determining the
      appropriateness of using or redistributing the Work and assume any
      risks associated with Your exercise of permissions under this License.

   8. Limitation of Liability. In no event and under no legal theory,
      whether in tort (including negligence), contract, or otherwise,
      unless required by applicable law (such as deliberate and grossly
      negligent acts) or agreed to in writing, shall any Contributor be
      liable to You for damages, including any direct, indirect, special,
      incidental, or consequential damages of any character arising as a
      result of this License or out of the use or inability to use the
      Work (including but not limited to damages for loss of goodwill,
      work stoppage, computer failure or malfunction, or any and all
      other commercial damages or losses), even if such Contributor
      has been advised of the possibility of such damages.

   9. Accepting Warranty or Additional Liability. While redistributing
      the Work or Derivative Works thereof, You may choose to offer,
      and charge a fee for, acceptance of support, warranty, indemnity,
      or other liability obligations and/or rights consistent with this
      License. However, in accepting such obligations, You may act only
      on Your own behalf and on Your sole responsibility, not on behalf
      of any other Contributor, and only if You agree to indemnify,
      defend, and hold each Contributor harmless for any liability
      incurred by, or claims asserted against, such Contributor by reason
      of your accepting any such warranty or additional liability.

   END OF TERMS AND CONDITIONS

   APPENDIX: How to apply the Apache License to your work.

      To apply the Apache License to your work, attach the following
      boilerplate notice, with the fields enclosed by brackets "[]"
      replaced with your own identifying information. (Don't include
      the brackets!)  The text should be enclosed in the appropriate
      comment syntax for the file format. We also recommend that a
      file or class name and description of purpose be included on the
      same "printed page" as the copyright notice for easier
      identification within third-party archives.

   Copyright [yyyy] [name of copyright owner]

   Licensed under the Apache License, Version 2.0 (the "License");
   you may not use this file except in compliance with the License.
   You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.


================================================
FILE: README.md
================================================

[![Maven Central](https://maven-badges.herokuapp.com/maven-central/io.github.hiverunner/hiverunner/badge.svg?subject=io.github.hiverunner:hiverunner)](https://maven-badges.herokuapp.com/maven-central/io.github.hiverunner/hiverunner) 
[![Build](https://github.com/HiveRunner/hiverunner/workflows/build/badge.svg)](https://github.com/HiveRunner/HiveRunner/actions?query=workflow:"build")
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)

![ScreenShot](/images/HiveRunnerSplash.png)

# HiveRunner

Welcome to HiveRunner - Zero installation open source unit testing of [Hive](https://hive.apache.org/) applications.

[Watch the HiveRunner teaser on youtube!](http://youtu.be/B7yEAHwgi2w)

Welcome to the open source project HiveRunner. HiveRunner is a unit test framework based on JUnit (4 & 5) and enables 
TDD development of Hive SQL without the need for any installed dependencies. All you need is to add HiveRunner to your 
`pom.xml` as any other library and you're good to go.

HiveRunner is under constant development. It is used extensively by many companies. Please feel free to suggest 
improvements both as pull requests and as written requests.


## Overview

HiveRunner enables you to write Hive SQL as releasable tested artifacts. It will require you to parametrize and 
modularize Hive SQL in order to make it testable. The bits and pieces of code should then be wired together with some 
orchestration/workflow/build tool of your choice, to be runnable in your environment (e.g. Oozie, Pentaho, Talend, 
Maven, etc.) 

So, even though your current Hive SQL probably won't run off the shelf within HiveRunner, we believe the enforced 
testability and enabling of a TDD workflow will do as much good to the scripting world of SQL as it has for the Java 
community.

## Versions

Different versions of HiveRunner target different versions of Hive as follows:

| HiveRunner Version | Hive Version | Status                     | Source Code Branch                                     |
|--------------------|--------------|----------------------------|--------------------------------------------------------|
| 7.x                | 4.x          | New, active development    | https://github.com/HiveRunner/HiveRunner/tree/hive-4.x |
| 6.x                | 3.x          | Stable, active development | https://github.com/HiveRunner/HiveRunner (i.e. `main`) |
| 5.x                | 2.x          | Stable, bug fixes only     | https://github.com/HiveRunner/HiveRunner/tree/hive-2.x |


# Cook Book

## 1. Include HiveRunner

HiveRunner is published to [Maven Central](https://search.maven.org/search?q=hiverunner). To start to use it, add a dependency to HiveRunner to your pom file:

    <dependency>
        <groupId>io.github.hiverunner</groupId>
        <artifactId>hiverunner</artifactId>
        <version>[HIVERUNNER VERSION]</version>
        <scope>test</scope>
    </dependency>

Alternatively, if you want to build from source, clone this repo and build with:

     mvn install

Then add the dependency as mentioned above.

Also explicitly add the surefire plugin and configure forkMode=always to avoid OutOfMemory when building big test suites.

    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.21.0</version>
        <configuration>
            <forkMode>always</forkMode>
        </configuration>
    </plugin>

As an alternative if this does not solve the OOM issues, try increase the -Xmx and -XX:MaxPermSize settings. For example:

    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.21.0</version>
        <configuration>
            <forkCount>1</forkCount>
            <reuseForks>false</reuseForks>
            <argLine>-Xmx2048m -XX:MaxPermSize=512m</argLine>
        </configuration>
    </plugin>

(please note that the forkMode option is deprecated and you should use forkCount and reuseForks instead)

With forkCount and reuseForks there is a possibility to reduce the test execution time drastically, depending on your hardware. A plugin configuration which are using one fork per CPU core and reuse threads would look like:

    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.21.0</version>
        <configuration>
            <forkCount>1C</forkCount>
            <reuseForks>true</reuseForks>
            <argLine>-Xmx2048m -XX:MaxPermSize=512m</argLine>
        </configuration>
    </plugin>

By default, HiveRunner uses mapreduce (mr) as the execution engine for Hive. If you wish to run using Tez, set the 
System property `hiveconf_hive.execution.engine` to 'tez'.

(Any Hive conf property may be overridden by prefixing it with 'hiveconf_')
        
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <version>2.21.0</version>
            <configuration>
                <systemProperties>
                    <hiveconf_hive.execution.engine>tez</hiveconf_hive.execution.engine>
                    <hiveconf_hive.exec.counters.pull.interval>1000</hiveconf_hive.exec.counters.pull.interval>
                </systemProperties>
            </configuration>
        </plugin>

### Timeout
It's possible to configure HiveRunner to make tests time out after some time and retry those tests a couple of times, but only when using `StandaloneHiveRunner` as this is not available in the `HiveRunnerExtension` (from HiveRunner 5.x and up). This is to cover for the bug
https://issues.apache.org/jira/browse/TEZ-2475 that at times causes test cases to not terminate due to a lost DAG reference.
The timeout feature can be configured via the 'enableTimeout', 'timeoutSeconds' and 'timeoutRetries' properties.
A configuration which enables timeouts after 30 seconds and allows 2 retries would look like:

    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.21.0</version>
        <configuration>
            <systemProperties>
                <enableTimeout>true</enableTimeout>
                <timeoutSeconds>30</timeoutSeconds>
                <timeoutRetries>2</timeoutRetries>
            </systemProperties>
        </configuration>
    </plugin>


### Logging

HiveRunner uses [SLF4J](https://www.slf4j.org/) so you should configure logging in your tests using any compatible logging framework.

## 2. Look at the examples

Look at the [com.klarna.hiverunner.examples.HelloHiveRunnerTest](/src/test/java/com/klarna/hiverunner/examples/HelloHiveRunnerTest.java) reference test case to get a feeling for how a typical test case looks like in JUnit5. To find JUnit4 versions of the examples, look at [com.klarna.hiverunner.examples.junit4.HelloHiveRunnerTest](/src/test/java/com/klarna/hiverunner/examples/junit4/HelloHiveRunnerTest.java).

If you're put off by the verbosity of the annotations, there's always the possibility to use HiveShell in a more interactive mode.  The [com.klarna.hiverunner.SerdeTest](/src/test/java/com/klarna/hiverunner/SerdeTest.java) adds a resource (test data) interactively with HiveShell instead of using annotations.

Annotations and interactive mode can be mixed and matched, however you'll always need to include the [com.klarna.hiverunner.annotations.HiveSQL](/src/main/java/com/klarna/hiverunner/annotations/HiveSQL.java) annotation e.g:

         @HiveSQL(files = {"serdeTest/create_table.sql", "serdeTest/hql_custom_serde.sql"}, autoStart = false)
         public HiveShell hiveShell;

Note that the *autostart = false* is needed for the interactive mode. It can be left out when running with only annotations.

### Sequence files
If you work with __sequence files__ (Or anything else than regular text files) make sure to take a look at [ResourceOutputStreamTest](/src/test/java/com/klarna/hiverunner/ResourceOutputStreamTest.java) 
for an example of how to use the new method [HiveShell](src/main/java/com/klarna/hiverunner/HiveShell.java)\#getResourceOutputStream to manage test input data. 

### Programatically create test input data

Test data can be programmatically inserted into any Hive table using `HiveShell.insertInto(...)`. This seamlessly handles different storage formats and partitioning types allowing you to focus on the data required by your test scenarios:

    hiveShell.execute("create database test_db");
    hiveShell.execute("create table test_db.test_table ("
        + "c1 string,"
        + "c2 string,"
        + "c3 string"
        + ")"
        + "partitioned by (p1 string)"
        + "stored as orc");

    hiveShell.insertInto("test_db", "test_table")
        .withColumns("c1", "p1").addRow("v1", "p1")       // add { "v1", null, null, "p1" }
        .withAllColumns().addRow("v1", "v2", "v3", "p1")  // add { "v1", "v2", "v3", "p1" }
        .copyRow().set("c1", "v4")                        // add { "v4", "v2", "v3", "p1" }
        .addRowsFromTsv(file)                             // parses TSV data out of a file resource
        .addRowsFrom(file, fileParser)                    // parses custom data out of a file resource
        .commit();

See [com.klarna.hiverunner.examples.InsertTestDataTest](/src/test/java/com/klarna/hiverunner/examples/InsertTestDataTest.java) for working examples.

## 3. Understand the order of execution

HiveRunner will in default mode set up and start the HiveShell before the test method is invoked. If autostart is set to false, the [HiveShell](/src/main/java/com/klarna/hiverunner/HiveShell.java) must be started manually from within the test method. Either way, HiveRunner will do the following steps when start is invoked:

1. Merge any [@HiveProperties](/src/main/java/com/klarna/hiverunner/annotations/HiveProperties.java) from the test case with the Hive conf
2. Start the HiveServer with the merged conf
3. Copy all [@HiveResource](/src/main/java/com/klarna/hiverunner/annotations/HiveResource.java) data into the temp file area for the test
4. Execute all fields annotated with [@HiveSetupScript](/src/main/java/com/klarna/hiverunner/annotations/HiveSetupScript.java)
5. Execute the script files given in the [@HiveSQL](/src/main/java/com/klarna/hiverunner/annotations/HiveSQL.java) annotation

The [HiveShell](/src/main/java/com/klarna/hiverunner/HiveShell.java) field annotated with [@HiveSQL](/src/main/java/com/klarna/hiverunner/annotations/HiveSQL.java) will always be injected before the test method is invoked.


# Hive version compatibility

- This version of HiveRunner is built for Hive 3.1.2.
- For Hive 2.x support please use HiveRunner 5.x.
- Command shell emulations are provided to closely match the behaviour of both the Hive CLI and Beeline interactive shells. The desired emulation can be specified in your `pom.xml` file like so: 

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <version>2.21.0</version>
            <configuration>
                <systemProperties>
                    <!-- Defaults to HIVE_CLI, other options include BEELINE and HIVE_CLI_PRE_V200 -->
                    <commandShellEmulator>BEELINE</commandShellEmulator>
                </systemProperties>
            </configuration>
        </plugin>

  Or provided on the command line using a system property:

      mvn -DcommandShellEmulator=BEELINE test

# Future work and Limitations

* HiveRunner does not allow the `add jar` statement. It is considered bad practice to keep environment specific code together with the business logic that targets HiveRunner. Keep environment specific stuff in separate files and use your build/orchestration/workflow tool to run the right files in the right order in the right environment. When running HiveRunner, all SerDes available on the classpath of the IDE/maven will be available.

* HiveRunner runs Hive and Hive runs on top of Hadoop, and Hadoop has limited support for Windows machines. Installing [Cygwin](http://www.cygwin.com/ "Cygwin") might help out.

* Currently the HiveServer spins up and tears down for every test method. As a performance option it should be possible to clean the HiveServer and metastore between each test method invocation. The choice should probably be exposed to the test writer. By switching between different strategies, side effects/leakage can be ruled out during test case debugging. See [#69](https://github.com/HiveRunner/HiveRunner/issues/69).

# Known Issues

### UnknownHostException
I've had issues with UnknownHostException on OS X after upgrading my system or running Docker. 
Usually a restart of my machine solved it, but last time I got some corporate 
stuff installed the restarts stopped working and I kept getting UnknownHostExceptions. 
Following this simple guide solved my problem:
http://crunchify.com/getting-java-net-unknownhostexception-nodename-nor-servname-provided-or-not-known-error-on-mac-os-x-update-your-privateetchosts-file/

### Tez queries do not terminate
Tez will at times forget the process id of a random DAG. This will cause the query to never terminate. To get around this there is 
a timeout and retry functionality implemented in HiveRunner:
 
         <plugin>
             <groupId>org.apache.maven.plugins</groupId>
             <artifactId>maven-surefire-plugin</artifactId>
             <version>2.21.0</version>
             <configuration>
                 <systemProperties>
                     <enableTimeout>true</enableTimeout>
                     <timeoutSeconds>30</timeoutSeconds>
                     <timeoutRetries>2</timeoutRetries>
                     </systemProperties>
             </configuration>
         </plugin>
         
Make sure to set the timeoutSeconds to that of your slowest test in the test suite and then add some padding.

# Contact

# Mailing List
If you would like to ask any questions about or discuss HiveRunner please join our mailing list at

  [https://groups.google.com/forum/#!forum/hive-runner-user](https://groups.google.com/forum/#!forum/hive-runner-user)

# History
This project was initially developed and maintained by [Klarna](https://klarna.github.io/) and then by [Expedia Group](https://expediagroup.github.io/) before moving to its own top-level organisation on GitHub.

# Legal
This project is available under the [Apache 2.0 License](http://www.apache.org/licenses/LICENSE-2.0.html).

Copyright 2021-2024 The HiveRunner Contributors.

Copyright 2013-2021 Klarna AB.


================================================
FILE: RELEASING.md
================================================
# Releasing HiveRunner to Maven Central

HiveRunner has been set up to build continuously and also to deploy SNAPSHOTS to Sonatype and releases to Maven Central via GitHub Actions.

## Deploying a SNAPSHOT to Sonatype

* Select the https://github.com/HiveRunner/HiveRunner/actions/workflows/deploy.yml worfklow
* Click "Run workflow"
* Select the branch to use to deploy a SNAPSHOT from:
  * Use `main` as the branch for a Hive 3.x release
  * Use `hive-4.x` as the branch for a Hive 4.x release
  * Use `hive-2.x` as the branch for a Hive 2.x release
* Run the workflow
* SNAPSHOT artifacts will be available at https://s01.oss.sonatype.org/content/repositories/snapshots/io/github/hiverunner/hiverunner/

## Deploying a release to Maven Central

* Ensure the `pom.xml` has the SNAPSHOT version set to the value you would like to make a release from
* Update `CHANGELOG.md` with the date corresponding to when you're performing the release
* Select the https://github.com/HiveRunner/HiveRunner/actions/workflows/release.yml worfklow
* Click "Run workflow"
* Select the branch to use to deploy a release from:
  * Use `main` as the branch for a Hive 3.x release
  * Use `hive-4.x` as the branch for a Hive 4.x release 
  * Use `hive-2.x` as the branch for a Hive 2.x release
* Run the workflow
* Release artifacts will be available at https://repo1.maven.org/maven2/io/github/hiverunner/hiverunner/
* It can take a few hours before the artifacts show up in searches performed at https://search.maven.org/search?q=hiverunner


================================================
FILE: pom.xml
================================================
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>io.github.hiverunner</groupId>
  <artifactId>hiverunner</artifactId>
  <version>6.1.1-SNAPSHOT</version>
  <name>HiveRunner</name>
  <description>HiveRunner is a unit test framework based on JUnit (4 or 5) that enables TDD development of Hive SQL without the need of any installed dependencies.</description>
  <url>https://github.com/HiveRunner/HiveRunner</url>
  <!-- below isn't the actual inception year but is the year the copyright changed -->
  <inceptionYear>2021</inceptionYear>

  <licenses>
    <license>
      <name>The Apache Software License, Version 2.0</name>
      <url>http://www.apache.org/licenses/LICENSE-2.0.txt</url>
      <distribution>repo</distribution>
    </license>
  </licenses>
  
  <developers>
    <developer>
      <name>HiveRunner Committers</name>
      <organization>HiveRunner</organization>
      <organizationUrl>https://github.com/HiveRunner</organizationUrl>
    </developer>
  </developers>  

  <scm>
    <connection>scm:git:https://github.com/HiveRunner/HiveRunner.git</connection>
    <developerConnection>scm:git:https://github.com/HiveRunner/HiveRunner.git</developerConnection>
    <url>git@github.com:HiveRunner/HiveRunner.git</url>
    <tag>HEAD</tag>
  </scm>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>

    <hadoop.version>3.1.0</hadoop.version>
    <hive.execution.engine>mr</hive.execution.engine>
    <hive.version>3.1.2</hive.version>
    <junit.jupiter.version>5.7.1</junit.jupiter.version>
    <license.maven.plugin.version>3.0</license.maven.plugin.version>
    <tez.version>0.9.1</tez.version>
    <mockito.version>3.8.0</mockito.version>
  </properties>
  
  <dependencies>
    <dependency>
      <groupId>org.apache.hive</groupId>
      <artifactId>hive-serde</artifactId>
      <version>${hive.version}</version>
    </dependency>

    <dependency>
      <groupId>com.fasterxml.jackson.core</groupId>
      <artifactId>jackson-annotations</artifactId>
      <version>2.9.5</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hive</groupId>
      <artifactId>hive-jdbc</artifactId>
      <version>${hive.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hive.hcatalog</groupId>
      <artifactId>hive-webhcat-java-client</artifactId>
      <version>${hive.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hive</groupId>
      <artifactId>hive-service</artifactId>
      <version>${hive.version}</version>
    </dependency>

    <dependency>
      <artifactId>tez-dag</artifactId>
      <groupId>org.apache.tez</groupId>
      <version>${tez.version}</version>
    </dependency>

    <dependency>
      <artifactId>tez-common</artifactId>
      <groupId>org.apache.tez</groupId>
      <version>${tez.version}</version>
    </dependency>

    <dependency>
      <artifactId>tez-mapreduce</artifactId>
      <groupId>org.apache.tez</groupId>
      <version>${tez.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hive</groupId>
      <artifactId>hive-contrib</artifactId>
      <version>${hive.version}</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>org.reflections</groupId>
      <artifactId>reflections</artifactId>
      <version>0.9.8</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-common</artifactId>
      <version>${hadoop.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>${hadoop.version}</version>
    </dependency>

    <dependency>
      <groupId>com.github.stefanbirkner</groupId>
      <artifactId>system-rules</artifactId>
      <version>1.19.0</version>
      <scope>test</scope>
    </dependency>

    <!-- Always put this before JUnit or the class loader might load the
      wrong Matcher -->
    <dependency>
      <groupId>org.hamcrest</groupId>
      <artifactId>hamcrest-all</artifactId>
      <version>1.3</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>org.mockito</groupId>
      <artifactId>mockito-core</artifactId>
      <version>${mockito.version}</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>org.mockito</groupId>
      <artifactId>mockito-junit-jupiter</artifactId>
      <version>${mockito.version}</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.13.2</version>
      <scope>provided</scope>
    </dependency>

    <dependency>
      <groupId>org.junit.jupiter</groupId>
      <artifactId>junit-jupiter</artifactId>
      <version>${junit.jupiter.version}</version>
    </dependency>

    <dependency>
      <groupId>org.junit.vintage</groupId>
      <artifactId>junit-vintage-engine</artifactId>
      <version>${junit.jupiter.version}</version>
    </dependency>

  </dependencies>

  <build>
    <plugins>
      <!-- forkMode:always resolves OOM error when running unit tests -->
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.22.2</version>
        <configuration>
          <!-- Hiverunner need this for some queries (property -XX:MaxPermSize
            was removed in java 8) -->
          <argLine>-Xmx2024m</argLine>
          <!-- below needed due to https://github.com/HiveRunner/HiveRunner/commit/1f9a9b353c3b072f7898a6b4fa277474674d4b54 -->
          <reuseForks>false</reuseForks>
          <systemProperties>
            <!--
              Any hive conf property may be overridden here by suffixing
              it with 'hiveconf_'
            -->
            <hiveconf_hive.execution.engine>${hive.execution.engine}</hiveconf_hive.execution.engine>
            <hiveconf_hive.exec.counters.pull.interval>1000</hiveconf_hive.exec.counters.pull.interval>
            <enableTimeout>false</enableTimeout>
            <timeoutSeconds>30</timeoutSeconds>
            <timeoutRetries>2</timeoutRetries>
          </systemProperties>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.7.0</version>
        <configuration>
          <source>1.8</source>
          <target>1.8</target>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-jar-plugin</artifactId>
        <version>3.2.0</version>
        <configuration>
          <archive>
            <manifestEntries>
              <Build-Version>${project.version}</Build-Version>
              <Build-DateTime>${maven.build.timestamp}</Build-DateTime>
              <Maven-GroupId>${project.groupId}</Maven-GroupId>
              <Maven-ArtifactId>${project.artifactId}</Maven-ArtifactId>
            </manifestEntries>
          </archive>
        </configuration>
      </plugin>

      <plugin>
        <groupId>com.mycila</groupId>
        <artifactId>license-maven-plugin</artifactId>
        <version>${license.maven.plugin.version}</version>
        <dependencies>
          <dependency>
            <groupId>com.mycila</groupId>
            <artifactId>license-maven-plugin-git</artifactId>
            <version>${license.maven.plugin.version}</version>
          </dependency>
        </dependencies>
        <configuration>
          <header>src/main/license/APACHE-2.txt</header>
          <properties>
            <owner>The HiveRunner Contributors</owner>
          </properties>
          <includes>
            <include>src/main/java/**</include>
            <include>src/test/java/**</include>
          </includes>
        </configuration>
        <executions>
          <execution>
            <phase>validate</phase>
            <goals>
              <goal>format</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
      
     <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-release-plugin</artifactId>
        <version>3.0.0-M1</version>
        <configuration>
          <tagNameFormat>v@{project.version}</tagNameFormat>
          <autoVersionSubmodules>true</autoVersionSubmodules>
          <useReleaseProfile>false</useReleaseProfile>
          <releaseProfiles>oss-release</releaseProfiles>
          <goals>deploy</goals>
        </configuration>
      </plugin>      

      <plugin>
        <groupId>org.sonatype.plugins</groupId>
        <artifactId>nexus-staging-maven-plugin</artifactId>
        <version>1.6.8</version>
        <extensions>true</extensions>
        <configuration>
          <serverId>ossrh</serverId>
          <nexusUrl>https://s01.oss.sonatype.org/</nexusUrl>
          <autoReleaseAfterClose>true</autoReleaseAfterClose>
        </configuration>
      </plugin>

    </plugins>
  </build>

  <profiles>
    <profile>
      <id>oss-release</id>
      <distributionManagement>
        <snapshotRepository>
          <id>ossrh</id>
          <url>https://s01.oss.sonatype.org/content/repositories/snapshots</url>
        </snapshotRepository>
        <repository>
          <id>ossrh</id>
          <url>https://s01.oss.sonatype.org/service/local/staging/deploy/maven2/</url>
        </repository>
      </distributionManagement>
      <build>
        <plugins>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-source-plugin</artifactId>
            <version>3.2.0</version>
            <executions>
              <execution>
                <id>attach-sources</id>
                <goals>
                  <goal>jar-no-fork</goal>
                </goals>
              </execution>
            </executions>
          </plugin>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-javadoc-plugin</artifactId>
            <version>3.2.0</version>
            <configuration>
               <encoding>${project.build.sourceEncoding}</encoding>
               <source>8</source>
               <detectJavaApiLink>false</detectJavaApiLink>
               <doclint>none</doclint>
            </configuration>            
            <executions>
              <execution>
                <id>attach-javadocs</id>
                <goals>
                  <goal>jar</goal>
                </goals>
              </execution>
            </executions>
          </plugin>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-gpg-plugin</artifactId>
            <version>1.6</version>
            <executions>
              <execution>
                <id>sign-artifacts</id>
                <phase>verify</phase>
                <goals>
                  <goal>sign</goal>
                </goals>
                <configuration>
                    <gpgArguments>
                        <arg>--pinentry-mode</arg>
                        <arg>loopback</arg>
                    </gpgArguments>                
                </configuration>
              </execution>
            </executions>
          </plugin>
        </plugins>
      </build>
    </profile>
    <profile>
      <id>tez</id>
      <properties>
        <hive.execution.engine>tez</hive.execution.engine>
      </properties>
    </profile>
  </profiles>
</project>


================================================
FILE: src/main/java/com/klarna/hiverunner/HiveRunnerCore.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import static org.reflections.ReflectionUtils.withAnnotation;

import java.io.File;
import java.io.IOException;
import java.lang.reflect.Field;
import java.net.URISyntaxException;
import java.nio.charset.Charset;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.Set;

import com.google.common.base.Preconditions;
import com.google.common.io.Resources;
import com.klarna.hiverunner.annotations.HiveProperties;
import com.klarna.hiverunner.annotations.HiveResource;
import com.klarna.hiverunner.annotations.HiveSQL;
import com.klarna.hiverunner.annotations.HiveSetupScript;
import com.klarna.hiverunner.builder.HiveShellBuilder;
import com.klarna.hiverunner.builder.Script;
import com.klarna.hiverunner.config.HiveRunnerConfig;
import com.klarna.reflection.ReflectionUtils;

class HiveRunnerCore {

    /**
     * Traverses the test case annotations. Will inject a HiveShell in the test case that envelopes the HiveServer.
     */
    HiveShellContainer createHiveServerContainer(List<? extends Script> scripts, Object testCase,
                                                 Path baseDir, HiveRunnerConfig config)
            throws IOException {

        HiveServerContext context = new StandaloneHiveServerContext(baseDir, config);

        return buildShell(scripts, testCase, config, context);
    }

    private HiveShellContainer buildShell(List<? extends Script> scripts, Object testCase, HiveRunnerConfig config,
                                          HiveServerContext context) throws IOException {
        HiveServerContainer hiveTestHarness = new HiveServerContainer(context);

        HiveShellBuilder hiveShellBuilder = new HiveShellBuilder();
        hiveShellBuilder.setCommandShellEmulation(config.getCommandShellEmulator());

        HiveShellField shellSetter = loadScriptUnderTest(testCase, hiveShellBuilder);
        if (!scripts.isEmpty()) {
            hiveShellBuilder.overrideScriptsUnderTest(scripts);
        }

        hiveShellBuilder.setHiveServerContainer(hiveTestHarness);

        loadAnnotatedResources(testCase, hiveShellBuilder);

        loadAnnotatedProperties(testCase, hiveShellBuilder);

        loadAnnotatedSetupScripts(testCase, hiveShellBuilder);

        // Build shell
        HiveShellContainer shell = hiveShellBuilder.buildShell();

        // Set shell
        shellSetter.setShell(shell);

        if (shellSetter.isAutoStart()) {
            shell.start();
        }
        return shell;
    }

    private HiveShellField loadScriptUnderTest(Object testCaseInstance, HiveShellBuilder hiveShellBuilder) {
        try {
            Set<Field> fields = ReflectionUtils.getAllFields(testCaseInstance.getClass(), withAnnotation(HiveSQL.class));

            Preconditions.checkState(fields.size() == 1, "Exact one field should to be annotated with @HiveSQL");

            Field field = fields.iterator().next();
            HiveSQL annotation = field.getAnnotation(HiveSQL.class);
            List<Path> scriptPaths = getScriptPaths(annotation);
            Charset charset = annotation.encoding().equals("") ?
                    Charset.defaultCharset() : Charset.forName(annotation.encoding());

            boolean isAutoStart = annotation.autoStart();

            hiveShellBuilder.setScriptsUnderTest(scriptPaths, charset);

            return new HiveShellField() {
                @Override
                public void setShell(HiveShell shell) {
                    ReflectionUtils.setField(testCaseInstance, field.getName(), shell);
                }

                @Override
                public boolean isAutoStart() {
                    return isAutoStart;
                }
            };
        } catch (Throwable t) {
            throw new IllegalArgumentException("Failed to init field annotated with @HiveSQL: " + t.getMessage(), t);
        }
    }

    protected List<Path> getScriptPaths(HiveSQL annotation) throws URISyntaxException {
        List<Path> scriptPaths = new ArrayList<>();
        for (String scriptFilePath : annotation.files()) {
            Path file = Paths.get(Resources.getResource(scriptFilePath).toURI());
            assertFileExists(file);
            scriptPaths.add(file);
        }
        return scriptPaths;
    }

    private void assertFileExists(Path file) {
        Preconditions.checkState(Files.exists(file), "File " + file + " does not exist");
    }

    private void loadAnnotatedSetupScripts(Object testCase, HiveShellBuilder workFlowBuilder) {
        Set<Field> setupScriptFields = ReflectionUtils.getAllFields(testCase.getClass(),
                withAnnotation(HiveSetupScript.class));

        for (Field setupScriptField : setupScriptFields) {
            if (ReflectionUtils.isOfType(setupScriptField, String.class)) {
                String script = ReflectionUtils.getFieldValue(testCase, setupScriptField.getName(), String.class);
                workFlowBuilder.addSetupScript(script);
            } else if (ReflectionUtils.isOfType(setupScriptField, File.class) ||
                    ReflectionUtils.isOfType(setupScriptField, Path.class)) {
                Path path = getMandatoryPathFromField(testCase, setupScriptField);
                workFlowBuilder.addSetupScript(readAll(path));
            } else {
                throw new IllegalArgumentException(
                        "Field annotated with @HiveSetupScript currently only supports type String, File and Path");
            }
        }
    }

    private static String readAll(Path path) {
        try {
            return new String(Files.readAllBytes(path), StandardCharsets.UTF_8);
        } catch (IOException e) {
            throw new IllegalStateException("Unable to read " + path + ": " + e.getMessage(), e);
        }
    }

    private void loadAnnotatedResources(Object testCase, HiveShellBuilder workFlowBuilder) throws IOException {
        Set<Field> fields = ReflectionUtils.getAllFields(testCase.getClass(), withAnnotation(HiveResource.class));

        for (Field resourceField : fields) {

            HiveResource annotation = resourceField.getAnnotation(HiveResource.class);
            String targetFile = annotation.targetFile();

            if (ReflectionUtils.isOfType(resourceField, String.class)) {
                String data = ReflectionUtils.getFieldValue(testCase, resourceField.getName(), String.class);
                workFlowBuilder.addResource(targetFile, data);
            } else if (ReflectionUtils.isOfType(resourceField, File.class) ||
                    ReflectionUtils.isOfType(resourceField, Path.class)) {
                Path dataFile = getMandatoryPathFromField(testCase, resourceField);
                workFlowBuilder.addResource(targetFile, dataFile);
            } else {
                throw new IllegalArgumentException(
                        "Fields annotated with @HiveResource currently only supports field type String, File or Path");
            }
        }
    }

    private Path getMandatoryPathFromField(Object testCase, Field resourceField) {
        Path path;
        if (ReflectionUtils.isOfType(resourceField, File.class)) {
            File dataFile = ReflectionUtils.getFieldValue(testCase, resourceField.getName(), File.class);
            path = Paths.get(dataFile.toURI());
        } else if (ReflectionUtils.isOfType(resourceField, Path.class)) {
            path = ReflectionUtils.getFieldValue(testCase, resourceField.getName(), Path.class);
        } else {
            throw new IllegalArgumentException(
                    "Only Path or File type is allowed on annotated field " + resourceField);
        }

        Preconditions.checkArgument(Files.exists(path), "File %s does not exist", path);
        return path;
    }

    private void loadAnnotatedProperties(Object testCase, HiveShellBuilder workFlowBuilder) {
        for (Field hivePropertyField : ReflectionUtils.getAllFields(testCase.getClass(),
                withAnnotation(HiveProperties.class))) {
            Preconditions.checkState(ReflectionUtils.isOfType(hivePropertyField, Map.class),
                    "Field annotated with @HiveProperties should be of type Map<String, String>");
            workFlowBuilder.putAllProperties(
                    ReflectionUtils.getFieldValue(testCase, hivePropertyField.getName(), Map.class));
        }
    }

    /**
     * Used as a handle for the HiveShell field in the test case so that we may set it once the
     * HiveShell has been instantiated.
     */
    interface HiveShellField {

        void setShell(HiveShell shell);

        boolean isAutoStart();
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/HiveRunnerExtension.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import static org.reflections.ReflectionUtils.withAnnotation;
import static org.reflections.ReflectionUtils.withType;

import java.io.IOException;
import java.io.UncheckedIOException;
import java.lang.reflect.Field;
import java.net.URISyntaxException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.List;
import java.util.Set;

import org.apache.commons.io.FileUtils;
import org.junit.jupiter.api.extension.AfterEachCallback;
import org.junit.jupiter.api.extension.ExtensionContext;
import org.junit.jupiter.api.extension.TestInstancePostProcessor;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import com.google.common.base.Preconditions;
import com.google.common.base.Predicates;
import com.klarna.hiverunner.annotations.HiveRunnerSetup;
import com.klarna.hiverunner.annotations.HiveSQL;
import com.klarna.hiverunner.builder.Script;
import com.klarna.hiverunner.config.HiveRunnerConfig;
import com.klarna.reflection.ReflectionUtils;

public class HiveRunnerExtension implements AfterEachCallback, TestInstancePostProcessor {

    private static final Logger LOGGER = LoggerFactory.getLogger(HiveRunnerExtension.class);

    private final HiveRunnerCore core;
    private final HiveRunnerConfig config = new HiveRunnerConfig();
    private Path basedir;
    private HiveShellContainer container;
    protected List<Script> scriptsUnderTest = new ArrayList<Script>();

    public HiveRunnerExtension() {
        core = new HiveRunnerCore();
    }

    protected List<Path> getScriptPaths(HiveSQL annotation) throws URISyntaxException {
        return core.getScriptPaths(annotation);
    }

    @Override
    public void postProcessTestInstance(Object target, ExtensionContext extensionContext) {
        setupConfig(target);
        try {
            basedir = Files.createTempDirectory("hiverunner_test");
            container = createHiveServerContainer(scriptsUnderTest, target, basedir);
        } catch (IOException e) {
            throw new UncheckedIOException(e);
        }
        scriptsUnderTest = container.getScriptsUnderTest();
    }

    private void setupConfig(Object target) {
        Set<Field> fields = ReflectionUtils.getAllFields(target.getClass(),
                Predicates.and(
                        withAnnotation(HiveRunnerSetup.class),
                        withType(HiveRunnerConfig.class)));

        Preconditions.checkState(fields.size() <= 1,
                "Only one field of type HiveRunnerConfig should be annotated with @HiveRunnerSetup");

        if (!fields.isEmpty()) {
            config.override(ReflectionUtils
                    .getFieldValue(target, fields.iterator().next().getName(), HiveRunnerConfig.class));
        }
    }

    private void tearDown(Object target) {
        if (container != null) {
            LOGGER.info("Tearing down {}", target.getClass());
            container.tearDown();
        }
        deleteTempFolder(basedir);
    }

    private void deleteTempFolder(Path directory) {
        try {
            FileUtils.deleteDirectory(directory.toFile());
        } catch (IOException e) {
            LOGGER.debug("Temporary folder was not deleted successfully: " + directory);
        }
    }

    private HiveShellContainer createHiveServerContainer(List<? extends Script> scripts, Object testCase, Path basedir)
            throws IOException {
        return core.createHiveServerContainer(scripts, testCase, basedir, config);
    }

    @Override
    public void afterEach(ExtensionContext extensionContext) {
        tearDown(extensionContext.getRequiredTestInstance());
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/HiveRunnerRule.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import java.util.ArrayList;
import java.util.List;

import java.nio.file.Path;

import org.junit.rules.TestRule;
import org.junit.runner.Description;
import org.junit.runners.model.Statement;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import com.klarna.hiverunner.builder.Script;

/**
 * A rule that executes the scripts under test
 */
public class HiveRunnerRule implements TestRule {

    private static final Logger LOGGER = LoggerFactory.getLogger(HiveRunnerRule.class);
    private final StandaloneHiveRunner runner;
    private final Object target;
    private final Path testBaseDir;
    private List<? extends Script> scriptsUnderTest = new ArrayList<>();

    HiveRunnerRule(StandaloneHiveRunner runner, Object target, Path testBaseDir) {
        this.runner = runner;
        this.target = target;
        this.testBaseDir = testBaseDir;
    }

    public List<? extends Script> getScriptsUnderTest() {
        return scriptsUnderTest;
    }

    public void setScriptsUnderTest(List<? extends Script> scriptsUnderTest) {
        LOGGER.debug("Setting up hive runner scripts under test");
        this.scriptsUnderTest = scriptsUnderTest;
    }

    @Override
    public Statement apply(Statement base, Description description) {
        LOGGER.debug("Running hive runner rule apply");
        return new HiveRunnerRuleStatement(runner, target, base, testBaseDir);
    }

    class HiveRunnerRuleStatement extends Statement {

        private Object target;
        private Statement base;
        private Path testBaseDir;
        private StandaloneHiveRunner runner;

        private HiveRunnerRuleStatement(
                StandaloneHiveRunner runner,
                Object target,
                Statement base,
                Path testBaseDir) {
            this.runner = runner;
            this.target = target;
            this.base = base;
            this.testBaseDir = testBaseDir;
        }

        @Override
        public void evaluate() throws Throwable {
            LOGGER.debug("Hive runner rule evaluate method");
            HiveShellContainer container = runner.evaluateStatement(scriptsUnderTest, target, testBaseDir, base);

            /**
             * Script list will initially be null. 'evaluateStatement' sets up the script list.
             * Need to set the value here to allow for mutation inside the mutantSwarmRule.
             */
            scriptsUnderTest = container.getScriptsUnderTest();
        }
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/HiveServerContainer.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import com.google.common.base.Function;
import com.google.common.base.Joiner;
import com.google.common.base.Preconditions;
import com.google.common.collect.Iterables;
import com.klarna.hiverunner.builder.Statement;
import com.klarna.hiverunner.io.IgnoreClosePrintStream;
import org.apache.hadoop.hive.conf.HiveConf;
import org.apache.hadoop.hive.conf.HiveVariableSource;
import org.apache.hadoop.hive.conf.VariableSubstitution;
import org.apache.hadoop.hive.ql.exec.tez.TezJobExecHelper;
import org.apache.hadoop.hive.ql.session.SessionState;
import org.apache.hive.service.Service;
import org.apache.hive.service.cli.CLIService;
import org.apache.hive.service.cli.HiveSQLException;
import org.apache.hive.service.cli.OperationHandle;
import org.apache.hive.service.cli.RowSet;
import org.apache.hive.service.cli.SessionHandle;
import org.apache.hive.service.server.HiveServer2;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.PrintStream;
import java.nio.file.Path;
import javax.annotation.Nullable;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

/**
 * HiveServer wrapper
 */
public class HiveServerContainer {

    private static final Logger LOGGER = LoggerFactory.getLogger(HiveServerContainer.class);

    private CLIService client;
    private final HiveServerContext context;
    private SessionHandle sessionHandle;
    private HiveServer2 hiveServer2;
    private SessionState currentSessionState;

    public HiveServerContainer(HiveServerContext context) {
        this.context = context;
    }

    public CLIService getClient() {
        return client;
    }

    /**
     * Will start the HiveServer.
     *
     * @param testConfig Specific test case properties. Will be merged with the HiveConf of the context
     * @param hiveVars   HiveVars to pass on to the HiveServer for this session
     */
    public void init(Map<String, String> testConfig, Map<String, String> hiveVars) {

        context.init();

        HiveConf hiveConf = context.getHiveConf();

        // merge test case properties with hive conf before HiveServer is started.
        for (Map.Entry<String, String> property : testConfig.entrySet()) {
            hiveConf.set(property.getKey(), property.getValue());
        }

        try {
            hiveServer2 = new HiveServer2();
            hiveServer2.init(hiveConf);

            // Locate the ClIService in the HiveServer2
            for (Service service : hiveServer2.getServices()) {
                if (service instanceof CLIService) {
                    client = (CLIService) service;
                }
            }

            Preconditions.checkNotNull(client, "ClIService was not initialized by HiveServer2");

            sessionHandle = client.openSession("noUser", "noPassword", null);

            SessionState sessionState = client.getSessionManager().getSession(sessionHandle).getSessionState();
            currentSessionState = sessionState;
            currentSessionState.setHiveVariables(hiveVars);
        } catch (Exception e) {
            throw new IllegalStateException("Failed to create HiveServer :" + e.getMessage(), e);
        }

        // Ping hive server before we do anything more with it! If validation
        // is switched on, this will fail if metastorage is not set up properly
        pingHiveServer();
    }

    public Path getBaseDir() {
        return context.getBaseDir();
    }

    public List<Object[]> executeStatement(Statement hiveql) {
        return executeStatement(hiveql.getSql());
    }

    public List<Object[]> executeStatement(String hiveql) {
        // This PrintStream hack can be removed if/when IntelliJ fixes https://youtrack.jetbrains.com/issue/IDEA-120628
        // See https://github.com/klarna/HiveRunner/issues/94 for more info.
        PrintStream initialPrintStream = System.out;
        try {
            System.setOut(new IgnoreClosePrintStream(System.out));
            OperationHandle handle = client.executeStatement(sessionHandle, hiveql, new HashMap<>());
            List<Object[]> resultSet = new ArrayList<>();
            if (handle.hasResultSet()) {
                /*
                 * fetchResults will by default return 100 rows per fetch (hive 14). For big result sets we need to continuously fetch the result set until all
                 * rows are fetched.
                 */
                RowSet rowSet;
                while ((rowSet = client.fetchResults(handle)) != null && rowSet.numRows() > 0) {
                    for (Object[] row : rowSet) {
                        resultSet.add(row.clone());
                    }
                }
            }

            LOGGER.debug("ResultSet:\n"
                    + Joiner.on("\n").join(Iterables.transform(resultSet, new Function<Object[], String>() {
                @Nullable
                @Override
                public String apply(@Nullable Object[] objects) {
                    return Joiner.on(", ").useForNull("null").join(objects);
                }
            })));

            return resultSet;
        } catch (HiveSQLException e) {
            throw new IllegalArgumentException("Failed to executeQuery Hive query " + hiveql + ": " + e.getMessage(),
                    e);
        } finally {
            System.setOut(initialPrintStream);
        }
    }

    /**
     * Release all resources.
     * <p>
     * This call will never throw an exception as it makes no sense doing that in the tear down phase.
     * </p>
     */
    public void tearDown() {

        try {
            TezJobExecHelper.killRunningJobs();
        } catch (Throwable e) {
            LOGGER.warn("Failed to kill tez session: " + e.getMessage() + ". Turn on log level debug for stacktrace");
            LOGGER.debug(e.getMessage(), e);
        }

        try {
            // Reset to default schema
            executeStatement("USE default");
        } catch (Throwable e) {
            LOGGER.warn("Failed to reset to default schema: " + e.getMessage()
                    + ". Turn on log level debug for stacktrace");
            LOGGER.debug(e.getMessage(), e);
        }

        try {
            client.closeSession(sessionHandle);
        } catch (Throwable e) {
            LOGGER.warn(
                    "Failed to close client session: " + e.getMessage() + ". Turn on log level debug for stacktrace");
            LOGGER.debug(e.getMessage(), e);
        }

        try {
            hiveServer2.stop();
        } catch (Throwable e) {
            LOGGER.warn("Failed to stop HiveServer2: " + e.getMessage() + ". Turn on log level debug for stacktrace");
            LOGGER.debug(e.getMessage(), e);
        }

        hiveServer2 = null;
        client = null;
        sessionHandle = null;

        LOGGER.info("Tore down HiveServer instance");
    }

    public String expandVariableSubstitutes(String expression) {
        return getVariableSubstitution().substitute(getHiveConf(), expression);
    }

    private void pingHiveServer() {
        executeStatement("SHOW TABLES");
    }

    public HiveConf getHiveConf() {
        return hiveServer2.getHiveConf();
    }

    public VariableSubstitution getVariableSubstitution() {
        // Make sure to set the session state for this thread before returning the VariableSubstitution. If not set,
        // hivevar:s will not be evaluated.
        SessionState.setCurrentSessionState(currentSessionState);

        SessionState ss = currentSessionState;
        return new VariableSubstitution(new HiveVariableSource() {
            @Override
            public Map<String, String> getHiveVariable() {
                return ss.getHiveVariables();
            }
        });
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/HiveServerContext.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import java.nio.file.Path;
import java.util.Map;

import org.apache.hadoop.hive.conf.HiveConf;

/**
 * Configuration for the HiveServer.
 *
 * Implementations of this interface should set the context of the HiveServer that is spawned by HiveRunner. {@link
 * com.klarna.hiverunner.StandaloneHiveRunner} uses the {@link StandaloneHiveServerContext} to create a context with
 * zero external dependencies.
 *
 * By implementing other contexts you may e.g. point hiveserver to a different metastore, pre installed external hadoop
 * instance etc.
 */
public interface HiveServerContext {

    /**
     * Create all test resources and set all hive configurations.
     *
     * Note that before this method is called, not all injected dependencies might have been initialized.
     * After this method is called, all configurations and resources should have been set.
     *
     * Called by {@link HiveServerContainer#init(Map, Map)}
     */
    void init();

    /**
     * Get the hiveconf. This will not be available until init() has been called.
     */
    HiveConf getHiveConf();

    /**
     * Get file folder that acts as the base dir for the test data. This is the sand box for the
     * file system that the HiveRunner uses as replacement for HDFS.
     * <p>
     * Each test method will have a new base dir spawned by the HiveRunner engine.
     * </p>
     */
    Path getBaseDir();
}


================================================
FILE: src/main/java/com/klarna/hiverunner/HiveShell.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import com.klarna.hiverunner.data.InsertIntoTable;
import org.apache.hadoop.hive.conf.HiveConf;

import java.io.File;
import java.io.OutputStream;
import java.nio.charset.Charset;
import java.nio.file.Path;
import java.util.List;

/**
 * Test handle to the hive server.
 *
 * Please refer to test class {@code com.klarna.hiverunner.examples.HelloHiveRunnerTest} for usage examples.
 */
public interface HiveShell {

    /**
     * Executes a single query.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(String hiveSql);

    /**
     * Executes a single query.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(String hiveSql, String rowValuesDelimitedBy, String replaceNullWith);

    /**
     * Executes a single query from a script file, returning any results.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(File script);

    /**
     * Executes a single query from a script file, returning any results.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(Path script);

    /**
     * Executes a single query from a script file, returning any results.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(Charset charset, File script);

    /**
     * Executes a single query from a script file, returning any results.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(Charset charset, Path script);

    /**
     * Executes a single query from a script file, returning any results.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(File script, String rowValuesDelimitedBy, String replaceNullWith);

    /**
     * Executes a single query from a script file, returning any results.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(Path script, String rowValuesDelimitedBy, String replaceNullWith);

    /**
     * Executes a single query from a script file, returning any results.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(Charset charset, File script, String rowValuesDelimitedBy, String replaceNullWith);

    /**
     * Executes a single query from a script file, returning any results.
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<String> executeQuery(Charset charset, Path script, String rowValuesDelimitedBy, String replaceNullWith);

    /**
     * Execute a single hive query
     * <p>
     * May only be called post #start()
     * </p>
     */
    List<Object[]> executeStatement(String hiveSql);

    /**
     * Executes a hive script. The script may contain multiple statements delimited by ';'
     * <p>
     * May only be called post #start()
     * </p>
     */
    void execute(String script);

    /**
     * Executes a hive script. The script may contain multiple statements delimited by ';'.
     * Default charset will be used to read the given files.
     * <p>
     * May only be called post #start()
     * </p>
     */
    void execute(File file);

    /**
     * Executes a hive script. The script may contain multiple statements delimited by ';'.
     * Default charset will be used to read the given files.
     * <p>
     * May only be called post #start()
     * </p>
     */
    void execute(Path path);

    /**
     * Executes a hive script. The script may contain multiple statements delimited by ';'
     * <p>
     * May only be called post #start()
     * </p>
     */
    void execute(Charset charset, File file);

    /**
     * Executes a hive script. The script may contain multiple statements delimited by ';'
     * <p>
     * May only be called post #start()
     * </p>
     */
    void execute(Charset charset, Path path);

    /**
     * Start the shell. May only be called once. The test engine will by default call this method,
     * Set {@link com.klarna.hiverunner.annotations.HiveSQL#autoStart()} to false to explicitly control
     * when to start from the test case.
     * <p>
     * This might be useful for test methods that needs additional setup not catered for with the provided annotations.
     * </p>
     */
    void start();

    /**
     * Set a HiveConf property.
     * <p>
     * May only be called pre #start()
     * </p>
     * @deprecated Use {@link HiveShell#setHiveConfValue(String, String)} instead
     */
    @Deprecated
    void setProperty(String key, String value);

    /**
     * Set HiveConf property.
     * <p>
     * May only be called pre #start()
     * </p>
     */
    void setHiveConfValue(String key, String value);

    /**
     * Set Hive variable.
     * <p>
     * May only be called pre #start()
     * </p>
     */
    void setHiveVarValue(String var, String value);

    /**
     * Get the current HiveConf from hive
     */
    HiveConf getHiveConf();

    void setCwd(Path cwd);

    Path getCwd();

    /**
     * Copy test data into hdfs
     * May only be called pre #start()
     * <p>
     * {@link com.klarna.hiverunner.MethodLevelResourceTest#resourceLoadingAsFileTest()}
     * and {@link com.klarna.hiverunner.MethodLevelResourceTest#resourceLoadingAsStringTest()}
     * </p>
     */
    void addResource(String targetFile, File sourceFile);

    /**
     * Copy test data into hdfs
     * May only be called pre #start()
     * <p>
     * {@link com.klarna.hiverunner.MethodLevelResourceTest#resourceLoadingAsFileTest()}
     * and {@link com.klarna.hiverunner.MethodLevelResourceTest#resourceLoadingAsStringTest()}
     * </p>
     */
    void addResource(String targetFile, Path sourceFile);


    /**
     * Copy test data into hdfs
     * May only be called pre #start()
     * <p>
     * {@link com.klarna.hiverunner.MethodLevelResourceTest#resourceLoadingAsFileTest()}
     * and {@link com.klarna.hiverunner.MethodLevelResourceTest#resourceLoadingAsStringTest()}
     * </p>
     */
    void addResource(String targetFile, String data);

    /**
     * Add a hive script that will be executed when the hive shell is started
     * Scripts will be executed in the order they are added.
     *
     * Note that execution order is not guaranteed with
     * fields annotated with {@link com.klarna.hiverunner.annotations.HiveSetupScript}
     */
    void addSetupScript(String script);

    /**
     * Add hive scripts that will be executed when the hive shell is started. Scripts will be executed in given order.
     *
     * Note that execution order is not guaranteed with
     * fields annotated with {@link com.klarna.hiverunner.annotations.HiveSetupScript}
     */
    void addSetupScripts(Charset charset, File... scripts);

    /**
     * Add hive scripts that will be executed when the hive shell is started. Scripts will be executed in given order.
     *
     * Note that execution order is not guaranteed with
     * fields annotated with {@link com.klarna.hiverunner.annotations.HiveSetupScript}
     */
    void addSetupScripts(Charset charset, Path... scripts);


    /**
     * Add hive scripts that will be executed when the hive shell is started. Scripts will be executed in given order.
     *
     * Default charset will be used to read the given files
     *
     * Note that execution order is not guaranteed with
     * fields annotated with {@link com.klarna.hiverunner.annotations.HiveSetupScript}
     */
    void addSetupScripts(File... scripts);

    /**
     * Add hive scripts that will be executed when the hive shell is started. Scripts will be executed in given order.
     *
     * Default charset will be used to read the given files
     *
     * Note that execution order is not guaranteed with
     * fields annotated with {@link com.klarna.hiverunner.annotations.HiveSetupScript}
     */
    void addSetupScripts(Path... scripts);


    /**
     * Get the test case sand box base dir
     */
    Path getBaseDir();

    /**
     * Resolve all substituted variables with the hive conf.
     * @throws IllegalArgumentException if not all substitutes could be resolved
     * @throws IllegalStateException    if the HiveShell was not started yet.
     */
    String expandVariableSubstitutes(String expression);

    /**
     * Open up a stream to write test data into HDFS.
     *
     * May only be called pre #start().
     * No writes to the stream will be allowed post #start().
     *
     * @param targetFile The path to the target file relative to the hive work space.
     *
     * See test class {@code com.klarna.hiverunner.ResourceOutputStreamTest#sequenceFile()} for an example of how this works.
     * with sequence files.
     */
    OutputStream getResourceOutputStream(String targetFile);

    /**
     * Returns an {@link InsertIntoTable} that allows programmatically inserting data into a table in a fluent manner.
     * <p>
     * May only be called post #start()
     * </p>
     *
     * @param databaseName The database name
     * @param tableName The table name
     */
    InsertIntoTable insertInto(String databaseName, String tableName);
}


================================================
FILE: src/main/java/com/klarna/hiverunner/HiveShellContainer.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import java.util.List;

import com.klarna.hiverunner.builder.Script;

/**
 * Wrapper for the HiveShell that allows the fwk to sugar the HiveShell with functionality that will not be exposed to
 * the test case creator.
 */
public interface HiveShellContainer extends HiveShell {

    /**
     * Should be called after execution of each test method and should tear down the test fixture leaving
     * no residue for coming test cases.
     */
    void tearDown();

    /**
     * Returns a List of the scripts being tested. 
     */
    List<Script> getScriptsUnderTest();
}


================================================
FILE: src/main/java/com/klarna/hiverunner/StandaloneHiveRunner.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import com.google.common.base.Preconditions;
import com.google.common.base.Predicates;
import com.klarna.hiverunner.annotations.*;
import com.klarna.hiverunner.builder.Script;
import com.klarna.hiverunner.config.HiveRunnerConfig;
import com.klarna.reflection.ReflectionUtils;

import org.apache.commons.io.FileUtils;
import org.apache.hadoop.fs.FileUtil;
import org.apache.hadoop.fs.permission.FsPermission;
import org.junit.Ignore;
import org.junit.internal.AssumptionViolatedException;
import org.junit.internal.runners.model.EachTestNotifier;
import org.junit.rules.TestRule;
import org.junit.runner.Description;
import org.junit.runner.notification.RunNotifier;
import org.junit.runners.BlockJUnit4ClassRunner;
import org.junit.runners.model.FrameworkMethod;
import org.junit.runners.model.InitializationError;
import org.junit.runners.model.Statement;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;

import java.io.File;
import java.io.IOException;
import java.io.UncheckedIOException;
import java.lang.reflect.Field;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.List;
import java.util.Set;

import static org.apache.hadoop.hive.metastore.conf.MetastoreConf.ConfVars.HIVE_IN_TEST;
import static org.reflections.ReflectionUtils.withAnnotation;
import static org.reflections.ReflectionUtils.withType;

/**
 * JUnit 4 runner that runs hive sql on a HiveServer residing in this JVM. No external dependencies needed.
 */
public class StandaloneHiveRunner extends BlockJUnit4ClassRunner {

    private static final Logger LOGGER = LoggerFactory.getLogger(StandaloneHiveRunner.class);

    private HiveShellContainer container;

    /**
     * We need to init config because we're going to pass
     * it around before it is actually fully loaded from the testcase.
     */
    private final HiveRunnerConfig config = new HiveRunnerConfig();

    public StandaloneHiveRunner(Class<?> clazz) throws InitializationError {
        super(clazz);
    }

    protected HiveRunnerConfig getHiveRunnerConfig() {
        return config;
    }

    @Override
    protected List<TestRule> getTestRules(Object target) {
        Path testBaseDir = null;
        try {
            testBaseDir = Files.createTempDirectory("hiverunner_tests");
        } catch (IOException e) {
            throw new UncheckedIOException(e);
        }

        HiveRunnerRule hiveRunnerRule = new HiveRunnerRule(this, target, testBaseDir);

        /*
         * Note that rules will be executed in reverse order to how they're added.
         */

        List<TestRule> rules = new ArrayList<>();
        rules.addAll(super.getTestRules(target));
        rules.add(hiveRunnerRule);
        rules.add(ThrowOnTimeout.create(config, getName()));

        /*
         Make sure hive runner config rule is the first rule on the list to be executed so that any subsequent
         statements has access to the final config.
          */
        rules.add(getHiveRunnerConfigRule(target));
        return rules;
    }

    @Override
    protected void runChild(FrameworkMethod method, RunNotifier notifier) {
        Description description = describeChild(method);
        if (method.getAnnotation(Ignore.class) != null) {
            notifier.fireTestIgnored(description);
        } else {
            setLogContext(method);
            EachTestNotifier eachNotifier = new EachTestNotifier(notifier, description);
            eachNotifier.fireTestStarted();
            try {
                runTestMethod(method, eachNotifier, config.getTimeoutRetries());
            } finally {
                eachNotifier.fireTestFinished();
                clearLogContext();
            }
        }
    }

    /**
     * Runs a {@link Statement} that represents a leaf (aka atomic) test.
     */
    protected final void runTestMethod(FrameworkMethod method,
                                       EachTestNotifier notifier, int retriesLeft) {

        Statement statement = methodBlock(method);

        try {
            statement.evaluate();
        } catch (AssumptionViolatedException e) {
            notifier.addFailedAssumption(e);
        } catch (TimeoutException e) {
            /*
             TimeoutException thrown by ThrowOnTimeout statement. Handling is kept in this class since this is where the
             retry needs to be triggered in order to get the right tear down and test setup between retries.
              */
            if (--retriesLeft >= 0) {
                LOGGER.warn(
                        "Test case timed out. Will attempt retry {} more times. Turn on log level DEBUG for stacktrace",
                        retriesLeft);
                LOGGER.debug(e.getMessage(), e);
                tearDown();
                runTestMethod(method, notifier, retriesLeft);
            } else {
                notifier.addFailure(e);
            }
        } catch (Throwable e) {
            notifier.addFailure(e);
        }
    }

    /**
     * Drives the unit test.
     */
    public HiveShellContainer evaluateStatement(List<? extends Script> scripts, Object target,
                                                Path temporaryFolder, Statement base) throws Throwable {
        container = null;
        File temporaryFile = temporaryFolder.toFile();
        if (!temporaryFile.exists()) {
            temporaryFile.mkdirs();
        }
        FileUtil.setPermission(temporaryFile, FsPermission.getDirDefault());
        try {
            LOGGER.info("Setting up {} in {}", getName(), temporaryFolder.getRoot());
            container = createHiveServerContainer(scripts, target, temporaryFolder);
            base.evaluate();
            return container;
        } finally {
            tearDown();
        }
    }

    private void tearDown() {
        tearDownContainer();
        if (container != null) {
            deleteTempFolder(container.getBaseDir());
        }
    }

    private void tearDownContainer() {
        if (container != null) {
            LOGGER.info("Tearing down {}", getName());
            try {
                container.tearDown();
            } catch (Throwable e) {
                LOGGER.warn("Tear down failed: " + e.getMessage(), e);
            }
        }
    }

    private void deleteTempFolder(Path directory) {
        try {
            FileUtils.deleteDirectory(directory.toFile());
        } catch (IOException e) {
            LOGGER.debug("Temporary folder was not deleted successfully: " + directory);
        }
    }

    /**
     * Traverses the test case annotations. Will inject a HiveShell in the test case that envelopes the HiveServer.
     */
    private HiveShellContainer createHiveServerContainer(List<? extends Script> scripts, Object testCase,
                                                         Path baseDir)
            throws IOException {
        HiveRunnerCore core = new HiveRunnerCore();
        return core.createHiveServerContainer(scripts, testCase, baseDir, config);
    }

    private TestRule getHiveRunnerConfigRule(Object target) {
        return new TestRule() {
            @Override
            public Statement apply(Statement base, Description description) {
                Set<Field> fields = ReflectionUtils.getAllFields(target.getClass(),
                        Predicates.and(
                                withAnnotation(HiveRunnerSetup.class),
                                withType(HiveRunnerConfig.class)));

                Preconditions.checkState(fields.size() <= 1,
                        "Exact one field of type HiveRunnerConfig should to be annotated with @HiveRunnerSetup");

                /*
                 Override the config with test case config. Taking care to not replace the config instance since it
                  has been passes around and referenced by some of the other test rules.
                  */
                if (!fields.isEmpty()) {
                    config.override(ReflectionUtils
                            .getFieldValue(target, fields.iterator().next().getName(), HiveRunnerConfig.class));
                }
                return base;
            }
        };
    }

    private void clearLogContext() {
        MDC.clear();
    }

    private void setLogContext(FrameworkMethod method) {
        MDC.put("testClassShort", getTestClass().getJavaClass().getSimpleName());
        MDC.put("testClass", getTestClass().getJavaClass().getName());
        MDC.put("testMethod", method.getName());
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/StandaloneHiveServerContext.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HADOOPBIN;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVECONVERTJOIN;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVEHISTORYFILELOC;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVEMETADATAONLYQUERIES;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVEOPTINDEXFILTER;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVESKEWJOIN;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVESTATSAUTOGATHER;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVE_CBO_ENABLED;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVE_INFER_BUCKET_SORT;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVE_SERVER2_LOGGING_OPERATION_ENABLED;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.HIVE_SUPPORT_CONCURRENCY;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.LOCALSCRATCHDIR;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.METASTORECONNECTURLKEY;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.METASTOREWAREHOUSE;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.METASTORE_VALIDATE_COLUMNS;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.METASTORE_VALIDATE_CONSTRAINTS;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.METASTORE_VALIDATE_TABLES;
import static org.apache.hadoop.hive.conf.HiveConf.ConfVars.SCRATCHDIR;
import static org.apache.hadoop.hive.metastore.conf.MetastoreConf.ConfVars.HIVE_IN_TEST;

import java.io.File;
import java.io.IOException;
import java.io.UncheckedIOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.Map;
import java.util.UUID;

import org.apache.hadoop.fs.FileUtil;
import org.apache.hadoop.fs.permission.FsPermission;
import org.apache.hadoop.hive.conf.HiveConf;
import org.apache.tez.dag.api.TezConfiguration;
import org.apache.tez.runtime.library.api.TezRuntimeConfiguration;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import com.klarna.hiverunner.config.HiveRunnerConfig;

/**
 * Responsible for common configuration for running the HiveServer within this JVM with zero external dependencies.
 * <p>
 * This class contains a bunch of methods meant to be overridden in order to create slightly different contexts.
 * </p>
 * <p>
 * This context configures HiveServer for both mr and tez. There's nothing contradicting with those configurations so
 * they may coexist in order to allow test cases to alter execution engines within the same test by e.g: 'set
 * hive.execution.engine=tez;'.
 * </p>
 */
public class StandaloneHiveServerContext implements HiveServerContext {

    private static final Logger LOGGER = LoggerFactory.getLogger(StandaloneHiveServerContext.class);

    private String metaStorageUrl;

    protected HiveConf hiveConf = new HiveConf();

    private final Path basedir;
    private final HiveRunnerConfig hiveRunnerConfig;

    public StandaloneHiveServerContext(Path basedir, HiveRunnerConfig hiveRunnerConfig) {
        this.basedir = basedir;
        this.hiveRunnerConfig = hiveRunnerConfig;
    }

    @Override
    public final void init() {

        configureMiscHiveSettings(hiveConf);

        configureMetaStore(hiveConf);

        configureMrExecutionEngine(hiveConf);

        configureTezExecutionEngine(hiveConf);

        configureJavaSecurityRealm(hiveConf);

        configureSupportConcurrency(hiveConf);

        try {
            configureFileSystem(basedir, hiveConf);
        } catch (IOException e) {
            throw new UncheckedIOException(e);
        }

        configureAssertionStatus(hiveConf);

        overrideHiveConf(hiveConf);
    }

    protected void configureMiscHiveSettings(HiveConf hiveConf) {
        hiveConf.setBoolVar(HIVESTATSAUTOGATHER, false);

        // Turn of dependency to calcite library
        hiveConf.setBoolVar(HIVE_CBO_ENABLED, false);

        // Disable to get rid of clean up exception when stopping the Session.
        hiveConf.setBoolVar(HIVE_SERVER2_LOGGING_OPERATION_ENABLED, false);

        hiveConf.setVar(HADOOPBIN, "NO_BIN!");
    }

    protected void overrideHiveConf(HiveConf hiveConf) {
        for (Map.Entry<String, String> hiveConfEntry : hiveRunnerConfig.getHiveConfSystemOverride().entrySet()) {
            hiveConf.set(hiveConfEntry.getKey(), hiveConfEntry.getValue());
        }
    }

    protected void configureMrExecutionEngine(HiveConf conf) {
        /*
         * Switch off all optimizers otherwise we didn't manage to contain the map reduction within this JVM.
         */
        conf.setBoolVar(HIVE_INFER_BUCKET_SORT, false);
        conf.setBoolVar(HIVEMETADATAONLYQUERIES, false);
        conf.setBoolVar(HIVEOPTINDEXFILTER, false);
        conf.setBoolVar(HIVECONVERTJOIN, false);
        conf.setBoolVar(HIVESKEWJOIN, false);

        // Defaults to a 1000 millis sleep in. We can speed up the tests a bit by setting this to 1 millis instead.
        // org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.
        hiveConf.setLongVar(HiveConf.ConfVars.HIVECOUNTERSPULLINTERVAL, 1L);

        hiveConf.setBoolVar(HiveConf.ConfVars.HIVE_RPC_QUERY_PLAN, true);
    }

    protected void configureTezExecutionEngine(HiveConf conf) {
        /*
         * Tez local mode settings
         */
        conf.setBoolean(TezConfiguration.TEZ_LOCAL_MODE, true);
        conf.set("fs.defaultFS", "file:///");
        conf.setBoolean(TezRuntimeConfiguration.TEZ_RUNTIME_OPTIMIZE_LOCAL_FETCH, true);

        /*
         * Set to be able to run tests offline
         */
        conf.set(TezConfiguration.TEZ_AM_DISABLE_CLIENT_VERSION_CHECK, "true");

        /*
         * General attempts to strip of unnecessary functionality to speed up test execution and increase stability
         */
        conf.set(TezConfiguration.TEZ_AM_USE_CONCURRENT_DISPATCHER, "false");
        conf.set(TezConfiguration.TEZ_AM_CONTAINER_REUSE_ENABLED, "false");
        conf.set(TezConfiguration.DAG_RECOVERY_ENABLED, "false");
        conf.set(TezConfiguration.TEZ_TASK_GET_TASK_SLEEP_INTERVAL_MS_MAX, "1");
        conf.set(TezConfiguration.TEZ_AM_WEBSERVICE_ENABLE, "false");
        conf.set(TezConfiguration.DAG_RECOVERY_ENABLED, "false");
        conf.set(TezConfiguration.TEZ_AM_NODE_BLACKLISTING_ENABLED, "false");
    }

    protected void configureJavaSecurityRealm(HiveConf hiveConf) {
        // These three properties gets rid of: 'Unable to load realm info from SCDynamicStore'
        // which seems to have a timeout of about 5 secs.
        System.setProperty("java.security.krb5.realm", "");
        System.setProperty("java.security.krb5.kdc", "");
        System.setProperty("java.security.krb5.conf", "/dev/null");
    }

    protected void configureAssertionStatus(HiveConf conf) {
        ClassLoader
                .getSystemClassLoader()
                .setPackageAssertionStatus("org.apache.hadoop.hive.serde2.objectinspector", false);
    }

    protected void configureSupportConcurrency(HiveConf conf) {
        hiveConf.setBoolVar(HIVE_SUPPORT_CONCURRENCY, false);
    }

    protected void configureMetaStore(HiveConf conf) {
        configureDerbyLog();

        String jdbcDriver = org.apache.derby.jdbc.EmbeddedDriver.class.getName();
        try {
            Class.forName(jdbcDriver);
        } catch (ClassNotFoundException e) {
            throw new RuntimeException(e);
        }

        // Set the Hive Metastore DB driver
        metaStorageUrl = "jdbc:derby:memory:" + UUID.randomUUID().toString();
        setMetastoreProperty("datanucleus.schema.autoCreateAll", "true");
        setMetastoreProperty("datanucleus.schema.autoCreateTables", "true");
        setMetastoreProperty("hive.metastore.schema.verification", "false");
        setMetastoreProperty("metastore.filter.hook", "org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl");

        setMetastoreProperty("datanucleus.connectiondrivername", jdbcDriver);
        setMetastoreProperty("javax.jdo.option.ConnectionDriverName", jdbcDriver);

        // No pooling needed. This will save us a lot of threads
        setMetastoreProperty("datanucleus.connectionPoolingType", "None");

        /**
         * If hive.in.test=false (default), Hive 3 will assume that the metastore rdbms has already been initialized
         * with some basic tables and will try to run initial test queries against them.
         * This results in multiple warning stacktraces if the rdbms has not actually been initialized.
         */
        setMetastoreProperty(HIVE_IN_TEST.getVarname(), "true");

        setMetastoreProperty(METASTORE_VALIDATE_CONSTRAINTS.varname, "true");
        setMetastoreProperty(METASTORE_VALIDATE_COLUMNS.varname, "true");
        setMetastoreProperty(METASTORE_VALIDATE_TABLES.varname, "true");
    }

    private void configureDerbyLog() {
        // overriding default derby log path to not go to root of project
        File derbyLogFile;
        try {
            derbyLogFile = File.createTempFile("derby", ".log");
            LOGGER.debug("Derby set to log to " + derbyLogFile.getAbsolutePath());
        } catch (IOException e) {
            throw new UncheckedIOException("Error creating temporary derby log file", e);
        }
        System.setProperty("derby.stream.error.file", derbyLogFile.getAbsolutePath());
    }

    protected void configureFileSystem(Path basedir, HiveConf conf) throws IOException {
        setMetastoreProperty(METASTORECONNECTURLKEY.varname, metaStorageUrl + ";create=true");

        createAndSetFolderProperty(METASTOREWAREHOUSE, "warehouse", conf, basedir);
        createAndSetFolderProperty(SCRATCHDIR, "scratchdir", conf, basedir);
        createAndSetFolderProperty(LOCALSCRATCHDIR, "localscratchdir", conf, basedir);
        createAndSetFolderProperty(HIVEHISTORYFILELOC, "tmp", conf, basedir);

        createAndSetFolderProperty("hadoop.tmp.dir", "hadooptmp", conf, basedir);
        createAndSetFolderProperty("test.log.dir", "logs", conf, basedir);

        /*
         * Tez specific configurations below
         */
        /*
         * Tez will upload a hive-exec.jar to this location. It looks like it will do this only once per test suite so it
         * makes sense to keep this in a central location rather than in the tmp dir of each test.
         */
        File installation_dir = newFolder(basedir, "tez_installation_dir").toFile();

        conf.setVar(HiveConf.ConfVars.HIVE_JAR_DIRECTORY, installation_dir.getAbsolutePath());
        conf.setVar(HiveConf.ConfVars.HIVE_USER_INSTALL_DIR, installation_dir.getAbsolutePath());
    }

    Path newFolder(Path basedir, String folder) throws IOException {
        Path newFolder = Files.createTempDirectory(basedir, folder);
        FileUtil.setPermission(newFolder.toFile(), FsPermission.getDirDefault());
        return newFolder;
    }

    @Override
    public HiveConf getHiveConf() {
        return hiveConf;
    }

    @Override
    public Path getBaseDir() {
        return basedir;
    }

    protected final void createAndSetFolderProperty(HiveConf.ConfVars var, String folder, HiveConf conf, Path basedir)
            throws IOException {
        setMetastoreProperty(var.varname, newFolder(basedir, folder).toAbsolutePath().toString());
    }

    protected final void createAndSetFolderProperty(String key, String folder, HiveConf conf, Path basedir)
            throws IOException {
        setMetastoreProperty(key, newFolder(basedir, folder).toAbsolutePath().toString());
    }

    protected final void setMetastoreProperty(String key, String value) {
        hiveConf.set(key, value);
        System.setProperty(key, value);
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/ThrowOnTimeout.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

import com.klarna.hiverunner.config.HiveRunnerConfig;
import org.apache.commons.lang.time.StopWatch;
import org.junit.rules.TestRule;
import org.junit.runner.Description;
import org.junit.runners.model.Statement;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class ThrowOnTimeout extends Statement {
    private static final Logger LOGGER = LoggerFactory.getLogger(ThrowOnTimeout.class);

    private final Statement originalStatement;

    private final HiveRunnerConfig config;
    private final Object target;

    private Throwable statementException;
    private boolean finished = false;

    public ThrowOnTimeout(Statement originalStatement, HiveRunnerConfig config, Object target) {
        this.originalStatement = originalStatement;
        this.config = config;
        this.target = target;
    }

    @Override
    public void evaluate() throws Throwable {
        /*
         * Reset the statementException before the test is run to prevent false errors during repeated execution.
         */
        statementException = null;
        final StopWatch stopWatch = new StopWatch();

        if (config.isTimeoutEnabled()) {
            LOGGER.info("Starting timeout monitoring ({}s) of test case {}.", config.getTimeoutSeconds(), target);
        }

        Thread statementThread = new Thread(new Runnable() {
            @Override
            public void run() {
                try {
                    stopWatch.start();
                    originalStatement.evaluate();
                    finished = true;
                } catch (InterruptedException e) {
                    // Ignore the InterruptedException
                    LOGGER.debug(e.getMessage(), e);
                } catch (Throwable e) {
                    synchronized (target) {
                        statementException = e;
                    }
                }
            }
        });

        statementThread.start();
        statementThread.join(config.getTimeoutSeconds() * 1000);

        synchronized (target) {
            if (statementException != null) {
                throw statementException;
            } else if (!finished) {
                if (config.isTimeoutEnabled()) {
                    statementThread.interrupt();
                    throw new TimeoutException(
                            String.format("test timed out after %d seconds", config.getTimeoutSeconds()));
                } else {
                    LOGGER.warn("Test ran for {} seconds. Timeout disabled. See class {} for configuration options.",
                            stopWatch.getTime() / 1000, HiveRunnerConfig.class.getName());
                }
            }
        }

        statementThread.join();

        if (statementException != null) {
            throw statementException;
        }
    }

    public static TestRule create(final HiveRunnerConfig config, final Object target) {
        return new TestRule() {
            @Override
            public Statement apply(Statement base, Description description) {
                return new ThrowOnTimeout(base, config, target);
            }
        };
    }
}

================================================
FILE: src/main/java/com/klarna/hiverunner/TimeoutException.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner;

public class TimeoutException extends RuntimeException {

    private static final long serialVersionUID = 1L;

    public TimeoutException() {
        super();
    }

    public TimeoutException(String message) {
        super(message);
    }

    public TimeoutException(String message, Throwable cause) {
        super(message, cause);
    }

    public TimeoutException(Throwable cause) {
        super(cause);
    }

    protected TimeoutException(String message, Throwable cause, boolean enableSuppression, boolean writableStackTrace) {
        super(message, cause, enableSuppression, writableStackTrace);
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/annotations/HiveProperties.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.annotations;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

/**
 * Marks a field to contain properties that will be appended to the HiveConf.
 * <p>
 * The field should be of type <pre>Map&lt;String, String&gt;</pre>.
 * </p><p>
 * Please refer to test class {@code com.klarna.hiverunner.examples.HelloHiveRunnerTest} for usage examples.
 * </p>
 */
@Retention(RetentionPolicy.RUNTIME)
public @interface HiveProperties {
}


================================================
FILE: src/main/java/com/klarna/hiverunner/annotations/HiveResource.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.annotations;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

/**
 * Marks a field to contain test data input. The field might either be of type String, File or Path.
 * The data will be copied into the specified target file by the HiveRunner engine.
 * <p>
 * Please refer to test class {@code com.klarna.hiverunner.examples.HelloHiveRunnerTest} for usage examples.
 * </p>
 */
@Retention(RetentionPolicy.RUNTIME)
public @interface HiveResource {

    /**
     * Specifies where the data should be made available in HDFS.
     * <p>
     * Please refer to test class {@code com.klarna.hiverunner.examples.HelloHiveRunnerTest} for usage examples.
     * </p>
     */
    String targetFile();
}


================================================
FILE: src/main/java/com/klarna/hiverunner/annotations/HiveRunnerSetup.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.annotations;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

/**
 * Annotates a field that configures the hive runner runtime.
 * So far fields of type {@link com.klarna.hiverunner.config.HiveRunnerConfig} are supported.
 */
@Retention(RetentionPolicy.RUNTIME)
public @interface HiveRunnerSetup {
}


================================================
FILE: src/main/java/com/klarna/hiverunner/annotations/HiveSQL.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.annotations;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

/**
 * Marks a field (of type HiveShell) in a unit test. This field with its annotation is mandatory.
 * The HiveRunner will set the HiveShell instance before each test method is called.
 * <p>
 * Please refer to test class {@code com.klarna.hiverunner.examples.HelloHiveRunnerTest} for usage examples.
 * </p>
 */
@Retention(RetentionPolicy.RUNTIME)
public @interface HiveSQL {

    /**
     * The hive sql files subject to test. Files will be executed in order
     */
    String[] files();

    /**
     * If the shell should be started automatically before the JUnit test method is called.
     * <p>
     * If set to false this leaves the tester to do additional setup in @BeforeEach (for JUnit 5) or @Before (for JUnit 4) or within actual test method. However,
     * HiveShell.start() has to be called explicit when setup is done.
     * </p>
     */
    boolean autoStart() default true;

    /**
     * The encoding of the given files. Will default to java.nio.charset.Charset#defaultCharset
     */
    String encoding() default "";

}


================================================
FILE: src/main/java/com/klarna/hiverunner/annotations/HiveSetupScript.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.annotations;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

/**
 * Marks a field to refer to a setup script. The field should be of type String, File or Path.
 * If its a String the value of the field should be the actual script, not a path.
 * <p>
 * Please refer to test class {@code com.klarna.hiverunner.examples.HelloHiveRunnerTest} for usage examples.
 * </p>
 */
@Retention(RetentionPolicy.RUNTIME)
public @interface HiveSetupScript {

}


================================================
FILE: src/main/java/com/klarna/hiverunner/builder/HiveResource.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.builder;

import org.apache.commons.lang.builder.ToStringBuilder;

import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;

/**
 * Representation of a resource configuration
 */
class HiveResource {
    private final String targetFile;
    private final ByteArrayOutputStream byteArrayOutputStream;

    HiveResource(String targetFile) throws IOException {
        this(targetFile, new ByteArrayOutputStream());
    }

    HiveResource(String targetFile, Path dataFile) throws IOException {
        this(targetFile, createOutputStream(Files.readAllBytes(dataFile)));
    }

    HiveResource(String targetFile, String data) throws IOException {
        this(targetFile, createOutputStream(data.getBytes(StandardCharsets.UTF_8)));
    }

    private HiveResource(String targetFile, ByteArrayOutputStream byteArrayOutputStream) {
        this.targetFile = targetFile;
        this.byteArrayOutputStream = byteArrayOutputStream;
    }

    private static ByteArrayOutputStream createOutputStream(byte[] data) throws IOException {
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        baos.write(data);
        baos.close();
        return baos;
    }

    String getTargetFile() {
        return targetFile;
    }

    @Override
    public String toString() {
        return ToStringBuilder.reflectionToString(this);
    }

    public ByteArrayOutputStream getOutputStream() {
        return byteArrayOutputStream;
    }

}


================================================
FILE: src/main/java/com/klarna/hiverunner/builder/HiveRunnerScript.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.builder;

import java.nio.file.Path;

public class HiveRunnerScript implements Script {

    private Path path;
    private String sqlText;
    private int index;

    public HiveRunnerScript(int index, Path path, String sqlText) {
        this.index = index;
        this.path = path;
        this.sqlText = sqlText;
    }

    @Override
    public int getIndex() {
        return index;
    }

    /* (non-Javadoc)
     * @see com.klarna.hiverunner.builder.Script#getPath()
     */
    @Override
    public Path getPath() {
        return path;
    }

    /* (non-Javadoc)
     * @see com.klarna.hiverunner.builder.Script#getSqlText()
     */
    @Override
    public String getSql() {
        return sqlText;
    }

    @Override
    public int hashCode() {
        final int prime = 31;
        int result = 1;
        result = prime * result + index;
        result = prime * result + ((path == null) ? 0 : path.hashCode());
        result = prime * result + ((sqlText == null) ? 0 : sqlText.hashCode());
        return result;
    }

    @Override
    public boolean equals(Object obj) {
        if (this == obj)
            return true;
        if (obj == null)
            return false;
        if (getClass() != obj.getClass())
            return false;
        HiveRunnerScript other = (HiveRunnerScript) obj;
        if (index != other.index)
            return false;
        if (path == null) {
            if (other.path != null)
                return false;
        } else if (!path.equals(other.path))
            return false;
        if (sqlText == null) {
            if (other.sqlText != null)
                return false;
        } else if (!sqlText.equals(other.sqlText))
            return false;
        return true;
    }

    @Override
    public String toString() {
        return "HiveRunnerScript [path=" + path + ", sqlText=" + sqlText + ", index=" + index + "]";
    }

}


================================================
FILE: src/main/java/com/klarna/hiverunner/builder/HiveShellBase.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.builder;

import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

import org.apache.hadoop.hive.conf.HiveConf;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import com.google.common.base.Joiner;
import com.google.common.base.Preconditions;
import com.klarna.hiverunner.HiveServerContainer;
import com.klarna.hiverunner.HiveShell;
import com.klarna.hiverunner.data.InsertIntoTable;
import com.klarna.hiverunner.sql.StatementLexer;
import com.klarna.hiverunner.sql.cli.CommandShellEmulator;
import com.klarna.hiverunner.sql.split.StatementSplitter;

/**
 * HiveShell implementation delegating to HiveServerContainer
 */
class HiveShellBase implements HiveShell {

    private static final Logger LOGGER = LoggerFactory.getLogger(HiveShellBase.class);
    private static final String DEFAULT_NULL_REPRESENTATION = "NULL";
    private static final String DEFAULT_ROW_VALUE_DELIMTER = "\t";

    protected boolean started = false;

    protected final HiveServerContainer hiveServerContainer;

    protected final Map<String, String> hiveConf;
    protected final Map<String, String> hiveVars;
    protected final List<String> setupScripts;
    protected final List<HiveResource> resources;
    protected final List<Script> scriptsUnderTest;
    protected final CommandShellEmulator commandShellEmulator;
    protected StatementLexer lexer;
    protected Path cwd;

    HiveShellBase(HiveServerContainer hiveServerContainer, Map<String, String> hiveConf, List<String> setupScripts,
                  List<HiveResource> resources, List<Script> scriptsUnderTest, CommandShellEmulator commandShellEmulator) {
        this.hiveServerContainer = hiveServerContainer;
        this.hiveConf = hiveConf;
        this.commandShellEmulator = commandShellEmulator;
        this.setupScripts = new ArrayList<>(setupScripts);
        this.resources = new ArrayList<>(resources);
        this.scriptsUnderTest = new ArrayList<>(scriptsUnderTest);
        hiveVars = new HashMap<>();
        cwd = Paths.get(System.getProperty("user.dir"));
    }

    @Override
    public List<String> executeQuery(String hiveSql) {
        return executeQuery(hiveSql, DEFAULT_ROW_VALUE_DELIMTER, DEFAULT_NULL_REPRESENTATION);
    }

    @Override
    public List<String> executeQuery(String hiveSql, String rowValuesDelimitedBy, String replaceNullWith) {
        assertStarted();

        List<Object[]> resultSet = executeStatement(hiveSql);
        List<String> result = new ArrayList<>();
        for (Object[] objects : resultSet) {
            result.add(Joiner.on(rowValuesDelimitedBy).useForNull(replaceNullWith).join(objects));
        }
        return result;
    }

    @Override
    public List<Object[]> executeStatement(String hiveSql) {
        assertStarted();
        return executeStatementWithCommandShellEmulation(hiveSql);
    }

    private void executeScriptWithCommandShellEmulation(String script) {
        List<String> statements = lexer.applyToScript(script);
        executeStatementsWithCommandShellEmulation(statements);
    }

    private List<Object[]> executeStatementWithCommandShellEmulation(String statement) {
        List<String> statements = lexer.applyToStatement(statement);
        return executeStatementsWithCommandShellEmulation(statements);
    }

    private List<Object[]> executeStatementsWithCommandShellEmulation(List<String> hiveSqlStatements) {
        List<Object[]> results = new ArrayList<>();
        for (String hiveSqlStatement : hiveSqlStatements) {
            results.addAll(hiveServerContainer.executeStatement(hiveSqlStatement));
        }
        return results;
    }

    @Override
    public void execute(String hiveSql) {
        assertStarted();
        executeScriptWithCommandShellEmulation(hiveSql);
    }

    @Override
    public void execute(File file) {
        assertStarted();
        execute(Charset.defaultCharset(), file);
    }

    @Override
    public void execute(Path path) {
        assertStarted();
        execute(Charset.defaultCharset(), path);
    }

    @Override
    public void execute(Charset charset, File file) {
        assertStarted();
        execute(charset, Paths.get(file.toURI()));
    }

    @Override
    public void execute(Charset charset, Path path) {
        assertStarted();
        assertFileExists(path);
        List<String> hiveSqlStatements = lexer.applyToPath(path);
        executeStatementsWithCommandShellEmulation(hiveSqlStatements);
    }

    @Override
    public void start() {
        assertNotStarted();
        started = true;

        lexer = new StatementLexer(cwd, Charset.defaultCharset(), commandShellEmulator);

        hiveServerContainer.init(hiveConf, hiveVars);

        executeSetupScripts();

        prepareResources();

        executeScriptsUnderTest();
    }

    @Override
    public void addSetupScript(String script) {
        assertNotStarted();
        setupScripts.add(script);
    }

    @Override
    public void addSetupScripts(Charset charset, Path... scripts) {
        assertNotStarted();
        for (Path script : scripts) {
            assertFileExists(script);
            try {
                String setupScript = new String(Files.readAllBytes(script), charset);
                setupScripts.add(setupScript);
            } catch (IOException e) {
                throw new IllegalArgumentException(
                        "Unable to read setup script file '" + script + "': " + e.getMessage(), e);
            }
        }
    }

    @Override
    public void addSetupScripts(Charset charset, File... scripts) {
        Path[] paths = new Path[scripts.length];
        for (int i = 0; i < paths.length; i++) {
            paths[i] = Paths.get(scripts[i].toURI());
        }
        addSetupScripts(charset, paths);
    }

    @Override
    public void addSetupScripts(File... scripts) {
        addSetupScripts(Charset.defaultCharset(), scripts);
    }

    @Override
    public void addSetupScripts(Path... scripts) {
        addSetupScripts(Charset.defaultCharset(), scripts);
    }

    @Override
    public Path getBaseDir() {
        return hiveServerContainer.getBaseDir();
    }

    @Override
    public String expandVariableSubstitutes(String expression) {
        assertStarted();
        HiveConf hiveConf = getHiveConf();
        Preconditions.checkNotNull(hiveConf);
        return hiveServerContainer.getVariableSubstitution().substitute(hiveConf, expression);
    }

    @Override
    public void setProperty(String key, String value) {
        setHiveConfValue(key, value);
    }

    @Override
    public void setHiveConfValue(String key, String value) {
        assertNotStarted();
        hiveConf.put(key, value);
    }

    @Override
    public HiveConf getHiveConf() {
        assertStarted();
        return hiveServerContainer.getHiveConf();
    }

    @Override
    public OutputStream getResourceOutputStream(String targetFile) {
        try {
            assertNotStarted();
            HiveResource resource = new HiveResource(targetFile);
            resources.add(resource);
            OutputStream hiveShellStateAwareOutputStream = createPreStartOutputStream(resource.getOutputStream());
            return hiveShellStateAwareOutputStream;
        } catch (IOException e) {
            throw new IllegalStateException(e.getMessage(), e);
        }
    }

    @Override
    public void setHiveVarValue(String var, String value) {
        assertNotStarted();
        hiveVars.put(var, value);
    }

    @Override
    public void addResource(String targetFile, String data) {
        try {
            assertNotStarted();
            resources.add(new HiveResource(targetFile, data));
        } catch (IOException e) {
            throw new IllegalStateException(e.getMessage(), e);
        }
    }

    @Override
    public void addResource(String targetFile, Path sourceFile) {
        try {
            assertNotStarted();
            assertFileExists(sourceFile);
            resources.add(new HiveResource(targetFile, sourceFile));
        } catch (IOException e) {
            throw new IllegalStateException(e.getMessage(), e);
        }
    }

    @Override
    public void addResource(String targetFile, File sourceFile) {
        addResource(targetFile, Paths.get(sourceFile.toURI()));
    }

    @Override
    public InsertIntoTable insertInto(String databaseName, String tableName) {
        assertStarted();
        return InsertIntoTable.newInstance(databaseName, tableName, getHiveConf());
    }

    private void executeSetupScripts() {
        for (String setupScript : setupScripts) {
            LOGGER.debug("Executing script: " + setupScript);
            executeScriptWithCommandShellEmulation(setupScript);
        }
    }

    private void prepareResources() {
        for (HiveResource resource : resources) {
            String expandedPath = hiveServerContainer.expandVariableSubstitutes(resource.getTargetFile());

            assertResourcePreconditions(resource, expandedPath);

            Path targetFile = Paths.get(expandedPath);

            // Create target file in the tmp dir and write test data to it.
            try {
                Files.createDirectories(targetFile.getParent());
                OutputStream targetFileOutputStream = Files.newOutputStream(targetFile, StandardOpenOption.CREATE_NEW);
                targetFileOutputStream.write(resource.getOutputStream().toByteArray());
                resource.getOutputStream().close();
                targetFileOutputStream.close();
            } catch (IOException e) {
                throw new IllegalStateException("Failed to create resource target file: " + targetFile + " ("
                        + resource.getTargetFile() + "): " + e.getMessage(), e);
            }

            LOGGER.debug("Created hive resource " + targetFile);

        }
    }

    private void executeScriptsUnderTest() {
        for (Script script : scriptsUnderTest) {
            try {
                executeScriptWithCommandShellEmulation(script.getSql());
            } catch (Exception e) {
                throw new IllegalStateException("Failed to executeScript '" + script + "': " + e.getMessage(), e);
            }
        }
    }

    protected final void assertResourcePreconditions(HiveResource resource, String expandedPath) {
        String unexpandedPropertyPattern = ".*\\$\\{.*\\}.*";
        boolean isUnexpanded = !expandedPath.matches(unexpandedPropertyPattern);

        Preconditions.checkArgument(isUnexpanded,
                "File path %s contains " + "unresolved references. Original arg was: %s", expandedPath,
                resource.getTargetFile());

        boolean isTargetFileWithinTestDir = expandedPath
                .startsWith(hiveServerContainer.getBaseDir().toString());

        Preconditions.checkArgument(isTargetFileWithinTestDir,
                "All resource target files should be created in a subdirectory to the test case basedir %s : %s",
                hiveServerContainer.getBaseDir().getRoot(), resource.getTargetFile());
    }

    protected final void assertFileExists(Path file) {
        Preconditions.checkNotNull(file, "File argument is null");
        Preconditions.checkArgument(Files.exists(file), "File %s does not exist", file);
        Preconditions.checkArgument(Files.isRegularFile(file), "%s is not a file", file);
    }

    protected final void assertNotStarted() {
        Preconditions.checkState(!started, "HiveShell was already started");
    }

    protected final void assertStarted() {
        Preconditions.checkState(started, "HiveShell was not started");
    }

    private OutputStream createPreStartOutputStream(ByteArrayOutputStream resourceOutputStream) {
        return new OutputStream() {
            @Override
            public void write(int b) throws IOException {
                // It should not be possible to write to the stream after the
                // shell has been started.
                assertNotStarted();
                resourceOutputStream.write(b);
            }
        };
    }

    @Override
    public List<String> executeQuery(File script) {
        return executeQuery(Charset.defaultCharset(), script);
    }

    @Override
    public List<String> executeQuery(Path script) {
        return executeQuery(Charset.defaultCharset(), script);
    }

    @Override
    public List<String> executeQuery(Charset charset, File script) {
        return executeQuery(charset, script, DEFAULT_ROW_VALUE_DELIMTER, DEFAULT_NULL_REPRESENTATION);
    }

    @Override
    public List<String> executeQuery(Charset charset, Path script) {
        return executeQuery(charset, script, DEFAULT_ROW_VALUE_DELIMTER, DEFAULT_NULL_REPRESENTATION);
    }

    @Override
    public List<String> executeQuery(File script, String rowValuesDelimitedBy, String replaceNullWith) {
        return executeQuery(Charset.defaultCharset(), script, rowValuesDelimitedBy, replaceNullWith);
    }

    @Override
    public List<String> executeQuery(Path script, String rowValuesDelimitedBy, String replaceNullWith) {
        return executeQuery(Charset.defaultCharset(), script, rowValuesDelimitedBy, replaceNullWith);
    }

    @Override
    public List<String> executeQuery(Charset charset, File script, String rowValuesDelimitedBy,
                                     String replaceNullWith) {
        return executeQuery(charset, Paths.get(script.toURI()), rowValuesDelimitedBy, replaceNullWith);
    }

    public List<Script> getScriptsUnderTest() {
        return scriptsUnderTest;
    }

    @Override
    public List<String> executeQuery(Charset charset, Path script, String rowValuesDelimitedBy,
                                     String replaceNullWith) {
        assertStarted();
        assertFileExists(script);
        try {
            String statements = new String(Files.readAllBytes(script), charset);
            List<Statement> splitStatements = new StatementSplitter(commandShellEmulator).split(statements);
            if (splitStatements.size() != 1) {
                throw new IllegalArgumentException("Script '" + script + "' must contain a single valid statement.");
            }
            Statement statement = splitStatements.get(0);
            return executeQuery(statement.getSql(), rowValuesDelimitedBy, replaceNullWith);
        } catch (IOException e) {
            throw new IllegalArgumentException("Unable to read setup script file '" + script + "': " + e.getMessage(),
                    e);
        }
    }

    @Override
    public void setCwd(Path cwd) {
        assertNotStarted();
        this.cwd = cwd;
    }

    @Override
    public Path getCwd() {
        return cwd;
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/builder/HiveShellBuilder.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.builder;

import com.google.common.base.Preconditions;
import com.klarna.hiverunner.HiveServerContainer;
import com.klarna.hiverunner.HiveShellContainer;
import com.klarna.hiverunner.sql.cli.CommandShellEmulator;
import com.klarna.hiverunner.sql.cli.hive.HiveCliEmulator;

import java.io.IOException;
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

/**
 * Builds a HiveShell.
 */
public class HiveShellBuilder {
    private List<Script> scriptsUnderTest = new ArrayList<>();
    private final Map<String, String> props = new HashMap<>();
    private HiveServerContainer hiveServerContainer;
    private final List<HiveResource> resources = new ArrayList<>();
    private final List<String> setupScripts = new ArrayList<>();
    private CommandShellEmulator commandShellEmulator = HiveCliEmulator.INSTANCE;

    public void setHiveServerContainer(HiveServerContainer hiveServerContainer) {
        this.hiveServerContainer = hiveServerContainer;
    }

    public void putAllProperties(Map<String, String> props) {
        this.props.putAll(props);
    }

    public void addSetupScript(String script) {
        this.setupScripts.add(script);
    }

    public void addResource(String targetFile, Path dataFile) throws IOException {
        resources.add(new HiveResource(targetFile, dataFile));
    }

    public void addResource(String targetFile, String data) throws IOException {
        resources.add(new HiveResource(targetFile, data));
    }

    public void setScriptsUnderTest(List<Path> scriptPaths, Charset charset) {
        scriptsUnderTest.addAll(fromScriptPaths(scriptPaths, charset));
    }

    public List<Script> fromScriptPaths(List<Path> scriptPaths, Charset charset) {
        List<Script> scripts = new ArrayList();
        int index = 0;
        for (Path path : scriptPaths) {
            Preconditions.checkState(Files.exists(path), "File %s does not exist", path);
            try {
                String sqlText = new String(Files.readAllBytes(path), charset);
                scripts.add(new HiveRunnerScript(index++, path, sqlText));
            } catch (IOException e) {
                throw new IllegalArgumentException("Failed to load script file '" + path + "'");
            }
        }
        return scripts;
    }

    public void setCommandShellEmulation(CommandShellEmulator commandShellEmulator) {
        this.commandShellEmulator = commandShellEmulator;
    }

    public HiveShellContainer buildShell() {
        return new HiveShellTearable(hiveServerContainer, props, setupScripts, resources, scriptsUnderTest, commandShellEmulator);
    }

    public void overrideScriptsUnderTest(List<? extends Script> scripts) {
        scriptsUnderTest = new ArrayList<>(scripts);
    }
}



================================================
FILE: src/main/java/com/klarna/hiverunner/builder/HiveShellTearable.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.builder;

import com.klarna.hiverunner.HiveServerContainer;
import com.klarna.hiverunner.HiveShellContainer;
import com.klarna.hiverunner.sql.cli.CommandShellEmulator;

import java.util.List;
import java.util.Map;

/**
 * HiveShellContainer implementation that will do a full tear down of the hive server after test method is executed.
 */
class HiveShellTearable extends HiveShellBase implements HiveShellContainer {

    HiveShellTearable(HiveServerContainer hiveServerContainer, Map<String, String> hiveConf,
                      List<String> setupScripts, List<HiveResource> resources,
                      List<Script> scriptsUnderTest, CommandShellEmulator commandShellEmulator) {
        super(hiveServerContainer, hiveConf, setupScripts, resources, scriptsUnderTest, commandShellEmulator);
    }

    @Override
    public void tearDown() {
        hiveServerContainer.tearDown();
    }
}


================================================
FILE: src/main/java/com/klarna/hiverunner/builder/Script.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.builder;

import java.nio.file.Path;

public interface Script {

    /**
     * index of script within all scripts in source
     */
    int getIndex();

    Path getPath();

    String getSql();

}


================================================
FILE: src/main/java/com/klarna/hiverunner/builder/Statement.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.builder;

public interface Statement {

    /**
     * Index of statement within all statements of script
     */
    int getIndex();

    /**
     * Original sql of the statement
     */
    String getSql();

}


================================================
FILE: src/main/java/com/klarna/hiverunner/config/HiveRunnerConfig.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.config;


import java.util.HashMap;
import java.util.Map;
import java.util.Properties;

import org.apache.hadoop.hive.conf.HiveConf;

import com.klarna.hiverunner.sql.cli.CommandShellEmulator;
import com.klarna.hiverunner.sql.cli.CommandShellEmulatorFactory;
import com.klarna.hiverunner.sql.cli.hive.HiveCliEmulator;


/**
 * HiveRunner runtime configuration.
 *
 * Configure with System properties via mvn like
 * <pre>
 * &lt;plugin&gt;
 *      &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
 *      &lt;artifactId&gt;maven-surefire-plugin&lt;/artifactId&gt;
 *      &lt;version&gt;2.17&lt;/version&gt;
 *      &lt;configuration>
 *          ...
 *          &lt;systemProperties&gt;
 *              &lt;hiveconf_any.hive.conf&gt;1000&lt;/hiveconf_any.hive.conf&gt;
 *              &lt;enableTimeout&gt;false&lt;/enableTimeout&gt;
 *              &lt;timeoutSeconds&gt;30&lt;/timeoutSeconds&gt;
 *              &lt;timeoutRetries&gt;2&lt;/timeoutRetries&gt;
 *              &lt;commandShellEmulation&gt;BEELINE&lt;/commandShellEmulation&gt;
 *          &lt;/systemProperties&gt;
 *      &lt;/configuration&gt;
 * &lt;/plugin&gt;
 * </pre>
 *
 * Properties may be overridden per test class by annotating a <b>static</b> HiveRunnerConfig field like:
 * <pre>
 *      &#064;HiveRunnerSetup
 *      public final static HiveRunnerConfig config = new HiveRunnerConfig(){{
 *          setTimeoutEnabled(true);
 *          setTimeoutSeconds(15);
 *          setTimeoutRetries(2);
 *          setCommandShellEmulation(CommandShellEmulation.BEELINE);
 *      }};
 * </pre>
 *
 * See the test class<{@code com.klarna.hiverunner.DisabledTimeoutTest} for more information.
 */
public class HiveRunnerConfig {

    /**
     * Enable timeout. Some versions of tez has proven to not always terminate. By enabling timeout,
     * HiveRunner will kill the current query and attempt to retry the test case a configurable number of times.
     *
     * Defaults to disabled
     */
    public static final String ENABLE_TIMEOUT_PROPERTY_NAME = "enableTimeout";
    public static final boolean ENABLE_TIMEOUT_DEFAULT = false;

    /**
     * Seconds to wait for a query to terminate before triggering the timeout.
     *
     * Defaults to 30 seconds
     */
    public static final String TIMEOUT_SECONDS_PROPERTY_NAME = "timeoutSeconds";
    public static final int TIMEOUT_SECONDS_DEFAULT = 30;

    /**
     * Number of retries for a test case that keep timing out.
     *
     * Defaults to 2 retries
     */
    public static final String TIMEOUT_RETRIES_PROPERTY_NAME = "timeoutRetries";
    public static final int TIMEOUT_RETRIES_DEFAULT = 2;

    /**
     * Suffix used to flag a system property to be a hiveconf setting.
     */
    public static final String HIVECONF_SYSTEM_OVERRIDE_PREFIX = "hiveconf_";

    /**
     * The shell's {@link CommandShellEmulator}.
     *
     * Defaults to {@code HIVE_CLI}
     */
    public static final String COMMAND_SHELL_EMULATOR_PROPERTY_NAME = "commandShellEmulator";
    public static final String COMMAND_SHELL_EMULATOR_DEFAULT = HiveCliEmulator.INSTANCE.getName();

    private Map<String, Object> config = new HashMap<>();

    private Map<String, String> hiveConfSystemOverride = new HashMap<>();

    /**
     * Construct a HiveRunnerConfig that will override hiveConf with
     * System properties of the format 'hiveconf_[hiveconf property name]'.
     */
    public HiveRunnerConfig() {
        this(System.getProperties());
    }

    /**
     * Construct a HiveRunnerConfig that will override hiveConf with
     * the given properties of the format 'hiveconf_[hiveconf property name]'.
     */
    public HiveRunnerConfig(Properties systemProperties) {
        config.put(ENABLE_TIMEOUT_PROPERTY_NAME, load(ENABLE_TIMEOUT_PROPERTY_NAME, ENABLE_TIMEOUT_DEFAULT, systemProperties));
        config.put(TIMEOUT_RETRIES_PROPERTY_NAME, load(TIMEOUT_RETRIES_PROPERTY_NAME, TIMEOUT_RETRIES_DEFAULT, systemProperties));
        config.put(TIMEOUT_SECONDS_PROPERTY_NAME, load(TIMEOUT_SECONDS_PROPERTY_NAME, TIMEOUT_SECONDS_DEFAULT, systemProperties));
        config.put(COMMAND_SHELL_EMULATOR_PROPERTY_NAME, load(COMMAND_SHELL_EMULATOR_PROPERTY_NAME, COMMAND_SHELL_EMULATOR_DEFAULT, systemProperties));

        hiveConfSystemOverride = loadHiveConfSystemOverrides(systemProperties);
    }

    public boolean isTimeoutEnabled() {
        return getBoolean(ENABLE_TIMEOUT_PROPERTY_NAME);
    }

    public int getTimeoutRetries() {
        return getInteger(TIMEOUT_RETRIES_PROPERTY_NAME);
    }

    public int getTimeoutSeconds() {
        return getInteger(TIMEOUT_SECONDS_PROPERTY_NAME);
    }

    /**
     * Get the configured hive.execution.engine. If not set it will default to the default value of HiveConf
     */
    public String getHiveExecutionEngine() {
        String executionEngine = hiveConfSystemOverride.get(HiveConf.ConfVars.HIVE_EXECUTION_ENGINE.varname);
        return executionEngine == null ? HiveConf.ConfVars.HIVE_EXECUTION_ENGINE.getDefaultValue() : executionEngine;
    }

    public Map<String, String> getHiveConfSystemOverride() {
        return hiveConfSystemOverride;
    }

    /**
     * Determines the statement parsing behaviour of the interactive shell. Provided to emulate slight differences
     * between different clients.
     */
    public CommandShellEmulator getCommandShellEmulator() {
        return CommandShellEmulatorFactory.valueOf(getString(COMMAND_SHELL_EMULATOR_PROPERTY_NAME).toUpperCase());
    }

    public void setTimeoutEnabled(boolean isEnabled) {
        config.put(ENABLE_TIMEOUT_PROPERTY_NAME, isEnabled);
    }

    public void setTimeoutRetries(int retries) {
        config.put(TIMEOUT_RETRIES_PROPERTY_NAME, retries);
    }

    public void setTimeoutSeconds(int timeout) {
        config.put(TIMEOUT_SECONDS_PROPERTY_NAME, timeout);
    }

    public void setHiveExecutionEngine(String executionEngine) {
        hiveConfSystemOverride.put(HiveConf.ConfVars.HIVE_EXECUTION_ENGINE.varname, executionEngine);
    }

    public void setCommandShellEmulator(CommandShellEmulator commandShellEmulator) {
        config.put(COMMAND_SHELL_EMULATOR_PROPERTY_NAME, commandShellEmulator.getName());
    }

    /**
     * Copy values from the inserted config to this config. Note that if properties has not been explicitly set,
     * the defaults will apply.
     */
    public void override(HiveRunnerConfig hiveRunnerConfig) {
        config.putAll(hiveRunnerConfig.config);
        hiveConfSystemOverride.putAll(hiveRunnerConfig.hiveConfSystemOverride);
    }

    private static boolean load(String property, boolean defaultValue, Properties sysProperties) {
        String value = sysProperties.getProperty(property);
        return value == null ? defaultValue : Boolean.parseBoolean(value);
    }

    private static String load(String property, String defaultValue, Properties sysProperties) {
        String value = sysProperties.getProperty(property);
        return value == null ? defaultValue : value;
    }

    private static int load(String property, int defaultValue, Properties sysProperties) {
        String value = sysProperties.getProperty(property);
        return value == null ? defaultValue : Integer.parseInt(value);
    }


    private boolean getBoolean(String key) {
        return (boolean) config.get(key);
    }


    private int getInteger(String key) {
        return (int) config.get(key);
    }

    private String getString(String key) {
        return (String) config.get(key);
    }

    private static Map<String, String> loadHiveConfSystemOverrides(Properties systemProperties) {
        Map<String, String> hiveConfSystemOverride = new HashMap<>();

        for (String sysKey : systemProperties.stringPropertyNames()) {
            if (sysKey.startsWith(HIVECONF_SYSTEM_OVERRIDE_PREFIX)) {
                String hiveConfKey = sysKey.substring(HIVECONF_SYSTEM_OVERRIDE_PREFIX.length());
                hiveConfSystemOverride.put(hiveConfKey, systemProperties.getProperty(sysKey));
            }
        }

        return hiveConfSystemOverride;
    }

}


================================================
FILE: src/main/java/com/klarna/hiverunner/data/Converters.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.data;

import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.binaryTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.booleanTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.byteTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.dateTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.doubleTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.floatTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.intTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.longTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.shortTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.stringTypeInfo;
import static org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory.timestampTypeInfo;

import java.math.BigDecimal;
import java.sql.Date;
import java.sql.Timestamp;
import java.util.Map;

import org.apache.commons.beanutils.ConversionException;
import org.apache.commons.beanutils.ConvertUtilsBean;
import org.apache.commons.beanutils.Converter;
import org.apache.commons.beanutils.converters.BooleanConverter;
import org.apache.commons.beanutils.converters.ByteArrayConverter;
import org.apache.commons.beanutils.converters.ByteConverter;
import org.apache.commons.beanutils.converters.DoubleConverter;
import org.apache.commons.beanutils.converters.FloatConverter;
import org.apache.commons.beanutils.converters.IntegerConverter;
import org.apache.commons.beanutils.converters.LongConverter;
import org.apache.commons.beanutils.converters.ShortConverter;
import org.apache.commons.beanutils.converters.StringConverter;
import org.apache.hadoop.hive.common.type.HiveChar;
import org.apache.hadoop.hive.common.type.HiveDecimal;
import org.apache.hadoop.hive.common.type.HiveVarchar;
import org.apache.hadoop.hive.serde2.typeinfo.CharTypeInfo;
import org.apache.hadoop.hive.serde2.typeinfo.DecimalTypeInfo;
import org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo;
import org.apache.hadoop.hive.serde2.typeinfo.VarcharTypeInfo;

import com.google.common.collect.ImmutableMap;

/**
 * A utility class for converting from {@link String Strings} into the target Hive table's column type.
 */
public final class Converters {

    static final Map<PrimitiveTypeInfo, Class<?>> TYPES = ImmutableMap
            .<PrimitiveTypeInfo, Class<?>>builder()
            .put(stringTypeInfo, String.class)
            .put(booleanTypeInfo, Boolean.class)
            .put(byteTypeInfo, Byte.class)
            .put(shortTypeInfo, Short.class)
            .put(intTypeInfo, Integer.class)
            .put(longTypeInfo, Long.class)
            .put(floatTypeInfo, Float.class)
            .put(doubleTypeInfo, Double.class)
            .put(dateTypeInfo, Date.class)
            .put(timestampTypeInfo, Timestamp.class)
            .put(binaryTypeInfo, Byte[].class)
            .build();

    private static final ConvertUtilsBean CONVERTER;

    static {
        CONVERTER = new ConvertUtilsBean();
        CONVERTER.register(new StringConverter(), String.class);
        CONVERTER.register(new BooleanConverter(), Boolean.class);
        CONVERTER.register(new ByteConverter(), Byte.class);
        CONVERTER.register(new ShortConverter(), Short.class);
        CONVERTER.register(new IntegerConverter(), Integer.class);
        CONVERTER.register(new LongConverter(), Long.class);
        CONVERTER.register(new FloatConverter(), Float.class);
        CONVERTER.register(new DoubleConverter(), Double.class);
        CONVERTER.register(new HiveDateConverter(), Date.class);
        CONVERTER.register(new HiveTimestampConverter(), Timestamp.class);
        CONVERTER.register(new ByteArrayConverter(), Byte[].class);
        CONVERTER.register(new HiveDecimalConverter(), HiveDecimal.class);
        CONVERTER.register(new HiveVarcharConverter(), HiveVarchar.class);
        CONVERTER.register(new HiveCharConverter(), HiveChar.class);
    }

    private Converters() {
    }

    static Class<?> type(PrimitiveTypeInfo typeInfo) {
        Class<?> type = TYPES.get(typeInfo);
        if (type == null) {
            if (typeInfo instanceof DecimalTypeInfo) {
                type = HiveDecimal.class;
            } else if (typeInfo instanceof VarcharTypeInfo) {
                type = HiveVarchar.class;
            } else if (typeInfo instanceof CharTypeInfo) {
                type = HiveChar.class;
            } else {
                type = String.class;
            }
        }
        return type;
    }

    /**
     * Attempts to convert the input value into the target type. If the input value is {@code null} then {@code null} is
     * returned. If the input value is a String then an attempt is made to convert it into the target type. If the input
     * value is not a {@link String} then it is assumed the user has explicitly chosen the required type and no attempt is
     * made to perform a conversion. This may result in Hive throwing an error if the incorrect type was chosen.
     *
     * @param value The input value.
     * @param typeInfo The target Table's column type.
     */
    public static Object convert(Object value, PrimitiveTypeInfo typeInfo) {
        if (value == null) {
            return null;
        }
        if (value instanceof String) {
            return CONVERTER.convert((String) value, type(typeInfo));
        }
        return value;
    }

    private static class HiveDecimalConverter implements Converter {
        @Override
        public Object convert(@SuppressWarnings("rawtypes") Class type, Object value) {
            try {
                return HiveDecimal.create(new BigDecimal(value.toString()));
            } catch (NumberFormatException e) {
                throw new ConversionException(e);
            }
        }
    }

    private static class HiveDateConverter implements Converter {
        @Override
        public Object convert(@SuppressWarnings("rawtypes") Class type, Object value) {
            try {
                return org.apache.hadoop.hive.common.type.Date.valueOf(value.toString());
            } catch (IllegalArgumentException e) {
                throw new ConversionException(e);
            }
        }
    }

    private static class HiveTimestampConverter implements Converter {
        @Override
        public Object convert(@SuppressWarnings("rawtypes") Class type, Object value) {
            try {
                return org.apache.hadoop.hive.common.type.Timestamp.valueOf(value.toString());
            } catch (IllegalArgumentException e) {
                throw new ConversionException(e);
            }
        }
    }

    private static class HiveVarcharConverter implements Converter {
        @Override
        public Object convert(@SuppressWarnings("rawtypes") Class type, Object value) {
            return new HiveVarchar(value.toString(), -1);
        }
    }

    private static class HiveCharConverter implements Converter {
        @Override
        public Object convert(@SuppressWarnings("rawtypes") Class type, Object value) {
            return new HiveChar(value.toString(), -1);
        }
    }

}


================================================
FILE: src/main/java/com/klarna/hiverunner/data/FileParser.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.data;

import java.io.File;
import java.util.List;

import org.apache.hive.hcatalog.data.schema.HCatSchema;

/**
 * A {@link File} parsing class to be used with {@link InsertIntoTable} for inserting data into a Hive table from a
 * {@link File}.
 */
public interface FileParser {

    /**
     * Parses the given file and returns the rows with the requested columns.
     *
     * @param file The file to be parsed.
     * @param schema The full schema of the Hive table.
     * @param names The requested field names.
     * @return A {@link List} of rows, each represented by an {@link Object} array.
     */
    List<Object[]> parse(File file, HCatSchema schema, List<String> names);

    /**
     * Parses the given file and returns the column names that are available in the file.
     *
     * @param file The file to be parsed
     * @return A {@link List} of column names as Strings
     */
    List<String> getColumnNames(File file);

    /**
     * Method that checks if the parser has access to column names.
     * @return
     */
    boolean hasColumnNames();
}


================================================
FILE: src/main/java/com/klarna/hiverunner/data/InsertIntoTable.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.data;

import java.io.File;

import org.apache.hadoop.hive.conf.HiveConf;
import org.apache.hive.hcatalog.api.HCatClient;
import org.apache.hive.hcatalog.api.HCatTable;
import org.apache.hive.hcatalog.common.HCatException;

import com.klarna.hiverunner.HiveShell;

/**
 * A class for fluently creating a list of rows and inserting them into a table.
 */
public final class InsertIntoTable {

    private final TableDataBuilder builder;
    private final TableDataInserter inserter;

    /**
     * Factory method for creating an {@link InsertIntoTable}.
     * <p>
     * This method is intended to be called via {@link HiveShell#insertInto(String, String)}.
     * </p>
     *
     * @param databaseName The database name.
     * @param tableName The table name.
     * @param conf The {@link HiveConf}.
     * @return InsertIntoTable
     */
    public static InsertIntoTable newInstance(String databaseName, String tableName, HiveConf conf) {
        TableDataBuilder builder = new TableDataBuilder(getHCatTable(databaseName, tableName, conf));
        TableDataInserter inserter = new TableDataInserter(databaseName, tableName, conf);
        return new InsertIntoTable(builder, inserter);
    }

    private static HCatTable getHCatTable(String databaseName, String tableName, HiveConf conf) {
        HCatClient client = null;
        try {
            client = HCatClient.create(conf);
            return client.getTable(databaseName, tableName);
        } catch (HCatException e) {
            throw new RuntimeException("Unable to get table from the metastore.", e);
        } finally {
            if (client != null) {
                try {
                    client.close();
                } catch (HCatException e) {
                    throw new RuntimeException("Unable close client.", e);
                }
            }
        }
    }

    InsertIntoTable(TableDataBuilder builder, TableDataInserter inserter) {
        this.builder = builder;
        this.inserter = inserter;
    }

    /**
     * Defines a subset of columns (a column name mask) so that only pertinent columns can be set.
     * <p>
     * e.g.
     *
     * <pre>
     * {@code
     * tableDataBuilder
     *     .withColumns("col1", "col3")
     *     .addRow("value1", "value3")
     * }
     * </pre>
     * </p>
     *
     * @param names The column names.
     * @return {@code this}
     * @throws IllegalArgumentException if a column name does not exist in the table.
     */
    public InsertIntoTable withColumns(String... names) {
        builder.withColumns(names);
        return this;
    }

    /**
     * Resets the column name mask to all the columns in the table.
     *
     * @return {@code this}
     */
    public InsertIntoTable withAllColumns() {
        builder.withAllColumns();
        return this;
    }

    /**
     * Flushes the current row and creates a new row with {@code null} values for all columns.
     *
     * @return {@code this}
     */
    public InsertIntoTable newRow() {
        builder.newRow();
        return this;
    }

    /**
     * Flushes the current row and creates a new row with the values specified.
     *
     * @param values The values to set.
     * @return {@code this}
     */
    public InsertIntoTable addRow(Object... values) {
        builder.addRow(values);
        return this;
    }

    /**
     * Sets the current row with the values specified.
     *
     * @param values The values to set.
     * @return {@code this}
     */
    public InsertIntoTable setRow(Object... values) {
        builder.setRow(values);
        return this;
    }

    /**
     * Adds all rows from the TSV file specified. The default delimiter is tab and the default null value is an empty
     * string.
     *
     * @param file The file to read the data from.
     * @return {@code this}
     */
    public InsertIntoTable addRowsFromTsv(File file) {
        builder.addRowsFromTsv(file);
        return this;
    }

    /**
     * Adds all rows from the TSV file specified, using the provided delimiter and null value.
     *
     * @param file The file to read the data from.
     * @param delimiter A column delimiter.
     * @param nullValue Value to be treated as null in the source data.
     * @return {@code this}
     */
    public InsertIntoTable addRowsFromDelimited(File file, String delimiter, Object nullValue) {
        builder.addRowsFromDelimited(file, delimiter, nullValue);
        return this;
    }

    /**
     * Adds all rows from the file specified, using the provided parser.
     *
     * @param file File to read the data from.
     * @param fileParser Parser to be used to parse the file.
     * @return {@code this}
     */
    public InsertIntoTable addRowsFrom(File file, FileParser fileParser) {
        builder.addRowsFrom(file, fileParser);
        return this;
    }

    /**
     * Flushes the current row and creates a new row with the same values.
     *
     * @return {@code this}
     */
    public InsertIntoTable copyRow() {
        builder.copyRow();
        return this;
    }

    /**
     * Set the given column name to the given value.
     *
     * @param name The column name to set.
     * @param value the value to set.
     * @return {@code this}
     * @throws IllegalArgumentException if a column name does not exist in the table.
     */
    public InsertIntoTable set(String name, Object value) {
        builder.set(name, value);
        return this;
    }

    /**
     * Inserts the data into the table. This does not replace any existing data, but appends new part files to the
     * table/partition location(s).
     */
    public void commit() {
        inserter.insert(builder.build());
    }

}


================================================
FILE: src/main/java/com/klarna/hiverunner/data/TableDataBuilder.java
================================================
/**
 * Copyright (C) 2013-2021 Klarna AB
 * Copyright (C) 2021 The HiveRunner Contributors
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package com.klarna.hiverunner.data;

import static com.google.common.base.Preconditions.checkArgument;
import static com.google.common.base.Preconditions.checkNotNull;
import static com.google.common.base.Preconditions.checkState;

import java.io.File;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;

import org.apache.commons.beanutils.ConversionException;
import org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo;
import org.apache.hive.hcatalog.api.HCatTable;
import org.apache.hive.hcatalog.common.HCatException;
import org.apache.hive.hcatalog.data.DefaultHCatRecord;
import org.apache.hive.hcatalog.data.HCatRecord;
import org.apache.hive.hcatalog.data.schema.HCatFieldSchema;
import org.apache.hive.hcatalog.data.schema.HCatSchema;

import com.google.common.base.Function;
import com.google.common.collect.FluentIterable;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import com.google.common.collect.ImmutableMultimap;
import com.google.common.collect.ImmutableMultimap.Builder;
import com.google.common.collect.Multimap;

class TableDataBuilder {

    private final Builder<Map<String, String>, HCatRecord> rowsBuilder = ImmutableMultimap.builder();
    private final HCatSchema schema;
    private final List<HCatFieldSchema> partitionColumns;

    private HCatRecord row;
    private List<String> names;

    TableDataBuilder(HCatTable table) {
        schema = new HCatSchema(ImmutableList
                .<HCatFieldSchema>builder()
                .addAll(table.getCols())
                .addAll(table.getPartCols())
                .build());
        partitionColumns = table.getPartCols();
        withAllColumns();
    }

    TableDataBuilder withColumns(String... names) {
        checkArgument(checkNotNull(names).length > 0, "Column names must be provided.");
        this.names = new ArrayList<>(names.length);
        for (String name : names) {
            checkColumn(name);
            this.names.add(name);
        }
        return this;
    }

    TableDataBuilder withAllColumns() {
        names = schema.getFieldNames();
        return this;
    }

    TableDataBuilder newRow() {
        flushRow();
        row = new DefaultHCatRecord(schema.size());
        return this;
    }

    TableDataBuilder addRow(Object... values) {
        return newRow().setRow(values);
    }

    TableDataBuilder setRow(Object... values) {
        checkArgument(values.length == names.size(), "Expected %d values, got %d", names.size(), values.length);
        for (int i = 0; i < values.length; i++) {
            set(names.get(i), values[i]);
        }
        return this;
    }

    TableDataBuilder addRowsFromTsv(File file) {
        return addRowsFrom(file, new TsvFileParser());
    }

    TableDataBuilder addRowsFromDelimited(File file, String delimiter, Object nullValue) {
        return addRowsFrom(file, new TsvFileParser().withDelimiter(delimiter).withNullValue(nullValue));
    }

    TableDataBuilder addRowsFrom(File file, FileParser fileParser) {
        if (fileParser.hasColumnNames()) {
            checkArgument(names.equals(schema.getFieldNames()),
                    "Manual column spec and header column spec are mutually exclusive");
            String[] columns = FluentIterable
                    .from(fileParser.getColumnNames(file))
                    .transform(toLowerCase())
                    .toArray(String.class);
            withColumns(columns);
        }
        return addRows(fileParser.parse(file, schema, names));
    }

    private Function<String, String> toLowerCase() {
        return new Function<String, String>() {
            @Override
            public String apply(String t) {
                return t.toLowerCase();
            }
        };
    }

    private TableDataBuilder addRows(List<Object[]> rows) {
        for (Object[] row : rows) {
            addRow(row);
        }
        return this;
    }

    TableDataBuilder copyRow() {
        checkState(row != null, "No previous row to copy.");
        HCatRecord copy = new DefaultHCatRecord(new ArrayList<>(row.getAll()));
        flushRow();
        row = copy;
        return this;
    }

    TableDataBuilder set(String name, Object value) {
        checkColumn(name);
        PrimitiveTypeInfo typeInfo;
        try {
            typeInfo = schema.get(name).getTypeInfo();
        } catch (HCatException e) {
            throw new IllegalArgumentException("Error getting type info for " + name, e);
        }
        Object converted;
        try {
            converted = Converters.convert(value, typeInfo);
        } catch (ConversionException e) {
            throw new IllegalArgumentException("Invalid value for " + name + ". Got '" + value + "' ("
                    + value.getClass().getSimpleName() + "). Expected " + typeInfo.getTypeName() + ".", e);
        }
        try {
            row.set(name, schema, converted);
        } catch (HCatException e) {
            throw new RuntimeException("Error setting value for " + name, e);
        }
        return this;
    }

    private Object get(String name) {
        checkColumn(name);
        try {
            return row.get(name, schema);
        } catch (HCatException e) {
            throw new RuntimeException("Error getting value for " + name, e);
        }
    }

    private void flushRow() {
        if (row != null) {
            rowsBuilder.put(createPartitionSpec(), row);
        }
    }

    private Map<String, String> createPartitionSpec() {
        ImmutableMap.Builder<String, String> builder = ImmutableMap.builder();
        for (HCatFieldSchema partitionColumn : partitionColumns) {
            String name = partitionColumn.getName();
            Object value = get(name);
            checkState(value != null, "Value for partition column %s must not be null.", name);
            builder.put(name, value.toString());
        }
        return builder.build();
    }

    Multimap<Map<String, String>, HCatRecord> build() {
        flushRow();
        return rowsBuilder.build();
    }

    private void checkColumn(String name) {
        checkArgument(schema.getFieldNames().contains(name.toLowerCase()), "Column %s does not exist", name);
    }

}


================================================
FILE: src/main/java/com/klarna/hiverunner/data/TableDataInserter.java
==============================================
Download .txt
gitextract_l_9r6xof/

├── .github/
│   ├── CODEOWNERS
│   ├── ISSUE_TEMPLATE/
│   │   ├── bug_report.md
│   │   └── feature_request.md
│   └── workflows/
│       ├── deploy.yml
│       ├── main.yml
│       └── release.yml
├── .gitignore
├── CHANGELOG.md
├── CODE-OF-CONDUCT.md
├── CONTRIBUTING.md
├── LICENSE.txt
├── README.md
├── RELEASING.md
├── pom.xml
└── src/
    ├── main/
    │   ├── java/
    │   │   └── com/
    │   │       └── klarna/
    │   │           ├── hiverunner/
    │   │           │   ├── HiveRunnerCore.java
    │   │           │   ├── HiveRunnerExtension.java
    │   │           │   ├── HiveRunnerRule.java
    │   │           │   ├── HiveServerContainer.java
    │   │           │   ├── HiveServerContext.java
    │   │           │   ├── HiveShell.java
    │   │           │   ├── HiveShellContainer.java
    │   │           │   ├── StandaloneHiveRunner.java
    │   │           │   ├── StandaloneHiveServerContext.java
    │   │           │   ├── ThrowOnTimeout.java
    │   │           │   ├── TimeoutException.java
    │   │           │   ├── annotations/
    │   │           │   │   ├── HiveProperties.java
    │   │           │   │   ├── HiveResource.java
    │   │           │   │   ├── HiveRunnerSetup.java
    │   │           │   │   ├── HiveSQL.java
    │   │           │   │   └── HiveSetupScript.java
    │   │           │   ├── builder/
    │   │           │   │   ├── HiveResource.java
    │   │           │   │   ├── HiveRunnerScript.java
    │   │           │   │   ├── HiveShellBase.java
    │   │           │   │   ├── HiveShellBuilder.java
    │   │           │   │   ├── HiveShellTearable.java
    │   │           │   │   ├── Script.java
    │   │           │   │   └── Statement.java
    │   │           │   ├── config/
    │   │           │   │   └── HiveRunnerConfig.java
    │   │           │   ├── data/
    │   │           │   │   ├── Converters.java
    │   │           │   │   ├── FileParser.java
    │   │           │   │   ├── InsertIntoTable.java
    │   │           │   │   ├── TableDataBuilder.java
    │   │           │   │   ├── TableDataInserter.java
    │   │           │   │   └── TsvFileParser.java
    │   │           │   ├── io/
    │   │           │   │   └── IgnoreClosePrintStream.java
    │   │           │   └── sql/
    │   │           │       ├── HiveRunnerStatement.java
    │   │           │       ├── StatementLexer.java
    │   │           │       ├── cli/
    │   │           │       │   ├── AbstractImportPostProcessor.java
    │   │           │       │   ├── CommandShellEmulator.java
    │   │           │       │   ├── CommandShellEmulatorFactory.java
    │   │           │       │   ├── CommentUtil.java
    │   │           │       │   ├── DefaultPreProcessor.java
    │   │           │       │   ├── PostProcessor.java
    │   │           │       │   ├── PreProcessor.java
    │   │           │       │   ├── beeline/
    │   │           │       │   │   ├── BeelineEmulator.java
    │   │           │       │   │   ├── RunCommandPostProcessor.java
    │   │           │       │   │   └── SqlLineCommandRule.java
    │   │           │       │   └── hive/
    │   │           │       │       ├── HiveCliEmulator.java
    │   │           │       │       ├── PreV200HiveCliEmulator.java
    │   │           │       │       ├── PreV200HiveCliPreProcessor.java
    │   │           │       │       └── SourceCommandPostProcessor.java
    │   │           │       └── split/
    │   │           │           ├── BaseContext.java
    │   │           │           ├── CloseStatementRule.java
    │   │           │           ├── Consumer.java
    │   │           │           ├── Context.java
    │   │           │           ├── DefaultTokenRule.java
    │   │           │           ├── NewLineUtil.java
    │   │           │           ├── PreserveCommentsRule.java
    │   │           │           ├── PreserveQuotesRule.java
    │   │           │           ├── StatementSplitter.java
    │   │           │           └── TokenRule.java
    │   │           └── reflection/
    │   │               └── ReflectionUtils.java
    │   └── license/
    │       └── APACHE-2.txt
    └── test/
        ├── java/
        │   └── com/
        │       └── klarna/
        │           └── hiverunner/
        │               ├── AggregateViewTest.java
        │               ├── AnnotatedBaseTestClass.java
        │               ├── AnnotatedFieldsInSuperClassTest.java
        │               ├── BeelineRunTest.java
        │               ├── BigResultSetTest.java
        │               ├── CommentTest.java
        │               ├── CtasTest.java
        │               ├── DisabledTimeoutTest.java
        │               ├── ExecuteFileBasedScriptIntegrationTest.java
        │               ├── ExecuteScriptIntegrationTest.java
        │               ├── HiveCliSourceTest.java
        │               ├── HiveRunnerAnnotationsTest.java
        │               ├── HiveRunnerExtensionTest.java
        │               ├── HiveServerContainerTest.java
        │               ├── HiveShellBeeLineEmulationTest.java
        │               ├── HiveShellHiveCliEmulationTest.java
        │               ├── HiveVariablesTest.java
        │               ├── InsertIntoTableIntegrationTest.java
        │               ├── IntegerPartitionFormatTest.java
        │               ├── InteractiveHiveShellTest.java
        │               ├── LeftOuterJoinTest.java
        │               ├── MSCKRepairNpeTest.java
        │               ├── MacroTest.java
        │               ├── MethodLevelResourceTest.java
        │               ├── MultipleExecutionEnginesTest.java
        │               ├── NeverEndingUdf.java
        │               ├── NoTimeoutTest.java
        │               ├── OrcSnappyTest.java
        │               ├── ParquetInsertionTest.java
        │               ├── PartitionSupportTest.java
        │               ├── ReservedKeywordTest.java
        │               ├── ResourceOutputStreamTest.java
        │               ├── SchemaResetBetweenTestMethodsTest.java
        │               ├── SerdeTest.java
        │               ├── SetHiveExecutionEngineTest.java
        │               ├── SetPropertyTest.java
        │               ├── SetTest.java
        │               ├── SlowlyFailingUdf.java
        │               ├── TestMethodIntegrityTest.java
        │               ├── TimeoutAndRetryTest.java
        │               ├── ToUpperCaseSerDe.java
        │               ├── UnresolvedResourcePathTest.java
        │               ├── UserDefinedFunctionTest.java
        │               ├── builder/
        │               │   └── HiveShellBaseTest.java
        │               ├── config/
        │               │   └── HiveRunnerConfigTest.java
        │               ├── data/
        │               │   ├── ConvertersTest.java
        │               │   ├── InsertIntoTableTest.java
        │               │   ├── TableDataBuilderTest.java
        │               │   ├── TableDataInserterTest.java
        │               │   └── TsvFileParserTest.java
        │               ├── examples/
        │               │   ├── HelloAnnotatedHiveRunnerTest.java
        │               │   ├── HelloHiveRunnerParamaterizedTest.java
        │               │   ├── HelloHiveRunnerTest.java
        │               │   ├── InsertTestDataTest.java
        │               │   ├── SetHiveConfValuesTest.java
        │               │   └── junit4/
        │               │       ├── HelloAnnotatedHiveRunnerTest.java
        │               │       ├── HelloHiveRunnerTest.java
        │               │       ├── InsertTestDataTest.java
        │               │       └── SetHiveConfValuesTest.java
        │               ├── io/
        │               │   └── IgnoreClosePrintStreamTest.java
        │               └── sql/
        │                   ├── cli/
        │                   │   ├── AbstractImportPostProcessorTest.java
        │                   │   ├── CommandShellEmulatorFactoryTest.java
        │                   │   ├── CommentUtilTest.java
        │                   │   ├── beeline/
        │                   │   │   ├── BeelineEmulatorTest.java
        │                   │   │   ├── BeelineStatementSplitterTest.java
        │                   │   │   ├── RunCommandPostProcessorTest.java
        │                   │   │   └── SqlLineCommandRuleTest.java
        │                   │   └── hive/
        │                   │       ├── HiveCliEmulatorTest.java
        │                   │       ├── HiveCliStatementSplitterTest.java
        │                   │       ├── PreV200HiveCliEmulatorTest.java
        │                   │       └── SourceCommandPostProcessorTest.java
        │                   └── split/
        │                       ├── BaseContextTest.java
        │                       ├── CloseStatementRuleTest.java
        │                       ├── ConsumerEolTest.java
        │                       ├── DefaultTokenRuleTest.java
        │                       ├── NewLineUtilTest.java
        │                       ├── PreserveCommentsRuleTest.java
        │                       ├── PreserveQuotesRuleTest.java
        │                       └── StatementSplitterTest.java
        └── resources/
            ├── AggregateViewTest/
            │   └── create_table.sql
            ├── CommentTest/
            │   └── comment.sql
            ├── CtasTest/
            │   └── ctas.sql
            ├── HelloHiveRunnerTest/
            │   ├── calculate_max.sql
            │   ├── create_ctas.sql
            │   ├── create_max.sql
            │   ├── create_table.sql
            │   └── hello_hive_runner.csv
            ├── HiveRunnerAnnotationsTest/
            │   ├── hql1.sql
            │   ├── setupFile.csv
            │   ├── setupPath.csv
            │   ├── testData.csv
            │   └── testData2.csv
            ├── HiveRunnerExtensionTest/
            │   └── test_query.sql
            ├── InsertIntoTableIntegrationTest/
            │   ├── data.tsv
            │   └── dataWithCustomNullValue.csv
            ├── InsertTestDataTest/
            │   ├── data1.tsv
            │   ├── data2.tsv
            │   ├── dataWithHeader1.tsv
            │   └── dataWithHeader2.tsv
            ├── MethodLevelResourceTest/
            │   └── MethodLevelResourceTest.txt
            ├── OrcSnappyTest/
            │   └── ctas.sql
            ├── PartitionSupportTest/
            │   └── hql_example.sql
            ├── SerdeTest/
            │   ├── create_table.sql
            │   └── hql_custom_serde.sql
            ├── SetTest/
            │   └── test_with_set.hql
            ├── TsvFileParserTest/
            │   ├── data.csv
            │   ├── data.tsv
            │   ├── dataWithCustomNullValue.csv
            │   ├── dataWithHeader.csv
            │   └── dataWithHeader.tsv
            └── log4j2.xml
Download .txt
SYMBOL INDEX (874 symbols across 143 files)

FILE: src/main/java/com/klarna/hiverunner/HiveRunnerCore.java
  class HiveRunnerCore (line 46) | class HiveRunnerCore {
    method createHiveServerContainer (line 51) | HiveShellContainer createHiveServerContainer(List<? extends Script> sc...
    method buildShell (line 60) | private HiveShellContainer buildShell(List<? extends Script> scripts, ...
    method loadScriptUnderTest (line 92) | private HiveShellField loadScriptUnderTest(Object testCaseInstance, Hi...
    method getScriptPaths (line 124) | protected List<Path> getScriptPaths(HiveSQL annotation) throws URISynt...
    method assertFileExists (line 134) | private void assertFileExists(Path file) {
    method loadAnnotatedSetupScripts (line 138) | private void loadAnnotatedSetupScripts(Object testCase, HiveShellBuild...
    method readAll (line 157) | private static String readAll(Path path) {
    method loadAnnotatedResources (line 165) | private void loadAnnotatedResources(Object testCase, HiveShellBuilder ...
    method getMandatoryPathFromField (line 187) | private Path getMandatoryPathFromField(Object testCase, Field resource...
    method loadAnnotatedProperties (line 203) | private void loadAnnotatedProperties(Object testCase, HiveShellBuilder...
    type HiveShellField (line 217) | interface HiveShellField {
      method setShell (line 219) | void setShell(HiveShell shell);
      method isAutoStart (line 221) | boolean isAutoStart();

FILE: src/main/java/com/klarna/hiverunner/HiveRunnerExtension.java
  class HiveRunnerExtension (line 47) | public class HiveRunnerExtension implements AfterEachCallback, TestInsta...
    method HiveRunnerExtension (line 57) | public HiveRunnerExtension() {
    method getScriptPaths (line 61) | protected List<Path> getScriptPaths(HiveSQL annotation) throws URISynt...
    method postProcessTestInstance (line 65) | @Override
    method setupConfig (line 77) | private void setupConfig(Object target) {
    method tearDown (line 92) | private void tearDown(Object target) {
    method deleteTempFolder (line 100) | private void deleteTempFolder(Path directory) {
    method createHiveServerContainer (line 108) | private HiveShellContainer createHiveServerContainer(List<? extends Sc...
    method afterEach (line 113) | @Override

FILE: src/main/java/com/klarna/hiverunner/HiveRunnerRule.java
  class HiveRunnerRule (line 35) | public class HiveRunnerRule implements TestRule {
    method HiveRunnerRule (line 43) | HiveRunnerRule(StandaloneHiveRunner runner, Object target, Path testBa...
    method getScriptsUnderTest (line 49) | public List<? extends Script> getScriptsUnderTest() {
    method setScriptsUnderTest (line 53) | public void setScriptsUnderTest(List<? extends Script> scriptsUnderTes...
    method apply (line 58) | @Override
    class HiveRunnerRuleStatement (line 64) | class HiveRunnerRuleStatement extends Statement {
      method HiveRunnerRuleStatement (line 71) | private HiveRunnerRuleStatement(
      method evaluate (line 82) | @Override

FILE: src/main/java/com/klarna/hiverunner/HiveServerContainer.java
  class HiveServerContainer (line 51) | public class HiveServerContainer {
    method HiveServerContainer (line 61) | public HiveServerContainer(HiveServerContext context) {
    method getClient (line 65) | public CLIService getClient() {
    method init (line 75) | public void init(Map<String, String> testConfig, Map<String, String> h...
    method getBaseDir (line 113) | public Path getBaseDir() {
    method executeStatement (line 117) | public List<Object[]> executeStatement(Statement hiveql) {
    method executeStatement (line 121) | public List<Object[]> executeStatement(String hiveql) {
    method tearDown (line 166) | public void tearDown() {
    method expandVariableSubstitutes (line 206) | public String expandVariableSubstitutes(String expression) {
    method pingHiveServer (line 210) | private void pingHiveServer() {
    method getHiveConf (line 214) | public HiveConf getHiveConf() {
    method getVariableSubstitution (line 218) | public VariableSubstitution getVariableSubstitution() {

FILE: src/main/java/com/klarna/hiverunner/HiveServerContext.java
  type HiveServerContext (line 34) | public interface HiveServerContext {
    method init (line 44) | void init();
    method getHiveConf (line 49) | HiveConf getHiveConf();
    method getBaseDir (line 58) | Path getBaseDir();

FILE: src/main/java/com/klarna/hiverunner/HiveShell.java
  type HiveShell (line 33) | public interface HiveShell {
    method executeQuery (line 41) | List<String> executeQuery(String hiveSql);
    method executeQuery (line 49) | List<String> executeQuery(String hiveSql, String rowValuesDelimitedBy,...
    method executeQuery (line 57) | List<String> executeQuery(File script);
    method executeQuery (line 65) | List<String> executeQuery(Path script);
    method executeQuery (line 73) | List<String> executeQuery(Charset charset, File script);
    method executeQuery (line 81) | List<String> executeQuery(Charset charset, Path script);
    method executeQuery (line 89) | List<String> executeQuery(File script, String rowValuesDelimitedBy, St...
    method executeQuery (line 97) | List<String> executeQuery(Path script, String rowValuesDelimitedBy, St...
    method executeQuery (line 105) | List<String> executeQuery(Charset charset, File script, String rowValu...
    method executeQuery (line 113) | List<String> executeQuery(Charset charset, Path script, String rowValu...
    method executeStatement (line 121) | List<Object[]> executeStatement(String hiveSql);
    method execute (line 129) | void execute(String script);
    method execute (line 138) | void execute(File file);
    method execute (line 147) | void execute(Path path);
    method execute (line 155) | void execute(Charset charset, File file);
    method execute (line 163) | void execute(Charset charset, Path path);
    method start (line 173) | void start();
    method setProperty (line 182) | @Deprecated
    method setHiveConfValue (line 191) | void setHiveConfValue(String key, String value);
    method setHiveVarValue (line 199) | void setHiveVarValue(String var, String value);
    method getHiveConf (line 204) | HiveConf getHiveConf();
    method setCwd (line 206) | void setCwd(Path cwd);
    method getCwd (line 208) | Path getCwd();
    method addResource (line 218) | void addResource(String targetFile, File sourceFile);
    method addResource (line 228) | void addResource(String targetFile, Path sourceFile);
    method addResource (line 239) | void addResource(String targetFile, String data);
    method addSetupScript (line 248) | void addSetupScript(String script);
    method addSetupScripts (line 256) | void addSetupScripts(Charset charset, File... scripts);
    method addSetupScripts (line 264) | void addSetupScripts(Charset charset, Path... scripts);
    method addSetupScripts (line 275) | void addSetupScripts(File... scripts);
    method addSetupScripts (line 285) | void addSetupScripts(Path... scripts);
    method getBaseDir (line 291) | Path getBaseDir();
    method expandVariableSubstitutes (line 298) | String expandVariableSubstitutes(String expression);
    method getResourceOutputStream (line 311) | OutputStream getResourceOutputStream(String targetFile);
    method insertInto (line 322) | InsertIntoTable insertInto(String databaseName, String tableName);

FILE: src/main/java/com/klarna/hiverunner/HiveShellContainer.java
  type HiveShellContainer (line 27) | public interface HiveShellContainer extends HiveShell {
    method tearDown (line 33) | void tearDown();
    method getScriptsUnderTest (line 38) | List<Script> getScriptsUnderTest();

FILE: src/main/java/com/klarna/hiverunner/StandaloneHiveRunner.java
  class StandaloneHiveRunner (line 60) | public class StandaloneHiveRunner extends BlockJUnit4ClassRunner {
    method StandaloneHiveRunner (line 72) | public StandaloneHiveRunner(Class<?> clazz) throws InitializationError {
    method getHiveRunnerConfig (line 76) | protected HiveRunnerConfig getHiveRunnerConfig() {
    method getTestRules (line 80) | @Override
    method runChild (line 108) | @Override
    method runTestMethod (line 129) | protected final void runTestMethod(FrameworkMethod method,
    method evaluateStatement (line 161) | public HiveShellContainer evaluateStatement(List<? extends Script> scr...
    method tearDown (line 179) | private void tearDown() {
    method tearDownContainer (line 186) | private void tearDownContainer() {
    method deleteTempFolder (line 197) | private void deleteTempFolder(Path directory) {
    method createHiveServerContainer (line 208) | private HiveShellContainer createHiveServerContainer(List<? extends Sc...
    method getHiveRunnerConfigRule (line 215) | private TestRule getHiveRunnerConfigRule(Object target) {
    method clearLogContext (line 240) | private void clearLogContext() {
    method setLogContext (line 244) | private void setLogContext(FrameworkMethod method) {

FILE: src/main/java/com/klarna/hiverunner/StandaloneHiveServerContext.java
  class StandaloneHiveServerContext (line 68) | public class StandaloneHiveServerContext implements HiveServerContext {
    method StandaloneHiveServerContext (line 79) | public StandaloneHiveServerContext(Path basedir, HiveRunnerConfig hive...
    method init (line 84) | @Override
    method configureMiscHiveSettings (line 110) | protected void configureMiscHiveSettings(HiveConf hiveConf) {
    method overrideHiveConf (line 122) | protected void overrideHiveConf(HiveConf hiveConf) {
    method configureMrExecutionEngine (line 128) | protected void configureMrExecutionEngine(HiveConf conf) {
    method configureTezExecutionEngine (line 145) | protected void configureTezExecutionEngine(HiveConf conf) {
    method configureJavaSecurityRealm (line 170) | protected void configureJavaSecurityRealm(HiveConf hiveConf) {
    method configureAssertionStatus (line 178) | protected void configureAssertionStatus(HiveConf conf) {
    method configureSupportConcurrency (line 184) | protected void configureSupportConcurrency(HiveConf conf) {
    method configureMetaStore (line 188) | protected void configureMetaStore(HiveConf conf) {
    method configureDerbyLog (line 223) | private void configureDerbyLog() {
    method configureFileSystem (line 235) | protected void configureFileSystem(Path basedir, HiveConf conf) throws...
    method newFolder (line 259) | Path newFolder(Path basedir, String folder) throws IOException {
    method getHiveConf (line 265) | @Override
    method getBaseDir (line 270) | @Override
    method createAndSetFolderProperty (line 275) | protected final void createAndSetFolderProperty(HiveConf.ConfVars var,...
    method createAndSetFolderProperty (line 280) | protected final void createAndSetFolderProperty(String key, String fol...
    method setMetastoreProperty (line 285) | protected final void setMetastoreProperty(String key, String value) {

FILE: src/main/java/com/klarna/hiverunner/ThrowOnTimeout.java
  class ThrowOnTimeout (line 27) | public class ThrowOnTimeout extends Statement {
    method ThrowOnTimeout (line 38) | public ThrowOnTimeout(Statement originalStatement, HiveRunnerConfig co...
    method evaluate (line 44) | @Override
    method create (line 99) | public static TestRule create(final HiveRunnerConfig config, final Obj...

FILE: src/main/java/com/klarna/hiverunner/TimeoutException.java
  class TimeoutException (line 19) | public class TimeoutException extends RuntimeException {
    method TimeoutException (line 23) | public TimeoutException() {
    method TimeoutException (line 27) | public TimeoutException(String message) {
    method TimeoutException (line 31) | public TimeoutException(String message, Throwable cause) {
    method TimeoutException (line 35) | public TimeoutException(Throwable cause) {
    method TimeoutException (line 39) | protected TimeoutException(String message, Throwable cause, boolean en...

FILE: src/main/java/com/klarna/hiverunner/builder/HiveResource.java
  class HiveResource (line 30) | class HiveResource {
    method HiveResource (line 34) | HiveResource(String targetFile) throws IOException {
    method HiveResource (line 38) | HiveResource(String targetFile, Path dataFile) throws IOException {
    method HiveResource (line 42) | HiveResource(String targetFile, String data) throws IOException {
    method HiveResource (line 46) | private HiveResource(String targetFile, ByteArrayOutputStream byteArra...
    method createOutputStream (line 51) | private static ByteArrayOutputStream createOutputStream(byte[] data) t...
    method getTargetFile (line 58) | String getTargetFile() {
    method toString (line 62) | @Override
    method getOutputStream (line 67) | public ByteArrayOutputStream getOutputStream() {

FILE: src/main/java/com/klarna/hiverunner/builder/HiveRunnerScript.java
  class HiveRunnerScript (line 21) | public class HiveRunnerScript implements Script {
    method HiveRunnerScript (line 27) | public HiveRunnerScript(int index, Path path, String sqlText) {
    method getIndex (line 33) | @Override
    method getPath (line 41) | @Override
    method getSql (line 49) | @Override
    method hashCode (line 54) | @Override
    method equals (line 64) | @Override
    method toString (line 88) | @Override

FILE: src/main/java/com/klarna/hiverunner/builder/HiveShellBase.java
  class HiveShellBase (line 49) | class HiveShellBase implements HiveShell {
    method HiveShellBase (line 68) | HiveShellBase(HiveServerContainer hiveServerContainer, Map<String, Str...
    method executeQuery (line 80) | @Override
    method executeQuery (line 85) | @Override
    method executeStatement (line 97) | @Override
    method executeScriptWithCommandShellEmulation (line 103) | private void executeScriptWithCommandShellEmulation(String script) {
    method executeStatementWithCommandShellEmulation (line 108) | private List<Object[]> executeStatementWithCommandShellEmulation(Strin...
    method executeStatementsWithCommandShellEmulation (line 113) | private List<Object[]> executeStatementsWithCommandShellEmulation(List...
    method execute (line 121) | @Override
    method execute (line 127) | @Override
    method execute (line 133) | @Override
    method execute (line 139) | @Override
    method execute (line 145) | @Override
    method start (line 153) | @Override
    method addSetupScript (line 169) | @Override
    method addSetupScripts (line 175) | @Override
    method addSetupScripts (line 190) | @Override
    method addSetupScripts (line 199) | @Override
    method addSetupScripts (line 204) | @Override
    method getBaseDir (line 209) | @Override
    method expandVariableSubstitutes (line 214) | @Override
    method setProperty (line 222) | @Override
    method setHiveConfValue (line 227) | @Override
    method getHiveConf (line 233) | @Override
    method getResourceOutputStream (line 239) | @Override
    method setHiveVarValue (line 252) | @Override
    method addResource (line 258) | @Override
    method addResource (line 268) | @Override
    method addResource (line 279) | @Override
    method insertInto (line 284) | @Override
    method executeSetupScripts (line 290) | private void executeSetupScripts() {
    method prepareResources (line 297) | private void prepareResources() {
    method executeScriptsUnderTest (line 322) | private void executeScriptsUnderTest() {
    method assertResourcePreconditions (line 332) | protected final void assertResourcePreconditions(HiveResource resource...
    method assertFileExists (line 348) | protected final void assertFileExists(Path file) {
    method assertNotStarted (line 354) | protected final void assertNotStarted() {
    method assertStarted (line 358) | protected final void assertStarted() {
    method createPreStartOutputStream (line 362) | private OutputStream createPreStartOutputStream(ByteArrayOutputStream ...
    method executeQuery (line 374) | @Override
    method executeQuery (line 379) | @Override
    method executeQuery (line 384) | @Override
    method executeQuery (line 389) | @Override
    method executeQuery (line 394) | @Override
    method executeQuery (line 399) | @Override
    method executeQuery (line 404) | @Override
    method getScriptsUnderTest (line 410) | public List<Script> getScriptsUnderTest() {
    method executeQuery (line 414) | @Override
    method setCwd (line 433) | @Override
    method getCwd (line 439) | @Override

FILE: src/main/java/com/klarna/hiverunner/builder/HiveShellBuilder.java
  class HiveShellBuilder (line 37) | public class HiveShellBuilder {
    method setHiveServerContainer (line 45) | public void setHiveServerContainer(HiveServerContainer hiveServerConta...
    method putAllProperties (line 49) | public void putAllProperties(Map<String, String> props) {
    method addSetupScript (line 53) | public void addSetupScript(String script) {
    method addResource (line 57) | public void addResource(String targetFile, Path dataFile) throws IOExc...
    method addResource (line 61) | public void addResource(String targetFile, String data) throws IOExcep...
    method setScriptsUnderTest (line 65) | public void setScriptsUnderTest(List<Path> scriptPaths, Charset charse...
    method fromScriptPaths (line 69) | public List<Script> fromScriptPaths(List<Path> scriptPaths, Charset ch...
    method setCommandShellEmulation (line 84) | public void setCommandShellEmulation(CommandShellEmulator commandShell...
    method buildShell (line 88) | public HiveShellContainer buildShell() {
    method overrideScriptsUnderTest (line 92) | public void overrideScriptsUnderTest(List<? extends Script> scripts) {

FILE: src/main/java/com/klarna/hiverunner/builder/HiveShellTearable.java
  class HiveShellTearable (line 29) | class HiveShellTearable extends HiveShellBase implements HiveShellContai...
    method HiveShellTearable (line 31) | HiveShellTearable(HiveServerContainer hiveServerContainer, Map<String,...
    method tearDown (line 37) | @Override

FILE: src/main/java/com/klarna/hiverunner/builder/Script.java
  type Script (line 21) | public interface Script {
    method getIndex (line 26) | int getIndex();
    method getPath (line 28) | Path getPath();
    method getSql (line 30) | String getSql();

FILE: src/main/java/com/klarna/hiverunner/builder/Statement.java
  type Statement (line 19) | public interface Statement {
    method getIndex (line 24) | int getIndex();
    method getSql (line 29) | String getSql();

FILE: src/main/java/com/klarna/hiverunner/config/HiveRunnerConfig.java
  class HiveRunnerConfig (line 66) | public class HiveRunnerConfig {
    method HiveRunnerConfig (line 114) | public HiveRunnerConfig() {
    method HiveRunnerConfig (line 122) | public HiveRunnerConfig(Properties systemProperties) {
    method isTimeoutEnabled (line 131) | public boolean isTimeoutEnabled() {
    method getTimeoutRetries (line 135) | public int getTimeoutRetries() {
    method getTimeoutSeconds (line 139) | public int getTimeoutSeconds() {
    method getHiveExecutionEngine (line 146) | public String getHiveExecutionEngine() {
    method getHiveConfSystemOverride (line 151) | public Map<String, String> getHiveConfSystemOverride() {
    method getCommandShellEmulator (line 159) | public CommandShellEmulator getCommandShellEmulator() {
    method setTimeoutEnabled (line 163) | public void setTimeoutEnabled(boolean isEnabled) {
    method setTimeoutRetries (line 167) | public void setTimeoutRetries(int retries) {
    method setTimeoutSeconds (line 171) | public void setTimeoutSeconds(int timeout) {
    method setHiveExecutionEngine (line 175) | public void setHiveExecutionEngine(String executionEngine) {
    method setCommandShellEmulator (line 179) | public void setCommandShellEmulator(CommandShellEmulator commandShellE...
    method override (line 187) | public void override(HiveRunnerConfig hiveRunnerConfig) {
    method load (line 192) | private static boolean load(String property, boolean defaultValue, Pro...
    method load (line 197) | private static String load(String property, String defaultValue, Prope...
    method load (line 202) | private static int load(String property, int defaultValue, Properties ...
    method getBoolean (line 208) | private boolean getBoolean(String key) {
    method getInteger (line 213) | private int getInteger(String key) {
    method getString (line 217) | private String getString(String key) {
    method loadHiveConfSystemOverrides (line 221) | private static Map<String, String> loadHiveConfSystemOverrides(Propert...

FILE: src/main/java/com/klarna/hiverunner/data/Converters.java
  class Converters (line 61) | public final class Converters {
    method Converters (line 98) | private Converters() {
    method type (line 101) | static Class<?> type(PrimitiveTypeInfo typeInfo) {
    method convert (line 126) | public static Object convert(Object value, PrimitiveTypeInfo typeInfo) {
    class HiveDecimalConverter (line 136) | private static class HiveDecimalConverter implements Converter {
      method convert (line 137) | @Override
    class HiveDateConverter (line 147) | private static class HiveDateConverter implements Converter {
      method convert (line 148) | @Override
    class HiveTimestampConverter (line 158) | private static class HiveTimestampConverter implements Converter {
      method convert (line 159) | @Override
    class HiveVarcharConverter (line 169) | private static class HiveVarcharConverter implements Converter {
      method convert (line 170) | @Override
    class HiveCharConverter (line 176) | private static class HiveCharConverter implements Converter {
      method convert (line 177) | @Override

FILE: src/main/java/com/klarna/hiverunner/data/FileParser.java
  type FileParser (line 28) | public interface FileParser {
    method parse (line 38) | List<Object[]> parse(File file, HCatSchema schema, List<String> names);
    method getColumnNames (line 46) | List<String> getColumnNames(File file);
    method hasColumnNames (line 52) | boolean hasColumnNames();

FILE: src/main/java/com/klarna/hiverunner/data/InsertIntoTable.java
  class InsertIntoTable (line 31) | public final class InsertIntoTable {
    method newInstance (line 47) | public static InsertIntoTable newInstance(String databaseName, String ...
    method getHCatTable (line 53) | private static HCatTable getHCatTable(String databaseName, String tabl...
    method InsertIntoTable (line 71) | InsertIntoTable(TableDataBuilder builder, TableDataInserter inserter) {
    method withColumns (line 94) | public InsertIntoTable withColumns(String... names) {
    method withAllColumns (line 104) | public InsertIntoTable withAllColumns() {
    method newRow (line 114) | public InsertIntoTable newRow() {
    method addRow (line 125) | public InsertIntoTable addRow(Object... values) {
    method setRow (line 136) | public InsertIntoTable setRow(Object... values) {
    method addRowsFromTsv (line 148) | public InsertIntoTable addRowsFromTsv(File file) {
    method addRowsFromDelimited (line 161) | public InsertIntoTable addRowsFromDelimited(File file, String delimite...
    method addRowsFrom (line 173) | public InsertIntoTable addRowsFrom(File file, FileParser fileParser) {
    method copyRow (line 183) | public InsertIntoTable copyRow() {
    method set (line 196) | public InsertIntoTable set(String name, Object value) {
    method commit (line 205) | public void commit() {

FILE: src/main/java/com/klarna/hiverunner/data/TableDataBuilder.java
  class TableDataBuilder (line 45) | class TableDataBuilder {
    method TableDataBuilder (line 54) | TableDataBuilder(HCatTable table) {
    method withColumns (line 64) | TableDataBuilder withColumns(String... names) {
    method withAllColumns (line 74) | TableDataBuilder withAllColumns() {
    method newRow (line 79) | TableDataBuilder newRow() {
    method addRow (line 85) | TableDataBuilder addRow(Object... values) {
    method setRow (line 89) | TableDataBuilder setRow(Object... values) {
    method addRowsFromTsv (line 97) | TableDataBuilder addRowsFromTsv(File file) {
    method addRowsFromDelimited (line 101) | TableDataBuilder addRowsFromDelimited(File file, String delimiter, Obj...
    method addRowsFrom (line 105) | TableDataBuilder addRowsFrom(File file, FileParser fileParser) {
    method toLowerCase (line 118) | private Function<String, String> toLowerCase() {
    method addRows (line 127) | private TableDataBuilder addRows(List<Object[]> rows) {
    method copyRow (line 134) | TableDataBuilder copyRow() {
    method set (line 142) | TableDataBuilder set(String name, Object value) {
    method get (line 165) | private Object get(String name) {
    method flushRow (line 174) | private void flushRow() {
    method createPartitionSpec (line 180) | private Map<String, String> createPartitionSpec() {
    method build (line 191) | Multimap<Map<String, String>, HCatRecord> build() {
    method checkColumn (line 196) | private void checkColumn(String name) {

FILE: src/main/java/com/klarna/hiverunner/data/TableDataInserter.java
  class TableDataInserter (line 33) | class TableDataInserter {
    method TableDataInserter (line 39) | TableDataInserter(String databaseName, String tableName, HiveConf conf) {
    method insert (line 45) | void insert(Multimap<Map<String, String>, HCatRecord> data) {
    method insert (line 53) | private void insert(Map<String, String> partitionSpec, Iterable<HCatRe...

FILE: src/main/java/com/klarna/hiverunner/data/TsvFileParser.java
  class TsvFileParser (line 36) | public class TsvFileParser implements FileParser {
    method TsvFileParser (line 46) | public TsvFileParser() {
    method withDelimiter (line 56) | public TsvFileParser withDelimiter(String delimiter) {
    method withNullValue (line 65) | public TsvFileParser withNullValue(Object nullValue) {
    method withCharset (line 73) | public TsvFileParser withCharset(Charset charset) {
    method withHeader (line 81) | public TsvFileParser withHeader() {
    method withoutHeader (line 89) | public TsvFileParser withoutHeader() {
    method parse (line 95) | @Override
    method hasColumnNames (line 114) | @Override
    method getColumnNames (line 119) | @Override
    method parseRow (line 136) | private Object[] parseRow(String line, int size) {

FILE: src/main/java/com/klarna/hiverunner/io/IgnoreClosePrintStream.java
  class IgnoreClosePrintStream (line 22) | public class IgnoreClosePrintStream extends PrintStream {
    method IgnoreClosePrintStream (line 24) | public IgnoreClosePrintStream(OutputStream out) {
    method close (line 28) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/HiveRunnerStatement.java
  class HiveRunnerStatement (line 21) | public class HiveRunnerStatement implements Statement {
    method HiveRunnerStatement (line 26) | public HiveRunnerStatement(int index, String sql) {
    method getIndex (line 31) | @Override
    method getSql (line 36) | @Override
    method hashCode (line 41) | @Override
    method equals (line 50) | @Override
    method toString (line 69) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/StatementLexer.java
  class StatementLexer (line 30) | public class StatementLexer {
    method StatementLexer (line 36) | public StatementLexer(Path cwd, Charset charset, CommandShellEmulator ...
    method internalApplyToStatement (line 42) | private List<String> internalApplyToStatement(String statement) {
    method applyToScript (line 47) | public List<String> applyToScript(String script) {
    method applyToStatement (line 57) | public List<String> applyToStatement(String statement) {
    method applyToPath (line 61) | public List<String> applyToPath(Path path) {

FILE: src/main/java/com/klarna/hiverunner/sql/cli/AbstractImportPostProcessor.java
  class AbstractImportPostProcessor (line 31) | public abstract class AbstractImportPostProcessor implements PostProcess...
    method AbstractImportPostProcessor (line 35) | public AbstractImportPostProcessor(StatementLexer lexer) {
    method statement (line 39) | @Override
    method getImportPath (line 49) | public abstract String getImportPath(String statement);
    method isImport (line 51) | public abstract boolean isImport(String statement);

FILE: src/main/java/com/klarna/hiverunner/sql/cli/CommandShellEmulator.java
  type CommandShellEmulator (line 28) | public interface CommandShellEmulator {
    method preProcessor (line 29) | PreProcessor preProcessor();
    method postProcessor (line 31) | PostProcessor postProcessor(StatementLexer lexer);
    method specialCharacters (line 33) | String specialCharacters();
    method splitterRules (line 35) | List<TokenRule> splitterRules();
    method getName (line 37) | String getName();

FILE: src/main/java/com/klarna/hiverunner/sql/cli/CommandShellEmulatorFactory.java
  class CommandShellEmulatorFactory (line 23) | public class CommandShellEmulatorFactory {
    method CommandShellEmulatorFactory (line 25) | private CommandShellEmulatorFactory() {
    method valueOf (line 28) | public static CommandShellEmulator valueOf(String name) {

FILE: src/main/java/com/klarna/hiverunner/sql/cli/CommentUtil.java
  class CommentUtil (line 20) | public final class CommentUtil {
    method CommentUtil (line 22) | private CommentUtil() {
    method stripFullLineComments (line 25) | public static String stripFullLineComments(String statement) {

FILE: src/main/java/com/klarna/hiverunner/sql/cli/DefaultPreProcessor.java
  type DefaultPreProcessor (line 22) | public enum DefaultPreProcessor implements PreProcessor {
    method script (line 25) | @Override
    method statement (line 30) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/cli/PostProcessor.java
  type PostProcessor (line 24) | public interface PostProcessor {
    method statement (line 25) | public List<String> statement(String statement);

FILE: src/main/java/com/klarna/hiverunner/sql/cli/PreProcessor.java
  type PreProcessor (line 20) | public interface PreProcessor {
    method script (line 21) | public String script(String script);
    method statement (line 23) | public String statement(String statement);

FILE: src/main/java/com/klarna/hiverunner/sql/cli/beeline/BeelineEmulator.java
  type BeelineEmulator (line 38) | public enum BeelineEmulator implements CommandShellEmulator {
    method preProcessor (line 43) | @Override
    method postProcessor (line 48) | @Override
    method getName (line 53) | @Override
    method specialCharacters (line 58) | @Override
    method splitterRules (line 63) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/cli/beeline/RunCommandPostProcessor.java
  class RunCommandPostProcessor (line 27) | class RunCommandPostProcessor extends AbstractImportPostProcessor {
    method RunCommandPostProcessor (line 31) | RunCommandPostProcessor(StatementLexer lexer) {
    method getImportPath (line 35) | @Override
    method isImport (line 45) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/cli/beeline/SqlLineCommandRule.java
  type SqlLineCommandRule (line 30) | public enum SqlLineCommandRule implements TokenRule {
    method triggers (line 33) | @Override
    method handle (line 38) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/cli/hive/HiveCliEmulator.java
  type HiveCliEmulator (line 38) | public enum HiveCliEmulator implements CommandShellEmulator {
    method preProcessor (line 41) | @Override
    method postProcessor (line 46) | @Override
    method getName (line 51) | @Override
    method specialCharacters (line 56) | @Override
    method splitterRules (line 61) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/cli/hive/PreV200HiveCliEmulator.java
  type PreV200HiveCliEmulator (line 32) | public enum PreV200HiveCliEmulator implements CommandShellEmulator {
    method preProcessor (line 35) | @Override
    method postProcessor (line 40) | @Override
    method getName (line 45) | @Override
    method specialCharacters (line 50) | @Override
    method splitterRules (line 55) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/cli/hive/PreV200HiveCliPreProcessor.java
  type PreV200HiveCliPreProcessor (line 32) | enum PreV200HiveCliPreProcessor implements PreProcessor {
    method script (line 35) | @Override
    method statement (line 40) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/cli/hive/SourceCommandPostProcessor.java
  class SourceCommandPostProcessor (line 27) | class SourceCommandPostProcessor extends AbstractImportPostProcessor {
    method SourceCommandPostProcessor (line 31) | SourceCommandPostProcessor(StatementLexer lexer) {
    method getImportPath (line 35) | @Override
    method isImport (line 41) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/split/BaseContext.java
  class BaseContext (line 26) | class BaseContext implements Context {
    method BaseContext (line 32) | BaseContext(StringTokenizer tokenizer) {
    method flush (line 36) | @Override
    method statement (line 44) | @Override
    method tokenizer (line 49) | @Override
    method append (line 54) | @Override
    method appendWith (line 59) | @Override
    method getStatements (line 64) | public List<String> getStatements() {

FILE: src/main/java/com/klarna/hiverunner/sql/split/CloseStatementRule.java
  type CloseStatementRule (line 23) | public enum CloseStatementRule implements TokenRule {
    method triggers (line 26) | @Override
    method handle (line 31) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/split/Consumer.java
  type Consumer (line 24) | public interface Consumer {
    method consume (line 26) | String consume(Context context);
    method consume (line 31) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/split/Context.java
  type Context (line 25) | public interface Context {
    method tokenizer (line 26) | StringTokenizer tokenizer();
    method statement (line 28) | String statement();
    method append (line 30) | void append(String chars);
    method appendWith (line 32) | void appendWith(Consumer consumer);
    method flush (line 34) | void flush();

FILE: src/main/java/com/klarna/hiverunner/sql/split/DefaultTokenRule.java
  type DefaultTokenRule (line 23) | public enum DefaultTokenRule implements TokenRule {
    method triggers (line 26) | @Override
    method handle (line 31) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/split/NewLineUtil.java
  type NewLineUtil (line 29) | enum NewLineUtil {
    method removeLeadingTrailingNewLines (line 35) | static String removeLeadingTrailingNewLines(String in) {

FILE: src/main/java/com/klarna/hiverunner/sql/split/PreserveCommentsRule.java
  type PreserveCommentsRule (line 24) | public enum PreserveCommentsRule implements TokenRule {
    method triggers (line 29) | @Override
    method handle (line 34) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/split/PreserveQuotesRule.java
  type PreserveQuotesRule (line 25) | public enum PreserveQuotesRule implements TokenRule {
    method triggers (line 30) | @Override
    method handle (line 35) | @Override
    class QuotedStringConsumer (line 40) | static class QuotedStringConsumer implements Consumer {
      method QuotedStringConsumer (line 44) | QuotedStringConsumer(String token) {
      method consume (line 48) | @Override

FILE: src/main/java/com/klarna/hiverunner/sql/split/StatementSplitter.java
  class StatementSplitter (line 31) | public class StatementSplitter {
    method StatementSplitter (line 38) | public StatementSplitter(CommandShellEmulator emulator) {
    method StatementSplitter (line 45) | public StatementSplitter(List<TokenRule> rules, String specialChars) {
    method split (line 50) | public List<Statement> split(String expression) {

FILE: src/main/java/com/klarna/hiverunner/sql/split/TokenRule.java
  type TokenRule (line 22) | public interface TokenRule {
    method triggers (line 23) | Set<String> triggers();
    method handle (line 25) | void handle(String token, Context context);

FILE: src/main/java/com/klarna/reflection/ReflectionUtils.java
  class ReflectionUtils (line 34) | public final class ReflectionUtils {
    method ReflectionUtils (line 39) | private ReflectionUtils() {
    method setStaticField (line 42) | public static void setStaticField(Class clazz, String fieldName, Objec...
    method setField (line 46) | public static void setField(Object instance, String fieldName, Object ...
    method setField (line 50) | private static void setField(Class clazz, Object instance, String fiel...
    method getField (line 73) | public static Optional<Field> getField(Class<?> type, final String fie...
    method getAllFields (line 83) | public static Set<Field> getAllFields(Class aClass, Predicate<? super ...
    method getFieldValue (line 87) | public static <T> T getFieldValue(Object testCase, String name, Class<...
    method getStaticFieldValue (line 91) | public static <T> T getStaticFieldValue(Class testCaseClass, String na...
    method getFieldValue (line 95) | private static <T> T getFieldValue(Object testCase, Class testCaseClas...
    method isOfType (line 116) | public static boolean isOfType(Field setupScriptField, Class type) {
    method havingFieldName (line 121) | private static Predicate<Field> havingFieldName(final String fieldName) {

FILE: src/test/java/com/klarna/hiverunner/AggregateViewTest.java
  class AggregateViewTest (line 31) | @ExtendWith(HiveRunnerExtension.class)
    method aggregateView (line 41) | @Test

FILE: src/test/java/com/klarna/hiverunner/AnnotatedBaseTestClass.java
  class AnnotatedBaseTestClass (line 23) | @ExtendWith(HiveRunnerExtension.class)
    method setup (line 28) | @BeforeEach

FILE: src/test/java/com/klarna/hiverunner/AnnotatedFieldsInSuperClassTest.java
  class AnnotatedFieldsInSuperClassTest (line 21) | public class AnnotatedFieldsInSuperClassTest extends AnnotatedBaseTestCl...
    method testShellInitializedInAbstractTestClass (line 22) | @Test

FILE: src/test/java/com/klarna/hiverunner/BeelineRunTest.java
  class BeelineRunTest (line 36) | @RunWith(StandaloneHiveRunner.class)
    method testNestedImport (line 54) | @Test

FILE: src/test/java/com/klarna/hiverunner/BigResultSetTest.java
  class BigResultSetTest (line 30) | @ExtendWith(HiveRunnerExtension.class)
    method bigResultSetTest (line 41) | @Test

FILE: src/test/java/com/klarna/hiverunner/CommentTest.java
  class CommentTest (line 29) | @ExtendWith(HiveRunnerExtension.class)
    method testPreceedingFullLineComment (line 34) | @Test
    method testFullLineCommentInsideDeclaration (line 40) | @Test

FILE: src/test/java/com/klarna/hiverunner/CtasTest.java
  class CtasTest (line 28) | @ExtendWith(HiveRunnerExtension.class)
    method tablesShouldBeCreated (line 37) | @Test
    method verifyThatDataIsAvailableInCtas (line 44) | @Test
    method testCountCtas (line 51) | @Test

FILE: src/test/java/com/klarna/hiverunner/DisabledTimeoutTest.java
  class DisabledTimeoutTest (line 25) | @ExtendWith(HiveRunnerExtension.class)
    method finishAfterTimeoutTest (line 37) | @Test

FILE: src/test/java/com/klarna/hiverunner/ExecuteFileBasedScriptIntegrationTest.java
  class ExecuteFileBasedScriptIntegrationTest (line 37) | @RunWith(StandaloneHiveRunner.class)
    method testExecuteFileBasedScript (line 46) | @Test

FILE: src/test/java/com/klarna/hiverunner/ExecuteScriptIntegrationTest.java
  class ExecuteScriptIntegrationTest (line 34) | @RunWith(StandaloneHiveRunner.class)
    method testInsertRowWithExecuteScript (line 43) | @Test

FILE: src/test/java/com/klarna/hiverunner/HiveCliSourceTest.java
  class HiveCliSourceTest (line 36) | @RunWith(StandaloneHiveRunner.class)
    method testNestedImport (line 54) | @Test

FILE: src/test/java/com/klarna/hiverunner/HiveRunnerAnnotationsTest.java
  class HiveRunnerAnnotationsTest (line 39) | @ExtendWith(HiveRunnerExtension.class)
    method setup (line 70) | @BeforeEach
    method testHiveSQLLoaded (line 75) | @Test
    method testSetupScript (line 82) | @Test
    method testSetupScriptFromFile (line 89) | @Test
    method testSetupScriptFromPath (line 96) | @Test
    method testPropertiesLoaded (line 104) | @Test
    method testLoadStringResources (line 110) | @Test
    method testLoadFileResources (line 119) | @Test
    method testLoadPathResources (line 126) | @Test

FILE: src/test/java/com/klarna/hiverunner/HiveRunnerExtensionTest.java
  class HiveRunnerExtensionTest (line 30) | @ExtendWith(HiveRunnerExtension.class)
    method shellFindFiles (line 36) | @Test

FILE: src/test/java/com/klarna/hiverunner/HiveServerContainerTest.java
  class HiveServerContainerTest (line 35) | public class HiveServerContainerTest {
    method setup (line 40) | @BeforeEach
    method tearDown (line 49) | @AfterEach
    method testGetBasedir (line 54) | @Test
    method testExecuteStatementMR (line 59) | @Test
    method testExecuteStatementTez (line 66) | @Test
    method testExecuteStatementOutputStreamReset (line 73) | @Test
    method testExecuteStatementOutputStreamResetIfException (line 80) | @Test
    method testTearDownShouldNotThrowException (line 91) | @Test
    method testInvalidQuery (line 98) | @Test

FILE: src/test/java/com/klarna/hiverunner/HiveShellBeeLineEmulationTest.java
  class HiveShellBeeLineEmulationTest (line 34) | @ExtendWith(HiveRunnerExtension.class)
    method testQueryStripFullLineCommentFirstLine (line 46) | @Test
    method testQueryStripFullLineCommentNested (line 54) | @Test
    method testQueryStripFullLineComment (line 61) | @Test
    method testScriptStripFullLineCommentFirstLine (line 66) | @Test
    method testScriptStripFullLineCommentLastLine (line 73) | @Test
    method testScriptStripFullLineComment (line 80) | @Test
    method testScriptStripFullLineCommentNested (line 85) | @Test

FILE: src/test/java/com/klarna/hiverunner/HiveShellHiveCliEmulationTest.java
  class HiveShellHiveCliEmulationTest (line 34) | @ExtendWith(HiveRunnerExtension.class)
    method testQueryStripFullLineCommentFirstLine (line 46) | @Test
    method testQueryStripFullLineCommentNested (line 52) | @Test
    method testQueryStripFullLineComment (line 59) | @Test
    method testScriptStripFullLineCommentFirstLine (line 64) | @Test
    method testScriptStripFullLineCommentLastLine (line 71) | @Test
    method testScriptStripFullLineComment (line 78) | @Test
    method testScriptStripFullLineCommentNested (line 83) | @Test

FILE: src/test/java/com/klarna/hiverunner/HiveVariablesTest.java
  class HiveVariablesTest (line 26) | @RunWith(StandaloneHiveRunner.class)
    method substitutedVariablesShouldBeExpanded (line 35) | @Test
    method nestedSubstitutesShouldBeExpanded (line 43) | @Test
    method nestedSubstitutesShouldBeExpandedUsingDeprecatedSetProperty (line 56) | @Test
    method unexpandableSubstitutesShouldNotBeExpanded (line 70) | @Test
    method testHiveVarCli (line 78) | @Test
    method testHiveVar (line 85) | @Test
    method testSystemVar (line 92) | @Test
    method testEnvironmentVar (line 101) | @Test

FILE: src/test/java/com/klarna/hiverunner/InsertIntoTableIntegrationTest.java
  class InsertIntoTableIntegrationTest (line 33) | @ExtendWith(HiveRunnerExtension.class)
    method before (line 39) | @BeforeEach
    method insertDataIntoOrcPartitionedTable (line 44) | @Test
    method insertDataIntoTextPartitionedTable (line 49) | @Test
    method insertDataIntoSequenceFilePartitionedTable (line 54) | @Test
    method testInsertDataIntoPartitionedTable (line 59) | private void testInsertDataIntoPartitionedTable(String storedAs) {
    method insertDataIntoTablePrimitiveParsedStrings (line 87) | @Test
    method insertsDataFromTsvFileIntoOrcTable (line 150) | @Test
    method insertsDataFromTsvFileWithCustomDelimiterAndNullValue (line 172) | @Test
    method insertsDataFromFileWithCustomStrategy (line 193) | @Test

FILE: src/test/java/com/klarna/hiverunner/IntegerPartitionFormatTest.java
  class IntegerPartitionFormatTest (line 29) | @ExtendWith(HiveRunnerExtension.class)
    method repair (line 47) | @BeforeEach
    method testInteger (line 55) | @Test
    method testPrefixedInteger (line 60) | @Test
    method testPrefixedPartitionInteger (line 66) | @Test
    method testNonPrefixedPartitionInteger (line 72) | @Test

FILE: src/test/java/com/klarna/hiverunner/InteractiveHiveShellTest.java
  class InteractiveHiveShellTest (line 35) | @RunWith(StandaloneHiveRunner.class)
    method setupScriptShouldBeExecuted (line 44) | @Test
    method setupScriptsShouldBeExecuted (line 53) | @Test
    method setupScriptsShouldBeExecutedInOrder (line 69) | @Test
    method createFileBasedScript (line 81) | private File createFileBasedScript(String script) throws IOException {

FILE: src/test/java/com/klarna/hiverunner/LeftOuterJoinTest.java
  class LeftOuterJoinTest (line 28) | @ExtendWith(HiveRunnerExtension.class)
    method leftOuterJoin (line 58) | @Test

FILE: src/test/java/com/klarna/hiverunner/MSCKRepairNpeTest.java
  class MSCKRepairNpeTest (line 23) | @ExtendWith(HiveRunnerExtension.class)
    method testMsckRepair (line 29) | @Test

FILE: src/test/java/com/klarna/hiverunner/MacroTest.java
  class MacroTest (line 29) | @ExtendWith(HiveRunnerExtension.class)
    method testMacro (line 51) | @Test

FILE: src/test/java/com/klarna/hiverunner/MethodLevelResourceTest.java
  class MethodLevelResourceTest (line 30) | @ExtendWith(HiveRunnerExtension.class)
    method resourceLoadingAsStringTest (line 42) | @Test()
    method resourceLoadingAsFileTest (line 51) | @Test()

FILE: src/test/java/com/klarna/hiverunner/MultipleExecutionEnginesTest.java
  class MultipleExecutionEnginesTest (line 27) | @ExtendWith(HiveRunnerExtension.class)
    method test (line 34) | @Test

FILE: src/test/java/com/klarna/hiverunner/NeverEndingUdf.java
  class NeverEndingUdf (line 27) | public class NeverEndingUdf extends UDF {
    method evaluate (line 31) | public Text evaluate(Text value) {

FILE: src/test/java/com/klarna/hiverunner/NoTimeoutTest.java
  class NoTimeoutTest (line 28) | @ExtendWith(HiveRunnerExtension.class)
    method prepare (line 41) | @BeforeEach
    method test (line 60) | @Test

FILE: src/test/java/com/klarna/hiverunner/OrcSnappyTest.java
  class OrcSnappyTest (line 29) | @ExtendWith(HiveRunnerExtension.class)
    method tablesShouldBeCreated (line 38) | @Test
    method verifyThatDataIsAvailableInOrcNocomp (line 45) | @Test
    method verifyThatDataIsAvailableInOrcSnappy (line 52) | @Test
    method testCountOrcNocomp (line 59) | @Disabled
    method testCountOrcSnappy (line 68) | @Disabled

FILE: src/test/java/com/klarna/hiverunner/ParquetInsertionTest.java
  class ParquetInsertionTest (line 35) | @ExtendWith(HiveRunnerExtension.class)
    method testCanInsertToParquetTable (line 46) | @Test

FILE: src/test/java/com/klarna/hiverunner/PartitionSupportTest.java
  class PartitionSupportTest (line 34) | @ExtendWith(HiveRunnerExtension.class)
    method repairPartitions (line 56) | @BeforeEach
    method testSelectMax (line 66) | @Test
    method testShowTables (line 77) | @Test

FILE: src/test/java/com/klarna/hiverunner/ReservedKeywordTest.java
  class ReservedKeywordTest (line 26) | @ExtendWith(HiveRunnerExtension.class)
    method reservedKeywordsShouldBeAllowedWhenHiveConfIsSet (line 38) | @Test
    method reservedKeywordsShouldBeAllowedWhenIdentifierHasBacktickQuote (line 53) | @Test

FILE: src/test/java/com/klarna/hiverunner/ResourceOutputStreamTest.java
  class ResourceOutputStreamTest (line 33) | @ExtendWith(HiveRunnerExtension.class)
    method writeShouldOnlyBeAllowedBeforeStartHasBeenCalled (line 39) | @Test
    method itShouldBePossibleToAddAResourceByOutputStream (line 51) | @Test
    method sequenceFile (line 68) | @Test
    method createSequenceFileWriter (line 93) | private SequenceFile.Writer createSequenceFileWriter(OutputStream reso...

FILE: src/test/java/com/klarna/hiverunner/SchemaResetBetweenTestMethodsTest.java
  class SchemaResetBetweenTestMethodsTest (line 35) | @ExtendWith(HiveRunnerExtension.class)
    method createDatabaseBar (line 42) | @Test
    method createDatabaseFoo (line 64) | @Test

FILE: src/test/java/com/klarna/hiverunner/SerdeTest.java
  class SerdeTest (line 31) | @ExtendWith(HiveRunnerExtension.class)
    method testWithProvidedRegexSerde (line 49) | @Test
    method testWithCustomSerde (line 56) | @Test

FILE: src/test/java/com/klarna/hiverunner/SetHiveExecutionEngineTest.java
  class SetHiveExecutionEngineTest (line 27) | @ExtendWith(HiveRunnerExtension.class)
    method test (line 38) | @Test

FILE: src/test/java/com/klarna/hiverunner/SetPropertyTest.java
  class SetPropertyTest (line 24) | @ExtendWith(HiveRunnerExtension.class)
    method propertyShouldNotBeSetIfShellIsAlreadyStarted (line 30) | @Test
    method propertyShouldBeSetInHiveConfiguration (line 36) | @Test

FILE: src/test/java/com/klarna/hiverunner/SetTest.java
  class SetTest (line 26) | @ExtendWith(HiveRunnerExtension.class)
    method testWithSet (line 38) | @Test

FILE: src/test/java/com/klarna/hiverunner/SlowlyFailingUdf.java
  class SlowlyFailingUdf (line 25) | public class SlowlyFailingUdf extends UDF {
    method evaluate (line 30) | public Text evaluate(Text value) throws InterruptedException {

FILE: src/test/java/com/klarna/hiverunner/TestMethodIntegrityTest.java
  class TestMethodIntegrityTest (line 28) | @ExtendWith(HiveRunnerExtension.class)
    method collisionCourseTestMethodOne (line 34) | @Test
    method collisionCourseTestMethodTwo (line 51) | @Test

FILE: src/test/java/com/klarna/hiverunner/TimeoutAndRetryTest.java
  class TimeoutAndRetryTest (line 34) | @RunWith(StandaloneHiveRunner.class)
    method prepare (line 54) | @Before
    method neverEnd (line 78) | @Ignore
    method expectTest (line 84) | @Test(expected = IllegalArgumentException.class)
    method expectTimoutTest (line 89) | @Test(expected = TimeoutException.class)
    method throwOnSecondRun (line 96) | @Test(expected = ArrayIndexOutOfBoundsException.class)
    method throwOnSecondRun2 (line 118) | @Test(expected = TimeoutException.class)
    method endOnSecondRun (line 139) | @Test

FILE: src/test/java/com/klarna/hiverunner/ToUpperCaseSerDe.java
  class ToUpperCaseSerDe (line 35) | public class ToUpperCaseSerDe extends AbstractSerDe {
    method initialize (line 39) | @Override
    method getSerializedClass (line 44) | @Override
    method serialize (line 49) | @Override
    method deserialize (line 54) | @Override
    method getObjectInspector (line 60) | @Override
    method getSerDeStats (line 76) | @Override

FILE: src/test/java/com/klarna/hiverunner/UnresolvedResourcePathTest.java
  class UnresolvedResourcePathTest (line 26) | @ExtendWith(HiveRunnerExtension.class)
    method resourceFileShouldNotBeCreatedIfReferencesAreUnresolved (line 34) | @Test
    method resourceFileShouldBeCreatedInsideTempDir (line 40) | @Test
    method resourceFilePathShouldAlwaysBeInsideTempDir (line 47) | @Test

FILE: src/test/java/com/klarna/hiverunner/UserDefinedFunctionTest.java
  class UserDefinedFunctionTest (line 28) | @ExtendWith(HiveRunnerExtension.class)
    method udfMax (line 48) | @Test
    method udfMin (line 58) | @Test
    method regexp_extract (line 68) | @Test

FILE: src/test/java/com/klarna/hiverunner/builder/HiveShellBaseTest.java
  class HiveShellBaseTest (line 53) | @RunWith(MockitoJUnitRunner.class)
    method variableSubstitutionShouldBlowUpIfShellIsNotStarted (line 63) | @Test(expected = IllegalStateException.class)
    method setupScriptMayBeAddedBeforeStart (line 69) | @Test
    method setupScriptsShouldBeExecutedAtStart (line 76) | @Test
    method setupScriptMayNotBeAddedAfterShellIsStarted (line 85) | @Test(expected = IllegalStateException.class)
    method invalidFilePathShouldThrowException (line 92) | @Test(expected = IllegalArgumentException.class)
    method setupScriptsMayNotBeAddedAfterShellIsStarted (line 99) | @Test(expected = IllegalStateException.class)
    method executeScriptFile (line 106) | @Test
    method executeScriptCharsetFile (line 120) | @Test
    method executeScriptPath (line 134) | @Test
    method executeScriptCharsetPath (line 148) | @Test
    method executeScriptFileNotExists (line 162) | @Test(expected = IllegalArgumentException.class)
    method executeScriptNotStarted (line 171) | @Test(expected = IllegalStateException.class)
    method executeQueryFromFile (line 179) | @Test
    method executeQueryFromFileMoreThanOneStatement (line 196) | @Test(expected = IllegalArgumentException.class)
    method executeQueryFromFileZeroStatements (line 209) | @Test(expected = IllegalArgumentException.class)
    method scriptFilesAreImportedInQueries (line 222) | @Test
    method scriptFilesAreImportedInOtherScriptsHiveCli (line 239) | @Test
    method scriptFilesAreImportedInOtherScriptsBeeline (line 258) | @Test
    method createHiveCliShell (line 277) | private HiveShell createHiveCliShell(String... keyValues) {
    method createBeelineShell (line 281) | private HiveShell createBeelineShell(String... keyValues) {
    method createHiveShell (line 285) | private HiveShell createHiveShell(CommandShellEmulator emulation, Stri...
    method createHiveconf (line 301) | private HiveConf createHiveconf(Map<String, String> conf) {

FILE: src/test/java/com/klarna/hiverunner/config/HiveRunnerConfigTest.java
  class HiveRunnerConfigTest (line 31) | public class HiveRunnerConfigTest {
    method testSetHiveconfFromSystemProperty (line 33) | @Test
    method testSetHiveExecutionEngine (line 49) | @Test
    method testEnableTimeout (line 57) | @Test
    method testTimeoutSeconds (line 66) | @Test
    method testTimeoutRetries (line 74) | @Test
    method testCommandShellEmulator (line 82) | @Test
    method testSetCommandShellEmulator (line 98) | @Test
    method testEnableTimeoutDefault (line 107) | @Test
    method testTimeoutSecondsDefault (line 113) | @Test
    method testTimeoutRetriesDefault (line 119) | @Test
    method testCommandShellEmulatorDefault (line 125) | @Test

FILE: src/test/java/com/klarna/hiverunner/data/ConvertersTest.java
  class ConvertersTest (line 51) | public class ConvertersTest {
    method inputNull (line 53) | @Test
    method inputNotString (line 60) | @Test
    method stringTypeInfo (line 67) | @Test
    method booleanTypeInfo (line 72) | @Test
    method byteTypeInfo (line 79) | @Test
    method shortTypeInfo (line 88) | @Test
    method intTypeInfo (line 97) | @Test
    method longTypeInfo (line 106) | @Test
    method floatTypeInfo (line 115) | @Test
    method doubleTypeInfo (line 121) | @Test
    method dateTypeInfo (line 127) | @Test
    method timestampTypeInfo (line 134) | @Test
    method binaryTypeInfo (line 141) | @Test
    method otherTypeInfo (line 147) | @Test
    method assertConversionException (line 156) | private void assertConversionException(Object value, PrimitiveTypeInfo...

FILE: src/test/java/com/klarna/hiverunner/data/InsertIntoTableTest.java
  class InsertIntoTableTest (line 35) | @ExtendWith(MockitoExtension.class)
    method before (line 45) | @BeforeEach
    method withColumns (line 50) | @Test
    method withAllColumns (line 58) | @Test
    method newRow (line 65) | @Test
    method addRow (line 72) | @Test
    method setRow (line 80) | @Test
    method addRows (line 88) | @Test
    method addRowsWithFileParser (line 96) | @Test
    method copyRow (line 105) | @Test
    method set (line 112) | @Test
    method commit (line 119) | @Test

FILE: src/test/java/com/klarna/hiverunner/data/TableDataBuilderTest.java
  class TableDataBuilderTest (line 48) | @ExtendWith(MockitoExtension.class)
    method testUnknownColumnNameWithColumnMask (line 58) | @Test
    method testUnknownColumnNameOnSet (line 66) | @Test
    method testAddRowsFromWithMixedCaseColumnNames (line 77) | @Test
    method testAddRowWithNoArguments (line 90) | @Test
    method testAddRowWithIncorrectNumberOfArguments (line 98) | @Test
    method testCopyRowWhenNoRowToCopy (line 106) | @Test
    method testCopyRow (line 114) | @Test
    method testUnpartitionedEmptyRow (line 128) | @Test
    method testUnpartitionedWithColumnMask (line 140) | @Test
    method testPartitionedNullPartitionColumnValue (line 155) | @Test
    method testPartitionedSimple (line 163) | @Test
    method testPartitionedMultiplePartitionsAndRows (line 182) | @Test
    method table (line 218) | private static HCatTable table() {
    method column (line 222) | private static HCatFieldSchema column(String name) {
    method columns (line 230) | private static List<HCatFieldSchema> columns(String... names) {

FILE: src/test/java/com/klarna/hiverunner/data/TableDataInserterTest.java
  class TableDataInserterTest (line 44) | @ExtendWith(HiveRunnerExtension.class)
    method setUp (line 53) | @BeforeEach
    method insertsRowsIntoExistingTable (line 61) | @Test

FILE: src/test/java/com/klarna/hiverunner/data/TsvFileParserTest.java
  class TsvFileParserTest (line 30) | public class TsvFileParserTest {
    method parsesTsv (line 32) | @Test
    method parsesTsvNotEnoughFieldsInFile (line 42) | @Test
    method parsesTsvSubSelectFields (line 51) | @Test
    method parsesCsvWithEmptyFields (line 61) | @Test
    method csvWithCustomNullValue (line 71) | @Test
    method tsvWithHeader (line 81) | @Test
    method csvWithHeader (line 95) | @Test

FILE: src/test/java/com/klarna/hiverunner/examples/HelloAnnotatedHiveRunnerTest.java
  class HelloAnnotatedHiveRunnerTest (line 46) | @ExtendWith(HiveRunnerExtension.class)
    method testTablesCreated (line 107) | @Test
    method testSelectFromFooWithCustomDelimiter (line 115) | @Test
    method testSelectFromFooWithTypeCheck (line 122) | @Test
    method testSelectFromCtas (line 133) | @Test

FILE: src/test/java/com/klarna/hiverunner/examples/HelloHiveRunnerParamaterizedTest.java
  class HelloHiveRunnerParamaterizedTest (line 36) | @ExtendWith(HiveRunnerExtension.class)
    method setupSourceDatabase (line 42) | @BeforeEach
    method testFileFormats (line 47) | @ParameterizedTest

FILE: src/test/java/com/klarna/hiverunner/examples/HelloHiveRunnerTest.java
  class HelloHiveRunnerTest (line 41) | @ExtendWith(HiveRunnerExtension.class)
    method setupSourceDatabase (line 47) | @BeforeEach
    method testMaxValueByYear (line 59) | @Test

FILE: src/test/java/com/klarna/hiverunner/examples/InsertTestDataTest.java
  class InsertTestDataTest (line 39) | @ExtendWith(HiveRunnerExtension.class)
    method setupDatabase (line 45) | @BeforeEach
    method insertRowsFromCode (line 55) | @Test
    method insertRowsFromCodeWithSelectedColumns (line 66) | @Test
    method insertRowsFromTsvFile (line 77) | @Test
    method insertRowsFromTsvFileWithHeader (line 88) | @Test
    method insertRowsFromTsvFileWithSubsetHeader (line 99) | @Test
    method insertRowsIntoPartitionedTableStoredAsSequencefileWithCustomDelimiterAndNullValue (line 110) | @Test
    method printResult (line 129) | private void printResult(List<Object[]> result, String methodName) {

FILE: src/test/java/com/klarna/hiverunner/examples/SetHiveConfValuesTest.java
  class SetHiveConfValuesTest (line 38) | @ExtendWith(HiveRunnerExtension.class)
    method setupDatabases (line 44) | @BeforeEach
    method useHiveConfValues (line 64) | @Test

FILE: src/test/java/com/klarna/hiverunner/examples/junit4/HelloAnnotatedHiveRunnerTest.java
  class HelloAnnotatedHiveRunnerTest (line 44) | @RunWith(StandaloneHiveRunner.class)
    method testTablesCreated (line 106) | @Test
    method testSelectFromFooWithCustomDelimiter (line 114) | @Test
    method testSelectFromFooWithTypeCheck (line 121) | @Test
    method testSelectFromCtas (line 132) | @Test

FILE: src/test/java/com/klarna/hiverunner/examples/junit4/HelloHiveRunnerTest.java
  class HelloHiveRunnerTest (line 41) | @RunWith(StandaloneHiveRunner.class)
    method setupSourceDatabase (line 46) | @Before
    method setupTargetDatabase (line 56) | @Before
    method testMaxValueByYear (line 61) | @Test

FILE: src/test/java/com/klarna/hiverunner/examples/junit4/InsertTestDataTest.java
  class InsertTestDataTest (line 40) | @RunWith(StandaloneHiveRunner.class)
    method setupDatabase (line 48) | @Before
    method insertRowsFromCode (line 59) | @Test
    method insertRowsFromCodeWithSelectedColumns (line 71) | @Test
    method insertRowsFromTsvFile (line 83) | @Test
    method insertRowsFromTsvFileWithHeader (line 95) | @Test
    method insertRowsFromTsvFileWithSubsetHeader (line 106) | @Test
    method insertRowsIntoPartitionedTableStoredAsSequencefileWithCustomDelimiterAndNullValue (line 118) | @Test
    method printResult (line 138) | public void printResult(List<Object[]> result) {

FILE: src/test/java/com/klarna/hiverunner/examples/junit4/SetHiveConfValuesTest.java
  class SetHiveConfValuesTest (line 36) | @RunWith(StandaloneHiveRunner.class)
    method setupDatabases (line 42) | @Before
    method useHiveConfValues (line 62) | @Test

FILE: src/test/java/com/klarna/hiverunner/io/IgnoreClosePrintStreamTest.java
  class IgnoreClosePrintStreamTest (line 29) | @ExtendWith(MockitoExtension.class)
    method closeIgnored (line 35) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/AbstractImportPostProcessorTest.java
  class AbstractImportPostProcessorTest (line 37) | @ExtendWith(MockitoExtension.class)
    method setup (line 49) | @BeforeEach
    method scriptImport (line 54) | @Test
    method nonScriptImport (line 62) | @Test
    class TestAbstractImportPostProcessor (line 69) | private static class TestAbstractImportPostProcessor extends AbstractI...
      method TestAbstractImportPostProcessor (line 74) | public TestAbstractImportPostProcessor(boolean isImport, String path...
      method getImportPath (line 80) | @Override
      method isImport (line 85) | @Override

FILE: src/test/java/com/klarna/hiverunner/sql/cli/CommandShellEmulatorFactoryTest.java
  class CommandShellEmulatorFactoryTest (line 32) | public class CommandShellEmulatorFactoryTest {
    method beeline (line 34) | @Test
    method hiveCli (line 41) | @Test
    method hiveCliPreV130 (line 48) | @Test
    method unknown (line 55) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/CommentUtilTest.java
  class CommentUtilTest (line 27) | public class CommentUtilTest {
    method nothingToStrip (line 29) | @Test
    method commentToStrip (line 34) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/beeline/BeelineEmulatorTest.java
  class BeelineEmulatorTest (line 24) | public class BeelineEmulatorTest {
    method testFullLineCommentAndSetStatementBeeLine (line 26) | @Test
    method testFullLineCommentStatementBeeLine (line 32) | @Test
    method testFullLineCommentAndSetScriptBeeLine (line 38) | @Test
    method testFullLineCommentScriptBeeLine (line 44) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/beeline/BeelineStatementSplitterTest.java
  class BeelineStatementSplitterTest (line 31) | public class BeelineStatementSplitterTest {
    method asStatementList (line 35) | private List<Statement> asStatementList(String... strings) {
    method testSplitBasic (line 44) | @Test
    method testRemoveTrailingSemiColon (line 51) | @Test
    method testDiscardRedundantSemiColons (line 58) | @Test
    method testDiscardTrailingSpace (line 65) | @Test
    method testDiscardEmptyStatements (line 72) | @Test
    method testCommentPreserved (line 79) | @Test
    method testCommentWithSingleQuote (line 86) | @Test
    method testCommentWithDoubleQuote (line 93) | @Test
    method testCommentWithSemiColon (line 100) | @Test
    method testMultilineStatementWithComment (line 107) | @Test
    method testRealLifeExample (line 114) | @Test
    method realLifeWithComments (line 127) | @Test
    method testPreserveQuoted (line 142) | @Test
    method beelineSqlLineCommandsAreSupported (line 150) | @Test
    method testReadUntilEndOfLine (line 162) | @Test
    method testReadQuoted (line 167) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/beeline/RunCommandPostProcessorTest.java
  class RunCommandPostProcessorTest (line 38) | @ExtendWith(MockitoExtension.class)
    method setup (line 46) | @BeforeEach
    method isImport (line 51) | @Test
    method isImportSpaces (line 56) | @Test
    method isNotImport (line 61) | @Test
    method importPathValid (line 66) | @Test
    method importPathInvalid (line 71) | @Test
    method importStatement (line 76) | @Test
    method importStatementSpaces (line 84) | @Test
    method generalStatement (line 92) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/beeline/SqlLineCommandRuleTest.java
  class SqlLineCommandRuleTest (line 32) | @ExtendWith(MockitoExtension.class)
    method handleStart (line 38) | @Test
    method handleOther (line 47) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/hive/HiveCliEmulatorTest.java
  class HiveCliEmulatorTest (line 24) | public class HiveCliEmulatorTest {
    method testFullLineCommentAndSetStatementHiveCli (line 25) | @Test
    method testFullLineCommentStatementHiveCli (line 31) | @Test
    method testFullLineCommentAndSetScriptHiveCli (line 37) | @Test
    method testFullLineCommentScriptHiveCli (line 43) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/hive/HiveCliStatementSplitterTest.java
  class HiveCliStatementSplitterTest (line 31) | public class HiveCliStatementSplitterTest {
    method asStatementList (line 35) | private List<Statement> asStatementList(String... strings) {
    method testSplitBasic (line 44) | @Test
    method testRemoveTrailingSemiColon (line 51) | @Test
    method testDiscardRedundantSemiColons (line 58) | @Test
    method testDiscardTrailingSpace (line 65) | @Test
    method testDiscardEmptyStatements (line 72) | @Test
    method testCommentPreserved (line 79) | @Test
    method testCommentWithSingleQuote (line 86) | @Test
    method testCommentWithDoubleQuote (line 93) | @Test
    method testCommentWithSemiColon (line 100) | @Test
    method testMultilineStatementWithComment (line 107) | @Test
    method testRealLifeExample (line 114) | @Test
    method realLifeWithComments (line 127) | @Test
    method testPreserveQuoted (line 142) | @Test
    method hiveCliSourceCommandsAreSupported (line 150) | @Test
    method testReadQuoted (line 162) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/hive/PreV200HiveCliEmulatorTest.java
  class PreV200HiveCliEmulatorTest (line 24) | public class PreV200HiveCliEmulatorTest {
    method testFullLineCommentAndSetStatementHiveCli (line 25) | @Test
    method testFullLineCommentStatementHiveCli (line 31) | @Test
    method testFullLineCommentAndSetScriptHiveCli (line 37) | @Test
    method testFullLineCommentScriptHiveCli (line 43) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/cli/hive/SourceCommandPostProcessorTest.java
  class SourceCommandPostProcessorTest (line 37) | @ExtendWith(MockitoExtension.class)
    method setup (line 45) | @BeforeEach
    method isImport (line 50) | @Test
    method isImportSpaces (line 55) | @Test
    method isImportCaseInsensitive (line 60) | @Test
    method isNotImport (line 65) | @Test
    method importPathValid (line 70) | @Test
    method importStatement (line 75) | @Test
    method importStatementSpaces (line 83) | @Test
    method generalStatement (line 91) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/split/BaseContextTest.java
  class BaseContextTest (line 33) | @ExtendWith(MockitoExtension.class)
    method appendAndFlush (line 41) | @Test
    method statementAndFlush (line 50) | @Test
    method appendWith (line 58) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/split/CloseStatementRuleTest.java
  class CloseStatementRuleTest (line 26) | @ExtendWith(MockitoExtension.class)
    method handle (line 34) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/split/ConsumerEolTest.java
  class ConsumerEolTest (line 33) | @ExtendWith(MockitoExtension.class)
    method setup (line 41) | @BeforeEach
    method consumeLine (line 46) | @Test
    method consumeNoCR (line 53) | @Test
    method consumeMultiLine (line 60) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/split/DefaultTokenRuleTest.java
  class DefaultTokenRuleTest (line 26) | @ExtendWith(MockitoExtension.class)
    method handle (line 34) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/split/NewLineUtilTest.java
  class NewLineUtilTest (line 26) | public class NewLineUtilTest {
    method typical (line 28) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/split/PreserveCommentsRuleTest.java
  class PreserveCommentsRuleTest (line 26) | public class PreserveCommentsRuleTest {
    method withInlineComment (line 30) | @Test
    method noComment (line 41) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/split/PreserveQuotesRuleTest.java
  class PreserveQuotesRuleTest (line 26) | public class PreserveQuotesRuleTest {
    method singleQuotes (line 30) | @Test
    method singleQuotesCrossLine (line 38) | @Test
    method singleEscapedQuotes (line 46) | @Test
    method doubleQuotes (line 54) | @Test
    method doubleQuotesCrossLine (line 62) | @Test
    method doubleEscapedQuotes (line 70) | @Test
    method doubleSingleQuotes (line 78) | @Test
    method singleDoubleQuotes (line 86) | @Test

FILE: src/test/java/com/klarna/hiverunner/sql/split/StatementSplitterTest.java
  class StatementSplitterTest (line 40) | @ExtendWith(MockitoExtension.class)
    method asStatementList (line 48) | private List<Statement> asStatementList(String... strings) {
    method setupEmulator (line 57) | @BeforeEach
    method defaultRule (line 66) | @Test
    method multipleRules (line 71) | @Test

FILE: src/test/resources/AggregateViewTest/create_table.sql
  type `db.mvtdescriptionchangeinfo` (line 3) | CREATE EXTERNAL TABLE `db.mvtdescriptionchangeinfo`(
  type db (line 13) | CREATE VIEW db.latestnodemvtchanges AS
  type db (line 19) | CREATE VIEW db.latesttestchangepairs AS

FILE: src/test/resources/CtasTest/ctas.sql
  type foo (line 1) | CREATE EXTERNAL TABLE foo (s1 string, s2 string)
  type foo_prim (line 7) | CREATE TABLE foo_prim as select * from foo

FILE: src/test/resources/HelloHiveRunnerTest/create_ctas.sql
  type foo_prim (line 3) | CREATE TABLE foo_prim as select i, s from foo

FILE: src/test/resources/HelloHiveRunnerTest/create_max.sql
  type my_schema (line 3) | CREATE EXTERNAL TABLE my_schema.result (year STRING, value INT)

FILE: src/test/resources/HelloHiveRunnerTest/create_table.sql
  type foo (line 3) | CREATE EXTERNAL TABLE foo (i int, s string)

FILE: src/test/resources/HiveRunnerAnnotationsTest/hql1.sql
  type foo (line 1) | CREATE EXTERNAL TABLE foo (s1 int, s2 string)

FILE: src/test/resources/HiveRunnerExtensionTest/test_query.sql
  type testdb (line 3) | CREATE EXTERNAL TABLE testdb.test_table

FILE: src/test/resources/OrcSnappyTest/ctas.sql
  type foo (line 1) | CREATE EXTERNAL TABLE foo (s1 string, s2 string)
  type foo_orc_nocomp (line 9) | CREATE TABLE foo_orc_nocomp as select * from foo
  type foo_orc_snappy (line 24) | CREATE TABLE foo_orc_snappy as select * from foo

FILE: src/test/resources/PartitionSupportTest/hql_example.sql
  type hiveconf (line 1) | CREATE EXTERNAL TABLE ${hiveconf:table.name} (s1 string, s2 string, s3 s...

FILE: src/test/resources/SerdeTest/create_table.sql
  type serde_test (line 1) | CREATE TABLE serde_test (

FILE: src/test/resources/SerdeTest/hql_custom_serde.sql
  type customSerdeTable (line 1) | CREATE EXTERNAL TABLE customSerdeTable (s1 string, s2 string, s3 string)
Condensed preview — 184 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (533K chars).
[
  {
    "path": ".github/CODEOWNERS",
    "chars": 36,
    "preview": "* @HiveRunner/hiverunner-committers\n"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/bug_report.md",
    "chars": 1650,
    "preview": "---\nname: Bug report\nabout: Create a report to help us improve\n\n---\n<!-- \n Before raising a bug report please consider t"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/feature_request.md",
    "chars": 570,
    "preview": "---\nname: Feature request\nabout: Suggest an idea for this project\n\n---\n\n**Is your feature request related to a problem? "
  },
  {
    "path": ".github/workflows/deploy.yml",
    "chars": 1570,
    "preview": "name: Deploy SNAPSHOT\non:\n  workflow_dispatch:\n    inputs:\n      branch:\n        description: \"The branch to use to depl"
  },
  {
    "path": ".github/workflows/main.yml",
    "chars": 454,
    "preview": "name: build\n\non: \n  pull_request:\n  push:\n    branches: \n      - main\n\njobs:\n  test:\n    name: Package and run all tests"
  },
  {
    "path": ".github/workflows/release.yml",
    "chars": 1811,
    "preview": "name: Release to Maven Central\non:\n  workflow_dispatch:\n    inputs:\n      branch:\n        description: \"The branch to us"
  },
  {
    "path": ".gitignore",
    "chars": 921,
    "preview": "# Project files #\n#################\n*.iml\n*.ipr\n*.iws\nnbactions.xml\n/.idea/\n\n# Compiled source #\n###################\n*.c"
  },
  {
    "path": "CHANGELOG.md",
    "chars": 10925,
    "preview": "# Changelog\nAll notable changes to this project will be documented in this file.\n\nThe format is based on [Keep a Changel"
  },
  {
    "path": "CODE-OF-CONDUCT.md",
    "chars": 3255,
    "preview": "# Contributor Covenant Code of Conduct\n\n## Our Pledge\n\nIn the interest of fostering an open and welcoming environment, w"
  },
  {
    "path": "CONTRIBUTING.md",
    "chars": 1744,
    "preview": "# How To Contribute\n\nWe'd love to accept your patches and contributions to this project. There are just a few guidelines"
  },
  {
    "path": "LICENSE.txt",
    "chars": 11358,
    "preview": "\n                                 Apache License\n                           Version 2.0, January 2004\n                  "
  },
  {
    "path": "README.md",
    "chars": 14716,
    "preview": "\n[![Maven Central](https://maven-badges.herokuapp.com/maven-central/io.github.hiverunner/hiverunner/badge.svg?subject=io"
  },
  {
    "path": "RELEASING.md",
    "chars": 1524,
    "preview": "# Releasing HiveRunner to Maven Central\n\nHiveRunner has been set up to build continuously and also to deploy SNAPSHOTS t"
  },
  {
    "path": "pom.xml",
    "chars": 11834,
    "preview": "<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemaLocat"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/HiveRunnerCore.java",
    "chars": 9416,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/HiveRunnerExtension.java",
    "chars": 4311,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/HiveRunnerRule.java",
    "chars": 3161,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/HiveServerContainer.java",
    "chars": 8419,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/HiveServerContext.java",
    "chars": 2078,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/HiveShell.java",
    "chars": 9917,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/HiveShellContainer.java",
    "chars": 1256,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/StandaloneHiveRunner.java",
    "chars": 9206,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/StandaloneHiveServerContext.java",
    "chars": 12416,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/ThrowOnTimeout.java",
    "chars": 3817,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/TimeoutException.java",
    "chars": 1302,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/annotations/HiveProperties.java",
    "chars": 1141,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/annotations/HiveResource.java",
    "chars": 1410,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/annotations/HiveRunnerSetup.java",
    "chars": 1016,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/annotations/HiveSQL.java",
    "chars": 1824,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/annotations/HiveSetupScript.java",
    "chars": 1163,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/builder/HiveResource.java",
    "chars": 2226,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/builder/HiveRunnerScript.java",
    "chars": 2584,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/builder/HiveShellBase.java",
    "chars": 15669,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/builder/HiveShellBuilder.java",
    "chars": 3544,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/builder/HiveShellTearable.java",
    "chars": 1577,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/builder/Script.java",
    "chars": 878,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/builder/Statement.java",
    "chars": 891,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/config/HiveRunnerConfig.java",
    "chars": 8787,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/data/Converters.java",
    "chars": 7948,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/data/FileParser.java",
    "chars": 1754,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/data/InsertIntoTable.java",
    "chars": 6386,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/data/TableDataBuilder.java",
    "chars": 6907,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/data/TableDataInserter.java",
    "chars": 2694,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/data/TsvFileParser.java",
    "chars": 4652,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/io/IgnoreClosePrintStream.java",
    "chars": 953,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/HiveRunnerStatement.java",
    "chars": 2013,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/StatementLexer.java",
    "chars": 2666,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/AbstractImportPostProcessor.java",
    "chars": 1691,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/CommandShellEmulator.java",
    "chars": 1138,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/CommandShellEmulatorFactory.java",
    "chars": 1456,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/CommentUtil.java",
    "chars": 1358,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/DefaultPreProcessor.java",
    "chars": 1099,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/PostProcessor.java",
    "chars": 896,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/PreProcessor.java",
    "chars": 874,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/beeline/BeelineEmulator.java",
    "chars": 2509,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/beeline/RunCommandPostProcessor.java",
    "chars": 1737,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/beeline/SqlLineCommandRule.java",
    "chars": 1663,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/hive/HiveCliEmulator.java",
    "chars": 2413,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/hive/PreV200HiveCliEmulator.java",
    "chars": 1824,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/hive/PreV200HiveCliPreProcessor.java",
    "chars": 1542,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/cli/hive/SourceCommandPostProcessor.java",
    "chars": 1562,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/BaseContext.java",
    "chars": 1818,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/CloseStatementRule.java",
    "chars": 1139,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/Consumer.java",
    "chars": 1507,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/Context.java",
    "chars": 1010,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/DefaultTokenRule.java",
    "chars": 1077,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/NewLineUtil.java",
    "chars": 2156,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/PreserveCommentsRule.java",
    "chars": 1345,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/PreserveQuotesRule.java",
    "chars": 2098,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/StatementSplitter.java",
    "chars": 2505,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/hiverunner/sql/split/TokenRule.java",
    "chars": 900,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/java/com/klarna/reflection/ReflectionUtils.java",
    "chars": 4992,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/main/license/APACHE-2.txt",
    "chars": 608,
    "preview": "Copyright (C) 2013-2021 Klarna AB\nCopyright (C) ${license.git.copyrightYears} ${owner}\n\nLicensed under the Apache Licens"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/AggregateViewTest.java",
    "chars": 1893,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/AnnotatedBaseTestClass.java",
    "chars": 1375,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/AnnotatedFieldsInSuperClassTest.java",
    "chars": 938,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/BeelineRunTest.java",
    "chars": 3766,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/BigResultSetTest.java",
    "chars": 2019,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/CommentTest.java",
    "chars": 1463,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/CtasTest.java",
    "chars": 1982,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/DisabledTimeoutTest.java",
    "chars": 1355,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/ExecuteFileBasedScriptIntegrationTest.java",
    "chars": 2062,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/ExecuteScriptIntegrationTest.java",
    "chars": 2002,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/HiveCliSourceTest.java",
    "chars": 3778,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/HiveRunnerAnnotationsTest.java",
    "chars": 4778,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/HiveRunnerExtensionTest.java",
    "chars": 1446,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/HiveServerContainerTest.java",
    "chars": 3583,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/HiveShellBeeLineEmulationTest.java",
    "chars": 3158,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/HiveShellHiveCliEmulationTest.java",
    "chars": 3039,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/HiveVariablesTest.java",
    "chars": 4067,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/InsertIntoTableIntegrationTest.java",
    "chars": 8709,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/IntegerPartitionFormatTest.java",
    "chars": 2709,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/InteractiveHiveShellTest.java",
    "chars": 3176,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/LeftOuterJoinTest.java",
    "chars": 2629,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/MSCKRepairNpeTest.java",
    "chars": 1563,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/MacroTest.java",
    "chars": 1878,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/MethodLevelResourceTest.java",
    "chars": 2139,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/MultipleExecutionEnginesTest.java",
    "chars": 2361,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/NeverEndingUdf.java",
    "chars": 1260,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/NoTimeoutTest.java",
    "chars": 2426,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/OrcSnappyTest.java",
    "chars": 2865,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/ParquetInsertionTest.java",
    "chars": 2020,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/PartitionSupportTest.java",
    "chars": 2960,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/ReservedKeywordTest.java",
    "chars": 2240,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/ResourceOutputStreamTest.java",
    "chars": 3658,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/SchemaResetBetweenTestMethodsTest.java",
    "chars": 2907,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/SerdeTest.java",
    "chars": 2421,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/SetHiveExecutionEngineTest.java",
    "chars": 1491,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/SetPropertyTest.java",
    "chars": 1436,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/SetTest.java",
    "chars": 1367,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/SlowlyFailingUdf.java",
    "chars": 1245,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/TestMethodIntegrityTest.java",
    "chars": 2688,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/TimeoutAndRetryTest.java",
    "chars": 5243,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/ToUpperCaseSerDe.java",
    "chars": 2875,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/UnresolvedResourcePathTest.java",
    "chars": 1794,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/UserDefinedFunctionTest.java",
    "chars": 2616,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/builder/HiveShellBaseTest.java",
    "chars": 10356,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/config/HiveRunnerConfigTest.java",
    "chars": 5462,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/data/ConvertersTest.java",
    "chars": 7164,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/data/InsertIntoTableTest.java",
    "chars": 3067,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/data/TableDataBuilderTest.java",
    "chars": 8798,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/data/TableDataInserterTest.java",
    "chars": 3892,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/data/TsvFileParserTest.java",
    "chars": 4990,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/HelloAnnotatedHiveRunnerTest.java",
    "chars": 5380,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/HelloHiveRunnerParamaterizedTest.java",
    "chars": 2516,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/HelloHiveRunnerTest.java",
    "chars": 2918,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/InsertTestDataTest.java",
    "chars": 5222,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/SetHiveConfValuesTest.java",
    "chars": 2693,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/junit4/HelloAnnotatedHiveRunnerTest.java",
    "chars": 5244,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/junit4/HelloHiveRunnerTest.java",
    "chars": 2899,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/junit4/InsertTestDataTest.java",
    "chars": 5162,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/examples/junit4/SetHiveConfValuesTest.java",
    "chars": 2630,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/io/IgnoreClosePrintStreamTest.java",
    "chars": 1322,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/AbstractImportPostProcessorTest.java",
    "chars": 2905,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/CommandShellEmulatorFactoryTest.java",
    "chars": 2516,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/CommentUtilTest.java",
    "chars": 1257,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/beeline/BeelineEmulatorTest.java",
    "chars": 1697,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/beeline/BeelineStatementSplitterTest.java",
    "chars": 6725,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/beeline/RunCommandPostProcessorTest.java",
    "chars": 2885,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/beeline/SqlLineCommandRuleTest.java",
    "chars": 1850,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/hive/HiveCliEmulatorTest.java",
    "chars": 1693,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/hive/HiveCliStatementSplitterTest.java",
    "chars": 6566,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/hive/PreV200HiveCliEmulatorTest.java",
    "chars": 1730,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/cli/hive/SourceCommandPostProcessorTest.java",
    "chars": 2826,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/split/BaseContextTest.java",
    "chars": 2115,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/split/CloseStatementRuleTest.java",
    "chars": 1192,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/split/ConsumerEolTest.java",
    "chars": 2200,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/split/DefaultTokenRuleTest.java",
    "chars": 1192,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/split/NewLineUtilTest.java",
    "chars": 5200,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/split/PreserveCommentsRuleTest.java",
    "chars": 1990,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/split/PreserveQuotesRuleTest.java",
    "chars": 3455,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/java/com/klarna/hiverunner/sql/split/StatementSplitterTest.java",
    "chars": 2586,
    "preview": "/**\n * Copyright (C) 2013-2021 Klarna AB\n * Copyright (C) 2021 The HiveRunner Contributors\n *\n * Licensed under the Apac"
  },
  {
    "path": "src/test/resources/AggregateViewTest/create_table.sql",
    "chars": 717,
    "preview": "CREATE DATABASE db;\n\nCREATE EXTERNAL TABLE `db.mvtdescriptionchangeinfo`(\n  `timestamp` bigint COMMENT '',\n  `testid` st"
  },
  {
    "path": "src/test/resources/CommentTest/comment.sql",
    "chars": 41,
    "preview": "-- hello\nset x=1;\n\nset y=\"\n-- goodbye\n\";\n"
  },
  {
    "path": "src/test/resources/CtasTest/ctas.sql",
    "chars": 212,
    "preview": "CREATE EXTERNAL TABLE foo (s1 string, s2 string)\n  ROW FORMAT DELIMITED FIELDS TERMINATED BY ','\n  STORED AS TEXTFILE\n  "
  },
  {
    "path": "src/test/resources/HelloHiveRunnerTest/calculate_max.sql",
    "chars": 95,
    "preview": "insert into my_schema.result\n  select year, max(value) from source_db.test_table group by year;"
  },
  {
    "path": "src/test/resources/HelloHiveRunnerTest/create_ctas.sql",
    "chars": 78,
    "preview": "USE ${hiveconf:my.schema};\n\nCREATE TABLE foo_prim as select i, s from foo;\n\n\n\n"
  },
  {
    "path": "src/test/resources/HelloHiveRunnerTest/create_max.sql",
    "chars": 118,
    "preview": "create database my_schema;\n\nCREATE EXTERNAL TABLE my_schema.result (year STRING, value INT)\n  stored as sequencefile\n;"
  },
  {
    "path": "src/test/resources/HelloHiveRunnerTest/create_table.sql",
    "chars": 184,
    "preview": "USE ${hiveconf:my.schema};\n\nCREATE EXTERNAL TABLE foo (i int, s string)\n  ROW FORMAT DELIMITED FIELDS TERMINATED BY ','\n"
  },
  {
    "path": "src/test/resources/HelloHiveRunnerTest/hello_hive_runner.csv",
    "chars": 12,
    "preview": "1,Hello\n,bar"
  },
  {
    "path": "src/test/resources/HiveRunnerAnnotationsTest/hql1.sql",
    "chars": 160,
    "preview": "CREATE EXTERNAL TABLE foo (s1 int, s2 string)\n  ROW FORMAT DELIMITED FIELDS TERMINATED BY ','\n  STORED AS TEXTFILE\n  LOC"
  },
  {
    "path": "src/test/resources/HiveRunnerAnnotationsTest/setupFile.csv",
    "chars": 28,
    "preview": "create table fox(id string);"
  },
  {
    "path": "src/test/resources/HiveRunnerAnnotationsTest/setupPath.csv",
    "chars": 29,
    "preview": "create table love(id string);"
  },
  {
    "path": "src/test/resources/HiveRunnerAnnotationsTest/testData.csv",
    "chars": 7,
    "preview": "5,F\n7,W"
  },
  {
    "path": "src/test/resources/HiveRunnerAnnotationsTest/testData2.csv",
    "chars": 8,
    "preview": "8,T\n10,Q"
  },
  {
    "path": "src/test/resources/HiveRunnerExtensionTest/test_query.sql",
    "chars": 102,
    "preview": "CREATE DATABASE testdb;\n\nCREATE EXTERNAL TABLE testdb.test_table\n(\n  field1 string,\n  field2 string\n)\n"
  },
  {
    "path": "src/test/resources/InsertIntoTableIntegrationTest/data.tsv",
    "chars": 30,
    "preview": "a1\tb1\tc1\td1\te1\na2\tb2\tc2\td2\te2\n"
  },
  {
    "path": "src/test/resources/InsertIntoTableIntegrationTest/dataWithCustomNullValue.csv",
    "chars": 33,
    "preview": "a1,b1,c1,d1,NULL\na2,b2,NULL,d2,e2"
  },
  {
    "path": "src/test/resources/InsertTestDataTest/data1.tsv",
    "chars": 41,
    "preview": "textA\t42\ttrue\ntextB\t3\ttrue\ntextC\t99\tfalse"
  },
  {
    "path": "src/test/resources/InsertTestDataTest/data2.tsv",
    "chars": 34,
    "preview": "textA:42:A\n__NULL__:3:A\ntextC:99:B"
  },
  {
    "path": "src/test/resources/InsertTestDataTest/dataWithHeader1.tsv",
    "chars": 60,
    "preview": "col_b\tcol_a\tcol_c\n42\ttextA\ttrue\n3\ttextB\ttrue\n99\ttextC\tfalse\n"
  },
  {
    "path": "src/test/resources/InsertTestDataTest/dataWithHeader2.tsv",
    "chars": 45,
    "preview": "col_a\tcol_c\ntextA\ttrue\ntextB\ttrue\ntextC\tfalse"
  },
  {
    "path": "src/test/resources/MethodLevelResourceTest/MethodLevelResourceTest.txt",
    "chars": 5,
    "preview": "1,2,3"
  },
  {
    "path": "src/test/resources/OrcSnappyTest/ctas.sql",
    "chars": 848,
    "preview": "CREATE EXTERNAL TABLE foo (s1 string, s2 string)\n  ROW FORMAT DELIMITED FIELDS TERMINATED BY ','\n  STORED AS TEXTFILE\n  "
  },
  {
    "path": "src/test/resources/PartitionSupportTest/hql_example.sql",
    "chars": 368,
    "preview": "CREATE EXTERNAL TABLE ${hiveconf:table.name} (s1 string, s2 string, s3 string)\n    PARTITIONED BY (\n        year int,\n  "
  },
  {
    "path": "src/test/resources/SerdeTest/create_table.sql",
    "chars": 252,
    "preview": "CREATE TABLE serde_test (\n  key STRING,\n  value STRING\n)\nROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSe"
  },
  {
    "path": "src/test/resources/SerdeTest/hql_custom_serde.sql",
    "chars": 312,
    "preview": "CREATE EXTERNAL TABLE customSerdeTable (s1 string, s2 string, s3 string)\n    ROW FORMAT SERDE 'com.klarna.hiverunner.ToU"
  },
  {
    "path": "src/test/resources/SetTest/test_with_set.hql",
    "chars": 200,
    "preview": "CREATE DATABASE testdb;\n\nSET hive.exec.dynamic.partition.mode=nonstrict;\nSET hive.exec.dynamic.partition=true;\n\nCREATE E"
  },
  {
    "path": "src/test/resources/TsvFileParserTest/data.csv",
    "chars": 25,
    "preview": "a1,b1,c1,d1,\na2,b2,,d2,e2"
  },
  {
    "path": "src/test/resources/TsvFileParserTest/data.tsv",
    "chars": 29,
    "preview": "a1\tb1\tc1\td1\te1\na2\tb2\tc2\td2\te2"
  },
  {
    "path": "src/test/resources/TsvFileParserTest/dataWithCustomNullValue.csv",
    "chars": 34,
    "preview": "a1,b1,c1,d1,NULL\na2,b2,NULL,d2,e2\n"
  },
  {
    "path": "src/test/resources/TsvFileParserTest/dataWithHeader.csv",
    "chars": 35,
    "preview": "a,b,c,d,e\na1,b1,c1,d1,\na2,b2,,d2,e2"
  },
  {
    "path": "src/test/resources/TsvFileParserTest/dataWithHeader.tsv",
    "chars": 39,
    "preview": "a\tb\tc\td\te\na1\tb1\tc1\td1\te1\na2\tb2\tc2\td2\te2"
  },
  {
    "path": "src/test/resources/log4j2.xml",
    "chars": 781,
    "preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Configuration status=\"WARN\">\n  <Appenders>\n    <Console name=\"Console\" target=\"S"
  }
]

About this extraction

This page contains the full source code of the klarna/HiveRunner GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 184 files (486.2 KB), approximately 119.5k tokens, and a symbol index with 874 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!