Full Code of orlikoski/CDQR for AI

main 465b48d1ab17 cached
16 files
173.2 KB
43.0k tokens
30 symbols
1 requests
Download .txt
Repository: orlikoski/CDQR
Branch: main
Commit: 465b48d1ab17
Files: 16
Total size: 173.2 KB

Directory structure:
gitextract_eljvsp25/

├── .travis.yml
├── Docker/
│   ├── Dockerfile
│   ├── README.md
│   ├── cdqr
│   ├── cdqr.d
│   ├── cdqr.d.ps1
│   └── cdqr.ps1
├── Icons/
│   └── Martin-Berube-Character-Knight.icns
├── LICENSE
├── README.md
├── ThankYou
├── docs/
│   ├── parser_datt.csv
│   ├── parser_lin.csv
│   ├── parser_mac.csv
│   └── parser_win.csv
└── src/
    └── cdqr.py

================================================
FILE CONTENTS
================================================

================================================
FILE: .travis.yml
================================================
os: windows
language: sh
python: "3.8"
before_install:
  - choco install python3
  - export PATH="/c/Python38:/c/Python38/Scripts:$PATH"
  - python -m pip install https://github.com/pyinstaller/pyinstaller/archive/develop.zip
script:
  - /c/Python38/Scripts/pyinstaller.exe -F --clean -c "src/cdqr.py" -n cdqr.exe -i "Icons/Martin-Berube-Character-Knight.ico"
deploy:
  provider: releases
  api_key:
    secure: BcuY8Or8PCre26nKczH9426UPoRkUddK4eZelit309w8SXfrHE3aRZILZ9cbU+8bmi5noGoeLWLiscl70COX018cnf8yzkPnAj9tigVALKojlF2Cv8Vt0p9b2l5gHZb73HMWbARkkaGlycrqofSv9LFqUyf6BSQu0UtWUaV1Y7ofZAhU3whVw+TB/0y6RrnIIq2C1qLrxMqPIrzcUcR6Bf+2RtVwqqEmjxFLawZA/IuY7HzTJd/YcWv1G6VjgELXAab0YI2BV0ZxpAmuofnV7IxBst355djlSxFy+/cHBn1YbWiIA9STWBgNUvDtp9wX12CcAUQiqeNsljb/nAUMx9gp9jkBKRtbXSW0i8wZSAH7xBVeTLYfE7oDRMAHa+jU8OTtmpgXX1zHHIcTj5stAZi+wk3GRvwNKkC+els/R+Xl8R3GLHkRS28qhYNHPIm7bEZfawM5pSOhNyxRc62mIs0zmLhjQu7eLDheJKBioM35xgCl00aV/531xjvzazRMJ/75E49LtODeE61H/I2jE2dZ8wchHHxTTvV77/kwWilYdXYfoUAFhXc1WTPyvmkMoYL/Dq0hGk05ID2/A9YYafyZRhyzfInV1NLTSezkBnjFnXIaK3REzKRVJK+WkaHxLLFJPSFVVzgZ7UlPp+CBVKbgfksp0syjyOgUpQfpJlU=
  file: dist/cdqr.exe
  name: Draft Release
  draft: true
  skip_cleanup: true
  on:
    tags: true


================================================
FILE: Docker/Dockerfile
================================================
# Use the official Docker Hub Ubuntu 18.04 base image
FROM ubuntu:18.04
MAINTAINER @aorlikoski

ENV DEBIAN_FRONTEND noninteractive

# Setup install environment, Plaso, and Timesketch dependencies
RUN apt-get -qq -y update && \
    apt-get -qq -y --no-install-recommends install \
      software-properties-common \
      apt-transport-https && \
    add-apt-repository -u -y ppa:gift/stable && \
    apt-get -qq -y update && \
    apt-get -qq -y --assume-yes --no-install-recommends install \
      python-setuptools \
      build-essential \
      curl \
      git \
      gpg-agent \
      libffi-dev \
      lsb-release \
      locales \
      python3-dev \
      python3-setuptools \
      python3 \
      python3-pip \
      python3-psycopg2 \
      python3-wheel && \
    curl -sS https://deb.nodesource.com/gpgkey/nodesource.gpg.key | apt-key add - && \
    VERSION=node_8.x && \
    DISTRO="$(lsb_release -s -c)" && \
    echo "deb https://deb.nodesource.com/$VERSION $DISTRO main" > /etc/apt/sources.list.d/nodesource.list && \
    curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - && \
    echo "deb https://dl.yarnpkg.com/debian/ stable main" > /etc/apt/sources.list.d/yarn.list && \
    apt-get -qq -y update && \
    apt-get -qq -y --no-install-recommends install \
      nodejs \
      yarn && \
    apt-get -y dist-upgrade && \
    apt-get -qq -y clean && \
    apt-get -qq -y autoclean && \
    apt-get -qq -y autoremove && \
    rm -rf /var/cache/apt/ /var/lib/apt/lists/

# Download and install Plaso from GitHub Release
RUN curl -sL -o /tmp/plaso-20190916.tar.gz https://github.com/log2timeline/plaso/archive/20190916.tar.gz && \
    cd /tmp/ && \
    tar zxf plaso-20190916.tar.gz && \
    cd plaso-20190916 && \
    pip3 install -r requirements.txt && \
    pip3 install mock && \
    python3 setup.py build && \
    python3 setup.py install && \
    rm -rf /tmp/*

# Build and Install Timesketch from GitHub Master with Pip
RUN git clone https://github.com/google/timesketch.git /tmp/timesketch && \
    cd /tmp/timesketch && \
    git checkout aded1b19acca44b99854083088ef920390f75457 && \
    cd /tmp/timesketch && ls && yarn install && \
    yarn run build  && \
    sed -i -e '/pyyaml/d' /tmp/timesketch/requirements.txt && \
    pip3 install /tmp/timesketch/ && \
    rm -rf /tmp/*

# Set terminal to UTF-8 by default
RUN locale-gen en_US.UTF-8 && \
    update-locale LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8

ENV LANG en_US.UTF-8
ENV LC_ALL en_US.UTF-8

# Download and install CDQR
RUN curl -s -o /usr/local/bin/cdqr.py \
    https://raw.githubusercontent.com/orlikoski/CDQR/master/src/cdqr.py && \
    chmod 755 /usr/local/bin/cdqr.py

# Load the entrypoint script to be run later
ENTRYPOINT ["/usr/local/bin/cdqr.py"]


================================================
FILE: Docker/README.md
================================================
# CDQR Docker
The CDQR docker is a docker image with CDQR and all of the dependencies installed.

The docker itself is stored on DockerHub at https://hub.docker.com/r/aorlikoski/cdqr. The docker can be used by `docker run aorlikoski/cdqr`.

# Skadi Compatibility
Due to the complexity of using docker a helper bash script `cdqr` was created. It works with the Docker, OVA, Vagrant, Signed Installer versions of Skadi. It can be easily modified to work in any environment.

# Command Line Changes
It is not required to use the `cdqr` bash script to make `aorlikoski/cdqr` work but it makes the transition much easier. That said, there is one critical difference in the commands used with the bash script `cdqr` vs the original python `cdqr.py`. The path to the data being processed (input) and the path to the output folder (output) are parsed differently in the bash script.

_TL;DR_ use `in:` and `out:` to specify the input and output paths. The `-y` flag to accept default answers to all CDQR questions is added automatically by the script at run time. _This is important since the process will fail if any user input is required._  


# Windows, MacOS and Linux Support
## Bash
`cdqr` is a translation script that does the heavy lifting of volume mapping and networking for docker.  
`cdqr.d` is a daemon version that doesn't output to the screen, thereby enabling processing in the background  
Example: `bash cdqr in:artifacts.zip`

## PowerShell
`cdqr.ps1` is a translation script that does the heavy lifting of volume mapping and networking for docker.  
`cdqr.d.ps1` is a daemon version that doesn't output to the screen, thereby enabling processing in the background  
Example: `powershell -ExecutionPolicy Bypass cdqr.ps1 in:artifacts.zip`

### How it Works
Helper Script Command  
`cdqr in:winevt.zip out:Results -z --max_cpu`  

Same Command Manually  
```docker run   -v /etc/hosts:/etc/hosts:ro   --network host -v /home/skadi/winevt.zip:/home/skadi/winevt.zip -v /home/skadi/Results:/home/skadi/Results aorlikoski/cdqr:4.4.0 -y /home/skadi/winevt.zip /home/skadi/Results -z --max_cpu```  

## Process ZIP file (default windows parser list)
This uses the default win parser list and saves output to Results folder on host  
*cdqr in:winevt.zip out:Results -z --max_cpu*  
![](/objects/images/zip_demo.gif?)

## Use the same .plaso file but output into Kibana
This uses existing .plaso file and doesn't save the output on the host (it is ephemeral and deleted when the CDQR docker run completes)  
*cdqr in:Results/winevt.plaso --plaso_db --es_kb winevt*  
![](/objects/images/plaso_kibana.gif?)

## Use the same .plaso file but output into TimeSketch
This uses existing .plaso file and doesn't save the output on the host. This uses `/etc/timesketh.conf` on the host to pass the values it needs to insert into TimeSketch.  
*cdqr in:Results/winevt.plaso --plaso_db --es_ts winevt*  
![](/objects/images/plaso_ts.gif?)


================================================
FILE: Docker/cdqr
================================================
#!/bin/bash
cdqr_version="5.1.0.1"
cur_dir="$(pwd)"
docker_network=${DOCKER_NETWORK}
timesketch_conf=${TIMESKETCH_CONF:-"/opt/Skadi/Docker/timesketch/timesketch_default.conf"}
timesketch_conf_legacy="/etc/timesketch.conf"
timesketch_server_ipaddress=${TIMESKETCH_SERVER_IPADDRESS:-""}
docker_args="docker run "
args=()

fix_path () {
  file_path=$1
  file_path="$(echo $file_path | sed 's/ /\\ /g')"
  eval file_path=$file_path
  if [ "${file_path:0:1}" == "/" ]; then
    #this is a root level path, do not modify
    final_path=$file_path
  elif [ "${file_path:0:2}" == "./" ]; then
    #this is a current dir path, modify to add absolute path
    final_path=("$cur_dir/${file_path:2:${#file_path}}")
  elif [ "${file_path:0:1}" == "~" ]; then
    #this is a home dir path, modify to add absolute path
    final_path=("$(echo $HOME)/${file_path:2:${#file_path}}")
  else
    final_path=("$cur_dir/$file_path")
  fi
  echo "$final_path"
}

# Set the docker network (if any) to use
if [ $docker_network ]; then
  echo "Validating the Docker network exists: $docker_network"
  if [ $(docker network ls |grep $docker_network |awk '{print $2}' ) ]; then
    echo "Connecting CDQR to the Docker network: $docker_network"
    docker_args="$docker_args --network $docker_network "
  else
    echo "Docker network $docker_network does not exist, quitting"
    exit
  fi
else
  echo "Assigning CDQR to the host network"
  echo "The Docker network can be changed by modifying the \"DOCKER_NETWORK\" environment variable"
  echo "Example (default Skadi mode): export DOCKER_NETWORK=host"
  echo "Example (use other Docker network): export DOCKER_NETWORK=skadi-backend"
  docker_args="$docker_args --network host "
fi

for i in "$@"; do
  # If it's timesketch add the timesketch mapping
  if [ "$i" == "--es_ts" ]; then
    if [ ! -f "$timesketch_conf" ]; then
      timesketch_conf=""
      if [ ! -f "$timesketch_conf_legacy" ]; then
        while [ "$timesketch_conf" == "" ]; do
          echo "TimeSketch default configuration file must be set. This can be done with an Environment variable."
          echo "The default configuration is the absolute path to Skadi/Docker/timesketch/timesketch_default.conf."
          echo "Example with Skadi git repo in \"/opt/Skadi\"): export TIMESKETCH_CONF=\"/opt/Skadi/Docker/timesketch/timesketch_default.conf\""
          echo ""
          read -e -p "Enter the location of the timesketch.conf file to use in this operation: " timesketch_conf
          timesketch_conf="$(fix_path $timesketch_conf)"
          if [ ! -f "$timesketch_conf" ]; then
            echo "Invalid file path, re-enter."
            timesketch_conf=""
          fi
        done
      else
        timesketch_conf=$timesketch_conf_legacy
      fi
    fi
    if [ "$timesketch_server_ipaddress" == "" ]; then
        timesketch_server_ipaddress='127.0.0.1'
    fi
    docker_args="$docker_args --add-host=elasticsearch:$timesketch_server_ipaddress --add-host=postgres:$timesketch_server_ipaddress -v ${timesketch_conf}:/etc/timesketch.conf"
  fi

  # If it's an input file/dir (denoted by "in:" then resolve absolute path)
  if [ "${i:0:3}" == "in:" ]; then
    input_map="${i:3:${#i}}"
    final_input_path="$(fix_path '$input_map')"
    args+=($final_input_path)
    docker_args="$docker_args -v $final_input_path:$final_input_path"
  # If it's an output file/dir (denoted by "out:" then resolve absolute path)
  elif [ "${i:0:4}" == "out:" ]; then
    output_map="${i:4:${#i}}"
    final_output_path="$(fix_path '$output_map')"
    args+=($final_output_path)
    docker_args="$docker_args -v $final_output_path:$final_output_path"
  # Everything is is copied over as is
  else
    args+=("$i")
  fi
done

final_command="$docker_args aorlikoski/cdqr:$cdqr_version -y ${args[@]}"
echo "$final_command"
$final_command


================================================
FILE: Docker/cdqr.d
================================================
#!/bin/bash
cdqr_version="5.1.0.1"
cur_dir="$(pwd)"
docker_network=${DOCKER_NETWORK}
timesketch_conf=${TIMESKETCH_CONF:-"/opt/Skadi/Docker/timesketch/timesketch_default.conf"}
timesketch_conf_legacy="/etc/timesketch.conf"
timesketch_server_ipaddress=${TIMESKETCH_SERVER_IPADDRESS:-""}
docker_args="docker run -d "
args=()

fix_path () {
  file_path=$1
  file_path="$(echo $file_path | sed 's/ /\\ /g')"
  eval file_path=$file_path
  if [ "${file_path:0:1}" == "/" ]; then
    #this is a root level path, do not modify
    final_path=$file_path
  elif [ "${file_path:0:2}" == "./" ]; then
    #this is a current dir path, modify to add absolute path
    final_path=("$cur_dir/${file_path:2:${#file_path}}")
  elif [ "${file_path:0:1}" == "~" ]; then
    #this is a home dir path, modify to add absolute path
    final_path=("$(echo $HOME)/${file_path:2:${#file_path}}")
  else
    final_path=("$cur_dir/$file_path")
  fi
  echo "$final_path"
}

# Set the docker network (if any) to use
if [ $docker_network ]; then
  echo "Validating the Docker network exists: $docker_network"
  if [ $(docker network ls |grep $docker_network |awk '{print $2}' ) ]; then
    echo "Connecting CDQR to the Docker network: $docker_network"
    docker_args="$docker_args --network $docker_network "
  else
    echo "Docker network $docker_network does not exist, quitting"
    exit
  fi
else
  echo "Assigning CDQR to the host network"
  echo "The Docker network can be changed by modifying the \"DOCKER_NETWORK\" environment variable"
  echo "Example (default Skadi mode): export DOCKER_NETWORK=host"
  echo "Example (use other Docker network): export DOCKER_NETWORK=skadi-backend"
  docker_args="$docker_args --network host "
fi

for i in "$@"; do
  # If it's timesketch add the timesketch mapping
  if [ "$i" == "--es_ts" ]; then
    if [ ! -f "$timesketch_conf" ]; then
      if [ -f "$timesketch_conf_legacy" ]; then
        timesketch_conf=$timesketch_conf_legacy
      else
        echo "TimeSketch default configuration file must be set with Environment variable in daemon mode."
        echo "The default configuration is the absolute path to Skadi/Docker/timesketch/timesketch_default.conf."
        echo "Example with Skadi git repo in \"/opt/Skadi\"): export TIMESKETCH_CONF=\"/opt/Skadi/Docker/timesketch/timesketch_default.conf\""
        echo "Exiting"
        exit
      fi
    fi
    if [ "$timesketch_server_ipaddress" == "" ]; then
        timesketch_server_ipaddress='127.0.0.1'
    fi
    docker_args="$docker_args --add-host=elasticsearch:$timesketch_server_ipaddress --add-host=postgres:$timesketch_server_ipaddress -v ${timesketch_conf}:/etc/timesketch.conf"
  fi

  # If it's an input file/dir (denoted by "in:" then resolve absolute path)
  if [ "${i:0:3}" == "in:" ]; then
    input_map="${i:3:${#i}}"
    final_input_path="$(fix_path '$input_map')"
    args+=($final_input_path)
    docker_args="$docker_args -v $final_input_path:$final_input_path"
  # If it's an output file/dir (denoted by "out:" then resolve absolute path)
  elif [ "${i:0:4}" == "out:" ]; then
    output_map="${i:4:${#i}}"
    final_output_path="$(fix_path '$output_map')"
    args+=($final_output_path)
    docker_args="$docker_args -v $final_output_path:$final_output_path"
  # Everything is is copied over as is
  else
    args+=("$i")
  fi
done

final_command="$docker_args aorlikoski/cdqr:$cdqr_version -y ${args[@]}"
echo "$final_command"
$final_command


================================================
FILE: Docker/cdqr.d.ps1
================================================
#! /usr/bin/pwsh
$ErrorActionPreference = "Stop"

$cdqr_version="5.1.0.1"
$cur_dir=Get-Location
$docker_network=$env:DOCKER_NETWORK
$timesketch_conf=$env:TIMESKETCH_CONF
$timesketch_server_ipaddress=$env:TIMESKETCH_SERVER_IPADDRESS
$docker_args="docker run -d"
$custom_args=@()

# Set the docker network (if any) to use
if ( $docker_network ) {
  echo "Validating the Docker network exists: $docker_network"
  $test = docker network ls | findstr $docker_network | %{ $_.Split(" ")[8]; }
  if ( $test ) {
    echo "Connecting CDQR to the Docker network: $docker_network"
    $docker_args="$docker_args --network $docker_network "
  }
  else {
    echo "Docker network $docker_network does not exist, quitting"
    echo "Exiting"
    exit
  }
}
else {
  echo "Assigning CDQR to the host network"
  echo "The Docker network can be changed by modifying the `"DOCKER_NETWORK`" environment variable"
  echo "Example (default Skadi mode): `$env:DOCKER_NETWORK = `"host`""
  echo "Example (use other Docker network): `$env:DOCKER_NETWORK = `"skadi-backend`""
  $docker_args="$docker_args --network host "
}

# Parse the arguments
foreach ($i in $args) {
    # If it's timesketch add the timesketch config file mapping
    if ( $i -eq "--es_ts" ) {
      if ($timesketch_conf -ne $null){
        if (-not(test-path $timesketch_conf)){
          Write-host "Invalid file path, exiting."
          exit
        }
        elseif ((get-item $timesketch_conf).psiscontainer){
          Write-host "Source must be a file, exiting."
          exit
        }
      }
      else {
        echo "TimeSketch default configuration file must be set with Environment variable in daemon mode."
        echo "The default configuration is the absolute path to Skadi\Docker\timesketch\timesketch_default.conf."
        echo "Example with Skadi git repo in `"C:\GitHub\Skadi`"): `$env:TIMESKETCH_CONF = `"C:\GitHub\Skadi\Docker\timesketch\timesketch_default.conf`""
        echo "Exiting"
        exit
      }
      if ( $timesketch_server_ipaddress -eq $null) {
          $timesketch_server_ipaddress = '127.0.0.1'
      }
      $docker_args="$docker_args --add-host=elasticsearch:$timesketch_server_ipaddress --add-host=postgres:$timesketch_server_ipaddress -v '${timesketch_conf}:/etc/timesketch.conf'"
    }
    # If it's an input file/dir (denoted by "in:" then resolve absolute path)
    if ( $i.SubString(0,3) -eq "in:" ) {
      $input_path=$i.SubString(3,$i.length - 3)
      $input_path_full=Resolve-Path -Path $input_path
      $docker_input_path="$input_path_full".SubString(2,"$input_path_full".length - 2).Replace("\","/")
      $docker_args+=" -v '${input_path_full}:/data$docker_input_path'"
      $custom_args+="'/data$docker_input_path'"
    }
    # If it's an output file/dir (denoted by "out:" then resolve absolute path)
    elseif ( $i.SubString(0,4) -eq "out:" ) {
      $output_path=$i.SubString(4,$i.length - 4)
      If(!(test-path $output_path))
      {
            New-Item -ItemType Directory -Force -Path $output_path | Out-Null
      }
      $output_path_full=Resolve-Path -Path $output_path
      $docker_output_path="$output_path_full".SubString(2,"$output_path_full".length - 2).Replace("\","/")
      $docker_args+=" -v '${output_path_full}:/output$docker_output_path'"
      $custom_args+="'/output$docker_output_path'"
    }
    else {
      $custom_args+=$i
    }
}
$final_command="$docker_args aorlikoski/cdqr:$cdqr_version -y $custom_args"
$final_command
iex $final_command


================================================
FILE: Docker/cdqr.ps1
================================================
#! /usr/bin/pwsh
$ErrorActionPreference = "Stop"

$cdqr_version="5.1.0.1"
$cur_dir=Get-Location
$docker_network=$env:DOCKER_NETWORK
$timesketch_conf=$env:TIMESKETCH_CONF
$timesketch_server_ipaddress=$env:TIMESKETCH_SERVER_IPADDRESS
$docker_args="docker run"
$custom_args=@()

# Set the docker network (if any) to use
if ( $docker_network ) {
  echo "Validating the Docker network exists: $docker_network"
  $test = docker network ls | findstr $docker_network | %{ $_.Split(" ")[8]; }
  if ( $test ) {
    echo "Connecting CDQR to the Docker network: $docker_network"
    $docker_args="$docker_args --network $docker_network "
  }
  else {
    echo "Docker network $docker_network does not exist, quitting"
    echo "Exiting"
    exit
  }
}
else {
  echo "Assigning CDQR to the host network"
  echo "The Docker network can be changed by modifying the `"DOCKER_NETWORK`" environment variable"
  echo "Example (default Skadi mode): `$env:DOCKER_NETWORK = `"host`""
  echo "Example (use other Docker network): `$env:DOCKER_NETWORK = `"skadi-backend`""
  $docker_args="$docker_args --network host "
}

# Parse the arguments
foreach ($i in $args) {
    # If it's timesketch add the timesketch config file mapping
    if ( $i -eq "--es_ts" ) {
      while ($timesketch_conf -eq $null){
        echo "TimeSketch default configuration file must be set. This can be done with an Environment variable."
        echo "The default configuration is the absolute path to Skadi\Docker\timesketch\timesketch_default.conf."
        echo "Example with Skadi git repo in `"C:\GitHub\Skadi`"): `$env:TIMESKETCH_CONF = `"C:\GitHub\Skadi\Docker\timesketch\timesketch_default.conf`""
        echo ""
        $timesketch_conf = read-host "Enter the location of the TimeSketch configuration file to use in this operation "
        if (-not(test-path $timesketch_conf)){
          Write-host "Invalid file path, re-enter."
          $timesketch_conf = $null
        }
        elseif ((get-item $timesketch_conf).psiscontainer){
          Write-host "Source must be a file, re-enter."
          $timesketch_conf = $null
        }
      }
      if ( $timesketch_server_ipaddress -eq $null) {
          $timesketch_server_ipaddress = '127.0.0.1'
      }
      $docker_args="$docker_args --add-host=elasticsearch:$timesketch_server_ipaddress --add-host=postgres:$timesketch_server_ipaddress -v '${timesketch_conf}:/etc/timesketch.conf'"
    }
    # If it's an input file/dir (denoted by "in:" then resolve absolute path)
    if ( $i.SubString(0,3) -eq "in:" ) {
      $input_path=$i.SubString(3,$i.length - 3)
      $input_path_full=Resolve-Path -Path $input_path
      $docker_input_path="$input_path_full".SubString(2,"$input_path_full".length - 2).Replace("\","/")
      $docker_args+=" -v '${input_path_full}:/data$docker_input_path'"
      $custom_args+="'/data$docker_input_path'"
    }
    # If it's an output file/dir (denoted by "out:" then resolve absolute path)
    elseif ( $i.SubString(0,4) -eq "out:" ) {
      $output_path=$i.SubString(4,$i.length - 4)
      If(!(test-path $output_path))
      {
            New-Item -ItemType Directory -Force -Path $output_path | Out-Null
      }
      $output_path_full=Resolve-Path -Path $output_path
      $docker_output_path="$output_path_full".SubString(2,"$output_path_full".length - 2).Replace("\","/")
      $docker_args+=" -v '${output_path_full}:/output$docker_output_path'"
      $custom_args+="'/output$docker_output_path'"
    }
    else {
      $custom_args+=$i
    }
}
$final_command="$docker_args aorlikoski/cdqr:$cdqr_version -y $custom_args"
$final_command
iex $final_command


================================================
FILE: LICENSE
================================================
           GNU GENERAL PUBLIC LICENSE
                       Version 3, 29 June 2007

 Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
 Everyone is permitted to copy and distribute verbatim copies
 of this license document, but changing it is not allowed.

                            Preamble

  The GNU General Public License is a free, copyleft license for
software and other kinds of works.

  The licenses for most software and other practical works are designed
to take away your freedom to share and change the works.  By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users.  We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors.  You can apply it to
your programs, too.

  When we speak of free software, we are referring to freedom, not
price.  Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.

  To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights.  Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.

  For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received.  You must make sure that they, too, receive
or can get the source code.  And you must show them these terms so they
know their rights.

  Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.

  For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software.  For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.

  Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so.  This is fundamentally incompatible with the aim of
protecting users' freedom to change the software.  The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable.  Therefore, we
have designed this version of the GPL to prohibit the practice for those
products.  If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.

  Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary.  To prevent this, the GPL assures that
patents cannot be used to render the program non-free.

  The precise terms and conditions for copying, distribution and
modification follow.

                       TERMS AND CONDITIONS

  0. Definitions.

  "This License" refers to version 3 of the GNU General Public License.

  "Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.

  "The Program" refers to any copyrightable work licensed under this
License.  Each licensee is addressed as "you".  "Licensees" and
"recipients" may be individuals or organizations.

  To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy.  The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.

  A "covered work" means either the unmodified Program or a work based
on the Program.

  To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy.  Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.

  To "convey" a work means any kind of propagation that enables other
parties to make or receive copies.  Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.

  An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License.  If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.

  1. Source Code.

  The "source code" for a work means the preferred form of the work
for making modifications to it.  "Object code" means any non-source
form of a work.

  A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.

  The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form.  A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.

  The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities.  However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work.  For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.

  The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.

  The Corresponding Source for a work in source code form is that
same work.

  2. Basic Permissions.

  All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met.  This License explicitly affirms your unlimited
permission to run the unmodified Program.  The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work.  This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.

  You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force.  You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright.  Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.

  Conveying under any other circumstances is permitted solely under
the conditions stated below.  Sublicensing is not allowed; section 10
makes it unnecessary.

  3. Protecting Users' Legal Rights From Anti-Circumvention Law.

  No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.

  When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.

  4. Conveying Verbatim Copies.

  You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.

  You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.

  5. Conveying Modified Source Versions.

  You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:

    a) The work must carry prominent notices stating that you modified
    it, and giving a relevant date.

    b) The work must carry prominent notices stating that it is
    released under this License and any conditions added under section
    7.  This requirement modifies the requirement in section 4 to
    "keep intact all notices".

    c) You must license the entire work, as a whole, under this
    License to anyone who comes into possession of a copy.  This
    License will therefore apply, along with any applicable section 7
    additional terms, to the whole of the work, and all its parts,
    regardless of how they are packaged.  This License gives no
    permission to license the work in any other way, but it does not
    invalidate such permission if you have separately received it.

    d) If the work has interactive user interfaces, each must display
    Appropriate Legal Notices; however, if the Program has interactive
    interfaces that do not display Appropriate Legal Notices, your
    work need not make them do so.

  A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit.  Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.

  6. Conveying Non-Source Forms.

  You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:

    a) Convey the object code in, or embodied in, a physical product
    (including a physical distribution medium), accompanied by the
    Corresponding Source fixed on a durable physical medium
    customarily used for software interchange.

    b) Convey the object code in, or embodied in, a physical product
    (including a physical distribution medium), accompanied by a
    written offer, valid for at least three years and valid for as
    long as you offer spare parts or customer support for that product
    model, to give anyone who possesses the object code either (1) a
    copy of the Corresponding Source for all the software in the
    product that is covered by this License, on a durable physical
    medium customarily used for software interchange, for a price no
    more than your reasonable cost of physically performing this
    conveying of source, or (2) access to copy the
    Corresponding Source from a network server at no charge.

    c) Convey individual copies of the object code with a copy of the
    written offer to provide the Corresponding Source.  This
    alternative is allowed only occasionally and noncommercially, and
    only if you received the object code with such an offer, in accord
    with subsection 6b.

    d) Convey the object code by offering access from a designated
    place (gratis or for a charge), and offer equivalent access to the
    Corresponding Source in the same way through the same place at no
    further charge.  You need not require recipients to copy the
    Corresponding Source along with the object code.  If the place to
    copy the object code is a network server, the Corresponding Source
    may be on a different server (operated by you or a third party)
    that supports equivalent copying facilities, provided you maintain
    clear directions next to the object code saying where to find the
    Corresponding Source.  Regardless of what server hosts the
    Corresponding Source, you remain obligated to ensure that it is
    available for as long as needed to satisfy these requirements.

    e) Convey the object code using peer-to-peer transmission, provided
    you inform other peers where the object code and Corresponding
    Source of the work are being offered to the general public at no
    charge under subsection 6d.

  A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.

  A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling.  In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage.  For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product.  A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.

  "Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source.  The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.

  If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information.  But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).

  The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed.  Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.

  Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.

  7. Additional Terms.

  "Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law.  If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.

  When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it.  (Additional permissions may be written to require their own
removal in certain cases when you modify the work.)  You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.

  Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:

    a) Disclaiming warranty or limiting liability differently from the
    terms of sections 15 and 16 of this License; or

    b) Requiring preservation of specified reasonable legal notices or
    author attributions in that material or in the Appropriate Legal
    Notices displayed by works containing it; or

    c) Prohibiting misrepresentation of the origin of that material, or
    requiring that modified versions of such material be marked in
    reasonable ways as different from the original version; or

    d) Limiting the use for publicity purposes of names of licensors or
    authors of the material; or

    e) Declining to grant rights under trademark law for use of some
    trade names, trademarks, or service marks; or

    f) Requiring indemnification of licensors and authors of that
    material by anyone who conveys the material (or modified versions of
    it) with contractual assumptions of liability to the recipient, for
    any liability that these contractual assumptions directly impose on
    those licensors and authors.

  All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10.  If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term.  If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.

  If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.

  Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.

  8. Termination.

  You may not propagate or modify a covered work except as expressly
provided under this License.  Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).

  However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.

  Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.

  Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License.  If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.

  9. Acceptance Not Required for Having Copies.

  You are not required to accept this License in order to receive or
run a copy of the Program.  Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance.  However,
nothing other than this License grants you permission to propagate or
modify any covered work.  These actions infringe copyright if you do
not accept this License.  Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.

  10. Automatic Licensing of Downstream Recipients.

  Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License.  You are not responsible
for enforcing compliance by third parties with this License.

  An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations.  If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.

  You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License.  For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.

  11. Patents.

  A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based.  The
work thus licensed is called the contributor's "contributor version".

  A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version.  For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.

  Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.

  In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement).  To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.

  If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients.  "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.

  If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.

  A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License.  You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.

  Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.

  12. No Surrender of Others' Freedom.

  If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License.  If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all.  For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.

  13. Use with the GNU Affero General Public License.

  Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work.  The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.

  14. Revised Versions of this License.

  The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time.  Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.

  Each version is given a distinguishing version number.  If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation.  If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.

  If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.

  Later license versions may give you additional or different
permissions.  However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.

  15. Disclaimer of Warranty.

  THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW.  EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE.  THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU.  SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.

  16. Limitation of Liability.

  IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.

  17. Interpretation of Sections 15 and 16.

  If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.

                     END OF TERMS AND CONDITIONS

            How to Apply These Terms to Your New Programs

  If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.

  To do so, attach the following notices to the program.  It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.

    CDQR — Cold Disk Quick Response tool.
    Copyright (C) 2017  Alan Orlikoski

    This program is free software: you can redistribute it and/or modify
    it under the terms of the GNU General Public License as published by
    the Free Software Foundation, either version 3 of the License, or
    (at your option) any later version.

    This program is distributed in the hope that it will be useful,
    but WITHOUT ANY WARRANTY; without even the implied warranty of
    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    GNU General Public License for more details.

    You should have received a copy of the GNU General Public License
    along with this program.  If not, see <http://www.gnu.org/licenses/>.

Also add information on how to contact you by electronic and paper mail.

  If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:

    CDQR  Copyright (C) 2017  Alan Orlikoski
    This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
    This is free software, and you are welcome to redistribute it
    under certain conditions; type `show c' for details.

The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License.  Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".

  You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<http://www.gnu.org/licenses/>.

  The GNU General Public License does not permit incorporating your program
into proprietary programs.  If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library.  If this is what you want to do, use the GNU Lesser General
Public License instead of this License.  But first, please read
<http://www.gnu.org/philosophy/why-not-lgpl.html>.

================================================
FILE: README.md
================================================
## NAME

CDQR — Cold Disk Quick Response tool by Alan Orlikoski

For latest release click [here](https://github.com/orlikoski/CDQR/releases/latest)

## Please Read
[Open Letter to the users of Skadi, CyLR, and CDQR](https://docs.google.com/document/d/1L6CBvFd7d1Qf4IxSJSdkKMTdbBuWzSzUM3u_h5ZCegY/edit?usp=sharing)

## Videos and Media
*  [OSDFCON 2017](http://www.osdfcon.org/presentations/2017/Asif-Matadar_Rapid-Incident-Response.pdf) Slides: Walk-through different techniques that are required to provide forensics results for Windows and *nix environments (Including CyLR and CDQR)

## What is CDQR?
The CDQR tool uses Plaso to parse forensic artifacts and/or disk images with specific parsers and create easy to analyze custom reports. The parsers were chosen based triaging best practices and the custom reports group like items together to make analysis easier. The design came from the Live Response Model of investigating the important artifacts first. This is meant to be a starting point for investigations, not the complete investigation.

In addition to processing entire forensic images it also parses extracted forensic artifact(s) as an individual file or collection of files inside of a folder structure (or inside a .zip file).

It creates up to 18 Reports (.csv files) based on triaging best practices and the parsing option selected
*  18 Reports for DATT:  
      ```
      Appcompat, Amcache, Bash, Event Logs, File System, MFT, UsnJrnl, Internet History, Prefetch, Registry, Scheduled Tasks, Persistence, System Information, AntiVirus, Firewall, Mac, Linux, and Android
      ```
*  14 Reports for Win:  
      ```
      Appcompat, Amcache, Bash, Event Logs, File System, MFT, UsnJrnl, Internet History, Prefetch, Registry, Scheduled Tasks, Persistence, System Information, AntiVirus, Firewall
      ```
*   8 Reports for Mac and Lin:  
      ```
      File System, Internet History, System Information, AntiVirus, Firewall, Mac, and Linux
      ```
*   7 Reports for Android:  
      ```
      File System, Internet History, Persistence, System Information, AntiVirus, Firewall, and Android
      ```


## Important Notes
* Make sure account has permissions to create files and directories when running (when in doubt, run as administrator)
*  Ensure line endings are correct for the OS it is running on

## DESCRIPTION

This program uses [Plaso](https://github.com/log2timeline/plaso/wiki) and a streamlined list of its parsers to quickly analyze a forenisic image file (dd, E01, .vmdk, etc) or group of forensic artifacts.  The results are output in either ElasticSearch, JSON (line delimited), or the following report files in CSV format:
*  18 Reports for DATT:  
      ```
      Appcompat, Amcache, Bash, Event Logs, File System, MFT, UsnJrnl, Internet History, Prefetch, Registry, Scheduled Tasks, Persistence, System Information, AntiVirus, Firewall, Mac, Linux, and Android
      ```
*  14 Reports for Win:  
      ```
      Appcompat, Amcache, Bash, Event Logs, File System, MFT, UsnJrnl, Internet History, Prefetch, Registry, Scheduled Tasks, Persistence, System Information, AntiVirus, Firewall
      ```
*   8 Reports for Mac and Lin:  
      ```
      File System, Internet History, System Information, AntiVirus, Firewall, Mac, and Linux
      ```
*   7 Reports for Android:  
      ```
      File System, Internet History, Persistence, System Information, AntiVirus, Firewall, and Android
      ```

## ARGUMENTS & OPTIONS
```
positional arguments:
  src_location          Source File location: Y:/Case/Tag009/sample.E01
  dst_location          Destination Folder location. If nothing is supplied
                        then the default is 'Results'

optional arguments:
  -h, --help            show this help message and exit
  -p PARSER, --parser PARSER
                        Choose parser to use. If nothing chosen then 'win' is
                        used. The parsing options are: win, mft_usnjrnl, lin,
                        mac, datt
  --nohash              Do not hash all the files as part of the processing of
                        the image
  --mft                 Process the MFT file (disabled by default except for
                        DATT)
  --usnjrnl             Process the USNJRNL file (disabled by default except
                        for DATT)
  --max_cpu             Use the maximum number of cpu cores to process the
                        image
  --export              Creates zipped, line delimited json export file
  --artifact_filters ARTIFACT_FILTERS
                        Plaso passthrough: Names of forensic artifact
                        definitions, provided on the command command line
                        (comma separated). Forensic artifacts are stored in
                        .yaml files that are directly pulled from the artifact
                        definitions project. You can also specify a custom
                        artifacts yaml file (see
                        --custom_artifact_definitions). Artifact definitions
                        can be used to describe and quickly collect data of
                        interest, such as specific files or Windows Registry
                        keys.
  --artifact_filters_file ARTIFACT_FILTERS_FILE
                        Plaso passthrough: Names of forensic artifact
                        definitions, provided in a file with one artifact name
                        per line. Forensic artifacts are stored in .yaml files
                        that are directly pulled from the artifact definitions
                        project. You can also specify a custom artifacts yaml
                        file (see --custom_artifact_definitions). Artifact
                        definitions can be used to describe and quickly
                        collect data of interest, such as specific files or
                        Windows Registry keys.
  --artifact_definitions ARTIFACT_DEFINITIONS
                        Plaso passthrough: Path to a directory containing
                        artifact definitions, which are .yaml files. Artifact
                        definitions can be used to describe and quickly
                        collect data of interest, such as specific files or
                        Windows Registry keys.
  --custom_artifact_definitions CUSTOM_ARTIFACT_DEFINITIONS
                        Plaso passthrough: Path to a file containing custom
                        artifact definitions, which are .yaml files. Artifact
                        definitions can be used to describe and quickly
                        collect data of interest, such as specific files or
                        Windows Registry keys.
  --file_filter FILE_FILTER, -f FILE_FILTER
                        Plaso passthrough: List of files to include for
                        targeted collection of files to parse, one line per
                        file path, setup is /path|file - where each element
                        can contain either a variable set in the preprocessing
                        stage or a regular expression.
  --es_kb ES_KB         Outputs Kibana format to elasticsearch database.
                        Requires index name. Example: '--es_kb my_index'
  --es_kb_server ES_KB_SERVER
                        Kibana Format Only: Exports to remote (default is
                        127.0.0.1) elasticsearch database. Requires Server
                        name or IP address Example: '--es_kb_server
                        myserver.elk.go' or '--es_kb_server 192.168.1.10'
  --es_kb_port ES_KB_PORT
                        Kibana Format Only: Port (default is 9200) for remote
                        elasticsearch database. Requires port number Example:
                        '--es_kb_port 9200 '
  --es_kb_user ES_KB_USER
                        Kibana Format Only: Username (default is none) for
                        remote elasticsearch database. Requires port number
                        Example: '--es_kb_user skadi '
  --es_ts ES_TS         Outputs TimeSketch format to elasticsearch database.
                        Requires index/timesketch name. Example: '--es_ts
                        my_name'
  --plaso_db            Process an existing Plaso DB file. Example:
                        artifacts.plaso
  -z                    Indicates the input file is a zip file and needs to be
                        decompressed
  --no_dependencies_check
                        Re-enables the log2timeline the dependencies check. It
                        is skipped by default
  --process_archives    Extract and inspect contents of archives found inside
                        of artifacts or disk images
  -v, --version         show program's version number and exit
  -y                    Accepts all defaults on prompted questions in the
                        program.
```

## DEPENDENCIES

1. 64-bit Windows, Linux, or Mac Operating System (OS)
2. The appropriate version of Plaso for the OS https://github.com/log2timeline/plaso/releases
3. [Python v3.x](https://www.python.org/downloads/) (if using cdqr.py source code)

## EXAMPLES

```
cdqr.py c:\mydiskimage.vmdk myresults
```
```
cdqr.exe -p win c:\images\badlaptop.e01
```
```
cdqr.exe -p datt --max_cpu C:\artifacts\tag009
```
```
cdqr.exe -p datt --max_cpu C:\artifacts\tag009\$MFT --export
```
```
cdqr.exe -z --max_cpu C:\artifacts\tag009\artifacts.zip
```
```
cdqr.exe -z --max_cpu C:\artifacts\tag009\artifacts.zip --es myindexname
```


## AUTHOR

Alan Orlikoski
* [GitHub](https://github.com/orlikoski)
* [Twitter](https://twitter.com/AlanOrlikoski)


================================================
FILE: ThankYou
================================================
Thanks to the Plaso team who's product is great (https://github.com/log2timeline/plaso/wiki)
Thanks to Andrew Moore for teaching me the ways of making .md files
Thank you to my friends and coworkers at Mandiant
Thank you to everyone who helped me along the way

================================================
FILE: docs/parser_datt.csv
================================================
amcache,
android_app_usage,
apache_access,
asl_log,
bash_history,
bash,
bencode,
binary_cookies,
bsm_log,
chrome_cache,
chrome_preferences,
cups_ipp,
custom_destinations,
czip,
dockerjson,
dpkg,
esedb,
filestat,
firefox_cache,
firefox_cache2,
fsevents,
gdrive_synclog,
hachoir,
java_idx,
lnk,
mac_appfirewall_log,
mac_keychain,
mac_securityd,
mactime,
macwifi,
mcafee_protection,
mft,
msiecf,
olecf,
opera_global,
opera_typed_history,
pe,
plist,
pls_recall,
popularity_contest,
prefetch,
recycle_bin_info2,
recycle_bin,
rplog,
santa,
sccm,
selinux,
skydrive_log_old,
skydrive_log,
sophos_av,
sqlite,
symantec_scanlog,
syslog,
systemd_journal,
trendmicro_url,
trendmicro_vd,
usnjrnl,
utmp,
utmpx,
winevt,
winevtx,
winfirewall,
winiis,
winjob,
winreg,
xchatlog,
xchatscrollback,
zsh_extended_history


================================================
FILE: docs/parser_lin.csv
================================================
bash,
bash_history,
bencode,
czip,
dockerjson,
dpkg,
filestat,
mcafee_protection,
olecf,
pls_recall,
popularity_contest,
selinux,
sophos_av,
sqlite,
symantec_scanlog,
syslog,
systemd_journal,
utmp,
webhist,
xchatlog,
xchatscrollback,
zsh_extended_history


================================================
FILE: docs/parser_mac.csv
================================================
asl_log,
bash_history,
bash,
bencode,
bsm_log,
ccleaner,
cups_ipp,
czipplist,
filestat,
fseventsd,
mcafee_protection,
mac_appfirewall_log,
mac_keychain,
mac_securityd,
macwifi,
mcafee_protection,
olecf,
sophos_av,
sqlite,
symantec_scanlog,
syslog,
utmpx,
webhist,
zsh_extended_history


================================================
FILE: docs/parser_win.csv
================================================
bencode,
czip,
ccleaner,
esedb,
filestat,
lnk,
mft,
mcafee_protection,
olecf,
pe,
prefetch,
recycle_bin,
recycle_bin_info2,
sccm,
sophos_av,
sqlite,
symantec_scanlog,
usnjrnl,
winevt,
winevtx,
webhist,
winfirewall,
winjob,
windows_typed_urls,
winreg


================================================
FILE: src/cdqr.py
================================================
#!/usr/bin/python3
"""
This program is free software: you can redistribute it and/or modify it under
the terms of the GNU General Public License as published by the Free Software
Foundation, either version 3 of the License, or (at your option) any later
version.
This program is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with
this program. If not, see <http://www.gnu.org/licenses/>.
"""
import io, \
  os, \
  sys, \
  argparse, \
  subprocess, \
  csv, \
  time, \
  datetime, \
  re, \
  multiprocessing, \
  shutil, \
  zipfile, \
  queue, \
  threading
try:
    import zlib
    compression = zipfile.ZIP_DEFLATED
except:
    compression = zipfile.ZIP_STORED

if sys.version_info[0] < 3:
    print(
        'CDQR requires python3 and python2 was detected. Please run this script with an compatible python interpreter.'
    )
    sys.exit(1)

modes = {
    zipfile.ZIP_DEFLATED: 'deflated',
    zipfile.ZIP_STORED: 'stored',
}
###############################################################################
# Created by: Alan Orlikoski
cdqr_version = "CDQR Version: 20191226"
#
###############################################################################
# Global Variables
parser_opt = ""
src_loc = ""
dst_loc = ""
start_dt = datetime.datetime.now()
end_dt = start_dt
duration = end_dt - start_dt
duration01 = end_dt - start_dt
duration02 = end_dt - start_dt
duration03 = end_dt - start_dt
create_db = True

# Dictionary of parsing options from command line to log2timeline
parse_optionslatest = {
    'win':
    "bash_history,bencode,czip,esedb,filestat,lnk,mcafee_protection,olecf,pe,prefetch,recycle_bin,recycle_bin_info2,sccm,sophos_av,sqlite,symantec_scanlog,winevt,winevtx,webhist,winfirewall,winjob,winreg,zsh_extended_history",
    'mft_usnjrnl':
    "mft,usnjrnl",
    'lin':
    "bash_history,bencode,binary_cookies,chrome_cache,chrome_preferences,czip/oxml,dockerjson,dpkg,esedb/msie_webcache,filestat,firefox_cache,gdrive_synclog,java_idx,msiecf,olecf,opera_global,opera_typed_history,plist/safari_history,pls_recall,popularity_contest,selinux,sqlite/chrome_27_history,sqlite/chrome_8_history,sqlite/chrome_autofill,sqlite/chrome_cookies,sqlite/chrome_extension_activity,sqlite/firefox_cookies,sqlite/firefox_downloads,sqlite/firefox_history,sqlite/google_drive,sqlite/skype,sqlite/zeitgeist,syslog,systemd_journal,utmp,xchatlog,xchatscrollback,zsh_extended_history",
    'mac':
    "asl_log,bash_history,bencode,binary_cookies,bsm_log,chrome_cache,chrome_preferences,cups_ipp,czip/oxml,esedb/msie_webcache,filestat,firefox_cache,fseventsd,gdrive_synclog,java_idx,mac_appfirewall_log,mac_keychain,mac_securityd,macwifi,msiecf,olecf,opera_global,opera_typed_history,plist,plist/safari_history,sqlite/appusage,sqlite/chrome_27_history,sqlite/chrome_8_history,sqlite/chrome_autofill,sqlite/chrome_cookies,sqlite/chrome_extension_activity,sqlite/firefox_cookies,sqlite/firefox_downloads,sqlite/firefox_history,sqlite/google_drive,sqlite/imessage,sqlite/ls_quarantine,sqlite/mac_document_versions,sqlite/mac_knowledgec,sqlite/mac_notes,sqlite/mackeeper_cache,sqlite/skype,syslog,utmpx,zsh_extended_history",
    'android':
    "android_app_usage,chrome_cache,filestat,sqlite/android_calls,sqlite/android_sms,sqlite/android_webview,sqlite/android_webviewcache,sqlite/chrome_27_history,sqlite/chrome_8_history,sqlite/chrome_cookies,sqlite/skype",
    'datt':
    "amcache,android_app_usage,apache_access,asl_log,bash_history,bencode,binary_cookies,bsm_log,chrome_cache,chrome_preferences,cups_ipp,custom_destinations,czip,dockerjson,dpkg,esedb,filestat,firefox_cache,firefox_cache2,fseventsd,gdrive_synclog,java_idx,lnk,mac_appfirewall_log,mac_keychain,mac_securityd,mactime,macwifi,mcafee_protection,mft,msiecf,olecf,opera_global,opera_typed_history,pe,plist,pls_recall,popularity_contest,prefetch,recycle_bin,recycle_bin_info2,rplog,santa,sccm,selinux,skydrive_log,skydrive_log_old,sophos_av,sqlite,symantec_scanlog,syslog,systemd_journal,trendmicro_url,trendmicro_vd,usnjrnl,utmp,utmpx,winevt,winevtx,winfirewall,winiis,winjob,winreg,xchatlog,xchatscrollback,zsh_extended_history",
}

# All credit for these definitions below to: https://www.ultimatewindowssecurity.com/securitylog/encyclopedia/default.aspx
eventlog_dict = {
    '512':
    'Windows NT is starting up',
    '513':
    'Windows is shutting down',
    '514':
    'An authentication package has been loaded by the Local Security Authority',
    '515':
    'A trusted logon process has registered with the Local Security Authority',
    '516':
    'Internal resources allocated for the queuing of audit messages have been exhausted, leading to the loss of some audits',
    '517':
    'The audit log was cleared',
    '518':
    'A notification package has been loaded by the Security Account Manager',
    '519':
    'A process is using an invalid local procedure call (LPC) port',
    '520':
    'The system time was changed',
    '521':
    'Unable to log events to security log',
    '528':
    'Successful Logon',
    '529':
    'Logon Failure - Unknown user name or bad password',
    '530':
    'Logon Failure - Account logon time restriction violation',
    '531':
    'Logon Failure - Account currently disabled',
    '532':
    'Logon Failure - The specified user account has expired',
    '533':
    'Logon Failure - User not allowed to logon at this computer',
    '534':
    'Logon Failure - The user has not been granted the requested logon type at this machine',
    '535':
    'Logon Failure - The specified accounts password has expired',
    '536':
    'Logon Failure - The NetLogon component is not active',
    '537':
    'Logon failure - The logon attempt failed for other reasons.',
    '538':
    'User Logoff',
    '539':
    'Logon Failure - Account locked out',
    '540':
    'Successful Network Logon',
    '551':
    'User initiated logoff',
    '552':
    'Logon attempt using explicit credentials',
    '560':
    'Object Open',
    '561':
    'Handle Allocated',
    '562':
    'Handle Closed',
    '563':
    'Object Open for Delete',
    '564':
    'Object Deleted',
    '565':
    'Object Open (Active Directory)',
    '566':
    'Object Operation (W3 Active Directory)',
    '567':
    'Object Access Attempt',
    '576':
    'Special privileges assigned to new logon',
    '577':
    'Privileged Service Called',
    '578':
    'Privileged object operation',
    '592':
    'A new process has been created',
    '593':
    'A process has exited',
    '594':
    'A handle to an object has been duplicated',
    '595':
    'Indirect access to an object has been obtained',
    '596':
    'Backup of data protection master key',
    '600':
    'A process was assigned a primary token',
    '601':
    'Attempt to install service',
    '602':
    'Scheduled Task created',
    '608':
    'User Right Assigned',
    '609':
    'User Right Removed',
    '610':
    'New Trusted Domain',
    '611':
    'Removing Trusted Domain',
    '612':
    'Audit Policy Change',
    '613':
    'IPSec policy agent started',
    '614':
    'IPSec policy agent disabled',
    '615':
    'IPSEC PolicyAgent Service',
    '616':
    'IPSec policy agent encountered a potentially serious failure.',
    '617':
    'Kerberos Policy Changed',
    '618':
    'Encrypted Data Recovery Policy Changed',
    '619':
    'Quality of Service Policy Changed',
    '620':
    'Trusted Domain Information Modified',
    '621':
    'System Security Access Granted',
    '622':
    'System Security Access Removed',
    '623':
    'Per User Audit Policy was refreshed',
    '624':
    'User Account Created',
    '625':
    'User Account Type Changed',
    '626':
    'User Account Enabled',
    '627':
    'Change Password Attempt',
    '628':
    'User Account password set',
    '629':
    'User Account Disabled',
    '630':
    'User Account Deleted',
    '631':
    'Security Enabled Global Group Created',
    '632':
    'Security Enabled Global Group Member Added',
    '633':
    'Security Enabled Global Group Member Removed',
    '634':
    'Security Enabled Global Group Deleted',
    '635':
    'Security Enabled Local Group Created',
    '636':
    'Security Enabled Local Group Member Added',
    '637':
    'Security Enabled Local Group Member Removed',
    '638':
    'Security Enabled Local Group Deleted',
    '639':
    'Security Enabled Local Group Changed',
    '640':
    'General Account Database Change',
    '641':
    'Security Enabled Global Group Changed',
    '642':
    'User Account Changed',
    '643':
    'Domain Policy Changed',
    '644':
    'User Account Locked Out',
    '645':
    'Computer Account Created',
    '646':
    'Computer Account Changed',
    '647':
    'Computer Account Deleted',
    '648':
    'Security Disabled Local Group Created',
    '649':
    'Security Disabled Local Group Changed',
    '650':
    'Security Disabled Local Group Member Added',
    '651':
    'Security Disabled Local Group Member Removed',
    '652':
    'Security Disabled Local Group Deleted',
    '653':
    'Security Disabled Global Group Created',
    '654':
    'Security Disabled Global Group Changed',
    '655':
    'Security Disabled Global Group Member Added',
    '656':
    'Security Disabled Global Group Member Removed',
    '657':
    'Security Disabled Global Group Deleted',
    '658':
    'Security Enabled Universal Group Created',
    '659':
    'Security Enabled Universal Group Changed',
    '660':
    'Security Enabled Universal Group Member Added',
    '661':
    'Security Enabled Universal Group Member Removed',
    '662':
    'Security Enabled Universal Group Deleted',
    '663':
    'Security Disabled Universal Group Created',
    '664':
    'Security Disabled Universal Group Changed',
    '665':
    'Security Disabled Universal Group Member Added',
    '666':
    'Security Disabled Universal Group Member Removed',
    '667':
    'Security Disabled Universal Group Deleted',
    '668':
    'Group Type Changed',
    '669':
    'Add SID History',
    '670':
    'Add SID History',
    '671':
    'User Account Unlocked',
    '672':
    'Authentication Ticket Granted',
    '673':
    'Service Ticket Granted',
    '674':
    'Ticket Granted Renewed',
    '675':
    'Pre-authentication failed',
    '676':
    'Authentication Ticket Request Failed',
    '677':
    'Service Ticket Request Failed',
    '678':
    'Account Mapped for Logon by',
    '679':
    'The name: %2 could not be mapped for logon by: %1',
    '680':
    'Account Used for Logon by',
    '681':
    'The logon to account: %2 by: %1 from workstation: %3 failed.',
    '682':
    'Session reconnected to winstation',
    '683':
    'Session disconnected from winstation',
    '684':
    'Set ACLs of members in administrators groups',
    '685':
    'Account Name Changed',
    '686':
    'Password of the following user accessed',
    '687':
    'Basic Application Group Created',
    '688':
    'Basic Application Group Changed',
    '689':
    'Basic Application Group Member Added',
    '690':
    'Basic Application Group Member Removed',
    '691':
    'Basic Application Group Non-Member Added',
    '692':
    'Basic Application Group Non-Member Removed',
    '693':
    'Basic Application Group Deleted',
    '694':
    'LDAP Query Group Created',
    '695':
    'LDAP Query Group Changed',
    '696':
    'LDAP Query Group Deleted',
    '697':
    'Password Policy Checking API is called',
    '806':
    'Per User Audit Policy was refreshed',
    '807':
    'Per user auditing policy set for user',
    '808':
    'A security event source has attempted to register',
    '809':
    'A security event source has attempted to unregister',
    '848':
    'The following policy was active when the Windows Firewall started',
    '849':
    'An application was listed as an exception when the Windows Firewall started',
    '850':
    'A port was listed as an exception when the Windows Firewall started',
    '851':
    'A change has been made to the Windows Firewall application exception list',
    '852':
    'A change has been made to the Windows Firewall port exception list',
    '853':
    'The Windows Firewall operational mode has changed',
    '854':
    'The Windows Firewall logging settings have changed',
    '855':
    'A Windows Firewall ICMP setting has changed',
    '856':
    'The Windows Firewall setting to allow unicast responses to multicast/broadcast traffic has changed',
    '857':
    'The Windows Firewall setting to allow remote administration, allowing port TCP 135 and DCOM/RPC, has changed',
    '858':
    'Windows Firewall group policy settings have been applied',
    '859':
    'The Windows Firewall group policy settings have been removed',
    '860':
    'The Windows Firewall has switched the active policy profile',
    '861':
    'The Windows Firewall has detected an application listening for incoming traffic',
    '1100':
    'The event logging service has shut down',
    '1101':
    'Audit events have been dropped by the transport.',
    '1102':
    'The audit log was cleared',
    '1104':
    'The security Log is now full',
    '1105':
    'Event log automatic backup',
    '1108':
    'The event logging service encountered an error',
    '4608':
    'Windows is starting up',
    '4609':
    'Windows is shutting down',
    '4610':
    'An authentication package has been loaded by the Local Security Authority',
    '4611':
    'A trusted logon process has been registered with the Local Security Authority',
    '4612':
    'Internal resources allocated for the queuing of audit messages have been exhausted, leading to the loss of some audits.',
    '4614':
    'A notification package has been loaded by the Security Account Manager.',
    '4615':
    'Invalid use of LPC port',
    '4616':
    'The system time was changed.',
    '4618':
    'A monitored security event pattern has occurred',
    '4621':
    'Administrator recovered system from CrashOnAuditFail',
    '4622':
    'A security package has been loaded by the Local Security Authority.',
    '4624':
    'An account was successfully logged on',
    '4625':
    'An account failed to log on',
    '4626':
    'User/Device claims information',
    '4627':
    'Group membership information.',
    '4634':
    'An account was logged off',
    '4646':
    'IKE DoS-prevention mode started',
    '4647':
    'User initiated logoff',
    '4648':
    'A logon was attempted using explicit credentials',
    '4649':
    'A replay attack was detected',
    '4650':
    'An IPsec Main Mode security association was established',
    '4651':
    'An IPsec Main Mode security association was established',
    '4652':
    'An IPsec Main Mode negotiation failed',
    '4653':
    'An IPsec Main Mode negotiation failed',
    '4654':
    'An IPsec Quick Mode negotiation failed',
    '4655':
    'An IPsec Main Mode security association ended',
    '4656':
    'A handle to an object was requested',
    '4657':
    'A registry value was modified',
    '4658':
    'The handle to an object was closed',
    '4659':
    'A handle to an object was requested with intent to delete',
    '4660':
    'An object was deleted',
    '4661':
    'A handle to an object was requested',
    '4662':
    'An operation was performed on an object',
    '4663':
    'An attempt was made to access an object',
    '4664':
    'An attempt was made to create a hard link',
    '4665':
    'An attempt was made to create an application client context.',
    '4666':
    'An application attempted an operation',
    '4667':
    'An application client context was deleted',
    '4668':
    'An application was initialized',
    '4670':
    'Permissions on an object were changed',
    '4671':
    'An application attempted to access a blocked ordinal through the TBS',
    '4672':
    'Special privileges assigned to new logon',
    '4673':
    'A privileged service was called',
    '4674':
    'An operation was attempted on a privileged object',
    '4675':
    'SIDs were filtered',
    '4688':
    'A new process has been created',
    '4689':
    'A process has exited',
    '4690':
    'An attempt was made to duplicate a handle to an object',
    '4691':
    'Indirect access to an object was requested',
    '4692':
    'Backup of data protection master key was attempted',
    '4693':
    'Recovery of data protection master key was attempted',
    '4694':
    'Protection of auditable protected data was attempted',
    '4695':
    'Unprotection of auditable protected data was attempted',
    '4696':
    'A primary token was assigned to process',
    '4697':
    'A service was installed in the system',
    '4698':
    'A scheduled task was created',
    '4699':
    'A scheduled task was deleted',
    '4700':
    'A scheduled task was enabled',
    '4701':
    'A scheduled task was disabled',
    '4702':
    'A scheduled task was updated',
    '4703':
    'A token right was adjusted',
    '4704':
    'A user right was assigned',
    '4705':
    'A user right was removed',
    '4706':
    'A new trust was created to a domain',
    '4707':
    'A trust to a domain was removed',
    '4709':
    'IPsec Services was started',
    '4710':
    'IPsec Services was disabled',
    '4711':
    'PAStore Engine (1%)',
    '4712':
    'IPsec Services encountered a potentially serious failure',
    '4713':
    'Kerberos policy was changed',
    '4714':
    'Encrypted data recovery policy was changed',
    '4715':
    'The audit policy (SACL) on an object was changed',
    '4716':
    'Trusted domain information was modified',
    '4717':
    'System security access was granted to an account',
    '4718':
    'System security access was removed from an account',
    '4719':
    'System audit policy was changed',
    '4720':
    'A user account was created',
    '4722':
    'A user account was enabled',
    '4723':
    'An attempt was made to change an accounts password',
    '4724':
    'An attempt was made to reset an accounts password',
    '4725':
    'A user account was disabled',
    '4726':
    'A user account was deleted',
    '4727':
    'A security-enabled global group was created',
    '4728':
    'A member was added to a security-enabled global group',
    '4729':
    'A member was removed from a security-enabled global group',
    '4730':
    'A security-enabled global group was deleted',
    '4731':
    'A security-enabled local group was created',
    '4732':
    'A member was added to a security-enabled local group',
    '4733':
    'A member was removed from a security-enabled local group',
    '4734':
    'A security-enabled local group was deleted',
    '4735':
    'A security-enabled local group was changed',
    '4737':
    'A security-enabled global group was changed',
    '4738':
    'A user account was changed',
    '4739':
    'Domain Policy was changed',
    '4740':
    'A user account was locked out',
    '4741':
    'A computer account was created',
    '4742':
    'A computer account was changed',
    '4743':
    'A computer account was deleted',
    '4744':
    'A security-disabled local group was created',
    '4745':
    'A security-disabled local group was changed',
    '4746':
    'A member was added to a security-disabled local group',
    '4747':
    'A member was removed from a security-disabled local group',
    '4748':
    'A security-disabled local group was deleted',
    '4749':
    'A security-disabled global group was created',
    '4750':
    'A security-disabled global group was changed',
    '4751':
    'A member was added to a security-disabled global group',
    '4752':
    'A member was removed from a security-disabled global group',
    '4753':
    'A security-disabled global group was deleted',
    '4754':
    'A security-enabled universal group was created',
    '4755':
    'A security-enabled universal group was changed',
    '4756':
    'A member was added to a security-enabled universal group',
    '4757':
    'A member was removed from a security-enabled universal group',
    '4758':
    'A security-enabled universal group was deleted',
    '4759':
    'A security-disabled universal group was created',
    '4760':
    'A security-disabled universal group was changed',
    '4761':
    'A member was added to a security-disabled universal group',
    '4762':
    'A member was removed from a security-disabled universal group',
    '4763':
    'A security-disabled universal group was deleted',
    '4764':
    'A groups type was changed',
    '4765':
    'SID History was added to an account',
    '4766':
    'An attempt to add SID History to an account failed',
    '4767':
    'A user account was unlocked',
    '4768':
    'A Kerberos authentication ticket (TGT) was requested',
    '4769':
    'A Kerberos service ticket was requested',
    '4770':
    'A Kerberos service ticket was renewed',
    '4771':
    'Kerberos pre-authentication failed',
    '4772':
    'A Kerberos authentication ticket request failed',
    '4773':
    'A Kerberos service ticket request failed',
    '4774':
    'An account was mapped for logon',
    '4775':
    'An account could not be mapped for logon',
    '4776':
    'The domain controller attempted to validate the credentials for an account',
    '4777':
    'The domain controller failed to validate the credentials for an account',
    '4778':
    'A session was reconnected to a Window Station',
    '4779':
    'A session was disconnected from a Window Station',
    '4780':
    'The ACL was set on accounts which are members of administrators groups',
    '4781':
    'The name of an account was changed',
    '4782':
    'The password hash an account was accessed',
    '4783':
    'A basic application group was created',
    '4784':
    'A basic application group was changed',
    '4785':
    'A member was added to a basic application group',
    '4786':
    'A member was removed from a basic application group',
    '4787':
    'A non-member was added to a basic application group',
    '4788':
    'A non-member was removed from a basic application group..',
    '4789':
    'A basic application group was deleted',
    '4790':
    'An LDAP query group was created',
    '4791':
    'A basic application group was changed',
    '4792':
    'An LDAP query group was deleted',
    '4793':
    'The Password Policy Checking API was called',
    '4794':
    'An attempt was made to set the Directory Services Restore Mode administrator password',
    '4797':
    'An attempt was made to query the existence of a blank password for an account',
    '4798':
    'A users local group membership was enumerated.',
    '4799':
    'A security-enabled local group membership was enumerated',
    '4800':
    'The workstation was locked',
    '4801':
    'The workstation was unlocked',
    '4802':
    'The screen saver was invoked',
    '4803':
    'The screen saver was dismissed',
    '4816':
    'RPC detected an integrity violation while decrypting an incoming message',
    '4817':
    'Auditing settings on object were changed.',
    '4818':
    'Proposed Central Access Policy does not grant the same access permissions as the current Central Access Policy',
    '4819':
    'Central Access Policies on the machine have been changed',
    '4820':
    'A Kerberos Ticket-granting-ticket (TGT) was denied because the device does not meet the access control restrictions',
    '4821':
    'A Kerberos service ticket was denied because the user, device, or both does not meet the access control restrictions',
    '4822':
    'NTLM authentication failed because the account was a member of the Protected User group',
    '4823':
    'NTLM authentication failed because access control restrictions are required',
    '4824':
    'Kerberos preauthentication by using DES or RC4 failed because the account was a member of the Protected User group',
    '4825':
    'A user was denied the access to Remote Desktop. By default, users are allowed to connect only if they are members of the Remote Desktop Users group or Administrators group',
    '4826':
    'Boot Configuration Data loaded',
    '4830':
    'SID History was removed from an account',
    '4864':
    'A namespace collision was detected',
    '4865':
    'A trusted forest information entry was added',
    '4866':
    'A trusted forest information entry was removed',
    '4867':
    'A trusted forest information entry was modified',
    '4868':
    'The certificate manager denied a pending certificate request',
    '4869':
    'Certificate Services received a resubmitted certificate request',
    '4870':
    'Certificate Services revoked a certificate',
    '4871':
    'Certificate Services received a request to publish the certificate revocation list (CRL)',
    '4872':
    'Certificate Services published the certificate revocation list (CRL)',
    '4873':
    'A certificate request extension changed',
    '4874':
    'One or more certificate request attributes changed.',
    '4875':
    'Certificate Services received a request to shut down',
    '4876':
    'Certificate Services backup started',
    '4877':
    'Certificate Services backup completed',
    '4878':
    'Certificate Services restore started',
    '4879':
    'Certificate Services restore completed',
    '4880':
    'Certificate Services started',
    '4881':
    'Certificate Services stopped',
    '4882':
    'The security permissions for Certificate Services changed',
    '4883':
    'Certificate Services retrieved an archived key',
    '4884':
    'Certificate Services imported a certificate into its database',
    '4885':
    'The audit filter for Certificate Services changed',
    '4886':
    'Certificate Services received a certificate request',
    '4887':
    'Certificate Services approved a certificate request and issued a certificate',
    '4888':
    'Certificate Services denied a certificate request',
    '4889':
    'Certificate Services set the status of a certificate request to pending',
    '4890':
    'The certificate manager settings for Certificate Services changed.',
    '4891':
    'A configuration entry changed in Certificate Services',
    '4892':
    'A property of Certificate Services changed',
    '4893':
    'Certificate Services archived a key',
    '4894':
    'Certificate Services imported and archived a key',
    '4895':
    'Certificate Services published the CA certificate to Active Directory Domain Services',
    '4896':
    'One or more rows have been deleted from the certificate database',
    '4897':
    'Role separation enabled',
    '4898':
    'Certificate Services loaded a template',
    '4899':
    'A Certificate Services template was updated',
    '4900':
    'Certificate Services template security was updated',
    '4902':
    'The Per-user audit policy table was created',
    '4904':
    'An attempt was made to register a security event source',
    '4905':
    'An attempt was made to unregister a security event source',
    '4906':
    'The CrashOnAuditFail value has changed',
    '4907':
    'Auditing settings on object were changed',
    '4908':
    'Special Groups Logon table modified',
    '4909':
    'The local policy settings for the TBS were changed',
    '4910':
    'The group policy settings for the TBS were changed',
    '4911':
    'Resource attributes of the object were changed',
    '4912':
    'Per User Audit Policy was changed',
    '4913':
    'Central Access Policy on the object was changed',
    '4928':
    'An Active Directory replica source naming context was established',
    '4929':
    'An Active Directory replica source naming context was removed',
    '4930':
    'An Active Directory replica source naming context was modified',
    '4931':
    'An Active Directory replica destination naming context was modified',
    '4932':
    'Synchronization of a replica of an Active Directory naming context has begun',
    '4933':
    'Synchronization of a replica of an Active Directory naming context has ended',
    '4934':
    'Attributes of an Active Directory object were replicated',
    '4935':
    'Replication failure begins',
    '4936':
    'Replication failure ends',
    '4937':
    'A lingering object was removed from a replica',
    '4944':
    'The following policy was active when the Windows Firewall started',
    '4945':
    'A rule was listed when the Windows Firewall started',
    '4946':
    'A change has been made to Windows Firewall exception list. A rule was added',
    '4947':
    'A change has been made to Windows Firewall exception list. A rule was modified',
    '4948':
    'A change has been made to Windows Firewall exception list. A rule was deleted',
    '4949':
    'Windows Firewall settings were restored to the default values',
    '4950':
    'A Windows Firewall setting has changed',
    '4951':
    'A rule has been ignored because its major version number was not recognized by Windows Firewall',
    '4952':
    'Parts of a rule have been ignored because its minor version number was not recognized by Windows Firewall',
    '4953':
    'A rule has been ignored by Windows Firewall because it could not parse the rule',
    '4954':
    'Windows Firewall Group Policy settings has changed. The new settings have been applied',
    '4956':
    'Windows Firewall has changed the active profile',
    '4957':
    'Windows Firewall did not apply the following rule',
    '4958':
    'Windows Firewall did not apply the following rule because the rule referred to items not configured on this computer',
    '4960':
    'IPsec dropped an inbound packet that failed an integrity check',
    '4961':
    'IPsec dropped an inbound packet that failed a replay check',
    '4962':
    'IPsec dropped an inbound packet that failed a replay check',
    '4963':
    'IPsec dropped an inbound clear text packet that should have been secured',
    '4964':
    'Special groups have been assigned to a new logon',
    '4965':
    'IPsec received a packet from a remote computer with an incorrect Security Parameter Index (SPI).',
    '4976':
    'During Main Mode negotiation, IPsec received an invalid negotiation packet.',
    '4977':
    'During Quick Mode negotiation, IPsec received an invalid negotiation packet.',
    '4978':
    'During Extended Mode negotiation, IPsec received an invalid negotiation packet.',
    '4979':
    'IPsec Main Mode and Extended Mode security associations were established.',
    '4980':
    'IPsec Main Mode and Extended Mode security associations were established',
    '4981':
    'IPsec Main Mode and Extended Mode security associations were established',
    '4982':
    'IPsec Main Mode and Extended Mode security associations were established',
    '4983':
    'An IPsec Extended Mode negotiation failed',
    '4984':
    'An IPsec Extended Mode negotiation failed',
    '4985':
    'The state of a transaction has changed',
    '5024':
    'The Windows Firewall Service has started successfully',
    '5025':
    'The Windows Firewall Service has been stopped',
    '5027':
    'The Windows Firewall Service was unable to retrieve the security policy from the local storage',
    '5028':
    'The Windows Firewall Service was unable to parse the new security policy.',
    '5029':
    'The Windows Firewall Service failed to initialize the driver',
    '5030':
    'The Windows Firewall Service failed to start',
    '5031':
    'The Windows Firewall Service blocked an application from accepting incoming connections on the network.',
    '5032':
    'Windows Firewall was unable to notify the user that it blocked an application from accepting incoming connections on the network',
    '5033':
    'The Windows Firewall Driver has started successfully',
    '5034':
    'The Windows Firewall Driver has been stopped',
    '5035':
    'The Windows Firewall Driver failed to start',
    '5037':
    'The Windows Firewall Driver detected critical runtime error. Terminating',
    '5038':
    'Code integrity determined that the image hash of a file is not valid',
    '5039':
    'A registry key was virtualized.',
    '5040':
    'A change has been made to IPsec settings. An Authentication Set was added.',
    '5041':
    'A change has been made to IPsec settings. An Authentication Set was modified',
    '5042':
    'A change has been made to IPsec settings. An Authentication Set was deleted',
    '5043':
    'A change has been made to IPsec settings. A Connection Security Rule was added',
    '5044':
    'A change has been made to IPsec settings. A Connection Security Rule was modified',
    '5045':
    'A change has been made to IPsec settings. A Connection Security Rule was deleted',
    '5046':
    'A change has been made to IPsec settings. A Crypto Set was added',
    '5047':
    'A change has been made to IPsec settings. A Crypto Set was modified',
    '5048':
    'A change has been made to IPsec settings. A Crypto Set was deleted',
    '5049':
    'An IPsec Security Association was deleted',
    '5050':
    'An attempt to programmatically disable the Windows Firewall using a call to INetFwProfile.FirewallEnabled(FALSE',
    '5051':
    'A file was virtualized',
    '5056':
    'A cryptographic self test was performed',
    '5057':
    'A cryptographic primitive operation failed',
    '5058':
    'Key file operation',
    '5059':
    'Key migration operation',
    '5060':
    'Verification operation failed',
    '5061':
    'Cryptographic operation',
    '5062':
    'A kernel-mode cryptographic self test was performed',
    '5063':
    'A cryptographic provider operation was attempted',
    '5064':
    'A cryptographic context operation was attempted',
    '5065':
    'A cryptographic context modification was attempted',
    '5066':
    'A cryptographic function operation was attempted',
    '5067':
    'A cryptographic function modification was attempted',
    '5068':
    'A cryptographic function provider operation was attempted',
    '5069':
    'A cryptographic function property operation was attempted',
    '5070':
    'A cryptographic function property operation was attempted',
    '5071':
    'Key access denied by Microsoft key distribution service',
    '5120':
    'OCSP Responder Service Started',
    '5121':
    'OCSP Responder Service Stopped',
    '5122':
    'A Configuration entry changed in the OCSP Responder Service',
    '5123':
    'A configuration entry changed in the OCSP Responder Service',
    '5124':
    'A security setting was updated on OCSP Responder Service',
    '5125':
    'A request was submitted to OCSP Responder Service',
    '5126':
    'Signing Certificate was automatically updated by the OCSP Responder Service',
    '5127':
    'The OCSP Revocation Provider successfully updated the revocation information',
    '5136':
    'A directory service object was modified',
    '5137':
    'A directory service object was created',
    '5138':
    'A directory service object was undeleted',
    '5139':
    'A directory service object was moved',
    '5140':
    'A network share object was accessed',
    '5141':
    'A directory service object was deleted',
    '5142':
    'A network share object was added.',
    '5143':
    'A network share object was modified',
    '5144':
    'A network share object was deleted.',
    '5145':
    'A network share object was checked to see whether client can be granted desired access',
    '5146':
    'The Windows Filtering Platform has blocked a packet',
    '5147':
    'A more restrictive Windows Filtering Platform filter has blocked a packet',
    '5148':
    'The Windows Filtering Platform has detected a DoS attack and entered a defensive mode; packets associated with this attack will be discarded.',
    '5149':
    'The DoS attack has subsided and normal processing is being resumed.',
    '5150':
    'The Windows Filtering Platform has blocked a packet.',
    '5151':
    'A more restrictive Windows Filtering Platform filter has blocked a packet.',
    '5152':
    'The Windows Filtering Platform blocked a packet',
    '5153':
    'A more restrictive Windows Filtering Platform filter has blocked a packet',
    '5154':
    'The Windows Filtering Platform has permitted an application or service to listen on a port for incoming connections',
    '5155':
    'The Windows Filtering Platform has blocked an application or service from listening on a port for incoming connections',
    '5156':
    'The Windows Filtering Platform has allowed a connection',
    '5157':
    'The Windows Filtering Platform has blocked a connection',
    '5158':
    'The Windows Filtering Platform has permitted a bind to a local port',
    '5159':
    'The Windows Filtering Platform has blocked a bind to a local port',
    '5168':
    'Spn check for SMB/SMB2 fails.',
    '5169':
    'A directory service object was modified',
    '5170':
    'A directory service object was modified during a background cleanup task',
    '5376':
    'Credential Manager credentials were backed up',
    '5377':
    'Credential Manager credentials were restored from a backup',
    '5378':
    'The requested credentials delegation was disallowed by policy',
    '5440':
    'The following callout was present when the Windows Filtering Platform Base Filtering Engine started',
    '5441':
    'The following filter was present when the Windows Filtering Platform Base Filtering Engine started',
    '5442':
    'The following provider was present when the Windows Filtering Platform Base Filtering Engine started',
    '5443':
    'The following provider context was present when the Windows Filtering Platform Base Filtering Engine started',
    '5444':
    'The following sub-layer was present when the Windows Filtering Platform Base Filtering Engine started',
    '5446':
    'A Windows Filtering Platform callout has been changed',
    '5447':
    'A Windows Filtering Platform filter has been changed',
    '5448':
    'A Windows Filtering Platform provider has been changed',
    '5449':
    'A Windows Filtering Platform provider context has been changed',
    '5450':
    'A Windows Filtering Platform sub-layer has been changed',
    '5451':
    'An IPsec Quick Mode security association was established',
    '5452':
    'An IPsec Quick Mode security association ended',
    '5453':
    'An IPsec negotiation with a remote computer failed because the IKE and AuthIP IPsec Keying Modules (IKEEXT) service is not started',
    '5456':
    'PAStore Engine applied Active Directory storage IPsec policy on the computer',
    '5457':
    'PAStore Engine failed to apply Active Directory storage IPsec policy on the computer',
    '5458':
    'PAStore Engine applied locally cached copy of Active Directory storage IPsec policy on the computer',
    '5459':
    'PAStore Engine failed to apply locally cached copy of Active Directory storage IPsec policy on the computer',
    '5460':
    'PAStore Engine applied local registry storage IPsec policy on the computer',
    '5461':
    'PAStore Engine failed to apply local registry storage IPsec policy on the computer',
    '5462':
    'PAStore Engine failed to apply some rules of the active IPsec policy on the computer',
    '5463':
    'PAStore Engine polled for changes to the active IPsec policy and detected no changes',
    '5464':
    'PAStore Engine polled for changes to the active IPsec policy, detected changes, and applied them to IPsec Services',
    '5465':
    'PAStore Engine received a control for forced reloading of IPsec policy and processed the control successfully',
    '5466':
    'PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory cannot be reached, and will use the cached copy of the Active Directory IPsec policy instead',
    '5467':
    'PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory can be reached, and found no changes to the policy',
    '5468':
    'PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory can be reached, found changes to the policy, and applied those changes',
    '5471':
    'PAStore Engine loaded local storage IPsec policy on the computer',
    '5472':
    'PAStore Engine failed to load local storage IPsec policy on the computer',
    '5473':
    'PAStore Engine loaded directory storage IPsec policy on the computer',
    '5474':
    'PAStore Engine failed to load directory storage IPsec policy on the computer',
    '5477':
    'PAStore Engine failed to add quick mode filter',
    '5478':
    'IPsec Services has started successfully',
    '5479':
    'IPsec Services has been shut down successfully',
    '5480':
    'IPsec Services failed to get the complete list of network interfaces on the computer',
    '5483':
    'IPsec Services failed to initialize RPC server. IPsec Services could not be started',
    '5484':
    'IPsec Services has experienced a critical failure and has been shut down',
    '5485':
    'IPsec Services failed to process some IPsec filters on a plug-and-play event for network interfaces',
    '5632':
    'A request was made to authenticate to a wireless network',
    '5633':
    'A request was made to authenticate to a wired network',
    '5712':
    'A Remote Procedure Call (RPC) was attempted',
    '5888':
    'An object in the COM+ Catalog was modified',
    '5889':
    'An object was deleted from the COM+ Catalog',
    '5890':
    'An object was added to the COM+ Catalog',
    '6144':
    'Security policy in the group policy objects has been applied successfully',
    '6145':
    'One or more errors occured while processing security policy in the group policy objects',
    '6272':
    'Network Policy Server granted access to a user',
    '6273':
    'Network Policy Server denied access to a user',
    '6274':
    'Network Policy Server discarded the request for a user',
    '6275':
    'Network Policy Server discarded the accounting request for a user',
    '6276':
    'Network Policy Server quarantined a user',
    '6277':
    'Network Policy Server granted access to a user but put it on probation because the host did not meet the defined health policy',
    '6278':
    'Network Policy Server granted full access to a user because the host met the defined health policy',
    '6279':
    'Network Policy Server locked the user account due to repeated failed authentication attempts',
    '6280':
    'Network Policy Server unlocked the user account',
    '6281':
    'Code Integrity determined that the page hashes of an image file are not valid...',
    '6400':
    'BranchCache: Received an incorrectly formatted response while discovering availability of content.',
    '6401':
    'BranchCache: Received invalid data from a peer. Data discarded.',
    '6402':
    'BranchCache: The message to the hosted cache offering it data is incorrectly formatted.',
    '6403':
    'BranchCache: The hosted cache sent an incorrectly formatted response to the clients message to offer it data.',
    '6404':
    'BranchCache: Hosted cache could not be authenticated using the provisioned SSL certificate.',
    '6405':
    'BranchCache: %2 instance(s) of event id %1 occurred.',
    '6406':
    '%1 registered to Windows Firewall to control filtering for the following:',
    '6407':
    '%1',
    '6408':
    'Registered product %1 failed and Windows Firewall is now controlling the filtering for %2.',
    '6409':
    'BranchCache: A service connection point object could not be parsed',
    '6410':
    'Code integrity determined that a file does not meet the security requirements to load into a process. This could be due to the use of shared sections or other issues',
    '6416':
    'A new external device was recognized by the system.',
    '6417':
    'The FIPS mode crypto selftests succeeded',
    '6418':
    'The FIPS mode crypto selftests failed',
    '6419':
    'A request was made to disable a device',
    '6420':
    'A device was disabled',
    '6421':
    'A request was made to enable a device',
    '6422':
    'A device was enabled',
    '6423':
    'The installation of this device is forbidden by system policy',
    '6424':
    'The installation of this device was allowed, after having previously been forbidden by policy',
}

####################### BEGIN FUNCTIONS ############################


def verify_file(file_location_tmp):
    file_loc = file_location_tmp
    file_loc = file_loc.replace("\\\\", "/").replace(
        "\\", "/").rstrip("/")
    if file_loc.count("/") > 1:
        filter_file_loc = file_loc.rstrip("/")

    if not os.path.exists(file_loc):
        print(
            "ERROR: \"" + file_loc +
            "\" cannot be found by the system.  Please verify filename and path are correct.")
        print("Exiting...")
        sys.exit(1)
    else:
        return file_loc

def query_plaso_location():
    # This prompts user for a plaso location and confirms it exists before returning
    # a valided file location
    while True:
        sys.stdout.writelines(
            "Please enter valid location for Plaso directory: ")
        p_path = input()
        # Verify files exist
        l2t_loc = p_path.rstrip("/").rstrip().strip("\"") + "/log2timeline.exe"
        p_loc = p_path.rstrip("/").rstrip().strip("\"") + "/psort.exe"
        if not os.path.isfile(l2t_loc):
            print("ERROR: " + l2t_loc + " does not exist")
        else:
            if not os.path.isfile(p_loc):
                print("ERROR: " + p_loc + " does not exist")
            else:
                return l2t_loc, p_loc


# Ask a yes/no question via input() and return their answer.
def query_yes_no(args, question, default="yes"):
    if args.confirmAll:
        if default == "yes":
            return True
        else:
            return False
    if default == "yes":
        prompt = " [Y/n]"
        yes = set(['yes', 'y', 'ye', ''])
        no = set(['no', 'n'])
    else:
        prompt = " [y/N]"
        yes = set(['yes', 'ye', 'y'])
        no = set(['no', 'n', ''])

    while True:
        sys.stdout.writelines(question + prompt + ": ")
        choice = input().lower()
        if choice in yes:
            return True
        elif choice in no:
            return False
        else:
            sys.stdout.write("Please respond with 'yes' or 'no'")


def status_marker(mylogfile, myproc):
    counter = 1
    while myproc.poll() is None:
        if counter % 2 == 0:
            sys.stdout.writelines("| Still working...\r")
        else:
            sys.stdout.writelines("- Still working...\r")
        sys.stdout.flush()
        counter += 1
        time.sleep(1)

    if myproc.poll() != 0:
        print("ERROR: There was a problem. See log for details in log.")
        mylogfile.writelines(
            "ERROR: There was a problem. See details in log.\n")
        print("\nExiting.......")
        sys.exit(1)


def multi_thread_reports(mqueue, infile, terms):
    for line in infile:
        if terms[0].search(line, re.I):
            mqueue.put(terms[1].writelines(
                line.replace("\n", " ").replace("\r", " ") + "\n"))
    print("Report Created:", terms[2])


def create_reports(args, mylogfile, dst_loc, csv_file, parser_opt):
    start_dt = datetime.datetime.now()
    print("Reporting started at: " + str(start_dt))
    mylogfile.writelines("Reporting started at: " + str(start_dt) + "\n")
    # Create individual reports
    print(
        "\nCreating the individual reports (This will take a long time for large files)"
    )
    mylogfile.writelines(
        "\nCreating the individual reports (This will take a long time for large files)\n"
    )
    # Create report directory and file names
    rpt_dir_name = dst_loc + "/Reports"
    rpt_evt_name = rpt_dir_name + "/Event Log Report.csv"
    rpt_fsfs_name = rpt_dir_name + "/File System Report.csv"
    rpt_fsmft_name = rpt_dir_name + "/MFT Report.csv"
    rpt_fsusnjrnl_name = rpt_dir_name + "/UsnJrnl Report.csv"
    rpt_ih_name = rpt_dir_name + "/Internet History Report.csv"
    rpt_pf_name = rpt_dir_name + "/Prefetch Report.csv"
    rpt_appc_name = rpt_dir_name + "/Appcompat Report.csv"
    rpt_reg_name = rpt_dir_name + "/Registry Report.csv"
    rpt_st_name = rpt_dir_name + "/Scheduled Tasks Report.csv"
    rpt_per_name = rpt_dir_name + "/Persistence Report.csv"
    rpt_si_name = rpt_dir_name + "/System Information Report.csv"
    rpt_av_name = rpt_dir_name + "/AntiVirus Report.csv"
    rpt_fw_name = rpt_dir_name + "/Firewall Report.csv"
    rpt_mac_name = rpt_dir_name + "/Mac Report.csv"
    rpt_lin_name = rpt_dir_name + "/Linux Report.csv"
    rpt_android_name = rpt_dir_name + "/Android Report.csv"
    rpt_amcache_name = rpt_dir_name + "/Amcache Report.csv"
    rpt_bash_name = rpt_dir_name + "/Bash Report.csv"

    # RC1 search strings for each report
    rpt_evt_search = re.compile(r'winevt')
    rpt_fsfs_search = re.compile(r'filestat|recycle_bin|fseventsd')
    rpt_fsmft_search = re.compile(r',mft,')
    rpt_fsusnjrnl_search = re.compile(r',usnjrnl,')
    rpt_ih_search = re.compile(r'chrome_cache|chrome_preferences|firefox_cache|gdrive_synclog|opera_global|opera_typed_history|sqlite/chrome_27_history|sqlite/chrome_8_history|sqlite/chrome_autofill|sqlite/chrome_cookies|sqlite/chrome_extension_activity|sqlite/firefox_cookies|sqlite/firefox_downloads|sqlite/firefox_history|sqlite/google_drive|sqlite/skype|binary_cookies|esedb/msie_webcache|plist/safari_history|xchatlog|xchatscrollback')
    rpt_pf_search = re.compile(r'prefetch')
    rpt_appc_search = re.compile(r'appcompatcache')
    rpt_reg_search = re.compile(r'winreg')
    rpt_st_search = re.compile(r'winjob|windows_task_cache|cron')
    rpt_per_search = re.compile(
        r'bagmru|bencode|mrulist|msie_zone|mstsc_rdp|userassist|windows_bootwindows_run|windows_sam_users|windows_services|winrar_mru'
    )
    rpt_si_search = re.compile(
        r'explorer_|mac_keychain|mac_securityd|mackeeper_cache|macosx_bluetooth|macosx_install_history|mactime|macuser|macwifi|network_drives|rplog|windows_shutdown|windows_timezone|windows_usb_devices|windows_usbstor_devices|windows_version'
    )
    rpt_av_search = re.compile(r'mcafee_protection|symantec_scanlog|sophos_av')
    rpt_fw_search = re.compile(r'winfirewall|mac_appfirewall_log')
    rpt_mac_search = re.compile(
        r'bencode|czip/oxml|dockerjson|java_idx|msiecf|olecf|pls_recall|popularity_contest|selinux|syslog|systemd_journal|utmpx|xchatlog|xchatscrollback|asl_log|bsm_log|cups_ipp|dockerjson|mac_keychain|mac_securityd|macwifi|plist|sqlite/appusage|sqlite/imessage|sqlite/ls_quarantine|sqlite/mac_document_versions|sqlite/mac_knowledgec|sqlite/mac_notes|sqlite/mackeeper_cache|systemd_journal'
    )
    rpt_lin_search = re.compile(
        r'bencode|czip/oxml|dockerjson|dpkg|java_idx|msiecf|olecf|sqlite/zeitgeist|syslog|systemd_journal|utmp|xchatlog|xchatscrollback'
    )
    rpt_android_search = re.compile(
        r'android_app_usage|sqlite/android_calls|sqlite/android_sms|sqlite/android_webview'
    )
    rpt_bash_search = re.compile(r'bash|zsh_extended_history')
    rpt_amcache_search = re.compile(r'amcache')

    # Create a list of the report names
    if parser_opt == "datt":
        lor = [
                rpt_appc_name,
                rpt_evt_name,
                rpt_fsfs_name,
                rpt_fsmft_name,
                rpt_fsusnjrnl_name,
                rpt_ih_name,
                rpt_pf_name,
                rpt_reg_name,
                rpt_st_name,
                rpt_per_name,
                rpt_si_name,
                rpt_av_name,
                rpt_fw_name,
                rpt_mac_name,
                rpt_lin_name,
                rpt_android_name,
                rpt_amcache_name,
                rpt_bash_name]
    elif parser_opt == "win":
        lor = [
                rpt_appc_name,
                rpt_evt_name,
                rpt_fsfs_name,
                rpt_fsmft_name,
                rpt_fsusnjrnl_name,
                rpt_ih_name,
                rpt_pf_name,
                rpt_reg_name,
                rpt_st_name,
                rpt_per_name,
                rpt_si_name,
                rpt_av_name,
                rpt_fw_name,
                rpt_amcache_name,
                rpt_bash_name]
    elif parser_opt == "mac":
        lor = [
                rpt_fsfs_name,
                rpt_ih_name,
                rpt_per_name,
                rpt_si_name,
                rpt_av_name,
                rpt_fw_name,
                rpt_mac_name,
                rpt_bash_name]
    elif parser_opt == "android":
        lor = [
                rpt_fsfs_name,
                rpt_ih_name,
                rpt_per_name,
                rpt_si_name,
                rpt_av_name,
                rpt_fw_name,
                rpt_android_name]
    else:
        lor = [
                rpt_fsfs_name,
                rpt_ih_name,
                rpt_per_name,
                rpt_si_name,
                rpt_av_name,
                rpt_fw_name,
                rpt_lin_name,
                rpt_bash_name]

    # Create Report directory
    if not os.path.isdir(rpt_dir_name):
        os.makedirs(rpt_dir_name)

    # Check if files exist
    create_rep = True
    all_reports_exit = True
    existing_report_list = []
    for rpt_name in lor:
        if not os.path.isfile(rpt_name):
            all_reports_exit = False
        else:
            existing_report_list.append(rpt_name)

    if all_reports_exit:
        if query_yes_no(
                args,
                "\nAll sub-reports already exist.  Would you like to delete these files?",
                "no"):
            for rpt_name in lor:
                os.remove(rpt_name)
        else:
            return

    # Create list of file handles + search terms based on the parser option selected
    if parser_opt == "datt":
        # Open all report files for writing
        rpt_appc = open(rpt_appc_name, 'a+', encoding='utf-8')
        rpt_evt = open(rpt_evt_name, 'a+', encoding='utf-8')
        rpt_fsfs = open(rpt_fsfs_name, 'a+', encoding='utf-8')
        rpt_fsmft = open(rpt_fsmft_name, 'a+', encoding='utf-8')
        rpt_fsusnjrnl = open(rpt_fsusnjrnl_name, 'a+', encoding='utf-8')
        rpt_ih = open(rpt_ih_name, 'a+', encoding='utf-8')
        rpt_pf = open(rpt_pf_name, 'a+', encoding='utf-8')
        rpt_reg = open(rpt_reg_name, 'a+', encoding='utf-8')
        rpt_st = open(rpt_st_name, 'a+', encoding='utf-8')
        rpt_per = open(rpt_per_name, 'a+', encoding='utf-8')
        rpt_si = open(rpt_si_name, 'a+', encoding='utf-8')
        rpt_av = open(rpt_av_name, 'a+', encoding='utf-8')
        rpt_fw = open(rpt_fw_name, 'a+', encoding='utf-8')
        rpt_mac = open(rpt_mac_name, 'a+', encoding='utf-8')
        rpt_lin = open(rpt_lin_name, 'a+', encoding='utf-8')
        rpt_android = open(rpt_android_name, 'a+', encoding='utf-8')
        rpt_amcache = open(rpt_amcache_name, 'a+', encoding='utf-8')
        rpt_bash = open(rpt_bash_name, 'a+', encoding='utf-8')
        lofh = [
                [rpt_appc_search, rpt_appc, rpt_appc_name],
                [rpt_evt_search, rpt_evt, rpt_evt_name],
                [rpt_fsfs_search, rpt_fsfs, rpt_fsfs_name],
                [rpt_fsmft_search, rpt_fsmft, rpt_fsmft_name],
                [rpt_fsusnjrnl_search, rpt_fsusnjrnl, rpt_fsusnjrnl_name],
                [rpt_ih_search, rpt_ih, rpt_ih_name],
                [rpt_pf_search, rpt_pf, rpt_pf_name],
                [rpt_reg_search, rpt_reg, rpt_reg_name],
                [rpt_st_search, rpt_st, rpt_st_name],
                [rpt_per_search, rpt_per, rpt_per_name],
                [rpt_si_search, rpt_si, rpt_si_name],
                [rpt_av_search, rpt_av, rpt_av_name],
                [rpt_fw_search, rpt_fw, rpt_fw_name],
                [rpt_mac_search, rpt_mac, rpt_mac_name],
                [rpt_lin_search, rpt_lin, rpt_lin_name],
                [rpt_android_search, rpt_android, rpt_android_name],
                [rpt_amcache_search, rpt_amcache, rpt_amcache_name],
                [rpt_bash_search, rpt_bash, rpt_bash_name]]
    elif parser_opt == "android":
        # Open all report files for writing
        # Open Linux report files for writing
        rpt_fsfs = open(rpt_fsfs_name, 'a+', encoding='utf-8')
        rpt_ih = open(rpt_ih_name, 'a+', encoding='utf-8')
        rpt_per = open(rpt_per_name, 'a+', encoding='utf-8')
        rpt_si = open(rpt_si_name, 'a+', encoding='utf-8')
        rpt_av = open(rpt_av_name, 'a+', encoding='utf-8')
        rpt_fw = open(rpt_fw_name, 'a+', encoding='utf-8')
        rpt_android = open(rpt_android_name, 'a+', encoding='utf-8')
        lofh = [
                [rpt_fsfs_search, rpt_fsfs, rpt_fsfs_name],
                [rpt_ih_search, rpt_ih, rpt_ih_name],
                [rpt_per_search, rpt_per, rpt_per_name],
                [rpt_si_search, rpt_si, rpt_si_name],
                [rpt_av_search, rpt_av, rpt_av_name],
                [rpt_fw_search, rpt_fw, rpt_fw_name],
                [rpt_android_search, rpt_android, rpt_android_name]]
    elif parser_opt == "win":
        # Open windows report files for writing
        rpt_appc = open(rpt_appc_name, 'a+', encoding='utf-8')
        rpt_evt = open(rpt_evt_name, 'a+', encoding='utf-8')
        rpt_fsfs = open(rpt_fsfs_name, 'a+', encoding='utf-8')
        rpt_fsmft = open(rpt_fsmft_name, 'a+', encoding='utf-8')
        rpt_fsusnjrnl = open(rpt_fsusnjrnl_name, 'a+', encoding='utf-8')
        rpt_ih = open(rpt_ih_name, 'a+', encoding='utf-8')
        rpt_pf = open(rpt_pf_name, 'a+', encoding='utf-8')
        rpt_reg = open(rpt_reg_name, 'a+', encoding='utf-8')
        rpt_st = open(rpt_st_name, 'a+', encoding='utf-8')
        rpt_per = open(rpt_per_name, 'a+', encoding='utf-8')
        rpt_si = open(rpt_si_name, 'a+', encoding='utf-8')
        rpt_av = open(rpt_av_name, 'a+', encoding='utf-8')
        rpt_fw = open(rpt_fw_name, 'a+', encoding='utf-8')
        rpt_amcache = open(rpt_amcache_name, 'a+', encoding='utf-8')
        rpt_bash = open(rpt_bash_name, 'a+', encoding='utf-8')
        lofh = [
                [rpt_appc_search, rpt_appc, rpt_appc_name],
                [rpt_evt_search, rpt_evt, rpt_evt_name],
                [rpt_fsfs_search, rpt_fsfs, rpt_fsfs_name],
                [rpt_fsmft_search, rpt_fsmft, rpt_fsmft_name],
                [rpt_fsusnjrnl_search, rpt_fsusnjrnl, rpt_fsusnjrnl_name],
                [rpt_ih_search, rpt_ih, rpt_ih_name],
                [rpt_pf_search, rpt_pf, rpt_pf_name],
                [rpt_reg_search, rpt_reg, rpt_reg_name],
                [rpt_st_search, rpt_st, rpt_st_name],
                [rpt_per_search, rpt_per, rpt_per_name],
                [rpt_si_search, rpt_si, rpt_si_name],
                [rpt_av_search, rpt_av, rpt_av_name],
                [rpt_fw_search, rpt_fw, rpt_fw_name],
                [rpt_amcache_search, rpt_amcache, rpt_amcache_name],
                [rpt_bash_search, rpt_bash, rpt_bash_name]]
    elif parser_opt == "mac":
        # Open Mac report files for writing
        rpt_fsfs = open(rpt_fsfs_name, 'a+', encoding='utf-8')
        rpt_ih = open(rpt_ih_name, 'a+', encoding='utf-8')
        rpt_per = open(rpt_per_name, 'a+', encoding='utf-8')
        rpt_si = open(rpt_si_name, 'a+', encoding='utf-8')
        rpt_av = open(rpt_av_name, 'a+', encoding='utf-8')
        rpt_fw = open(rpt_fw_name, 'a+', encoding='utf-8')
        rpt_mac = open(rpt_mac_name, 'a+', encoding='utf-8')
        rpt_bash = open(rpt_bash_name, 'a+', encoding='utf-8')

        lofh = [
                [rpt_fsfs_search, rpt_fsfs, rpt_fsfs_name],
                [rpt_ih_search, rpt_ih, rpt_ih_name],
                [rpt_per_search, rpt_per, rpt_per_name],
                [rpt_si_search, rpt_si, rpt_si_name],
                [rpt_av_search, rpt_av, rpt_av_name],
                [rpt_fw_search, rpt_fw, rpt_fw_name],
                [rpt_mac_search, rpt_mac, rpt_mac_name],
                [rpt_bash_search, rpt_bash, rpt_bash_name]]
    else:
        # Open Linux report files for writing
        rpt_fsfs = open(rpt_fsfs_name, 'a+', encoding='utf-8')
        rpt_ih = open(rpt_ih_name, 'a+', encoding='utf-8')
        rpt_per = open(rpt_per_name, 'a+', encoding='utf-8')
        rpt_si = open(rpt_si_name, 'a+', encoding='utf-8')
        rpt_av = open(rpt_av_name, 'a+', encoding='utf-8')
        rpt_fw = open(rpt_fw_name, 'a+', encoding='utf-8')
        rpt_lin = open(rpt_lin_name, 'a+', encoding='utf-8')
        rpt_bash = open(rpt_bash_name, 'a+', encoding='utf-8')
        lofh = [
                [rpt_fsfs_search, rpt_fsfs, rpt_fsfs_name],
                [rpt_ih_search, rpt_ih, rpt_ih_name],
                [rpt_per_search, rpt_per, rpt_per_name],
                [rpt_si_search, rpt_si, rpt_si_name],
                [rpt_av_search, rpt_av, rpt_av_name],
                [rpt_fw_search, rpt_fw, rpt_fw_name],
                [rpt_lin_search, rpt_lin, rpt_lin_name],
                [rpt_bash_search, rpt_bash, rpt_bash_name]]

    # Write the header line in each new report file
    for item in lofh:
        if os.stat(item[2]).st_size == 0:
            item[1].writelines(
                "date,time,timezone,MACB,source,sourcetype,type,user,host,short,desc,version,filename,inode,notes,format,extra\n"
            )

    if not os.path.isfile(csv_file):
        print("File not found", csv_file)
        mylogfile.writelines("File not found " + csv_file + "\n")
        sys.exit(1)

    # Run each search for each report (in parallel) and write the results to the report CSV files
    counter = 1
    counter2 = True
    mqueue = queue.Queue()
    #Open file and read to memory
    SuperTimeline_file = io.open(csv_file, 'r', encoding='utf-8').readlines()
    # Create all threads to start
    threads = []
    for terms in lofh:
        threads.append(
            threading.Thread(
                target=multi_thread_reports,
                args=(mqueue, SuperTimeline_file, terms)))

    [t.start() for t in threads]
    [t.join() for t in threads]

    # Close all report files
    for item in lofh:
        item[1].close()

    # Removing files with no output
    final_lor = []
    final_lor_nodata = []
    for i_filename in lor:
        if os.stat(i_filename).st_size == 111:
            os.remove(i_filename)
            final_lor_nodata.append(i_filename)
        else:
            final_lor.append(i_filename)

    # Print report not created messages
    print("\nDid not keep " + str(len(final_lor_nodata)) +
          " Reports due to no matching data from SuperTimeline")
    mylogfile.writelines(
        "\nDid not keep " + str(len(final_lor_nodata)) +
        " Reports due to no matching data from SuperTimeline\n")
    for item in final_lor_nodata:
        print("Report not kept: " + item)
        mylogfile.writelines("Report not kept:" + item + "\n")

    # Print report created messages
    print("\nCreated " + str(len(final_lor)) + " Reports.  Now improving them")
    mylogfile.writelines("\nCreated " + str(len(final_lor)) + " Reports.")

    # Function to improve reports (in parallel)
    print(
        "Improving Reports if possible (This will take a long time for large files)"
    )
    mqueue.put(
        mylogfile.writelines(
            "Improving Reports if possible (This will take a long time for large files)"
            + "\n"))
    report_improvements(lor, mylogfile)

    print("\nAll reporting complete")
    mylogfile.writelines("\nAll reporting complete\n")
    end_dt = datetime.datetime.now()
    duration02 = end_dt - start_dt
    print("Reporting ended at: " + str(end_dt))
    print("Reporting duration was: " + str(duration02))
    mylogfile.writelines("Reporting ended at: " + str(end_dt) + "\n")
    mylogfile.writelines("Reporting duration was: " + str(duration02) + "\n")
    return


def plaso_version(log2timeline_location):
    newout = subprocess.check_output(
        [log2timeline_location, "--version"],
        stderr=subprocess.STDOUT).decode("utf-8")
    pver_out = ".".join(str(newout).split(" ")[-1].split(".")[0:2]).rstrip(
        "\\n\'").rstrip("\\r").strip()
    return (pver_out)


def output_elasticsearch(mylogfile, srcfilename, casename, psort_location,
                         server, port, user, logname):
    # Run psort against plaso db file to output to an ElasticSearch server running on the localhost
    print("Exporting results in Kibana format to the ElasticSearch server")
    mylogfile.writelines(
        "Exporting results in Kibana format to the ElasticSearch server\n")

    # Create psort command to run
    command = [
        psort_location, "-o", "elastic", "--status_view", "none",
        "--index_name", "case_cdqr-" + casename.lower(), "--logfile", logname, "--server", server,
        "--port", port, srcfilename
    ]
    if user != "":
        command.append("--elastic_user")
        command.append(user)

    print("\"" + "\" \"".join(command) + "\"")
    mylogfile.writelines("\"" + "\" \"".join(command) + "\"" + "\n")

    # Execute Command
    status_marker(mylogfile,
                  subprocess.Popen(
                      command, stdout=mylogfile, stderr=mylogfile))

    print("All entries have been inserted into database with case: " +
          "case_cdqr-" + casename.lower())
    mylogfile.writelines(
        "All entries have been inserted into database with case: " +
        "case_cdqr-" + casename.lower() + "\n")


def output_elasticsearch_ts(mylogfile, srcfilename, casename, psort_location, logname):
    # Run psort against plaso db file to output to an ElasticSearch server running on the localhost
    print("Exporting results in TimeSketch format to the ElasticSearch server")
    mylogfile.writelines(
        "Exporting results in TimeSketch format to the ElasticSearch server\n")

    # Create command to run
    command = [
        psort_location, "-o", "timesketch", "--status_view", "none",
        "--logfile", logname, "--name", casename.lower(),
        "--index", casename.lower(), srcfilename
    ]

    print("\"" + "\" \"".join(command) + "\"")
    mylogfile.writelines("\"" + "\" \"".join(command) + "\"" + "\n")

    # Execute Command
    status_marker(mylogfile,
                  subprocess.Popen(
                      command, stdout=mylogfile, stderr=mylogfile))

    print("All entries have been inserted into TimeSketch database with case: "
          + casename.lower())
    mylogfile.writelines(
        "All entries have been inserted into TimeSketch database with case: " +
        casename.lower() + "\n")


def zip_source(inputfile, outputzip):
    try:
        with zipfile.ZipFile(outputzip, "w") as zip_ref:
            zip_ref.write(inputfile, compress_type=compression)
        return
    except Exception as e:
        print("Unable to compress file: " + inputfile)
        print(e)
        sys.exit(1)


def unzip_source(src_loc_tmp, outputzipfolder):
    try:
        with zipfile.ZipFile(src_loc_tmp, "r") as zip_ref:
            if sys.platform[0:3] == "win":
                zip_ref.extractall(u'\\\\?\\' +
                                   os.path.abspath(outputzipfolder))
            else:
                zip_ref.extractall(os.path.abspath(outputzipfolder))
        return outputzipfolder
    except Exception as e:
        print("Unable to extract file: " + src_loc_tmp)
        print(e)
        sys.exit(1)


def create_export(dst_loc, srcfilename, mylogfile, db_file, psort_location, logname):
    # Create Output filenames
    dstrawfilename = dst_loc + "/" + srcfilename.split("/")[-1] + ".json"
    dstfilename = dst_loc + "/" + srcfilename.split("/")[-1] + ".json.zip"
    if os.path.exists(dstfilename):
        if query_yes_no(
                args, "\n" + dstfilename +
                " already exists.  Would you like to delete that file?", "yes"):
            os.remove(dstfilename)

    # Run psort against plaso db file to output a file in line delimited json format
    print("Creating json line delimited file")
    mylogfile.writelines("Creating json line delimited file\n")

    # Create command to run
    command = [
        psort_location, "-o", "json_line", "--status_view", "none", db_file,
        "--logfile", logname, "-w", dstrawfilename
    ]

    print("\"" + "\" \"".join(command) + "\"")
    mylogfile.writelines("\"" + "\" \"".join(command) + "\"" + "\n")

    # Execute Command
    status_marker(mylogfile,
                  subprocess.Popen(
                      command, stdout=mylogfile, stderr=mylogfile))

    print("Json line delimited file created")
    mylogfile.writelines("Json line delimited file created" + "\n")

    return dstfilename

def get_parser_list(parser_opt, plaso_ver, args):
    parserlist = parse_optionslatest[parser_opt]
    if args.parser:
        parser_opt = args.parser[0]
    if parser_opt == "win":
        if args.mft:
             parserlist = parserlist + ",mft"
        if args.usnjrnl:
            parserlist = parserlist + ",usnjrnl"

    return parserlist

###################### REPORT FIXING SECTION ###############################

def prefetch_report_fix(row):
    header_desc_rows = report_header_dict['Prefetch Report.csv'][0][0]
    header_extra_rows = report_header_dict['Prefetch Report.csv'][1][0]

    if row[5] == "WinPrefetch":
        search_desc = re.compile(
            r'Prefetch \[(.{1,200})\](.{1,20}) - run count (\d{1,10})( (path): (.{1,200})|) (hash): (.{1,15}) (volume): (\d{1,10}) \[(serial number): (.{1,20})  (device path): (.+)\]'
        )
        search_extra = re.compile(
            r'(md5_hash): (.{1,100})  (number_of_volumes): (\d{1,10})  (version): (\d{1,10})  (volume_device_paths): \[u.(.{1,100}).\]  (volume_serial_numbers): \[(.+)\]'
        )
    else:
        search_desc = re.compile(
            r'(.{1,200}) (Serial number): (.{1,15}) (Origin): (.+)')
        search_extra = re.compile(r'(md5_hash): (.+) ')

    search_results_desc = re.search(search_desc, row[header_desc_rows])

    if row[5] == "WinPrefetch":
        if search_results_desc:
            if search_results_desc.group(4) == '':
                row[header_desc_rows] = search_results_desc.group(
                    1) + "," + search_results_desc.group(
                        3) + ",," + search_results_desc.group(
                            8) + "," + search_results_desc.group(
                                10) + "," + search_results_desc.group(
                                    12) + "," + search_results_desc.group(
                                        14) + ","
            else:
                row[header_desc_rows] = search_results_desc.group(
                    1) + "," + search_results_desc.group(
                        3) + "," + search_results_desc.group(
                            6) + "," + search_results_desc.group(
                                8) + "," + search_results_desc.group(
                                    10) + "," + search_results_desc.group(
                                        12) + "," + search_results_desc.group(
                                            14) + ","

        search_results_extra = re.search(
            search_extra, row[header_extra_rows]
        )  # 'md5_hash','number_of_volumes','version','volume_device_paths','volume_serial_numbers'
        if search_results_extra:
            row[header_extra_rows] = search_results_extra.group(
                2) + "," + search_results_extra.group(
                    4) + "," + search_results_extra.group(
                        6) + "," + search_results_extra.group(
                            8) + "," + search_results_extra.group(10)
    else:
        if search_results_desc:
            row[header_desc_rows] = ",,,," + search_results_desc.group(
                1) + "," + search_results_desc.group(
                    3) + ",," + search_results_desc.group(5)

        search_results_extra = re.search(search_extra, row[header_extra_rows])
        if search_results_extra:
            row[header_extra_rows] = search_results_extra.group(2) + ",,,,"

    row[12] = row[12].replace('OS:', '')
    return row


def appcompat_report_fix(row):
    header_desc_rows = report_header_dict['Appcompat Report.csv'][0][0]
    search_desc = re.compile(
        r'\[(.{1,100})\] (Cached entry): (\d+) (Path): (.+)')

    header_extra_rows = report_header_dict['Appcompat Report.csv'][1][0]
    search_extra = re.compile(r'(md5_hash): (.{1,50})')
    search_results_desc = re.search(search_desc, row[header_desc_rows])
    if search_results_desc:
        row[header_desc_rows] = search_results_desc.group(
            1) + "," + search_results_desc.group(
                3) + "," + search_results_desc.group(
                    5) + "," + search_results_desc.group(5).split('\\')[-1]

    search_results_extra = re.search(search_extra, row[header_extra_rows])
    if search_results_extra:
        row[header_extra_rows] = search_results_extra.group(2).strip()

    row[12] = row[12].replace('OS:', '')
    return row


def event_log_report_fix(
        row
):  #'Event Log Report.csv':[[10,['event_id','EID_desc','record_number','event_level','source_name','computer_name','message']]
    header_desc_rows = report_header_dict['Event Log Report.csv'][0][0]
    header_extra_rows = report_header_dict['Event Log Report.csv'][1][0]
    if row[4] == "EVT":
        search_desc = re.compile(
            r'\[(.{1,8}) /.{1,100} (Record Number): (.{1,10}) (Event Level): (.{1,10}) (Source Name): (.{1,300}) (Computer Name): (.{1,100}) (Strings|Message string): (\[(.+)\]|.+)'
        )
        search_extra = re.compile(
            r'(md5_hash): (.{1,50}) (message_identifier): (.{1,20}) (recovered): (True|False)  (strings_parsed): ({}  (user_sid): (.{1,75}) (xml_string): (.+)|.+)'
        )

        search_results_desc = re.search(search_desc, row[header_desc_rows])
        if search_results_desc:
            try:
                eventlog_string = eventlog_dict[search_results_desc.group(1)]
            except:
                eventlog_string = ""
            row[header_desc_rows] = search_results_desc.group(
                1) + "," + eventlog_string + "," + search_results_desc.group(
                    3) + "," + search_results_desc.group(
                        5) + "," + search_results_desc.group(
                            7) + "," + search_results_desc.group(9) + "," + (
                                (str(search_results_desc.group(12))).replace(
                                    "\r", " ")).replace("\n", " ")
        search_results_extra = re.search(search_extra, row[header_extra_rows])
        if search_results_extra:
            row[header_extra_rows] = search_results_extra.group(
                2) + "," + search_results_extra.group(
                    4) + "," + search_results_extra.group(
                        6) + "," + search_results_extra.group(8) + "," + str(
                            search_results_extra.group(10)) + "," + (
                                (str(search_results_extra.group(12))).replace(
                                    "\r", " ")).replace("\n", " ")
    else:
        if row[header_desc_rows] != "desc":
            row[header_desc_rows] = ",,,,,,"
            row[header_extra_rows] = ",,,,,"
    row[12] = row[12].replace('OS:', '')
    return row


def scheduled_tasks_report_fix(row):
    header_desc_rows = report_header_dict['Scheduled Tasks Report.csv'][0][0]
    search_desc = re.compile(
        r'(\[(.{1,200})\] (Task): (.{1,200}): \[(ID): \{(.{1,100})\}\]|(Task): (.{1,200}) \[(Identifier): \{(.{1,100})\}\])'
    )

    header_extra_rows = report_header_dict['Scheduled Tasks Report.csv'][1][0]
    search_extra = re.compile(r'(md5_hash): (.+) ')

    search_results_desc = re.search(search_desc, row[header_desc_rows])
    if search_results_desc:
        if search_results_desc.group(1)[0:4] == "Task":
            row[header_desc_rows] = "," + search_results_desc.group(
                8) + "," + search_results_desc.group(10)
        else:
            row[header_desc_rows] = search_results_desc.group(
                2) + "," + search_results_desc.group(
                    4) + "," + search_results_desc.group(6)

    search_results_extra = re.search(search_extra, row[header_extra_rows])
    if search_results_extra:
        row[header_extra_rows] = search_results_extra.group(2)

    return row


def file_system_report_fix(row):
    if row[0] is not "" and row[0] is not "--":
        header_desc_rows = report_header_dict['File System Report.csv'][0][0]
        FS_search_desc = re.compile(r'(..):(.{1,500})(Type):(.{1,100})')

        header_extra_rows = report_header_dict['File System Report.csv'][1][0]
        FS_search_extra = re.compile(
            r'(file_size): \((.{1,50}) \)  (file_system_type): (.{1,20})  (is_allocated): (True|False)(  (md5_hash): (.+) |)'
        )

        search_results_desc = re.search(FS_search_desc, row[header_desc_rows])
        if search_results_desc:
            row[header_desc_rows] = search_results_desc.group(
                2) + "," + search_results_desc.group(4)
        search_results_extra = re.search(FS_search_extra,
                                         row[header_extra_rows])

        if search_results_extra:
            if search_results_extra.group(7) != '':
                row[header_extra_rows] = search_results_extra.group(
                    2) + "," + search_results_extra.group(
                        4) + "," + search_results_extra.group(
                            6) + "," + search_results_extra.group(9)
            else:
                row[header_extra_rows] = search_results_extra.group(
                    2) + "," + search_results_extra.group(
                        4) + "," + search_results_extra.group(6) + ","
        return row
    else:
        return [
            "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "",
            "", "", "", ""
        ]


def mft_report_fix(row):
    header_desc_rows = report_header_dict['MFT Report.csv'][0][0]
    header_extra_rows = report_header_dict['MFT Report.csv'][1][0]

    if row[4] == "FILE":
        search_desc = re.compile(
            r'(.{1,100}) (File reference): (.{1,100}) (Attribute name): (\$STANDARD_INFORMATION|\$FILE_NAME)( |)((Name): (.{1,200}) (Parent file reference): (.+)|(\((unallocated)|))'
        )
        search_extra = re.compile(
            r'(attribute_type): (.{1,20}) (file_attribute_flags): (.{1,20}) (file_system_type): (.{1,20}) (is_allocated): (True|False)  (md5_hash): (.+) '
        )
    else:
        search_desc = re.compile(
            r'((.{1,100}) (MAC address): (.{1,20}) (Origin): (.+))')
        search_extra = re.compile(r'(md5_hash): (.+) ')

    search_results_desc = re.search(search_desc, row[header_desc_rows])

    if row[4] == "FILE":
        if search_results_desc:
            if search_results_desc.group(5) == "$FILE_NAME":
                row[header_desc_rows] = search_results_desc.group(
                    3) + "," + search_results_desc.group(
                        5) + "," + search_results_desc.group(
                            9) + "," + search_results_desc.group(11).rstrip(
                                r" (unallocated)") + ","
            else:
                row[header_desc_rows] = search_results_desc.group(
                    3) + "," + search_results_desc.group(5) + ",,,"

        search_results_extra = re.search(search_extra, row[header_extra_rows])
        if search_results_extra:
            row[header_extra_rows] = search_results_extra.group(
                2) + "," + search_results_extra.group(
                    4) + "," + search_results_extra.group(
                        6) + "," + search_results_extra.group(
                            8) + "," + search_results_extra.group(10)
    else:
        if search_results_desc:
            row[header_desc_rows] = ",,,," + search_results_desc.group(1)

        search_results_extra = re.search(search_extra, row[header_extra_rows])
        if search_results_extra:
            row[header_extra_rows] = search_results_extra.group(2)

    row[12] = row[12].replace('OS:', '')

    return row


def fix_line(row, report_name):
    if report_name == 'File System Report.csv':
        del row[9]
        del row[10]
        del row[11]
        del row[11]
        del row[10]
    elif report_name == 'Scheduled Tasks Report.csv':
        del row[9]
        del row[12]
        del row[12]
        del row[11]
    elif report_name == 'Event Log Report.csv':
        del row[9]
        del row[12]
        del row[12]
        del row[10]
    elif report_name == 'Appcompat Report.csv':
        del row[3]
        del row[3]
        del row[3]
        del row[4]
        del row[4]
        del row[4]
        del row[5]
        del row[5]
        del row[5]
        del row[5]
    elif report_name == 'MFT Report.csv':
        del row[9]
        del row[12]
        del row[12]
        del row[10]
    elif report_name == 'Prefetch Report.csv':
        del row[9]
        del row[12]
        del row[12]
        del row[10]
    return row


# Report Dictionary (by OS)

report_header_dict = {
    'Appcompat Report.csv':
    [[10, ['source', 'cached_entry_order', 'full_path', 'filename']],
     [16, ['md5_hash']], appcompat_report_fix],
    'Event Log Report.csv': [[
        10, [
            'event_id', 'EID_desc', 'record_number', 'event_level',
            'source_name', 'computer_name', 'message'
        ]
    ], [
        16, [
            'md5_hash', 'message_id', 'recovered', 'strings_parsed',
            'user_sid', 'xml_string'
        ]
    ], event_log_report_fix],
    'File System Report.csv': [[10, ['filename', 'Type']], [
        16, ['file_size', 'file_system_type', 'is_allocated', 'md5_hash']
    ], file_system_report_fix],
    'MFT Report.csv': [[
        10, [
            'File_reference', 'Attribute_name', 'Name',
            'Parent_file_reference', 'Log_info'
        ]
    ], [
        16, [
            'attribute_type', 'file_attribute_flags', 'file_system_type',
            'is_allocated', 'md5_hash'
        ]
    ], mft_report_fix],
    #    'UsnJrnl Report.csv':[],
    #    'Internet History Report.csv':[],
    'Prefetch Report.csv': [[
        10, [
            'File_name', 'Run_count', 'path', 'hash', 'volume',
            'Serial number', 'Device_path', 'Origin'
        ]
    ], [
        16, [
            'md5_hash', 'number_of_volumes', 'version', 'volume_device_paths',
            'volume_serial_numbers'
        ]
    ], prefetch_report_fix],
    #    'Registry Report.csv':[],
    'Scheduled Tasks Report.csv': [[10, ['key', 'task', 'identification']],
                                   [16,
                                    ['md5_hash']], scheduled_tasks_report_fix],
    #    'Persistence Report.csv':[],
    #    'System Information Report.csv':[],
    #    'AntiVirus Report.csv':[],
    #    'Firewall Report.csv':[],
    #    'Login Report.csv':[]
}


# Report Improvement Multi-threading
def multi_thread_report_improve(mqueue, mylogfile, report, report_name,
                                tmp_report_name):
    output_list = []
    #mqueue.put(terms[1].writelines(line.replace("\n"," ").replace("\r"," ")+"\n"))
    with io.open(report, 'r', encoding='utf-8') as csvfile:
        for trow in csvfile:
            row = trow.split(',')
            output_list.append((report_header_dict[report_name][2](row)))
        # Print Report to file
        newreport = open(tmp_report_name, 'w', encoding='utf-8')
        for line in output_list:
            if line[10] == 'desc':
                for thing in report_header_dict[report_name]:
                    if isinstance(thing, list):
                        line[thing[0]] = ','.join(thing[1])
            mqueue.put(
                newreport.writelines(
                    ','.join(fix_line(line, report_name)).replace(
                        "\n", " ").replace("\r", " ") + "\n"))
        newreport.close()

        if os.stat(tmp_report_name).st_size != 0:
            mqueue.put(shutil.copyfile(tmp_report_name, report))
            mqueue.put(os.remove(tmp_report_name))
        print(str(report_name) + ":    Complete")
        mqueue.put(
            mylogfile.writelines(str(report_name) + ":    Complete" + "\n"))
    return


# Report Improvements Function
def report_improvements(lor, mylogfile):
    mqueue = queue.Queue()
    threads = []
    for report in lor:
        lonf = []
        report_name = report.split('/')[-1]
        tmp_report_name = os.path.dirname(
            report) + "/tmp_" + report_name + ".csv"
        if tmp_report_name[0] == '/':
            tmp_report_name = tmp_report_name[1:]
        if report_name in report_header_dict:
            if os.path.exists(report):
                lonf.append([report, report_name, tmp_report_name])
        for nfile in lonf:
            threads.append(
                threading.Thread(
                    target=multi_thread_report_improve,
                    args=(mqueue, mylogfile, nfile[0], nfile[1], nfile[2])))

    [t.start() for t in threads]
    [t.join() for t in threads]
    return


# This processes the image using parser option selected and creates .plaso file
def parse_the_things(args, mylogfile, command1, db_file, unzipped_file,
                     unzipped_file_loc, csv_file):
    # Check if the database and supertimeline files already exists and ask to keep or delete them if they do
    if os.path.isfile(db_file):
        if query_yes_no(
                args, "\n" + db_file +
                " already exists.  Would you like to delete this file?", "no"):
            print("Removing the existing file: " + db_file)
            mylogfile.writelines("Removing the existing file: " + db_file +
                                 "\n")
            os.remove(db_file)
            if os.path.isfile(csv_file):
                print("Removing the existing file: " + csv_file)
                mylogfile.writelines("Removing the existing file: " + csv_file
                                     + "\n")
                os.remove(csv_file)
                rpt_dir_name = dst_loc + "/Reports"
                if os.path.isdir(rpt_dir_name):
                    print("Removing the existing report directory: " +
                          rpt_dir_name)
                    mylogfile.writelines(
                        "Removing the existing report directory: " +
                        rpt_dir_name + "\n")
                    if sys.platform[0:3] == "win":
                        shutil.rmtree(u'\\\\?\\' +
                                      os.path.abspath(rpt_dir_name))
                    else:
                        shutil.rmtree(rpt_dir_name)
        else:
            print("Keeping the existing file: " + db_file)
            mylogfile.writelines("Keeping the existing file: " + db_file)
#            return

    # Process image with log2timeline
    start_dt = datetime.datetime.now()
    print("Processing started at: " + str(start_dt))
    mylogfile.writelines("Processing started at: " + str(start_dt) + "\n")
    print("Parsing image")
    mylogfile.writelines("Parsing image" + "\n")
    print("\"" + "\" \"".join(command1) + "\"")
    mylogfile.writelines("\"" + "\" \"".join(command1) + "\"" + "\n")
    ######################  Log2timeline Command Execute  ##########################
    status_marker(mylogfile,
                  subprocess.Popen(
                      command1, stdout=mylogfile, stderr=mylogfile))

    end_dt = datetime.datetime.now()
    duration01 = end_dt - start_dt
    print("Parsing ended at: " + str(end_dt))
    mylogfile.writelines("Parsing ended at: " + str(end_dt) + "\n")
    print("Parsing duration was: " + str(duration01))
    mylogfile.writelines("Parsing duration was: " + str(duration01) + "\n")
    # Removing uncompressed file(s)
    if unzipped_file:
        print("\nRemoving uncompressed files in directory: " +
              unzipped_file_loc)
        mylogfile.writelines("\nRemoving uncompressed files in directory: " +
                             unzipped_file_loc + "\n")
        if sys.platform[0:3] == "win":
            shutil.rmtree(u'\\\\?\\' + os.path.abspath(unzipped_file_loc))
        else:
            shutil.rmtree(unzipped_file_loc)

    return


def create_supertimeline(args, mylogfile, csv_file, psort_location, db_file, logname):
    # This processes the .plaso file creates the SuperTimeline
    if os.path.isfile(csv_file):
        if query_yes_no(
                args, "\n" + csv_file +
                " already exists.  Would you like to delete this file?", "no"):
            print("Removing the existing file: " + csv_file)
            mylogfile.writelines("Removing the existing file: " + csv_file +
                                 "\n")
            os.remove(csv_file)
            rpt_dir_name = dst_loc + "/Reports"
            if os.path.isdir(rpt_dir_name):
                print("Removing the existing report directory: " +
                      rpt_dir_name)
                mylogfile.writelines("Removing the existing file: " +
                                     rpt_dir_name + "\n")
        else:
            print("Keeping the existing file: " + csv_file)
            mylogfile.writelines("Keeping the existing file: " + csv_file)
            return
    command2 = [
        psort_location, "-o", "l2tcsv", "--status_view", "none", db_file,
        "--logfile", logname, "-w", csv_file
    ]
    # Create SuperTimeline
    print("\nCreating the SuperTimeline CSV file")
    mylogfile.writelines("\nCreating the SuperTimeline CSV file" + "\n")
    print("\"" + "\" \"".join(command2) + "\"")
    mylogfile.writelines("\"" + "\" \"".join(command2) + "\"" + "\n")
    ######################  Psort Command Execute  ##########################
    status_marker(mylogfile,
                  subprocess.Popen(
                      command2, stdout=mylogfile, stderr=mylogfile))
    print("SuperTimeline CSV file is created")
    mylogfile.writelines("SuperTimeline CSV file is created\n")
    return


def get_es_info(args):
    casename = "default"
    user = ""
    server = "127.0.0.1"
    port = "9200"

    if args.es_kb:
        casename = args.es_kb[0]
    if args.es_kb_user:
        user = args.es_kb_user[0]
    if args.es_kb_server:
        server = args.es_kb_server[0]
    if args.es_kb_port:
        port = args.es_kb_port[0]

    return casename, server, port, user


def get_ts_es_info(args):
    casename = "default"

    if args.es_ts:
        casename = args.es_ts[0]
    return casename


def export_to_elasticsearch(mylogfile, args, db_file, psort_location, logname):
    start_dt = datetime.datetime.now()
    print("\nProcess to export to ElasticSearch started")
    mylogfile.writelines("\nProcess to export to ElasticSearch started" + "\n")
    if args.es_kb:
        casename, server, port, user = get_es_info(args)
        output_elasticsearch(mylogfile, db_file, casename, psort_location,
                             server, port, user, logname)
    else:
        casename = get_ts_es_info(args)
        output_elasticsearch_ts(mylogfile, db_file, casename, psort_location, logname)
    end_dt = datetime.datetime.now()
    duration03 = end_dt - start_dt
    print("\nProcess to export to ElasticSearch completed")
    mylogfile.writelines("\nProcess to export to ElasticSearch completed" +
                         "\n")
    print("ElasticSearch export process duration was: " + str(duration03))
    mylogfile.writelines("ElasticSearch export process duration was: " +
                         str(duration03) + "\n")
    return


def export_to_json(dst_loc, srcfilename, mylogfile, db_file, psort_location, logname):
    # Export Data (if selected)
    print("\nProcess to create export document started")
    mylogfile.writelines("\nProcess to create export document started" + "\n")
    # Create the file for export
    exportfname = create_export(dst_loc, srcfilename, mylogfile, db_file,
                                psort_location, logname)
    print("Process to create export document complete")
    mylogfile.writelines("Process to create export document complete" + "\n")

    end_dt = datetime.datetime.now()
    duration03 = end_dt - start_dt
    print("Creating export document process duration was: " + str(duration03))
    mylogfile.writelines("Creating export document process duration was: " +
                         str(duration03) + "\n")
    return


def unzip_files(dst_loc, src_loc):
    unzipped_file_loc = dst_loc + "/artifacts/" + src_loc.split("/")[-1][:-4]
    print("Attempting to extract source file: " + src_loc)
    src_loc = unzip_source(src_loc, unzipped_file_loc)
    print("All files extracted to folder: " + src_loc)
    return src_loc


####################### END FUNCTIONS ############################


##################  EXECTUTION SECTION ############################
def main():
    # Default Parser option
    default_parser = "win"
    unzipped_file = False
    unzipped_file_loc = ""

    # Plaso Program Locations (default)
    if sys.platform[0:3] == "win":
        log2timeline_location = r"plaso\log2timeline.exe"
        psort_location = r"plaso\psort.exe"
    else:
        log2timeline_location = r"log2timeline.py"
        psort_location = r"psort.py"

    # Parsing begins
    parser_list = ["win", "mft_usnjrnl", "lin", "mac", "android","datt"]

    parser = argparse.ArgumentParser(
        description='Cold Disk Quick Response Tool (CDQR)')
    parser.add_argument(
        'src_location',
        nargs=1,
        help='Source File location: Y:/Case/Tag009/sample.E01')
    parser.add_argument(
        'dst_location',
        nargs='?',
        default='Results',
        help=
        'Destination Folder location. If nothing is supplied then the default is \'Results\''
    )
    parser.add_argument(
        '-p',
        '--parser',
        nargs=1,
        help=
        'Choose parser to use.  If nothing chosen then \'win\' is used.  The parsing options are: '
        + ', '.join(parser_list))
    parser.add_argument(
        '--nohash',
        action='store_true',
        default=False,
        help=
        'Do not hash all the files as part of the processing of the image')
    parser.add_argument(
        '--mft',
        action='store_true',
        default=False,
        help=
        'Process the MFT file (disabled by default except for DATT)')
    parser.add_argument(
        '--usnjrnl',
        action='store_true',
        default=False,
        help=
        'Process the USNJRNL file (disabled by default except for DATT)')
    parser.add_argument(
        '--max_cpu',
        action='store_true',
        default=False,
        help='Use the maximum number of cpu cores to process the image')
    parser.add_argument(
        '--export',
        action='store_true',
        help='Creates line delimited json export file')
    parser.add_argument(
        '--artifact_filters',
        nargs=1,
        help='Plaso passthrough: Names of forensic artifact definitions, \
            provided on the command command line (comma separated). Forensic \
            artifacts are stored in .yaml files that are directly \
            pulled from the artifact definitions project. You can \
            also specify a custom artifacts yaml file (see \
            --custom_artifact_definitions). Artifact definitions \
            can be used to describe and quickly collect data of \
            interest, such as specific files or Windows Registry \
            keys.')
    parser.add_argument(
        '--artifact_filters_file',
        nargs=1,
        help='Plaso passthrough: Names of forensic artifact definitions, \
            provided in a file with one artifact name per line. Forensic \
            artifacts are stored in .yaml files that are directly \
            pulled from the artifact definitions project. You can \
            also specify a custom artifacts yaml file (see \
            --custom_artifact_definitions). Artifact definitions \
            can be used to describe and quickly collect data of \
            interest, such as specific files or Windows Registry \
            keys.')
    parser.add_argument(
        '--artifact_definitions',
        nargs=1,
        help='Plaso passthrough: Path to a directory containing artifact \
            definitions, which are .yaml files. Artifact definitions can \
            be used to describe and quickly collect data of interest, \
            such as specific files or Windows Registry keys.')
    parser.add_argument(
        '--custom_artifact_definitions',
        nargs=1,
        help='Plaso passthrough: Path to a file containing custom artifact \
        definitions, which are .yaml files. Artifact definitions can be \
        used to describe and quickly collect data of interest, \
        such as specific files or Windows Registry keys.')
    parser.add_argument(
        '--file_filter',
        '-f',
        nargs=1,
        help='Plaso passthrough: List of files to include for targeted \
         collection of files to parse, one line per file path, setup is \
        /path|file - where each element can contain either a \
        variable set in the preprocessing stage or a regular \
        expression.')
    parser.add_argument(
        '--es_kb',
        nargs=1,
        help=
        'Outputs Kibana format to elasticsearch database. Requires index name. Example: \'--es_kb my_index\''
    )
    parser.add_argument(
        '--es_kb_server',
        nargs=1,
        help=
        'Kibana Format Only: Exports to remote (default is 127.0.0.1) elasticsearch database. Requires Server name or IP address Example: \'--es_kb_server myserver.elk.go\' or \'--es_kb_server 192.168.1.10\''
    )
    parser.add_argument(
        '--es_kb_port',
        nargs=1,
        help=
        'Kibana Format Only: Port (default is 9200) for remote elasticsearch database. Requires port number Example: \'--es_kb_port 9200 \''
    )
    parser.add_argument(
        '--es_kb_user',
        nargs=1,
        help=
        'Kibana Format Only: Username (default is none) for remote elasticsearch database. Requires port number Example: \'--es_kb_user skadi \''
    )
    parser.add_argument(
        '--es_ts',
        nargs=1,
        help=
        'Outputs TimeSketch format to elasticsearch database. Requires index/timesketch name. Example: \'--es_ts my_name\''
    )
    parser.add_argument(
        '--plaso_db',
        action='store_true',
        default=False,
        help='Process an existing Plaso DB file. Example: artifacts.plaso')
    parser.add_argument(
        '-z',
        action='store_true',
        default=False,
        help=
        'Indicates the input file is a zip file and needs to be decompressed')
    parser.add_argument(
        '--no_dependencies_check',
        action='store_false',
        default=True,
        help=
        'Re-enables the log2timeline the dependencies check. It is skipped by default'
    )
    parser.add_argument(
        '--process_archives',
        action='store_true',
        default=False,
        help=
        'Extract and inspect contents of archives found inside of artifacts or disk images'
    )
    parser.add_argument(
        '-v', '--version', action='version', version=cdqr_version)
    parser.add_argument(
        '-y',
        action="store_true",
        default=False,
        dest='confirmAll',
        help='Accepts all defaults on prompted questions in the program.')
    args = parser.parse_args()

    # List to help with logging
    log_list = [cdqr_version + "\n"]
    print(cdqr_version)

    # Parsing the input from the command line and building log2timeline command
    if args:
        # Validate log2timeline.exe and psort.exe locations
        if sys.platform[0:3] == "win":
            if not os.path.isfile(log2timeline_location):
                log2timeline_location, psort_location = query_plaso_location()
            # Default log2timeline command
        command1 = [
            log2timeline_location, "--partition", "all", "--vss_stores", "all",
            "--status_view", "linear"
        ]

        # Do not process archives unless enabled
        if args.process_archives:
            command1.append("--process_archives")

    # Set log2timeline parsing option(s)
        if args.parser:
            if args.parser[0] not in parser_list:
                print("ERROR: \"" + args.parser[0] +
                      "\" is not a valid parser selection.")
                print("ERROR: Valid parser options are: " +
                      ', '.join(parser_list))
                print("ERROR: Please verify your command and try again.")
                print("Exiting...")
                sys.exit(1)
            parser_opt = args.parser[0]
            if parser_opt == "lin" or parser_opt == "mac":
                command1 = [
                    log2timeline_location, "--partition", "all",
                    "--status_view", "none"
                ]
        else:
            # Set Default parser
            parser_opt = default_parser

    # Determine if Plaso version is compatible
        p_ver = plaso_version(log2timeline_location)
        print("Plaso Version: " + p_ver)
        log_list.append("Plaso Version: " + p_ver + "\n")

        plaso_ver = plaso_version(log2timeline_location)

    # Determine if Export is being used and option is valid
        if args.export:
            print("Export data option selected")
            log_list.append("Export data option selected\n")
        # add parsing options to the command
        command1.append("--parsers")
        command1.append(get_parser_list(parser_opt, plaso_ver, args))
        print("Using parser: " + parser_opt)
        log_list.append("Using parser: " + parser_opt + "\n")

        # Set Hashing variable
        if args.nohash:
            command1.append("--hashers")
            command1.append("none")
        else:
            command1.append("--hashers")
            command1.append("md5")

    # Set Number of CPU cores to use
        if args.max_cpu:
            num_cpus = multiprocessing.cpu_count()
        else:
            num_cpus = multiprocessing.cpu_count() - 3
            if num_cpus <= 0:
                num_cpus = 1
        command1.append("--workers")
        command1.append(str(num_cpus))
        print("Number of cpu cores to use: " + str(num_cpus))
        log_list.append("Number of cpu cores to use: " + str(num_cpus) + "\n")

        # Set filter file location
        if args.file_filter:
            filter_file_loc = verify_file(args.file_filter[0])
            command1.append("--file_filter")
            command1.append(filter_file_loc)
            print("Filter file used: " + filter_file_loc)
            log_list.append("Filter file used: " + filter_file_loc)

        # Set custom artifact definitions file location
        if args.custom_artifact_definitions:
            custom_artifact_definitions_file = verify_file(args.custom_artifact_definitions[0])
            command1.append("--custom_artifact_definitions")
            command1.append(custom_artifact_definitions_file)
            print("Custom Artifact Definition file used: " + custom_artifact_definitions_file)
            log_list.append("Custom Artifact Definition file used: " + custom_artifact_definitions_file)

        # Set artifact definitions file location
        if args.artifact_definitions:
            artifact_definitions_file = verify_file(args.artifact_definitions[0])
            command1.append("--artifact_definitions")
            command1.append(artifact_definitions_file)
            print("Artifact Definition file used: " + artifact_definitions_file)
            log_list.append("Artifact Definition file used: " + artifact_definitions_file)

        # Set artifact filters file location
        if args.artifact_filters_file:
            artifact_filters_file = verify_file(args.artifact_filters_file[0])
            command1.append("--artifact_filters_file")
            command1.append(artifact_filters_file)
            print("Artifact Definition file used: " + artifact_filters_file)
            log_list.append("Artifact Definition file used: " + artifact_filters_file)

        # Set artifact filters
        if args.artifact_filters:
            artifact_filters = args.artifact_filters[0]
            command1.append("--artifact_filters")
            command1.append(artifact_filters)
            print("Artifact Definitions used: " + artifact_filters)
            log_list.append("Artifact Definitions used: " + artifact_filters)

    # Set source location/file
        src_loc = verify_file(args.src_location[0])

    # Set destination location/file
        dst_loc = args.dst_location.replace("\\\\",
                                            "/").replace("\\", "/").rstrip("/")
        if os.path.exists(dst_loc):
            if not query_yes_no(
                    args, "\n" + dst_loc +
                    " already exists.  Would you like to use that directory anyway?",
                    "yes"):
                dst_loc = dst_loc + "_" + datetime.datetime.now().strftime(
                    "%d-%b-%y_%H-%M-%S")
                os.makedirs(dst_loc)
        else:
            os.makedirs(dst_loc)

        print("Destination Folder: " + dst_loc)
        log_list.append("Destination Folder: " + dst_loc + "\n")

        if args.z:
            unzipped_file = True
            src_loc = unzip_files(dst_loc, src_loc)
            unzipped_file_loc = dst_loc + "/artifacts/"
        elif src_loc[-4:].lower() == ".zip":
            if query_yes_no(
                    args, "\n" + src_loc +
                    " appears to be a zip file.  Would you like CDQR to unzip it and process the contents?",
                    "yes"):
                unzipped_file = True
                src_loc = unzip_files(dst_loc, src_loc)
                unzipped_file_loc = dst_loc + "/artifacts/"

        print("Source data: " + src_loc)
        log_list.append("Source data: " + src_loc + "\n")

    if args.plaso_db:
        db_file = dst_loc + "/" + src_loc
    else:
        db_file = dst_loc + "/" + src_loc.split("/")[-1] + ".plaso"

    # Create DB, CSV and Log Filenames
    csv_file = dst_loc + "/" + src_loc.split("/")[-1] + ".SuperTimeline.csv"
    logname = dst_loc + "/" + src_loc.split("/")[-1] + "_log2timeline.gz"
    logfilename = dst_loc + "/" + src_loc.split("/")[-1] + ".log"

    # Check to see if it's a mounted drive and update filename if so
    if db_file == dst_loc + "/.plaso" or db_file[-7:] == ":.plaso":
        db_file = dst_loc + "/" + "mounted_image.plaso"
        csv_file = dst_loc + "/" + "mounted_image.SuperTimeline.csv"
        logname = dst_loc + "/" + "mounted_image.gz"
        logfilename = logname + ".log"

    command1.append("--logfile")
    command1.append(logname)

    print("Log File: " + logfilename)
    print("Database File: " + db_file)

    log_list.append("Log File: " + logfilename + "\n")
    log_list.append("Database File: " + db_file + "\n")

    # Todo only print this if not using ES or TS
    if args.es_kb is None and args.es_ts is None:
        print("SuperTimeline CSV File: " + csv_file)
        log_list.append("SuperTimeline CSV File: " + csv_file + "\n")

    command1.append(db_file)
    command1.append(src_loc)

    if args.no_dependencies_check:
        command1.append("--no_dependencies_check")

    if os.path.isfile(logfilename):
        os.remove(logfilename)

    if os.path.isfile(logname):
        os.remove(logname)

    mylogfile = open(logfilename, 'w')
    mylogfile.writelines("".join(log_list))

    start_dt = datetime.datetime.now()
    print("\nStart time was: " + str(start_dt))
    mylogfile.writelines("\nStart time  was: " + str(start_dt) + "\n")

    # If this is plaso database file, skip parsing
    if args.plaso_db:
        print(
            "WARNING: File must be plaso database file otherwise it will not work.  Example: artifact.plaso (from CDQR)"
        )
        mylogfile.writelines(
            "\nWARNING: File must be plaso database file otherwise it will not work.  Example: artifact.plaso (from CDQR)"
            + "\n")
        db_file = src_loc
    else:
        parse_the_things(args, mylogfile, command1, db_file, unzipped_file,
                         unzipped_file_loc, csv_file)

    logname = dst_loc + "/" + src_loc.split("/")[-1] + "_psort.gz"
    if args.export:
        export_to_json(dst_loc, src_loc, mylogfile, db_file, psort_location, logname)
    elif args.es_kb or args.es_ts:
        export_to_elasticsearch(mylogfile, args, db_file, psort_location, logname)
    else:
        create_supertimeline(args, mylogfile, csv_file, psort_location,
                             db_file, logname)
        create_reports(args, mylogfile, dst_loc, csv_file, parser_opt)

    end_dt = datetime.datetime.now()
    duration_full = end_dt - start_dt
    print("\nTotal duration was: " + str(duration_full))
    mylogfile.writelines("\nTotal duration was: " + str(duration_full) + "\n")
    mylogfile.close()


if __name__ == "__main__":
    main()
Download .txt
gitextract_eljvsp25/

├── .travis.yml
├── Docker/
│   ├── Dockerfile
│   ├── README.md
│   ├── cdqr
│   ├── cdqr.d
│   ├── cdqr.d.ps1
│   └── cdqr.ps1
├── Icons/
│   └── Martin-Berube-Character-Knight.icns
├── LICENSE
├── README.md
├── ThankYou
├── docs/
│   ├── parser_datt.csv
│   ├── parser_lin.csv
│   ├── parser_mac.csv
│   └── parser_win.csv
└── src/
    └── cdqr.py
Download .txt
SYMBOL INDEX (30 symbols across 1 files)

FILE: src/cdqr.py
  function verify_file (line 1221) | def verify_file(file_location_tmp):
  function query_plaso_location (line 1237) | def query_plaso_location():
  function query_yes_no (line 1257) | def query_yes_no(args, question, default="yes"):
  function status_marker (line 1283) | def status_marker(mylogfile, myproc):
  function multi_thread_reports (line 1302) | def multi_thread_reports(mqueue, infile, terms):
  function create_reports (line 1310) | def create_reports(args, mylogfile, dst_loc, csv_file, parser_opt):
  function plaso_version (line 1673) | def plaso_version(log2timeline_location):
  function output_elasticsearch (line 1682) | def output_elasticsearch(mylogfile, srcfilename, casename, psort_location,
  function output_elasticsearch_ts (line 1714) | def output_elasticsearch_ts(mylogfile, srcfilename, casename, psort_loca...
  function zip_source (line 1742) | def zip_source(inputfile, outputzip):
  function unzip_source (line 1753) | def unzip_source(src_loc_tmp, outputzipfolder):
  function create_export (line 1768) | def create_export(dst_loc, srcfilename, mylogfile, db_file, psort_locati...
  function get_parser_list (line 1801) | def get_parser_list(parser_opt, plaso_ver, args):
  function prefetch_report_fix (line 1815) | def prefetch_report_fix(row):
  function appcompat_report_fix (line 1876) | def appcompat_report_fix(row):
  function event_log_report_fix (line 1898) | def event_log_report_fix(
  function scheduled_tasks_report_fix (line 1941) | def scheduled_tasks_report_fix(row):
  function file_system_report_fix (line 1967) | def file_system_report_fix(row):
  function mft_report_fix (line 2002) | def mft_report_fix(row):
  function fix_line (line 2052) | def fix_line(row, report_name):
  function multi_thread_report_improve (line 2150) | def multi_thread_report_improve(mqueue, mylogfile, report, report_name,
  function report_improvements (line 2181) | def report_improvements(lor, mylogfile):
  function parse_the_things (line 2206) | def parse_the_things(args, mylogfile, command1, db_file, unzipped_file,
  function create_supertimeline (line 2272) | def create_supertimeline(args, mylogfile, csv_file, psort_location, db_f...
  function get_es_info (line 2310) | def get_es_info(args):
  function get_ts_es_info (line 2328) | def get_ts_es_info(args):
  function export_to_elasticsearch (line 2336) | def export_to_elasticsearch(mylogfile, args, db_file, psort_location, lo...
  function export_to_json (line 2358) | def export_to_json(dst_loc, srcfilename, mylogfile, db_file, psort_locat...
  function unzip_files (line 2376) | def unzip_files(dst_loc, src_loc):
  function main (line 2388) | def main():
Condensed preview — 16 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (184K chars).
[
  {
    "path": ".travis.yml",
    "chars": 1197,
    "preview": "os: windows\nlanguage: sh\npython: \"3.8\"\nbefore_install:\n  - choco install python3\n  - export PATH=\"/c/Python38:/c/Python3"
  },
  {
    "path": "Docker/Dockerfile",
    "chars": 2759,
    "preview": "# Use the official Docker Hub Ubuntu 18.04 base image\nFROM ubuntu:18.04\nMAINTAINER @aorlikoski\n\nENV DEBIAN_FRONTEND noni"
  },
  {
    "path": "Docker/README.md",
    "chars": 2979,
    "preview": "# CDQR Docker\r\nThe CDQR docker is a docker image with CDQR and all of the dependencies installed.\r\n\r\nThe docker itself i"
  },
  {
    "path": "Docker/cdqr",
    "chars": 3826,
    "preview": "#!/bin/bash\ncdqr_version=\"5.1.0.1\"\ncur_dir=\"$(pwd)\"\ndocker_network=${DOCKER_NETWORK}\ntimesketch_conf=${TIMESKETCH_CONF:-"
  },
  {
    "path": "Docker/cdqr.d",
    "chars": 3439,
    "preview": "#!/bin/bash\ncdqr_version=\"5.1.0.1\"\ncur_dir=\"$(pwd)\"\ndocker_network=${DOCKER_NETWORK}\ntimesketch_conf=${TIMESKETCH_CONF:-"
  },
  {
    "path": "Docker/cdqr.d.ps1",
    "chars": 3486,
    "preview": "#! /usr/bin/pwsh\n$ErrorActionPreference = \"Stop\"\n\n$cdqr_version=\"5.1.0.1\"\n$cur_dir=Get-Location\n$docker_network=$env:DOC"
  },
  {
    "path": "Docker/cdqr.ps1",
    "chars": 3614,
    "preview": "#! /usr/bin/pwsh\n$ErrorActionPreference = \"Stop\"\n\n$cdqr_version=\"5.1.0.1\"\n$cur_dir=Get-Location\n$docker_network=$env:DOC"
  },
  {
    "path": "LICENSE",
    "chars": 35090,
    "preview": "           GNU GENERAL PUBLIC LICENSE\n                       Version 3, 29 June 2007\n\n Copyright (C) 2007 Free Software "
  },
  {
    "path": "README.md",
    "chars": 9654,
    "preview": "## NAME\n\nCDQR — Cold Disk Quick Response tool by Alan Orlikoski\n\nFor latest release click [here](https://github.com/orli"
  },
  {
    "path": "ThankYou",
    "chars": 260,
    "preview": "Thanks to the Plaso team who's product is great (https://github.com/log2timeline/plaso/wiki)\nThanks to Andrew Moore for "
  },
  {
    "path": "docs/parser_datt.csv",
    "chars": 798,
    "preview": "amcache,\nandroid_app_usage,\napache_access,\nasl_log,\nbash_history,\nbash,\nbencode,\nbinary_cookies,\nbsm_log,\nchrome_cache,\n"
  },
  {
    "path": "docs/parser_lin.csv",
    "chars": 255,
    "preview": "bash,\nbash_history,\nbencode,\nczip,\ndockerjson,\ndpkg,\nfilestat,\nmcafee_protection,\nolecf,\npls_recall,\npopularity_contest,"
  },
  {
    "path": "docs/parser_mac.csv",
    "chars": 285,
    "preview": "asl_log,\nbash_history,\nbash,\nbencode,\nbsm_log,\nccleaner,\ncups_ipp,\nczipplist,\nfilestat,\nfseventsd,\nmcafee_protection,\nma"
  },
  {
    "path": "docs/parser_win.csv",
    "chars": 250,
    "preview": "bencode,\nczip,\nccleaner,\nesedb,\nfilestat,\nlnk,\nmft,\nmcafee_protection,\nolecf,\npe,\nprefetch,\nrecycle_bin,\nrecycle_bin_inf"
  },
  {
    "path": "src/cdqr.py",
    "chars": 109416,
    "preview": "#!/usr/bin/python3\n\"\"\"\nThis program is free software: you can redistribute it and/or modify it under\nthe terms of the GN"
  }
]

// ... and 1 more files (download for full content)

About this extraction

This page contains the full source code of the orlikoski/CDQR GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 16 files (173.2 KB), approximately 43.0k tokens, and a symbol index with 30 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!