[
  {
    "path": ".gitignore",
    "content": "# Logs\nlogs\n*.log\nnpm-debug.log*\n\n# Runtime data\npids\n*.pid\n*.seed\n\n# Directory for instrumented libs generated by jscoverage/JSCover\nlib-cov\n\n# Coverage used by tools like istanbul\ncoverage\ncoverage.*\n\n# nyc test coverage\n.nyc_output\n\n# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)\n.grunt\n\n# node-waf configuration\n.lock-wscript\n\n# Compiled binary addons (http://nodejs.org/api/addons.html)\nbuild/Release\n\n# Dependency directories\nnode_modules\njspm_packages\n\n# Optional npm cache directory\n.npm\n\n# Optional REPL history\n.node_repl_history\n\n# OS-specific files\n.DS_Store\n.Trash*\n.fseventsd\n.Spotlight*\n"
  },
  {
    "path": ".travis.yml",
    "content": "language: node_js\nmatrix:\n  include:\n#    - node_js: '0.10'\n#      env: _CXXAUTO=1\n#    - node_js: '0.12'\n#      env: _CXXAUTO=1\n#    - node_js: '4'\n#      env: CXX=g++-4.8\n    - node_js: '6'\n#      env: CXX=g++-4.8\n    - node_js: '8'\n#      env: CXX=g++-4.8\n    - node_js: '10'\ndist: trusty\nsudo: required\naddons:\n  apt:\n    sources:\n      - ubuntu-toolchain-r-test\n    packages:\n#      - gcc-4.8\n#      - g++-4.8\nbefore_script:\n  - sudo apt-key adv --keyserver keyserver.ubuntu.com --recv E0C56BD4\n  - echo \"deb http://repo.yandex.ru/clickhouse/deb/stable/ main/\" | sudo tee -a /etc/apt/sources.list\n  - sudo apt-get update -qq\n  - sudo apt-get install clickhouse-server-common clickhouse-client -y\n  - sudo sed -i -- 's/listen_host>::/listen_host>0.0.0.0/g' /etc/clickhouse-server/config.xml\n  - sudo service clickhouse-server start\n  - sudo netstat -ltn\n  - (tail -n 100 /var/log/clickhouse-server/clickhouse-server.err.log || exit 0)\n  - (tail -n 100 /var/log/clickhouse-server/clickhouse-server.log || exit 0)\n  - (id metrika || exit 0)\n  - (ls -la /var/lib/clickhouse/data/default/ || exit 0)\n  - (ls -la /var/lib/clickhouse/metadata/default/ || exit 0)\n  - curl -v \"http://127.0.0.1:8123/\"\n  - npm run legacy-install\nafter_script:\n  - npm run report\n"
  },
  {
    "path": "LICENSE",
    "content": "The MIT License (MIT)\n\nCopyright (c) 2016 ivan baktsheev\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
  },
  {
    "path": "NOTES.md",
    "content": "# Notes\n\n## Testing on node 10.x\n\nSeems like memwatch-next is not supported anymore,\nso replace it with @airbnb/node-memwatch. Airbnb's version\nnot supports node < 8 (https://github.com/andywer/leakage/pull/27)\n"
  },
  {
    "path": "README.md",
    "content": "Simple and powerful interface for [ClickHouse](https://clickhouse.yandex/) [![travis](https://travis-ci.org/apla/node-clickhouse.svg)](https://travis-ci.org/apla/node-clickhouse) [![codecov](https://codecov.io/gh/apla/node-clickhouse/branch/master/graph/badge.svg)](https://codecov.io/gh/apla/node-clickhouse)\n===\n```sh\nnpm install @apla/clickhouse\n```\n\nSynopsis\n---\n```javascript\nconst ClickHouse = require('@apla/clickhouse')\nconst ch = new ClickHouse({ host, port, user, password })\n\nconst stream = ch.query(\"SELECT 1\", (err, data) => {})\nstream.pipe(process.stdout)\n\n// promise interface, not recommended for selects\n// (requires 'util.promisify' for node < 8, Promise shim for node < 4)\nawait ch.querying(\"CREATE DATABASE test\")\n```\nExamples:\n- [Selecting large dataset](README.md#selecting-large-dataset)\n- [Inserting large dataset](README.md#inserting-large-dataset)\n- [Inserting single row](README.md#insert-single-row-of-data)\n\n\nAPI\n---\n\n### `new ClickHouse(options: Options)`\n\n#### `Options`\n\n|                  | required | default       | description\n| :--------------- | :------: | :------------ | :----------\n| `host`           | ✓        |               | Host to connect.\n| `user`           |          |               | Authentication user.\n| `password`       |          |               | Authentication password.\n| `path` (`pathname`) |       | `/`           | Pathname of ClickHouse server.\n| `port`           |          | `8123`        | Server port number.\n| `protocol`       |          | `'http:'`     | `'https:'` or `'http:'`.\n| `dataObjects`    |          | `false`       | By default (`false`), you'll receive array of values for each row. <br /> If you set `dataObjects: true`, every row will become an object with format: `{ fieldName: fieldValue, … }`. <br /> Alias to `format: 'JSON'`.\n| `format`         |          | `JSONCompact` | Adds the `FORMAT` statement for query if it did not have one. <br /> Specifies format of [selected](https://clickhouse.yandex/docs/en/query_language/select/#format-clause) or [inserted](https://clickhouse.yandex/docs/en/query_language/insert_into/#insert) data. <br /> See [\"Formats for input and output data\"](https://clickhouse.yandex/docs/en/interfaces/formats/#formats) to find out possible values.\n| `queryOptions`   |          |               | Object, can contain any ClickHouse option from [Settings](https://clickhouse.yandex/docs/en/operations/settings/index.html), [Restrictions](https://clickhouse.yandex/docs/en/operations/settings/query_complexity/) and [Permissions](https://clickhouse.yandex/docs/en/operations/settings/permissions_for_queries/). <br /> See [example](README.md#settings-for-connection).\n| `readonly`       |          | `false`       | Tells driver to send query with HTTP GET method. Same as [`readonly=1` setting](https://clickhouse.yandex/docs/en/operations/settings/permissions_for_queries/#settings_readonly). [More details](https://clickhouse.yandex/docs/en/interfaces/http/).\n| `timeout`, <br /> `headers`, <br /> `agent`, <br /> `localAddress`, <br /> `servername`, <br /> etc… |   |   |  Any [http.request](https://nodejs.org/api/http.html#http_http_request_options_callback) or [https.request](https://nodejs.org/api/https.html#https_https_request_options_callback) options are also available.\n\n<!--\nThis are dangerous for using by end user\n\n| `syncParser`     |          | `false`       | **Not recommended for large amounts of data!** <br /> Collects all data, then parse entire response. <br /> May be faster, but for large datasets all your dataset goes into memory (actually, entire response + entire dataset).\n# Might be completely replaced with promise interface.\n\n| `omitFormat`     |          | `false`       | By default `FORMAT JSONCompact` statement will be added to the query if it did not have it. <br /> You can change disable this behaviour by providing this option.\n# Looks like internal option\n-->\n\n##### Options example:\n```javascript\nconst ch = new ClickHouse({\n  host: \"clickhouse.msk\",\n  dataObjects: true,\n  readonly: true,\n  queryOptions: {\n    profile: \"web\",\n    database: \"test\",\n  },\n})\n```\n\n\n### `clickHouse.query(query, [options], [callback])`\nSends a query statement to a server.\n\n##### `query: string`\nSQL query statement.\n\n##### `options: Options`\nThe same [`Options`](README.md#options), excluding connection options.\n\n##### `callback: (error, result) => void`\nWill be always called upon completion.\n\n##### Returns: [`DuplexStream`](https://nodejs.org/api/stream.html#stream_duplex_and_transform_streams)\nIt supports [`.pipe`](https://nodejs.org/api/stream.html#stream_readable_pipe_destination_options) to process records. <br/>\nYou should have at least one error handler listening. Via query callback or via stream `error` event.\n\n| Stream event | Description\n| ------------ | -----------\n| `'error'`      | Query execution finished with error. <br /> If you have both query `callback` and stream `error` listener, you'll have error notification in both listeners.\n| `'metadata'`   | When a column information is parsed.\n| `'data'`       | When a row is available.\n| `'end'`        | When entire response is processed. <blockquote>Regardless of whether there is an `'end'` listener, the query `callback` are always called.</blockquote> <blockquote>You should always listen to `'data'` event together with `'end'` event. <br/>[\"The 'end' event will not be emitted unless the data is completely consumed.\"](https://nodejs.org/api/stream.html#stream_event_end) <br/> If you don't need to handle `'data'` event prefer to use only `callback` or [Promise interface](#promise-interface).</blockquote>\n\n##### `stream.supplemental`\nAfter response is processed, you can read a supplemental response data from it, such as row count.\n\n\nExamples:\n- [Selecting with stream](README.md#selecting-with-stream)\n- [Inserting with stream](README.md#inserting-with-stream)\n\n### `clickHouse.ping(callback)`\nSends an empty query.\nDoesn't requires authorization.\n\n##### `callback: (error, result) => void`\nWill be called upon completion.\n\n<br />\n\n## Promise interface\n\nPromise interface **is not recommended** for `INSERT` and `SELECT` queries.\n* `INSERT` can't do bulk load data with promise interface.\n* `SELECT` will collect entire query result in the memory. See the [Memory size](README.md#memory-size) section.\n\nWith promise interface query result are parsed synchronously.\nThis means that large query result in promise interface:\n* Will synchronously block JS thread/event loop.\n* May lead to memory leaks in your app due peak GC loads.\n\nUse it only for queries where resulting data size is is known and extremely small.<br/>\nThe good cases to use it is `DESCRIBE TABLE` or `EXISTS TABLE`\n\n### `clickHouse.querying(query, [options])`\nSimilar to `ch.query(query)` but collects entire response in memory and resolves with complete query result. <br />\nSee the [Memory size](README.md#memory-size) section.\n##### `options: Options`\nThe same [`Options`](README.md#options), excluding connection options.\n\n##### Returns: `Promise`\nWill be resolved with entire query result.\n\nExample of [promise interface](README.md#promise-interface).\n\n### `clickHouse.pinging()`\nPromise interface for [`.ping`](README.md#clickhousepingcallback).\n\n##### Returns: `Promise`\n\n<br />\n\nHow it works\n-----\n\n### Bulk data loading with `INSERT` statements\n\n`INSERT` can be used for bulk data loading. There is a 2 formats easily implementable\nwith javascript: CSV and TabSeparated/TSV.\n\nCSV is useful for loading from file, thus you can read and `.pipe` into clickhouse\nfile contents. <br />\nTo activate CSV parsing you should set `format` driver option or query `FORMAT` statement to `CSV`:\n\n```javascript\n\nvar csvStream = fs.createReadStream('data.csv')\nvar clickhouseStream = ch.query(statement, { format: CSV })\n\ncsvStream.pipe(clickhouseStream)\n\n```\n\nTSV is useful for loading from file and bulk loading from external sources, such as other databases.\nOnly `\\\\`, `\\t` and `\\n` need to be escaped in strings; numbers, nulls,\nbools and date objects need some minor processing. You can send prepared TSV data strings\n(line ending will be appended automatically), buffers (always passed as is) or Arrays with fields.\n\nInternally, every field will be converted to the format which ClickHouse can accept.\nThen escaped and joined with delimiter for the particular format.\nIf you ever need to store rows (in arrays) and send preformatted data, you can do it.\n\nClickHouse also supports [JSONEachRow](https://clickhouse.yandex/docs/en/formats/jsoneachrow.html) format\nwhich can be useful to insert javascript objects if you have such recordset.\n\n```js\nconst stream = ch.query(statement, { format: 'JSONEachRow' })\n\nstream.write(object) // Do write as many times as possible\nstream.end() // And don't forget to finish insert query\n```\n\n### Memory size\n\nYou can read all the records into memory in single call like this:\n\n```javascript\n\nvar ch = new ClickHouse({ host: host, port: port })\nch.querying(\"SELECT number FROM system.numbers LIMIT 10\", (err, result) => {\n  // result will contain all the data you need\n})\n\n```\n\nIn this case whole JSON response from the server will be read into memory,\nthen parsed into memory hogging your CPU. Default parser will parse server response\nline by line and emits events. This is slower, but much more memory and CPU efficient\nfor larger datasets.\n\n<br />\n\n## Examples\n#### Selecting with stream:\n```javascript\nconst readableStream = ch.query(\n  'SELECT * FROM system.contributors FORMAT JSONEachRow',\n  (err, result) => {},\n)\nconst writableStream = fs.createWriteStream('./contributors.json')\nreadableStream.pipe(writableStream)\n```\n\n#### Inserting with stream:\n```javascript\nconst readableStream = fs.createReadStream('./x.csv')\nconst writableStream = ch.query('INSERT INTO table FORMAT CSV', (err, result) => {})\nreadableStream.pipe(writableStream)\n```\n\n#### Insert single row of data:\n```javascript\nconst ch = new ClickHouse(options)\nconst writableStream = ch.query(`INSERT INTO table FORMAT TSV`, (err) => {\n  if (err) {\n    console.error(err)\n  }\n  console.log('Insert complete!')\n})\n\n// data will be formatted for you\nwritableStream.write([1, 2.22, \"erbgwerg\", new Date()])\n\n// prepare data yourself\nwritableStream.write(\"1\\t2.22\\terbgwerg\\t2017-07-17 17:17:17\")\n\nwritableStream.end()\n\n```\n\n#### Selecting large dataset:\n\n```javascript\nconst ch = new ClickHouse(options)\n// it is better to use stream interface to fetch select results\nconst stream = ch.query(\"SELECT * FROM system.numbers LIMIT 10000000\")\n\nstream.on('metadata', (columns) => { /* do something with column list */ })\n\nlet rows = []\nstream.on('data', (row) => rows.push(row))\n\nstream.on('error', (err) => { /* handler error */ })\n\nstream.on('end', () => {\n  console.log(\n    rows.length,\n    stream.supplemental.rows,\n    stream.supplemental.rows_before_limit_at_least, // how many rows in result are set without windowing\n  )\n})\n```\n\n#### Inserting large dataset:\n```javascript\nconst ch = new ClickHouse(options)\n// insert from file\nconst tsvStream = fs.createReadStream('data.tsv')\nconst clickhouseStream = ch.query('INSERT INTO table FORMAT TSV')\n\ntsvStream.pipe(clickhouseStream)\n```\n\n#### Settings for connection:\n```javascript\nconst ch = new ClickHouse({\n  host: 'clickhouse.msk',\n  queryOptions: {\n    database: \"test\",\n    profile: \"web\",\n    readonly: 2,\n    force_index_by_date: 1,\n    max_rows_to_read: 10 * 1e6,\n  },\n})\n```\n\n#### Settings for query:\n```javascript\nconst ch = new ClickHouse({ host: 'clickhouse.msk' })\nconst stream = ch.query('INSERT INTO table FORMAT TSV', {\n  queryOptions: {\n    database: \"test\",\n    insert_quorum: 2,\n  },\n})\n```\n\n#### Promise interface:\n```js\nconst ch = new ClickHouse(options)\n// Check connection to server. Doesn't requires authorization.\nawait ch.pinging()\n```\n```js\nconst { data } = await ch.querying(\"SELECT 1\")\n// [ [ 1 ] ]\nconst { data } = await ch.querying(\"DESCRIBE TABLE system.numbers\", { dataObjects: true })\n// [ { name: 'number', type: 'UInt64', default_type: '', default_expression: '' } ]\n```\n"
  },
  {
    "path": "package.json",
    "content": "{\n  \"name\": \"@apla/clickhouse\",\n  \"version\": \"1.6.4\",\n  \"description\": \"ClickHouse database interface\",\n  \"main\": \"src/clickhouse.js\",\n  \"scripts\": {\n    \"legacy-install\": \"node ./src/legacy-support.js\",\n    \"launch-docker-image\": \"docker run --rm -d -p 8123:8123 --name clickhouse-server clickhouse/clickhouse-server\",\n    \"stop-docker-image\": \"docker stop clickhouse-server\",\n    \"test\": \"nyc mocha --recursive ./test -R spec\",\n    \"report\": \"nyc report --reporter=lcov > coverage.lcov && codecov\",\n    \"simpletest\": \"mocha --recursive ./test -R spec\",\n    \"torturetest\": \"TORTURE=1 mocha -gc --recursive ./test -R spec\"\n  },\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"git+https://github.com/apla/node-clickhouse.git\"\n  },\n  \"keywords\": [\n    \"clickhouse\",\n    \"database\",\n    \"db\"\n  ],\n  \"author\": \"Ivan Baktsheev <dot.and.thing@gmail.com>\",\n  \"license\": \"MIT\",\n  \"bugs\": {\n    \"url\": \"https://github.com/apla/node-clickhouse/issues\"\n  },\n  \"homepage\": \"https://github.com/apla/node-clickhouse#readme\",\n  \"dependencies\": {},\n  \"devDependencies\": {\n    \"bluebird\": \"^3.5.0\",\n    \"codecov\": \"^2.2.0\",\n    \"mocha\": \"^2.5.3\",\n    \"nyc\": \"^10.2.0\"\n  },\n  \"engines\": {\n    \"node\": \">=0.10\"\n  }\n}\n"
  },
  {
    "path": "src/clickhouse.js",
    "content": "var http = require ('http');\nvar https = require('https');\nvar url  = require ('url');\nvar qs   = require ('querystring');\nvar util = require ('util');\n\n// var debug = require ('debug')('clickhouse');\n\nrequire ('./legacy-support');\n\nvar RecordStream = require ('./streams').RecordStream;\nvar JSONStream   = require ('./streams').JSONStream;\n\nvar parseError = require ('./parse-error');\n\nvar LIBRARY_SPECIFIC_OPTIONS = require ('./consts').LIBRARY_SPECIFIC_OPTIONS;\n\nfunction httpResponseHandler (stream, reqParams, reqData, cb, response) {\n\tvar str;\n\tvar error;\n\n\tif (response.statusCode === 200) {\n\t\tstr = Buffer.alloc ? Buffer.alloc (0) : new Buffer (0);\n\t} else {\n\t\terror = Buffer.alloc ? Buffer.alloc (0) : new Buffer (0);\n\t}\n\n\tfunction errorHandler (e) {\n\t\tvar err = parseError (e);\n\n\t\t// user should define callback or add event listener for the error event\n\t\tif (!cb || (cb && stream.listeners ('error').length))\n\t\t\tstream.emit ('error', err);\n\t\treturn cb && cb (err);\n\t}\n\n\t// In case of error, we're just throw away data\n\tresponse.on ('error', errorHandler);\n\n\t// TODO: use streaming interface\n\t// from https://github.com/jimhigson/oboe.js\n\t// or https://www.npmjs.com/package/stream-json or\n\t// or https://github.com/creationix/jsonparse\n\n\t// or implement it youself\n\tvar jsonParser = new JSONStream (stream);\n\n\tvar symbolsTransferred = 0;\n\n\t//another chunk of data has been received, so append it to `str`\n\tresponse.on ('data', function (chunk) {\n\n\t\tsymbolsTransferred += chunk.length;\n\n\t\t// JSON response\n\t\tif (\n\t\t\tresponse.headers['content-type']\n\t\t\t&& response.headers['content-type'].indexOf ('application/json') === 0\n\t\t\t&& !reqData.syncParser\n\t\t\t&& chunk.lastIndexOf (\"\\n\") !== -1\n\t\t\t&& str\n\t\t) {\n\n\t\t\t// store in buffer anything after\n\t\t\tvar newLinePos = chunk.lastIndexOf (\"\\n\");\n\n\t\t\tvar remains = chunk.slice (newLinePos + 1);\n\n\t\t\tBuffer.concat([str, chunk.slice (0, newLinePos)])\n\t\t\t\t.toString ('utf8')\n\t\t\t\t.split (\"\\n\")\n\t\t\t\t.forEach (jsonParser);\n\n\t\t\tjsonParser.rows.forEach (function (row) {\n\t\t\t\t// write to readable stream\n\t\t\t\tstream.push (row);\n\t\t\t});\n\n\t\t\tjsonParser.rows = [];\n\n\t\t\tstr = remains;\n\n\t\t\t// plaintext response\n\t\t} else if (str) {\n\t\t\tstr   = Buffer.concat ([str, chunk]);\n\t\t} else {\n\t\t\terror = Buffer.concat ([error, chunk]);\n\t\t}\n\t});\n\n\t//the whole response has been received, so we just print it out here\n\tresponse.on('end', function () {\n\n\t\t// debug (response.headers);\n\n\t\tif (error) {\n\t\t\treturn errorHandler (error);\n\t\t}\n\n\t\tvar data;\n\n\t\tvar contentType = response.headers['content-type'];\n\n\t\t// Early return and stream end in case when content-type means empty body\n\t\tif (response.statusCode === 200 && (\n\t\t\t!contentType\n\t\t\t|| contentType.indexOf ('text/plain') === 0\n\t\t\t|| contentType.indexOf ('text/html') === 0 // WTF: xenial - no content-type, precise - text/html\n\t\t)) {\n\t\t\t// probably this is a ping response or any other successful response with *empty* body\n\t\t\tstream.push (null);\n\t\t\tcb && cb (null, str.toString ('utf8'));\n\t\t\treturn;\n\t\t}\n\n\t\tvar supplemental = {};\n\n\t\t// we already pushed all the data\n\t\tif (jsonParser.columns.length) {\n\t\t\ttry {\n\t\t\t\tsupplemental = JSON.parse (jsonParser.supplementalString + str.toString ('utf8'));\n\t\t\t} catch (e) {\n\t\t\t\t// TODO\n\t\t\t}\n\t\t\tstream.supplemental = supplemental;\n\n\t\t\t// end stream\n\t\t\tstream.push (null);\n\n\t\t\tcb && cb (null, Object.assign ({}, supplemental, {\n\t\t\t\tmeta: jsonParser.columns,\n\t\t\t\ttransferred: symbolsTransferred\n\t\t\t}));\n\n\t\t\treturn;\n\t\t}\n\n\t\t// one shot data parsing, should be much faster for smaller datasets\n\t\ttry {\n\t\t\tdata = JSON.parse (str.toString ('utf8'));\n\n\t\t\tdata.transferred = symbolsTransferred;\n\n\t\t\tif (data.meta) {\n\t\t\t\tstream.emit ('metadata', data.meta);\n\t\t\t}\n\n\t\t\tif (data.data) {\n\t\t\t\t// no highWatermark support\n\t\t\t\tdata.data.forEach (function (row) {\n\t\t\t\t\tstream.push (row);\n\t\t\t\t});\n\n\t\t\t}\n\t\t} catch (e) {\n\t\t\tif (!reqData.format || !reqData.format.match (/^(JSON|JSONCompact)$/)) {\n\t\t\t\tdata = str.toString ('utf8');\n\t\t\t} else {\n\t\t\t\treturn errorHandler (e);\n\t\t\t}\n\n\t\t} finally {\n\t\t\tif (!stream.readableEnded) {\n\t\t\t\tstream.push (null);\n\t\t\t\tcb && cb (null, data);\n\t\t\t}\n\t\t}\n\t});\n\n}\n\nfunction httpRequest (reqParams, reqData, cb) {\n\n\tif (reqParams.query) {\n\t\treqParams.path = (reqParams.pathname || reqParams.path) + '?' + qs.stringify (reqParams.query);\n\t}\n\n\tvar stream = new RecordStream ({\n\t\tformat: reqData.format\n\t});\n\tvar requestInstance = reqParams.protocol === 'https:' ? https : http;\n\tvar req = requestInstance.request (reqParams, httpResponseHandler.bind (\n\t\tthis, stream, reqParams, reqData, cb\n\t));\n\n\treq.on ('error', function (e) {\n\t\t// user should define callback or add event listener for the error event\n\t\tif (!cb || (cb && stream.listeners ('error').length))\n\t\t\tstream.emit ('error', e);\n\t\treturn cb && cb (e);\n  });\n\n  req.on('timeout', function (e) {\n    req.abort();\n  })\n\n\tstream.req = req;\n\n\tif (reqData.query)\n\t\treq.write (reqData.query);\n\n\tif (reqData.finalized) {\n\t\treq.end();\n\t}\n\n\treturn stream;\n}\n\nfunction ClickHouse (options) {\n\tif (!options) {\n\t\tconsole.error ('You must provide at least host name to query ClickHouse');\n\t\treturn null;\n\t}\n\n\tif (options.constructor === String) {\n\t\toptions = {host: options};\n\t}\n\n\tthis.options = options;\n}\n\nClickHouse.prototype.getReqParams = function () {\n\tvar urlObject = {};\n\n\t// avoid to set defaults - node http module is not happy\n\tfor (var name of Object.keys(this.options)) {\n\t\tif (!LIBRARY_SPECIFIC_OPTIONS.has(name)) {\n\t\t\turlObject[name] = this.options[name];\n\t\t}\n\t}\n\n\tif (this.options.hasOwnProperty('user') || this.options.hasOwnProperty('password')) {\n\t\turlObject.auth = (this.options.user || 'default')\n\t\t\t+ ':'\n\t\t\t+ (this.options.password || '')\n\t}\n\n\turlObject.method = 'POST';\n\n\turlObject.path = urlObject.path || '/';\n\n\turlObject.port = urlObject.port || 8123;\n\n\treturn urlObject;\n}\n\nClickHouse.prototype.query = function (chQuery, options, cb) {\n\n\tchQuery = chQuery.trim ();\n\n\tif (cb === undefined && options && options.constructor === Function) {\n\t\tcb = options;\n\t\toptions = undefined;\n\t}\n\n\tif (!options)\n\t\toptions = {\n\t\t\tqueryOptions: {}\n\t\t};\n\n\toptions.omitFormat  = options.omitFormat  || this.options.omitFormat  || false;\n\toptions.dataObjects = options.dataObjects || this.options.dataObjects || false;\n\toptions.format      = options.format      || this.options.format      || null;\n\toptions.readonly    = options.readonly    || this.options.readonly    || this.options.useQueryString || false;\n\n\t// we're adding `queryOptions` passed for constructor if any\n\tvar queryObject = Object.assign ({}, this.options.queryOptions, options.queryOptions);\n\n\tvar formatRegexp = /FORMAT\\s+(BlockTabSeparated|CSV|CSVWithNames|JSON|JSONCompact|JSONEachRow|Native|Null|Pretty|PrettyCompact|PrettyCompactMonoBlock|PrettyNoEscapes|PrettyCompactNoEscapes|PrettySpaceNoEscapes|PrettySpace|RowBinary|TabSeparated|TabSeparatedRaw|TabSeparatedWithNames|TabSeparatedWithNamesAndTypes|TSKV|Values|Vertical|XML)/i;\n\tvar formatMatch = chQuery.match (formatRegexp);\n\n\tif (!options.omitFormat && formatMatch) {\n\t\toptions.format = formatMatch[1];\n\t\toptions.omitFormat = true;\n\t}\n\n\tvar reqData = {\n\t\tsyncParser: options.syncParser || this.options.syncParser || false,\n\t\tfinalized: true, // allows to write records into connection stream\n\t};\n\n\tvar reqParams = this.getReqParams ();\n\n\tvar formatEnding = '';\n\n\t// format should be added for data queries\n\tif (chQuery.match (/^(?:SELECT|WITH|SHOW|DESC|DESCRIBE|EXISTS\\s+TABLE)/i)) {\n\t\tif (!options.format)\n\t\t\toptions.format = options.dataObjects ? 'JSON' : 'JSONCompact';\n\t} else if (chQuery.match (/^INSERT/i)) {\n\n\t\t// There is some variants according to the documentation:\n\t\t// 1. Values already available in the query: INSERT INTO t VALUES (1),(2),(3)\n\t\t// 2. Values must me provided with POST data: INSERT INTO t VALUES\n\t\t// 3. Same as previous but without VALUES keyword: INSERT INTO t FORMAT Values\n\t\t// 4. Insert from SELECT: INSERT INTO t SELECT…\n\n\t\t// we need to handle 2 and 3 and http stream must stay open in that cases\n\t\tif (chQuery.match (/\\s+VALUES\\b/i)) {\n\t\t\tif (chQuery.match (/\\s+VALUES\\s*$/i))\n\t\t\t\treqData.finalized = false;\n\n\t\t\toptions.format = 'Values';\n\t\t\toptions.omitFormat = true;\n\n\t\t} else if (chQuery.match (/INSERT\\s+INTO\\s+\\S+\\s+(?:\\([^\\)]+\\)\\s+)?SELECT/mi)) {\n\t\t\treqData.finalized  = true;\n\t\t\toptions.omitFormat = true;\n\t\t} else {\n\n\t\t\treqData.finalized = false;\n\n\t\t\t// Newline is recomended https://clickhouse.yandex/docs/en/query_language/insert_into/#insert\n\t\t\tformatEnding = '\\n';\n\t\t\tif (!chQuery.match (/FORMAT/i)) {\n\t\t\t\t// simplest format to use, only need to escape \\t, \\\\ and \\n\n\t\t\t\toptions.format = options.format || 'TabSeparated';\n\t\t\t} else {\n\t\t\t\toptions.omitFormat = true;\n\t\t\t}\n\t\t}\n\t} else {\n\t\toptions.omitFormat = true;\n\t}\n\n\treqData.format = options.format;\n\n\t// use query string to submit ClickHouse query — useful to mock CH server\n\tif (options.readonly) {\n\t\tqueryObject.query = chQuery + ((options.omitFormat) ? '' : ' FORMAT ' + options.format + formatEnding);\n\t\treqParams.method = 'GET';\n\t} else {\n\t\t// Trimmed query still may require `formatEnding` when FORMAT clause specified in query\n\t\treqData.query = chQuery + (options.omitFormat ? '' : ' FORMAT ' + options.format) + formatEnding;\n\t\treqParams.method = 'POST';\n\t}\n\n\treqParams.query = queryObject;\n\n\tvar stream = httpRequest (reqParams, reqData, cb);\n\n\treturn stream;\n}\n\nClickHouse.prototype.querying = function (chQuery, options) {\n\n\treturn new Promise (function (resolve, reject) {\n\t\t// Force override `syncParser` option when using promise api\n\t\tconst queryOptions = Object.assign ({}, options, {syncParser: true})\n\t\tvar stream = this.query (chQuery, queryOptions, function (err, data) {\n\t\t\tif (err)\n\t\t\t\treturn reject (err);\n\t\t\tresolve (data);\n\t\t});\n\t}.bind (this));\n}\n\nClickHouse.prototype.ping = function (cb) {\n\n\tvar reqParams = this.getReqParams ();\n\n\treqParams.method = 'GET';\n\n\tvar stream = httpRequest (reqParams, {finalized: true}, cb);\n\n\treturn stream;\n}\n\nClickHouse.prototype.pinging = function () {\n\n\treturn new Promise (function (resolve, reject) {\n\t\tvar reqParams = this.getReqParams ();\n\n\t\treqParams.method = 'GET';\n\n\t\thttpRequest (reqParams, {finalized: true}, function (err, data) {\n\t\t\tif (err)\n\t\t\t\treturn reject (err);\n\t\t\tresolve (data);\n\t\t});\n\t}.bind (this));\n}\n\nmodule.exports = ClickHouse;\n"
  },
  {
    "path": "src/consts.js",
    "content": "exports.LIBRARY_SPECIFIC_OPTIONS = new Set([\n  // \"Auth\" shorthand\n  'user',\n  'password',\n\n  // Database settings go to query string\n  'queryOptions',\n\n  // Driver options\n  'dataObjects',\n  'format',\n  'syncParser',\n  'omitFormat',\n  'readonly',\n  'useQueryString'\n]);\n"
  },
  {
    "path": "src/legacy-support.js",
    "content": "var nodeVer = process.version.substr (1).split ('.');\n\nif (nodeVer[0] >= 6)\n\treturn;\n\nvar legacyModulesInstallCmd = 'npm install object-assign buffer-indexof-polyfill';\n\nif (process.mainModule === module) {\n\tvar spawn = require ('child_process').spawn;\n\n\tvar child = spawn (legacyModulesInstallCmd);\n\tchild.stdout.pipe (process.stdout);\n\tchild.stderr.pipe (process.stderr);\n\n\tchild.on ('error', function (err) {\n\t\tprocess.exit (1);\n\t});\n\n\tchild.on ('exit', function (code) {\n\t\tprocess.exit (0);\n\t});\n\n    return;\n}\n\n\ntry {\n\nif (nodeVer[0] < 4) {\n\tglobal.Promise = global.Promise || require ('bluebird');\n\tObject.assign  = Object.assign  || require ('object-assign');\n\tArray.isArray = function(arg) {\n\t\treturn Object.prototype.toString.call(arg) === '[object Array]';\n\t};\n}\n\nif (nodeVer[0] < 6) {\n\trequire ('buffer-indexof-polyfill');\n}\n\n} catch (err) {\n\tconsole.warn (\"You're using outdated nodejs version.\");\n\tconsole.warn (\"This module supports nodejs down to the version 0.10, but some legwork required.\");\n\tconsole.warn (\"Either install version >= 6, or add dependencies to your own project with `\" + legacyModulesInstallCmd + \"`\");\n\n}\n"
  },
  {
    "path": "src/parse-error.js",
    "content": "function parseError (e) {\n\tvar fields = new Error (e.toString ('utf8'));\n\te.toString ('utf8')\n\t\t.split (/\\,\\s+(?=e\\.)/gm)\n\t\t.map (function (f) {\n\t\tf = f.trim ().split (/\\n/gm).join ('');\n\t\tvar m;\n\t\tif (m = f.match (/^(?:Error: )?Code: (\\d+)$/)) {\n\t\t\tfields.code = parseInt (m[1]);\n\t\t} else if (m = f.match (/^e\\.displayText\\(\\) = ([A-Za-z0-9\\:]+:) ([^]+)/m)) {\n\t\t\t// e.displayText() = DB::Exception: Syntax error: failed at position 0: SEL\n\t\t\tfields.scope = m[1];\n\t\t\tfields.message = m[2];\n\t\t\tif (m = fields.message.match (/Syntax error: (?:failed at position (\\d+)(?:\\s*\\(line\\s*(\\d+)\\,\\s+col\\s*(\\d+)\\))?)/)) {\n\t\t\t\t// console.log ('!!! syntax error: pos %s line %s col %s', m[1], m[2], m[3]);\n\t\t\t\tfields.lineno = parseInt (m[2] || 1, 10);\n\t\t\t\tfields.colno  = parseInt (m[3] || m[1], 10);\n\t\t\t}\n\t\t} else if (m = f.match (/^e\\.what\\(\\) = (.*)/)) {\n\t\t\tfields.type = m[1];\n\t\t} else {\n\t\t\tconsole.warn ('Unknown error field:', f)\n\t\t}\n\n\t});\n\n\treturn fields;\n}\n\nmodule.exports = parseError;\n"
  },
  {
    "path": "src/process-db-value.js",
    "content": "/*\n\nFormats:\n\nCSV: https://clickhouse.yandex/docs/en/formats/csv.html\n\nDuring parsing, values could be enclosed or not enclosed in quotes.\nSupported both single and double quotes. In particular,\nStrings could be represented without quotes - in that case,\nthey are parsed up to comma or newline (CR or LF).\nContrary to RFC, in case of parsing strings without quotes,\nleading and trailing spaces and tabs are ignored. As line delimiter,\nboth Unix (LF), Windows (CR LF) or Mac OS Classic (LF CR) variants are supported.\n\nTSV/TabSeparated: https://clickhouse.yandex/docs/en/formats/tabseparated.html\n\nIn TabSeparated format, data is written by row. Each row contains values separated by tabs.\nEach value is follow by a tab, except the last value in the row,\nwhich is followed by a line break. Strictly Unix line breaks are assumed everywhere.\nThe last row also must contain a line break at the end. Values are written in text format,\nwithout enclosing quotation marks, and with special characters escaped.\n\nMinimum set of symbols that you must escape in TabSeparated format is tab, newline (LF) and backslash.\n\nArrays are formatted as a list of comma-separated values in square brackets.\nNumber items in the array are formatted as normally, but dates, dates with times,\nand strings are formatted in single quotes with the same escaping rules as above.\n\nAs an exception, parsing DateTime is also supported in Unix timestamp format,\nif it consists of exactly 10 decimal digits. The result is not time zone-dependent.\nThe formats YYYY-MM-DD hh:mm:ss and NNNNNNNNNN are differentiated automatically.\n\nValues: https://clickhouse.yandex/docs/en/formats/values.html\n\nPrints every row in parentheses. Rows are separated by commas.\nThere is no comma after the last row. The values inside the parentheses are also comma-separated.\nNumbers are output in decimal format without quotes. Arrays are output in square brackets.\nStrings, dates, and dates with times are output in quotes.\nEscaping rules and parsing are same as in the TabSeparated format.\nDuring formatting, extra spaces aren’t inserted, but during parsing,\nthey are allowed and skipped (except for spaces inside array values, which are not allowed).\n\nMinimum set of symbols that you must escape in Values format is single quote and backslash.\n\nNulls: \\N?\n\nhttps://github.com/yandex/ClickHouse/issues/252\nhttps://github.com/yandex/ClickHouse/issues/700\nhttps://github.com/Infinidat/infi.clickhouse_orm/pull/42\n\n*/\n\nvar SEPARATORS = {\n\tTSV: \"\\t\",\n\tCSV: \",\",\n\tValues: \",\"\n}\n\nvar ALIASES = {\n\tTabSeparated: \"TSV\"\n}\n\nvar ESCAPE_STRING = {\n\tTSV: function (v, quote) {return v.replace (/\\\\/g, '\\\\\\\\').replace(/\\t/g, '\\\\t').replace(/\\n/g, '\\\\n')},\n\tCSV: function (v, quote) {return v.replace (/\\\"/g, '\"\"')},\n}\n\nvar ESCAPE_NULL = {\n\tTSV: \"\\\\N\",\n\tCSV: \"\\\\N\",\n\tValues: \"\\\\N\",\n\t// JSONEachRow: \"\\\\N\",\n}\n\nfunction encodeValue (quote, v, format) {\n\n\tformat = ALIASES[format] || format;\n\n\tswitch (typeof v) {\n\t\tcase 'string':\n\t\t\treturn ESCAPE_STRING[format] ? ESCAPE_STRING[format] (v, quote) : v;\n\t\tcase 'number':\n\t\t\tif (isNaN (v))\n\t\t\t\treturn 'nan';\n\t\t\tif (v === +Infinity)\n\t\t\t\treturn '+inf';\n\t\t\tif (v === -Infinity)\n\t\t\t\treturn '-inf';\n\t\t\tif (v === Infinity)\n\t\t\t\treturn 'inf';\n\t\t\treturn v;\n\t\tcase 'object':\n\t\t\t// clickhouse allows to use unix timestamp in seconds\n\t\t\tif (v instanceof Date)\n\t\t\t\treturn (\"\" + v.valueOf ()).substr (0, 10);\n\t\t\t// you can add array items\n\t\t\tif (v instanceof Array)\n\t\t\t\treturn v.map (function(item) {\n\t\t\t\t\treturn encodeValue(true, item, format);\n\t\t\t\t})\n\t\t\t// TODO: tuples support\n\t\t\tif (!format) console.trace ();\n\t\t\tif (v === null)\n\t\t\t\treturn format in ESCAPE_NULL ? ESCAPE_NULL[format] : v;\n\n\t\t\treturn format in ESCAPE_NULL ? ESCAPE_NULL[format] : v;\n\n\t\t\tconsole.warn ('Cannot stringify [Object]:', v);\n\t\tcase 'boolean':\n\t\t\treturn v === true ? 1 : 0;\n\t}\n}\n\nfunction encodeRow (row, format) {\n\n\tformat = ALIASES[format] || format;\n\n\tvar encodedRow;\n\n\tif (Array.isArray (row)) {\n\t\tencodedRow = row.map (function (field) {\n\t\t\treturn encodeValue (false, field, format);\n\t\t}.bind (this)).join (SEPARATORS[format]) + \"\\n\";\n\t} else if (row.toString () === \"[object Object]\" && format === \"JSONEachRow\") {\n\t\tencodedRow = JSON.stringify (Object.keys (row).reduce (function (encodedRowObject, k) {\n\t\t\tencodedRowObject[k] = encodeValue (false, row[k], format);\n\t\t\treturn encodedRowObject;\n\t\t}.bind (this), {})) + \"\\n\";\n\t}\n\n\treturn encodedRow;\n}\n\nmodule.exports = {\n\tencodeValue: encodeValue,\n\tencodeRow:   encodeRow\n}\n"
  },
  {
    "path": "src/streams.js",
    "content": "var util   = require ('util');\nvar Duplex = require ('stream').Duplex;\n\nvar encodeRow = require ('./process-db-value').encodeRow;\n\n/**\n * Simplified JSON stream parser\n * @param   {object}   emitter parser will emit metadata event when metadata is available\n * @returns {function} string consumer\n */\nfunction JSONStream (emitter) {\n\t// states are:\n\t// start: { encountered, look for keys\n\t// meta: meta encountered with `[`, parse everything until `]`\n\t// data: data encountered with `[`, just parse every string until `]`\n\t// other keys: concatenate until the end, then prepend `{` and JSON.parse\n\tvar state = null;\n\n\tvar objBuffer;\n\n\tfunction processLine (l) {\n\t\t// console.log (\"LINE>\", l);\n\t\tl = l.trim ();\n\t\tif (!l.length)\n\t\t\treturn;\n\n\t\tif (state === null) {\n\t\t\t// first string should contains `{`\n\t\t\tif (l === '{') {\n\t\t\t\tstate = 'topKeys';\n\t\t\t}\n\t\t} else if (state === 'topKeys') {\n\t\t\t// console.log ('TOP>', l);\n\t\t\tif (l === '\"meta\":') {\n\t\t\t\tstate = 'meta';\n\t\t\t} else if (l === '\"data\":') {\n\t\t\t\tstate = 'data';\n\t\t\t} else if (l === '\"meta\": [') {\n\t\t\t\tstate = 'meta-array';\n\t\t\t} else if (l === '\"data\": [') {\n\t\t\t\tstate = 'data-array';\n\t\t\t} else {\n\t\t\t\tprocessLine.supplementalString += l;\n\t\t\t}\n\t\t} else if (state === 'meta') {\n\t\t\tif (l === '[') {\n\t\t\t\tstate = 'meta-array';\n\t\t\t}\n\t\t} else if (state === 'data') {\n\t\t\tif (l === '[') {\n\t\t\t\tstate = 'data-array';\n\t\t\t}\n\t\t} else if (state === 'meta-array') {\n\t\t\tif (l.match (/^},?$/)) {\n\t\t\t\tprocessLine.columns.push (JSON.parse (objBuffer + '}'));\n\t\t\t\tobjBuffer = undefined;\n\t\t\t} else if (l === '{') {\n\t\t\t\tobjBuffer = l;\n\t\t\t} else if (l.match (/^],?$/)) {\n\n\t\t\t\temitter.emit ('metadata', processLine.columns);\n\n\t\t\t\tstate = 'topKeys';\n\t\t\t} else {\n\t\t\t\tobjBuffer += l;\n\t\t\t}\n\t\t} else if (state === 'data-array') {\n\t\t\tif (l.match (/^[\\]\\}],?$/) && objBuffer) {\n\t\t\t\tprocessLine.rows.push (JSON.parse (objBuffer + l[0]));\n\t\t\t\tobjBuffer = undefined;\n\t\t\t} else if (l === '{' || l === '[') {\n\t\t\t\tobjBuffer = l;\n\t\t\t} else if (l.match (/^],?$/)) {\n\t\t\t\tstate = 'topKeys';\n\t\t\t} else if (objBuffer === undefined) {\n\t\t\t\tprocessLine.rows.push (JSON.parse (l[l.length - 1] !== ',' ? l : l.substr (0, l.length - 1)));\n\t\t\t} else {\n\t\t\t\tobjBuffer += l;\n\t\t\t}\n\t\t}\n\n\t}\n\n\tprocessLine.columns            = [];\n\tprocessLine.rows               = [];\n\tprocessLine.supplementalString = '{';\n\n\n\treturn processLine;\n}\n\n/**\n * Duplex stream to work with database\n * @param {object} [options] options\n * @param {object} [options.format] how to format filelds/rows internally\n */\nfunction RecordStream (options) {\n\t// if (! (this instanceof RecordStream)) return new RecordStream(options);\n\toptions = options || {};\n\toptions.objectMode = true;\n\tDuplex.call (this, options);\n\n\tthis.format = options.format;\n\n\tthis._writeBuffer = [];\n\tthis._canWrite = false;\n\n\tObject.defineProperty (this, 'req', {\n\t\tget: function () {return this._req},\n\t\tset: function (req) {this._req = req; this._canWrite = true;}\n\t})\n}\n\nutil.inherits(RecordStream, Duplex);\n\nRecordStream.prototype._read = function read () {\n\t// nothing to do there, when data comes, push will be called\n};\n\n// http://ey3ball.github.io/posts/2014/07/17/node-streams-back-pressure/\n// https://nodejs.org/en/docs/guides/backpressuring-in-streams/\n// https://nodejs.org/docs/latest/api/stream.html#stream_implementing_a_writable_stream\n\n// TODO: implement _writev\n\nRecordStream.prototype._write = function _write (chunk, enc, cb) {\n\n\tif (!Buffer.isBuffer (chunk) && typeof chunk !== 'string')\n\t\tchunk = encodeRow (chunk, this.format);\n\n\t// there is no way to determine line ending efficiently for Buffer\n\tif (typeof chunk === 'string') {\n\t\tif (chunk.substr (chunk.length - 1) !== \"\\n\") {\n\t\t\tchunk = chunk + \"\\n\";\n\t\t}\n\t\tchunk = Buffer.from ? Buffer.from (chunk, enc) : new Buffer (chunk, enc);\n\t}\n\n\tif (!(chunk instanceof Buffer)) {\n\t\treturn this.emit ('error', new Error ('Incompatible format'));\n\t}\n\n\t// node stores further write requests into `_writableState.bufferedRequest` chain\n\t// until cb is called.\n\tthis._canWrite = this.req.write (chunk, cb);\n\n};\n\nRecordStream.prototype._destroy = function _destroy (err, cb) {\n\n\tprocess.nextTick (function () {\n\t\tRecordStream.super_.prototype._destroy.call (this, err, function (destroyErr) {\n\t\t\tthis.req.destroy (err);\n\t\t\tcb (destroyErr || err);\n\t\t}.bind (this));\n\t}.bind (this));\n}\n\nRecordStream.prototype.end = function end (chunk, enc, cb) {\n\n\tRecordStream.super_.prototype.end.call (this, chunk, enc, function () {\n\t\tthis.req.end (cb);\n\t}.bind (this));\n};\n\n\nmodule.exports = {\n\tJSONStream: JSONStream,\n\tRecordStream: RecordStream\n};\n"
  },
  {
    "path": "test/01-simulated.js",
    "content": "var ClickHouse = require (\"../src/clickhouse\");\n\nvar http = require ('http');\nvar url  = require ('url');\nvar qs   = require ('querystring');\n\nvar assert = require (\"assert\");\n\nvar responses = {\n\t\"SELECT 1 FORMAT JSONCompact\": {\n\t\t\"meta\": [\n\t\t\t{\"name\": \"1\", \"type\": \"UInt8\"}\n\t\t],\n\t\t\"data\": [\n\t\t\t[1]\n\t\t],\n\t\t\"rows\": 1\n\t},\n\t\"SHOW DATABASES FORMAT JSONCompact\": {\n\t\t\"meta\": [\n\t\t\t{\"name\": \"name\", \"type\": \"String\"}\n\t\t],\n\t\t\"data\": [\n\t\t\t[\"default\"],\n\t\t\t[\"system\"]\n\t\t],\n\t\t\"rows\": 2\n\t},\n\t\"SELECT number FROM system.numbers LIMIT 10 FORMAT JSONCompact\": {\n\t\t\"meta\": [\n\t\t\t{\"name\":\"number\",\"type\":\"UInt64\"}\n\t\t],\n\t\t\"data\": [\n\t\t\t[\"0\"],[\"1\"],[\"2\"],[\"3\"],[\"4\"],[\"5\"],[\"6\"],[\"7\"],[\"8\"],[\"9\"]\n\t\t],\n\t\t\"rows\": 10,\n\t\t\"rows_before_limit_at_least\": 10\n\t},\n\n};\n\ndescribe (\"simulated queries\", function () {\n\n\tvar server,\n\t\thost,\n\t\tport;\n\n\tbefore (function (done) {\n\n\t\tserver = http.createServer(function (req, res) {\n\n\t\t\tvar queryString = url.parse (req.url).query;\n\n\t\t\t// test only supports db queries using queryString\n\t\t\tif (!queryString) {\n\t\t\t\tres.writeHead (200, {});\n\t\t\t\tres.end (\"Ok.\\n\");\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tvar queryObject = qs.parse (queryString);\n\n\t\t\t// console.log (queryObject);\n\n\t\t\tif (queryObject.query in responses) {\n\t\t\t\tres.writeHead (200, {\"Content-Type\": \"application/json; charset=UTF-8\"});\n\t\t\t\t// console.log (JSON.stringify (responses[queryObject.query], null, \"\\t\"));\n\t\t\t\tres.end (JSON.stringify (responses[queryObject.query], null, \"\\t\"));\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tres.writeHead (500, {\"Content-Type\": \"text/plain; charset=UTF-8\"});\n\t\t\tres.end (\"Simulated error\");\n\t\t});\n\n\t\tserver.on('clientError', function (err, socket) {\n\t\t\tsocket.end('HTTP/1.1 400 Bad Request\\r\\n\\r\\n');\n\t\t});\n\n\t\tserver.listen (0, function (evt) {\n\t\t\thost = server.address().address;\n\t\t\tport = server.address().port;\n\n\t\t\thost = host === '0.0.0.0' ? '127.0.0.1' : host;\n\t\t\thost = host === '::' ? '127.0.0.1' : host;\n\n\t\t\tdone();\n\t\t});\n\n\t})\n\n\tafter (function (done) {\n\t\tserver.close(function () {\n\t\t\tdone();\n\t\t});\n\t});\n\n\t// this.timeout (5000);\n\n\tit (\"pings\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.ping (function (err, ok) {\n\t\t\tassert (!err);\n\t\t\tassert (ok === \"Ok.\\n\", \"ping response should be 'Ok.\\\\n'\");\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects using callback\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tch.query (\"SELECT 1\", {syncParser: true}, function (err, result) {\n\t\t\tassert (!err);\n\t\t\tassert (result.meta, \"result should be Object with `data` key to represent rows\");\n\t\t\tassert (result.data, \"result should be Object with `meta` key to represent column info\");\n\t\t\tassert (result.meta.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects numbers using callback\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tch.query (\"SELECT number FROM system.numbers LIMIT 10\", {syncParser: true}, function (err, result) {\n\t\t\tassert (!err);\n\t\t\tassert (result.data, \"result should be Object with `data` key to represent rows\");\n\t\t\tassert (result.meta, \"result should be Object with `meta` key to represent column info\");\n\t\t\tassert (result.meta.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (result.meta[0].name === \"number\");\n\t\t\tassert (result.data.constructor === Array, \"data is a row set\");\n\t\t\tassert (result.data[0].constructor === Array, \"each row contains list of values (using FORMAT JSONCompact)\");\n\t\t\tassert (result.data[9][0] === \"9\"); // this should be corrected at database side\n\t\t\tassert (result.rows === 10);\n\t\t\tassert (result.rows_before_limit_at_least === 10);\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects numbers using stream\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar rows = [];\n\t\tvar stream = ch.query (\"SELECT number FROM system.numbers LIMIT 10\", function (err, result) {\n\t\t\tassert (!err);\n\t\t\tassert (result.meta, \"result should be Object with `meta` key to represent column info\");\n\t\t\tassert (result.meta.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (result.meta[0].name === \"number\");\n\t\t\tassert (rows[0].constructor === Array, \"each row contains list of values (using FORMAT JSONCompact)\");\n\t\t\tassert (rows[9][0] === \"9\"); // this should be corrected at database side\n\t\t\tassert (result.rows === 10);\n\t\t\tassert (result.rows_before_limit_at_least === 10);\n\t\t\tdone ();\n\t\t});\n\n\t\tstream.on ('data', function (row) {\n\t\t\trows.push (row);\n\t\t})\n\t});\n\n\n});\n"
  },
  {
    "path": "test/02-real-server.js",
    "content": "var ClickHouse = require (\"../src/clickhouse\");\n\nvar http = require ('http');\nvar url  = require ('url');\nvar qs   = require ('querystring');\n\nvar assert = require (\"assert\");\n\ndescribe (\"real server\", function () {\n\n\tvar server,\n\t\thost = process.env.CLICKHOUSE_HOST || '127.0.0.1',\n\t\tport = process.env.CLICKHOUSE_PORT || 8123,\n\t\tdbCreated = false;\n\n\tit (\"pings\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.ping (function (err, ok) {\n\t\t\tassert.ifError (err);\n\t\t\tassert.equal (ok, \"Ok.\\n\", \"ping response should be 'Ok.\\\\n'\");\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"pinging using promise interface\", function () {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\treturn ch.pinging ();\n\t});\n\n\tit (\"pinging using promise interface with bad connection option\", function () {\n\t\tvar ch = new ClickHouse ();\n\t\treturn ch.pinging ().then (function () {\n\t\t\treturn Promise.reject (new Error (\"Driver should throw without host name\"))\n\t\t}, function (e) {\n\t\t\treturn Promise.resolve ();\n\t\t});\n\t});\n\n\tit (\"pings with options as host\", function (done) {\n\t\tvar ch = new ClickHouse (host);\n\t\tch.ping (function (err, ok) {\n\t\t\tassert.ifError (err);\n\t\t\tassert.equal (ok, \"Ok.\\n\", \"ping response should be 'Ok.\\\\n'\");\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"nothing to ping\", function () {\n\t\tvar ch = new ClickHouse ();\n\t\tassert (ch);\n\t});\n\n\tit (\"returns error\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar stream = ch.query (\"ABCDEFGHIJKLMN\", {syncParser: true}, function (err, result) {\n\t\t\t// assert (err);\n\t\t\t// done ();\n\t\t});\n\n\t\tstream.on ('error', function (err) {\n\t\t\tassert (err);\n\t\t\t// console.log (err);\n\t\t\tdone();\n\t\t});\n\t});\n\n\tit (\"authorises by credentials\", function () {\n\t\tvar ch = new ClickHouse ({host: host, port: port, user: 'default', password: ''});\n\t\treturn ch.querying (\"SELECT 1\");\n\t});\n\n\tit (\"selects from system columns\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"SELECT * FROM system.columns\", function (err, result) {\n\t\t\tassert (!err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects from system columns no more than 10 rows throws exception\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, queryOptions: {max_rows_to_read: 10}});\n\t\tch.query (\"SELECT * FROM system.columns\", function (err, result) {\n\t\t\tassert (err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"creates a database\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"CREATE DATABASE node_clickhouse_test\", function (err, result) {\n\t\t\tassert (!err, err);\n\n\t\t\tdbCreated = true;\n\t\t\t// console.log (result);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"creates a table\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"CREATE TABLE node_clickhouse_test.t (a UInt8) ENGINE = Memory\", function (err, result) {\n\t\t\tassert (!err, err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"drops a table\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, queryOptions: {database: 'node_clickhouse_test'}});\n\t\tch.query (\"DROP TABLE t\", function (err, result) {\n\t\t\tassert (!err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"creates a table\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, queryOptions: {database: 'node_clickhouse_test'}});\n\t\tch.query (\"CREATE TABLE t (a UInt8) ENGINE = Memory\", function (err, result) {\n\t\t\tassert (!err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"inserts some data\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"INSERT INTO t VALUES (1),(2),(3)\", {queryOptions: {database: 'node_clickhouse_test'}}, function (err, result) {\n\t\t\tassert (!err, err);\n\n\t\t\t// let's wait a few seconds\n\t\t\tsetTimeout (function () {done ()}, 500);\n\t\t});\n\t});\n\n\tit (\"gets back data\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tvar rows = [];\n\t\tvar stream = ch.query (\"select a FROM t\", {queryOptions: {database: 'node_clickhouse_test'}});\n\n\t\tstream.on ('data', function (row) {\n\t\t\trows.push (row);\n\t\t});\n\n\t\tstream.on ('end', function () {\n\n\t\t\tdone();\n\t\t});\n\t});\n\n\n\tafter (function (done) {\n\n\t\tif (!dbCreated)\n\t\t\treturn done ();\n\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"DROP DATABASE node_clickhouse_test\", function (err, result) {\n\t\t\tassert (!err);\n\n\t\t\t// console.log (result);\n\n\t\t\tdone ();\n\t\t});\n\t});\n});\n"
  },
  {
    "path": "test/03-errors.js",
    "content": "var ClickHouse = require (\"../src/clickhouse\");\n\nvar http = require ('http');\nvar url  = require ('url');\nvar qs   = require ('querystring');\n\nvar assert = require (\"assert\");\n\ndescribe (\"error parsing\", function () {\n\n\tvar server,\n\t\thost = process.env.CLICKHOUSE_HOST || '127.0.0.1',\n\t\tport = process.env.CLICKHOUSE_PORT || 8123,\n\t\tdbCreated = false;\n\n\tit (\"will not throw on http error\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: 59999, readonly: true});\n\t\tvar stream = ch.query (\"ABCDEFGHIJKLMN\", {syncParser: true}, function (err, result) {\n\t\t\t// assert (err);\n\t\t\t// done ();\n\t\t});\n\n\t\tstream.on ('error', function (err) {\n\t\t\tdone();\n\t\t});\n\t});\n\n\tit (\"returns error for unknown sql\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar stream = ch.query (\"ABCDEFGHIJKLMN\", {syncParser: true}, function (err, result) {\n\t\t\t// assert (err);\n\t\t\t// done ();\n\t\t});\n\n\t\tstream.on ('error', function (err) {\n\n\t\t\tassert (err);\n\t\t\tassert (err.message.match (/Syntax error/));\n\t\t\tassert.equal (err.lineno, 1, \"line number should eq 1\");\n\t\t\tassert.equal (err.colno, 1, \"col  number should eq 1\");\n\t\t\t// console.log (err);\n\t\t\tdone();\n\t\t});\n\t});\n\n\tit (\"returns error with line/col for sql with garbage\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar stream = ch.query (\"CREATE\\n\\t\\tABCDEFGHIJKLMN\", {syncParser: true}, function (err, result) {\n\t\t\t// assert (err);\n\t\t\t// done ();\n\t\t});\n\n\t\tstream.on ('error', function (err) {\n\n\t\t\tassert (err);\n\t\t\tassert (err.message.match (/Syntax error/));\n\t\t\tassert.equal (err.lineno, 2, \"line number should eq 2\");\n\t\t\tassert.equal (err.colno, 3, \"col  number should eq 3\");\n\t\t\t// console.log (err);\n\t\t\tdone();\n\t\t});\n\t});\n\n\tit (\"returns error for empty sql\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\n\t\tfunction countCallbacks (err) {\n\t\t\tcountCallbacks.count = (countCallbacks.count || 0) + 1;\n\n\t\t\tif (countCallbacks.count === 2) {\n\t\t\t\t//\n\t\t\t\tdone ();\n\t\t\t}\n\t\t}\n\n\t\tvar stream = ch.query (\"-- nothing here\", {syncParser: true}, function (err, result) {\n\t\t\tcountCallbacks (err);\n\t\t\t// assert (err);\n\t\t\t// done ();\n\t\t});\n\n\t\tstream.on ('error', function (err) {\n\n\t\t\t// console.log (err); // failed at end of query\n\n\t\t\tassert (err);\n\t\t\tassert (err.message.match (/Empty query/) || err.message.match (/Syntax error/), err);\n\t\t\t// clickhouse doesn't return lineno and colno for some queries\n\t\t\t// assert.ifError ('lineno' in err);\n\t\t\t// assert.ifError ('colno'  in err);\n\n\t\t\tcountCallbacks (err);\n\t\t});\n\t});\n\n\tit (\"returns error for unknown table\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar stream = ch.query (\"SELECT * FROM xxx\", {syncParser: true}, function (err, result) {\n\t\t\tassert (err);\n\t\t});\n\n\t\tstream.on ('error', function (err) {\n\n\t\t\tassert (err);\n\t\t\tassert.ifError (err.message.match (/Syntax error/));\n\t\t\t// clickhouse doesn't return lineno and colno for some queries\n\t\t\t// assert.ifError ('lineno' in err);\n\t\t\t// assert.ifError ('colno'  in err);\n\n\t\t\tdone();\n\t\t});\n\t});\n\n\t// TODO turn it on when fixed https://github.com/ClickHouse/ClickHouse/issues/9914\n\t// it (\"returns error for writing in readonly mode\", function (done) {\n\t// \tvar ch = new ClickHouse ({host: host, port: port});\n\t// \tch.querying (\"CREATE TABLE xxx (a UInt8) ENGINE = Memory()\", {readonly: true})\n\t// \t\t.then(() => done('Should fail in readonly mode'))\n\t// \t\t.catch(() => done())\n\t// });\n\n\n\n});\n"
  },
  {
    "path": "test/04-select.js",
    "content": "var ClickHouse = require (\"../src/clickhouse\");\n\nvar assert = require (\"assert\");\n\ndescribe (\"select data from database\", function () {\n\n\tvar server,\n\t\thost = process.env.CLICKHOUSE_HOST || '127.0.0.1',\n\t\tport = process.env.CLICKHOUSE_PORT || 8123,\n\t\tdbCreated = false;\n\n\tit (\"selects using callback\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tch.query (\"SELECT 1\", {syncParser: true}, function (err, result) {\n\t\t\tassert (!err);\n\t\t\tassert (result.meta, \"result should be Object with `data` key to represent rows\");\n\t\t\tassert (result.data, \"result should be Object with `meta` key to represent column info\");\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects using callback and query submitted in the POST body\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"SELECT 1\", {syncParser: true}, function (err, result) {\n\t\t\tassert (!err);\n\t\t\tassert (result.meta, \"result should be Object with `data` key to represent rows\");\n\t\t\tassert (result.data, \"result should be Object with `meta` key to represent column info\");\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects numbers using callback\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tch.query (\"SELECT number FROM system.numbers LIMIT 10\", {syncParser: true}, function (err, result) {\n\t\t\tassert (!err);\n\t\t\tassert (result.meta, \"result should be Object with `data` key to represent rows\");\n\t\t\tassert (result.data, \"result should be Object with `meta` key to represent column info\");\n\t\t\tassert (result.meta.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (result.meta[0].name === \"number\");\n\t\t\tassert (result.data.constructor === Array, \"data is a row set\");\n\t\t\tassert (result.data[0].constructor === Array, \"each row contains list of values (using FORMAT JSONCompact)\");\n\t\t\tassert (result.data[9][0] === \"9\"); // this should be corrected at database side\n\t\t\tassert (result.rows === 10);\n\t\t\tassert (result.rows_before_limit_at_least === 10);\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects numbers using promise should already have parsed data\", function () {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\treturn ch.querying (\"SELECT number FROM system.numbers LIMIT 10\").then (function (result) {\n\t\t\tassert (result.meta, \"result should be Object with `data` key to represent rows\");\n\t\t\tassert (result.data, \"result should be Object with `meta` key to represent column info\");\n\t\t\tassert (result.meta.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (result.meta[0].name === \"number\");\n\t\t\tassert (result.data.constructor === Array, \"data is a row set\");\n\t\t\tassert (result.data[0].constructor === Array, \"each row contains list of values (using FORMAT JSONCompact)\");\n\t\t\tassert (result.data[9][0] === \"9\"); // this should be corrected at database side\n\t\t\tassert (result.rows === 10);\n\t\t\tassert (result.rows_before_limit_at_least === 10);\n\t\t\treturn Promise.resolve ();\n\t\t});\n\t});\n\n\tit (\"selects numbers as dataObjects using promise\", function () {\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true, dataObjects: true});\n\t\treturn ch.querying (\"SELECT number FROM system.numbers LIMIT 10\").then (function (result) {\n\t\t\tassert (result.meta, \"result should be Object with `data` key to represent rows\");\n\t\t\tassert (result.data, \"result should be Object with `meta` key to represent column info\");\n\t\t\tassert (result.meta.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (result.meta[0].name === \"number\");\n\t\t\tassert (result.data.constructor === Array, \"data is a row set\");\n\t\t\tassert (result.data[0].constructor === Object, \"each row contains key-valued rows (using FORMAT JSON)\");\n\t\t\tassert (result.data[9].number === \"9\");\n\t\t\tassert (result.data.length === 10);\n\t\t\tassert (result.rows === 10);\n\t\t\tassert (result.rows_before_limit_at_least === 10);\n\t\t\treturn Promise.resolve ();\n\t\t});\n\t});\n\n\tit (\"selects numbers using callback and query submitted in the POST body\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"SELECT number FROM system.numbers LIMIT 10\", {syncParser: true}, function (err, result) {\n\t\t\tassert (!err);\n\t\t\tassert (result.meta, \"result should be Object with `meta` key to represent rows\");\n\t\t\tassert (result.data, \"result should be Object with `data` key to represent column info\");\n\t\t\tassert (result.meta.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (result.meta[0].name === \"number\");\n\t\t\tassert (result.data.constructor === Array, \"data is a row set\");\n\t\t\tassert (result.data[0].constructor === Array, \"each row contains list of values (using FORMAT JSONCompact)\");\n\t\t\tassert (result.data[9][0] === \"9\"); // this should be corrected at database side\n\t\t\tassert (result.rows === 10);\n\t\t\tassert (result.rows_before_limit_at_least === 10);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects numbers asynchronously using events and query submitted in the POST body\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tvar rows = [];\n\t\tvar stream = ch.query (\"SELECT number FROM system.numbers LIMIT 10\", function (err, result) {\n\t\t\tassert (!err);\n\t\t\tassert (result.meta, \"result should be Object with `meta` key to represent rows\");\n\t\t\tassert (rows, \"result should be Object with `data` key to represent column info\");\n\t\t\tassert (result.meta.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (result.meta[0].name === \"number\");\n\t\t\tassert (rows.length === 10, \"total 10 rows\");\n\t\t\tassert (rows[0].constructor === Array, \"each row contains list of values (using FORMAT JSONCompact)\");\n\t\t\tassert (rows[9][0] === \"9\"); // this should be corrected at database side\n\t\t\tassert (result.rows === 10);\n\t\t\tassert (result.rows_before_limit_at_least === 10);\n\n\t\t\tdone ();\n\t\t});\n\t\tstream.on ('data', function (row) {\n\t\t\trows.push (row);\n\t\t})\n\t});\n\n\tit (\"selects numbers asynchronously using stream and query submitted in the POST body\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tvar metadata;\n\t\tvar rows = [];\n\t\tvar stream = ch.query (\"SELECT number FROM system.numbers LIMIT 10\");\n\n\t\tstream.on ('metadata', function (_meta) {\n\t\t\tmetadata = _meta;\n\t\t});\n\t\tstream.on ('data', function (row) {\n\t\t\trows.push (row);\n\t\t});\n\t\tstream.on ('error', function (err) {\n\t\t\tassert (err);\n\t\t});\n\t\tstream.on ('end', function () {\n\t\t\tassert (metadata, \"result should be Object with `meta` key to represent rows\");\n\t\t\tassert (rows, \"result should be Object with `data` key to represent column info\");\n\t\t\tassert (metadata.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (metadata[0].name === \"number\");\n\t\t\tassert (rows.length === 10, \"total 10 rows\");\n\t\t\tassert (rows[0].constructor === Array, \"each row contains list of values (using FORMAT JSONCompact)\");\n\t\t\tassert (rows[9][0] === \"9\"); // this should be corrected at database side\n\t\t\tassert (stream.supplemental.rows === 10);\n\t\t\tassert (stream.supplemental.rows_before_limit_at_least === 10);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"selects number objects asynchronously using stream and query submitted in the POST body\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tvar metadata;\n\t\tvar rows = [];\n\t\tvar stream = ch.query (\"SELECT number FROM system.numbers LIMIT 10\", {dataObjects: true});\n\n\t\tstream.on ('metadata', function (_meta) {\n\t\t\tmetadata = _meta;\n\t\t});\n\t\tstream.on ('data', function (row) {\n\t\t\trows.push (row);\n\t\t});\n\t\tstream.on ('error', function (err) {\n\t\t\tassert (err);\n\t\t});\n\t\tstream.on ('end', function () {\n\t\t\tassert (metadata, \"result should be Object with `meta` key to represent rows\");\n\t\t\tassert (rows, \"result should be Object with `data` key to represent column info\");\n\t\t\tassert (metadata.constructor === Array, \"metadata is an array with column descriptions\");\n\t\t\tassert (metadata[0].name === \"number\");\n\t\t\tassert (rows.length === 10, \"total 10 rows\");\n\t\t\tassert ('number' in rows[0], \"each row contains fields (using FORMAT JSON)\");\n\t\t\tassert (rows[9].number === \"9\"); // this should be corrected at database side\n\t\t\tassert (stream.supplemental.rows === 10);\n\t\t\tassert (stream.supplemental.rows_before_limit_at_least === 10);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"select data in unsupported format\", function (done) {\n\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\n\t\tch.query (\"SELECT number FROM system.numbers LIMIT 10\", {format: \"CSV\"}, function (err, result) {\n\n\t\t\tassert (!err, err);\n\n\t\t\tassert (result.match (/1\\n2\\n3\\n4\\n5\\n6\\n7\\n8\\n9/));\n\n\t\t\tdone ();\n\n\t\t});\n\t});\n\n\tit(\"can cancel an ongoing select by calling destroy\", function (done) {\n\t\tvar RecordStream = require(\"../src/streams\").RecordStream;\n\t\tif (typeof RecordStream.prototype.destroy !== \"function\") {\n\t\t\t// If current Node.js version lacks Stream#destroy impl, skip the test\n\t\t\tthis.skip ();\n\t\t\treturn;\n\t\t}\n\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tvar rows = [];\n\t\tvar limit = 10000;\n\t\tvar stream = ch.query (\"SELECT number FROM system.numbers LIMIT \" + limit, function () {\n\t\t\tassert (stream.destroyed);\n\t\t\tassert (rows.length < limit);\n\n\t\t\tdone ();\n\t\t});\n\n\t\tstream.on ('data', function (row) {\n\t\t\trows.push (row);\n\t\t});\n\n\t\tstream.once ('data', function () {\n\t\t\tstream.destroy ();\n\t\t});\n\t});\n\n\tit(\"select query with WITH clause will produces result formatted \", function (done) {\n\t\tvar ch = new ClickHouse({ host: host, port: port, format: \"JSON\" });\n\t\tch.query(\"WITH 10 as pagesize SELECT 1\", { syncParser: true }, function (err, result) {\n\t\t\tassert(!err);\n\t\t\tassert(result.meta, \"result should be Object with `data` key to represent rows\");\n\t\t\tassert(result.data, \"result should be Object with `meta` key to represent column info\");\n\t\t\tassert(Object.prototype.toString.call(result.data[0]) === '[object Object]', \"data should be formatted JSON\" )\n\t\t\tdone();\n\t\t});\n\t})\n});\n"
  },
  {
    "path": "test/05-insert.js",
    "content": "var ClickHouse = require (\"../src/clickhouse\");\n\nvar assert = require (\"assert\");\nvar fs     = require (\"fs\");\nvar crypto = require (\"crypto\");\n\nvar encodeValue = require ('../src/process-db-value').encodeValue;\nvar encodeRow   = require ('../src/process-db-value').encodeRow;\n\nfunction randomDate(start, end) {\n\treturn new Date(start.getTime() + Math.random() * (end.getTime() - start.getTime()));\n}\n\nfunction generateData (format, fileName, cb) {\n\tvar rs = fs.createWriteStream (fileName);\n\tfor (var i = 0; i < 10; i++) {\n\n\t\trs.write (\n\t\t\tencodeRow ([\n\t\t\t\tMath.ceil (Math.random () * 1000),\n\t\t\t\tMath.random () * 1000,\n\t\t\t\tcrypto.randomBytes(20).toString('hex'),\n\t\t\t\trandomDate(new Date(2012, 0, 1), new Date())\n\t\t\t], format)\n\t\t);\n\t}\n\n\trs.end (function () {\n\t\tcb ();\n\t});\n}\n\nvar testDate = new Date ();\nvar testDateISO = testDate.toISOString ().replace (/\\..*/, '').replace ('T', ' ');\n\ndescribe (\"insert data\", function () {\n\n\tvar server,\n\t\thost = process.env.CLICKHOUSE_HOST || '127.0.0.1',\n\t\tport = process.env.CLICKHOUSE_PORT || 8123,\n\t\tdbCreated = false,\n\t\tdbName = 'node_clickhouse_test_insert';\n\n\tbefore (function () {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tvar okFn = function () {return Promise.resolve()};\n\t\treturn ch.querying (\"DROP DATABASE \" + dbName).then (\n\t\t\tokFn, okFn\n\t\t).then (function () {\n\t\t\treturn ch.querying (\"CREATE DATABASE \" + dbName);\n\t\t}).then (function (result) {\n\t\t\tdbCreated = true;\n\t\t\t// console.log (result);\n\t\t\treturn Promise.resolve ();\n\t\t});\n\t});\n\n\tit (\"creates a table\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, queryOptions: {database: dbName}});\n\t\tch.query (\"CREATE TABLE t (a UInt8) ENGINE = Memory\", function (err, result) {\n\t\t\tassert (!err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"inserts some prepared data using stream\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tvar stream = ch.query (\"INSERT INTO t\", {queryOptions: {database: dbName}}, function (err, result) {\n\t\t\tassert (!err, err);\n\n\t\t\tch.query (\"SELECT * FROM t\", {syncParser: true, queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\tassert.equal (result.data[0][0], 8);\n\t\t\t\tassert.equal (result.data[1][0], 73);\n\t\t\t\tassert.equal (result.data[2][0], 42);\n\n\t\t\t\tdone ();\n\n\t\t\t});\n\t\t});\n\n\t\tstream.write ('8');\n\t\tstream.write ('73');\n\t\tstream.write (Buffer.from ? Buffer.from ('42') : new Buffer ('42'));\n\t\tstream.end ();\n\t});\n\n\tit (\"inserts some data\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"INSERT INTO t VALUES (1),(2),(3)\", {queryOptions: {database: dbName}}, function (err, result) {\n\t\t\tassert (!err, err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"inserts some data using promise\", function () {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\treturn ch.querying (\"INSERT INTO t VALUES (1),(2),(3)\", {queryOptions: {database: dbName}});\n\t});\n\n\tit (\"creates a table 2\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, queryOptions: {database: dbName}});\n\t\tch.query (\"CREATE TABLE t2 (a UInt8, b Float32, x Nullable(String), z DateTime) ENGINE = Memory\", function (err, result) {\n\t\t\tassert (!err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"inserts data from array using stream\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\n\t\tvar stream = ch.query (\"INSERT INTO t2\", {queryOptions: {database: dbName}}, function (err, result) {\n\t\t\tassert (!err, err);\n\n\t\t\tch.query (\"SELECT * FROM t2\", {syncParser: true, queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\tassert.equal (result.data[0][0], 1);\n\t\t\t\tassert.equal (result.data[0][1], 2.22);\n\t\t\t\tassert.equal (result.data[0][2], null);\n\t\t\t\tassert.equal (result.data[0][3], testDateISO);\n\n\t\t\t\tassert.equal (result.data[1][0], 20);\n\t\t\t\tassert.equal (result.data[1][1], 1.11);\n\t\t\t\tassert.equal (result.data[1][2], \"wrqwefqwef\");\n\t\t\t\tassert.equal (result.data[1][3], \"2017-07-07 12:12:12\");\n\n\t\t\t\tdone ();\n\n\t\t\t});\n\t\t});\n\n\t\tstream.write ([1, 2.22, null, testDate]);\n\t\tstream.write (\"20\\t1.11\\twrqwefqwef\\t2017-07-07 12:12:12\");\n\n\t\t// stream.write ([0, Infinity, null, new Date ()]);\n\t\t// stream.write ([23, NaN, \"yyy\", new Date ()]);\n\n\t\tstream.end ();\n\t});\n\n\tit (\"creates a table 3\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, queryOptions: {database: dbName}});\n\t\tch.query (\"CREATE TABLE t3 (a UInt8, b Float32, x Nullable(String), z DateTime) ENGINE = Memory\", function (err, result) {\n\t\t\tassert (!err);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n\tit (\"inserts data from array of objects using stream\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\n\t\tvar stream = ch.query (\"INSERT INTO t3\", {format: \"JSONEachRow\", queryOptions: {database: dbName}}, function (err, result) {\n\t\t\tassert (!err, err);\n\n\t\t\tch.query (\"SELECT * FROM t3\", {syncParser: true, queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\tassert.equal (result.data[0][0], 1);\n\t\t\t\tassert.equal (result.data[0][1], 2.22);\n\t\t\t\tassert.equal (result.data[0][2], null);\n\t\t\t\tassert.equal (result.data[0][3], testDateISO);\n\n\t\t\t\tassert.equal (result.data[1][0], 20);\n\t\t\t\tassert.equal (result.data[1][1], 1.11);\n\t\t\t\tassert.equal (result.data[1][2], \"wrqwefqwef\");\n\t\t\t\tassert.equal (result.data[1][3], \"2017-07-07 12:12:12\");\n\n\t\t\t\tassert.equal (result.data.length, 2);\n\n\t\t\t\tdone ();\n\n\t\t\t});\n\t\t});\n\n\t\tstream.write ({a: 1, b: 2.22, x: null, z: testDate});\n\t\tstream.write ({a: 20, b: 1.11, x: \"wrqwefqwef\", z: \"2017-07-07 12:12:12\"});\n\n\t\t// stream.write ([0, Infinity, null, new Date ()]);\n\t\t// stream.write ([23, NaN, \"yyy\", new Date ()]);\n\n\t\tstream.end ();\n\t});\n\n\tit (\"inserts from select\", function (done) {\n\t\tvar ch = new ClickHouse ({host: host, port: port, queryOptions: {database: dbName}});\n\n\t\tvar stream = ch.query (\"INSERT INTO t3 SELECT * FROM t2\", {}, function (err, result) {\n\t\t\tassert (!err, err);\n\n\t\t\tch.query (\"SELECT * FROM t3\", {syncParser: true, queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\tassert.equal (result.data[2][0], 1);\n\t\t\t\tassert.equal (result.data[2][1], 2.22);\n\t\t\t\tassert.equal (result.data[2][2], null);\n\t\t\t\tassert.equal (result.data[2][3], testDateISO);\n\n\t\t\t\tassert.equal (result.data[3][0], 20);\n\t\t\t\tassert.equal (result.data[3][1], 1.11);\n\t\t\t\tassert.equal (result.data[3][2], \"wrqwefqwef\");\n\t\t\t\tassert.equal (result.data[3][3], \"2017-07-07 12:12:12\");\n\n\t\t\t\tassert.equal (result.data.length, 4);\n\n\t\t\t\tdone ();\n\n\t\t\t});\n\t\t});\n\n\t\t// stream.end ();\n\t});\n\n\tit (\"piping data from csv file\", function (done) {\n\n\t\tthis.timeout (5000);\n\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\n\t\tvar csvFileName = __filename.replace ('.js', '.csv');\n\n\t\tfunction processFileStream (fileStream) {\n\t\t\tvar stream = ch.query (\"INSERT INTO t3\", {format: \"CSV\", queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\tassert (!err, err);\n\n\t\t\t\tch.query (\"SELECT * FROM t3\", {syncParser: true, queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\t\tassert (!err, err);\n\n\t\t\t\t\tfs.unlink (csvFileName, function () {\n\t\t\t\t\t\tdone ();\n\t\t\t\t\t});\n\t\t\t\t});\n\t\t\t});\n\n\t\t\tfileStream.pipe (stream);\n\n\t\t\tstream.on ('error', function (err) {\n\t\t\t\t// console.log (err);\n\t\t\t});\n\t\t}\n\n\t\tfs.stat (csvFileName, function (err, stat) {\n\t\t\t//if (err) {\n\t\t\t\treturn generateData ('CSV', csvFileName, function () {\n\t\t\t\t\tprocessFileStream (fs.createReadStream (csvFileName));\n\t\t\t\t})\n\t\t\t//}\n\n\t\t\tprocessFileStream (fs.createReadStream (csvFileName));\n\t\t});\n\n\t});\n\n\tit (\"piping data from tsv file\", function (done) {\n\n\t\tthis.timeout (5000);\n\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\n\t\tvar tsvFileName = __filename.replace ('.js', '.tsv');\n\n\t\tfunction processFileStream (fileStream) {\n\t\t\tvar stream = ch.query (\"INSERT INTO t3\", {format: \"TabSeparated\", queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\tassert (!err, err);\n\n\t\t\t\tch.query (\"SELECT * FROM t3\", {syncParser: true, queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\t\tassert (!err, err);\n\n\t\t\t\t\tfs.unlink (tsvFileName, function () {\n\t\t\t\t\t\tdone ();\n\t\t\t\t\t});\n\t\t\t\t});\n\t\t\t});\n\n\t\t\tfileStream.pipe (stream);\n\n\t\t\tstream.on ('error', function (err) {\n\t\t\t\t// console.log (err);\n\t\t\t});\n\t\t}\n\n\t\tfs.stat (tsvFileName, function (err, stat) {\n\t\t\t//if (err) {\n\t\t\treturn generateData ('TSV', tsvFileName, function () {\n\t\t\t\tprocessFileStream (fs.createReadStream (tsvFileName));\n\t\t\t})\n\t\t\t//}\n\n\t\t\tprocessFileStream (fs.createReadStream (tsvFileName));\n\t\t});\n\n\t});\n\n    it (\"creates a table 4\", function (done) {\n        var ch = new ClickHouse ({host: host, port: port, queryOptions: {database: dbName}});\n        ch.query (\"CREATE TABLE t4 (arrayString Array(String), arrayInt Array(UInt32)) ENGINE = Memory\", function (err, result) {\n            assert (!err);\n\n            done ();\n        });\n    });\n\n    it (\"inserts array with format JSON using stream\", function (done) {\n        var ch = new ClickHouse ({host: host, port: port});\n\n        var stream = ch.query (\"INSERT INTO t4\", {queryOptions: {database: dbName}, format: \"JSONEachRow\"}, function (err, result) {\n            assert (!err, err);\n\n            ch.query (\"SELECT * FROM t4\", {syncParser: true, queryOptions: {database: dbName}, dataObjects: \"JSON\"}, function (err, result) {\n                assert.deepEqual (result.data[0].arrayString, ['first', 'second']);\n                assert.deepEqual (result.data[0].arrayInt, [1, 0, 100]);\n\n                done ();\n            });\n        });\n\n        stream.write ({\n            arrayString: ['first', 'second'],\n            arrayInt: [1, 0, 100]\n        });\n\n        stream.end ();\n    });\n\n\t\tit (\"creates a table 5\", function () {\n\t\t\tvar ch = new ClickHouse ({host: host, port: port, queryOptions: {database: dbName}});\n\t\t\treturn ch.querying (\"CREATE TABLE t5 (a UInt8, b Float32, x Nullable(String), z DateTime) ENGINE = Memory\");\n\t\t});\n\n\t\tit (\"inserts csv with FORMAT clause\", function (done) {\n\t\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\t\tvar stream = ch.query (\"INSERT INTO t5 FORMAT CSV\", {queryOptions: {database: dbName}}, function (err, result) {\n\t\t\t\tassert (!err, err);\n\n\t\t\t\tch.query (\"SELECT * FROM t5\", {syncParser: true, queryOptions: {database: dbName}}, function (err, result) {\n\n\t\t\t\t\tassert.equal (result.data[0][0], 0);\n\t\t\t\t\tassert.equal (result.data[0][1], 0);\n\t\t\t\t\tassert.equal (result.data[0][2], null);\n\t\t\t\t\tassert.equal (result.data[0][3], '1970-01-02 00:00:00');\n\t\t\t\t\tassert.equal (result.data[1][0], 1);\n\t\t\t\t\tassert.equal (result.data[1][1], 1.5);\n\t\t\t\t\tassert.equal (result.data[1][2], '1');\n\t\t\t\t\tassert.equal (result.data[1][3], '2050-01-01 00:00:00');\n\n\t\t\t\t\tdone ();\n\n\t\t\t\t});\n\t\t\t});\n\t\t\tstream.write('0,0,\\\\N,\"1970-01-02 00:00:00\"\\n1,1.5,\"1\",\"2050-01-01 00:00:00\"')\n\t\t\tstream.end ();\n\t\t});\n\n\t\tit (\"select data with FORMAT clause\", function () {\n\t\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\t\treturn ch.querying(\"SELECT * FROM t5 FORMAT Values\", {queryOptions: {database: dbName}})\n\t\t\t\t.then((data) => {\n\t\t\t\t\tassert.equal (data, `(0,0,NULL,'1970-01-02 00:00:00'),(1,1.5,'1','2050-01-01 00:00:00')`)\n\t\t\t\t})\n\t\t});\n\n\t\tit (\"select data with GET method and FORMAT clause\", function () {\n\t\t\tvar ch = new ClickHouse ({host: host, port: port, useQueryString: true});\n\t\t\treturn ch.querying(\"SELECT * FROM t5 FORMAT Values\", {queryOptions: {database: dbName}})\n\t\t\t\t.then((data) => {\n\t\t\t\t\tassert.equal (data, `(0,0,NULL,'1970-01-02 00:00:00'),(1,1.5,'1','2050-01-01 00:00:00')`)\n\t\t\t\t})\n\t\t});\n\n\t\tit (\"select data with GET method and format option\", function () {\n\t\t\tvar ch = new ClickHouse ({host: host, port: port, useQueryString: true});\n\t\t\treturn ch.querying(\"SELECT * FROM t5\", {queryOptions: {database: dbName}, format: 'Values'})\n\t\t\t\t.then((data) => {\n\t\t\t\t\tassert.equal (data, `(0,0,NULL,'1970-01-02 00:00:00'),(1,1.5,'1','2050-01-01 00:00:00')`)\n\t\t\t\t})\n\t\t});\n\n\tafter (function (done) {\n\n\t\tif (!dbCreated)\n\t\t\treturn done;\n\n\t\tvar ch = new ClickHouse ({host: host, port: port});\n\t\tch.query (\"DROP DATABASE \" + dbName, function (err, result) {\n\t\t\tassert (!err);\n\n\t\t\t// console.log (result);\n\n\t\t\tdone ();\n\t\t});\n\t});\n\n});\n"
  },
  {
    "path": "test/06-encode-value.js",
    "content": "var assert = require ('assert');\nvar encodeValue = require ('../src/process-db-value').encodeValue;\n\ndescribe('encode value', function() {\n\tit('escaping string backslashes for TSV', function() {\n\t\tvar value = '{\"key1\":\\t\"{\\\"key2\\\":\\\"<some \\\\\"attr\\\\\">With tabulated \\\\t and line bracking \\\\n</some>\\\"}\"\\n}';\n\t\tvar expected = '{\"key1\":\\\\t\"{\"key2\":\"<some \\\\\\\\\"attr\\\\\\\\\">With tabulated \\\\\\\\t and line bracking \\\\\\\\n</some>\\\"}\"\\\\n}';\n\t\tvar actual = encodeValue(undefined, value, 'TSV');\n\n\t\tassert.equal(actual, expected);\n  })\n})\n"
  },
  {
    "path": "test/07-compat.js",
    "content": "const ClickHouse = require(\"../src/clickhouse\");\nconst assert = require(\"assert\");\n\nconst DB_NAME = 'node_clickhouse_test_compat';\nconst noop = () => {}\n\ndescribe(\"api compatibility\", () => {\n  const host = process.env.CLICKHOUSE_HOST || '127.0.0.1';\n  const port = process.env.CLICKHOUSE_PORT || 8123;\n  const ch = new ClickHouse({host: host, port: port, queryOptions: {database: DB_NAME}});\n\n  before(() => (\n    new ClickHouse({host: host, port: port})\n      .querying(`CREATE DATABASE IF NOT EXISTS \"${DB_NAME}\"`)\n  ))\n\n  after(() => (\n    ch.querying(`DROP DATABASE IF EXISTS \"${DB_NAME}\"`)\n  ))\n\n  describe('should emit \"end\" event', () => {\n\n    before(function() {\n      console.log(('CREATE TABLE x (date Date) Engine=Memory'))\n      return ch.querying('CREATE TABLE x (date Date) Engine=Memory')\n    })\n\n    it('on insert done', (done) => {\n      const chStream = ch.query('INSERT INTO x', { format: 'JSONEachRow' })\n      chStream.on('data', noop)\n      chStream.on('error', done)\n      chStream.on('end', e => done())\n\n      chStream.write({date: '2000-01-01'})\n      chStream.end()\n    });\n\n    describe('on SELECT done. With default format', () => {\n      it('HTTP GET', (done) => {\n        const stream = ch.query('SELECT * FROM system.numbers LIMIT 0', { readonly: true });\n        stream.on ('data', noop)\n        stream.on ('end', () => done());\n        stream.end()\n      });\n\n      it('HTTP POST', (done) => {\n        const stream = ch.query('SELECT * FROM system.numbers LIMIT 0', { readonly: false });\n        stream.on ('data', noop)\n        stream.on ('end', () => done());\n        stream.end()\n      });\n    })\n\n  })\n\n});\n"
  },
  {
    "path": "test/90-torture.js",
    "content": "var ClickHouse = require (\"../src/clickhouse\");\n\nvar http = require ('http');\nvar url  = require ('url');\nvar qs   = require ('querystring');\n\nvar assert = require (\"assert\");\n\nvar memwatch;\ntry {\n\tvar nodeMajorVersion = parseInt(process.version.substr(1));\n\tif (nodeMajorVersion >= 10) {\n\t\tmemwatch = require('memwatch');\n\t} else {\n\t\tmemwatch = require('memwatch-next');\n\t}\n} catch (err) {\n\tif (process.env.TORTURE) {\n\t\tconsole.error (\"For the torture test you should install memwatch-next (node 8) or @airbnb/memwatch (node >= 10)\");\n\t\tconsole.error (err);\n\t\tprocess.exit (1);\n\t}\n}\n\n// replace with it, if you want to run this test suite\nvar method  = process.env.TORTURE ? it : it.skip;\nvar timeout = 60000;\n\nfunction checkUsageAndGC (used, baseline) {\n\tfunction inMB (v) {\n\t\treturn (v/Math.pow (2, 20)).toFixed (1) + 'MB';\n\t}\n\tvar usage = 'rss heapTotal heapUsed external'.split (' ').map (function (k) {\n\t\tvar delta = used[k] - baseline[k];\n\t\treturn k + ' ' + inMB (used[k]) + ' / +' + inMB (delta);\n\t});\n\n\tconsole.log (usage.join (' '));\n\n\tgc ();\n}\n\ndescribe (\"torturing\", function () {\n\n\tglobal.gc && gc();\n\n\tvar server,\n\t\thost = process.env.CLICKHOUSE_HOST || '127.0.0.1',\n\t\tport = process.env.CLICKHOUSE_PORT || 8123,\n\t\tlastSuite,\n\t\tbaselineMemoryUsage = process.memoryUsage();\n\n\tbefore (function () {\n\t\tmemwatch && memwatch.on ('leak', function (info) {\n\t\t\tconsole.log (info);\n\t\t});\n\n\t\t// console.log (process.memoryUsage());\n\t})\n\n\tmethod (\"selects 1 million using async parser\", function (done) {\n\n\t\tthis.timeout (timeout);\n\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar queryCount = 0;\n\t\tvar symbolsTransferred = 0;\n\n\t\tfunction runQuery () {\n\t\t\tvar stream = ch.query (\"SELECT number FROM system.numbers LIMIT 1000000\", function (err, result) {\n\t\t\t\tsymbolsTransferred += result.transferred;\n\t\t\t\tqueryCount ++;\n\t\t\t\tif (queryCount < 10)\n\t\t\t\t\treturn runQuery ();\n\t\t\t\tconsole.log ('symbols transferred:', symbolsTransferred);\n\t\t\t\tdone();\n\t\t\t});\n\n\t\t\tstream.on ('error', function (err) {\n\t\t\t\tdone (err);\n\t\t\t});\n\n\t\t\tstream.on ('data', function (row) {\n\t\t\t\t// just throw away that data\n\t\t\t});\n\t\t}\n\n\t\trunQuery ();\n\n\t\tlastSuite = 'async';\n\t});\n\n\t// enable this test separately\n\tit.skip (\"selects 1 million using sync parser\", function (done) {\n\n\t\tthis.timeout (timeout);\n\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar queryCount = 0;\n\t\tvar symbolsTransferred = 0;\n\n\t\tfunction runQuery () {\n\t\t\tvar stream = ch.query (\"SELECT number FROM system.numbers LIMIT 1000000\", {syncParser: true}, function (err, result) {\n\t\t\t\tsymbolsTransferred += result.transferred;\n\t\t\t\tqueryCount ++;\n\t\t\t\tif (queryCount < 10)\n\t\t\t\t\treturn runQuery ();\n\t\t\t\tconsole.log ('symbols transferred:', symbolsTransferred);\n\t\t\t\tdone();\n\t\t\t});\n\n\t\t\tstream.on ('error', function (err) {\n\t\t\t\tdone (err);\n\t\t\t});\n\n\t\t\tstream.on ('data', function (row) {\n\t\t\t\t// just throw away that data\n\t\t\t});\n\t\t}\n\n\t\trunQuery ();\n\n\t\tlastSuite = 'sync';\n\t});\n\n\tmethod (\"selects system.columns using async parser #1\", function (done) {\n\n\t\tthis.timeout (timeout);\n\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar queryCount = 0;\n\t\tvar symbolsTransferred = 0;\n\n\t\tfunction runQuery () {\n\t\t\tvar stream = ch.query (\"SELECT * FROM system.columns\", function (err, result) {\n\t\t\t\tsymbolsTransferred += result.transferred;\n\t\t\t\tqueryCount ++;\n\t\t\t\tif (queryCount < 10000)\n\t\t\t\t\treturn runQuery ();\n\t\t\t\tconsole.log ('symbols transferred:', symbolsTransferred);\n\t\t\t\tdone();\n\t\t\t});\n\n\t\t\tstream.on ('error', function (err) {\n\t\t\t\tdone (err);\n\t\t\t});\n\n\t\t\tstream.on ('data', function (row) {\n\t\t\t\t// just throw away that data\n\t\t\t});\n\t\t}\n\n\t\trunQuery ();\n\n\t\tlastSuite = 'async';\n\t});\n\n\tmethod (\"selects system.columns using sync parser #1\", function (done) {\n\n\t\tthis.timeout (timeout);\n\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar queryCount = 0;\n\t\tvar symbolsTransferred = 0;\n\n\t\tfunction runQuery () {\n\t\t\tvar stream = ch.query (\"SELECT * FROM system.columns\", {syncParser: true}, function (err, result) {\n\t\t\t\tsymbolsTransferred += result.transferred;\n\t\t\t\tqueryCount ++;\n\t\t\t\tif (queryCount < 10000)\n\t\t\t\t\treturn runQuery ();\n\t\t\t\tconsole.log ('symbols transferred:', symbolsTransferred);\n\t\t\t\tdone();\n\t\t\t});\n\n\t\t\tstream.on ('error', function (err) {\n\t\t\t\tdone (err);\n\t\t\t});\n\n\t\t\tstream.on ('data', function (row) {\n\t\t\t\t// just throw away that data\n\t\t\t});\n\t\t}\n\n\t\trunQuery ();\n\n\t\tlastSuite = 'sync';\n\t});\n\n\tmethod (\"selects system.columns using async parser #2\", function (done) {\n\n\t\tthis.timeout (timeout);\n\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar queryCount = 0;\n\t\tvar symbolsTransferred = 0;\n\n\t\tfunction runQuery () {\n\t\t\tvar stream = ch.query (\"SELECT * FROM system.columns\", function (err, result) {\n\t\t\t\tsymbolsTransferred += result.transferred;\n\t\t\t\tqueryCount ++;\n\t\t\t\tif (queryCount < 10000)\n\t\t\t\t\treturn runQuery ();\n\t\t\t\tconsole.log ('symbols transferred:', symbolsTransferred);\n\t\t\t\tdone();\n\t\t\t});\n\n\t\t\tstream.on ('error', function (err) {\n\t\t\t\tdone (err);\n\t\t\t});\n\n\t\t\tstream.on ('data', function (row) {\n\t\t\t\t// just throw away that data\n\t\t\t});\n\t\t}\n\n\t\trunQuery ();\n\n\t\tlastSuite = 'async';\n\t});\n\n\tmethod (\"selects system.columns using sync parser #2\", function (done) {\n\n\t\tthis.timeout (timeout);\n\n\t\tvar ch = new ClickHouse ({host: host, port: port, readonly: true});\n\t\tvar queryCount = 0;\n\t\tvar symbolsTransferred = 0;\n\n\t\tfunction runQuery () {\n\t\t\tvar stream = ch.query (\"SELECT * FROM system.columns\", {syncParser: true}, function (err, result) {\n\t\t\t\tsymbolsTransferred += result.transferred;\n\t\t\t\tqueryCount ++;\n\t\t\t\tif (queryCount < 10000)\n\t\t\t\t\treturn runQuery ();\n\t\t\t\tconsole.log ('symbols transferred:', symbolsTransferred);\n\t\t\t\tdone();\n\t\t\t});\n\n\t\t\tstream.on ('error', function (err) {\n\t\t\t\tdone (err);\n\t\t\t});\n\n\t\t\tstream.on ('data', function (row) {\n\t\t\t\t// just throw away that data\n\t\t\t});\n\t\t}\n\n\t\trunQuery ();\n\n\t\tlastSuite = 'sync';\n\t});\n\n\tafterEach (function () {\n\n\t\tcheckUsageAndGC (process.memoryUsage(), baselineMemoryUsage)\n\n\t\t// console.log ('after', lastSuite,  process.memoryUsage());\n\n\t})\n\n});\n"
  }
]