[
  {
    "path": ".gitattributes",
    "content": "* text=auto eol=lf\n"
  },
  {
    "path": ".github/workflows/test.yml",
    "content": "name: Build Status\non:\n  push:\n    branches:\n      - master\n  pull_request:\n    branches:\n      - master\njobs:\n  build:\n    strategy:\n      matrix:\n        node-version: [lts/*]\n        os: [ubuntu-latest, macos-latest, windows-latest]\n    runs-on: ${{ matrix.os }}\n    steps:\n      - uses: actions/checkout@v2\n      - name: Use Node.js ${{ matrix.node-version }}\n        uses: actions/setup-node@v2\n        with:\n          node-version: ${{ matrix.node-version }}\n      - run: npm install\n      - run: npm test\n"
  },
  {
    "path": ".gitignore",
    "content": "node_modules\npackage-lock.json\ncoverage\nsandbox.js\nsandbox/\n*.cpy\n"
  },
  {
    "path": ".prettierrc",
    "content": "\"prettier-config-holepunch\"\n"
  },
  {
    "path": "LICENSE",
    "content": "The MIT License (MIT)\n\nCopyright (c) 2019 Mathias Buus\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n"
  },
  {
    "path": "README.md",
    "content": "# streamx\n\nAn iteration of the Node.js core streams with a series of improvements.\n\n```\nnpm install streamx\n```\n\n[![Build Status](https://github.com/streamxorg/streamx/workflows/Build%20Status/badge.svg)](https://github.com/streamxorg/streamx/actions?query=workflow%3A%22Build+Status%22)\n\n## Main improvements from Node.js core stream\n\n#### Proper lifecycle support.\n\nStreams have an `_open` function that is called before any read/write operation and a `_destroy`\nfunction that is always run as the last part of the stream.\n\nThis makes it easy to maintain state.\n\n#### Easy error handling\n\nFully integrates a `.destroy()` function. When called the stream will wait for any\npending operation to finish and call the stream destroy logic.\n\nClose is _always_ the last event emitted and `destroy` is always run.\n\n#### `pipe()` error handles\n\n`pipe` accepts a callback that is called when the pipeline is fully drained.\nIt also error handles the streams provided and destroys both streams if either\nof them fail.\n\n#### All streams are both binary and object mode streams\n\nA `map` function can be provided to map your input data into buffers\nor other formats. To indicate how much buffer space each data item takes\nan `byteLength` function can be provided as well.\n\nThis removes the need for two modes of streams.\n\n#### Simplicity\n\nThis is a full rewrite, all contained in one file.\n\nLots of stream methods are simplified based on how I and devs I work with actually use streams in the wild.\n\n#### Backwards compat\n\nstreamx aims to be compatible with Node.js streams whenever it is reasonable to do so.\n\nThis means that streamx streams behave a lot like Node.js streams from the outside but still provides the\nimprovements above.\n\n#### Smaller browser footprint\n\nstreamx has a much smaller footprint when compiled for the browser:\n\n```\n$ for x in stream{,x}; do echo $x: $(browserify -r $x | wc -c) bytes; done\nstream: 173844 bytes\nstreamx: 46943 bytes\n```\n\nWith optimizations turned on, the difference is even more stark:\n\n```\n$ for x in stream{,x}; do echo $x: $(browserify -r $x -p tinyify | wc -c) bytes; done\nstream: 62649 bytes\nstreamx: 8460 bytes\n$ for x in stream{,x}; do echo $x: $(browserify -r $x -p tinyify | gzip | wc -c) \"bytes (gzipped)\"; done\nstream: 18053 bytes (gzipped)\nstreamx: 2806 bytes (gzipped)\n```\n\n#### AbortSignal support\n\nTo make it easier to integrate streams in a `async/await` flow, all streams support a `signal` option\nthat accepts a [`AbortSignal`](https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal) to as an\nalternative means to `.destroy` streams.\n\n## Usage\n\n```js\nconst { Readable } = require('streamx')\n\nconst rs = new Readable({\n  read(cb) {\n    this.push('Cool data')\n    cb(null)\n  }\n})\n\nrs.on('data', (data) => console.log('data:', data))\n```\n\n## API\n\nThis streamx package contains 4 streams similar to Node.js core.\n\n## Readable Stream\n\n#### `rs = new stream.Readable([options])`\n\nCreate a new readable stream.\n\nOptions include:\n\n```\n{\n  highWaterMark: 16384, // max buffer size in bytes\n  map: (data) => data, // optional function to map input data\n  byteLength: (data) => size, // optional function that calculates the byte size of input data\n  signal: abortController.signal, // optional AbortSignal that triggers `.destroy` when on `abort`\n  eagerOpen: false // eagerly open the stream\n}\n```\n\nIn addition you can pass the `open`, `read`, and `destroy` functions as shorthands in\nthe constructor instead of overwrite the methods below.\n\nThe default byteLength function returns the byte length of buffers and `1024`\nfor any other object. This means the buffer will contain around 16 non buffers\nor buffers worth 16kb when full if the defaults are used.\n\nIf you set highWaterMark to `0` then all read ahead buffering on the stream\nis disabled and it will only call `_read` when a user reads rather than ahead of time.\n\n#### `rs._read(cb)`\n\nThis function is called when the stream wants you to push new data.\nOverwrite this and add your own read logic.\nYou should call the callback when you are fully done with the read.\n\nCan also be set using `options.read` in the constructor.\n\nNote that this function differs from Node.js streams in that it takes\nthe \"read finished\" callback.\n\n#### `drained = rs.push(data)`\n\nPush new data to the stream. Returns true if the buffer is not full\nand you should push more data if you can.\n\nIf you call `rs.push(null)` you signal to the stream that no more\ndata will be pushed and that you want to end the stream.\n\n#### `data = rs.read()`\n\nRead a piece of data from the stream buffer. If the buffer is currently empty\n`null` will be returned and you should wait for `readable` to be emitted before\ntrying again. If the stream has been ended it will also return `null`.\n\nNote that this method differs from Node.js streams in that it does not accept\nan optional amounts of bytes to consume.\n\n#### `rs.unshift(data)`\n\nAdd a piece of data to the front of the buffer. Use this if you read too much\ndata using the `rs.read()` function.\n\n#### `rs._open(cb)`\n\nThis function is called once before the first read is issued. Use this function\nto implement your own open logic.\n\nCan also be set using `options.open` in the constructor.\n\n#### `rs._destroy(cb)`\n\nThis function is called just before the stream is fully destroyed. You should\nuse this to implement whatever teardown logic you need. The final part of the\nstream life cycle is always to call destroy itself so this function will always\nbe called whether or not the stream ends gracefully or forcefully.\n\nCan also be set using `options.destroy` in the constructor.\n\nNote that the `_destroy` might be called without the open function being called\nin case no read was ever performed on the stream.\n\n#### `rs._predestroy()`\n\nA simple hook that is called as soon as the first `stream.destroy()` call is invoked.\n\nUse this in case you need to cancel pending reads (if possible) instead of waiting for them to finish.\n\nCan also be set using `options.predestroy` in the constructor.\n\n#### `rs.destroy([error])`\n\nForcefully destroy the stream. Will call `_destroy` as soon as all pending reads have finished.\nOnce the stream is fully destroyed `close` will be emitted.\n\nIf you pass an error this error will be emitted just before `close` is, signifying a reason\nas to why this stream was destroyed.\n\n#### `rs.pause()`\n\nPauses the stream. You will only need to call this if you want to pause a resumed stream.\n\nReturns this stream instance.\n\n#### `rs.resume()`\n\nWill start reading data from the stream as fast as possible.\n\nIf you do not call this, you need to use the `read()` method to read data or the `pipe()` method to\npipe the stream somewhere else or the `data` handler.\n\nIf none of these option are used the stream will stay paused.\n\nReturns this stream instance.\n\n#### `rs.setEncoding(encoding)`\n\nSet an encoding to change how data is interpreted. E.g. `utf-8`.\n\n#### `bool = Readable.isPaused(rs)`\n\nReturns `true` if the stream is paused, else `false`.\n\n#### `writableStream = rs.pipe(writableStream, [callback])`\n\nEfficently pipe the readable stream to a writable stream (can be Node.js core stream or a stream from this package).\nIf you provide a callback the callback is called when the pipeline has fully finished with an optional error in case\nit failed.\n\nTo cancel the pipeline destroy either of the streams.\n\n#### `rs.on('readable')`\n\nEmitted when data is pushed to the stream if the buffer was previously empty.\n\n#### `rs.on('data', data)`\n\nEmitted when data is being read from the stream. If you attach a data handler you are implicitly resuming the stream.\n\n#### `rs.on('end')`\n\nEmitted when the readable stream has ended and no data is left in it's buffer.\n\n#### `rs.on('close')`\n\nEmitted when the readable stream has fully closed (i.e. it's destroy function has completed)\n\n#### `rs.on('error', err)`\n\nEmitted if any of the stream operations fail with an error. `close` is always emitted right after this.\n\n#### `rs.on('piping', dest)`\n\nEmitted when the readable stream is pipeing to a destination.\n\n#### `rs.on('open')`\n\nEmitted after `rs._open(cb)` execution.\n\n#### `rs.destroying`\n\nBoolean property indicating whether or not this stream has started to be destroyed.\n\n#### `rs.destroyed`\n\nBoolean property indicating whether or not this stream has been destroyed.\n\n#### `rs.readable`\n\nReturns `true` if the stream is an active streamx readable stream. Returns `undefined` if not.\n\n#### `bool = Readable.isBackpressured(rs)`\n\nStatic method to check if a readable stream is currently under backpressure.\n\n#### `stream = Readable.from(arrayOrBufferOrStringOrAsyncIterator)`\n\nStatic method to turn an array or buffer or string or AsyncIterator into a readable stream.\n\n## Writable Stream\n\n#### `ws = new stream.Writable([options])`\n\nCreate a new writable stream.\n\nOptions include:\n\n```\n{\n  highWaterMark: 16384, // max buffer size in bytes\n  map: (data) => data, // optional function to map input data\n  byteLength: (data) => size, // optional function that calculates the byte size of input data\n  signal: abortController.signal // optional AbortSignal that triggers `.destroy` when on `abort`\n}\n```\n\nIn addition you can pass the `open`, `write`, `final`, and `destroy` functions as shorthands in\nthe constructor instead of overwrite the methods below.\n\nThe default byteLength function returns the byte length of buffers and `1024`\nfor any other object. This means the buffer will contain around 16 non buffers\nor buffers worth 16kb when full if the defaults are used.\n\n#### `ws._open(cb)`\n\nThis function is called once before the first write is issued. Use this function\nto implement your own open logic.\n\nCan also be set using `options.open` in the constructor.\n\n#### `ws._destroy(cb)`\n\nThis function is called just before the stream is fully destroyed. You should\nuse this to implement whatever teardown logic you need. The final part of the\nstream life cycle is always to call destroy itself so this function will always\nbe called whether or not the stream ends gracefully or forcefully.\n\nCan also be set using `options.destroy` in the constructor.\n\nNote that the `_destroy` might be called without the open function being called\nin case no write was ever performed on the stream.\n\n#### `ws._predestroy()`\n\nA simple hook that is called as soon as the first `stream.destroy()` call is invoked.\n\nUse this in case you need to cancel pending writes (if possible) instead of waiting for them to finish.\n\nCan also be set using `options.predestroy` in the constructor.\n\n#### `ws.destroy([error])`\n\nForcefully destroy the stream. Will call `_destroy` as soon as all pending reads have finished.\nOnce the stream is fully destroyed `close` will be emitted.\n\nIf you pass an error this error will be emitted just before `close` is, signifying a reason\nas to why this stream was destroyed.\n\n#### `drained = ws.write(data)`\n\nWrite a piece of data to the stream. Returns `true` if the stream buffer is not full and you\nshould keep writing to it if you can. If `false` is returned the stream will emit `drain`\nonce it's buffer is fully drained.\n\n#### `ws._write(data, callback)`\n\nThis function is called when the stream want to write some data. Use this to implement your own\nwrite logic. When done call the callback and the stream will call it again if more data exists in the buffer.\n\nCan also be set using `options.write` in the constructor.\n\n#### `ws._writev(batch, callback)`\n\nSimilar to `_write` but passes an array of all data in the current write buffer instead of the oldest one.\nUseful if the destination you are writing the data to supports batching.\n\nCan also be set using `options.writev` in the constructor.\n\n#### `ws.end()`\n\nGracefully end the writable stream. Call this when you no longer want to write to the stream.\n\nOnce all writes have been fully drained `finish` will be emitted.\n\nReturns this stream instance.\n\n#### `ws.cork()`\n\nBuffer written data in memory. Useful for accumulating small chunks of data into a batch to be consumed by `writev`.\n\n#### `ws.uncork()`\n\nDisable `ws.cork()`.\n\n#### `ws._final(callback)`\n\nThis function is called just before `finish` is emitted, i.e. when all writes have flushed but `ws.end()`\nhave been called. Use this to implement any logic that should happen after all writes but before finish.\n\nCan also be set using `options.final` in the constructor.\n\n#### `ws.on('finish')`\n\nEmitted when the stream has been ended and all writes have been drained.\n\n#### `ws.on('close')`\n\nEmitted when the readable stream has fully closed (i.e. it's destroy function has completed)\n\n#### `ws.on('error', err)`\n\nEmitted if any of the stream operations fail with an error. `close` is always emitted right after this.\n\n#### `ws.on('pipe', src)`\n\nEmitted when a readable stream is being piped to the writable one.\n\n#### `ws.on('open')`\n\nEmitted after `ws._open(cb)` execution.\n\n#### `ws.on('drain')`\n\nEmitted after all data was drained if `ws.write(data)` returned `false`.\n\n#### `ws.destroying`\n\nBoolean property indicating whether or not this stream has started to be destroyed.\n\n#### `ws.destroyed`\n\nBoolean property indicating whether or not this stream has been destroyed.\n\n#### `ws.writable`\n\nReturns `true` if the stream is an active streamx writable stream. Returns `undefined` if not.\n\n#### `bool = Writable.isBackpressured(ws)`\n\nStatic method to check if a writable stream is currently under backpressure.\n\n#### `bool = await Writable.drained(ws)`\n\nStatic helper to wait for a stream to drain the currently queued writes.\nReturns true if they were drained and false otherwise if the stream was destroyed.\n\n## Duplex Stream\n\n#### `s = new stream.Duplex([options])`\n\nA duplex stream is a stream that is both readable and writable.\n\nSince JS does not support multiple inheritance it inherits directly from Readable\nbut implements the Writable API as well.\n\nIf you want to provide only a map function for the readable side use `mapReadable` instead.\nIf you want to provide only a byteLength function for the readable side use `byteLengthReadable` instead.\n\nSame goes for the writable side but using `mapWritable` and `byteLengthWritable` instead.\n\n## Transform Stream\n\nA Transform stream is a duplex stream with an `._transform` template method that allows to\nasynchronously map the input to a different output.\n\nThe transform stream overrides the `_write` and `_read` operations of `Readable` and `Writable` but\nstill allows the setting of these options in the constructor. Usually it is unnecessary to pass\nin `read` or `write`/`writev` or to override the corresponding `._read`, `._write` or `._writev` operation.\n\n#### `ts = new stream.Transform([options])`\n\nA transform stream is a duplex stream that maps the data written to it and emits that as readable data.\n\nHas the same options as a duplex stream except you can provide a `transform` function also.\n\n#### `ts._transform(data, callback)`\n\nTransform the incoming data. Call `callback(null, mappedData)` or use `ts.push(mappedData)` to\nreturn data to the readable side of the stream.\n\nPer default the transform function just remits the incoming data making it act as a pass-through stream.\n\n## Pipeline\n\n`pipeline` allows to stream form a readable through a set of duplex streams to a writable entry.\n\n```js\nconst { pipeline, Readable, Transform, Writable } = require('streamx')\nconst lastStream = pipeline(\n  Readable.from([1, 2, 3]),\n  new Transform({\n    transform (from, cb) {\n      this.push(from.toString())\n      cb()\n    }\n  }),\n  new Writable({\n    write (data, cb) {\n      console.log(data)\n      cb()\n    }\n  })\n  error => {\n    // Callback once write has finished\n  }\n)\n```\n\n#### `lastStream = stream.pipeline(...streams, [done])`\n\nPipe all streams together and return the last stream piped to.\nWhen the last stream finishes the pipeline ended succesfully.\n\nIf any of the streams error, whether they are Node.js core streams\nor streamx streams, all streams in the pipeline are shutdown.\n\nOptionally you can pass a done callback to know when the pipeline is done.\n\n#### `promise = stream.pipelinePromise(...streams)`\n\nSame as normal pipeline except instead of returning the last stream it returns\na promise representing the done callback. Note you should error handle this\npromise if you use this version.\n\n## Helpers\n\n#### `bool = isStream(stream)`\n\n#### `bool = isStreamx(stream)`\n\n#### `bool = isDisturbed(stream)`\n\nIndicates if the stream has been opened or started to be destroyed.\n\n#### `bool = isEnding(stream)`\n\nIndicates if a readable stream has started it's ending process.\n\n#### `bool = isEnded(stream)`\n\nIndicates if a readable stream ended, so all it's readings.\n\n#### `bool = isFinishing(stream)`\n\nIndicates if a writable stream has started it's ending process.\n\n#### `bool = isFinished(stream)`\n\nIndicates if a writable stream ended, so all it's writings.\n\n#### `err = getStreamError(stream, [options])`\n\nReturns `null` if the stream has no errors.\n\n## Utilities\n\nStreamx aims to be minimal and stable. It therefore only contains a minimal set of utilities.\nTo help discover of other modules that help you build streamx apps, we link some useful utilities here\n\n- [stream-composer](https://github.com/mafintosh/stream-composer) - Compose streams like Node's `stream.compose` and the `duplexify` and `pumpify` modules.\n- [teex](https://github.com/mafintosh/teex) - Clone a readable stream into multiple new readable instances.\n- [merge-readable](https://github.com/holepunchto/merge-readable) - Merge multiple readers into a new readable with optional mapping of data\n\n## Contributing\n\nIf you want to help contribute to streamx a good way to start is to help writing more test\ncases, compatibility tests, documentation, or performance benchmarks.\n\nIf in doubt open an issue :)\n\n## License\n\nMIT\n"
  },
  {
    "path": "SECURITY.md",
    "content": "## Security contact information\n\nTo report a security vulnerability, please use the\n[Tidelift security contact](https://tidelift.com/security).\nTidelift will coordinate the fix and disclosure.\n"
  },
  {
    "path": "examples/fs.js",
    "content": "const fs = require('fs')\nconst { Writable, Readable } = require('../')\n\nclass FileWriteStream extends Writable {\n  constructor(filename, mode) {\n    super()\n    this.filename = filename\n    this.mode = mode\n    this.fd = 0\n  }\n\n  _open(cb) {\n    fs.open(this.filename, this.mode, (err, fd) => {\n      if (err) return cb(err)\n      this.fd = fd\n      cb(null)\n    })\n  }\n\n  _write(data, cb) {\n    fs.write(this.fd, data, 0, data.length, null, (err, written) => {\n      if (err) return cb(err)\n      if (written !== data.length) return this._write(data.slice(written), cb)\n      cb(null)\n    })\n  }\n\n  _destroy(cb) {\n    if (!this.fd) return cb()\n    fs.close(this.fd, cb)\n  }\n}\n\nclass FileReadStream extends Readable {\n  constructor(filename) {\n    super()\n    this.filename = filename\n    this.fd = 0\n  }\n\n  _open(cb) {\n    fs.open(this.filename, 'r', (err, fd) => {\n      if (err) return cb(err)\n      this.fd = fd\n      cb(null)\n    })\n  }\n\n  _read(cb) {\n    let data = Buffer.alloc(16 * 1024)\n\n    fs.read(this.fd, data, 0, data.length, null, (err, read) => {\n      if (err) return cb(err)\n      if (read !== data.length) data = data.slice(0, read)\n      this.push(data.length ? data : null)\n      cb(null)\n    })\n  }\n\n  _destroy(cb) {\n    if (!this.fd) return cb()\n    fs.close(this.fd, cb)\n  }\n}\n\n// copy this file as an example\n\nconst rs = new FileReadStream(__filename)\nconst ws = new FileWriteStream(`${__filename}.cpy`, 'w')\n\nrs.pipe(ws, function (err) {\n  console.log('file copied', err)\n})\n"
  },
  {
    "path": "index.js",
    "content": "const { EventEmitter } = require('events-universal')\nconst FIFO = require('fast-fifo')\nconst TextDecoder = require('text-decoder')\n\nconst StreamError = require('./lib/errors.js')\n\n// if we do a future major, expect queue microtask to be there always, for now a bit defensive\nconst qmt =\n  typeof queueMicrotask === 'undefined' ? (fn) => global.process.nextTick(fn) : queueMicrotask\n\n// 29 bits used total (4 from shared, 14 from read, and 11 from write)\nconst MAX = (1 << 29) - 1\n\n// Shared state\nconst OPENING = 0b0001\nconst PREDESTROYING = 0b0010\nconst DESTROYING = 0b0100\nconst DESTROYED = 0b1000\n\nconst NOT_OPENING = MAX ^ OPENING\nconst NOT_PREDESTROYING = MAX ^ PREDESTROYING\n\n// Read state (4 bit offset from shared state)\nconst READ_ACTIVE = 0b00000000000001 << 4\nconst READ_UPDATING = 0b00000000000010 << 4\nconst READ_PRIMARY = 0b00000000000100 << 4\nconst READ_QUEUED = 0b00000000001000 << 4\nconst READ_RESUMED = 0b00000000010000 << 4\nconst READ_PIPE_DRAINED = 0b00000000100000 << 4\nconst READ_ENDING = 0b00000001000000 << 4\nconst READ_EMIT_DATA = 0b00000010000000 << 4\nconst READ_EMIT_READABLE = 0b00000100000000 << 4\nconst READ_EMITTED_READABLE = 0b00001000000000 << 4\nconst READ_DONE = 0b00010000000000 << 4\nconst READ_NEXT_TICK = 0b00100000000000 << 4\nconst READ_NEEDS_PUSH = 0b01000000000000 << 4\nconst READ_READ_AHEAD = 0b10000000000000 << 4\n\n// Combined read state\nconst READ_FLOWING = READ_RESUMED | READ_PIPE_DRAINED\nconst READ_ACTIVE_AND_NEEDS_PUSH = READ_ACTIVE | READ_NEEDS_PUSH\nconst READ_PRIMARY_AND_ACTIVE = READ_PRIMARY | READ_ACTIVE\nconst READ_EMIT_READABLE_AND_QUEUED = READ_EMIT_READABLE | READ_QUEUED\nconst READ_RESUMED_READ_AHEAD = READ_RESUMED | READ_READ_AHEAD\n\nconst READ_NOT_ACTIVE = MAX ^ READ_ACTIVE\nconst READ_NON_PRIMARY = MAX ^ READ_PRIMARY\nconst READ_NON_PRIMARY_AND_PUSHED = MAX ^ (READ_PRIMARY | READ_NEEDS_PUSH)\nconst READ_PUSHED = MAX ^ READ_NEEDS_PUSH\nconst READ_PAUSED = MAX ^ READ_RESUMED\nconst READ_NOT_QUEUED = MAX ^ (READ_QUEUED | READ_EMITTED_READABLE)\nconst READ_NOT_ENDING = MAX ^ READ_ENDING\nconst READ_PIPE_NOT_DRAINED = MAX ^ READ_FLOWING\nconst READ_NOT_NEXT_TICK = MAX ^ READ_NEXT_TICK\nconst READ_NOT_UPDATING = MAX ^ READ_UPDATING\nconst READ_NO_READ_AHEAD = MAX ^ READ_READ_AHEAD\nconst READ_PAUSED_NO_READ_AHEAD = MAX ^ READ_RESUMED_READ_AHEAD\n\n// Write state (18 bit offset, 4 bit offset from shared state and 14 from read state)\nconst WRITE_ACTIVE = 0b00000000001 << 18\nconst WRITE_UPDATING = 0b00000000010 << 18\nconst WRITE_PRIMARY = 0b00000000100 << 18\nconst WRITE_QUEUED = 0b00000001000 << 18\nconst WRITE_UNDRAINED = 0b00000010000 << 18\nconst WRITE_DONE = 0b00000100000 << 18\nconst WRITE_EMIT_DRAIN = 0b00001000000 << 18\nconst WRITE_NEXT_TICK = 0b00010000000 << 18\nconst WRITE_WRITING = 0b00100000000 << 18\nconst WRITE_FINISHING = 0b01000000000 << 18\nconst WRITE_CORKED = 0b10000000000 << 18\n\nconst WRITE_NOT_ACTIVE = MAX ^ (WRITE_ACTIVE | WRITE_WRITING)\nconst WRITE_NON_PRIMARY = MAX ^ WRITE_PRIMARY\nconst WRITE_NOT_FINISHING = MAX ^ (WRITE_ACTIVE | WRITE_FINISHING)\nconst WRITE_DRAINED = MAX ^ WRITE_UNDRAINED\nconst WRITE_NOT_QUEUED = MAX ^ WRITE_QUEUED\nconst WRITE_NOT_NEXT_TICK = MAX ^ WRITE_NEXT_TICK\nconst WRITE_NOT_UPDATING = MAX ^ WRITE_UPDATING\nconst WRITE_NOT_CORKED = MAX ^ WRITE_CORKED\n\n// Combined shared state\nconst ACTIVE = READ_ACTIVE | WRITE_ACTIVE\nconst NOT_ACTIVE = MAX ^ ACTIVE\nconst DONE = READ_DONE | WRITE_DONE\nconst DESTROY_STATUS = DESTROYING | DESTROYED | PREDESTROYING\nconst OPEN_STATUS = DESTROY_STATUS | OPENING\nconst AUTO_DESTROY = DESTROY_STATUS | DONE\nconst NON_PRIMARY = WRITE_NON_PRIMARY & READ_NON_PRIMARY\nconst ACTIVE_OR_TICKING = WRITE_NEXT_TICK | READ_NEXT_TICK\nconst TICKING = ACTIVE_OR_TICKING & NOT_ACTIVE\nconst IS_OPENING = OPEN_STATUS | TICKING\n\n// Combined shared state and read state\nconst READ_PRIMARY_STATUS = OPEN_STATUS | READ_ENDING | READ_DONE\nconst READ_STATUS = OPEN_STATUS | READ_DONE | READ_QUEUED\nconst READ_ENDING_STATUS = OPEN_STATUS | READ_ENDING | READ_QUEUED\nconst READ_READABLE_STATUS = OPEN_STATUS | READ_EMIT_READABLE | READ_QUEUED | READ_EMITTED_READABLE\nconst SHOULD_NOT_READ =\n  OPEN_STATUS | READ_ACTIVE | READ_ENDING | READ_DONE | READ_NEEDS_PUSH | READ_READ_AHEAD\nconst READ_BACKPRESSURE_STATUS = DESTROY_STATUS | READ_ENDING | READ_DONE\nconst READ_UPDATE_SYNC_STATUS = READ_UPDATING | OPEN_STATUS | READ_NEXT_TICK | READ_PRIMARY\nconst READ_NEXT_TICK_OR_OPENING = READ_NEXT_TICK | OPENING\n\n// Combined write state\nconst WRITE_PRIMARY_STATUS = OPEN_STATUS | WRITE_FINISHING | WRITE_DONE\nconst WRITE_QUEUED_AND_UNDRAINED = WRITE_QUEUED | WRITE_UNDRAINED\nconst WRITE_QUEUED_AND_ACTIVE = WRITE_QUEUED | WRITE_ACTIVE\nconst WRITE_DRAIN_STATUS = WRITE_QUEUED | WRITE_UNDRAINED | OPEN_STATUS | WRITE_ACTIVE\nconst WRITE_STATUS = OPEN_STATUS | WRITE_ACTIVE | WRITE_QUEUED | WRITE_CORKED\nconst WRITE_PRIMARY_AND_ACTIVE = WRITE_PRIMARY | WRITE_ACTIVE\nconst WRITE_ACTIVE_AND_WRITING = WRITE_ACTIVE | WRITE_WRITING\nconst WRITE_FINISHING_STATUS = OPEN_STATUS | WRITE_FINISHING | WRITE_QUEUED_AND_ACTIVE | WRITE_DONE\nconst WRITE_BACKPRESSURE_STATUS = WRITE_UNDRAINED | DESTROY_STATUS | WRITE_FINISHING | WRITE_DONE\nconst WRITE_UPDATE_SYNC_STATUS = WRITE_UPDATING | OPEN_STATUS | WRITE_NEXT_TICK | WRITE_PRIMARY\nconst WRITE_DROP_DATA = WRITE_FINISHING | WRITE_DONE | DESTROY_STATUS\n\nconst asyncIterator = Symbol.asyncIterator || Symbol('asyncIterator')\n\nclass WritableState {\n  constructor(\n    stream,\n    { highWaterMark = 16384, map = null, mapWritable, byteLength, byteLengthWritable } = {}\n  ) {\n    this.stream = stream\n    this.queue = new FIFO()\n    this.highWaterMark = highWaterMark\n    this.buffered = 0\n    this.error = null\n    this.pipeline = null\n    this.drains = null // if we add more seldomly used helpers we might them into a subobject so its a single ptr\n    this.byteLength = byteLengthWritable || byteLength || defaultByteLength\n    this.map = mapWritable || map\n    this.afterWrite = afterWrite.bind(this)\n    this.afterUpdateNextTick = updateWriteNT.bind(this)\n  }\n\n  get ending() {\n    return (this.stream._duplexState & WRITE_FINISHING) !== 0\n  }\n\n  get ended() {\n    return (this.stream._duplexState & WRITE_DONE) !== 0\n  }\n\n  push(data) {\n    if ((this.stream._duplexState & WRITE_DROP_DATA) !== 0) return false\n    if (this.map !== null) data = this.map(data)\n\n    this.buffered += this.byteLength(data)\n    this.queue.push(data)\n\n    if (this.buffered < this.highWaterMark) {\n      this.stream._duplexState |= WRITE_QUEUED\n      return true\n    }\n\n    this.stream._duplexState |= WRITE_QUEUED_AND_UNDRAINED\n    return false\n  }\n\n  shift() {\n    const data = this.queue.shift()\n\n    this.buffered -= this.byteLength(data)\n    if (this.buffered === 0) this.stream._duplexState &= WRITE_NOT_QUEUED\n\n    return data\n  }\n\n  end(data) {\n    if (typeof data === 'function') {\n      this.stream.once('finish', data)\n    } else if (data !== undefined && data !== null) {\n      this.push(data)\n    }\n\n    this.stream._duplexState = (this.stream._duplexState | WRITE_FINISHING) & WRITE_NON_PRIMARY\n  }\n\n  autoBatch(data, cb) {\n    const buffer = []\n    const stream = this.stream\n\n    buffer.push(data)\n    while ((stream._duplexState & WRITE_STATUS) === WRITE_QUEUED_AND_ACTIVE) {\n      buffer.push(stream._writableState.shift())\n    }\n\n    if ((stream._duplexState & OPEN_STATUS) !== 0) return cb(null)\n    stream._writev(buffer, cb)\n  }\n\n  update() {\n    const stream = this.stream\n\n    stream._duplexState |= WRITE_UPDATING\n\n    do {\n      while ((stream._duplexState & WRITE_STATUS) === WRITE_QUEUED) {\n        const data = this.shift()\n        stream._duplexState |= WRITE_ACTIVE_AND_WRITING\n        stream._write(data, this.afterWrite)\n      }\n\n      if ((stream._duplexState & WRITE_PRIMARY_AND_ACTIVE) === 0) this.updateNonPrimary()\n    } while (this.continueUpdate() === true)\n\n    stream._duplexState &= WRITE_NOT_UPDATING\n  }\n\n  updateNonPrimary() {\n    const stream = this.stream\n\n    if ((stream._duplexState & WRITE_FINISHING_STATUS) === WRITE_FINISHING) {\n      stream._duplexState = stream._duplexState | WRITE_ACTIVE\n      stream._final(afterFinal.bind(this))\n      return\n    }\n\n    if ((stream._duplexState & DESTROY_STATUS) === DESTROYING) {\n      if ((stream._duplexState & ACTIVE_OR_TICKING) === 0) {\n        stream._duplexState |= ACTIVE\n        stream._destroy(afterDestroy.bind(this))\n      }\n      return\n    }\n\n    if ((stream._duplexState & IS_OPENING) === OPENING) {\n      stream._duplexState = (stream._duplexState | ACTIVE) & NOT_OPENING\n      stream._open(afterOpen.bind(this))\n    }\n  }\n\n  continueUpdate() {\n    if ((this.stream._duplexState & WRITE_NEXT_TICK) === 0) return false\n    this.stream._duplexState &= WRITE_NOT_NEXT_TICK\n    return true\n  }\n\n  updateCallback() {\n    if ((this.stream._duplexState & WRITE_UPDATE_SYNC_STATUS) === WRITE_PRIMARY) {\n      this.update()\n    } else {\n      this.updateNextTick()\n    }\n  }\n\n  updateNextTick() {\n    if ((this.stream._duplexState & WRITE_NEXT_TICK) !== 0) return\n    this.stream._duplexState |= WRITE_NEXT_TICK\n    if ((this.stream._duplexState & WRITE_UPDATING) === 0) qmt(this.afterUpdateNextTick)\n  }\n}\n\nclass ReadableState {\n  constructor(\n    stream,\n    { highWaterMark = 16384, map = null, mapReadable, byteLength, byteLengthReadable } = {}\n  ) {\n    this.stream = stream\n    this.queue = new FIFO()\n    this.highWaterMark = highWaterMark === 0 ? 1 : highWaterMark\n    this.buffered = 0\n    this.readAhead = highWaterMark > 0\n    this.error = null\n    this.pipeline = null\n    this.byteLength = byteLengthReadable || byteLength || defaultByteLength\n    this.map = mapReadable || map\n    this.pipeTo = null\n    this.afterRead = afterRead.bind(this)\n    this.afterUpdateNextTick = updateReadNT.bind(this)\n  }\n\n  get ending() {\n    return (this.stream._duplexState & READ_ENDING) !== 0\n  }\n\n  get ended() {\n    return (this.stream._duplexState & READ_DONE) !== 0\n  }\n\n  pipe(pipeTo, cb) {\n    if (this.pipeTo !== null) throw new Error('Can only pipe to one destination')\n    if (typeof cb !== 'function') cb = null\n\n    this.stream._duplexState |= READ_PIPE_DRAINED\n    this.pipeTo = pipeTo\n    this.pipeline = new Pipeline(this.stream, pipeTo, cb)\n\n    if (cb) this.stream.on('error', noop) // We already error handle this so supress crashes\n\n    if (isStreamx(pipeTo)) {\n      pipeTo._writableState.pipeline = this.pipeline\n      if (cb) pipeTo.on('error', noop) // We already error handle this so supress crashes\n      pipeTo.on('finish', this.pipeline.finished.bind(this.pipeline)) // TODO: just call finished from pipeTo itself\n    } else {\n      const onerror = this.pipeline.done.bind(this.pipeline, pipeTo)\n      const onclose = this.pipeline.done.bind(this.pipeline, pipeTo, null) // onclose has a weird bool arg\n      pipeTo.on('error', onerror)\n      pipeTo.on('close', onclose)\n      pipeTo.on('finish', this.pipeline.finished.bind(this.pipeline))\n    }\n\n    pipeTo.on('drain', afterDrain.bind(this))\n    this.stream.emit('piping', pipeTo)\n    pipeTo.emit('pipe', this.stream)\n  }\n\n  push(data) {\n    const stream = this.stream\n\n    if (data === null) {\n      this.highWaterMark = 0\n      stream._duplexState = (stream._duplexState | READ_ENDING) & READ_NON_PRIMARY_AND_PUSHED\n      return false\n    }\n\n    if (this.map !== null) {\n      data = this.map(data)\n      if (data === null) {\n        stream._duplexState &= READ_PUSHED\n        return this.buffered < this.highWaterMark\n      }\n    }\n\n    this.buffered += this.byteLength(data)\n    this.queue.push(data)\n\n    stream._duplexState = (stream._duplexState | READ_QUEUED) & READ_PUSHED\n\n    return this.buffered < this.highWaterMark\n  }\n\n  shift() {\n    const data = this.queue.shift()\n\n    this.buffered -= this.byteLength(data)\n    if (this.buffered === 0) {\n      this.stream._duplexState &= READ_NOT_QUEUED\n    }\n\n    return data\n  }\n\n  unshift(data) {\n    const pending = [this.map !== null ? this.map(data) : data]\n    while (this.buffered > 0) pending.push(this.shift())\n\n    for (let i = 0; i < pending.length - 1; i++) {\n      const data = pending[i]\n      this.buffered += this.byteLength(data)\n      this.queue.push(data)\n    }\n\n    this.push(pending[pending.length - 1])\n  }\n\n  read() {\n    const stream = this.stream\n\n    if ((stream._duplexState & READ_STATUS) === READ_QUEUED) {\n      const data = this.shift()\n\n      if (this.pipeTo !== null && this.pipeTo.write(data) === false) {\n        stream._duplexState &= READ_PIPE_NOT_DRAINED\n      }\n\n      if ((stream._duplexState & READ_EMIT_DATA) !== 0) {\n        stream.emit('data', data)\n      }\n\n      return data\n    }\n\n    if (this.readAhead === false) {\n      stream._duplexState |= READ_READ_AHEAD\n      this.updateNextTick()\n    }\n\n    return null\n  }\n\n  drain() {\n    const stream = this.stream\n\n    while (\n      (stream._duplexState & READ_STATUS) === READ_QUEUED &&\n      (stream._duplexState & READ_FLOWING) !== 0\n    ) {\n      const data = this.shift()\n\n      if (this.pipeTo !== null && this.pipeTo.write(data) === false) {\n        stream._duplexState &= READ_PIPE_NOT_DRAINED\n      }\n\n      if ((stream._duplexState & READ_EMIT_DATA) !== 0) {\n        stream.emit('data', data)\n      }\n    }\n  }\n\n  update() {\n    const stream = this.stream\n\n    stream._duplexState |= READ_UPDATING\n\n    do {\n      this.drain()\n\n      while (\n        this.buffered < this.highWaterMark &&\n        (stream._duplexState & SHOULD_NOT_READ) === READ_READ_AHEAD\n      ) {\n        stream._duplexState |= READ_ACTIVE_AND_NEEDS_PUSH\n        stream._read(this.afterRead)\n        this.drain()\n      }\n\n      if ((stream._duplexState & READ_READABLE_STATUS) === READ_EMIT_READABLE_AND_QUEUED) {\n        stream._duplexState |= READ_EMITTED_READABLE\n        stream.emit('readable')\n      }\n\n      if ((stream._duplexState & READ_PRIMARY_AND_ACTIVE) === 0) {\n        this.updateNonPrimary()\n      }\n    } while (this.continueUpdate() === true)\n\n    stream._duplexState &= READ_NOT_UPDATING\n  }\n\n  updateNonPrimary() {\n    const stream = this.stream\n\n    if ((stream._duplexState & READ_ENDING_STATUS) === READ_ENDING) {\n      stream._duplexState = (stream._duplexState | READ_DONE) & READ_NOT_ENDING\n      stream.emit('end')\n\n      if ((stream._duplexState & AUTO_DESTROY) === DONE) {\n        stream._duplexState |= DESTROYING\n      }\n\n      if (this.pipeTo !== null) {\n        this.pipeTo.end()\n      }\n    }\n\n    if ((stream._duplexState & DESTROY_STATUS) === DESTROYING) {\n      if ((stream._duplexState & ACTIVE_OR_TICKING) === 0) {\n        stream._duplexState |= ACTIVE\n        stream._destroy(afterDestroy.bind(this))\n      }\n      return\n    }\n\n    if ((stream._duplexState & IS_OPENING) === OPENING) {\n      stream._duplexState = (stream._duplexState | ACTIVE) & NOT_OPENING\n      stream._open(afterOpen.bind(this))\n    }\n  }\n\n  continueUpdate() {\n    if ((this.stream._duplexState & READ_NEXT_TICK) === 0) return false\n    this.stream._duplexState &= READ_NOT_NEXT_TICK\n    return true\n  }\n\n  updateCallback() {\n    if ((this.stream._duplexState & READ_UPDATE_SYNC_STATUS) === READ_PRIMARY) {\n      this.update()\n    } else {\n      this.updateNextTick()\n    }\n  }\n\n  updateNextTickIfOpen() {\n    if ((this.stream._duplexState & READ_NEXT_TICK_OR_OPENING) !== 0) return\n    this.stream._duplexState |= READ_NEXT_TICK\n    if ((this.stream._duplexState & READ_UPDATING) === 0) qmt(this.afterUpdateNextTick)\n  }\n\n  updateNextTick() {\n    if ((this.stream._duplexState & READ_NEXT_TICK) !== 0) return\n    this.stream._duplexState |= READ_NEXT_TICK\n    if ((this.stream._duplexState & READ_UPDATING) === 0) qmt(this.afterUpdateNextTick)\n  }\n}\n\nclass TransformState {\n  constructor(stream) {\n    this.data = null\n    this.afterTransform = afterTransform.bind(stream)\n    this.afterFinal = null\n  }\n}\n\nclass Pipeline {\n  constructor(src, dst, cb) {\n    this.from = src\n    this.to = dst\n    this.afterPipe = cb\n    this.error = null\n    this.pipeToFinished = false\n  }\n\n  finished() {\n    this.pipeToFinished = true\n  }\n\n  done(stream, err) {\n    if (err) this.error = err\n\n    if (stream === this.to) {\n      this.to = null\n\n      if (this.from !== null) {\n        if ((this.from._duplexState & READ_DONE) === 0 || !this.pipeToFinished) {\n          this.from.destroy(this.error || new Error('Writable stream closed prematurely'))\n        }\n        return\n      }\n    }\n\n    if (stream === this.from) {\n      this.from = null\n\n      if (this.to !== null) {\n        if ((stream._duplexState & READ_DONE) === 0) {\n          this.to.destroy(this.error || new Error('Readable stream closed before ending'))\n        }\n        return\n      }\n    }\n\n    if (this.afterPipe !== null) this.afterPipe(this.error)\n    this.to = this.from = this.afterPipe = null\n  }\n}\n\nfunction afterDrain() {\n  this.stream._duplexState |= READ_PIPE_DRAINED\n  this.updateCallback()\n}\n\nfunction afterFinal(err) {\n  const stream = this.stream\n  if (err) stream.destroy(err)\n\n  if ((stream._duplexState & DESTROY_STATUS) === 0) {\n    stream._duplexState |= WRITE_DONE\n    stream.emit('finish')\n  }\n\n  if ((stream._duplexState & AUTO_DESTROY) === DONE) {\n    stream._duplexState |= DESTROYING\n  }\n\n  stream._duplexState &= WRITE_NOT_FINISHING\n\n  // no need to wait the extra tick here, so we short circuit that\n  if ((stream._duplexState & WRITE_UPDATING) === 0) {\n    this.update()\n  } else {\n    this.updateNextTick()\n  }\n}\n\nfunction afterDestroy(err) {\n  const stream = this.stream\n\n  if (!err && !StreamError.isStreamDestroyed(this.error)) err = this.error\n  if (err) stream.emit('error', err)\n\n  stream._duplexState |= DESTROYED\n  stream.emit('close')\n\n  const rs = stream._readableState\n  const ws = stream._writableState\n\n  if (rs !== null && rs.pipeline !== null) {\n    rs.pipeline.done(stream, err)\n  }\n\n  if (ws !== null) {\n    while (ws.drains !== null && ws.drains.length > 0) {\n      ws.drains.shift().resolve(false)\n    }\n\n    if (ws.pipeline !== null) {\n      ws.pipeline.done(stream, err)\n    }\n  }\n}\n\nfunction afterWrite(err) {\n  const stream = this.stream\n\n  if (err) stream.destroy(err)\n  stream._duplexState &= WRITE_NOT_ACTIVE\n\n  if (this.drains !== null) tickDrains(this.drains)\n\n  if ((stream._duplexState & WRITE_DRAIN_STATUS) === WRITE_UNDRAINED) {\n    stream._duplexState &= WRITE_DRAINED\n\n    if ((stream._duplexState & WRITE_EMIT_DRAIN) === WRITE_EMIT_DRAIN) {\n      stream.emit('drain')\n    }\n  }\n\n  this.updateCallback()\n}\n\nfunction afterRead(err) {\n  if (err) this.stream.destroy(err)\n  this.stream._duplexState &= READ_NOT_ACTIVE\n\n  if (this.readAhead === false && (this.stream._duplexState & READ_RESUMED) === 0) {\n    this.stream._duplexState &= READ_NO_READ_AHEAD\n  }\n\n  this.updateCallback()\n}\n\nfunction updateReadNT() {\n  if ((this.stream._duplexState & READ_UPDATING) === 0) {\n    this.stream._duplexState &= READ_NOT_NEXT_TICK\n    this.update()\n  }\n}\n\nfunction updateWriteNT() {\n  if ((this.stream._duplexState & WRITE_UPDATING) === 0) {\n    this.stream._duplexState &= WRITE_NOT_NEXT_TICK\n    this.update()\n  }\n}\n\nfunction tickDrains(drains) {\n  for (let i = 0; i < drains.length; i++) {\n    // drains.writes are monotonic, so if one is 0 its always the first one\n    if (--drains[i].writes === 0) {\n      drains.shift().resolve(true)\n      i--\n    }\n  }\n}\n\nfunction afterOpen(err) {\n  const stream = this.stream\n\n  if (err) stream.destroy(err)\n\n  if ((stream._duplexState & DESTROYING) === 0) {\n    if ((stream._duplexState & READ_PRIMARY_STATUS) === 0) {\n      stream._duplexState |= READ_PRIMARY\n    }\n\n    if ((stream._duplexState & WRITE_PRIMARY_STATUS) === 0) {\n      stream._duplexState |= WRITE_PRIMARY\n    }\n\n    stream.emit('open')\n  }\n\n  stream._duplexState &= NOT_ACTIVE\n\n  if (stream._writableState !== null) {\n    stream._writableState.updateCallback()\n  }\n\n  if (stream._readableState !== null) {\n    stream._readableState.updateCallback()\n  }\n}\n\nfunction afterTransform(err, data) {\n  if (data !== undefined && data !== null) this.push(data)\n  this._writableState.afterWrite(err)\n}\n\nfunction newListener(name) {\n  if (this._readableState !== null) {\n    if (name === 'data') {\n      this._duplexState |= READ_EMIT_DATA | READ_RESUMED_READ_AHEAD\n      this._readableState.updateNextTick()\n    }\n    if (name === 'readable') {\n      this._duplexState |= READ_EMIT_READABLE\n      this._readableState.updateNextTick()\n    }\n  }\n\n  if (this._writableState !== null) {\n    if (name === 'drain') {\n      this._duplexState |= WRITE_EMIT_DRAIN\n      this._writableState.updateNextTick()\n    }\n  }\n}\n\nclass Stream extends EventEmitter {\n  constructor(opts) {\n    super()\n\n    this._duplexState = 0\n    this._readableState = null\n    this._writableState = null\n\n    if (opts) {\n      if (opts.open) this._open = opts.open\n      if (opts.destroy) this._destroy = opts.destroy\n      if (opts.predestroy) this._predestroy = opts.predestroy\n      if (opts.signal) opts.signal.addEventListener('abort', abort.bind(this))\n    }\n\n    this.on('newListener', newListener)\n  }\n\n  _open(cb) {\n    cb(null)\n  }\n\n  _destroy(cb) {\n    cb(null)\n  }\n\n  _predestroy() {\n    // does nothing\n  }\n\n  get readable() {\n    return this._readableState !== null ? true : undefined\n  }\n\n  get writable() {\n    return this._writableState !== null ? true : undefined\n  }\n\n  get destroyed() {\n    return (this._duplexState & DESTROYED) !== 0\n  }\n\n  get destroying() {\n    return (this._duplexState & DESTROY_STATUS) !== 0\n  }\n\n  destroy(err) {\n    if ((this._duplexState & DESTROY_STATUS) === 0) {\n      if (!err) err = StreamError.STREAM_DESTROYED()\n      this._duplexState = (this._duplexState | DESTROYING) & NON_PRIMARY\n\n      if (this._readableState !== null) {\n        this._readableState.highWaterMark = 0\n        this._readableState.error = err\n      }\n\n      if (this._writableState !== null) {\n        this._writableState.highWaterMark = 0\n        this._writableState.error = err\n      }\n\n      this._duplexState |= PREDESTROYING\n      this._predestroy()\n      this._duplexState &= NOT_PREDESTROYING\n\n      if (this._readableState !== null) {\n        this._readableState.updateNextTick()\n      }\n\n      if (this._writableState !== null) {\n        this._writableState.updateNextTick()\n      }\n    }\n  }\n}\n\nclass Readable extends Stream {\n  constructor(opts) {\n    super(opts)\n\n    this._duplexState |= OPENING | WRITE_DONE | READ_READ_AHEAD\n    this._readableState = new ReadableState(this, opts)\n\n    if (opts) {\n      if (this._readableState.readAhead === false) this._duplexState &= READ_NO_READ_AHEAD\n      if (opts.read) this._read = opts.read\n      if (opts.eagerOpen) this._readableState.updateNextTick()\n      if (opts.encoding) this.setEncoding(opts.encoding)\n    }\n  }\n\n  setEncoding(encoding) {\n    const dec = new TextDecoder(encoding)\n    const map = this._readableState.map || echo\n    this._readableState.map = mapOrSkip\n    return this\n\n    function mapOrSkip(data) {\n      const next = dec.push(data)\n      return next === '' && (data.byteLength !== 0 || dec.remaining > 0) ? null : map(next)\n    }\n  }\n\n  _read(cb) {\n    cb(null)\n  }\n\n  pipe(dest, cb) {\n    this._readableState.updateNextTick()\n    this._readableState.pipe(dest, cb)\n    return dest\n  }\n\n  read() {\n    this._readableState.updateNextTick()\n    return this._readableState.read()\n  }\n\n  push(data) {\n    this._readableState.updateNextTickIfOpen()\n    return this._readableState.push(data)\n  }\n\n  unshift(data) {\n    this._readableState.updateNextTickIfOpen()\n    return this._readableState.unshift(data)\n  }\n\n  resume() {\n    this._duplexState |= READ_RESUMED_READ_AHEAD\n    this._readableState.updateNextTick()\n    return this\n  }\n\n  pause() {\n    this._duplexState &=\n      this._readableState.readAhead === false ? READ_PAUSED_NO_READ_AHEAD : READ_PAUSED\n    return this\n  }\n\n  static _fromAsyncIterator(ite, opts) {\n    let destroy\n\n    const rs = new Readable({\n      ...opts,\n      read(cb) {\n        ite.next().then(push).then(cb.bind(null, null)).catch(cb)\n      },\n      predestroy() {\n        destroy = ite.return()\n      },\n      destroy(cb) {\n        if (!destroy) return cb(null)\n        destroy.then(cb.bind(null, null)).catch(cb)\n      }\n    })\n\n    return rs\n\n    function push(data) {\n      if (data.done) rs.push(null)\n      else rs.push(data.value)\n    }\n  }\n\n  static from(data, opts) {\n    if (isReadStreamx(data)) return data\n    if (data[asyncIterator]) return this._fromAsyncIterator(data[asyncIterator](), opts)\n    if (!Array.isArray(data)) data = data === undefined ? [] : [data]\n\n    let i = 0\n    return new Readable({\n      ...opts,\n      read(cb) {\n        this.push(i === data.length ? null : data[i++])\n        cb(null)\n      }\n    })\n  }\n\n  static isBackpressured(rs) {\n    return (\n      (rs._duplexState & READ_BACKPRESSURE_STATUS) !== 0 ||\n      rs._readableState.buffered >= rs._readableState.highWaterMark\n    )\n  }\n\n  static isPaused(rs) {\n    return (rs._duplexState & READ_RESUMED) === 0\n  }\n\n  [asyncIterator]() {\n    const stream = this\n\n    let error = null\n    let promiseResolve = null\n    let promiseReject = null\n\n    this.on('error', (err) => {\n      error = err\n    })\n    this.on('readable', onreadable)\n    this.on('close', onclose)\n\n    return {\n      [asyncIterator]() {\n        return this\n      },\n      next() {\n        return new Promise(function (resolve, reject) {\n          promiseResolve = resolve\n          promiseReject = reject\n          const data = stream.read()\n          if (data !== null) ondata(data)\n          else if ((stream._duplexState & DESTROYED) !== 0) ondata(null)\n        })\n      },\n      return() {\n        return destroy(null)\n      },\n      throw(err) {\n        return destroy(err)\n      }\n    }\n\n    function onreadable() {\n      if (promiseResolve !== null) ondata(stream.read())\n    }\n\n    function onclose() {\n      if (promiseResolve !== null) ondata(null)\n    }\n\n    function ondata(data) {\n      if (promiseReject === null) return\n      if (error) {\n        promiseReject(error)\n      } else if (data === null && (stream._duplexState & READ_DONE) === 0) {\n        promiseReject(StreamError.STREAM_DESTROYED())\n      } else {\n        promiseResolve({ value: data, done: data === null })\n      }\n      promiseReject = promiseResolve = null\n    }\n\n    function destroy(err) {\n      stream.destroy(err)\n      return new Promise((resolve, reject) => {\n        if (stream._duplexState & DESTROYED) return resolve({ value: undefined, done: true })\n        stream.once('close', function () {\n          if (err) reject(err)\n          else resolve({ value: undefined, done: true })\n        })\n      })\n    }\n  }\n}\n\nclass Writable extends Stream {\n  constructor(opts) {\n    super(opts)\n\n    this._duplexState |= OPENING | READ_DONE\n    this._writableState = new WritableState(this, opts)\n\n    if (opts) {\n      if (opts.writev) this._writev = opts.writev\n      if (opts.write) this._write = opts.write\n      if (opts.final) this._final = opts.final\n      if (opts.eagerOpen) this._writableState.updateNextTick()\n    }\n  }\n\n  cork() {\n    this._duplexState |= WRITE_CORKED\n  }\n\n  uncork() {\n    this._duplexState &= WRITE_NOT_CORKED\n    this._writableState.updateNextTick()\n  }\n\n  _writev(batch, cb) {\n    cb(null)\n  }\n\n  _write(data, cb) {\n    this._writableState.autoBatch(data, cb)\n  }\n\n  _final(cb) {\n    cb(null)\n  }\n\n  static isBackpressured(ws) {\n    return (ws._duplexState & WRITE_BACKPRESSURE_STATUS) !== 0\n  }\n\n  static drained(ws) {\n    if (ws.destroyed) return Promise.resolve(false)\n\n    const state = ws._writableState\n    const pending = isWritev(ws) ? Math.min(1, state.queue.length) : state.queue.length\n    const writes = pending + (ws._duplexState & WRITE_WRITING ? 1 : 0)\n    if (writes === 0) return Promise.resolve(true)\n\n    if (state.drains === null) state.drains = []\n    return new Promise((resolve) => {\n      state.drains.push({ writes, resolve })\n    })\n  }\n\n  write(data) {\n    this._writableState.updateNextTick()\n    return this._writableState.push(data)\n  }\n\n  end(data) {\n    this._writableState.updateNextTick()\n    this._writableState.end(data)\n    return this\n  }\n}\n\nclass Duplex extends Readable {\n  // and Writable\n  constructor(opts) {\n    super(opts)\n\n    this._duplexState = OPENING | (this._duplexState & READ_READ_AHEAD)\n    this._writableState = new WritableState(this, opts)\n\n    if (opts) {\n      if (opts.writev) this._writev = opts.writev\n      if (opts.write) this._write = opts.write\n      if (opts.final) this._final = opts.final\n    }\n  }\n\n  cork() {\n    this._duplexState |= WRITE_CORKED\n  }\n\n  uncork() {\n    this._duplexState &= WRITE_NOT_CORKED\n    this._writableState.updateNextTick()\n  }\n\n  _writev(batch, cb) {\n    cb(null)\n  }\n\n  _write(data, cb) {\n    this._writableState.autoBatch(data, cb)\n  }\n\n  _final(cb) {\n    cb(null)\n  }\n\n  write(data) {\n    this._writableState.updateNextTick()\n    return this._writableState.push(data)\n  }\n\n  end(data) {\n    this._writableState.updateNextTick()\n    this._writableState.end(data)\n    return this\n  }\n}\n\nclass Transform extends Duplex {\n  constructor(opts) {\n    super(opts)\n    this._transformState = new TransformState(this)\n\n    if (opts) {\n      if (opts.transform) this._transform = opts.transform\n      if (opts.flush) this._flush = opts.flush\n    }\n  }\n\n  _write(data, cb) {\n    if (this._readableState.buffered >= this._readableState.highWaterMark) {\n      this._transformState.data = data\n    } else {\n      this._transform(data, this._transformState.afterTransform)\n    }\n  }\n\n  _read(cb) {\n    if (this._transformState.data !== null) {\n      const data = this._transformState.data\n      this._transformState.data = null\n      cb(null)\n      this._transform(data, this._transformState.afterTransform)\n    } else {\n      cb(null)\n    }\n  }\n\n  destroy(err) {\n    super.destroy(err)\n    if (this._transformState.data !== null) {\n      this._transformState.data = null\n      this._transformState.afterTransform()\n    }\n  }\n\n  _transform(data, cb) {\n    cb(null, data)\n  }\n\n  _flush(cb) {\n    cb(null)\n  }\n\n  _final(cb) {\n    this._transformState.afterFinal = cb\n    this._flush(transformAfterFlush.bind(this))\n  }\n}\n\nclass PassThrough extends Transform {}\n\nfunction transformAfterFlush(err, data) {\n  const cb = this._transformState.afterFinal\n  if (err) return cb(err)\n\n  if (data !== null && data !== undefined) this.push(data)\n  this.push(null)\n  cb(null)\n}\n\nfunction pipelinePromise(...streams) {\n  return new Promise((resolve, reject) => {\n    return pipeline(...streams, (err) => {\n      if (err) return reject(err)\n      resolve()\n    })\n  })\n}\n\nfunction pipeline(stream, ...streams) {\n  const all = Array.isArray(stream) ? [...stream, ...streams] : [stream, ...streams]\n  const done = all.length && typeof all[all.length - 1] === 'function' ? all.pop() : null\n\n  if (all.length < 2) throw new Error('Pipeline requires at least 2 streams')\n\n  let src = all[0]\n  let dest = null\n  let error = null\n\n  for (let i = 1; i < all.length; i++) {\n    dest = all[i]\n\n    if (isStreamx(src)) {\n      src.pipe(dest, onerror)\n    } else {\n      errorHandle(src, true, i > 1, onerror)\n      src.pipe(dest)\n    }\n\n    src = dest\n  }\n\n  if (done) {\n    let fin = false\n\n    const autoDestroy =\n      isStreamx(dest) || !!(dest._writableState && dest._writableState.autoDestroy)\n\n    dest.on('error', (err) => {\n      if (error === null) error = err\n    })\n\n    dest.on('finish', () => {\n      fin = true\n      if (!autoDestroy) done(error)\n    })\n\n    if (autoDestroy) {\n      dest.on('close', () => done(error || (fin ? null : StreamError.PREMATURE_CLOSE())))\n    }\n  }\n\n  return dest\n\n  function errorHandle(s, rd, wr, onerror) {\n    s.on('error', onerror)\n    s.on('close', onclose)\n\n    function onclose() {\n      if (rd && s._readableState && !s._readableState.ended) {\n        return onerror(StreamError.PREMATURE_CLOSE())\n      }\n      if (wr && s._writableState && !s._writableState.ended) {\n        return onerror(StreamError.PREMATURE_CLOSE())\n      }\n    }\n  }\n\n  function onerror(err) {\n    if (!err || error) return\n    error = err\n\n    for (const s of all) {\n      s.destroy(err)\n    }\n  }\n}\n\nfunction echo(s) {\n  return s\n}\n\nfunction isStream(stream) {\n  return !!stream._readableState || !!stream._writableState\n}\n\nfunction isStreamx(stream) {\n  return typeof stream._duplexState === 'number' && isStream(stream)\n}\n\nfunction isEnding(stream) {\n  return !!stream._readableState && stream._readableState.ending\n}\n\nfunction isEnded(stream) {\n  return !!stream._readableState && stream._readableState.ended\n}\n\nfunction isFinishing(stream) {\n  return !!stream._writableState && stream._writableState.ending\n}\n\nfunction isFinished(stream) {\n  return !!stream._writableState && stream._writableState.ended\n}\n\nfunction getStreamError(stream, opts = {}) {\n  const err =\n    (stream._readableState && stream._readableState.error) ||\n    (stream._writableState && stream._writableState.error)\n\n  // avoid implicit errors by default\n  return !opts.all && StreamError.isStreamDestroyed(err) ? null : err\n}\n\nfunction isReadStreamx(stream) {\n  return isStreamx(stream) && stream.readable\n}\n\nfunction isDisturbed(stream) {\n  return (\n    (stream._duplexState & OPENING) !== OPENING ||\n    (stream._duplexState & DESTROYING) === DESTROYING ||\n    (stream._duplexState & ACTIVE_OR_TICKING) !== 0\n  )\n}\n\nfunction isTypedArray(data) {\n  return typeof data === 'object' && data !== null && typeof data.byteLength === 'number'\n}\n\nfunction defaultByteLength(data) {\n  return isTypedArray(data) ? data.byteLength : 1024\n}\n\nfunction noop() {}\n\nfunction abort() {\n  this.destroy(StreamError.ABORTED())\n}\n\nfunction isWritev(s) {\n  return s._writev !== Writable.prototype._writev && s._writev !== Duplex.prototype._writev\n}\n\nmodule.exports = {\n  pipeline,\n  pipelinePromise,\n  isStream,\n  isStreamx,\n  isEnding,\n  isEnded,\n  isFinishing,\n  isFinished,\n  isDisturbed,\n  getStreamError,\n  Stream,\n  Writable,\n  Readable,\n  Duplex,\n  Transform,\n  // Export PassThrough for compatibility with Node.js core's stream module\n  PassThrough\n}\n"
  },
  {
    "path": "lib/errors.js",
    "content": "module.exports = class StreamError extends Error {\n  constructor(msg, code, fn = StreamError) {\n    super(msg)\n\n    this.code = code\n\n    if (Error.captureStackTrace) {\n      Error.captureStackTrace(this, fn)\n    }\n  }\n\n  static isStreamDestroyed(err) {\n    return err && err.code === 'STREAM_DESTROYED'\n  }\n\n  static isPrematureClose(err) {\n    return err && err.code === 'PREMATURE_CLOSE'\n  }\n\n  static isAborted(err) {\n    return err && err.code === 'ABORTED'\n  }\n\n  get name() {\n    return 'StreamError'\n  }\n\n  static STREAM_DESTROYED() {\n    return new StreamError('Stream was destroyed', 'STREAM_DESTROYED', StreamError.STREAM_DESTROYED)\n  }\n\n  static PREMATURE_CLOSE() {\n    return new StreamError('Premature close', 'PREMATURE_CLOSE', StreamError.PREMATURE_CLOSE)\n  }\n\n  static ABORTED() {\n    return new StreamError('Stream aborted', 'ABORTED', StreamError.ABORTED)\n  }\n}\n"
  },
  {
    "path": "package.json",
    "content": "{\n  \"name\": \"streamx\",\n  \"version\": \"2.25.0\",\n  \"description\": \"An iteration of the Node.js core streams with a series of improvements\",\n  \"main\": \"index.js\",\n  \"exports\": {\n    \".\": \"./index.js\",\n    \"./package\": \"./package.json\",\n    \"./errors\": \"./lib/errors\"\n  },\n  \"dependencies\": {\n    \"events-universal\": \"^1.0.0\",\n    \"fast-fifo\": \"^1.3.2\",\n    \"text-decoder\": \"^1.1.0\"\n  },\n  \"devDependencies\": {\n    \"b4a\": \"^1.6.6\",\n    \"brittle\": \"^3.1.1\",\n    \"end-of-stream\": \"^1.4.4\",\n    \"lunte\": \"^1.8.0\",\n    \"prettier\": \"^3.6.2\",\n    \"prettier-config-holepunch\": \"^2.0.0\"\n  },\n  \"files\": [\n    \"index.js\",\n    \"lib/errors.js\"\n  ],\n  \"scripts\": {\n    \"format\": \"prettier --write .\",\n    \"test\": \"lunte && prettier --check . && node test/all.js\",\n    \"test:bare\": \"bare test/all.js\"\n  },\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"https://github.com/mafintosh/streamx.git\"\n  },\n  \"author\": \"Mathias Buus (@mafintosh)\",\n  \"license\": \"MIT\",\n  \"bugs\": {\n    \"url\": \"https://github.com/mafintosh/streamx/issues\"\n  },\n  \"homepage\": \"https://github.com/mafintosh/streamx\"\n}\n"
  },
  {
    "path": "test/all.js",
    "content": "// This runner is auto-generated by Brittle\n\nrunTests()\n\nasync function runTests() {\n  const test = (await import('brittle')).default\n\n  test.pause()\n\n  await import('./async-iterator.js')\n  await import('./backpressure.js')\n  await import('./byte-length.js')\n  await import('./compat.js')\n  await import('./destroy.js')\n  await import('./duplex.js')\n  await import('./errors.js')\n  await import('./get-stream-error.js')\n  await import('./passthrough.js')\n  await import('./pipe.js')\n  await import('./pipeline.js')\n  await import('./readable.js')\n  await import('./transform.js')\n  await import('./writable.js')\n\n  test.resume()\n}\n"
  },
  {
    "path": "test/async-iterator.js",
    "content": "const test = require('brittle')\nconst { Readable, getStreamError } = require('../')\nconst StreamError = require('../lib/errors')\n\ntest('streams are async iterators', async function (t) {\n  const data = ['a', 'b', 'c', null]\n  const expected = data.slice(0)\n\n  const r = new Readable({\n    read(cb) {\n      this.push(data.shift())\n      cb(null)\n    }\n  })\n\n  for await (const chunk of r) {\n    t.is(chunk, expected.shift())\n  }\n\n  t.is(expected.shift(), null)\n})\n\ntest('break out of iterator', async function (t) {\n  const r = new Readable({\n    read(cb) {\n      this.push('tick')\n      cb(null)\n    },\n    destroy(cb) {\n      t.pass('destroying')\n      cb(null)\n    }\n  })\n\n  let runs = 10\n\n  for await (const chunk of r) {\n    t.is(chunk, 'tick')\n    if (--runs === 0) break\n  }\n})\n\ntest('throw out of iterator', async function (t) {\n  const r = new Readable({\n    read(cb) {\n      this.push('tick')\n      cb(null)\n    },\n    destroy(cb) {\n      t.pass('destroying')\n      cb(null)\n    }\n  })\n\n  let runs = 10\n\n  await t.exception(async function () {\n    for await (const chunk of r) {\n      t.is(chunk, 'tick')\n      if (--runs === 0) throw new Error('stop')\n    }\n  })\n})\n\ntest('intertesting timing', async function (t) {\n  const r = new Readable({\n    read(cb) {\n      setImmediate(() => {\n        this.push('b')\n        this.push('c')\n        this.push(null)\n        cb(null)\n      })\n    },\n    destroy(cb) {\n      t.pass('destroying')\n      cb(null)\n    }\n  })\n\n  r.push('a')\n\n  const iterated = []\n\n  for await (const chunk of r) {\n    iterated.push(chunk)\n    await new Promise((resolve) => setTimeout(resolve, 10))\n  }\n\n  t.alike(iterated, ['a', 'b', 'c'])\n})\n\ntest('intertesting timing with close', async function (t) {\n  t.plan(3)\n\n  const r = new Readable({\n    read(cb) {\n      setImmediate(() => {\n        this.destroy(new Error('stop'))\n        cb(null)\n      })\n    },\n    destroy(cb) {\n      t.pass('destroying')\n      cb(null)\n    }\n  })\n\n  r.push('a')\n\n  const iterated = []\n\n  await t.exception(async function () {\n    for await (const chunk of r) {\n      iterated.push(chunk)\n      await new Promise((resolve) => setTimeout(resolve, 10))\n    }\n  })\n\n  t.alike(iterated, ['a'])\n})\n\ntest('cleaning up a closed iterator', async function (t) {\n  const r = new Readable()\n  r.push('a')\n  t.plan(1)\n\n  const fn = async () => {\n    for await (const chunk of r) {\n      // eslint-disable-line\n      r.destroy()\n      await new Promise((resolve) => r.once('close', resolve))\n      t.is(chunk, 'a')\n      return\n    }\n  }\n  await fn()\n})\n\ntest('using abort controller', { skip: !!global.Bare }, async function (t) {\n  function createInfinite(signal) {\n    let count = 0\n    const r = new Readable({ signal })\n    r.push(count)\n    const int = setInterval(() => r.push(count++), 5000)\n    r.once('close', () => clearInterval(int))\n    return r\n  }\n  const controller = new AbortController()\n  const inc = []\n  setImmediate(() => controller.abort())\n\n  let r\n  await t.exception(async function () {\n    r = createInfinite(controller.signal)\n    for await (const chunk of r) {\n      inc.push(chunk)\n    }\n  })\n\n  t.alike(inc, [0])\n  const err = getStreamError(r)\n  t.ok(StreamError.isAborted(err))\n})\n\ntest('from async iterator and to async iterator', async function (t) {\n  const expected = []\n\n  const stream = Readable.from(\n    (async function* () {\n      yield 'a'\n      yield 'b'\n    })()\n  )\n\n  for await (const data of stream) {\n    expected.push(data)\n  }\n\n  t.alike(expected, ['a', 'b'])\n})\n"
  },
  {
    "path": "test/backpressure.js",
    "content": "const test = require('brittle')\nconst { Writable, Readable } = require('../')\n\ntest('write backpressure', function (t) {\n  const ws = new Writable()\n\n  for (let i = 0; i < 15; i++) {\n    t.ok(ws.write('a'), 'not backpressured')\n    t.absent(Writable.isBackpressured(ws), 'static check')\n  }\n\n  t.absent(ws.write('a'), 'backpressured')\n  t.ok(Writable.isBackpressured(ws), 'static check')\n})\n\ntest('write backpressure with drain', function (t) {\n  t.plan(15 * 2 + 2 + 1)\n\n  const ws = new Writable()\n\n  for (let i = 0; i < 15; i++) {\n    t.ok(ws.write('a'), 'not backpressured')\n    t.absent(Writable.isBackpressured(ws), 'static check')\n  }\n\n  t.absent(ws.write('a'), 'backpressured')\n  t.ok(Writable.isBackpressured(ws), 'static check')\n\n  ws.on('drain', function () {\n    t.absent(Writable.isBackpressured(ws))\n  })\n})\n\ntest('write backpressure with destroy', function (t) {\n  const ws = new Writable()\n\n  ws.write('a')\n  ws.destroy()\n\n  t.ok(Writable.isBackpressured(ws))\n})\n\ntest('write backpressure with end', function (t) {\n  const ws = new Writable()\n\n  ws.write('a')\n  ws.end()\n\n  t.ok(Writable.isBackpressured(ws))\n})\n\ntest('read backpressure', function (t) {\n  const rs = new Readable()\n\n  for (let i = 0; i < 15; i++) {\n    t.ok(rs.push('a'), 'not backpressured')\n    t.absent(Readable.isBackpressured(rs), 'static check')\n  }\n\n  t.absent(rs.push('a'), 'backpressured')\n  t.ok(Readable.isBackpressured(rs), 'static check')\n})\n\ntest('read backpressure with later read', function (t) {\n  t.plan(15 * 2 + 2 + 1)\n\n  const rs = new Readable()\n\n  for (let i = 0; i < 15; i++) {\n    t.ok(rs.push('a'), 'not backpressured')\n    t.absent(Readable.isBackpressured(rs), 'static check')\n  }\n\n  t.absent(rs.push('a'), 'backpressured')\n  t.ok(Readable.isBackpressured(rs), 'static check')\n\n  rs.once('readable', function () {\n    rs.read()\n    t.absent(Readable.isBackpressured(rs))\n  })\n})\n\ntest('read backpressure with destroy', function (t) {\n  const rs = new Readable()\n\n  rs.push('a')\n  rs.destroy()\n\n  t.ok(Readable.isBackpressured(rs))\n})\n\ntest('read backpressure with push(null)', function (t) {\n  const rs = new Readable()\n\n  rs.push('a')\n  rs.push(null)\n\n  t.ok(Readable.isBackpressured(rs))\n})\n"
  },
  {
    "path": "test/byte-length.js",
    "content": "const test = require('brittle')\nconst { Readable, Writable } = require('../')\n\nconst defaultSizes = [\n  { name: 'buf512', item: Buffer.alloc(512), size: 512 },\n  { name: 'number', item: 1, size: 1024 },\n  { name: 'number-byteLength', item: 1, size: 512, byteLength: () => 512 },\n  {\n    name: 'number-byteLengthReadable',\n    item: 1,\n    size: 256,\n    byteLength: () => 512,\n    byteLengthExtended: () => 256\n  },\n  { name: 'uint8-512', item: new Uint8Array(512), size: 512 },\n  { name: 'uint32-64', item: new Uint32Array(64), size: 256 }\n]\n\nfor (const { name, item, size, byteLength, byteLengthExtended } of defaultSizes) {\n  test(`readable ${name}`, function (t) {\n    const r = new Readable({\n      byteLength,\n      byteLengthReadable: byteLengthExtended\n    })\n    r.push(item)\n    t.is(r._readableState.buffered, size)\n  })\n\n  test(`writable ${name}`, function (t) {\n    const w = new Writable({\n      byteLength,\n      byteLengthWritable: byteLengthExtended\n    })\n    w.write(item)\n    t.is(w._writableState.buffered, size)\n  })\n}\n\ntest('byteLength receives readable item', function (t) {\n  t.plan(1)\n\n  const obj = {}\n  const r = new Readable({\n    byteLength: (data) => {\n      t.alike(obj, data)\n    }\n  })\n  r.push(obj)\n})\n\ntest('byteLength receives writable item', function (t) {\n  t.plan(2)\n\n  const obj = {}\n  const r = new Writable({\n    byteLength: (data) => {\n      t.alike(obj, data)\n      return 1\n    }\n  })\n  r.write(obj)\n})\n"
  },
  {
    "path": "test/compat.js",
    "content": "const eos = global.Bare ? null : require('end-of-stream')\nconst test = require('brittle')\nconst stream = require('../')\nconst finished = global.Bare ? null : require('stream').finished\n\nrun(eos)\nrun(finished)\n\nfunction run(eos) {\n  if (!eos) return\n  const name = eos === finished ? 'nodeStream.finished' : 'eos'\n\n  test(name + ' readable', function (t) {\n    t.plan(2)\n\n    const r = new stream.Readable()\n    let ended = false\n\n    r.on('end', function () {\n      ended = true\n    })\n\n    eos(r, function (err) {\n      t.absent(err, 'no error')\n      t.ok(ended)\n    })\n\n    r.push('hello')\n    r.push(null)\n    r.resume()\n  })\n\n  test(name + ' readable destroy', function (t) {\n    t.plan(2)\n\n    const r = new stream.Readable()\n    let ended = false\n\n    r.on('end', function () {\n      ended = true\n    })\n\n    eos(r, function (err) {\n      t.ok(err, 'had error')\n      t.absent(ended)\n    })\n\n    r.push('hello')\n    r.push(null)\n    r.resume()\n    r.destroy()\n  })\n\n  test(name + ' writable', function (t) {\n    t.plan(2)\n\n    const w = new stream.Writable()\n    let finished = false\n\n    w.on('finish', function () {\n      finished = true\n    })\n\n    eos(w, function (err) {\n      t.absent(err, 'no error')\n      t.ok(finished)\n    })\n\n    w.write('hello')\n    w.end()\n  })\n\n  test(name + ' writable destroy', function (t) {\n    t.plan(3)\n\n    const w = new stream.Writable()\n    let finished = false\n\n    w.on('finish', function () {\n      finished = true\n    })\n\n    eos(w, function (err) {\n      t.ok(err, 'had error')\n      t.absent(finished)\n    })\n\n    w.write('hello')\n    t.is(w.end(), w)\n    w.destroy()\n  })\n\n  test(name + ' duplex', function (t) {\n    t.plan(4)\n\n    const s = new stream.Duplex()\n    let ended = false\n    let finished = false\n\n    s.on('end', () => {\n      ended = true\n    })\n    s.on('finish', () => {\n      finished = true\n    })\n\n    eos(s, function (err) {\n      t.absent(err, 'no error')\n      t.ok(ended)\n      t.ok(finished)\n    })\n\n    s.push('hello')\n    s.push(null)\n    s.resume()\n    t.is(s.end(), s)\n  })\n\n  test(name + ' duplex + deferred s.end()', function (t) {\n    t.plan(3)\n\n    const s = new stream.Duplex()\n    let ended = false\n    let finished = false\n\n    s.on('end', function () {\n      ended = true\n      setImmediate(() => s.end())\n    })\n\n    s.on('finish', () => {\n      finished = true\n    })\n\n    eos(s, function (err) {\n      t.absent(err, 'no error')\n      t.ok(ended)\n      t.ok(finished)\n    })\n\n    s.push('hello')\n    s.push(null)\n    s.resume()\n  })\n\n  test(name + ' duplex + deferred s.push(null)', function (t) {\n    t.plan(3)\n\n    const s = new stream.Duplex()\n    let ended = false\n    let finished = false\n\n    s.on('finish', function () {\n      finished = true\n      setImmediate(() => s.push(null))\n    })\n\n    s.on('end', () => {\n      ended = true\n    })\n\n    eos(s, function (err) {\n      t.absent(err, 'no error')\n      t.ok(ended)\n      t.ok(finished)\n    })\n\n    s.push('hello')\n    s.end()\n    s.resume()\n  })\n\n  test(name + ' duplex destroy', function (t) {\n    t.plan(3)\n\n    const s = new stream.Duplex()\n    let ended = false\n    let finished = false\n\n    s.on('end', () => {\n      ended = true\n    })\n    s.on('finish', () => {\n      finished = true\n    })\n\n    eos(s, function (err) {\n      t.ok(err, 'had error')\n      t.absent(ended)\n      t.absent(finished)\n    })\n\n    s.push('hello')\n    s.push(null)\n    s.resume()\n    s.end()\n    s.destroy()\n  })\n}\n"
  },
  {
    "path": "test/destroy.js",
    "content": "const test = require('brittle')\nconst { Duplex } = require('../')\n\ntest('destroy is never sync', function (t) {\n  t.plan(1)\n\n  let openCb = null\n\n  const s = new Duplex({\n    open(cb) {\n      openCb = cb\n    },\n    predestroy() {\n      openCb(new Error('stop'))\n    }\n  })\n\n  s.resume()\n  setImmediate(() => {\n    s.destroy()\n    s.on('close', () => t.pass('destroy was not sync'))\n  })\n})\n"
  },
  {
    "path": "test/duplex.js",
    "content": "const test = require('brittle')\nconst { Duplex } = require('../')\n\ntest('if open does not end, it should stall', function (t) {\n  t.plan(1)\n\n  const d = new Duplex({\n    open() {\n      t.pass('open called')\n    },\n    read() {\n      t.fail('should not call read')\n    },\n    write() {\n      t.fail('should not call write')\n    }\n  })\n\n  d.resume()\n  d.write('hi')\n})\n\ntest('Using both mapReadable and mapWritable to map data', function (t) {\n  t.plan(2)\n\n  const d = new Duplex({\n    write(data, cb) {\n      d.push(data)\n      cb()\n    },\n    final(cb) {\n      d.push(null)\n      cb()\n    },\n    mapReadable: (num) => JSON.stringify({ num }),\n    mapWritable: (input) => parseInt(input, 10)\n  })\n  d.on('data', (data) => {\n    t.is(data, '{\"num\":32}')\n  })\n  d.on('close', () => {\n    t.pass('closed')\n  })\n  d.write('32')\n  d.end()\n})\n\ntest('wait for readable', function (t) {\n  t.plan(1)\n\n  const d = new Duplex({\n    read(cb) {\n      d.push('ok')\n      d.push(null)\n      cb()\n    }\n  })\n\n  d.on('readable', function () {\n    t.is(d.read(), 'ok')\n  })\n})\n\ntest('write during end', function (t) {\n  t.plan(3)\n\n  const expected = ['a', 'b']\n\n  const w = new Duplex({\n    write(data, cb) {\n      t.is(data, expected.shift())\n      cb(null)\n    },\n    final(cb) {\n      w.write('bad')\n      cb(null)\n    }\n  })\n\n  w.write('a')\n  w.write('b')\n  w.end()\n  w.on('finish', () => w.push(null))\n  w.on('close', () => t.pass('closed'))\n})\n"
  },
  {
    "path": "test/errors.js",
    "content": "const test = require('brittle')\nconst StreamError = require('../lib/errors')\n\ntest('can make errors', function (t) {\n  {\n    const err = StreamError.STREAM_DESTROYED()\n\n    t.is(err.code, 'STREAM_DESTROYED')\n    t.is(err.message, 'Stream was destroyed')\n    t.ok(StreamError.isStreamDestroyed(err))\n  }\n\n  {\n    const err = StreamError.PREMATURE_CLOSE()\n\n    t.is(err.code, 'PREMATURE_CLOSE')\n    t.is(err.message, 'Premature close')\n    t.ok(StreamError.isPrematureClose(err))\n  }\n\n  {\n    const err = StreamError.ABORTED()\n\n    t.is(err.code, 'ABORTED')\n    t.is(err.message, 'Stream aborted')\n    t.ok(StreamError.isAborted(err))\n  }\n})\n"
  },
  {
    "path": "test/get-stream-error.js",
    "content": "const test = require('brittle')\nconst { Readable, getStreamError } = require('../')\nconst StreamError = require('../lib/errors')\n\ntest('getStreamError, no errors', function (t) {\n  const stream = new Readable()\n\n  t.is(getStreamError(stream), null)\n})\n\ntest('getStreamError, basic', function (t) {\n  const stream = new Readable()\n  stream.on('error', () => {})\n\n  const err = new Error('stop')\n  stream.destroy(err)\n\n  t.is(getStreamError(stream), err)\n})\n\ntest('getStreamError, only explicit errors by default', function (t) {\n  const stream = new Readable()\n\n  stream.destroy()\n\n  t.absent(getStreamError(stream))\n})\n\ntest('getStreamError, get premature destroy', function (t) {\n  const stream = new Readable()\n\n  stream.destroy()\n\n  const err = getStreamError(stream, { all: true })\n  t.alike(err.message, 'Stream was destroyed')\n  t.ok(StreamError.isStreamDestroyed(err))\n})\n"
  },
  {
    "path": "test/passthrough.js",
    "content": "const test = require('brittle')\nconst { PassThrough, Writable, Readable } = require('../')\n\ntest('passthrough', (t) => {\n  t.plan(3)\n\n  let i = 0\n  const p = new PassThrough()\n  const w = new Writable({\n    write(data, cb) {\n      i++\n      if (i === 1) t.is(data, 'foo')\n      else if (i === 2) t.is(data, 'bar')\n      else t.fail('too many messages')\n      cb()\n    }\n  })\n  w.on('finish', () => t.pass('finished'))\n  const r = new Readable()\n  r.pipe(p).pipe(w)\n  r.push('foo')\n  r.push('bar')\n  r.push(null)\n})\n"
  },
  {
    "path": "test/pipe.js",
    "content": "const test = require('brittle')\nconst compat = global.Bare ? null : require('stream')\nconst { Readable, Writable } = require('../')\n\ntest('pipe to node stream', { skip: !compat }, function (t) {\n  t.plan(3)\n\n  const expected = ['hi', 'ho']\n\n  const r = new Readable()\n  const w = new compat.Writable({\n    objectMode: true,\n    write(data, enc, cb) {\n      t.is(data, expected.shift())\n      cb(null)\n    }\n  })\n\n  r.push('hi')\n  r.push('ho')\n  r.push(null)\n\n  r.pipe(w)\n\n  w.on('finish', function () {\n    t.is(expected.length, 0)\n  })\n})\n\ntest('pipe with callback - error case', function (t) {\n  t.plan(2)\n\n  const r = new Readable()\n  const w = new Writable({\n    write(data, cb) {\n      cb(new Error('blerg'))\n    }\n  })\n\n  r.pipe(w, function (err) {\n    t.pass('callback called')\n    t.alike(err, new Error('blerg'))\n  })\n\n  r.push('hello')\n  r.push('world')\n  r.push(null)\n})\n\ntest('pipe with callback - error case with destroy', function (t) {\n  t.plan(2)\n\n  const r = new Readable()\n  const w = new Writable({\n    write(data, cb) {\n      w.destroy(new Error('blerg'))\n      cb(null)\n    }\n  })\n\n  r.pipe(w, function (err) {\n    t.pass('callback called')\n    t.alike(err, new Error('blerg'))\n  })\n\n  r.push('hello')\n  r.push('world')\n})\n\ntest('pipe with callback - error case node stream', { skip: !compat }, function (t) {\n  t.plan(2)\n\n  const r = new Readable()\n  const w = new compat.Writable({\n    write(data, enc, cb) {\n      cb(new Error('blerg'))\n    }\n  })\n\n  r.pipe(w, function (err) {\n    t.pass('callback called')\n    t.alike(err, new Error('blerg'))\n  })\n\n  r.push('hello')\n  r.push('world')\n  r.push(null)\n})\n\ntest('simple pipe', function (t) {\n  t.plan(2)\n\n  const buffered = []\n\n  const r = new Readable()\n  const w = new Writable({\n    write(data, cb) {\n      buffered.push(data)\n      cb(null)\n    },\n\n    final() {\n      t.pass('final called')\n      t.alike(buffered, ['hello', 'world'])\n    }\n  })\n\n  r.pipe(w)\n\n  r.push('hello')\n  r.push('world')\n  r.push(null)\n})\n\ntest('pipe with callback', function (t) {\n  t.plan(3)\n\n  const buffered = []\n\n  const r = new Readable()\n  const w = new Writable({\n    write(data, cb) {\n      buffered.push(data)\n      cb(null)\n    }\n  })\n\n  r.pipe(w, function (err) {\n    t.pass('callback called')\n    t.is(err, null)\n    t.alike(buffered, ['hello', 'world'])\n  })\n\n  r.push('hello')\n  r.push('world')\n  r.push(null)\n})\n\ntest('pipe continues if read is \"blocked\"', function (t) {\n  t.plan(1)\n\n  let written = 0\n  let read = 0\n\n  const r = new Readable({\n    read(cb) {\n      this.push('test')\n\n      if (++read === 20) {\n        setTimeout(done, 10)\n        return\n      }\n\n      cb(null)\n    }\n  })\n\n  const w = new Writable({\n    write(data, cb) {\n      written++\n      cb(null)\n    }\n  })\n\n  r.pipe(w)\n\n  function done() {\n    t.is(written, read)\n  }\n})\n"
  },
  {
    "path": "test/pipeline.js",
    "content": "const test = require('brittle')\nconst { pipeline, pipelinePromise, Transform, Readable, Writable } = require('../')\n\ntest('piping to a writable', function (t) {\n  t.plan(2)\n\n  const w = pipeline(\n    Readable.from('hello'),\n    new Writable({\n      write(data, cb) {\n        t.is(data, 'hello')\n        cb()\n      }\n    })\n  )\n  w.on('close', () => t.pass('closed'))\n})\n\ntest('piping with error', function (t) {\n  t.plan(1)\n\n  const r = new Readable()\n  const w = new Writable()\n  const err = new Error()\n  pipeline(r, w, (error) => {\n    t.alike(error, err)\n  })\n  r.destroy(err)\n})\n\ntest('piping with final callback', function (t) {\n  t.plan(2)\n\n  pipeline(\n    Readable.from('hello'),\n    new Writable({\n      write(data, cb) {\n        t.is(data, 'hello')\n        cb()\n      }\n    }),\n    () => t.pass('ended')\n  )\n})\n\ntest('piping with transform stream inbetween', function (t) {\n  t.plan(2)\n\n  pipeline(\n    [\n      Readable.from('hello'),\n      new Transform({\n        transform(input, cb) {\n          this.push(input.length)\n          cb()\n        }\n      }),\n      new Writable({\n        write(data, cb) {\n          t.is(data, 5)\n          cb()\n        }\n      })\n    ],\n    () => t.pass('ended')\n  )\n})\n\ntest('piping to a writable', function (t) {\n  t.plan(2)\n\n  const w = pipeline(\n    Readable.from('hello'),\n    new Writable({\n      write(data, cb) {\n        t.is(data, 'hello')\n        cb()\n      }\n    })\n  )\n  w.on('close', () => t.pass('closed'))\n})\n\ntest('piping to a writable + promise', async function (t) {\n  t.plan(2)\n\n  const r = Readable.from('hello')\n  let closed = false\n  r.on('close', () => {\n    closed = true\n  })\n  await pipelinePromise(\n    r,\n    new Writable({\n      write(data, cb) {\n        t.is(data, 'hello')\n        cb()\n      }\n    })\n  )\n  t.ok(closed)\n})\n"
  },
  {
    "path": "test/readable.js",
    "content": "const test = require('brittle')\nconst b4a = require('b4a')\nconst { Readable, isDisturbed } = require('../')\n\ntest('ondata', function (t) {\n  t.plan(4)\n\n  const r = new Readable()\n  const buffered = []\n  let ended = 0\n\n  r.push('hello')\n  r.push('world')\n  r.push(null)\n\n  r.on('data', (data) => buffered.push(data))\n  r.on('end', () => ended++)\n  r.on('close', function () {\n    t.pass('closed')\n    t.alike(buffered, ['hello', 'world'])\n    t.is(ended, 1)\n    t.ok(r.destroyed)\n  })\n})\n\ntest('pause', async function (t) {\n  const r = new Readable()\n  const buffered = []\n  t.is(Readable.isPaused(r), true, 'starting off paused')\n  r.on('data', (data) => buffered.push(data))\n  r.on('close', () => t.end())\n  r.push('hello')\n  await nextImmediate()\n  t.is(r.pause(), r, '.pause() returns self')\n  t.is(Readable.isPaused(r), true, '.pause() marks stream as paused')\n  r.push('world')\n  await nextImmediate()\n  t.alike(buffered, ['hello'], '.pause() prevents data to be read')\n  t.is(r.resume(), r, '.resume() returns self')\n  t.is(Readable.isPaused(r), false, '.resume() marks stream as resumed')\n  await nextImmediate()\n  t.alike(buffered, ['hello', 'world'])\n  r.push(null)\n})\n\ntest('resume', function (t) {\n  t.plan(3)\n\n  const r = new Readable()\n  let ended = 0\n\n  r.push('hello')\n  r.push('world')\n  r.push(null)\n\n  r.resume()\n  r.on('end', () => ended++)\n  r.on('close', function () {\n    t.pass('closed')\n    t.is(ended, 1)\n    t.ok(r.destroyed)\n  })\n})\n\ntest('lazy open', async function (t) {\n  let opened = false\n  const r = new Readable({\n    open(cb) {\n      opened = true\n      cb(null)\n    }\n  })\n  await nextImmediate()\n  t.absent(opened)\n  r.read()\n  await nextImmediate()\n  t.ok(opened)\n})\n\ntest('eager open', async function (t) {\n  let opened = false\n  const r = new Readable({\n    open(cb) {\n      opened = true\n      cb(null)\n    },\n    eagerOpen: true\n  })\n  await nextImmediate()\n  t.ok(opened)\n  r.push(null)\n})\n\ntest('shorthands', function (t) {\n  t.plan(3)\n\n  const r = new Readable({\n    read(cb) {\n      this.push('hello')\n      cb(null)\n    },\n    destroy(cb) {\n      t.pass('destroyed')\n      cb(null)\n    }\n  })\n\n  r.once('readable', function () {\n    t.is(r.read(), 'hello')\n    r.destroy()\n    t.is(r.read(), null)\n  })\n})\n\ntest('both push and the cb needs to be called for re-reads', function (t) {\n  t.plan(2)\n\n  let once = true\n\n  const r = new Readable({\n    read(cb) {\n      t.ok(once, 'read called once')\n      once = false\n      cb(null)\n    }\n  })\n\n  r.resume()\n\n  setTimeout(function () {\n    once = true\n    r.push('hi')\n  }, 100)\n})\n\ntest('from array', function (t) {\n  t.plan(1)\n\n  const inc = []\n  const r = Readable.from([1, 2, 3])\n  r.on('data', (data) => inc.push(data))\n  r.on('end', function () {\n    t.alike(inc, [1, 2, 3])\n  })\n})\n\ntest('from buffer', function (t) {\n  t.plan(1)\n\n  const inc = []\n  const r = Readable.from(Buffer.from('hello'))\n  r.on('data', (data) => inc.push(data))\n  r.on('end', function () {\n    t.alike(inc, [Buffer.from('hello')])\n  })\n})\n\ntest('from async iterator', function (t) {\n  t.plan(1)\n\n  async function* test() {\n    yield 1\n    yield 2\n    yield 3\n  }\n\n  const inc = []\n  const r = Readable.from(test())\n  r.on('data', (data) => inc.push(data))\n  r.on('end', function () {\n    t.alike(inc, [1, 2, 3])\n  })\n})\n\ntest('from array with highWaterMark', function (t) {\n  const r = Readable.from([1, 2, 3], { highWaterMark: 1 })\n  t.is(r._readableState.highWaterMark, 1)\n})\n\ntest('from async iterator with highWaterMark', function (t) {\n  async function* test() {\n    yield 1\n  }\n\n  const r = Readable.from(test(), { highWaterMark: 1 })\n  t.is(r._readableState.highWaterMark, 1)\n})\n\ntest('unshift', async function (t) {\n  const r = new Readable()\n  r.pause()\n  r.push(1)\n  r.push(2)\n  r.unshift(0)\n  r.push(null)\n  const inc = []\n  for await (const entry of r) {\n    inc.push(entry)\n  }\n  t.alike(inc, [0, 1, 2])\n})\n\ntest('from readable should return the original readable', function (t) {\n  const r = new Readable()\n  t.is(Readable.from(r), r)\n})\n\ntest('map readable data', async function (t) {\n  const r = new Readable({\n    map: (input) => JSON.parse(input)\n  })\n  r.push('{ \"foo\": 1 }')\n  for await (const obj of r) {\n    // eslint-disable-line\n    t.alike(obj, { foo: 1 })\n    break\n  }\n})\n\ntest('use mapReadable to map data', async function (t) {\n  const r = new Readable({\n    map: () => t.fail('.mapReadable has priority'),\n    mapReadable: (input) => JSON.parse(input)\n  })\n  r.push('{ \"foo\": 1 }')\n  for await (const obj of r) {\n    // eslint-disable-line\n    t.alike(obj, { foo: 1 })\n    break\n  }\n})\n\ntest('live stream', function (t) {\n  t.plan(3)\n\n  const r = new Readable({\n    read(cb) {\n      this.push('data')\n      this.push('data')\n      this.push('data')\n      // assume cb is called way later\n    }\n  })\n\n  r.on('data', function (data) {\n    t.is(data, 'data')\n  })\n})\n\ntest('live stream with readable', function (t) {\n  t.plan(3)\n\n  const r = new Readable({\n    read(cb) {\n      this.push('data')\n      this.push('data')\n      this.push('data')\n      // assume cb is called way later\n    }\n  })\n\n  r.on('readable', function () {\n    let data\n    while ((data = r.read()) !== null) t.is(data, 'data')\n  })\n})\n\ntest('resume a stalled stream', function (t) {\n  t.plan(1)\n\n  const expected = []\n  let once = true\n\n  const r = new Readable({\n    read(cb) {\n      if (once) {\n        once = false\n        this.push('data')\n        expected.push('data')\n        return cb()\n      }\n\n      for (let i = 0; i < 20; i++) {\n        this.push('data')\n        expected.push('data')\n      }\n\n      // pretend its stalled\n    }\n  })\n\n  const collected = []\n\n  r.once('data', function (data) {\n    r.pause()\n    collected.push(data)\n    setImmediate(() => {\n      r.on('data', function (data) {\n        collected.push(data)\n        if (collected.length === 21) {\n          t.alike(collected, expected)\n        }\n      })\n      r.resume()\n    })\n  })\n})\n\ntest('no read-ahead with pause/resume', function (t) {\n  t.plan(4)\n\n  let tick = 0\n\n  const r = new Readable({\n    highWaterMark: 0,\n    read(cb) {\n      this.push('tick: ' + ++tick)\n      cb()\n    }\n  })\n\n  r.once('data', function () {\n    t.is(tick, 1)\n    r.pause()\n    setImmediate(() => {\n      t.is(tick, 1)\n      r.resume()\n      r.once('data', function () {\n        t.is(tick, 2)\n        r.pause()\n        setImmediate(() => {\n          t.is(tick, 2)\n        })\n      })\n    })\n  })\n})\n\ntest('no read-ahead with async iterator', async function (t) {\n  let tick = 0\n\n  const r = new Readable({\n    highWaterMark: 0,\n    read(cb) {\n      this.push('tick: ' + ++tick)\n      if (tick === 10) this.push(null)\n      cb()\n    }\n  })\n\n  let expectedTick = 0\n  for await (const data of r) {\n    t.is(tick, ++expectedTick)\n    t.is(data, 'tick: ' + tick)\n    await nextImmediate()\n  }\n\n  t.is(expectedTick, 10)\n})\n\ntest('setEncoding', async function (t) {\n  const r = new Readable()\n\n  r.setEncoding('utf-8')\n  const buffer = b4a.from('hællå wørld!')\n  for (let i = 0; i < buffer.byteLength; i++) {\n    r.push(buffer.subarray(i, i + 1))\n  }\n  r.push(null)\n  const expected = b4a.toString(buffer).split('')\n  for await (const data of r) {\n    t.is(data, expected.shift())\n  }\n  t.is(expected.length, 0)\n})\n\ntest('setEncoding respects existing map', function (t) {\n  t.plan(1)\n\n  const r = new Readable({\n    encoding: 'utf-8',\n    map(data) {\n      return JSON.parse(data)\n    }\n  })\n\n  r.push('{\"hello\":\"world\"}')\n  r.once('data', function (data) {\n    t.alike(data, { hello: 'world' })\n  })\n})\n\ntest('setEncoding empty string', async function (t) {\n  t.plan(1)\n\n  const r = new Readable()\n\n  r.setEncoding('utf-8')\n  const buffer = b4a.from('')\n  r.push(buffer)\n  r.push(null)\n\n  for await (const data of r) {\n    t.is(data, '')\n  }\n})\n\ntest('is disturbed', function (t) {\n  const r = new Readable()\n  t.is(isDisturbed(r), false)\n\n  r.push('hello')\n  t.is(isDisturbed(r), false)\n\n  r.resume()\n  t.is(isDisturbed(r), true)\n\n  r.pause()\n  t.is(isDisturbed(r), true)\n})\n\ntest('is disturbed after immediate destroy', function (t) {\n  t.plan(3)\n\n  const r = new Readable()\n  t.is(isDisturbed(r), false)\n\n  r.destroy()\n  t.is(isDisturbed(r), true)\n\n  r.on('close', () => t.is(isDisturbed(r), true))\n})\n\nfunction nextImmediate() {\n  return new Promise((resolve) => setImmediate(resolve))\n}\n"
  },
  {
    "path": "test/transform.js",
    "content": "const test = require('brittle')\nconst { Transform } = require('../')\n\ntest('default transform teardown when saturated', async function (t) {\n  const stream = new Transform({\n    transform(data, cb) {\n      cb(null, data)\n    }\n  })\n\n  for (let i = 0; i < 20; i++) {\n    stream.write('hello')\n  }\n\n  await new Promise((resolve) => setImmediate(resolve))\n\n  stream.destroy()\n\n  await new Promise((resolve) => stream.on('close', resolve))\n\n  t.pass('close fired')\n})\n"
  },
  {
    "path": "test/writable.js",
    "content": "const test = require('brittle')\nconst { Writable, Duplex } = require('../')\n\ntest('opens before writes', function (t) {\n  t.plan(2)\n\n  const trace = []\n  const stream = new Writable({\n    open(cb) {\n      trace.push('open')\n      return cb(null)\n    },\n    write(data, cb) {\n      trace.push('write')\n      return cb(null)\n    }\n  })\n  stream.on('close', () => {\n    t.is(trace.length, 2)\n    t.is(trace[0], 'open')\n  })\n  stream.write('data')\n  stream.end()\n})\n\ntest('drain', function (t) {\n  t.plan(2)\n\n  const stream = new Writable({\n    highWaterMark: 1,\n    write(data, cb) {\n      cb(null)\n    }\n  })\n\n  t.absent(stream.write('a'))\n  stream.on('drain', function () {\n    t.pass('drained')\n  })\n})\n\ntest('drain multi write', function (t) {\n  t.plan(4)\n\n  const stream = new Writable({\n    highWaterMark: 1,\n    write(data, cb) {\n      cb(null)\n    }\n  })\n\n  t.absent(stream.write('a'))\n  t.absent(stream.write('a'))\n  t.absent(stream.write('a'))\n  stream.on('drain', function () {\n    t.pass('drained')\n  })\n})\n\ntest('drain async write', function (t) {\n  t.plan(3)\n\n  let flushed = false\n\n  const stream = new Writable({\n    highWaterMark: 1,\n    write(data, cb) {\n      setImmediate(function () {\n        flushed = true\n        cb(null)\n      })\n    }\n  })\n\n  t.absent(stream.write('a'))\n  t.absent(flushed)\n  stream.on('drain', function () {\n    t.ok(flushed)\n  })\n})\n\ntest('writev', function (t) {\n  t.plan(3)\n\n  const expected = [[], ['ho']]\n\n  const s = new Writable({\n    writev(batch, cb) {\n      t.alike(batch, expected.shift())\n      cb(null)\n    }\n  })\n\n  for (let i = 0; i < 100; i++) {\n    expected[0].push('hi-' + i)\n    s.write('hi-' + i)\n  }\n\n  s.on('drain', function () {\n    s.write('ho')\n    s.end()\n  })\n\n  s.on('finish', function () {\n    t.pass('finished')\n  })\n})\n\ntest('map written data', function (t) {\n  t.plan(2)\n\n  const r = new Writable({\n    write(data, cb) {\n      t.is(data, '{\"foo\":1}')\n      cb()\n    },\n    map: (input) => JSON.stringify(input)\n  })\n  r.on('finish', () => {\n    t.pass('finished')\n  })\n  r.write({ foo: 1 })\n  r.end()\n})\n\ntest('use mapWritable to map data', function (t) {\n  t.plan(2)\n\n  const r = new Writable({\n    write(data, cb) {\n      t.is(data, '{\"foo\":1}')\n      cb()\n    },\n    map: () => t.fail('.mapWritable has priority'),\n    mapWritable: (input) => JSON.stringify(input)\n  })\n  r.on('finish', () => {\n    t.pass('finished')\n  })\n  r.write({ foo: 1 })\n  r.end()\n})\n\ntest('many ends', function (t) {\n  t.plan(2)\n\n  let finals = 0\n  let finish = 0\n\n  const s = new Duplex({\n    final(cb) {\n      finals++\n      cb(null)\n    }\n  })\n\n  s.end()\n  queueMicrotask(() => {\n    s.end()\n    queueMicrotask(() => {\n      s.end()\n    })\n  })\n\n  s.on('finish', function () {\n    finish++\n    t.is(finals, 1)\n    t.is(finish, 1)\n  })\n})\n\ntest('drained helper', async function (t) {\n  const w = new Writable({\n    write(data, cb) {\n      setImmediate(cb)\n    }\n  })\n\n  for (let i = 0; i < 20; i++) w.write('hi')\n\n  await Writable.drained(w)\n\n  t.is(w._writableState.queue.length, 0)\n\n  for (let i = 0; i < 20; i++) w.write('hi')\n\n  const d1 = Writable.drained(w)\n\n  for (let i = 0; i < 20; i++) w.write('hi')\n\n  const d2 = Writable.drained(w)\n\n  d1.then(() => {\n    t.not(w._writableState.queue.length, 0, 'future writes are queued')\n  })\n\n  d2.then(() => {\n    t.is(w._writableState.queue.length, 0, 'all drained now')\n  })\n\n  await d1\n  await d2\n\n  await Writable.drained(w)\n\n  t.pass('works if no writes are pending')\n\n  for (let i = 0; i < 20; i++) w.write('hi')\n\n  const d3 = Writable.drained(w)\n  w.destroy()\n\n  t.absent(await d3)\n  t.absent(await Writable.drained(w), 'already destroyed')\n})\n\ntest('drained helper, duplex', async function (t) {\n  const w = new Duplex({\n    write(data, cb) {\n      setImmediate(cb)\n    }\n  })\n\n  for (let i = 0; i < 20; i++) w.write('hi')\n\n  await Writable.drained(w)\n\n  t.is(w._writableState.queue.length, 0)\n\n  for (let i = 0; i < 20; i++) w.write('hi')\n\n  const d1 = Writable.drained(w)\n\n  for (let i = 0; i < 20; i++) w.write('hi')\n\n  const d2 = Writable.drained(w)\n\n  d1.then(() => {\n    t.not(w._writableState.queue.length, 0, 'future writes are queued')\n  })\n\n  d2.then(() => {\n    t.is(w._writableState.queue.length, 0, 'all drained now')\n  })\n\n  await d1\n  await d2\n\n  await Writable.drained(w)\n\n  t.pass('works if no writes are pending')\n\n  for (let i = 0; i < 20; i++) w.write('hi')\n\n  const d3 = Writable.drained(w)\n  w.destroy()\n\n  t.absent(await d3)\n  t.absent(await Writable.drained(w), 'already destroyed')\n})\n\ntest('drained helper, inflight write', async function (t) {\n  let writing = false\n  const w = new Writable({\n    write(data, cb) {\n      writing = true\n      setImmediate(() => {\n        setImmediate(() => {\n          writing = false\n          cb()\n        })\n      })\n    }\n  })\n\n  w.write('hello')\n  w.end()\n\n  await new Promise((resolve) => setImmediate(resolve))\n  t.ok(writing, 'is writing')\n  await Writable.drained(w)\n  t.absent(writing, 'not writing')\n})\n\ntest('drained helper, writev', async function (t) {\n  let writing = 0\n  let continueWrite\n\n  const wrote = new Promise((resolve) => {\n    continueWrite = resolve\n  })\n\n  const w = new Writable({\n    writev(datas, cb) {\n      continueWrite()\n      setImmediate(() => {\n        writing -= datas.length\n        cb()\n      })\n    }\n  })\n\n  for (let i = 0; i < 10; i++) {\n    writing++\n    w.write('hello')\n  }\n\n  w.end()\n\n  await wrote\n  t.ok(writing > 0, 'is writing')\n  await Writable.drained(w)\n  t.ok(writing === 0, 'not writing')\n})\n\ntest('drained helper, writev, already flushed', async function (t) {\n  const w = new Writable({\n    writev(datas, cb) {\n      cb()\n    }\n  })\n\n  await Writable.drained(w)\n  t.pass('drained resovled')\n})\n\ntest('can cork and uncork the stream', async function (t) {\n  const w = new Writable({\n    writev(batch, cb) {\n      t.alike(batch, [1, 2, 3])\n      cb(null)\n    }\n  })\n\n  w.cork()\n  w.write(1)\n  await eventFlush()\n  w.write(2)\n  await eventFlush()\n  w.write(3)\n  w.uncork()\n\n  await Writable.drained(w)\n})\n\nfunction eventFlush() {\n  return new Promise((resolve) => setImmediate(resolve))\n}\n"
  }
]