Full Code of mafintosh/streamx for AI

master 77053f55b0f9 cached
26 files
89.5 KB
25.6k tokens
291 symbols
1 requests
Download .txt
Repository: mafintosh/streamx
Branch: master
Commit: 77053f55b0f9
Files: 26
Total size: 89.5 KB

Directory structure:
gitextract_6870msq2/

├── .gitattributes
├── .github/
│   └── workflows/
│       └── test.yml
├── .gitignore
├── .prettierrc
├── LICENSE
├── README.md
├── SECURITY.md
├── examples/
│   └── fs.js
├── index.js
├── lib/
│   └── errors.js
├── package.json
└── test/
    ├── all.js
    ├── async-iterator.js
    ├── backpressure.js
    ├── byte-length.js
    ├── compat.js
    ├── destroy.js
    ├── duplex.js
    ├── errors.js
    ├── get-stream-error.js
    ├── passthrough.js
    ├── pipe.js
    ├── pipeline.js
    ├── readable.js
    ├── transform.js
    └── writable.js

================================================
FILE CONTENTS
================================================

================================================
FILE: .gitattributes
================================================
* text=auto eol=lf


================================================
FILE: .github/workflows/test.yml
================================================
name: Build Status
on:
  push:
    branches:
      - master
  pull_request:
    branches:
      - master
jobs:
  build:
    strategy:
      matrix:
        node-version: [lts/*]
        os: [ubuntu-latest, macos-latest, windows-latest]
    runs-on: ${{ matrix.os }}
    steps:
      - uses: actions/checkout@v2
      - name: Use Node.js ${{ matrix.node-version }}
        uses: actions/setup-node@v2
        with:
          node-version: ${{ matrix.node-version }}
      - run: npm install
      - run: npm test


================================================
FILE: .gitignore
================================================
node_modules
package-lock.json
coverage
sandbox.js
sandbox/
*.cpy


================================================
FILE: .prettierrc
================================================
"prettier-config-holepunch"


================================================
FILE: LICENSE
================================================
The MIT License (MIT)

Copyright (c) 2019 Mathias Buus

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.


================================================
FILE: README.md
================================================
# streamx

An iteration of the Node.js core streams with a series of improvements.

```
npm install streamx
```

[![Build Status](https://github.com/streamxorg/streamx/workflows/Build%20Status/badge.svg)](https://github.com/streamxorg/streamx/actions?query=workflow%3A%22Build+Status%22)

## Main improvements from Node.js core stream

#### Proper lifecycle support.

Streams have an `_open` function that is called before any read/write operation and a `_destroy`
function that is always run as the last part of the stream.

This makes it easy to maintain state.

#### Easy error handling

Fully integrates a `.destroy()` function. When called the stream will wait for any
pending operation to finish and call the stream destroy logic.

Close is _always_ the last event emitted and `destroy` is always run.

#### `pipe()` error handles

`pipe` accepts a callback that is called when the pipeline is fully drained.
It also error handles the streams provided and destroys both streams if either
of them fail.

#### All streams are both binary and object mode streams

A `map` function can be provided to map your input data into buffers
or other formats. To indicate how much buffer space each data item takes
an `byteLength` function can be provided as well.

This removes the need for two modes of streams.

#### Simplicity

This is a full rewrite, all contained in one file.

Lots of stream methods are simplified based on how I and devs I work with actually use streams in the wild.

#### Backwards compat

streamx aims to be compatible with Node.js streams whenever it is reasonable to do so.

This means that streamx streams behave a lot like Node.js streams from the outside but still provides the
improvements above.

#### Smaller browser footprint

streamx has a much smaller footprint when compiled for the browser:

```
$ for x in stream{,x}; do echo $x: $(browserify -r $x | wc -c) bytes; done
stream: 173844 bytes
streamx: 46943 bytes
```

With optimizations turned on, the difference is even more stark:

```
$ for x in stream{,x}; do echo $x: $(browserify -r $x -p tinyify | wc -c) bytes; done
stream: 62649 bytes
streamx: 8460 bytes
$ for x in stream{,x}; do echo $x: $(browserify -r $x -p tinyify | gzip | wc -c) "bytes (gzipped)"; done
stream: 18053 bytes (gzipped)
streamx: 2806 bytes (gzipped)
```

#### AbortSignal support

To make it easier to integrate streams in a `async/await` flow, all streams support a `signal` option
that accepts a [`AbortSignal`](https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal) to as an
alternative means to `.destroy` streams.

## Usage

```js
const { Readable } = require('streamx')

const rs = new Readable({
  read(cb) {
    this.push('Cool data')
    cb(null)
  }
})

rs.on('data', (data) => console.log('data:', data))
```

## API

This streamx package contains 4 streams similar to Node.js core.

## Readable Stream

#### `rs = new stream.Readable([options])`

Create a new readable stream.

Options include:

```
{
  highWaterMark: 16384, // max buffer size in bytes
  map: (data) => data, // optional function to map input data
  byteLength: (data) => size, // optional function that calculates the byte size of input data
  signal: abortController.signal, // optional AbortSignal that triggers `.destroy` when on `abort`
  eagerOpen: false // eagerly open the stream
}
```

In addition you can pass the `open`, `read`, and `destroy` functions as shorthands in
the constructor instead of overwrite the methods below.

The default byteLength function returns the byte length of buffers and `1024`
for any other object. This means the buffer will contain around 16 non buffers
or buffers worth 16kb when full if the defaults are used.

If you set highWaterMark to `0` then all read ahead buffering on the stream
is disabled and it will only call `_read` when a user reads rather than ahead of time.

#### `rs._read(cb)`

This function is called when the stream wants you to push new data.
Overwrite this and add your own read logic.
You should call the callback when you are fully done with the read.

Can also be set using `options.read` in the constructor.

Note that this function differs from Node.js streams in that it takes
the "read finished" callback.

#### `drained = rs.push(data)`

Push new data to the stream. Returns true if the buffer is not full
and you should push more data if you can.

If you call `rs.push(null)` you signal to the stream that no more
data will be pushed and that you want to end the stream.

#### `data = rs.read()`

Read a piece of data from the stream buffer. If the buffer is currently empty
`null` will be returned and you should wait for `readable` to be emitted before
trying again. If the stream has been ended it will also return `null`.

Note that this method differs from Node.js streams in that it does not accept
an optional amounts of bytes to consume.

#### `rs.unshift(data)`

Add a piece of data to the front of the buffer. Use this if you read too much
data using the `rs.read()` function.

#### `rs._open(cb)`

This function is called once before the first read is issued. Use this function
to implement your own open logic.

Can also be set using `options.open` in the constructor.

#### `rs._destroy(cb)`

This function is called just before the stream is fully destroyed. You should
use this to implement whatever teardown logic you need. The final part of the
stream life cycle is always to call destroy itself so this function will always
be called whether or not the stream ends gracefully or forcefully.

Can also be set using `options.destroy` in the constructor.

Note that the `_destroy` might be called without the open function being called
in case no read was ever performed on the stream.

#### `rs._predestroy()`

A simple hook that is called as soon as the first `stream.destroy()` call is invoked.

Use this in case you need to cancel pending reads (if possible) instead of waiting for them to finish.

Can also be set using `options.predestroy` in the constructor.

#### `rs.destroy([error])`

Forcefully destroy the stream. Will call `_destroy` as soon as all pending reads have finished.
Once the stream is fully destroyed `close` will be emitted.

If you pass an error this error will be emitted just before `close` is, signifying a reason
as to why this stream was destroyed.

#### `rs.pause()`

Pauses the stream. You will only need to call this if you want to pause a resumed stream.

Returns this stream instance.

#### `rs.resume()`

Will start reading data from the stream as fast as possible.

If you do not call this, you need to use the `read()` method to read data or the `pipe()` method to
pipe the stream somewhere else or the `data` handler.

If none of these option are used the stream will stay paused.

Returns this stream instance.

#### `rs.setEncoding(encoding)`

Set an encoding to change how data is interpreted. E.g. `utf-8`.

#### `bool = Readable.isPaused(rs)`

Returns `true` if the stream is paused, else `false`.

#### `writableStream = rs.pipe(writableStream, [callback])`

Efficently pipe the readable stream to a writable stream (can be Node.js core stream or a stream from this package).
If you provide a callback the callback is called when the pipeline has fully finished with an optional error in case
it failed.

To cancel the pipeline destroy either of the streams.

#### `rs.on('readable')`

Emitted when data is pushed to the stream if the buffer was previously empty.

#### `rs.on('data', data)`

Emitted when data is being read from the stream. If you attach a data handler you are implicitly resuming the stream.

#### `rs.on('end')`

Emitted when the readable stream has ended and no data is left in it's buffer.

#### `rs.on('close')`

Emitted when the readable stream has fully closed (i.e. it's destroy function has completed)

#### `rs.on('error', err)`

Emitted if any of the stream operations fail with an error. `close` is always emitted right after this.

#### `rs.on('piping', dest)`

Emitted when the readable stream is pipeing to a destination.

#### `rs.on('open')`

Emitted after `rs._open(cb)` execution.

#### `rs.destroying`

Boolean property indicating whether or not this stream has started to be destroyed.

#### `rs.destroyed`

Boolean property indicating whether or not this stream has been destroyed.

#### `rs.readable`

Returns `true` if the stream is an active streamx readable stream. Returns `undefined` if not.

#### `bool = Readable.isBackpressured(rs)`

Static method to check if a readable stream is currently under backpressure.

#### `stream = Readable.from(arrayOrBufferOrStringOrAsyncIterator)`

Static method to turn an array or buffer or string or AsyncIterator into a readable stream.

## Writable Stream

#### `ws = new stream.Writable([options])`

Create a new writable stream.

Options include:

```
{
  highWaterMark: 16384, // max buffer size in bytes
  map: (data) => data, // optional function to map input data
  byteLength: (data) => size, // optional function that calculates the byte size of input data
  signal: abortController.signal // optional AbortSignal that triggers `.destroy` when on `abort`
}
```

In addition you can pass the `open`, `write`, `final`, and `destroy` functions as shorthands in
the constructor instead of overwrite the methods below.

The default byteLength function returns the byte length of buffers and `1024`
for any other object. This means the buffer will contain around 16 non buffers
or buffers worth 16kb when full if the defaults are used.

#### `ws._open(cb)`

This function is called once before the first write is issued. Use this function
to implement your own open logic.

Can also be set using `options.open` in the constructor.

#### `ws._destroy(cb)`

This function is called just before the stream is fully destroyed. You should
use this to implement whatever teardown logic you need. The final part of the
stream life cycle is always to call destroy itself so this function will always
be called whether or not the stream ends gracefully or forcefully.

Can also be set using `options.destroy` in the constructor.

Note that the `_destroy` might be called without the open function being called
in case no write was ever performed on the stream.

#### `ws._predestroy()`

A simple hook that is called as soon as the first `stream.destroy()` call is invoked.

Use this in case you need to cancel pending writes (if possible) instead of waiting for them to finish.

Can also be set using `options.predestroy` in the constructor.

#### `ws.destroy([error])`

Forcefully destroy the stream. Will call `_destroy` as soon as all pending reads have finished.
Once the stream is fully destroyed `close` will be emitted.

If you pass an error this error will be emitted just before `close` is, signifying a reason
as to why this stream was destroyed.

#### `drained = ws.write(data)`

Write a piece of data to the stream. Returns `true` if the stream buffer is not full and you
should keep writing to it if you can. If `false` is returned the stream will emit `drain`
once it's buffer is fully drained.

#### `ws._write(data, callback)`

This function is called when the stream want to write some data. Use this to implement your own
write logic. When done call the callback and the stream will call it again if more data exists in the buffer.

Can also be set using `options.write` in the constructor.

#### `ws._writev(batch, callback)`

Similar to `_write` but passes an array of all data in the current write buffer instead of the oldest one.
Useful if the destination you are writing the data to supports batching.

Can also be set using `options.writev` in the constructor.

#### `ws.end()`

Gracefully end the writable stream. Call this when you no longer want to write to the stream.

Once all writes have been fully drained `finish` will be emitted.

Returns this stream instance.

#### `ws.cork()`

Buffer written data in memory. Useful for accumulating small chunks of data into a batch to be consumed by `writev`.

#### `ws.uncork()`

Disable `ws.cork()`.

#### `ws._final(callback)`

This function is called just before `finish` is emitted, i.e. when all writes have flushed but `ws.end()`
have been called. Use this to implement any logic that should happen after all writes but before finish.

Can also be set using `options.final` in the constructor.

#### `ws.on('finish')`

Emitted when the stream has been ended and all writes have been drained.

#### `ws.on('close')`

Emitted when the readable stream has fully closed (i.e. it's destroy function has completed)

#### `ws.on('error', err)`

Emitted if any of the stream operations fail with an error. `close` is always emitted right after this.

#### `ws.on('pipe', src)`

Emitted when a readable stream is being piped to the writable one.

#### `ws.on('open')`

Emitted after `ws._open(cb)` execution.

#### `ws.on('drain')`

Emitted after all data was drained if `ws.write(data)` returned `false`.

#### `ws.destroying`

Boolean property indicating whether or not this stream has started to be destroyed.

#### `ws.destroyed`

Boolean property indicating whether or not this stream has been destroyed.

#### `ws.writable`

Returns `true` if the stream is an active streamx writable stream. Returns `undefined` if not.

#### `bool = Writable.isBackpressured(ws)`

Static method to check if a writable stream is currently under backpressure.

#### `bool = await Writable.drained(ws)`

Static helper to wait for a stream to drain the currently queued writes.
Returns true if they were drained and false otherwise if the stream was destroyed.

## Duplex Stream

#### `s = new stream.Duplex([options])`

A duplex stream is a stream that is both readable and writable.

Since JS does not support multiple inheritance it inherits directly from Readable
but implements the Writable API as well.

If you want to provide only a map function for the readable side use `mapReadable` instead.
If you want to provide only a byteLength function for the readable side use `byteLengthReadable` instead.

Same goes for the writable side but using `mapWritable` and `byteLengthWritable` instead.

## Transform Stream

A Transform stream is a duplex stream with an `._transform` template method that allows to
asynchronously map the input to a different output.

The transform stream overrides the `_write` and `_read` operations of `Readable` and `Writable` but
still allows the setting of these options in the constructor. Usually it is unnecessary to pass
in `read` or `write`/`writev` or to override the corresponding `._read`, `._write` or `._writev` operation.

#### `ts = new stream.Transform([options])`

A transform stream is a duplex stream that maps the data written to it and emits that as readable data.

Has the same options as a duplex stream except you can provide a `transform` function also.

#### `ts._transform(data, callback)`

Transform the incoming data. Call `callback(null, mappedData)` or use `ts.push(mappedData)` to
return data to the readable side of the stream.

Per default the transform function just remits the incoming data making it act as a pass-through stream.

## Pipeline

`pipeline` allows to stream form a readable through a set of duplex streams to a writable entry.

```js
const { pipeline, Readable, Transform, Writable } = require('streamx')
const lastStream = pipeline(
  Readable.from([1, 2, 3]),
  new Transform({
    transform (from, cb) {
      this.push(from.toString())
      cb()
    }
  }),
  new Writable({
    write (data, cb) {
      console.log(data)
      cb()
    }
  })
  error => {
    // Callback once write has finished
  }
)
```

#### `lastStream = stream.pipeline(...streams, [done])`

Pipe all streams together and return the last stream piped to.
When the last stream finishes the pipeline ended succesfully.

If any of the streams error, whether they are Node.js core streams
or streamx streams, all streams in the pipeline are shutdown.

Optionally you can pass a done callback to know when the pipeline is done.

#### `promise = stream.pipelinePromise(...streams)`

Same as normal pipeline except instead of returning the last stream it returns
a promise representing the done callback. Note you should error handle this
promise if you use this version.

## Helpers

#### `bool = isStream(stream)`

#### `bool = isStreamx(stream)`

#### `bool = isDisturbed(stream)`

Indicates if the stream has been opened or started to be destroyed.

#### `bool = isEnding(stream)`

Indicates if a readable stream has started it's ending process.

#### `bool = isEnded(stream)`

Indicates if a readable stream ended, so all it's readings.

#### `bool = isFinishing(stream)`

Indicates if a writable stream has started it's ending process.

#### `bool = isFinished(stream)`

Indicates if a writable stream ended, so all it's writings.

#### `err = getStreamError(stream, [options])`

Returns `null` if the stream has no errors.

## Utilities

Streamx aims to be minimal and stable. It therefore only contains a minimal set of utilities.
To help discover of other modules that help you build streamx apps, we link some useful utilities here

- [stream-composer](https://github.com/mafintosh/stream-composer) - Compose streams like Node's `stream.compose` and the `duplexify` and `pumpify` modules.
- [teex](https://github.com/mafintosh/teex) - Clone a readable stream into multiple new readable instances.
- [merge-readable](https://github.com/holepunchto/merge-readable) - Merge multiple readers into a new readable with optional mapping of data

## Contributing

If you want to help contribute to streamx a good way to start is to help writing more test
cases, compatibility tests, documentation, or performance benchmarks.

If in doubt open an issue :)

## License

MIT


================================================
FILE: SECURITY.md
================================================
## Security contact information

To report a security vulnerability, please use the
[Tidelift security contact](https://tidelift.com/security).
Tidelift will coordinate the fix and disclosure.


================================================
FILE: examples/fs.js
================================================
const fs = require('fs')
const { Writable, Readable } = require('../')

class FileWriteStream extends Writable {
  constructor(filename, mode) {
    super()
    this.filename = filename
    this.mode = mode
    this.fd = 0
  }

  _open(cb) {
    fs.open(this.filename, this.mode, (err, fd) => {
      if (err) return cb(err)
      this.fd = fd
      cb(null)
    })
  }

  _write(data, cb) {
    fs.write(this.fd, data, 0, data.length, null, (err, written) => {
      if (err) return cb(err)
      if (written !== data.length) return this._write(data.slice(written), cb)
      cb(null)
    })
  }

  _destroy(cb) {
    if (!this.fd) return cb()
    fs.close(this.fd, cb)
  }
}

class FileReadStream extends Readable {
  constructor(filename) {
    super()
    this.filename = filename
    this.fd = 0
  }

  _open(cb) {
    fs.open(this.filename, 'r', (err, fd) => {
      if (err) return cb(err)
      this.fd = fd
      cb(null)
    })
  }

  _read(cb) {
    let data = Buffer.alloc(16 * 1024)

    fs.read(this.fd, data, 0, data.length, null, (err, read) => {
      if (err) return cb(err)
      if (read !== data.length) data = data.slice(0, read)
      this.push(data.length ? data : null)
      cb(null)
    })
  }

  _destroy(cb) {
    if (!this.fd) return cb()
    fs.close(this.fd, cb)
  }
}

// copy this file as an example

const rs = new FileReadStream(__filename)
const ws = new FileWriteStream(`${__filename}.cpy`, 'w')

rs.pipe(ws, function (err) {
  console.log('file copied', err)
})


================================================
FILE: index.js
================================================
const { EventEmitter } = require('events-universal')
const FIFO = require('fast-fifo')
const TextDecoder = require('text-decoder')

const StreamError = require('./lib/errors.js')

// if we do a future major, expect queue microtask to be there always, for now a bit defensive
const qmt =
  typeof queueMicrotask === 'undefined' ? (fn) => global.process.nextTick(fn) : queueMicrotask

// 29 bits used total (4 from shared, 14 from read, and 11 from write)
const MAX = (1 << 29) - 1

// Shared state
const OPENING = 0b0001
const PREDESTROYING = 0b0010
const DESTROYING = 0b0100
const DESTROYED = 0b1000

const NOT_OPENING = MAX ^ OPENING
const NOT_PREDESTROYING = MAX ^ PREDESTROYING

// Read state (4 bit offset from shared state)
const READ_ACTIVE = 0b00000000000001 << 4
const READ_UPDATING = 0b00000000000010 << 4
const READ_PRIMARY = 0b00000000000100 << 4
const READ_QUEUED = 0b00000000001000 << 4
const READ_RESUMED = 0b00000000010000 << 4
const READ_PIPE_DRAINED = 0b00000000100000 << 4
const READ_ENDING = 0b00000001000000 << 4
const READ_EMIT_DATA = 0b00000010000000 << 4
const READ_EMIT_READABLE = 0b00000100000000 << 4
const READ_EMITTED_READABLE = 0b00001000000000 << 4
const READ_DONE = 0b00010000000000 << 4
const READ_NEXT_TICK = 0b00100000000000 << 4
const READ_NEEDS_PUSH = 0b01000000000000 << 4
const READ_READ_AHEAD = 0b10000000000000 << 4

// Combined read state
const READ_FLOWING = READ_RESUMED | READ_PIPE_DRAINED
const READ_ACTIVE_AND_NEEDS_PUSH = READ_ACTIVE | READ_NEEDS_PUSH
const READ_PRIMARY_AND_ACTIVE = READ_PRIMARY | READ_ACTIVE
const READ_EMIT_READABLE_AND_QUEUED = READ_EMIT_READABLE | READ_QUEUED
const READ_RESUMED_READ_AHEAD = READ_RESUMED | READ_READ_AHEAD

const READ_NOT_ACTIVE = MAX ^ READ_ACTIVE
const READ_NON_PRIMARY = MAX ^ READ_PRIMARY
const READ_NON_PRIMARY_AND_PUSHED = MAX ^ (READ_PRIMARY | READ_NEEDS_PUSH)
const READ_PUSHED = MAX ^ READ_NEEDS_PUSH
const READ_PAUSED = MAX ^ READ_RESUMED
const READ_NOT_QUEUED = MAX ^ (READ_QUEUED | READ_EMITTED_READABLE)
const READ_NOT_ENDING = MAX ^ READ_ENDING
const READ_PIPE_NOT_DRAINED = MAX ^ READ_FLOWING
const READ_NOT_NEXT_TICK = MAX ^ READ_NEXT_TICK
const READ_NOT_UPDATING = MAX ^ READ_UPDATING
const READ_NO_READ_AHEAD = MAX ^ READ_READ_AHEAD
const READ_PAUSED_NO_READ_AHEAD = MAX ^ READ_RESUMED_READ_AHEAD

// Write state (18 bit offset, 4 bit offset from shared state and 14 from read state)
const WRITE_ACTIVE = 0b00000000001 << 18
const WRITE_UPDATING = 0b00000000010 << 18
const WRITE_PRIMARY = 0b00000000100 << 18
const WRITE_QUEUED = 0b00000001000 << 18
const WRITE_UNDRAINED = 0b00000010000 << 18
const WRITE_DONE = 0b00000100000 << 18
const WRITE_EMIT_DRAIN = 0b00001000000 << 18
const WRITE_NEXT_TICK = 0b00010000000 << 18
const WRITE_WRITING = 0b00100000000 << 18
const WRITE_FINISHING = 0b01000000000 << 18
const WRITE_CORKED = 0b10000000000 << 18

const WRITE_NOT_ACTIVE = MAX ^ (WRITE_ACTIVE | WRITE_WRITING)
const WRITE_NON_PRIMARY = MAX ^ WRITE_PRIMARY
const WRITE_NOT_FINISHING = MAX ^ (WRITE_ACTIVE | WRITE_FINISHING)
const WRITE_DRAINED = MAX ^ WRITE_UNDRAINED
const WRITE_NOT_QUEUED = MAX ^ WRITE_QUEUED
const WRITE_NOT_NEXT_TICK = MAX ^ WRITE_NEXT_TICK
const WRITE_NOT_UPDATING = MAX ^ WRITE_UPDATING
const WRITE_NOT_CORKED = MAX ^ WRITE_CORKED

// Combined shared state
const ACTIVE = READ_ACTIVE | WRITE_ACTIVE
const NOT_ACTIVE = MAX ^ ACTIVE
const DONE = READ_DONE | WRITE_DONE
const DESTROY_STATUS = DESTROYING | DESTROYED | PREDESTROYING
const OPEN_STATUS = DESTROY_STATUS | OPENING
const AUTO_DESTROY = DESTROY_STATUS | DONE
const NON_PRIMARY = WRITE_NON_PRIMARY & READ_NON_PRIMARY
const ACTIVE_OR_TICKING = WRITE_NEXT_TICK | READ_NEXT_TICK
const TICKING = ACTIVE_OR_TICKING & NOT_ACTIVE
const IS_OPENING = OPEN_STATUS | TICKING

// Combined shared state and read state
const READ_PRIMARY_STATUS = OPEN_STATUS | READ_ENDING | READ_DONE
const READ_STATUS = OPEN_STATUS | READ_DONE | READ_QUEUED
const READ_ENDING_STATUS = OPEN_STATUS | READ_ENDING | READ_QUEUED
const READ_READABLE_STATUS = OPEN_STATUS | READ_EMIT_READABLE | READ_QUEUED | READ_EMITTED_READABLE
const SHOULD_NOT_READ =
  OPEN_STATUS | READ_ACTIVE | READ_ENDING | READ_DONE | READ_NEEDS_PUSH | READ_READ_AHEAD
const READ_BACKPRESSURE_STATUS = DESTROY_STATUS | READ_ENDING | READ_DONE
const READ_UPDATE_SYNC_STATUS = READ_UPDATING | OPEN_STATUS | READ_NEXT_TICK | READ_PRIMARY
const READ_NEXT_TICK_OR_OPENING = READ_NEXT_TICK | OPENING

// Combined write state
const WRITE_PRIMARY_STATUS = OPEN_STATUS | WRITE_FINISHING | WRITE_DONE
const WRITE_QUEUED_AND_UNDRAINED = WRITE_QUEUED | WRITE_UNDRAINED
const WRITE_QUEUED_AND_ACTIVE = WRITE_QUEUED | WRITE_ACTIVE
const WRITE_DRAIN_STATUS = WRITE_QUEUED | WRITE_UNDRAINED | OPEN_STATUS | WRITE_ACTIVE
const WRITE_STATUS = OPEN_STATUS | WRITE_ACTIVE | WRITE_QUEUED | WRITE_CORKED
const WRITE_PRIMARY_AND_ACTIVE = WRITE_PRIMARY | WRITE_ACTIVE
const WRITE_ACTIVE_AND_WRITING = WRITE_ACTIVE | WRITE_WRITING
const WRITE_FINISHING_STATUS = OPEN_STATUS | WRITE_FINISHING | WRITE_QUEUED_AND_ACTIVE | WRITE_DONE
const WRITE_BACKPRESSURE_STATUS = WRITE_UNDRAINED | DESTROY_STATUS | WRITE_FINISHING | WRITE_DONE
const WRITE_UPDATE_SYNC_STATUS = WRITE_UPDATING | OPEN_STATUS | WRITE_NEXT_TICK | WRITE_PRIMARY
const WRITE_DROP_DATA = WRITE_FINISHING | WRITE_DONE | DESTROY_STATUS

const asyncIterator = Symbol.asyncIterator || Symbol('asyncIterator')

class WritableState {
  constructor(
    stream,
    { highWaterMark = 16384, map = null, mapWritable, byteLength, byteLengthWritable } = {}
  ) {
    this.stream = stream
    this.queue = new FIFO()
    this.highWaterMark = highWaterMark
    this.buffered = 0
    this.error = null
    this.pipeline = null
    this.drains = null // if we add more seldomly used helpers we might them into a subobject so its a single ptr
    this.byteLength = byteLengthWritable || byteLength || defaultByteLength
    this.map = mapWritable || map
    this.afterWrite = afterWrite.bind(this)
    this.afterUpdateNextTick = updateWriteNT.bind(this)
  }

  get ending() {
    return (this.stream._duplexState & WRITE_FINISHING) !== 0
  }

  get ended() {
    return (this.stream._duplexState & WRITE_DONE) !== 0
  }

  push(data) {
    if ((this.stream._duplexState & WRITE_DROP_DATA) !== 0) return false
    if (this.map !== null) data = this.map(data)

    this.buffered += this.byteLength(data)
    this.queue.push(data)

    if (this.buffered < this.highWaterMark) {
      this.stream._duplexState |= WRITE_QUEUED
      return true
    }

    this.stream._duplexState |= WRITE_QUEUED_AND_UNDRAINED
    return false
  }

  shift() {
    const data = this.queue.shift()

    this.buffered -= this.byteLength(data)
    if (this.buffered === 0) this.stream._duplexState &= WRITE_NOT_QUEUED

    return data
  }

  end(data) {
    if (typeof data === 'function') {
      this.stream.once('finish', data)
    } else if (data !== undefined && data !== null) {
      this.push(data)
    }

    this.stream._duplexState = (this.stream._duplexState | WRITE_FINISHING) & WRITE_NON_PRIMARY
  }

  autoBatch(data, cb) {
    const buffer = []
    const stream = this.stream

    buffer.push(data)
    while ((stream._duplexState & WRITE_STATUS) === WRITE_QUEUED_AND_ACTIVE) {
      buffer.push(stream._writableState.shift())
    }

    if ((stream._duplexState & OPEN_STATUS) !== 0) return cb(null)
    stream._writev(buffer, cb)
  }

  update() {
    const stream = this.stream

    stream._duplexState |= WRITE_UPDATING

    do {
      while ((stream._duplexState & WRITE_STATUS) === WRITE_QUEUED) {
        const data = this.shift()
        stream._duplexState |= WRITE_ACTIVE_AND_WRITING
        stream._write(data, this.afterWrite)
      }

      if ((stream._duplexState & WRITE_PRIMARY_AND_ACTIVE) === 0) this.updateNonPrimary()
    } while (this.continueUpdate() === true)

    stream._duplexState &= WRITE_NOT_UPDATING
  }

  updateNonPrimary() {
    const stream = this.stream

    if ((stream._duplexState & WRITE_FINISHING_STATUS) === WRITE_FINISHING) {
      stream._duplexState = stream._duplexState | WRITE_ACTIVE
      stream._final(afterFinal.bind(this))
      return
    }

    if ((stream._duplexState & DESTROY_STATUS) === DESTROYING) {
      if ((stream._duplexState & ACTIVE_OR_TICKING) === 0) {
        stream._duplexState |= ACTIVE
        stream._destroy(afterDestroy.bind(this))
      }
      return
    }

    if ((stream._duplexState & IS_OPENING) === OPENING) {
      stream._duplexState = (stream._duplexState | ACTIVE) & NOT_OPENING
      stream._open(afterOpen.bind(this))
    }
  }

  continueUpdate() {
    if ((this.stream._duplexState & WRITE_NEXT_TICK) === 0) return false
    this.stream._duplexState &= WRITE_NOT_NEXT_TICK
    return true
  }

  updateCallback() {
    if ((this.stream._duplexState & WRITE_UPDATE_SYNC_STATUS) === WRITE_PRIMARY) {
      this.update()
    } else {
      this.updateNextTick()
    }
  }

  updateNextTick() {
    if ((this.stream._duplexState & WRITE_NEXT_TICK) !== 0) return
    this.stream._duplexState |= WRITE_NEXT_TICK
    if ((this.stream._duplexState & WRITE_UPDATING) === 0) qmt(this.afterUpdateNextTick)
  }
}

class ReadableState {
  constructor(
    stream,
    { highWaterMark = 16384, map = null, mapReadable, byteLength, byteLengthReadable } = {}
  ) {
    this.stream = stream
    this.queue = new FIFO()
    this.highWaterMark = highWaterMark === 0 ? 1 : highWaterMark
    this.buffered = 0
    this.readAhead = highWaterMark > 0
    this.error = null
    this.pipeline = null
    this.byteLength = byteLengthReadable || byteLength || defaultByteLength
    this.map = mapReadable || map
    this.pipeTo = null
    this.afterRead = afterRead.bind(this)
    this.afterUpdateNextTick = updateReadNT.bind(this)
  }

  get ending() {
    return (this.stream._duplexState & READ_ENDING) !== 0
  }

  get ended() {
    return (this.stream._duplexState & READ_DONE) !== 0
  }

  pipe(pipeTo, cb) {
    if (this.pipeTo !== null) throw new Error('Can only pipe to one destination')
    if (typeof cb !== 'function') cb = null

    this.stream._duplexState |= READ_PIPE_DRAINED
    this.pipeTo = pipeTo
    this.pipeline = new Pipeline(this.stream, pipeTo, cb)

    if (cb) this.stream.on('error', noop) // We already error handle this so supress crashes

    if (isStreamx(pipeTo)) {
      pipeTo._writableState.pipeline = this.pipeline
      if (cb) pipeTo.on('error', noop) // We already error handle this so supress crashes
      pipeTo.on('finish', this.pipeline.finished.bind(this.pipeline)) // TODO: just call finished from pipeTo itself
    } else {
      const onerror = this.pipeline.done.bind(this.pipeline, pipeTo)
      const onclose = this.pipeline.done.bind(this.pipeline, pipeTo, null) // onclose has a weird bool arg
      pipeTo.on('error', onerror)
      pipeTo.on('close', onclose)
      pipeTo.on('finish', this.pipeline.finished.bind(this.pipeline))
    }

    pipeTo.on('drain', afterDrain.bind(this))
    this.stream.emit('piping', pipeTo)
    pipeTo.emit('pipe', this.stream)
  }

  push(data) {
    const stream = this.stream

    if (data === null) {
      this.highWaterMark = 0
      stream._duplexState = (stream._duplexState | READ_ENDING) & READ_NON_PRIMARY_AND_PUSHED
      return false
    }

    if (this.map !== null) {
      data = this.map(data)
      if (data === null) {
        stream._duplexState &= READ_PUSHED
        return this.buffered < this.highWaterMark
      }
    }

    this.buffered += this.byteLength(data)
    this.queue.push(data)

    stream._duplexState = (stream._duplexState | READ_QUEUED) & READ_PUSHED

    return this.buffered < this.highWaterMark
  }

  shift() {
    const data = this.queue.shift()

    this.buffered -= this.byteLength(data)
    if (this.buffered === 0) {
      this.stream._duplexState &= READ_NOT_QUEUED
    }

    return data
  }

  unshift(data) {
    const pending = [this.map !== null ? this.map(data) : data]
    while (this.buffered > 0) pending.push(this.shift())

    for (let i = 0; i < pending.length - 1; i++) {
      const data = pending[i]
      this.buffered += this.byteLength(data)
      this.queue.push(data)
    }

    this.push(pending[pending.length - 1])
  }

  read() {
    const stream = this.stream

    if ((stream._duplexState & READ_STATUS) === READ_QUEUED) {
      const data = this.shift()

      if (this.pipeTo !== null && this.pipeTo.write(data) === false) {
        stream._duplexState &= READ_PIPE_NOT_DRAINED
      }

      if ((stream._duplexState & READ_EMIT_DATA) !== 0) {
        stream.emit('data', data)
      }

      return data
    }

    if (this.readAhead === false) {
      stream._duplexState |= READ_READ_AHEAD
      this.updateNextTick()
    }

    return null
  }

  drain() {
    const stream = this.stream

    while (
      (stream._duplexState & READ_STATUS) === READ_QUEUED &&
      (stream._duplexState & READ_FLOWING) !== 0
    ) {
      const data = this.shift()

      if (this.pipeTo !== null && this.pipeTo.write(data) === false) {
        stream._duplexState &= READ_PIPE_NOT_DRAINED
      }

      if ((stream._duplexState & READ_EMIT_DATA) !== 0) {
        stream.emit('data', data)
      }
    }
  }

  update() {
    const stream = this.stream

    stream._duplexState |= READ_UPDATING

    do {
      this.drain()

      while (
        this.buffered < this.highWaterMark &&
        (stream._duplexState & SHOULD_NOT_READ) === READ_READ_AHEAD
      ) {
        stream._duplexState |= READ_ACTIVE_AND_NEEDS_PUSH
        stream._read(this.afterRead)
        this.drain()
      }

      if ((stream._duplexState & READ_READABLE_STATUS) === READ_EMIT_READABLE_AND_QUEUED) {
        stream._duplexState |= READ_EMITTED_READABLE
        stream.emit('readable')
      }

      if ((stream._duplexState & READ_PRIMARY_AND_ACTIVE) === 0) {
        this.updateNonPrimary()
      }
    } while (this.continueUpdate() === true)

    stream._duplexState &= READ_NOT_UPDATING
  }

  updateNonPrimary() {
    const stream = this.stream

    if ((stream._duplexState & READ_ENDING_STATUS) === READ_ENDING) {
      stream._duplexState = (stream._duplexState | READ_DONE) & READ_NOT_ENDING
      stream.emit('end')

      if ((stream._duplexState & AUTO_DESTROY) === DONE) {
        stream._duplexState |= DESTROYING
      }

      if (this.pipeTo !== null) {
        this.pipeTo.end()
      }
    }

    if ((stream._duplexState & DESTROY_STATUS) === DESTROYING) {
      if ((stream._duplexState & ACTIVE_OR_TICKING) === 0) {
        stream._duplexState |= ACTIVE
        stream._destroy(afterDestroy.bind(this))
      }
      return
    }

    if ((stream._duplexState & IS_OPENING) === OPENING) {
      stream._duplexState = (stream._duplexState | ACTIVE) & NOT_OPENING
      stream._open(afterOpen.bind(this))
    }
  }

  continueUpdate() {
    if ((this.stream._duplexState & READ_NEXT_TICK) === 0) return false
    this.stream._duplexState &= READ_NOT_NEXT_TICK
    return true
  }

  updateCallback() {
    if ((this.stream._duplexState & READ_UPDATE_SYNC_STATUS) === READ_PRIMARY) {
      this.update()
    } else {
      this.updateNextTick()
    }
  }

  updateNextTickIfOpen() {
    if ((this.stream._duplexState & READ_NEXT_TICK_OR_OPENING) !== 0) return
    this.stream._duplexState |= READ_NEXT_TICK
    if ((this.stream._duplexState & READ_UPDATING) === 0) qmt(this.afterUpdateNextTick)
  }

  updateNextTick() {
    if ((this.stream._duplexState & READ_NEXT_TICK) !== 0) return
    this.stream._duplexState |= READ_NEXT_TICK
    if ((this.stream._duplexState & READ_UPDATING) === 0) qmt(this.afterUpdateNextTick)
  }
}

class TransformState {
  constructor(stream) {
    this.data = null
    this.afterTransform = afterTransform.bind(stream)
    this.afterFinal = null
  }
}

class Pipeline {
  constructor(src, dst, cb) {
    this.from = src
    this.to = dst
    this.afterPipe = cb
    this.error = null
    this.pipeToFinished = false
  }

  finished() {
    this.pipeToFinished = true
  }

  done(stream, err) {
    if (err) this.error = err

    if (stream === this.to) {
      this.to = null

      if (this.from !== null) {
        if ((this.from._duplexState & READ_DONE) === 0 || !this.pipeToFinished) {
          this.from.destroy(this.error || new Error('Writable stream closed prematurely'))
        }
        return
      }
    }

    if (stream === this.from) {
      this.from = null

      if (this.to !== null) {
        if ((stream._duplexState & READ_DONE) === 0) {
          this.to.destroy(this.error || new Error('Readable stream closed before ending'))
        }
        return
      }
    }

    if (this.afterPipe !== null) this.afterPipe(this.error)
    this.to = this.from = this.afterPipe = null
  }
}

function afterDrain() {
  this.stream._duplexState |= READ_PIPE_DRAINED
  this.updateCallback()
}

function afterFinal(err) {
  const stream = this.stream
  if (err) stream.destroy(err)

  if ((stream._duplexState & DESTROY_STATUS) === 0) {
    stream._duplexState |= WRITE_DONE
    stream.emit('finish')
  }

  if ((stream._duplexState & AUTO_DESTROY) === DONE) {
    stream._duplexState |= DESTROYING
  }

  stream._duplexState &= WRITE_NOT_FINISHING

  // no need to wait the extra tick here, so we short circuit that
  if ((stream._duplexState & WRITE_UPDATING) === 0) {
    this.update()
  } else {
    this.updateNextTick()
  }
}

function afterDestroy(err) {
  const stream = this.stream

  if (!err && !StreamError.isStreamDestroyed(this.error)) err = this.error
  if (err) stream.emit('error', err)

  stream._duplexState |= DESTROYED
  stream.emit('close')

  const rs = stream._readableState
  const ws = stream._writableState

  if (rs !== null && rs.pipeline !== null) {
    rs.pipeline.done(stream, err)
  }

  if (ws !== null) {
    while (ws.drains !== null && ws.drains.length > 0) {
      ws.drains.shift().resolve(false)
    }

    if (ws.pipeline !== null) {
      ws.pipeline.done(stream, err)
    }
  }
}

function afterWrite(err) {
  const stream = this.stream

  if (err) stream.destroy(err)
  stream._duplexState &= WRITE_NOT_ACTIVE

  if (this.drains !== null) tickDrains(this.drains)

  if ((stream._duplexState & WRITE_DRAIN_STATUS) === WRITE_UNDRAINED) {
    stream._duplexState &= WRITE_DRAINED

    if ((stream._duplexState & WRITE_EMIT_DRAIN) === WRITE_EMIT_DRAIN) {
      stream.emit('drain')
    }
  }

  this.updateCallback()
}

function afterRead(err) {
  if (err) this.stream.destroy(err)
  this.stream._duplexState &= READ_NOT_ACTIVE

  if (this.readAhead === false && (this.stream._duplexState & READ_RESUMED) === 0) {
    this.stream._duplexState &= READ_NO_READ_AHEAD
  }

  this.updateCallback()
}

function updateReadNT() {
  if ((this.stream._duplexState & READ_UPDATING) === 0) {
    this.stream._duplexState &= READ_NOT_NEXT_TICK
    this.update()
  }
}

function updateWriteNT() {
  if ((this.stream._duplexState & WRITE_UPDATING) === 0) {
    this.stream._duplexState &= WRITE_NOT_NEXT_TICK
    this.update()
  }
}

function tickDrains(drains) {
  for (let i = 0; i < drains.length; i++) {
    // drains.writes are monotonic, so if one is 0 its always the first one
    if (--drains[i].writes === 0) {
      drains.shift().resolve(true)
      i--
    }
  }
}

function afterOpen(err) {
  const stream = this.stream

  if (err) stream.destroy(err)

  if ((stream._duplexState & DESTROYING) === 0) {
    if ((stream._duplexState & READ_PRIMARY_STATUS) === 0) {
      stream._duplexState |= READ_PRIMARY
    }

    if ((stream._duplexState & WRITE_PRIMARY_STATUS) === 0) {
      stream._duplexState |= WRITE_PRIMARY
    }

    stream.emit('open')
  }

  stream._duplexState &= NOT_ACTIVE

  if (stream._writableState !== null) {
    stream._writableState.updateCallback()
  }

  if (stream._readableState !== null) {
    stream._readableState.updateCallback()
  }
}

function afterTransform(err, data) {
  if (data !== undefined && data !== null) this.push(data)
  this._writableState.afterWrite(err)
}

function newListener(name) {
  if (this._readableState !== null) {
    if (name === 'data') {
      this._duplexState |= READ_EMIT_DATA | READ_RESUMED_READ_AHEAD
      this._readableState.updateNextTick()
    }
    if (name === 'readable') {
      this._duplexState |= READ_EMIT_READABLE
      this._readableState.updateNextTick()
    }
  }

  if (this._writableState !== null) {
    if (name === 'drain') {
      this._duplexState |= WRITE_EMIT_DRAIN
      this._writableState.updateNextTick()
    }
  }
}

class Stream extends EventEmitter {
  constructor(opts) {
    super()

    this._duplexState = 0
    this._readableState = null
    this._writableState = null

    if (opts) {
      if (opts.open) this._open = opts.open
      if (opts.destroy) this._destroy = opts.destroy
      if (opts.predestroy) this._predestroy = opts.predestroy
      if (opts.signal) opts.signal.addEventListener('abort', abort.bind(this))
    }

    this.on('newListener', newListener)
  }

  _open(cb) {
    cb(null)
  }

  _destroy(cb) {
    cb(null)
  }

  _predestroy() {
    // does nothing
  }

  get readable() {
    return this._readableState !== null ? true : undefined
  }

  get writable() {
    return this._writableState !== null ? true : undefined
  }

  get destroyed() {
    return (this._duplexState & DESTROYED) !== 0
  }

  get destroying() {
    return (this._duplexState & DESTROY_STATUS) !== 0
  }

  destroy(err) {
    if ((this._duplexState & DESTROY_STATUS) === 0) {
      if (!err) err = StreamError.STREAM_DESTROYED()
      this._duplexState = (this._duplexState | DESTROYING) & NON_PRIMARY

      if (this._readableState !== null) {
        this._readableState.highWaterMark = 0
        this._readableState.error = err
      }

      if (this._writableState !== null) {
        this._writableState.highWaterMark = 0
        this._writableState.error = err
      }

      this._duplexState |= PREDESTROYING
      this._predestroy()
      this._duplexState &= NOT_PREDESTROYING

      if (this._readableState !== null) {
        this._readableState.updateNextTick()
      }

      if (this._writableState !== null) {
        this._writableState.updateNextTick()
      }
    }
  }
}

class Readable extends Stream {
  constructor(opts) {
    super(opts)

    this._duplexState |= OPENING | WRITE_DONE | READ_READ_AHEAD
    this._readableState = new ReadableState(this, opts)

    if (opts) {
      if (this._readableState.readAhead === false) this._duplexState &= READ_NO_READ_AHEAD
      if (opts.read) this._read = opts.read
      if (opts.eagerOpen) this._readableState.updateNextTick()
      if (opts.encoding) this.setEncoding(opts.encoding)
    }
  }

  setEncoding(encoding) {
    const dec = new TextDecoder(encoding)
    const map = this._readableState.map || echo
    this._readableState.map = mapOrSkip
    return this

    function mapOrSkip(data) {
      const next = dec.push(data)
      return next === '' && (data.byteLength !== 0 || dec.remaining > 0) ? null : map(next)
    }
  }

  _read(cb) {
    cb(null)
  }

  pipe(dest, cb) {
    this._readableState.updateNextTick()
    this._readableState.pipe(dest, cb)
    return dest
  }

  read() {
    this._readableState.updateNextTick()
    return this._readableState.read()
  }

  push(data) {
    this._readableState.updateNextTickIfOpen()
    return this._readableState.push(data)
  }

  unshift(data) {
    this._readableState.updateNextTickIfOpen()
    return this._readableState.unshift(data)
  }

  resume() {
    this._duplexState |= READ_RESUMED_READ_AHEAD
    this._readableState.updateNextTick()
    return this
  }

  pause() {
    this._duplexState &=
      this._readableState.readAhead === false ? READ_PAUSED_NO_READ_AHEAD : READ_PAUSED
    return this
  }

  static _fromAsyncIterator(ite, opts) {
    let destroy

    const rs = new Readable({
      ...opts,
      read(cb) {
        ite.next().then(push).then(cb.bind(null, null)).catch(cb)
      },
      predestroy() {
        destroy = ite.return()
      },
      destroy(cb) {
        if (!destroy) return cb(null)
        destroy.then(cb.bind(null, null)).catch(cb)
      }
    })

    return rs

    function push(data) {
      if (data.done) rs.push(null)
      else rs.push(data.value)
    }
  }

  static from(data, opts) {
    if (isReadStreamx(data)) return data
    if (data[asyncIterator]) return this._fromAsyncIterator(data[asyncIterator](), opts)
    if (!Array.isArray(data)) data = data === undefined ? [] : [data]

    let i = 0
    return new Readable({
      ...opts,
      read(cb) {
        this.push(i === data.length ? null : data[i++])
        cb(null)
      }
    })
  }

  static isBackpressured(rs) {
    return (
      (rs._duplexState & READ_BACKPRESSURE_STATUS) !== 0 ||
      rs._readableState.buffered >= rs._readableState.highWaterMark
    )
  }

  static isPaused(rs) {
    return (rs._duplexState & READ_RESUMED) === 0
  }

  [asyncIterator]() {
    const stream = this

    let error = null
    let promiseResolve = null
    let promiseReject = null

    this.on('error', (err) => {
      error = err
    })
    this.on('readable', onreadable)
    this.on('close', onclose)

    return {
      [asyncIterator]() {
        return this
      },
      next() {
        return new Promise(function (resolve, reject) {
          promiseResolve = resolve
          promiseReject = reject
          const data = stream.read()
          if (data !== null) ondata(data)
          else if ((stream._duplexState & DESTROYED) !== 0) ondata(null)
        })
      },
      return() {
        return destroy(null)
      },
      throw(err) {
        return destroy(err)
      }
    }

    function onreadable() {
      if (promiseResolve !== null) ondata(stream.read())
    }

    function onclose() {
      if (promiseResolve !== null) ondata(null)
    }

    function ondata(data) {
      if (promiseReject === null) return
      if (error) {
        promiseReject(error)
      } else if (data === null && (stream._duplexState & READ_DONE) === 0) {
        promiseReject(StreamError.STREAM_DESTROYED())
      } else {
        promiseResolve({ value: data, done: data === null })
      }
      promiseReject = promiseResolve = null
    }

    function destroy(err) {
      stream.destroy(err)
      return new Promise((resolve, reject) => {
        if (stream._duplexState & DESTROYED) return resolve({ value: undefined, done: true })
        stream.once('close', function () {
          if (err) reject(err)
          else resolve({ value: undefined, done: true })
        })
      })
    }
  }
}

class Writable extends Stream {
  constructor(opts) {
    super(opts)

    this._duplexState |= OPENING | READ_DONE
    this._writableState = new WritableState(this, opts)

    if (opts) {
      if (opts.writev) this._writev = opts.writev
      if (opts.write) this._write = opts.write
      if (opts.final) this._final = opts.final
      if (opts.eagerOpen) this._writableState.updateNextTick()
    }
  }

  cork() {
    this._duplexState |= WRITE_CORKED
  }

  uncork() {
    this._duplexState &= WRITE_NOT_CORKED
    this._writableState.updateNextTick()
  }

  _writev(batch, cb) {
    cb(null)
  }

  _write(data, cb) {
    this._writableState.autoBatch(data, cb)
  }

  _final(cb) {
    cb(null)
  }

  static isBackpressured(ws) {
    return (ws._duplexState & WRITE_BACKPRESSURE_STATUS) !== 0
  }

  static drained(ws) {
    if (ws.destroyed) return Promise.resolve(false)

    const state = ws._writableState
    const pending = isWritev(ws) ? Math.min(1, state.queue.length) : state.queue.length
    const writes = pending + (ws._duplexState & WRITE_WRITING ? 1 : 0)
    if (writes === 0) return Promise.resolve(true)

    if (state.drains === null) state.drains = []
    return new Promise((resolve) => {
      state.drains.push({ writes, resolve })
    })
  }

  write(data) {
    this._writableState.updateNextTick()
    return this._writableState.push(data)
  }

  end(data) {
    this._writableState.updateNextTick()
    this._writableState.end(data)
    return this
  }
}

class Duplex extends Readable {
  // and Writable
  constructor(opts) {
    super(opts)

    this._duplexState = OPENING | (this._duplexState & READ_READ_AHEAD)
    this._writableState = new WritableState(this, opts)

    if (opts) {
      if (opts.writev) this._writev = opts.writev
      if (opts.write) this._write = opts.write
      if (opts.final) this._final = opts.final
    }
  }

  cork() {
    this._duplexState |= WRITE_CORKED
  }

  uncork() {
    this._duplexState &= WRITE_NOT_CORKED
    this._writableState.updateNextTick()
  }

  _writev(batch, cb) {
    cb(null)
  }

  _write(data, cb) {
    this._writableState.autoBatch(data, cb)
  }

  _final(cb) {
    cb(null)
  }

  write(data) {
    this._writableState.updateNextTick()
    return this._writableState.push(data)
  }

  end(data) {
    this._writableState.updateNextTick()
    this._writableState.end(data)
    return this
  }
}

class Transform extends Duplex {
  constructor(opts) {
    super(opts)
    this._transformState = new TransformState(this)

    if (opts) {
      if (opts.transform) this._transform = opts.transform
      if (opts.flush) this._flush = opts.flush
    }
  }

  _write(data, cb) {
    if (this._readableState.buffered >= this._readableState.highWaterMark) {
      this._transformState.data = data
    } else {
      this._transform(data, this._transformState.afterTransform)
    }
  }

  _read(cb) {
    if (this._transformState.data !== null) {
      const data = this._transformState.data
      this._transformState.data = null
      cb(null)
      this._transform(data, this._transformState.afterTransform)
    } else {
      cb(null)
    }
  }

  destroy(err) {
    super.destroy(err)
    if (this._transformState.data !== null) {
      this._transformState.data = null
      this._transformState.afterTransform()
    }
  }

  _transform(data, cb) {
    cb(null, data)
  }

  _flush(cb) {
    cb(null)
  }

  _final(cb) {
    this._transformState.afterFinal = cb
    this._flush(transformAfterFlush.bind(this))
  }
}

class PassThrough extends Transform {}

function transformAfterFlush(err, data) {
  const cb = this._transformState.afterFinal
  if (err) return cb(err)

  if (data !== null && data !== undefined) this.push(data)
  this.push(null)
  cb(null)
}

function pipelinePromise(...streams) {
  return new Promise((resolve, reject) => {
    return pipeline(...streams, (err) => {
      if (err) return reject(err)
      resolve()
    })
  })
}

function pipeline(stream, ...streams) {
  const all = Array.isArray(stream) ? [...stream, ...streams] : [stream, ...streams]
  const done = all.length && typeof all[all.length - 1] === 'function' ? all.pop() : null

  if (all.length < 2) throw new Error('Pipeline requires at least 2 streams')

  let src = all[0]
  let dest = null
  let error = null

  for (let i = 1; i < all.length; i++) {
    dest = all[i]

    if (isStreamx(src)) {
      src.pipe(dest, onerror)
    } else {
      errorHandle(src, true, i > 1, onerror)
      src.pipe(dest)
    }

    src = dest
  }

  if (done) {
    let fin = false

    const autoDestroy =
      isStreamx(dest) || !!(dest._writableState && dest._writableState.autoDestroy)

    dest.on('error', (err) => {
      if (error === null) error = err
    })

    dest.on('finish', () => {
      fin = true
      if (!autoDestroy) done(error)
    })

    if (autoDestroy) {
      dest.on('close', () => done(error || (fin ? null : StreamError.PREMATURE_CLOSE())))
    }
  }

  return dest

  function errorHandle(s, rd, wr, onerror) {
    s.on('error', onerror)
    s.on('close', onclose)

    function onclose() {
      if (rd && s._readableState && !s._readableState.ended) {
        return onerror(StreamError.PREMATURE_CLOSE())
      }
      if (wr && s._writableState && !s._writableState.ended) {
        return onerror(StreamError.PREMATURE_CLOSE())
      }
    }
  }

  function onerror(err) {
    if (!err || error) return
    error = err

    for (const s of all) {
      s.destroy(err)
    }
  }
}

function echo(s) {
  return s
}

function isStream(stream) {
  return !!stream._readableState || !!stream._writableState
}

function isStreamx(stream) {
  return typeof stream._duplexState === 'number' && isStream(stream)
}

function isEnding(stream) {
  return !!stream._readableState && stream._readableState.ending
}

function isEnded(stream) {
  return !!stream._readableState && stream._readableState.ended
}

function isFinishing(stream) {
  return !!stream._writableState && stream._writableState.ending
}

function isFinished(stream) {
  return !!stream._writableState && stream._writableState.ended
}

function getStreamError(stream, opts = {}) {
  const err =
    (stream._readableState && stream._readableState.error) ||
    (stream._writableState && stream._writableState.error)

  // avoid implicit errors by default
  return !opts.all && StreamError.isStreamDestroyed(err) ? null : err
}

function isReadStreamx(stream) {
  return isStreamx(stream) && stream.readable
}

function isDisturbed(stream) {
  return (
    (stream._duplexState & OPENING) !== OPENING ||
    (stream._duplexState & DESTROYING) === DESTROYING ||
    (stream._duplexState & ACTIVE_OR_TICKING) !== 0
  )
}

function isTypedArray(data) {
  return typeof data === 'object' && data !== null && typeof data.byteLength === 'number'
}

function defaultByteLength(data) {
  return isTypedArray(data) ? data.byteLength : 1024
}

function noop() {}

function abort() {
  this.destroy(StreamError.ABORTED())
}

function isWritev(s) {
  return s._writev !== Writable.prototype._writev && s._writev !== Duplex.prototype._writev
}

module.exports = {
  pipeline,
  pipelinePromise,
  isStream,
  isStreamx,
  isEnding,
  isEnded,
  isFinishing,
  isFinished,
  isDisturbed,
  getStreamError,
  Stream,
  Writable,
  Readable,
  Duplex,
  Transform,
  // Export PassThrough for compatibility with Node.js core's stream module
  PassThrough
}


================================================
FILE: lib/errors.js
================================================
module.exports = class StreamError extends Error {
  constructor(msg, code, fn = StreamError) {
    super(msg)

    this.code = code

    if (Error.captureStackTrace) {
      Error.captureStackTrace(this, fn)
    }
  }

  static isStreamDestroyed(err) {
    return err && err.code === 'STREAM_DESTROYED'
  }

  static isPrematureClose(err) {
    return err && err.code === 'PREMATURE_CLOSE'
  }

  static isAborted(err) {
    return err && err.code === 'ABORTED'
  }

  get name() {
    return 'StreamError'
  }

  static STREAM_DESTROYED() {
    return new StreamError('Stream was destroyed', 'STREAM_DESTROYED', StreamError.STREAM_DESTROYED)
  }

  static PREMATURE_CLOSE() {
    return new StreamError('Premature close', 'PREMATURE_CLOSE', StreamError.PREMATURE_CLOSE)
  }

  static ABORTED() {
    return new StreamError('Stream aborted', 'ABORTED', StreamError.ABORTED)
  }
}


================================================
FILE: package.json
================================================
{
  "name": "streamx",
  "version": "2.25.0",
  "description": "An iteration of the Node.js core streams with a series of improvements",
  "main": "index.js",
  "exports": {
    ".": "./index.js",
    "./package": "./package.json",
    "./errors": "./lib/errors"
  },
  "dependencies": {
    "events-universal": "^1.0.0",
    "fast-fifo": "^1.3.2",
    "text-decoder": "^1.1.0"
  },
  "devDependencies": {
    "b4a": "^1.6.6",
    "brittle": "^3.1.1",
    "end-of-stream": "^1.4.4",
    "lunte": "^1.8.0",
    "prettier": "^3.6.2",
    "prettier-config-holepunch": "^2.0.0"
  },
  "files": [
    "index.js",
    "lib/errors.js"
  ],
  "scripts": {
    "format": "prettier --write .",
    "test": "lunte && prettier --check . && node test/all.js",
    "test:bare": "bare test/all.js"
  },
  "repository": {
    "type": "git",
    "url": "https://github.com/mafintosh/streamx.git"
  },
  "author": "Mathias Buus (@mafintosh)",
  "license": "MIT",
  "bugs": {
    "url": "https://github.com/mafintosh/streamx/issues"
  },
  "homepage": "https://github.com/mafintosh/streamx"
}


================================================
FILE: test/all.js
================================================
// This runner is auto-generated by Brittle

runTests()

async function runTests() {
  const test = (await import('brittle')).default

  test.pause()

  await import('./async-iterator.js')
  await import('./backpressure.js')
  await import('./byte-length.js')
  await import('./compat.js')
  await import('./destroy.js')
  await import('./duplex.js')
  await import('./errors.js')
  await import('./get-stream-error.js')
  await import('./passthrough.js')
  await import('./pipe.js')
  await import('./pipeline.js')
  await import('./readable.js')
  await import('./transform.js')
  await import('./writable.js')

  test.resume()
}


================================================
FILE: test/async-iterator.js
================================================
const test = require('brittle')
const { Readable, getStreamError } = require('../')
const StreamError = require('../lib/errors')

test('streams are async iterators', async function (t) {
  const data = ['a', 'b', 'c', null]
  const expected = data.slice(0)

  const r = new Readable({
    read(cb) {
      this.push(data.shift())
      cb(null)
    }
  })

  for await (const chunk of r) {
    t.is(chunk, expected.shift())
  }

  t.is(expected.shift(), null)
})

test('break out of iterator', async function (t) {
  const r = new Readable({
    read(cb) {
      this.push('tick')
      cb(null)
    },
    destroy(cb) {
      t.pass('destroying')
      cb(null)
    }
  })

  let runs = 10

  for await (const chunk of r) {
    t.is(chunk, 'tick')
    if (--runs === 0) break
  }
})

test('throw out of iterator', async function (t) {
  const r = new Readable({
    read(cb) {
      this.push('tick')
      cb(null)
    },
    destroy(cb) {
      t.pass('destroying')
      cb(null)
    }
  })

  let runs = 10

  await t.exception(async function () {
    for await (const chunk of r) {
      t.is(chunk, 'tick')
      if (--runs === 0) throw new Error('stop')
    }
  })
})

test('intertesting timing', async function (t) {
  const r = new Readable({
    read(cb) {
      setImmediate(() => {
        this.push('b')
        this.push('c')
        this.push(null)
        cb(null)
      })
    },
    destroy(cb) {
      t.pass('destroying')
      cb(null)
    }
  })

  r.push('a')

  const iterated = []

  for await (const chunk of r) {
    iterated.push(chunk)
    await new Promise((resolve) => setTimeout(resolve, 10))
  }

  t.alike(iterated, ['a', 'b', 'c'])
})

test('intertesting timing with close', async function (t) {
  t.plan(3)

  const r = new Readable({
    read(cb) {
      setImmediate(() => {
        this.destroy(new Error('stop'))
        cb(null)
      })
    },
    destroy(cb) {
      t.pass('destroying')
      cb(null)
    }
  })

  r.push('a')

  const iterated = []

  await t.exception(async function () {
    for await (const chunk of r) {
      iterated.push(chunk)
      await new Promise((resolve) => setTimeout(resolve, 10))
    }
  })

  t.alike(iterated, ['a'])
})

test('cleaning up a closed iterator', async function (t) {
  const r = new Readable()
  r.push('a')
  t.plan(1)

  const fn = async () => {
    for await (const chunk of r) {
      // eslint-disable-line
      r.destroy()
      await new Promise((resolve) => r.once('close', resolve))
      t.is(chunk, 'a')
      return
    }
  }
  await fn()
})

test('using abort controller', { skip: !!global.Bare }, async function (t) {
  function createInfinite(signal) {
    let count = 0
    const r = new Readable({ signal })
    r.push(count)
    const int = setInterval(() => r.push(count++), 5000)
    r.once('close', () => clearInterval(int))
    return r
  }
  const controller = new AbortController()
  const inc = []
  setImmediate(() => controller.abort())

  let r
  await t.exception(async function () {
    r = createInfinite(controller.signal)
    for await (const chunk of r) {
      inc.push(chunk)
    }
  })

  t.alike(inc, [0])
  const err = getStreamError(r)
  t.ok(StreamError.isAborted(err))
})

test('from async iterator and to async iterator', async function (t) {
  const expected = []

  const stream = Readable.from(
    (async function* () {
      yield 'a'
      yield 'b'
    })()
  )

  for await (const data of stream) {
    expected.push(data)
  }

  t.alike(expected, ['a', 'b'])
})


================================================
FILE: test/backpressure.js
================================================
const test = require('brittle')
const { Writable, Readable } = require('../')

test('write backpressure', function (t) {
  const ws = new Writable()

  for (let i = 0; i < 15; i++) {
    t.ok(ws.write('a'), 'not backpressured')
    t.absent(Writable.isBackpressured(ws), 'static check')
  }

  t.absent(ws.write('a'), 'backpressured')
  t.ok(Writable.isBackpressured(ws), 'static check')
})

test('write backpressure with drain', function (t) {
  t.plan(15 * 2 + 2 + 1)

  const ws = new Writable()

  for (let i = 0; i < 15; i++) {
    t.ok(ws.write('a'), 'not backpressured')
    t.absent(Writable.isBackpressured(ws), 'static check')
  }

  t.absent(ws.write('a'), 'backpressured')
  t.ok(Writable.isBackpressured(ws), 'static check')

  ws.on('drain', function () {
    t.absent(Writable.isBackpressured(ws))
  })
})

test('write backpressure with destroy', function (t) {
  const ws = new Writable()

  ws.write('a')
  ws.destroy()

  t.ok(Writable.isBackpressured(ws))
})

test('write backpressure with end', function (t) {
  const ws = new Writable()

  ws.write('a')
  ws.end()

  t.ok(Writable.isBackpressured(ws))
})

test('read backpressure', function (t) {
  const rs = new Readable()

  for (let i = 0; i < 15; i++) {
    t.ok(rs.push('a'), 'not backpressured')
    t.absent(Readable.isBackpressured(rs), 'static check')
  }

  t.absent(rs.push('a'), 'backpressured')
  t.ok(Readable.isBackpressured(rs), 'static check')
})

test('read backpressure with later read', function (t) {
  t.plan(15 * 2 + 2 + 1)

  const rs = new Readable()

  for (let i = 0; i < 15; i++) {
    t.ok(rs.push('a'), 'not backpressured')
    t.absent(Readable.isBackpressured(rs), 'static check')
  }

  t.absent(rs.push('a'), 'backpressured')
  t.ok(Readable.isBackpressured(rs), 'static check')

  rs.once('readable', function () {
    rs.read()
    t.absent(Readable.isBackpressured(rs))
  })
})

test('read backpressure with destroy', function (t) {
  const rs = new Readable()

  rs.push('a')
  rs.destroy()

  t.ok(Readable.isBackpressured(rs))
})

test('read backpressure with push(null)', function (t) {
  const rs = new Readable()

  rs.push('a')
  rs.push(null)

  t.ok(Readable.isBackpressured(rs))
})


================================================
FILE: test/byte-length.js
================================================
const test = require('brittle')
const { Readable, Writable } = require('../')

const defaultSizes = [
  { name: 'buf512', item: Buffer.alloc(512), size: 512 },
  { name: 'number', item: 1, size: 1024 },
  { name: 'number-byteLength', item: 1, size: 512, byteLength: () => 512 },
  {
    name: 'number-byteLengthReadable',
    item: 1,
    size: 256,
    byteLength: () => 512,
    byteLengthExtended: () => 256
  },
  { name: 'uint8-512', item: new Uint8Array(512), size: 512 },
  { name: 'uint32-64', item: new Uint32Array(64), size: 256 }
]

for (const { name, item, size, byteLength, byteLengthExtended } of defaultSizes) {
  test(`readable ${name}`, function (t) {
    const r = new Readable({
      byteLength,
      byteLengthReadable: byteLengthExtended
    })
    r.push(item)
    t.is(r._readableState.buffered, size)
  })

  test(`writable ${name}`, function (t) {
    const w = new Writable({
      byteLength,
      byteLengthWritable: byteLengthExtended
    })
    w.write(item)
    t.is(w._writableState.buffered, size)
  })
}

test('byteLength receives readable item', function (t) {
  t.plan(1)

  const obj = {}
  const r = new Readable({
    byteLength: (data) => {
      t.alike(obj, data)
    }
  })
  r.push(obj)
})

test('byteLength receives writable item', function (t) {
  t.plan(2)

  const obj = {}
  const r = new Writable({
    byteLength: (data) => {
      t.alike(obj, data)
      return 1
    }
  })
  r.write(obj)
})


================================================
FILE: test/compat.js
================================================
const eos = global.Bare ? null : require('end-of-stream')
const test = require('brittle')
const stream = require('../')
const finished = global.Bare ? null : require('stream').finished

run(eos)
run(finished)

function run(eos) {
  if (!eos) return
  const name = eos === finished ? 'nodeStream.finished' : 'eos'

  test(name + ' readable', function (t) {
    t.plan(2)

    const r = new stream.Readable()
    let ended = false

    r.on('end', function () {
      ended = true
    })

    eos(r, function (err) {
      t.absent(err, 'no error')
      t.ok(ended)
    })

    r.push('hello')
    r.push(null)
    r.resume()
  })

  test(name + ' readable destroy', function (t) {
    t.plan(2)

    const r = new stream.Readable()
    let ended = false

    r.on('end', function () {
      ended = true
    })

    eos(r, function (err) {
      t.ok(err, 'had error')
      t.absent(ended)
    })

    r.push('hello')
    r.push(null)
    r.resume()
    r.destroy()
  })

  test(name + ' writable', function (t) {
    t.plan(2)

    const w = new stream.Writable()
    let finished = false

    w.on('finish', function () {
      finished = true
    })

    eos(w, function (err) {
      t.absent(err, 'no error')
      t.ok(finished)
    })

    w.write('hello')
    w.end()
  })

  test(name + ' writable destroy', function (t) {
    t.plan(3)

    const w = new stream.Writable()
    let finished = false

    w.on('finish', function () {
      finished = true
    })

    eos(w, function (err) {
      t.ok(err, 'had error')
      t.absent(finished)
    })

    w.write('hello')
    t.is(w.end(), w)
    w.destroy()
  })

  test(name + ' duplex', function (t) {
    t.plan(4)

    const s = new stream.Duplex()
    let ended = false
    let finished = false

    s.on('end', () => {
      ended = true
    })
    s.on('finish', () => {
      finished = true
    })

    eos(s, function (err) {
      t.absent(err, 'no error')
      t.ok(ended)
      t.ok(finished)
    })

    s.push('hello')
    s.push(null)
    s.resume()
    t.is(s.end(), s)
  })

  test(name + ' duplex + deferred s.end()', function (t) {
    t.plan(3)

    const s = new stream.Duplex()
    let ended = false
    let finished = false

    s.on('end', function () {
      ended = true
      setImmediate(() => s.end())
    })

    s.on('finish', () => {
      finished = true
    })

    eos(s, function (err) {
      t.absent(err, 'no error')
      t.ok(ended)
      t.ok(finished)
    })

    s.push('hello')
    s.push(null)
    s.resume()
  })

  test(name + ' duplex + deferred s.push(null)', function (t) {
    t.plan(3)

    const s = new stream.Duplex()
    let ended = false
    let finished = false

    s.on('finish', function () {
      finished = true
      setImmediate(() => s.push(null))
    })

    s.on('end', () => {
      ended = true
    })

    eos(s, function (err) {
      t.absent(err, 'no error')
      t.ok(ended)
      t.ok(finished)
    })

    s.push('hello')
    s.end()
    s.resume()
  })

  test(name + ' duplex destroy', function (t) {
    t.plan(3)

    const s = new stream.Duplex()
    let ended = false
    let finished = false

    s.on('end', () => {
      ended = true
    })
    s.on('finish', () => {
      finished = true
    })

    eos(s, function (err) {
      t.ok(err, 'had error')
      t.absent(ended)
      t.absent(finished)
    })

    s.push('hello')
    s.push(null)
    s.resume()
    s.end()
    s.destroy()
  })
}


================================================
FILE: test/destroy.js
================================================
const test = require('brittle')
const { Duplex } = require('../')

test('destroy is never sync', function (t) {
  t.plan(1)

  let openCb = null

  const s = new Duplex({
    open(cb) {
      openCb = cb
    },
    predestroy() {
      openCb(new Error('stop'))
    }
  })

  s.resume()
  setImmediate(() => {
    s.destroy()
    s.on('close', () => t.pass('destroy was not sync'))
  })
})


================================================
FILE: test/duplex.js
================================================
const test = require('brittle')
const { Duplex } = require('../')

test('if open does not end, it should stall', function (t) {
  t.plan(1)

  const d = new Duplex({
    open() {
      t.pass('open called')
    },
    read() {
      t.fail('should not call read')
    },
    write() {
      t.fail('should not call write')
    }
  })

  d.resume()
  d.write('hi')
})

test('Using both mapReadable and mapWritable to map data', function (t) {
  t.plan(2)

  const d = new Duplex({
    write(data, cb) {
      d.push(data)
      cb()
    },
    final(cb) {
      d.push(null)
      cb()
    },
    mapReadable: (num) => JSON.stringify({ num }),
    mapWritable: (input) => parseInt(input, 10)
  })
  d.on('data', (data) => {
    t.is(data, '{"num":32}')
  })
  d.on('close', () => {
    t.pass('closed')
  })
  d.write('32')
  d.end()
})

test('wait for readable', function (t) {
  t.plan(1)

  const d = new Duplex({
    read(cb) {
      d.push('ok')
      d.push(null)
      cb()
    }
  })

  d.on('readable', function () {
    t.is(d.read(), 'ok')
  })
})

test('write during end', function (t) {
  t.plan(3)

  const expected = ['a', 'b']

  const w = new Duplex({
    write(data, cb) {
      t.is(data, expected.shift())
      cb(null)
    },
    final(cb) {
      w.write('bad')
      cb(null)
    }
  })

  w.write('a')
  w.write('b')
  w.end()
  w.on('finish', () => w.push(null))
  w.on('close', () => t.pass('closed'))
})


================================================
FILE: test/errors.js
================================================
const test = require('brittle')
const StreamError = require('../lib/errors')

test('can make errors', function (t) {
  {
    const err = StreamError.STREAM_DESTROYED()

    t.is(err.code, 'STREAM_DESTROYED')
    t.is(err.message, 'Stream was destroyed')
    t.ok(StreamError.isStreamDestroyed(err))
  }

  {
    const err = StreamError.PREMATURE_CLOSE()

    t.is(err.code, 'PREMATURE_CLOSE')
    t.is(err.message, 'Premature close')
    t.ok(StreamError.isPrematureClose(err))
  }

  {
    const err = StreamError.ABORTED()

    t.is(err.code, 'ABORTED')
    t.is(err.message, 'Stream aborted')
    t.ok(StreamError.isAborted(err))
  }
})


================================================
FILE: test/get-stream-error.js
================================================
const test = require('brittle')
const { Readable, getStreamError } = require('../')
const StreamError = require('../lib/errors')

test('getStreamError, no errors', function (t) {
  const stream = new Readable()

  t.is(getStreamError(stream), null)
})

test('getStreamError, basic', function (t) {
  const stream = new Readable()
  stream.on('error', () => {})

  const err = new Error('stop')
  stream.destroy(err)

  t.is(getStreamError(stream), err)
})

test('getStreamError, only explicit errors by default', function (t) {
  const stream = new Readable()

  stream.destroy()

  t.absent(getStreamError(stream))
})

test('getStreamError, get premature destroy', function (t) {
  const stream = new Readable()

  stream.destroy()

  const err = getStreamError(stream, { all: true })
  t.alike(err.message, 'Stream was destroyed')
  t.ok(StreamError.isStreamDestroyed(err))
})


================================================
FILE: test/passthrough.js
================================================
const test = require('brittle')
const { PassThrough, Writable, Readable } = require('../')

test('passthrough', (t) => {
  t.plan(3)

  let i = 0
  const p = new PassThrough()
  const w = new Writable({
    write(data, cb) {
      i++
      if (i === 1) t.is(data, 'foo')
      else if (i === 2) t.is(data, 'bar')
      else t.fail('too many messages')
      cb()
    }
  })
  w.on('finish', () => t.pass('finished'))
  const r = new Readable()
  r.pipe(p).pipe(w)
  r.push('foo')
  r.push('bar')
  r.push(null)
})


================================================
FILE: test/pipe.js
================================================
const test = require('brittle')
const compat = global.Bare ? null : require('stream')
const { Readable, Writable } = require('../')

test('pipe to node stream', { skip: !compat }, function (t) {
  t.plan(3)

  const expected = ['hi', 'ho']

  const r = new Readable()
  const w = new compat.Writable({
    objectMode: true,
    write(data, enc, cb) {
      t.is(data, expected.shift())
      cb(null)
    }
  })

  r.push('hi')
  r.push('ho')
  r.push(null)

  r.pipe(w)

  w.on('finish', function () {
    t.is(expected.length, 0)
  })
})

test('pipe with callback - error case', function (t) {
  t.plan(2)

  const r = new Readable()
  const w = new Writable({
    write(data, cb) {
      cb(new Error('blerg'))
    }
  })

  r.pipe(w, function (err) {
    t.pass('callback called')
    t.alike(err, new Error('blerg'))
  })

  r.push('hello')
  r.push('world')
  r.push(null)
})

test('pipe with callback - error case with destroy', function (t) {
  t.plan(2)

  const r = new Readable()
  const w = new Writable({
    write(data, cb) {
      w.destroy(new Error('blerg'))
      cb(null)
    }
  })

  r.pipe(w, function (err) {
    t.pass('callback called')
    t.alike(err, new Error('blerg'))
  })

  r.push('hello')
  r.push('world')
})

test('pipe with callback - error case node stream', { skip: !compat }, function (t) {
  t.plan(2)

  const r = new Readable()
  const w = new compat.Writable({
    write(data, enc, cb) {
      cb(new Error('blerg'))
    }
  })

  r.pipe(w, function (err) {
    t.pass('callback called')
    t.alike(err, new Error('blerg'))
  })

  r.push('hello')
  r.push('world')
  r.push(null)
})

test('simple pipe', function (t) {
  t.plan(2)

  const buffered = []

  const r = new Readable()
  const w = new Writable({
    write(data, cb) {
      buffered.push(data)
      cb(null)
    },

    final() {
      t.pass('final called')
      t.alike(buffered, ['hello', 'world'])
    }
  })

  r.pipe(w)

  r.push('hello')
  r.push('world')
  r.push(null)
})

test('pipe with callback', function (t) {
  t.plan(3)

  const buffered = []

  const r = new Readable()
  const w = new Writable({
    write(data, cb) {
      buffered.push(data)
      cb(null)
    }
  })

  r.pipe(w, function (err) {
    t.pass('callback called')
    t.is(err, null)
    t.alike(buffered, ['hello', 'world'])
  })

  r.push('hello')
  r.push('world')
  r.push(null)
})

test('pipe continues if read is "blocked"', function (t) {
  t.plan(1)

  let written = 0
  let read = 0

  const r = new Readable({
    read(cb) {
      this.push('test')

      if (++read === 20) {
        setTimeout(done, 10)
        return
      }

      cb(null)
    }
  })

  const w = new Writable({
    write(data, cb) {
      written++
      cb(null)
    }
  })

  r.pipe(w)

  function done() {
    t.is(written, read)
  }
})


================================================
FILE: test/pipeline.js
================================================
const test = require('brittle')
const { pipeline, pipelinePromise, Transform, Readable, Writable } = require('../')

test('piping to a writable', function (t) {
  t.plan(2)

  const w = pipeline(
    Readable.from('hello'),
    new Writable({
      write(data, cb) {
        t.is(data, 'hello')
        cb()
      }
    })
  )
  w.on('close', () => t.pass('closed'))
})

test('piping with error', function (t) {
  t.plan(1)

  const r = new Readable()
  const w = new Writable()
  const err = new Error()
  pipeline(r, w, (error) => {
    t.alike(error, err)
  })
  r.destroy(err)
})

test('piping with final callback', function (t) {
  t.plan(2)

  pipeline(
    Readable.from('hello'),
    new Writable({
      write(data, cb) {
        t.is(data, 'hello')
        cb()
      }
    }),
    () => t.pass('ended')
  )
})

test('piping with transform stream inbetween', function (t) {
  t.plan(2)

  pipeline(
    [
      Readable.from('hello'),
      new Transform({
        transform(input, cb) {
          this.push(input.length)
          cb()
        }
      }),
      new Writable({
        write(data, cb) {
          t.is(data, 5)
          cb()
        }
      })
    ],
    () => t.pass('ended')
  )
})

test('piping to a writable', function (t) {
  t.plan(2)

  const w = pipeline(
    Readable.from('hello'),
    new Writable({
      write(data, cb) {
        t.is(data, 'hello')
        cb()
      }
    })
  )
  w.on('close', () => t.pass('closed'))
})

test('piping to a writable + promise', async function (t) {
  t.plan(2)

  const r = Readable.from('hello')
  let closed = false
  r.on('close', () => {
    closed = true
  })
  await pipelinePromise(
    r,
    new Writable({
      write(data, cb) {
        t.is(data, 'hello')
        cb()
      }
    })
  )
  t.ok(closed)
})


================================================
FILE: test/readable.js
================================================
const test = require('brittle')
const b4a = require('b4a')
const { Readable, isDisturbed } = require('../')

test('ondata', function (t) {
  t.plan(4)

  const r = new Readable()
  const buffered = []
  let ended = 0

  r.push('hello')
  r.push('world')
  r.push(null)

  r.on('data', (data) => buffered.push(data))
  r.on('end', () => ended++)
  r.on('close', function () {
    t.pass('closed')
    t.alike(buffered, ['hello', 'world'])
    t.is(ended, 1)
    t.ok(r.destroyed)
  })
})

test('pause', async function (t) {
  const r = new Readable()
  const buffered = []
  t.is(Readable.isPaused(r), true, 'starting off paused')
  r.on('data', (data) => buffered.push(data))
  r.on('close', () => t.end())
  r.push('hello')
  await nextImmediate()
  t.is(r.pause(), r, '.pause() returns self')
  t.is(Readable.isPaused(r), true, '.pause() marks stream as paused')
  r.push('world')
  await nextImmediate()
  t.alike(buffered, ['hello'], '.pause() prevents data to be read')
  t.is(r.resume(), r, '.resume() returns self')
  t.is(Readable.isPaused(r), false, '.resume() marks stream as resumed')
  await nextImmediate()
  t.alike(buffered, ['hello', 'world'])
  r.push(null)
})

test('resume', function (t) {
  t.plan(3)

  const r = new Readable()
  let ended = 0

  r.push('hello')
  r.push('world')
  r.push(null)

  r.resume()
  r.on('end', () => ended++)
  r.on('close', function () {
    t.pass('closed')
    t.is(ended, 1)
    t.ok(r.destroyed)
  })
})

test('lazy open', async function (t) {
  let opened = false
  const r = new Readable({
    open(cb) {
      opened = true
      cb(null)
    }
  })
  await nextImmediate()
  t.absent(opened)
  r.read()
  await nextImmediate()
  t.ok(opened)
})

test('eager open', async function (t) {
  let opened = false
  const r = new Readable({
    open(cb) {
      opened = true
      cb(null)
    },
    eagerOpen: true
  })
  await nextImmediate()
  t.ok(opened)
  r.push(null)
})

test('shorthands', function (t) {
  t.plan(3)

  const r = new Readable({
    read(cb) {
      this.push('hello')
      cb(null)
    },
    destroy(cb) {
      t.pass('destroyed')
      cb(null)
    }
  })

  r.once('readable', function () {
    t.is(r.read(), 'hello')
    r.destroy()
    t.is(r.read(), null)
  })
})

test('both push and the cb needs to be called for re-reads', function (t) {
  t.plan(2)

  let once = true

  const r = new Readable({
    read(cb) {
      t.ok(once, 'read called once')
      once = false
      cb(null)
    }
  })

  r.resume()

  setTimeout(function () {
    once = true
    r.push('hi')
  }, 100)
})

test('from array', function (t) {
  t.plan(1)

  const inc = []
  const r = Readable.from([1, 2, 3])
  r.on('data', (data) => inc.push(data))
  r.on('end', function () {
    t.alike(inc, [1, 2, 3])
  })
})

test('from buffer', function (t) {
  t.plan(1)

  const inc = []
  const r = Readable.from(Buffer.from('hello'))
  r.on('data', (data) => inc.push(data))
  r.on('end', function () {
    t.alike(inc, [Buffer.from('hello')])
  })
})

test('from async iterator', function (t) {
  t.plan(1)

  async function* test() {
    yield 1
    yield 2
    yield 3
  }

  const inc = []
  const r = Readable.from(test())
  r.on('data', (data) => inc.push(data))
  r.on('end', function () {
    t.alike(inc, [1, 2, 3])
  })
})

test('from array with highWaterMark', function (t) {
  const r = Readable.from([1, 2, 3], { highWaterMark: 1 })
  t.is(r._readableState.highWaterMark, 1)
})

test('from async iterator with highWaterMark', function (t) {
  async function* test() {
    yield 1
  }

  const r = Readable.from(test(), { highWaterMark: 1 })
  t.is(r._readableState.highWaterMark, 1)
})

test('unshift', async function (t) {
  const r = new Readable()
  r.pause()
  r.push(1)
  r.push(2)
  r.unshift(0)
  r.push(null)
  const inc = []
  for await (const entry of r) {
    inc.push(entry)
  }
  t.alike(inc, [0, 1, 2])
})

test('from readable should return the original readable', function (t) {
  const r = new Readable()
  t.is(Readable.from(r), r)
})

test('map readable data', async function (t) {
  const r = new Readable({
    map: (input) => JSON.parse(input)
  })
  r.push('{ "foo": 1 }')
  for await (const obj of r) {
    // eslint-disable-line
    t.alike(obj, { foo: 1 })
    break
  }
})

test('use mapReadable to map data', async function (t) {
  const r = new Readable({
    map: () => t.fail('.mapReadable has priority'),
    mapReadable: (input) => JSON.parse(input)
  })
  r.push('{ "foo": 1 }')
  for await (const obj of r) {
    // eslint-disable-line
    t.alike(obj, { foo: 1 })
    break
  }
})

test('live stream', function (t) {
  t.plan(3)

  const r = new Readable({
    read(cb) {
      this.push('data')
      this.push('data')
      this.push('data')
      // assume cb is called way later
    }
  })

  r.on('data', function (data) {
    t.is(data, 'data')
  })
})

test('live stream with readable', function (t) {
  t.plan(3)

  const r = new Readable({
    read(cb) {
      this.push('data')
      this.push('data')
      this.push('data')
      // assume cb is called way later
    }
  })

  r.on('readable', function () {
    let data
    while ((data = r.read()) !== null) t.is(data, 'data')
  })
})

test('resume a stalled stream', function (t) {
  t.plan(1)

  const expected = []
  let once = true

  const r = new Readable({
    read(cb) {
      if (once) {
        once = false
        this.push('data')
        expected.push('data')
        return cb()
      }

      for (let i = 0; i < 20; i++) {
        this.push('data')
        expected.push('data')
      }

      // pretend its stalled
    }
  })

  const collected = []

  r.once('data', function (data) {
    r.pause()
    collected.push(data)
    setImmediate(() => {
      r.on('data', function (data) {
        collected.push(data)
        if (collected.length === 21) {
          t.alike(collected, expected)
        }
      })
      r.resume()
    })
  })
})

test('no read-ahead with pause/resume', function (t) {
  t.plan(4)

  let tick = 0

  const r = new Readable({
    highWaterMark: 0,
    read(cb) {
      this.push('tick: ' + ++tick)
      cb()
    }
  })

  r.once('data', function () {
    t.is(tick, 1)
    r.pause()
    setImmediate(() => {
      t.is(tick, 1)
      r.resume()
      r.once('data', function () {
        t.is(tick, 2)
        r.pause()
        setImmediate(() => {
          t.is(tick, 2)
        })
      })
    })
  })
})

test('no read-ahead with async iterator', async function (t) {
  let tick = 0

  const r = new Readable({
    highWaterMark: 0,
    read(cb) {
      this.push('tick: ' + ++tick)
      if (tick === 10) this.push(null)
      cb()
    }
  })

  let expectedTick = 0
  for await (const data of r) {
    t.is(tick, ++expectedTick)
    t.is(data, 'tick: ' + tick)
    await nextImmediate()
  }

  t.is(expectedTick, 10)
})

test('setEncoding', async function (t) {
  const r = new Readable()

  r.setEncoding('utf-8')
  const buffer = b4a.from('hællå wørld!')
  for (let i = 0; i < buffer.byteLength; i++) {
    r.push(buffer.subarray(i, i + 1))
  }
  r.push(null)
  const expected = b4a.toString(buffer).split('')
  for await (const data of r) {
    t.is(data, expected.shift())
  }
  t.is(expected.length, 0)
})

test('setEncoding respects existing map', function (t) {
  t.plan(1)

  const r = new Readable({
    encoding: 'utf-8',
    map(data) {
      return JSON.parse(data)
    }
  })

  r.push('{"hello":"world"}')
  r.once('data', function (data) {
    t.alike(data, { hello: 'world' })
  })
})

test('setEncoding empty string', async function (t) {
  t.plan(1)

  const r = new Readable()

  r.setEncoding('utf-8')
  const buffer = b4a.from('')
  r.push(buffer)
  r.push(null)

  for await (const data of r) {
    t.is(data, '')
  }
})

test('is disturbed', function (t) {
  const r = new Readable()
  t.is(isDisturbed(r), false)

  r.push('hello')
  t.is(isDisturbed(r), false)

  r.resume()
  t.is(isDisturbed(r), true)

  r.pause()
  t.is(isDisturbed(r), true)
})

test('is disturbed after immediate destroy', function (t) {
  t.plan(3)

  const r = new Readable()
  t.is(isDisturbed(r), false)

  r.destroy()
  t.is(isDisturbed(r), true)

  r.on('close', () => t.is(isDisturbed(r), true))
})

function nextImmediate() {
  return new Promise((resolve) => setImmediate(resolve))
}


================================================
FILE: test/transform.js
================================================
const test = require('brittle')
const { Transform } = require('../')

test('default transform teardown when saturated', async function (t) {
  const stream = new Transform({
    transform(data, cb) {
      cb(null, data)
    }
  })

  for (let i = 0; i < 20; i++) {
    stream.write('hello')
  }

  await new Promise((resolve) => setImmediate(resolve))

  stream.destroy()

  await new Promise((resolve) => stream.on('close', resolve))

  t.pass('close fired')
})


================================================
FILE: test/writable.js
================================================
const test = require('brittle')
const { Writable, Duplex } = require('../')

test('opens before writes', function (t) {
  t.plan(2)

  const trace = []
  const stream = new Writable({
    open(cb) {
      trace.push('open')
      return cb(null)
    },
    write(data, cb) {
      trace.push('write')
      return cb(null)
    }
  })
  stream.on('close', () => {
    t.is(trace.length, 2)
    t.is(trace[0], 'open')
  })
  stream.write('data')
  stream.end()
})

test('drain', function (t) {
  t.plan(2)

  const stream = new Writable({
    highWaterMark: 1,
    write(data, cb) {
      cb(null)
    }
  })

  t.absent(stream.write('a'))
  stream.on('drain', function () {
    t.pass('drained')
  })
})

test('drain multi write', function (t) {
  t.plan(4)

  const stream = new Writable({
    highWaterMark: 1,
    write(data, cb) {
      cb(null)
    }
  })

  t.absent(stream.write('a'))
  t.absent(stream.write('a'))
  t.absent(stream.write('a'))
  stream.on('drain', function () {
    t.pass('drained')
  })
})

test('drain async write', function (t) {
  t.plan(3)

  let flushed = false

  const stream = new Writable({
    highWaterMark: 1,
    write(data, cb) {
      setImmediate(function () {
        flushed = true
        cb(null)
      })
    }
  })

  t.absent(stream.write('a'))
  t.absent(flushed)
  stream.on('drain', function () {
    t.ok(flushed)
  })
})

test('writev', function (t) {
  t.plan(3)

  const expected = [[], ['ho']]

  const s = new Writable({
    writev(batch, cb) {
      t.alike(batch, expected.shift())
      cb(null)
    }
  })

  for (let i = 0; i < 100; i++) {
    expected[0].push('hi-' + i)
    s.write('hi-' + i)
  }

  s.on('drain', function () {
    s.write('ho')
    s.end()
  })

  s.on('finish', function () {
    t.pass('finished')
  })
})

test('map written data', function (t) {
  t.plan(2)

  const r = new Writable({
    write(data, cb) {
      t.is(data, '{"foo":1}')
      cb()
    },
    map: (input) => JSON.stringify(input)
  })
  r.on('finish', () => {
    t.pass('finished')
  })
  r.write({ foo: 1 })
  r.end()
})

test('use mapWritable to map data', function (t) {
  t.plan(2)

  const r = new Writable({
    write(data, cb) {
      t.is(data, '{"foo":1}')
      cb()
    },
    map: () => t.fail('.mapWritable has priority'),
    mapWritable: (input) => JSON.stringify(input)
  })
  r.on('finish', () => {
    t.pass('finished')
  })
  r.write({ foo: 1 })
  r.end()
})

test('many ends', function (t) {
  t.plan(2)

  let finals = 0
  let finish = 0

  const s = new Duplex({
    final(cb) {
      finals++
      cb(null)
    }
  })

  s.end()
  queueMicrotask(() => {
    s.end()
    queueMicrotask(() => {
      s.end()
    })
  })

  s.on('finish', function () {
    finish++
    t.is(finals, 1)
    t.is(finish, 1)
  })
})

test('drained helper', async function (t) {
  const w = new Writable({
    write(data, cb) {
      setImmediate(cb)
    }
  })

  for (let i = 0; i < 20; i++) w.write('hi')

  await Writable.drained(w)

  t.is(w._writableState.queue.length, 0)

  for (let i = 0; i < 20; i++) w.write('hi')

  const d1 = Writable.drained(w)

  for (let i = 0; i < 20; i++) w.write('hi')

  const d2 = Writable.drained(w)

  d1.then(() => {
    t.not(w._writableState.queue.length, 0, 'future writes are queued')
  })

  d2.then(() => {
    t.is(w._writableState.queue.length, 0, 'all drained now')
  })

  await d1
  await d2

  await Writable.drained(w)

  t.pass('works if no writes are pending')

  for (let i = 0; i < 20; i++) w.write('hi')

  const d3 = Writable.drained(w)
  w.destroy()

  t.absent(await d3)
  t.absent(await Writable.drained(w), 'already destroyed')
})

test('drained helper, duplex', async function (t) {
  const w = new Duplex({
    write(data, cb) {
      setImmediate(cb)
    }
  })

  for (let i = 0; i < 20; i++) w.write('hi')

  await Writable.drained(w)

  t.is(w._writableState.queue.length, 0)

  for (let i = 0; i < 20; i++) w.write('hi')

  const d1 = Writable.drained(w)

  for (let i = 0; i < 20; i++) w.write('hi')

  const d2 = Writable.drained(w)

  d1.then(() => {
    t.not(w._writableState.queue.length, 0, 'future writes are queued')
  })

  d2.then(() => {
    t.is(w._writableState.queue.length, 0, 'all drained now')
  })

  await d1
  await d2

  await Writable.drained(w)

  t.pass('works if no writes are pending')

  for (let i = 0; i < 20; i++) w.write('hi')

  const d3 = Writable.drained(w)
  w.destroy()

  t.absent(await d3)
  t.absent(await Writable.drained(w), 'already destroyed')
})

test('drained helper, inflight write', async function (t) {
  let writing = false
  const w = new Writable({
    write(data, cb) {
      writing = true
      setImmediate(() => {
        setImmediate(() => {
          writing = false
          cb()
        })
      })
    }
  })

  w.write('hello')
  w.end()

  await new Promise((resolve) => setImmediate(resolve))
  t.ok(writing, 'is writing')
  await Writable.drained(w)
  t.absent(writing, 'not writing')
})

test('drained helper, writev', async function (t) {
  let writing = 0
  let continueWrite

  const wrote = new Promise((resolve) => {
    continueWrite = resolve
  })

  const w = new Writable({
    writev(datas, cb) {
      continueWrite()
      setImmediate(() => {
        writing -= datas.length
        cb()
      })
    }
  })

  for (let i = 0; i < 10; i++) {
    writing++
    w.write('hello')
  }

  w.end()

  await wrote
  t.ok(writing > 0, 'is writing')
  await Writable.drained(w)
  t.ok(writing === 0, 'not writing')
})

test('drained helper, writev, already flushed', async function (t) {
  const w = new Writable({
    writev(datas, cb) {
      cb()
    }
  })

  await Writable.drained(w)
  t.pass('drained resovled')
})

test('can cork and uncork the stream', async function (t) {
  const w = new Writable({
    writev(batch, cb) {
      t.alike(batch, [1, 2, 3])
      cb(null)
    }
  })

  w.cork()
  w.write(1)
  await eventFlush()
  w.write(2)
  await eventFlush()
  w.write(3)
  w.uncork()

  await Writable.drained(w)
})

function eventFlush() {
  return new Promise((resolve) => setImmediate(resolve))
}
Download .txt
gitextract_6870msq2/

├── .gitattributes
├── .github/
│   └── workflows/
│       └── test.yml
├── .gitignore
├── .prettierrc
├── LICENSE
├── README.md
├── SECURITY.md
├── examples/
│   └── fs.js
├── index.js
├── lib/
│   └── errors.js
├── package.json
└── test/
    ├── all.js
    ├── async-iterator.js
    ├── backpressure.js
    ├── byte-length.js
    ├── compat.js
    ├── destroy.js
    ├── duplex.js
    ├── errors.js
    ├── get-stream-error.js
    ├── passthrough.js
    ├── pipe.js
    ├── pipeline.js
    ├── readable.js
    ├── transform.js
    └── writable.js
Download .txt
SYMBOL INDEX (291 symbols across 14 files)

FILE: examples/fs.js
  class FileWriteStream (line 4) | class FileWriteStream extends Writable {
    method constructor (line 5) | constructor(filename, mode) {
    method _open (line 12) | _open(cb) {
    method _write (line 20) | _write(data, cb) {
    method _destroy (line 28) | _destroy(cb) {
  class FileReadStream (line 34) | class FileReadStream extends Readable {
    method constructor (line 35) | constructor(filename) {
    method _open (line 41) | _open(cb) {
    method _read (line 49) | _read(cb) {
    method _destroy (line 60) | _destroy(cb) {

FILE: index.js
  constant FIFO (line 2) | const FIFO = require('fast-fifo')
  constant MAX (line 12) | const MAX = (1 << 29) - 1
  constant OPENING (line 15) | const OPENING = 0b0001
  constant PREDESTROYING (line 16) | const PREDESTROYING = 0b0010
  constant DESTROYING (line 17) | const DESTROYING = 0b0100
  constant DESTROYED (line 18) | const DESTROYED = 0b1000
  constant NOT_OPENING (line 20) | const NOT_OPENING = MAX ^ OPENING
  constant NOT_PREDESTROYING (line 21) | const NOT_PREDESTROYING = MAX ^ PREDESTROYING
  constant READ_ACTIVE (line 24) | const READ_ACTIVE = 0b00000000000001 << 4
  constant READ_UPDATING (line 25) | const READ_UPDATING = 0b00000000000010 << 4
  constant READ_PRIMARY (line 26) | const READ_PRIMARY = 0b00000000000100 << 4
  constant READ_QUEUED (line 27) | const READ_QUEUED = 0b00000000001000 << 4
  constant READ_RESUMED (line 28) | const READ_RESUMED = 0b00000000010000 << 4
  constant READ_PIPE_DRAINED (line 29) | const READ_PIPE_DRAINED = 0b00000000100000 << 4
  constant READ_ENDING (line 30) | const READ_ENDING = 0b00000001000000 << 4
  constant READ_EMIT_DATA (line 31) | const READ_EMIT_DATA = 0b00000010000000 << 4
  constant READ_EMIT_READABLE (line 32) | const READ_EMIT_READABLE = 0b00000100000000 << 4
  constant READ_EMITTED_READABLE (line 33) | const READ_EMITTED_READABLE = 0b00001000000000 << 4
  constant READ_DONE (line 34) | const READ_DONE = 0b00010000000000 << 4
  constant READ_NEXT_TICK (line 35) | const READ_NEXT_TICK = 0b00100000000000 << 4
  constant READ_NEEDS_PUSH (line 36) | const READ_NEEDS_PUSH = 0b01000000000000 << 4
  constant READ_READ_AHEAD (line 37) | const READ_READ_AHEAD = 0b10000000000000 << 4
  constant READ_FLOWING (line 40) | const READ_FLOWING = READ_RESUMED | READ_PIPE_DRAINED
  constant READ_ACTIVE_AND_NEEDS_PUSH (line 41) | const READ_ACTIVE_AND_NEEDS_PUSH = READ_ACTIVE | READ_NEEDS_PUSH
  constant READ_PRIMARY_AND_ACTIVE (line 42) | const READ_PRIMARY_AND_ACTIVE = READ_PRIMARY | READ_ACTIVE
  constant READ_EMIT_READABLE_AND_QUEUED (line 43) | const READ_EMIT_READABLE_AND_QUEUED = READ_EMIT_READABLE | READ_QUEUED
  constant READ_RESUMED_READ_AHEAD (line 44) | const READ_RESUMED_READ_AHEAD = READ_RESUMED | READ_READ_AHEAD
  constant READ_NOT_ACTIVE (line 46) | const READ_NOT_ACTIVE = MAX ^ READ_ACTIVE
  constant READ_NON_PRIMARY (line 47) | const READ_NON_PRIMARY = MAX ^ READ_PRIMARY
  constant READ_NON_PRIMARY_AND_PUSHED (line 48) | const READ_NON_PRIMARY_AND_PUSHED = MAX ^ (READ_PRIMARY | READ_NEEDS_PUSH)
  constant READ_PUSHED (line 49) | const READ_PUSHED = MAX ^ READ_NEEDS_PUSH
  constant READ_PAUSED (line 50) | const READ_PAUSED = MAX ^ READ_RESUMED
  constant READ_NOT_QUEUED (line 51) | const READ_NOT_QUEUED = MAX ^ (READ_QUEUED | READ_EMITTED_READABLE)
  constant READ_NOT_ENDING (line 52) | const READ_NOT_ENDING = MAX ^ READ_ENDING
  constant READ_PIPE_NOT_DRAINED (line 53) | const READ_PIPE_NOT_DRAINED = MAX ^ READ_FLOWING
  constant READ_NOT_NEXT_TICK (line 54) | const READ_NOT_NEXT_TICK = MAX ^ READ_NEXT_TICK
  constant READ_NOT_UPDATING (line 55) | const READ_NOT_UPDATING = MAX ^ READ_UPDATING
  constant READ_NO_READ_AHEAD (line 56) | const READ_NO_READ_AHEAD = MAX ^ READ_READ_AHEAD
  constant READ_PAUSED_NO_READ_AHEAD (line 57) | const READ_PAUSED_NO_READ_AHEAD = MAX ^ READ_RESUMED_READ_AHEAD
  constant WRITE_ACTIVE (line 60) | const WRITE_ACTIVE = 0b00000000001 << 18
  constant WRITE_UPDATING (line 61) | const WRITE_UPDATING = 0b00000000010 << 18
  constant WRITE_PRIMARY (line 62) | const WRITE_PRIMARY = 0b00000000100 << 18
  constant WRITE_QUEUED (line 63) | const WRITE_QUEUED = 0b00000001000 << 18
  constant WRITE_UNDRAINED (line 64) | const WRITE_UNDRAINED = 0b00000010000 << 18
  constant WRITE_DONE (line 65) | const WRITE_DONE = 0b00000100000 << 18
  constant WRITE_EMIT_DRAIN (line 66) | const WRITE_EMIT_DRAIN = 0b00001000000 << 18
  constant WRITE_NEXT_TICK (line 67) | const WRITE_NEXT_TICK = 0b00010000000 << 18
  constant WRITE_WRITING (line 68) | const WRITE_WRITING = 0b00100000000 << 18
  constant WRITE_FINISHING (line 69) | const WRITE_FINISHING = 0b01000000000 << 18
  constant WRITE_CORKED (line 70) | const WRITE_CORKED = 0b10000000000 << 18
  constant WRITE_NOT_ACTIVE (line 72) | const WRITE_NOT_ACTIVE = MAX ^ (WRITE_ACTIVE | WRITE_WRITING)
  constant WRITE_NON_PRIMARY (line 73) | const WRITE_NON_PRIMARY = MAX ^ WRITE_PRIMARY
  constant WRITE_NOT_FINISHING (line 74) | const WRITE_NOT_FINISHING = MAX ^ (WRITE_ACTIVE | WRITE_FINISHING)
  constant WRITE_DRAINED (line 75) | const WRITE_DRAINED = MAX ^ WRITE_UNDRAINED
  constant WRITE_NOT_QUEUED (line 76) | const WRITE_NOT_QUEUED = MAX ^ WRITE_QUEUED
  constant WRITE_NOT_NEXT_TICK (line 77) | const WRITE_NOT_NEXT_TICK = MAX ^ WRITE_NEXT_TICK
  constant WRITE_NOT_UPDATING (line 78) | const WRITE_NOT_UPDATING = MAX ^ WRITE_UPDATING
  constant WRITE_NOT_CORKED (line 79) | const WRITE_NOT_CORKED = MAX ^ WRITE_CORKED
  constant ACTIVE (line 82) | const ACTIVE = READ_ACTIVE | WRITE_ACTIVE
  constant NOT_ACTIVE (line 83) | const NOT_ACTIVE = MAX ^ ACTIVE
  constant DONE (line 84) | const DONE = READ_DONE | WRITE_DONE
  constant DESTROY_STATUS (line 85) | const DESTROY_STATUS = DESTROYING | DESTROYED | PREDESTROYING
  constant OPEN_STATUS (line 86) | const OPEN_STATUS = DESTROY_STATUS | OPENING
  constant AUTO_DESTROY (line 87) | const AUTO_DESTROY = DESTROY_STATUS | DONE
  constant NON_PRIMARY (line 88) | const NON_PRIMARY = WRITE_NON_PRIMARY & READ_NON_PRIMARY
  constant ACTIVE_OR_TICKING (line 89) | const ACTIVE_OR_TICKING = WRITE_NEXT_TICK | READ_NEXT_TICK
  constant TICKING (line 90) | const TICKING = ACTIVE_OR_TICKING & NOT_ACTIVE
  constant IS_OPENING (line 91) | const IS_OPENING = OPEN_STATUS | TICKING
  constant READ_PRIMARY_STATUS (line 94) | const READ_PRIMARY_STATUS = OPEN_STATUS | READ_ENDING | READ_DONE
  constant READ_STATUS (line 95) | const READ_STATUS = OPEN_STATUS | READ_DONE | READ_QUEUED
  constant READ_ENDING_STATUS (line 96) | const READ_ENDING_STATUS = OPEN_STATUS | READ_ENDING | READ_QUEUED
  constant READ_READABLE_STATUS (line 97) | const READ_READABLE_STATUS = OPEN_STATUS | READ_EMIT_READABLE | READ_QUE...
  constant SHOULD_NOT_READ (line 98) | const SHOULD_NOT_READ =
  constant READ_BACKPRESSURE_STATUS (line 100) | const READ_BACKPRESSURE_STATUS = DESTROY_STATUS | READ_ENDING | READ_DONE
  constant READ_UPDATE_SYNC_STATUS (line 101) | const READ_UPDATE_SYNC_STATUS = READ_UPDATING | OPEN_STATUS | READ_NEXT_...
  constant READ_NEXT_TICK_OR_OPENING (line 102) | const READ_NEXT_TICK_OR_OPENING = READ_NEXT_TICK | OPENING
  constant WRITE_PRIMARY_STATUS (line 105) | const WRITE_PRIMARY_STATUS = OPEN_STATUS | WRITE_FINISHING | WRITE_DONE
  constant WRITE_QUEUED_AND_UNDRAINED (line 106) | const WRITE_QUEUED_AND_UNDRAINED = WRITE_QUEUED | WRITE_UNDRAINED
  constant WRITE_QUEUED_AND_ACTIVE (line 107) | const WRITE_QUEUED_AND_ACTIVE = WRITE_QUEUED | WRITE_ACTIVE
  constant WRITE_DRAIN_STATUS (line 108) | const WRITE_DRAIN_STATUS = WRITE_QUEUED | WRITE_UNDRAINED | OPEN_STATUS ...
  constant WRITE_STATUS (line 109) | const WRITE_STATUS = OPEN_STATUS | WRITE_ACTIVE | WRITE_QUEUED | WRITE_C...
  constant WRITE_PRIMARY_AND_ACTIVE (line 110) | const WRITE_PRIMARY_AND_ACTIVE = WRITE_PRIMARY | WRITE_ACTIVE
  constant WRITE_ACTIVE_AND_WRITING (line 111) | const WRITE_ACTIVE_AND_WRITING = WRITE_ACTIVE | WRITE_WRITING
  constant WRITE_FINISHING_STATUS (line 112) | const WRITE_FINISHING_STATUS = OPEN_STATUS | WRITE_FINISHING | WRITE_QUE...
  constant WRITE_BACKPRESSURE_STATUS (line 113) | const WRITE_BACKPRESSURE_STATUS = WRITE_UNDRAINED | DESTROY_STATUS | WRI...
  constant WRITE_UPDATE_SYNC_STATUS (line 114) | const WRITE_UPDATE_SYNC_STATUS = WRITE_UPDATING | OPEN_STATUS | WRITE_NE...
  constant WRITE_DROP_DATA (line 115) | const WRITE_DROP_DATA = WRITE_FINISHING | WRITE_DONE | DESTROY_STATUS
  class WritableState (line 119) | class WritableState {
    method constructor (line 120) | constructor(
    method ending (line 137) | get ending() {
    method ended (line 141) | get ended() {
    method push (line 145) | push(data) {
    method shift (line 161) | shift() {
    method end (line 170) | end(data) {
    method autoBatch (line 180) | autoBatch(data, cb) {
    method update (line 193) | update() {
    method updateNonPrimary (line 211) | updateNonPrimary() {
    method continueUpdate (line 234) | continueUpdate() {
    method updateCallback (line 240) | updateCallback() {
    method updateNextTick (line 248) | updateNextTick() {
  class ReadableState (line 255) | class ReadableState {
    method constructor (line 256) | constructor(
    method ending (line 274) | get ending() {
    method ended (line 278) | get ended() {
    method pipe (line 282) | pipe(pipeTo, cb) {
    method push (line 309) | push(data) {
    method shift (line 334) | shift() {
    method unshift (line 345) | unshift(data) {
    method read (line 358) | read() {
    method drain (line 383) | drain() {
    method update (line 402) | update() {
    method updateNonPrimary (line 432) | updateNonPrimary() {
    method continueUpdate (line 462) | continueUpdate() {
    method updateCallback (line 468) | updateCallback() {
    method updateNextTickIfOpen (line 476) | updateNextTickIfOpen() {
    method updateNextTick (line 482) | updateNextTick() {
  class TransformState (line 489) | class TransformState {
    method constructor (line 490) | constructor(stream) {
  class Pipeline (line 497) | class Pipeline {
    method constructor (line 498) | constructor(src, dst, cb) {
    method finished (line 506) | finished() {
    method done (line 510) | done(stream, err) {
  function afterDrain (line 540) | function afterDrain() {
  function afterFinal (line 545) | function afterFinal(err) {
  function afterDestroy (line 568) | function afterDestroy(err) {
  function afterWrite (line 595) | function afterWrite(err) {
  function afterRead (line 614) | function afterRead(err) {
  function updateReadNT (line 625) | function updateReadNT() {
  function updateWriteNT (line 632) | function updateWriteNT() {
  function tickDrains (line 639) | function tickDrains(drains) {
  function afterOpen (line 649) | function afterOpen(err) {
  function afterTransform (line 677) | function afterTransform(err, data) {
  function newListener (line 682) | function newListener(name) {
  class Stream (line 702) | class Stream extends EventEmitter {
    method constructor (line 703) | constructor(opts) {
    method _open (line 720) | _open(cb) {
    method _destroy (line 724) | _destroy(cb) {
    method _predestroy (line 728) | _predestroy() {
    method readable (line 732) | get readable() {
    method writable (line 736) | get writable() {
    method destroyed (line 740) | get destroyed() {
    method destroying (line 744) | get destroying() {
    method destroy (line 748) | destroy(err) {
  class Readable (line 778) | class Readable extends Stream {
    method constructor (line 779) | constructor(opts) {
    method setEncoding (line 793) | setEncoding(encoding) {
    method _read (line 805) | _read(cb) {
    method pipe (line 809) | pipe(dest, cb) {
    method read (line 815) | read() {
    method push (line 820) | push(data) {
    method unshift (line 825) | unshift(data) {
    method resume (line 830) | resume() {
    method pause (line 836) | pause() {
    method _fromAsyncIterator (line 842) | static _fromAsyncIterator(ite, opts) {
    method from (line 867) | static from(data, opts) {
    method isBackpressured (line 882) | static isBackpressured(rs) {
    method isPaused (line 889) | static isPaused(rs) {
    method [asyncIterator] (line 893) | [asyncIterator]() {
  class Writable (line 960) | class Writable extends Stream {
    method constructor (line 961) | constructor(opts) {
    method cork (line 975) | cork() {
    method uncork (line 979) | uncork() {
    method _writev (line 984) | _writev(batch, cb) {
    method _write (line 988) | _write(data, cb) {
    method _final (line 992) | _final(cb) {
    method isBackpressured (line 996) | static isBackpressured(ws) {
    method drained (line 1000) | static drained(ws) {
    method write (line 1014) | write(data) {
    method end (line 1019) | end(data) {
  class Duplex (line 1026) | class Duplex extends Readable {
    method constructor (line 1028) | constructor(opts) {
    method cork (line 1041) | cork() {
    method uncork (line 1045) | uncork() {
    method _writev (line 1050) | _writev(batch, cb) {
    method _write (line 1054) | _write(data, cb) {
    method _final (line 1058) | _final(cb) {
    method write (line 1062) | write(data) {
    method end (line 1067) | end(data) {
  class Transform (line 1074) | class Transform extends Duplex {
    method constructor (line 1075) | constructor(opts) {
    method _write (line 1085) | _write(data, cb) {
    method _read (line 1093) | _read(cb) {
    method destroy (line 1104) | destroy(err) {
    method _transform (line 1112) | _transform(data, cb) {
    method _flush (line 1116) | _flush(cb) {
    method _final (line 1120) | _final(cb) {
  class PassThrough (line 1126) | class PassThrough extends Transform {}
  function transformAfterFlush (line 1128) | function transformAfterFlush(err, data) {
  function pipelinePromise (line 1137) | function pipelinePromise(...streams) {
  function pipeline (line 1146) | function pipeline(stream, ...streams) {
  function echo (line 1215) | function echo(s) {
  function isStream (line 1219) | function isStream(stream) {
  function isStreamx (line 1223) | function isStreamx(stream) {
  function isEnding (line 1227) | function isEnding(stream) {
  function isEnded (line 1231) | function isEnded(stream) {
  function isFinishing (line 1235) | function isFinishing(stream) {
  function isFinished (line 1239) | function isFinished(stream) {
  function getStreamError (line 1243) | function getStreamError(stream, opts = {}) {
  function isReadStreamx (line 1252) | function isReadStreamx(stream) {
  function isDisturbed (line 1256) | function isDisturbed(stream) {
  function isTypedArray (line 1264) | function isTypedArray(data) {
  function defaultByteLength (line 1268) | function defaultByteLength(data) {
  function noop (line 1272) | function noop() {}
  function abort (line 1274) | function abort() {
  function isWritev (line 1278) | function isWritev(s) {

FILE: lib/errors.js
  method constructor (line 2) | constructor(msg, code, fn = StreamError) {
  method isStreamDestroyed (line 12) | static isStreamDestroyed(err) {
  method isPrematureClose (line 16) | static isPrematureClose(err) {
  method isAborted (line 20) | static isAborted(err) {
  method name (line 24) | get name() {
  method STREAM_DESTROYED (line 28) | static STREAM_DESTROYED() {
  method PREMATURE_CLOSE (line 32) | static PREMATURE_CLOSE() {
  method ABORTED (line 36) | static ABORTED() {

FILE: test/all.js
  function runTests (line 5) | async function runTests() {

FILE: test/async-iterator.js
  method read (line 10) | read(cb) {
  method read (line 25) | read(cb) {
  method destroy (line 29) | destroy(cb) {
  method read (line 45) | read(cb) {
  method destroy (line 49) | destroy(cb) {
  method read (line 67) | read(cb) {
  method destroy (line 75) | destroy(cb) {
  method read (line 97) | read(cb) {
  method destroy (line 103) | destroy(cb) {
  function createInfinite (line 141) | function createInfinite(signal) {

FILE: test/compat.js
  function run (line 9) | function run(eos) {

FILE: test/destroy.js
  method open (line 10) | open(cb) {
  method predestroy (line 13) | predestroy() {

FILE: test/duplex.js
  method open (line 8) | open() {
  method read (line 11) | read() {
  method write (line 14) | write() {
  method write (line 27) | write(data, cb) {
  method final (line 31) | final(cb) {
  method read (line 52) | read(cb) {
  method write (line 70) | write(data, cb) {
  method final (line 74) | final(cb) {

FILE: test/passthrough.js
  method write (line 10) | write(data, cb) {

FILE: test/pipe.js
  method write (line 13) | write(data, enc, cb) {
  method write (line 35) | write(data, cb) {
  method write (line 55) | write(data, cb) {
  method write (line 75) | write(data, enc, cb) {
  method write (line 97) | write(data, cb) {
  method final (line 102) | final() {
  method write (line 122) | write(data, cb) {
  method read (line 146) | read(cb) {
  method write (line 159) | write(data, cb) {
  function done (line 167) | function done() {

FILE: test/pipeline.js
  method write (line 10) | write(data, cb) {
  method write (line 37) | write(data, cb) {
  method transform (line 53) | transform(input, cb) {
  method write (line 59) | write(data, cb) {
  method write (line 75) | write(data, cb) {
  method write (line 95) | write(data, cb) {

FILE: test/readable.js
  method open (line 68) | open(cb) {
  method open (line 83) | open(cb) {
  method read (line 98) | read(cb) {
  method destroy (line 102) | destroy(cb) {
  method read (line 121) | read(cb) {
  method read (line 237) | read(cb) {
  method read (line 254) | read(cb) {
  method read (line 275) | read(cb) {
  method read (line 316) | read(cb) {
  method read (line 344) | read(cb) {
  method map (line 382) | map(data) {
  function nextImmediate (line 434) | function nextImmediate() {

FILE: test/transform.js
  method transform (line 6) | transform(data, cb) {

FILE: test/writable.js
  method open (line 9) | open(cb) {
  method write (line 13) | write(data, cb) {
  method write (line 31) | write(data, cb) {
  method write (line 47) | write(data, cb) {
  method write (line 67) | write(data, cb) {
  method writev (line 88) | writev(batch, cb) {
  method write (line 113) | write(data, cb) {
  method write (line 130) | write(data, cb) {
  method final (line 151) | final(cb) {
  method write (line 174) | write(data, cb) {
  method write (line 219) | write(data, cb) {
  method write (line 265) | write(data, cb) {
  method writev (line 294) | writev(datas, cb) {
  method writev (line 318) | writev(datas, cb) {
  method writev (line 329) | writev(batch, cb) {
  function eventFlush (line 346) | function eventFlush() {
Condensed preview — 26 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (97K chars).
[
  {
    "path": ".gitattributes",
    "chars": 19,
    "preview": "* text=auto eol=lf\n"
  },
  {
    "path": ".github/workflows/test.yml",
    "chars": 512,
    "preview": "name: Build Status\non:\n  push:\n    branches:\n      - master\n  pull_request:\n    branches:\n      - master\njobs:\n  build:\n"
  },
  {
    "path": ".gitignore",
    "chars": 66,
    "preview": "node_modules\npackage-lock.json\ncoverage\nsandbox.js\nsandbox/\n*.cpy\n"
  },
  {
    "path": ".prettierrc",
    "chars": 28,
    "preview": "\"prettier-config-holepunch\"\n"
  },
  {
    "path": "LICENSE",
    "chars": 1079,
    "preview": "The MIT License (MIT)\n\nCopyright (c) 2019 Mathias Buus\n\nPermission is hereby granted, free of charge, to any person obta"
  },
  {
    "path": "README.md",
    "chars": 17750,
    "preview": "# streamx\n\nAn iteration of the Node.js core streams with a series of improvements.\n\n```\nnpm install streamx\n```\n\n[![Buil"
  },
  {
    "path": "SECURITY.md",
    "chars": 193,
    "preview": "## Security contact information\n\nTo report a security vulnerability, please use the\n[Tidelift security contact](https://"
  },
  {
    "path": "examples/fs.js",
    "chars": 1501,
    "preview": "const fs = require('fs')\nconst { Writable, Readable } = require('../')\n\nclass FileWriteStream extends Writable {\n  const"
  },
  {
    "path": "index.js",
    "chars": 33926,
    "preview": "const { EventEmitter } = require('events-universal')\nconst FIFO = require('fast-fifo')\nconst TextDecoder = require('text"
  },
  {
    "path": "lib/errors.js",
    "chars": 881,
    "preview": "module.exports = class StreamError extends Error {\n  constructor(msg, code, fn = StreamError) {\n    super(msg)\n\n    this"
  },
  {
    "path": "package.json",
    "chars": 1074,
    "preview": "{\n  \"name\": \"streamx\",\n  \"version\": \"2.25.0\",\n  \"description\": \"An iteration of the Node.js core streams with a series o"
  },
  {
    "path": "test/all.js",
    "chars": 632,
    "preview": "// This runner is auto-generated by Brittle\n\nrunTests()\n\nasync function runTests() {\n  const test = (await import('britt"
  },
  {
    "path": "test/async-iterator.js",
    "chars": 3511,
    "preview": "const test = require('brittle')\nconst { Readable, getStreamError } = require('../')\nconst StreamError = require('../lib/"
  },
  {
    "path": "test/backpressure.js",
    "chars": 2202,
    "preview": "const test = require('brittle')\nconst { Writable, Readable } = require('../')\n\ntest('write backpressure', function (t) {"
  },
  {
    "path": "test/byte-length.js",
    "chars": 1449,
    "preview": "const test = require('brittle')\nconst { Readable, Writable } = require('../')\n\nconst defaultSizes = [\n  { name: 'buf512'"
  },
  {
    "path": "test/compat.js",
    "chars": 3451,
    "preview": "const eos = global.Bare ? null : require('end-of-stream')\nconst test = require('brittle')\nconst stream = require('../')\n"
  },
  {
    "path": "test/destroy.js",
    "chars": 390,
    "preview": "const test = require('brittle')\nconst { Duplex } = require('../')\n\ntest('destroy is never sync', function (t) {\n  t.plan"
  },
  {
    "path": "test/duplex.js",
    "chars": 1431,
    "preview": "const test = require('brittle')\nconst { Duplex } = require('../')\n\ntest('if open does not end, it should stall', functio"
  },
  {
    "path": "test/errors.js",
    "chars": 640,
    "preview": "const test = require('brittle')\nconst StreamError = require('../lib/errors')\n\ntest('can make errors', function (t) {\n  {"
  },
  {
    "path": "test/get-stream-error.js",
    "chars": 879,
    "preview": "const test = require('brittle')\nconst { Readable, getStreamError } = require('../')\nconst StreamError = require('../lib/"
  },
  {
    "path": "test/passthrough.js",
    "chars": 515,
    "preview": "const test = require('brittle')\nconst { PassThrough, Writable, Readable } = require('../')\n\ntest('passthrough', (t) => {"
  },
  {
    "path": "test/pipe.js",
    "chars": 2818,
    "preview": "const test = require('brittle')\nconst compat = global.Bare ? null : require('stream')\nconst { Readable, Writable } = req"
  },
  {
    "path": "test/pipeline.js",
    "chars": 1796,
    "preview": "const test = require('brittle')\nconst { pipeline, pipelinePromise, Transform, Readable, Writable } = require('../')\n\ntes"
  },
  {
    "path": "test/readable.js",
    "chars": 8320,
    "preview": "const test = require('brittle')\nconst b4a = require('b4a')\nconst { Readable, isDisturbed } = require('../')\n\ntest('ondat"
  },
  {
    "path": "test/transform.js",
    "chars": 464,
    "preview": "const test = require('brittle')\nconst { Transform } = require('../')\n\ntest('default transform teardown when saturated', "
  },
  {
    "path": "test/writable.js",
    "chars": 6115,
    "preview": "const test = require('brittle')\nconst { Writable, Duplex } = require('../')\n\ntest('opens before writes', function (t) {\n"
  }
]

About this extraction

This page contains the full source code of the mafintosh/streamx GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 26 files (89.5 KB), approximately 25.6k tokens, and a symbol index with 291 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!