Repository: sighjs/sigh
Branch: master
Commit: 6d21fdd555a2
Files: 40
Total size: 77.8 KB
Directory structure:
gitextract_loa6yxo6/
├── .gitignore
├── .npmignore
├── changelog.md
├── circle.yml
├── docs/
│ └── writing-plugins.md
├── index.js
├── package.json
├── readme.md
├── sigh.js
├── src/
│ ├── PipelineCompiler.js
│ ├── api.js
│ ├── gulp-adapter.js
│ ├── plugin/
│ │ ├── concat.js
│ │ ├── debounce.js
│ │ ├── env.js
│ │ ├── filter.js
│ │ ├── glob.js
│ │ ├── merge.js
│ │ ├── pipeline.js
│ │ └── write.js
│ └── test/
│ ├── PipelineCompiler.spec.js
│ ├── api.spec.js
│ ├── bootstrap.spec.js
│ ├── gulp-adapter.spec.js
│ └── plugin/
│ ├── concat.spec.js
│ ├── debounce.spec.js
│ ├── env.spec.js
│ ├── filter.spec.js
│ ├── glob.spec.js
│ ├── helper.js
│ ├── merge.js
│ ├── pipeline.spec.js
│ └── write.spec.js
└── test/
├── fixtures/
│ ├── sigh-project/
│ │ ├── Sigh.js
│ │ ├── bootstrap.js
│ │ └── src/
│ │ ├── arrow.js
│ │ └── class.js
│ └── simple-project/
│ ├── file1.js
│ └── file2.js
└── tmp/
└── .keepdir
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
/npm-debug.log
/node_modules
/api.map
/lib
/test/tmp/*
!/test/tmp/.keepdir
================================================
FILE: .npmignore
================================================
================================================
FILE: changelog.md
================================================
# sigh changelog
## v0.12.29
* `merge` with `collectInitial` option was missing events.
## v0.12.28
* errors thrown in plugins issue an error and exit the sigh process.
## v0.12.27
* `pipeline({ activate: true }, ...)` works inside of `merge`.
## v0.12.26
* glob/write plugins supports encoding.
## v0.12.25
* merge with `collectInitial` didn't forward subsequent events in watch mode.
## v0.12.24
* fix `merge` breakage from previous version.
## v0.12.23
* `merge` has `collectInitial` option.
## v0.12.22
* improvements to code.
## v0.12.21
* `filter` plugin is now present as `select` and `reject`.
## v0.12.20
* `select` and `reject` instead of `filter`.
## v0.12.20
* write plugin sets `basePath` to equal write directory.
* setters for Event.basePath and Event.projectPath.
## v0.12.19
* fix crash when using gulp-adapter for file type that doesn't support identity source map generation.
## v0.12.18
* fix writing files that do not support source maps.
## v0.12.17
* fix crash when filter plugin filters all events.
## v0.12.16
* add filter plugin.
## v0.12.15
* glob: forwards input events down stream with glob/watch events.
## v0.12.14
* debounce: avoid loss of events when a file is changed during debouncing of events in the init phase.
## v0.12.13
* debounce: only debounce events during the init phase (when the initial state of the filesystem is being read).
## v0.12.12
* concat/write: fix crash when writing/concatenating untransformed files.
## v0.12.11
* fix deletion of last non-comment character when stripping source map comment from js/css file.
================================================
FILE: circle.yml
================================================
dependencies:
override:
- 'nvm exec 0.10.33 npm install'
- 'nvm exec iojs-v1.3.0 npm install'
test:
override:
- 'nvm exec 0.10.33 npm test'
- 'nvm exec iojs-v1.3.0 npm test'
================================================
FILE: docs/writing-plugins.md
================================================
# Writing sigh plugins
sigh will generator your plugin scaffolding for you, all you need to do then is fill in a small amount of code, push to git, `npm publish` and you're done.
First install sigh-cli:
```
sudo npm install -g sigh-cli
```
Then change to the directory where you want your plugin to live and type this:
```
sigh -p my-plugin-name
```
This will prompt you like this:
```
sigh plugin generator
? What is the name of this plugin? my plugin name
? Which github username/organisation should own this plugin? my-username
? What's your github author? Your Name <name@email.net>
? Which of the following apply to your plugin? (Press <space> to select)
❯◉ Maps input files to output files 1:1
◯ Spreads work over multiple CPUs
? Which features would you like to use? (Press <space> to select)
❯◉ CircleCI integration
◯ TravisCI integration
? Which dependencies do you need? (Press <space> to select)
❯◉ bluebird
◉ lodash
◉ sigh-core
```
This will generate code for you according to the responses you provide, your github username/author and CI choices are remembered and used as the defaults next time you run `sigh -p`.
To begin developing change to the directory containing your plugin and run sigh:
```
cd sigh-my-plugin-name
sigh -w
```
You can add the `-v` or `-vv` flags for more information. Everytime you change a source file the plugin will be recompiled and tested.
This is the code generated if you select the options shown above:
```javascript
import _ from 'lodash'
import Promise from 'bluebird'
import { Bacon } from 'sigh-core'
import { mapEvents } from 'sigh-core/lib/stream'
export default function(op, opts = {}) {
return mapEvents(op.stream, event => {
if (event.type !== 'add' && event.type !== 'change')
return event
// if (event.fileType !== 'relevantType') return event
// TODO: alter event here or return a new event
// event.changeFileSuffix('newSuffix')
return event
})
}
```
Plugins are written using EcmaScript 6 as `sigh -w` makes it trivial to translate the code to ES5, the generated tests also run with source maps remapped to show the stack trace in the original source code. When the plugin is pushed to the npm registry the generated code is pushed so that users of the module will not have to wait for any compilation when installing the plugin. If you would prefer to use ES5 then the source files can be removed after the first compilation and the generated ES5 files in `lib` can be edited from here.
The first argument (called `op` here) to the exported function is used to pass information to the plugin, the subsequent arguments are passed from the arguments used within the `Sigh.js` file. The `operation` argument has the following fields:
* `stream`: Bacon.js stream to adapt.
* `treeIndex`: depth-first index of operator within pipeline tree. This can be written to in order to this to set the treeIndex for the next pipeline operation otherwise it is incremented by one.
* `watch`: true if and only if the `-w` flag was used.
* `environment`: environment being built (change with the `-e` or `--environment` flag).
* `procPool`: A [ProcessPool](https://github.com/ohjames/process-pool) instance configured to limit the number of active processes accoring to the `-j` argument passed to sigh. This can be used to schedule work over multiple CPUs, see the [sigh-babel plugin](https://github.com/sighjs/sigh-babel/blob/master/src/index.js) for an example.
* `compiler`: a pipeline compiler that can be used to compile any sub-trees, this is used in advanced plugins that take other pipelines as arguments.
The [sigh-core](https://github.com/sighjs/sigh-core) library also provides some functionality useful for writing plugins including access to the `Bacon` instance sigh uses.
To adapt the code above so it suffixes a comment to each source file the scaffolded code would be adapted like this:
```javascript
import { mapEvents } from 'sigh-core/lib/stream'
export default function(op, suffix) {
return mapEvents(op.stream, event => {
if (event.type !== 'add' && event.type !== 'change' && event.fileType !== 'js')
return event
event.data += `\n// data added by sigh-suffix plugin: ${suffix || "default"}`
return event
})
}
```
Assuming the plugin above is called `suffixer` it could be used in a Sighfile like:
```javascript
module.exports = function(pipelines) {
pipelines['js'] = [ glob('*.js'), suffixer('kittens'), write('build') ]
}
```
The stream payload is an array of event objects, each event object contains the following fields:
* `type`: `add`, `change`, or `remove`
* `path`: path to source file.
* `sourceMap`: source map as javascript object, this is read-write but in most cases `applySourceMap` should be used instead of writing to this variable.
* `data`: file content as string (plugins can modify this, if modified then applySourceMap should be called with a source map describing the modifications).
* `sourceData`: original content before any transforms (read-only).
* `fileType`: filename extension (read-only).
* `basePath`: optional base directory containing resource.
* `projectPath`: path with basePath stripped off.
* `opTreeIndex`: depth-first index (within asset pipeline tree) of the source operator for this event.
The first array in the stream always contains an event of type `add` for every source file.
The following methods are available:
* `applySourceMap(nextSourceMap)`: apply a new source map on top of the resource's existing source map.
* `changeFileSuffix(targetSuffix)`: change the target suffix, e.g. from `scss` to `css`.
Plugins can also return a `Promise` to delay construction of the pipeline.
## Using multiple CPUs
By answering the generator questions in this way:
```
sigh plugin generator
? What is the name of this plugin? my plugin name
? Which github username/organisation should own this plugin? my-username
? What's your github author? Your Name <name@email.net>
? Which of the following apply to your plugin? (Press <space> to select)
❯◉ Maps input files to output files 1:1
◉ Spreads work over multiple CPUs
? Which features would you like to use? (Press <space> to select)
❯◉ CircleCI integration
◯ TravisCI integration
? Which dependencies do you need? (Press <space> to select)
❯◯ bluebird
```
The following scaffolding is generated:
```javascript
import _ from 'lodash'
import { Bacon } from 'sigh-core'
import { mapEvents } from 'sigh-core/lib/stream'
function myPluginNameTask(opts) {
// this function is called once for each subprocess in order to cache state,
// it is not a closure and does not have access to the surrounding state, use
// `require` to include any modules you need, for further info see
// https://github.com/ohjames/process-pool
var log = require('sigh-core').log
// this task runs inside the subprocess to transform each event
return event => {
var data, sourceMap
// TODO: data = compile(event.data) etc.
return { data, sourceMap }
}
}
function adaptEvent(compiler) {
// data sent to/received from the subprocess has to be serialised/deserialised
return event => {
if (event.type !== 'add' && event.type !== 'change')
return event
// if (event.fileType !== 'relevantType') return event
return compiler(_.pick(event, 'type', 'data', 'path', 'projectPath')).then(result => {
event.data = result.data
if (result.sourceMap)
event.applySourceMap(JSON.parse(result.sourceMap))
// event.changeFileSuffix('newSuffix')
return event
})
}
}
var pooledProc
export default function(op, opts = {}) {
if (! pooledProc)
pooledProc = op.procPool.prepare(pumpsTask, opts, { module })
return mapEvents(op.stream, adaptEvent(pooledProc))
}
```
The comments labelled `TODO` must then be filled in to complete the plugin.
## Incremental rebuilds and plugins
Due to the way Sigh's event stream works processing never needs to be repeated, only work relating to the actual files changed is performed. In most cases caching isn't necessary, in the few cases where it is Sigh handles it transparently. Library code available to plugin writers makes it simple to handle caching in cases where it is necessary.
# Future updates to this document
* Document file coalescing, for now see the [concat plugin](https://github.com/sighjs/sigh/blob/master/src/plugin/concat.js) and [toFileSystemState](https://github.com/sighjs/sigh-core/blob/master/src/stream.js).
================================================
FILE: index.js
================================================
require('source-map-support').install()
module.exports = require('./lib/api')
================================================
FILE: package.json
================================================
{
"name": "sigh",
"version": "0.12.35",
"description": "the fastest and most expressive build system for the web and node.js/io.js",
"main": "index",
"directories": {
"lib": "./lib"
},
"scripts": {
"test": "sigh"
},
"repository": {
"type": "git",
"url": "git://github.com/sighjs/sigh.git"
},
"keywords": [
"build",
"assets",
"pipeline"
],
"author": "James Pike <github@chilon.net>",
"license": "MIT",
"bugs": {
"url": "https://github.com/sighjs/sigh/issues"
},
"engines": {
"iojs": ">=1.3.0 <4.0.0",
"node": ">=0.10.0 <8.0.0"
},
"dependencies": {
"bluebird": "^2.9.13",
"chokidar": "^1.0.0",
"esprima": "^2.1.0",
"fs-extra": "^0.30.0",
"glob": "^5.0.1",
"lodash": "^3.6.0",
"process-pool": "^0.3.5",
"rewire": "~2.3.1",
"sigh-core": "^0.11.8",
"source-map": "^0.4.0",
"source-map-support": "^0.2.10",
"vinyl": "^0.4.6"
},
"devDependencies": {
"babel-plugin-transform-es2015-modules-commonjs": "^6.22.0",
"babel-preset-es2015": "^6.22.0",
"babel-preset-stage-2": "^6.22.0",
"chai": "^2.2.0",
"gulp-uglify": "^1.2.0",
"sigh": "^0.12.35",
"sigh-babel": "^0.12.2",
"sigh-cli": "^0.2.1",
"sigh-mocha": "^0.1.4",
"temp": "^0.8.1"
}
}
================================================
FILE: readme.md
================================================
# sigh

[](https://circleci.com/gh/sighjs/sigh)
Sigh is a declarative functional reactive build system for the web and io.js/node.js.
Sigh combines the best features of the best asset pipelines with unique features including best speed by delegating tasks to multiple processes and perfect source maps even in production builds. With sigh sub-second incremental production rebuilds are a reality, including source map support allowing you to debug production issues happening in minified transpiled source against the original code.
* Pipelines are written in JavaScript with a very neat tree-based syntax, no more grunt spaghetti or verbose gulp files: [plumber][plumber].
* Supports gulp plugins [gulp][gulp].
* Uses Functional Reactive Programming via [bacon.js][bacon], your asset pipelines are bacon streams ([plumber][plumber] uses Microsoft's [rxjs][rxjs], [gulp][gulp] uses node's built-in stream API).
* Support source maps at every stage of the pipeline: [plumber][plumber] and [gulp][gulp] ([see gulp issue](https://github.com/wearefractal/gulp-concat/issues/94)).
* Schedules work over multiple CPU cores to reduce build times and make better use of available processing resources.
* Caches all data in memory where possible rather than the filesystem: [gulp][gulp].
* Easy to write plugins in a small number of lines of code: [gobble][gobble].
* Includes a plugin generator (`sigh -p plugin-name`) that asks user about their plugin and generates a scaffolded project ready for `npm publish`.
* Support watching files and updating the pipeline as files change: [plumber][plumber] (and [gulp][gulp] when coupled with a couple of extra plugins). No special code or plugins are necessary for file watching, just use the `-w` flag.
* Support incremental rebuilds (only perform the minimum work necessary on file changes): [broccoli][broccoli].
* Inputs are based on simple glob expressions. Recursive glob expressions can be used when you want to speak in terms of directory trees rather than files.
* Supports `n:n`, `n:1` and `1:n` operations: [broccoli][broccoli]. The stream payload is an array of objects each representing file `update`, `add` and `remove` events, `1:1` plugins emit and consume single element arrays.
* Sigh has [automated tests](https://circleci.com/gh/sighjs/sigh) (using mocha/chai) that cover all functionality.
[plumber]: https://github.com/plumberjs/plumber
[gobble]: https://github.com/gobblejs/gobble
[gulp]: https://github.com/gulpjs/gulp
[rxjs]: https://github.com/Reactive-Extensions/RxJS
[bacon]: https://baconjs.github.io/
[broccoli]: https://github.com/broccolijs/broccoli
Check out [this presentation](http://sighjs.github.io) about JavaScript build systems and sigh.
## Using sigh
Install sigh-cli globally:
```bash
% sudo npm install -g sigh-cli
```
Install sigh and sigh/gulp plugins in your project:
```bash
% npm install --save-dev sigh sigh-babel sigh-mocha gulp-uglify
```
Write a file called `sigh.js` (or `Sigh.js`) and put it in the root of the project:
```javascript
// To use a plugin it must be declared as a global variable, some plugins are
// built-in and others are loaded by scanning package.json for entries
// beginning with "sigh-" or "gulp-".
var merge, glob, concat, write, env, pipeline
var uglify, mocha, babel
module.exports = function(pipelines) {
pipelines['build-source'] = [
merge(
[ glob('src/**/*.js'), babel() ],
glob('vendor/*.js', 'bootstrap.js')
),
debounce(500),
concat('combined.js'),
env(uglify(), ['production', 'staging']),
write('build/assets')
]
pipelines['build-tests'] = [
glob({ basePath: 'test' }, '*.js'),
babel(),
write('build/test')
]
pipelines.alias.build = ['build-source', 'build-tests']
pipelines['tests-run'] = [
pipeline('build-source', 'build-tests'),
debounce(500),
mocha({ files: 'lib/**/*.spec.js' })
]
}
```
The pipeline `build-source` globs files matching `src/**/*.js` (recursive glob) and transpiles them with babel, this transpiled output is concatenated together with the files matching the glob pattern `vendor/*.js` followed by the file `bootstrap.js` (`concat` operators sort files by the depth-first index of the source stream that produced their untransformed content). The concatenated resource is uglified (using `gulp-uglify`) but only during builds for `production` and `staging` environments. The resulting file is written to the directory `build/assets`.
The pipeline `build-tests` takes the files in `test`, compiles them with `babel` and writes each compiled file to the directory `build/test`. Each file's path relative to its `basePath` becomes its offset within the output directory, in this case only the filename is used.
The pipeline `tests-run` runs mocha when either the `build-tests` or `build-source` pipelines complete. `tests-run` is delayed until neither pipeline completes for 500ms to avoid wasting CPU time.
Running `sigh -w` would compile all the files then watch the directories and files matching the glob patterns for changes. Each plugin caches resources and only recompiles the files that have changed.
sigh plugins are injected into the variables defined at the top of the file. Some of the plugins are built-in (for now) and others are found by scanning package.json for dependency and devDependency entries of the format `sigh-*`. Sigh also searches for plugins of the format `gulp-*` and adapts them to work with sigh.
### Running sigh
Running `sigh` with no arguments will run all pipelines.
```bash
% sigh
```
Compile all pipelines and then watch files for changes compiling those that have changed:
```bash
% sigh -w
```
Compile/watch only the specified pipeline (with the `sigh.js` shown above the source and tests would be compiled but the tests would never be run).
```bash
% sigh -w build-source build-tests
```
This is equivalent to using the alias defined in `sigh.js`:
```bash
% sigh -w build
```
It is also possible to create pipelines on the `pipeline.explicit` object that only run if specifically requested:
```javascript
pipelines.explicit['tests-run'] = mocha({ files: 'lib/**/*.spec.js' })
```
This pipeline would only run if `sigh tests-run` is used but not with `sigh`.
# Built-in plugins
## glob
The glob plugin takes a list of glob expressions as arguments starting with an optional object containing options.
```javascript
module.exports = function(pipelines) {
pipelines.js = [
glob('test/*.js', 'src/**/*.js', 'bootstrap.js'),
write('build')
]
}
```
The glob plugin also forwards input events down the stream:
```javascript
module.exports = function(pipelines) {
pipelines.js = [
glob('test/*.js'),
glob('src/**/*.js'), // forwards glob events from test directory
write('build')
]
}
```
### options
* basePath: restricts the glob to operate within basePath and also attaches the property to all resources (affecting their projectPath field).
```javascript
glob({ basePath: 'src' }, '*.js') // similar to glob('src/*.js')
```
* debounce: Debounce file updates, defaults to 120 (milliseconds). Ideally it should not be set lower than 120, this interval is also used to iron out bad events reported by the underlying file watching plugin Sigh uses.
```javascript
glob({ debounce: 500 }, '*.js')
```
* encoding: Set `encoding` attribute of all generated events to this, this attribute is used by the `write` plugin.
```javascript
glob({ encoding: 'binary' }, '*.png')
```
sigh does not curretly support events representing directories, please avoid globbing directory paths for now.
## write
The `write` plugin is responsible for writing data to the filesystem. It adds files corresponding to `Event` objects with type `add`, updates files for events with type `change` and removes files corresponding to events with type `remove`. The output path of each file is determined by prefixing its `projectPath` with the argument to `write`. Operations that produce events (such as glob) take a `basePath` option so that the output path can be easily manipulated.
```javascript
module.exports = function(pipelines) {
pipelines.js = [
glob({ basePath: 'src' }, '**/*.js'),
write('build')
]
}
```
This pipeline takes all files with the extension `js` recursively reachable from `src` and writes each one to `build` directory (without the `src` prefix due to `basePath`).
The write plugin passes events representing the written files down the stream, this is useful in combination with the `pipeline` plugin.
The clobber option can be used to recursively remove the contents of the output directory when the plugin is initialised:
```javascript
module.exports = function(pipelines) {
pipelines.js = [
glob({ basePath: 'src' }, '**/*.js'),
write({ clobber: true }, 'build')
]
}
```
A glob pattern or list of glob patterns ([according to node-glob syntax](https://github.com/isaacs/node-glob)) can be supplied to `clobber` to restrict which files get removed.
```javascript
module.exports = function(pipelines) {
pipelines.js = [
glob({ basePath: 'src' }, '**/*.js'),
write({ clobber: '!(jspm_packages|config.js)' }, 'build')
]
}
```
## merge
The `merge` plugin combines many streams together.
```javascript
pipelines.js = [
merge(
[ glob({ basePath: 'src' }, '*.js'), babel() ],
[ glob('vendor/*.js'), concat('vendor.js') ],
glob('bootstrap.js')
),
write('build')
]
```
This would transpile files matching `src/*.js` using babel and copy them to the directory `build`. Files matching `vendor/*.js` will all be concatenated together into a single file at `build/vendor.js`. The file `bootstrap.js` will be copied to `build` without being modified beyond adding a source map comment.
The `merge` plugin forwards events as they come by default (so if the first of many streams emits an event it will be forwarded immediately, before other streams have emitted their own events). To have `merge` wait for all streams to emit one payload and send the events from these merged together as the first event the `collectInitial` option can be used:
```javascript
pipelines.js = [
merge(
{ collectInitial: true },
pipeline('build-tests'),
pipeline('build-source')
),
pipeline('run-tests')
]
```
## concat
The `concat` plugin concatenates all resources together into one file. The order in which the files are concatenated corresponds to the depth-first index within the tree of the plugin that produced the original source content of that file.
```javascript
pipelines.js = [
merge(
[ glob('src/*.js'), babel() ],
glob('loader.js', 'bootstrap.js')
),
concat('output.js'),
write('build')
]
```
In this example the order of the files in `output.js` is determined by tree order:
1. The files in `src/*.js` compiled by babel.
2. The file `loader.js`.
3. The file `bootstrap.js`.
You can see here that `glob` uses multiple tree indexes and assigns them to events according to the index of the pattern that produced them.
## debounce
Combines events in the pipeline until the event stream settles for longer than the given period. The debounce is deactivated after sigh enters "file watch" mode (to ensure minimum build delays during development).
```javascript
pipelines.js = [
glob('loader.js', 'bootstrap.js')
debounce(200),
concat('output.js'),
write('build')
]
```
In this pipeline if `loader.js` and `bootstrap.js` change within 200 milliseconds of each other then the `concat` plugin will contain an array of two events (rather than being called twice with an array of one event each time). If no parameter is given then a default of 500 milliseconds is used.
## env
Runs the operation only when one of the selected environments is chosen (using sigh's `-e` or `--environment` flag) otherwise pass data through unchanged.
```javascript
pipelines.js = [
glob('src/*.js'),
env(concat('output.js'), ['production', 'staging']),
write('build')
]
```
This pipeline only concatenates the files together in `production` and `staging` builds otherwise multiple files are written to the directory `build`. The environments may be passed as an array or as positional parameters.
## pipeline
The pipeline plugin allows named pipelines to be connected.
```javascript
pipelines['source-js'] = [
glob({ basePath: 'src' }, '*.js', 'plugin/*.js'), babel(), write('lib')
]
pipelines['test-js'] = [
glob({ basePath: 'test' }, '*.js', 'plugin/*.js'), babel(), write('lib')
]
pipelines['tests-run'] = [
pipeline('source-js', 'test-js'),
debounce(700),
mocha({ files: 'lib/test/*.spec.js' })
]
```
In this example the `pipeline` plugin in the `tests-run` pipeline forwards the output from the `source-js` and `test-js` pipelines down the stream. By default it will not force a pipeline to run unless the user specifies it e.g. if the user runs `sigh test-js tests-run` the `pipeline` plugin will issue stream events from the `test-js` pipeline only.
To force a pipeline operation to activate a named pipeline the `activate` option can be used, the previous `tests-run` pipeline could be rewritten more flexibly to allow the user to run mocha tests manually as such:
```javascript
pipelines['tests-run'] = [
pipeline('source-js', 'test-js'),
debounce(700),
pipeline({ activate: true }, 'mocha')
]
pipelines.explicit.mocha = mocha({ files: 'lib/test/*.spec.js' })
```
This also shows that `pipeline` operations forward pipeline events to the named pipelines in addition to receiving events from them.
To activate some plugins and not others one of the following equivalent formats can be used:
```javascript
pipeline({ activate: true }, 'mocha', { activate: false }, 'express')
```
```javascript
pipeline('express', { activate: true }, 'mocha')
```
```javascript
merge(
pipeline({ activate: true }, 'mocha'),
pipeline('express')
)
```
## select
Filters events from the pipeline leaving only those that match all expressions.
Select only events where the `projectPath` begins with the letter `b`.
```javascript
pipelines['source-js'] = [
glob({ basePath: 'src' }, '*.js'),
select({ projectPath: /^b/ })
]
```
Only pass events down the pipeline with type `update` and that begin with the letter `c`.
```javascript
pipelines['source-js'] = [
glob({ basePath: 'src' }, '*.js'),
select({ type: 'update', projectPath: /^c/ })
]
```
## reject
Filters events from the pipeline leaving only those that don't match all expressions.
Filters out `add` events where `projectPath` begins with `b`.
```javascript
pipelines['source-js'] = [
glob({ basePath: 'src' }, '*.js'),
reject({ type: 'add', projectPath: /^b/ })
]
```
# Writing sigh plugins
Please see [plugin writing guide](https://github.com/sighjs/sigh/blob/master/docs/writing-plugins.md)
# Changelog
[Link to changelog](https://github.com/sighjs/sigh/blob/master/changelog.md).
# Future Work
* debouce: `duringWatch` option to also debounce after sigh enters watch mode.
* pipeline: `activate` should activate (but not create dependency) in first position of pipeline.
* `sigh -w` should watch `sigh.js` file for changes in addition to the source files.
================================================
FILE: sigh.js
================================================
var glob, pipeline, babel, debounce, write, mocha
module.exports = function(pipelines) {
var babelOpts = {
presets: ['es2015', 'stage-2'],
plugins: ['transform-es2015-modules-commonjs'],
}
pipelines['build-sources'] = [
glob({ basePath: 'src' }, '*.js', 'plugin/*.js'),
babel(babelOpts),
write('lib')
]
pipelines['build-tests'] = [
glob({ basePath: 'src/test' }, '*.js', 'plugin/*.js'),
babel(babelOpts),
write('lib/test')
]
pipelines.alias.build = ['build-sources', 'build-tests']
pipelines['run-tests'] = [
merge(
{ collectInitial: true },
pipeline('build-sources'),
pipeline('build-tests')
),
pipeline({ activate: true }, 'mocha')
]
pipelines.explicit.mocha = [ mocha({ files: 'lib/test/**/*.spec.js' }) ]
}
================================================
FILE: src/PipelineCompiler.js
================================================
import _ from 'lodash'
import { Bacon } from 'sigh-core'
import Promise from 'bluebird'
import ProcessPool from 'process-pool'
const DEFAULT_JOBS = 4
export default class {
/**
* @param {Object} options Object containing the following fields:
* watch: {Booloean} Whether to pass "watch" to plugins (i.e. sigh -w was used).
* environment: {String} Environment being bulit (sigh -e env).
* treeIndex: {Number} treeIndex First tree index, defaulting to 1.
*/
constructor(options) {
if (! options)
options = {}
this.treeIndex = options.treeIndex || 1
this.watch = options.watch
this.environment = options.environment
// dependency name against array of input stream
this.pipelineInputs = {}
// compiled stream by pipeline name
this.streams = {}
this.initStream = Bacon.constant([])
let processLimit = options.jobs || DEFAULT_JOBS
// include sigh process as one job so subtract one
// TODO: (processLimit > 0) when process-pools supports limit of 0
if (processLimit > 1)
--processLimit
this.procPool = new ProcessPool({ processLimit })
}
addPipelineInput(name, stream) {
const pipelineInputs = this.pipelineInputs[name]
if (pipelineInputs)
pipelineInputs.push(stream)
else
this.pipelineInputs[name] = [ stream ]
}
/**
* Clean up all allocated resources.
*/
destroy() {
this.procPool.destroy()
}
/**
* Turn a pipeline into a stream.
* @param {Array} pipeline Array of operations representing pipeline.
* @return {Bacon} stream that results from combining all operations in the pipeline.
*/
compile(pipeline, inputStream = null, name = null) {
if (name) {
const pipelineInputs = this.pipelineInputs[name]
if (pipelineInputs) {
inputStream = Bacon.mergeAll(
inputStream ? [ inputStream, ...pipelineInputs ] : pipelineInputs
)
}
}
if (! inputStream)
inputStream = this.initStream
const compileOperation = (operation, opData) => {
let stream
try {
stream = operation.plugin ?
operation.plugin.apply(this, [ opData ].concat(operation.args)) :
operation(opData)
}
catch (e) {
console.log('issue running pipeline', name)
console.log(e.stack ? e.stack : e)
process.exit(1)
}
return Promise.resolve(stream).then(stream => {
if (this.treeIndex === opData.treeIndex)
++this.treeIndex
else if (opData.treeIndex > this.treeIndex)
this.treeIndex = opData.treeIndex
if (opData.cleanup) {
// TODO: register pipeline cleanup function
}
return stream
})
}
if (! (pipeline instanceof Array))
pipeline = [ pipeline ]
const { watch, environment } = this
const streamPromise = Promise.reduce(pipeline, (stream, operation) => {
const { treeIndex, procPool } = this
return compileOperation(operation, {
stream,
watch,
treeIndex,
procPool,
compiler: this,
environment
})
}, inputStream)
if (! name)
return streamPromise
return streamPromise.then(stream => {
return this.streams[name] = stream
})
}
}
================================================
FILE: src/api.js
================================================
import fs from 'fs'
import _ from 'lodash'
import Promise from 'bluebird'
import rewire from 'rewire'
import path from 'path'
import activeCallLimiter from 'process-pool/lib/activeCallLimiter'
import { log, Bacon } from 'sigh-core'
import PipelineCompiler from './PipelineCompiler'
import gulpAdapter from './gulp-adapter'
import merge from './plugin/merge'
import concat from './plugin/concat'
import debounce from './plugin/debounce'
import env from './plugin/env'
import glob from './plugin/glob'
import pipeline from './plugin/pipeline'
import write from './plugin/write'
import filter from './plugin/filter'
const plugins = {
merge, concat, debounce, env, glob, pipeline, write,
select: filter.bind(null, true),
reject: filter.bind(null, false),
}
/**
* Run Sigh.js
* @return {Promise} Resolves to an object { pipelineName: baconStream }
*/
export function invoke(opts = {}) {
try {
let exitCode = 0
let streams
const compiler = new PipelineCompiler(opts)
const startTime = Date.now()
const relTime = (time = startTime) => ((Date.now() - time) / 1000).toFixed(3)
return compileSighfile(compiler, opts)
.then(_streams => {
streams = _streams
if (opts.verbose)
log('waiting for subprocesses to start')
return compiler.procPool.ready()
})
.then(() => {
if (opts.verbose)
log('subprocesses started in %s seconds', relTime())
const pipeStartTime = Date.now()
_.forEach(streams, (stream, pipelineName) => {
stream.onValue(events => {
const now = new Date
const createTime = _.min(events, 'createTime').createTime
const timeDuration = relTime(createTime ? createTime.getTime() : pipeStartTime)
log('pipeline %s complete: %s seconds', pipelineName, timeDuration)
if (opts.verbose > 1) {
events.forEach(event => {
const { path, projectPath } = event
const suffix = path !== projectPath ? ` [${event.projectPath}]` : ''
log.nested(`${event.type} ${event.path}${suffix}`)
})
}
})
stream.onError(error => {
exitCode = 1
log.warn('\x07error: pipeline %s', pipelineName)
log.warn(error)
if (error.stack)
log.warn(error.stack)
})
})
Bacon.mergeAll(_.values(streams)).onEnd(() => {
if (opts.verbose)
log('pipeline(s) complete: %s seconds', relTime())
compiler.destroy()
process.exit(exitCode)
})
})
}
catch (e) {
if (typeof e === 'function' && e instanceof Error) {
log.warn(e)
if (e.stack)
log.warn(e.stack)
process.exit(1)
}
else {
throw e
}
}
}
/**
* Requires a plugin ensuring es6 modules are also supported
*/
function requirePlugin(path) {
const module = require(path)
return module.default || module
}
/**
* Compile the Sigh.js file in the current directory with the given options.
* @return {Promise} Resolves to an object { pipelineName: baconStream }
*/
export function compileSighfile(compiler, opts = {}) {
let packageJson
try {
packageJson = JSON.parse(fs.readFileSync('package.json'))
}
catch (e) {}
const notPlugin = { 'sigh-cli': true, 'sigh-core' : true }
if (packageJson) {
[ packageJson.devDependencies, packageJson.dependencies ].forEach(deps => {
if (! deps)
return
_.forEach(deps, function(version, pkg) {
if (notPlugin[pkg])
return
if (/^sigh-/.test(pkg)) {
plugins[pkg.substr(5)] = requirePlugin(path.join(process.cwd(), 'node_modules', pkg))
}
else if (/^gulp-/.test(pkg)) {
const name = pkg.substr(5)
if (! plugins[name])
plugins[name] = gulpAdapter(requirePlugin(path.join(process.cwd(), 'node_modules', pkg)))
}
})
})
}
let sighPath
try {
sighPath = require.resolve(path.join(process.cwd(), 'Sigh'))
}
catch (e) {
sighPath = require.resolve(path.join(process.cwd(), 'sigh'))
}
const sighModule = rewire(sighPath)
_.forEach(plugins, (plugin, key) => injectPlugin(sighModule, key))
const pipelines = { alias: {}, explicit: {} }
sighModule(pipelines)
const selectedPipelines = selectPipelines(opts.pipelines, pipelines)
const runPipelines = loadPipelineDependencies(selectedPipelines, pipelines)
if (opts.verbose) {
log(
'running pipelines [ %s ] with %s jobs',
Object.keys(runPipelines).join(', '),
opts.jobs
)
}
// to ensure the promises run one after the other so that plugins load
// in dependency order, ideally they could be segmented according to
// dependencies and loaded in several asynchronous batches.
const limiter = activeCallLimiter(func => func(), 1)
return Promise.props(
_.mapValues(runPipelines, (pipeline, name) => limiter(() => {
// This ensures that user selected pipeline's input streams are
// merged with the init stream.
const inputStream = selectedPipelines[name] ? compiler.initStream : null
return compiler.compile(pipeline, inputStream, name)
}))
)
}
/**
* Select pipelines, then _.assign(pipelines, pipelines.explicit) ; delete pipelines.explicit
* @param {Object} pipelines All pipelines by name and two extra keys, alias and explicit.
* After this function returns the explicit pipelines will be
* merged with the main pipelines and then the key will be deleted.
* @return {Object} Pipeline name against pipeline in the order the user selected them.
*/
function selectPipelines(selected, pipelines) {
if (! selected || selected.length === 0)
selected = Object.keys(_.omit(pipelines, 'alias', 'explicit'))
if (! _.isEmpty(pipelines.alias)) {
selected = _.flatten(
selected.map(pipelineName => {
return pipelines.alias[pipelineName] || pipelineName
})
)
}
_.defaults(pipelines, pipelines.explicit)
delete pipelines.explicit
const runPipelines = {}
selected.forEach(name => {
runPipelines[name] = pipelines[name]
})
return runPipelines
}
/**
* Reverse the order of keys in a hash.
* Works in any JS VM that maintains key insertion order.
*/
function reverseHash(hash) {
const ret = {}
Object.keys(hash).reverse().forEach(key => { ret[key] = hash[key] })
return ret
}
/**
* @arg {Object} runPipelines A map of pipelines the user has chosen to run by name.
* @arg {Object} pipelines A map of all pipelines by name.
* @return {Object} A map of pipelines that should be run with dependents after dependencies.
*/
function loadPipelineDependencies(runPipelines, pipelines) {
const ret = {}
const loading = {}
const loadDeps = srcPipelines => {
_.forEach(srcPipelines, (pipeline, name) => {
if (ret.name)
return
else if (loading.name)
throw Error(`circular dependency from pipeline ${name}`)
loading[name] = true
// TODO: also cursively scan args, e.g. if used in merge
const activations = []
// ignore pipelines in the first position as they only provide output, not
// input and this can be associated dynamically through a flatMap
// TODO: if this pipeline itself has input then the above comment no
// longer applies.
let skipNextLeaf = true
const scanPipelineForActivation = pipeline => {
pipeline.forEach(pluginMeta => {
if (pluginMeta.plugin === plugins.merge) {
scanPipelineForActivation(pluginMeta.args)
return
}
if (skipNextLeaf) {
skipNextLeaf = false
return
}
if (pluginMeta.plugin === plugins.pipeline) {
let activateState = false
pluginMeta.args.forEach(arg => {
if (arg.hasOwnProperty('activate'))
activateState = arg.activate
else if (activateState)
activations.push(arg)
})
}
})
}
scanPipelineForActivation(pipeline)
activations.forEach(activation => {
const activationPipeline = pipelines[activation]
if (! activationPipeline)
throw Error(`invalid pipeline ${activation}`)
ret[activation] = activationPipeline
})
// this pipeline must come after those it activates so that
// the dependency order is preserved after the list is reserved.
ret[name] = pipeline
delete loading[name]
})
}
loadDeps(reverseHash(runPipelines))
return reverseHash(ret)
}
function injectPlugin(module, pluginName) {
const plugin = plugins[pluginName]
if (! plugin)
throw new Error("Nonexistent plugin `" + pluginName + "'")
try {
const varName = _.camelCase(pluginName)
module.__set__(varName, (...args) => ({ plugin, args }))
}
catch (e) {
// plugin not used
}
}
================================================
FILE: src/gulp-adapter.js
================================================
import Vinyl from 'vinyl'
import { Bacon, Event } from 'sigh-core'
import { Transform } from 'stream'
export default gulpPlugin => adapter.bind(null, gulpPlugin)
function adapter(gulpPlugin, op, ...args) {
let sink
const gulpAdaptedStream = Bacon.fromBinder(_sink => { sink = _sink })
const onGulpValue = vinyl => {
const { __source: source } = vinyl
if (! source)
return new Bacon.Error('gulp plugin lost source, may not be compatible with sigh')
source.data = vinyl.contents.toString()
source.sourceMap = vinyl.sourceMap
sink([ source ])
}
const gulpInStream = new Transform({ objectMode: true })
const gulpOutStream = gulpInStream.pipe(gulpPlugin(...args))
gulpOutStream.on('data', onGulpValue)
let registeredForEnd = false
const passThroughStream = op.stream.flatMap(events => {
const passThroughEvents = []
events = events.filter(event => {
if (event.type === 'change' || event.type === 'add')
return true
passThroughEvents.push(event)
return false
})
if (events.length !== 0) {
events.forEach(event => {
const vinyl = new Vinyl({
contents: new Buffer(event.data),
path: event.path,
// the following messes with source maps...
// path: event.projectPath,
// base: event.basePath,
})
// the next cannot be attached via the constructor
vinyl.sourceMap = event.sourceMap
// something to help...
vinyl.__source = event
gulpInStream.push(vinyl)
})
}
if (! registeredForEnd) {
// delay until the first value to avoid starting stream during compilation stage
op.stream.onEnd(() => {
// without the nextTick then the last event can go missing on node 0.10
process.nextTick(() => {
gulpOutStream.removeListener('data', onGulpValue)
sink(new Bacon.End())
})
})
registeredForEnd = true
}
return passThroughEvents.length === 0 ? Bacon.never() : passThroughEvents
})
return Bacon.mergeAll(gulpAdaptedStream, passThroughStream)
}
================================================
FILE: src/plugin/concat.js
================================================
import _ from 'lodash'
import { toFileSystemState } from 'sigh-core/lib/stream'
import { concatenate as concatSourceMaps } from 'sigh-core/lib/sourceMap'
import { log, Event } from 'sigh-core'
export default function(op, outputPath) {
let fileExists = false
let maxCreateTime = new Date(-8640000000000000)
return toFileSystemState(op.stream)
.map(function(eventCache) {
let data = ''
const sourceMaps = []
const offsets = [0]
let cumOffset = 0
const events = _.sortBy(eventCache, 'opTreeIndex')
// set this to the earliest new createTime after maxCreateTime
let createTime = null
let nextMaxCreateTime = maxCreateTime
events.forEach((event, idx) => {
if (event.createTime > maxCreateTime) {
if (event.createTime < createTime || createTime === null)
createTime = event.createTime
if (event.createTime > nextMaxCreateTime)
nextMaxCreateTime = event.createTime
}
let offset = event.lineCount - 1
data += event.data
if (data[data.length - 1] !== '\n') {
data += '\n'
++offset
}
let sourceMap
try {
sourceMap = event.sourceMap
}
catch (e) {
log.warn('\x07could not construct identity source map for %s', event.projectPath)
if (e.message)
log.warn(e.message)
}
if (sourceMap) {
sourceMaps.push(sourceMap)
}
if (idx < events.length - 1)
offsets.push(cumOffset += offset)
})
// is null when none of the creation times was greater than the previous
if (createTime === null)
createTime = maxCreateTime
maxCreateTime = nextMaxCreateTime
const sourceMap = concatSourceMaps(sourceMaps, offsets)
sourceMap.file = outputPath
const ret = [ new Event({
type: fileExists ? 'change' : 'add',
path: outputPath,
data,
sourceMap,
createTime,
initPhase: ! events.some(event => ! event.initPhase)
}) ]
fileExists = true
return ret
})
}
================================================
FILE: src/plugin/debounce.js
================================================
import _ from 'lodash'
import { Bacon } from 'sigh-core'
import { bufferingDebounce } from 'sigh-core/lib/stream'
export default function(op, delay = 500) {
// return bufferingDebounce(op.stream, delay).map(_.flatten)
let initPhase = true
const buffer = []
return op.stream.flatMapLatest(events => {
// avoid buffering during file watch phase
if (! initPhase)
return events
if (events.some(event => ! event.initPhase)) {
// glob found end of init phase
initPhase = false
if (buffer.length) {
events = buffer.concat(events)
buffer.length = 0
}
return events
}
// TODO: coalesce events to reflect latest fs state
buffer.push(...events)
// if another event is published then flatMapLatest unsubscribes from
// the stream returned previously ensuring the splice doesn't happen.
return Bacon.later(delay, buffer).map(buffer => buffer.splice(0))
})
}
================================================
FILE: src/plugin/env.js
================================================
import _ from 'lodash'
export default function(op, pipeline, ...envs) {
envs = _.flatten(envs)
if (! _.includes(envs, op.environment))
return op.stream
return op.compiler.compile(pipeline, op.stream)
}
================================================
FILE: src/plugin/filter.js
================================================
import _ from 'lodash'
import { Bacon } from 'sigh-core'
export default function(select, op, ...filters) {
filters = _.flatten(filters)
return op.stream.flatMap(events => {
// ensure initialisation events are forwarded
if (events.length === 0)
return []
events = events.filter(event => {
return filters.some(filter => {
for (const key in filter) {
const keyFilter = filter[key]
const value = event[key]
if (keyFilter instanceof RegExp) {
if (! keyFilter.test(value))
return true
}
else if (keyFilter !== value)
return true
}
return false
}) ? (! select) : select
})
return events.length === 0 ? Bacon.never() : events
})
}
================================================
FILE: src/plugin/glob.js
================================================
import chokidar from 'chokidar'
import _ from 'lodash'
import { Bacon, Event } from 'sigh-core'
import Promise from 'bluebird'
const glob = Promise.promisify(require('glob'))
import { bufferingDebounce, coalesceEvents } from 'sigh-core/lib/stream'
// necessary to detect chokidar's duplicate/invalid events, see later comment
const DEFAULT_DEBOUNCE = 120
export default function(op, ...patterns) {
// the first argument could be an option object rather than a pattern
const opts = typeof patterns[0] === 'object' ? patterns.shift() : {}
const { treeIndex = 1, debounce = DEFAULT_DEBOUNCE } = op
op.nextTreeIndex = treeIndex + patterns.length
const newEvent = (type, { path, treeIndex, initPhase = false }) => {
const props = { type, path, initPhase, opTreeIndex: treeIndex, encoding: opts.encoding }
if (opts.basePath)
props.basePath = opts.basePath
props.createTime = new Date
return new Event(props)
}
if (opts.basePath)
patterns = patterns.map(pattern => opts.basePath + '/' + pattern)
const makeGlobStream = events => {
const stream = Bacon.combineAsArray(
patterns.map(
(pattern, idx) => Bacon.fromPromise(
glob(pattern).then(
paths => paths.map(path => ({ path, treeIndex: treeIndex + idx }))
)
)
)
)
.map(_.flatten)
.map(files => {
return events.concat(files.map(file => {
file.initPhase = true
return newEvent('add', file)
}))
})
.take(1)
if (! op.watch)
return stream
const watchers = patterns.map(
pattern => chokidar.watch(pattern, { ignoreInitial: true })
)
const chokEvRemap = { unlink: 'remove' }
const updates = Bacon.mergeAll(
_.flatten(['add', 'change', 'unlink'].map(type => watchers.map(
(watcher, idx) => Bacon.fromEvent(watcher, type).map(
path => {
// TODO: remove
// console.log('watch', Date.now(), type, path)
return [ newEvent(chokEvRemap[type] || type, { path, treeIndex: treeIndex + idx }) ]
}
)
)))
)
// see https://github.com/paulmillr/chokidar/issues/262
// the debounce alone makes chokidar behave but eventually coalesceEvents will
// act as a second defense to this issue.
return stream.changes().concat(
coalesceEvents( bufferingDebounce(updates, debounce).map(_.flatten) )
)
}
let globStream
return op.stream.flatMap(events => {
if (! globStream) {
globStream = makeGlobStream(events)
return globStream
}
else {
return events
}
})
}
================================================
FILE: src/plugin/merge.js
================================================
import Promise from 'bluebird'
import { Bacon } from 'sigh-core'
import _ from 'lodash'
import { bufferingDebounce } from 'sigh-core/lib/stream'
export default function(op, ...pipelines) {
let collectInitial = false
if (pipelines.length && ! pipelines[0].plugin) {
const opts = pipelines.shift()
collectInitial = opts.collectInitial
}
// Promise.map(..., { concurrency: 1 }) delivers the items to the iterator
// out of order which messes with opTreeIndex ordering.
const streamPromise = Promise.reduce(
pipelines,
(streams, pipeline) => {
return op.compiler.compile(pipeline, op.stream || null)
.then(stream => {
streams.push(stream)
return streams
})
},
[]
)
.then(streams => Bacon.mergeAll(streams.filter(stream => stream !== op.compiler.initStream)))
if (collectInitial) {
return streamPromise.then(stream => {
let initEvents = []
let { length: nStreamEventsLeft } = pipelines
return stream.flatMapLatest(events => {
if (nStreamEventsLeft) {
if (events.every(event => event.initPhase)) {
initEvents.push(...events)
return --nStreamEventsLeft ? Bacon.never() :
Bacon.mergeAll([ Bacon.constant(initEvents), stream ])
}
}
else {
return events
}
})
})
}
return streamPromise
}
================================================
FILE: src/plugin/pipeline.js
================================================
import Promise from 'bluebird'
import { Bacon } from 'sigh-core'
import _ from 'lodash'
import { bufferingDebounce } from 'sigh-core/lib/stream'
const DEFAULT_DEBOUNCE = 200
export default function(op, ...pipelineNames) {
const { compiler } = op
pipelineNames = pipelineNames.filter(p => ! p.hasOwnProperty('activate'))
if (op.stream !== compiler.initStream) {
pipelineNames.forEach(name => {
// TODO: avoid forwarding []?
compiler.addPipelineInput(name, op.stream.skipErrors())
})
}
// during this call the streams may not be set up, wait until the first
// "stream initialisation" value before merging the pipeline streams.
return op.stream.take(1).flatMap(events => {
return Bacon.mergeAll(_.reduce(
pipelineNames,
(streams, name) => {
const stream = compiler.streams[name]
if (stream)
streams.push(stream)
return streams
},
[]
)).skipErrors()
})
}
================================================
FILE: src/plugin/write.js
================================================
import path from 'path'
import Promise from 'bluebird'
import fs from 'fs'
import { log } from 'sigh-core'
import fse from 'fs-extra'
const glob = Promise.promisify(require('glob'))
const writeFile = Promise.promisify(fs.writeFile)
const unlink = Promise.promisify(fs.unlink)
const rm = Promise.promisify(fse.remove) // TODO: not used yet, see later comment
const ensureDir = Promise.promisify(fse.ensureDir)
import { mapEvents } from 'sigh-core/lib/stream'
export function writeEvent(basePath, event) {
const { fileType } = event
const projectFile = path.basename(event.path)
const { projectPath } = event
// the projectPath remains the same but the basePath is changed to point to
// the output directory
event.basePath = basePath
const outputPath = event.path
if (event.type === 'remove') {
return unlink(outputPath).then(() => {
return event.supportsSourceMap ? unlink(outputPath + '.map').then(() => event) : event
})
}
let { data } = event
const outputDir = path.dirname(outputPath)
let promise = ensureDir(path.dirname(outputPath)).then(() => {
return writeFile(outputPath, data, {encoding: event.encoding})
})
if (event.supportsSourceMap) {
let sourceMap
try {
sourceMap = event.sourceMap
}
catch (e) {
log.warn('\x07could not construct identity source map for %s', projectPath)
if (e.message)
log.warn(e.message)
}
if (sourceMap) {
const mapPath = projectFile + '.map'
let suffix
if (fileType === 'js')
suffix = '//# sourceMappingURL=' + mapPath
else if (fileType === 'css')
suffix = '/*# sourceMappingURL=' + mapPath + ' */'
if (suffix)
data += '\n' + suffix
promise = promise.then(() => {
sourceMap.sources = sourceMap.sources.map(source => path.relative(outputDir, source))
return writeFile(path.join(outputDir, mapPath), JSON.stringify(sourceMap))
})
}
}
return promise.then(() => event)
}
// basePath = base directory in which to write output files
export default function(op, options, basePath) {
if (! basePath) {
basePath = options
options = {}
}
let clobberPromise
let { clobber } = options
if (clobber) {
// sanitize a path we are about to recursively remove... it must be below
// the current working directory (which contains sigh.js)
if (! basePath || basePath[0] === '/' || basePath.substr(0, 3) === '../')
throw Error(`refusing to clobber '${basePath}' outside of project`)
if (clobber === true) {
clobberPromise = rm(basePath)
}
else {
if (! (clobber instanceof Array))
clobber = [ clobber ]
clobberPromise = Promise.map(clobber, pattern => {
return glob(pattern, { cwd: basePath }).then(
matches => Promise.map(matches, match => rm(path.join(basePath, match)))
)
})
}
}
const streamPromise = mapEvents(op.stream, writeEvent.bind(this, basePath))
return clobberPromise ? clobberPromise.thenReturn(streamPromise) : streamPromise
}
================================================
FILE: src/test/PipelineCompiler.spec.js
================================================
import { Bacon } from 'sigh-core'
import Promise from 'bluebird'
import PipelineCompiler from '../PipelineCompiler'
import { plugin } from './plugin/helper'
const should = require('chai').should()
describe('PipelineCompiler', () => {
it('should create appropriate stream from array', () => {
const compiler = new PipelineCompiler
return compiler.compile([
op => {
// TODO: test op.stream is Bacon.constant([])
return Bacon.constant(1)
},
op => op.stream.map(v => v + 1)
])
.then(stream => stream.toPromise(Promise).then(value => value.should.equal(2)))
})
it('should create stream from stream, passing watch option', () => {
const compiler = new PipelineCompiler({ watch: true })
return compiler.compile(op => {
op.watch.should.be.true
// TODO: test op.stream is Bacon.constant([])
return Bacon.constant(420)
})
.then(
stream => stream.toPromise(Promise).then(value => value.should.equal(420))
)
})
it('should pass arguments to plugin', () => {
const compiler = new PipelineCompiler
return compiler.compile(plugin((op, arg1, arg2) => {
should.not.exist(op.watch)
// TODO: test op.stream is Bacon.constant([])
return Bacon.constant(arg1 + arg2)
}, 7, 11))
.then(stream => stream.toPromise(Promise).then(value => value.should.equal(18)))
})
it('should pass treeIndex and observe nextTreeIndex', () => {
const compiler = new PipelineCompiler
return compiler.compile([
op => { op.treeIndex.should.equal(1) },
op => { op.treeIndex.should.equal(2), op.treeIndex = 4 },
op => { op.treeIndex.should.equal(4) }
])
})
})
================================================
FILE: src/test/api.spec.js
================================================
import Promise from 'bluebird'
const copy = Promise.promisify(require('fs-extra').copy)
const mkTmpDir = Promise.promisify(require('temp').mkdir)
import PipelineCompiler from '../PipelineCompiler'
import { compileSighfile } from '../api'
import rewire from 'rewire'
const FIXTURE_PATH = 'test/fixtures/sigh-project'
describe('loadPipelineDependencies', () => {
const pipelinePlugin = require('../plugin/pipeline').default
const mergePlugin = require('../plugin/merge').default
const fakePlugin = {}
const loadPipelineDependencies = rewire('../api').__get__('loadPipelineDependencies')
const makePluginDesc = (plugin, args = []) => ({ plugin, args })
const pipelines = {
dep1: [ makePluginDesc(fakePlugin, ['1']) ],
dep2: [ makePluginDesc(fakePlugin, ['2']) ],
}
it('should ignore pipeline activation in first leaf and detect subsequent activation', function() {
const runPipelines = {
main: [
{ plugin: pipelinePlugin, args: [{ activate: true }, 'dep2'] },
{ plugin: pipelinePlugin, args: [{ activate: true }, 'dep1'] }
]
}
const deps = loadPipelineDependencies(runPipelines, pipelines)
deps.should.have.property('dep1').and.equal(pipelines.dep1)
deps.should.not.have.property('dep2')
})
it('should detect pipeline activation in merge, ignoring activation in first leaf node', function() {
const runPipelines = {
main: [
makePluginDesc(mergePlugin, [
{ plugin: pipelinePlugin, args: [{ activate: true }, 'dep2'] },
{ plugin: pipelinePlugin, args: [{ activate: true }, 'dep1'] },
]),
]
}
const deps = loadPipelineDependencies(runPipelines, pipelines)
deps.should.have.property('dep1').and.equal(pipelines.dep1)
deps.should.not.have.property('dep2')
})
})
describe('compileSighFile', () => {
it('compile should build working bacon streams from pipelines in Sigh.js file', function() {
this.timeout(3000)
let pathBackup, compiler, tmpPath
return mkTmpDir({ dir: 'test/tmp', prefix: 'sigh-api-test-' })
.then(_tmpPath => {
tmpPath = _tmpPath
return copy(FIXTURE_PATH, tmpPath)
})
.then(() => {
pathBackup = process.cwd()
process.chdir(tmpPath)
const opts = { environment: 'production' }
compiler = new PipelineCompiler(opts)
return compileSighfile(compiler, opts)
})
.then(streams => streams.js.toPromise(Promise))
.then(events => {
events.length.should.equal(1)
const event = events[0]
event.path.should.equal('dist/combined.js')
})
.finally(() => {
compiler.destroy()
if (pathBackup)
process.chdir(pathBackup)
})
})
})
================================================
FILE: src/test/bootstrap.spec.js
================================================
require('source-map-support').install()
require('chai').should()
// Ensure temporary directories are removed after each run of tests.
import Promise from 'bluebird'
import temp from 'temp'
temp.track()
const cleanup = Promise.promisify(temp.cleanup)
after(() => cleanup())
================================================
FILE: src/test/gulp-adapter.spec.js
================================================
import Promise from 'bluebird'
import { Bacon, Event } from 'sigh-core'
import gulpUglify from 'gulp-uglify'
import { SourceMapConsumer } from 'source-map'
import { positionOf } from 'sigh-core/lib/sourceMap'
import gulpAdapter from '../gulp-adapter'
describe('gulp adapter', () => {
it('adapts the gulp-uglify plugin', () => {
const adapted = gulpAdapter(gulpUglify)
let data = ' function hey() {\n return 14 }\n\n var a = 1'
const stream = Bacon.constant([ new Event({ path: 'file1.js', type: 'add', data }) ])
const op = adapted({ stream })
let nCalls = 0
op.onValue(events => {
++nCalls
events.length.should.equal(1)
const event = events[0]
const sizeReduction = data.length - event.data.length
// verify data is smaller (minified)
sizeReduction.should.be.greaterThan(10)
// verify the source map
const consumer = new SourceMapConsumer(event.sourceMap)
const origPos = positionOf(data, 'var')
origPos.should.eql({ line: 4, column: 2 })
const transformedPos = positionOf(event.data, 'var')
transformedPos.should.eql({ line: 1, column: 25 })
const mappedPos = consumer.originalPositionFor(transformedPos)
origPos.line.should.not.equal(transformedPos.line)
origPos.line.should.equal(mappedPos.line)
origPos.column.should.equal(mappedPos.column)
})
return new Promise(resolve => {
op.onEnd(() => {
nCalls.should.equal(1)
resolve()
})
})
})
})
================================================
FILE: src/test/plugin/concat.spec.js
================================================
import _ from 'lodash'
import Promise from 'bluebird'
import { Bacon, Event } from 'sigh-core'
import concat from '../../plugin/concat'
import { SourceMapConsumer } from 'source-map'
import { makeEvent } from './helper'
describe('concat plugin', () => {
it('concatenates three javascript files', () => {
const stream = Bacon.constant([1, 2, 3].map(num => makeEvent(num)))
return concat({ stream }, 'output.js', 10).toPromise(Promise).then(events => {
events.length.should.equal(1)
const { data, sourceMap } = events[0]
data.should.equal('var a1 = 1\nvar a2 = 2\nvar a3 = 3\n')
const consumer = new SourceMapConsumer(sourceMap)
const varPos = [1, 2, 3].map(line => consumer.originalPositionFor({ line, column: 0 }))
varPos.forEach((pos, idx) => {
pos.line.should.equal(1)
pos.source.should.equal(`file${idx + 1}.js`)
})
})
})
it('preserves treeIndex order', () => {
const stream = Bacon.fromArray([
[2, 1].map(num => makeEvent(num)), // first file in event array has higher tree index
])
return concat({ stream }, 'output.js', 10).toPromise(Promise).then(events => {
events[0].data.should.equal('var a1 = 1\nvar a2 = 2\n')
})
})
it('should handle erroneous js file without sourcemap', () => {
const data = "console.log('test)"
const event = new Event({
basePath: 'root',
path: 'root/subdir/output.js',
type: 'add',
data
})
const stream = Bacon.constant([ event ])
return concat({ stream }).toPromise(Promise).then(events => {
events[0]._sourceMap.sources.length.should.equal(0);
})
})
xit('strips source map comments when concatenating two javascript files', () => {
})
})
================================================
FILE: src/test/plugin/debounce.spec.js
================================================
import Promise from 'bluebird'
import { Bacon, Event } from 'sigh-core'
import PipelineCompiler from '../../PipelineCompiler'
import debounce from '../../plugin/debounce'
import { plugin, makeEvent } from './helper'
describe('debounce plugin', () => {
it('debounces two streams', () => {
const streams = Bacon.fromArray([ 1, 2 ].map(idx => [ makeEvent(idx, true) ]))
const compiler = new PipelineCompiler
const opData = { compiler }
return compiler.compile([
op => streams,
plugin(debounce, 100)
])
.then(streams => streams.toPromise(Promise))
.then(events => {
events = events.sort()
events[0].path.should.equal('file1.js')
events.length.should.equal(2)
events[1].path.should.equal('file2.js')
})
})
})
================================================
FILE: src/test/plugin/env.spec.js
================================================
import { Bacon, Event } from 'sigh-core'
import Promise from 'bluebird'
import env from '../../plugin/env'
import PipelineCompiler from '../../PipelineCompiler'
import { plugin } from './helper'
describe('env plugin', () => {
it('modifies stream when selected environment is chosen', () => {
const compiler = new PipelineCompiler({ environment: 'friend' })
const streams = [
op => Bacon.constant(8),
plugin(env, op => op.stream.map(val => val * 2), 'friend')
]
return compiler.compile(streams).then(
stream => stream.toPromise(Promise).then(v => { v.should.equal(16) })
)
})
it('passes stream through when selected environments are not chosen', () => {
const compiler = new PipelineCompiler({ environment: 'e1' })
const streams = [
op => Bacon.constant(9),
plugin(env, op => op.stream.map(val => val * 2), 'e2', 'e3')
]
return compiler.compile(streams).then(
stream => stream.toPromise(Promise).then(v => { v.should.equal(9) })
)
})
})
================================================
FILE: src/test/plugin/filter.spec.js
================================================
import { Bacon, Event } from 'sigh-core'
import Promise from 'bluebird'
import filter from '../../plugin/filter'
import PipelineCompiler from '../../PipelineCompiler'
import { plugin } from './helper'
describe('filter plugin', () => {
it('filters events according to a projectPath regex filter', () => {
const events = [
new Event({ type: 'add', path: 'blah.js', data: 'var blah' }),
new Event({ type: 'add', path: 'plah.js', data: 'var plah' }),
]
const stream = Bacon.constant(events)
return filter(true, { stream }, { projectPath: /^b/ }).toPromise(Promise)
.then(events => {
events.length.should.equal(1)
events[0].projectPath.should.equal('blah.js')
})
})
it('filters events according to a type string filter', () => {
const events = [
new Event({ type: 'add', path: 'blah.js', data: 'var blah' }),
new Event({ type: 'update', path: 'plah.js', data: 'var plah' }),
]
const stream = Bacon.constant(events)
return filter(true, { stream }, { type: 'add' }).toPromise(Promise)
.then(events => {
events.length.should.equal(1)
events[0].projectPath.should.equal('blah.js')
})
})
})
================================================
FILE: src/test/plugin/glob.spec.js
================================================
import { Bacon, Event } from 'sigh-core'
import _ from 'lodash'
import Promise from 'bluebird'
import fs from 'fs'
const copy = Promise.promisify(require('fs-extra').copy)
const mkTmpDir = Promise.promisify(require('temp').mkdir)
import glob from '../../plugin/glob'
const FIXTURE_PATH = 'test/fixtures/simple-project'
const FIXTURE_FILES = [
FIXTURE_PATH + '/file1.js',
FIXTURE_PATH + '/file2.js'
]
describe('glob plugin', () => {
const stream = Bacon.constant([])
it('globs a wildcard', () => {
return glob({ stream }, FIXTURE_PATH + '/*.js').toPromise(Promise).then(updates => {
updates.length.should.equal(2)
_.pluck(updates, 'projectPath').sort().should.eql(FIXTURE_FILES)
updates.forEach(file => {
file.initPhase.should.be.true
file.type.should.equal('add')
file.opTreeIndex.should.equal(1)
})
})
})
it('globs a wildcard and forwards initial input events', () => {
const stream = Bacon.constant([new Event({
type: 'add',
path: 'blah.js',
data: 'var blah',
})])
return glob({ stream }, FIXTURE_PATH + '/*.js').toPromise(Promise).then(events => {
events.length.should.equal(3)
const updates = events.slice(1)
_.pluck(updates, 'projectPath').sort().should.eql(FIXTURE_FILES)
updates.forEach(file => {
file.initPhase.should.be.true
file.type.should.equal('add')
file.opTreeIndex.should.equal(1)
})
})
})
it('globs a wildcard using the basePath option', () => {
const opData = { stream, treeIndex: 4 }
return glob(opData, { basePath: FIXTURE_PATH }, '*.js')
.toPromise(Promise)
.then(updates => {
opData.nextTreeIndex.should.equal(5)
updates.length.should.equal(2)
updates[0].projectPath.should.equal('file1.js')
updates[1].projectPath.should.equal('file2.js')
})
})
it('globs two wildcards', () => {
const opData = { stream, treeIndex: 1 }
return glob(opData, FIXTURE_PATH + '/*1.js', FIXTURE_PATH + '/*2.js')
.toPromise(Promise)
.then(updates => {
opData.nextTreeIndex.should.equal(3)
updates.length.should.equal(2)
_.pluck(updates, 'path').sort().should.eql(FIXTURE_FILES)
updates[0].opTreeIndex.should.equal(1)
updates[1].opTreeIndex.should.equal(2)
updates.forEach(file => { file.type.should.equal('add') })
})
})
it('detects changes to two files matching globbed pattern', () => {
let tmpPath
return mkTmpDir({ dir: 'test/tmp', prefix: 'sigh-glob-test-' }).then(_tmpPath => {
tmpPath = _tmpPath
return copy(FIXTURE_PATH, tmpPath)
})
.then(() => {
return new Promise(function(resolve) {
let nUpdates = 0
const files = [ tmpPath + '/file1.js', tmpPath + '/file2.js' ]
glob({ stream, watch: true, treeIndex: 4 }, tmpPath + '/*.js')
.onValue(updates => {
if (++nUpdates === 1) {
updates.length.should.equal(2)
_.delay(fs.appendFile, 50, files[0], 'var file1line2 = 24;\n', () => {})
_.delay(fs.appendFile, 500, files[1], 'var file2line2 = 25;\n', () => {})
}
else {
updates.should.eql([
new Event({
type: 'change',
path: files[nUpdates - 2],
initPhase: false,
opTreeIndex: 4,
createTime: updates[0].createTime
}),
])
if (nUpdates === 3) {
resolve()
return Bacon.noMore
}
}
})
})
})
})
it('forwards subsequent input events along with file change events', () => {
const delayedInputEvent = new Event({
type: 'add',
path: 'blah.js',
data: 'var blah',
})
let sink
const twoStream = Bacon.mergeAll(
stream,
Bacon.fromBinder(_sink => { sink = _sink; return () => undefined })
)
let tmpPath
return mkTmpDir({ dir: 'test/tmp', prefix: 'sigh-glob-test-' }).then(_tmpPath => {
tmpPath = _tmpPath
return copy(FIXTURE_PATH, tmpPath)
})
.then(() => {
return new Promise(function(resolve) {
let nUpdates = 0
const updateFile = tmpPath + '/file1.js'
glob({ stream: twoStream, watch: true, treeIndex: 4 }, tmpPath + '/*.js')
.onValue(updates => {
if (++nUpdates === 1) {
updates.length.should.equal(2)
_.delay(fs.appendFile, 50, updateFile, 'var file1line2 = 24;\n', () => {})
}
else if (nUpdates === 2) {
updates.should.eql([
new Event({
type: 'change',
path: updateFile,
initPhase: false,
opTreeIndex: 4,
createTime: updates[0].createTime
}),
])
sink([delayedInputEvent])
}
else {
updates[0].should.equal(delayedInputEvent)
resolve()
return Bacon.noMore
}
})
})
})
})
it('detects new file', () => {
let tmpPath
return mkTmpDir({ dir: 'test/tmp', prefix: 'sigh-glob-test-2-' }).then(_tmpPath => {
tmpPath = _tmpPath
return copy(FIXTURE_PATH, tmpPath)
})
.then(() => {
const addedFile = tmpPath + '/added-file.js'
return new Promise(function(resolve) {
let nUpdates = 0
const files = [ tmpPath + '/file1.js', tmpPath + '/file2.js' ]
glob({ stream, watch: true, treeIndex: 4 }, tmpPath + '/*.js')
.onValue(updates => {
if (++nUpdates === 1) {
updates.length.should.equal(2)
_.delay(fs.writeFile, 300, addedFile, 'var file3line1 = 33;\n', () => {})
}
else {
updates.should.eql([
new Event({
type: 'add',
path: addedFile,
initPhase: false,
opTreeIndex: 4,
createTime: updates[0].createTime
}),
])
resolve()
return Bacon.noMore
}
})
})
})
})
xit('detects file unlink', () => {
})
xit('detects file rename', () => {
})
})
================================================
FILE: src/test/plugin/helper.js
================================================
import { Event } from 'sigh-core'
export function makeEvent(num, initPhase = false) {
return new Event({
path: `file${num}.js`,
type: 'add',
opTreeIndex: num,
data: `var a${num} = ${num}`,
initPhase
})
}
export function plugin(plugin, ...args) {
const ret = { plugin }
if (args.length)
ret.args = args
return ret
}
================================================
FILE: src/test/plugin/merge.js
================================================
import _ from 'lodash'
import Promise from 'bluebird'
import { Bacon } from 'sigh-core'
import PipelineCompiler from '../../lib/PipelineCompiler'
import merge from '../../lib/plugin/merge'
import { plugin, makeEvent } from './helper'
describe('merge plugin', () => {
it('combines three streams into one', () => {
const streams = [1, 2, 3].map(i => plugin(op => Bacon.constant([ makeEvent(i) ])))
const opData = { compiler: new PipelineCompiler }
return merge(opData, ...streams).then(streams => {
let nEvents = 0
return new Promise(function(resolve, reject) {
streams.onValue(events => {
++nEvents
events.length.should.equal(1)
events[0].path.should.equal(`file${nEvents}.js`)
if (nEvents === 3) {
resolve()
return Bacon.noMore
}
})
})
})
})
it('assigns treeIndex to sub-operations', () => {
const compiler = new PipelineCompiler
const opData = { compiler }
const streams = [
plugin(op => op.treeIndex.should.equal(1)),
plugin(op => op.treeIndex.should.equal(2))
]
return compiler.compile([ plugin(merge, ...streams) ])
})
it('increments treeIndex for subsequent operations', () => {
const compiler = new PipelineCompiler
const opData = { compiler }
const streams = [ plugin(op => 1), plugin(op => 2) ]
return compiler.compile([
plugin(merge, ...streams),
plugin(op => op.treeIndex.should.equal(3))
])
})
})
================================================
FILE: src/test/plugin/pipeline.spec.js
================================================
import _ from 'lodash'
import Promise from 'bluebird'
import { Bacon } from 'sigh-core'
import PipelineCompiler from '../../PipelineCompiler'
import pipeline from '../../plugin/pipeline'
describe('pipeline plugin', () => {
let compiler, stream
beforeEach(() => {
compiler = new PipelineCompiler
stream = compiler.initStream
})
it('intercepts the end of two pipelines', () => {
return Promise.all([1, 2].map(
idx => compiler.compile(op => Bacon.constant(idx), null, `stream${idx}`)
))
.then(streams => {
return new Promise(function(resolve, reject) {
let nValues = 0
pipeline({ stream, compiler }, 'stream1', 'stream2').onValue(events => {
++nValues
events.should.eql(nValues)
if (nValues === 2) {
resolve()
return Bacon.noMore
}
})
})
})
})
it('can subscribe to the same stream as another pipeline', () => {
return Promise.all([1, 2].map(
idx => compiler.compile(op => Bacon.constant(idx), null, `stream${idx}`)
))
.then(streams => {
// this stops the pipelines from spitting out all the events into the first
// subscriber, emulating how a true pipeline would work using async glob etc.
compiler.streams = _.mapValues(compiler.streams, stream => stream.delay(0))
return Promise.all([
new Promise(function(resolve, reject) {
let nValues = 0
pipeline({ stream, compiler }, 'stream1', 'stream2').onValue(events => {
++nValues
events.should.equal(nValues)
if (nValues === 2) {
resolve()
return Bacon.noMore
}
})
}),
new Promise(function(resolve, reject) {
pipeline({ stream, compiler }, 'stream2').onValue(events => {
events.should.equal(2)
resolve()
return Bacon.noMore
})
}),
])
})
})
it('can subscribe to a pipeline before it has been compiled', () => {
const pipelineOp = pipeline({ stream, compiler }, 'stream')
compiler.compile(op => Bacon.constant(1), null, 'stream')
.then(stream => {
compiler.streams.stream = compiler.streams.stream.delay(0)
return pipelineOp.toPromise(Promise)
})
.then(function(values) {
values.should.eql(1)
})
})
})
================================================
FILE: src/test/plugin/write.spec.js
================================================
import _ from 'lodash'
import { Bacon, Event } from 'sigh-core'
import Promise from 'bluebird'
import { readFileSync, existsSync } from 'fs'
import write from '../../plugin/write'
const TMP_PATH = 'test/tmp/write'
const PROJ_PATH = 'subdir/file1.js'
const PROJ_PATH_BINARY = 'subdir/file2.bin'
const TMP_FILE = TMP_PATH + '/' + PROJ_PATH
describe('write plugin', () => {
it('writes a single file with no map to output directory with identity map', () => {
const data = 'var pump\n'
const stream = Bacon.constant([ new Event({ path: PROJ_PATH, type: 'add', data }) ])
return write({ stream }, TMP_PATH).toPromise(Promise).then(events => {
// console.log('write events %j', events)
readFileSync(TMP_FILE).toString()
.should.equal(data + '\n//# sourceMappingURL=file1.js.map')
readFileSync(TMP_FILE + '.map').toString()
.should.equal('{"version":3,"sources":["../../../../subdir/file1.js"],"names":[],"mappings":"AAAA,IAAI","file":"file1.js","sourcesContent":["var pump\\n"]}')
})
})
it('write a single file containing a basePath', () => {
const data = 'var pumpbaby\n'
const stream = Bacon.constant([
new Event({ basePath: 'subdir', path: PROJ_PATH, type: 'add', data })
])
return write({ stream }, TMP_PATH).toPromise(Promise).then(events => {
// subdir stripped from the output path due to basePath
const tmpFile = TMP_PATH + '/file1.js'
readFileSync(tmpFile).toString()
.should.equal(data + '\n//# sourceMappingURL=file1.js.map')
readFileSync(tmpFile + '.map').toString()
.should.equal('{"version":3,"sources":["../../../subdir/file1.js"],"names":[],"mappings":"AAAA,KAAK","file":"file1.js","sourcesContent":["var pumpbaby\\n"]}')
})
})
it('write a binary file', () => {
const data = new Buffer([0, 1, 2, 3, -1, 5, 6, 7, 0])
const stream = Bacon.constant([
new Event({ basePath: 'subdir', path: PROJ_PATH_BINARY, type: 'add', data: data.toString('binary'), encoding: 'binary', supportsSourceMap: false })
])
return write({ stream }, TMP_PATH).toPromise(Promise).then(events => {
// subdir stripped from the output path due to basePath
const tmpFile = TMP_PATH + '/file2.bin'
readFileSync(tmpFile).should.eql(data)
})
})
it('write a single file then remove it', () => {
const data = 'var mew\n'
const stream = Bacon.fromArray([
[ new Event({ path: PROJ_PATH, type: 'add', data }) ],
[ new Event({ path: PROJ_PATH, type: 'remove', data }) ]
])
return new Promise(function(resolve, reject) {
let nValues = 0
const writeStream = write({ stream }, TMP_PATH)
writeStream.onValue(events => {
// console.log('write events %j', events)
if (++nValues === 1) {
existsSync(TMP_FILE).should.be.ok
existsSync(TMP_FILE + '.map').should.be.ok
}
else {
existsSync(TMP_FILE).should.not.be.ok
existsSync(TMP_FILE + '.map').should.not.be.ok
resolve()
return Bacon.noMore
}
})
writeStream.onError(reject)
})
})
})
================================================
FILE: test/fixtures/sigh-project/Sigh.js
================================================
// sigh's node_modules/package.json are symlinked to the same directory as this
// sigh file which allows the babel plugin to be loaded.
var merge, concat, debounce, env, glob, write, babel
module.exports = function(pipelines) {
pipelines['js'] = [
merge(
[ glob({ basePath: 'src' }, '*.js'), babel({ modules: 'common' }) ],
glob('bootstrap.js')
),
env([ debounce(500), concat('combined.js') ], 'production'),
write('dist')
]
}
================================================
FILE: test/fixtures/sigh-project/bootstrap.js
================================================
console.log('the end')
================================================
FILE: test/fixtures/sigh-project/src/arrow.js
================================================
var times2 = a => a * 2
console.log('10 is %s', times2(5))
================================================
FILE: test/fixtures/sigh-project/src/class.js
================================================
class Hello {
constructor() { this.friend = 'totoro' }
}
var greeter = new Hello
console.log(greeter.friend)
================================================
FILE: test/fixtures/simple-project/file1.js
================================================
var file1 = 1;
================================================
FILE: test/fixtures/simple-project/file2.js
================================================
var file2 = 2;
================================================
FILE: test/tmp/.keepdir
================================================
gitextract_loa6yxo6/
├── .gitignore
├── .npmignore
├── changelog.md
├── circle.yml
├── docs/
│ └── writing-plugins.md
├── index.js
├── package.json
├── readme.md
├── sigh.js
├── src/
│ ├── PipelineCompiler.js
│ ├── api.js
│ ├── gulp-adapter.js
│ ├── plugin/
│ │ ├── concat.js
│ │ ├── debounce.js
│ │ ├── env.js
│ │ ├── filter.js
│ │ ├── glob.js
│ │ ├── merge.js
│ │ ├── pipeline.js
│ │ └── write.js
│ └── test/
│ ├── PipelineCompiler.spec.js
│ ├── api.spec.js
│ ├── bootstrap.spec.js
│ ├── gulp-adapter.spec.js
│ └── plugin/
│ ├── concat.spec.js
│ ├── debounce.spec.js
│ ├── env.spec.js
│ ├── filter.spec.js
│ ├── glob.spec.js
│ ├── helper.js
│ ├── merge.js
│ ├── pipeline.spec.js
│ └── write.spec.js
└── test/
├── fixtures/
│ ├── sigh-project/
│ │ ├── Sigh.js
│ │ ├── bootstrap.js
│ │ └── src/
│ │ ├── arrow.js
│ │ └── class.js
│ └── simple-project/
│ ├── file1.js
│ └── file2.js
└── tmp/
└── .keepdir
SYMBOL INDEX (27 symbols across 11 files)
FILE: src/PipelineCompiler.js
constant DEFAULT_JOBS (line 6) | const DEFAULT_JOBS = 4
method constructor (line 15) | constructor(options) {
method addPipelineInput (line 39) | addPipelineInput(name, stream) {
method destroy (line 50) | destroy() {
method compile (line 59) | compile(pipeline, inputStream = null, name = null) {
FILE: src/api.js
function invoke (line 32) | function invoke(opts = {}) {
function requirePlugin (line 105) | function requirePlugin(path) {
function compileSighfile (line 114) | function compileSighfile(compiler, opts = {}) {
function selectPipelines (line 191) | function selectPipelines(selected, pipelines) {
function reverseHash (line 218) | function reverseHash(hash) {
function loadPipelineDependencies (line 229) | function loadPipelineDependencies(runPipelines, pipelines) {
function injectPlugin (line 299) | function injectPlugin(module, pluginName) {
FILE: src/gulp-adapter.js
function adapter (line 7) | function adapter(gulpPlugin, op, ...args) {
FILE: src/plugin/glob.js
constant DEFAULT_DEBOUNCE (line 10) | const DEFAULT_DEBOUNCE = 120
FILE: src/plugin/pipeline.js
constant DEFAULT_DEBOUNCE (line 7) | const DEFAULT_DEBOUNCE = 200
FILE: src/plugin/write.js
function writeEvent (line 17) | function writeEvent(basePath, event) {
FILE: src/test/api.spec.js
constant FIXTURE_PATH (line 10) | const FIXTURE_PATH = 'test/fixtures/sigh-project'
FILE: src/test/plugin/glob.spec.js
constant FIXTURE_PATH (line 11) | const FIXTURE_PATH = 'test/fixtures/simple-project'
constant FIXTURE_FILES (line 12) | const FIXTURE_FILES = [
FILE: src/test/plugin/helper.js
function makeEvent (line 3) | function makeEvent(num, initPhase = false) {
function plugin (line 13) | function plugin(plugin, ...args) {
FILE: src/test/plugin/write.spec.js
constant TMP_PATH (line 8) | const TMP_PATH = 'test/tmp/write'
constant PROJ_PATH (line 9) | const PROJ_PATH = 'subdir/file1.js'
constant PROJ_PATH_BINARY (line 10) | const PROJ_PATH_BINARY = 'subdir/file2.bin'
constant TMP_FILE (line 11) | const TMP_FILE = TMP_PATH + '/' + PROJ_PATH
FILE: test/fixtures/sigh-project/src/class.js
class Hello (line 1) | class Hello {
method constructor (line 2) | constructor() { this.friend = 'totoro' }
Condensed preview — 40 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (84K chars).
[
{
"path": ".gitignore",
"chars": 75,
"preview": "/npm-debug.log\n/node_modules\n/api.map\n/lib\n/test/tmp/*\n!/test/tmp/.keepdir\n"
},
{
"path": ".npmignore",
"chars": 0,
"preview": ""
},
{
"path": "changelog.md",
"chars": 1610,
"preview": "# sigh changelog\n\n## v0.12.29\n * `merge` with `collectInitial` option was missing events.\n\n## v0.12.28\n * errors thrown "
},
{
"path": "circle.yml",
"chars": 195,
"preview": "dependencies:\n override:\n - 'nvm exec 0.10.33 npm install'\n - 'nvm exec iojs-v1.3.0 npm install'\n\ntest:\n overrid"
},
{
"path": "docs/writing-plugins.md",
"chars": 8501,
"preview": "# Writing sigh plugins\n\nsigh will generator your plugin scaffolding for you, all you need to do then is fill in a small "
},
{
"path": "index.js",
"chars": 78,
"preview": "require('source-map-support').install()\nmodule.exports = require('./lib/api')\n"
},
{
"path": "package.json",
"chars": 1300,
"preview": "{\n \"name\": \"sigh\",\n \"version\": \"0.12.35\",\n \"description\": \"the fastest and most expressive build system for the web a"
},
{
"path": "readme.md",
"chars": 15332,
"preview": "# sigh\n\n\n\n[ {\n var babelOpts = {\n preset"
},
{
"path": "src/PipelineCompiler.js",
"chars": 3288,
"preview": "import _ from 'lodash'\nimport { Bacon } from 'sigh-core'\nimport Promise from 'bluebird'\nimport ProcessPool from 'process"
},
{
"path": "src/api.js",
"chars": 8986,
"preview": "import fs from 'fs'\nimport _ from 'lodash'\nimport Promise from 'bluebird'\nimport rewire from 'rewire'\nimport path from '"
},
{
"path": "src/gulp-adapter.js",
"chars": 2127,
"preview": "import Vinyl from 'vinyl'\nimport { Bacon, Event } from 'sigh-core'\nimport { Transform } from 'stream'\n\nexport default gu"
},
{
"path": "src/plugin/concat.js",
"chars": 2039,
"preview": "import _ from 'lodash'\nimport { toFileSystemState } from 'sigh-core/lib/stream'\nimport { concatenate as concatSourceMaps"
},
{
"path": "src/plugin/debounce.js",
"chars": 950,
"preview": "import _ from 'lodash'\n\nimport { Bacon } from 'sigh-core'\nimport { bufferingDebounce } from 'sigh-core/lib/stream'\n\nexpo"
},
{
"path": "src/plugin/env.js",
"chars": 214,
"preview": "import _ from 'lodash'\n\nexport default function(op, pipeline, ...envs) {\n envs = _.flatten(envs)\n if (! _.includes(env"
},
{
"path": "src/plugin/filter.js",
"chars": 784,
"preview": "import _ from 'lodash'\nimport { Bacon } from 'sigh-core'\n\nexport default function(select, op, ...filters) {\n filters = "
},
{
"path": "src/plugin/glob.js",
"chars": 2623,
"preview": "import chokidar from 'chokidar'\nimport _ from 'lodash'\nimport { Bacon, Event } from 'sigh-core'\nimport Promise from 'blu"
},
{
"path": "src/plugin/merge.js",
"chars": 1423,
"preview": "import Promise from 'bluebird'\nimport { Bacon } from 'sigh-core'\nimport _ from 'lodash'\n\nimport { bufferingDebounce } fr"
},
{
"path": "src/plugin/pipeline.js",
"chars": 960,
"preview": "import Promise from 'bluebird'\nimport { Bacon } from 'sigh-core'\nimport _ from 'lodash'\n\nimport { bufferingDebounce } fr"
},
{
"path": "src/plugin/write.js",
"chars": 3076,
"preview": "import path from 'path'\nimport Promise from 'bluebird'\nimport fs from 'fs'\nimport { log } from 'sigh-core'\n\nimport fse f"
},
{
"path": "src/test/PipelineCompiler.spec.js",
"chars": 1697,
"preview": "import { Bacon } from 'sigh-core'\nimport Promise from 'bluebird'\nimport PipelineCompiler from '../PipelineCompiler'\n\nimp"
},
{
"path": "src/test/api.spec.js",
"chars": 2711,
"preview": "import Promise from 'bluebird'\n\nconst copy = Promise.promisify(require('fs-extra').copy)\nconst mkTmpDir = Promise.promis"
},
{
"path": "src/test/bootstrap.spec.js",
"chars": 274,
"preview": "require('source-map-support').install()\nrequire('chai').should()\n\n// Ensure temporary directories are removed after each"
},
{
"path": "src/test/gulp-adapter.spec.js",
"chars": 1521,
"preview": "import Promise from 'bluebird'\nimport { Bacon, Event } from 'sigh-core'\nimport gulpUglify from 'gulp-uglify'\nimport { So"
},
{
"path": "src/test/plugin/concat.spec.js",
"chars": 1751,
"preview": "import _ from 'lodash'\nimport Promise from 'bluebird'\nimport { Bacon, Event } from 'sigh-core'\n\nimport concat from '../."
},
{
"path": "src/test/plugin/debounce.spec.js",
"chars": 779,
"preview": "import Promise from 'bluebird'\nimport { Bacon, Event } from 'sigh-core'\n\nimport PipelineCompiler from '../../PipelineCom"
},
{
"path": "src/test/plugin/env.spec.js",
"chars": 1025,
"preview": "import { Bacon, Event } from 'sigh-core'\nimport Promise from 'bluebird'\n\nimport env from '../../plugin/env'\nimport Pipel"
},
{
"path": "src/test/plugin/filter.spec.js",
"chars": 1190,
"preview": "import { Bacon, Event } from 'sigh-core'\nimport Promise from 'bluebird'\n\nimport filter from '../../plugin/filter'\nimport"
},
{
"path": "src/test/plugin/glob.spec.js",
"chars": 6259,
"preview": "import { Bacon, Event } from 'sigh-core'\nimport _ from 'lodash'\nimport Promise from 'bluebird'\nimport fs from 'fs'\n\ncons"
},
{
"path": "src/test/plugin/helper.js",
"chars": 351,
"preview": "import { Event } from 'sigh-core'\n\nexport function makeEvent(num, initPhase = false) {\n return new Event({\n path: `f"
},
{
"path": "src/test/plugin/merge.js",
"chars": 1515,
"preview": "import _ from 'lodash'\nimport Promise from 'bluebird'\nimport { Bacon } from 'sigh-core'\n\nimport PipelineCompiler from '."
},
{
"path": "src/test/plugin/pipeline.spec.js",
"chars": 2388,
"preview": "import _ from 'lodash'\nimport Promise from 'bluebird'\nimport { Bacon } from 'sigh-core'\n\nimport PipelineCompiler from '."
},
{
"path": "src/test/plugin/write.spec.js",
"chars": 3141,
"preview": "import _ from 'lodash'\nimport { Bacon, Event } from 'sigh-core'\nimport Promise from 'bluebird'\nimport { readFileSync, ex"
},
{
"path": "test/fixtures/sigh-project/Sigh.js",
"chars": 461,
"preview": "// sigh's node_modules/package.json are symlinked to the same directory as this\n// sigh file which allows the babel plug"
},
{
"path": "test/fixtures/sigh-project/bootstrap.js",
"chars": 23,
"preview": "console.log('the end')\n"
},
{
"path": "test/fixtures/sigh-project/src/arrow.js",
"chars": 59,
"preview": "var times2 = a => a * 2\nconsole.log('10 is %s', times2(5))\n"
},
{
"path": "test/fixtures/sigh-project/src/class.js",
"chars": 112,
"preview": "class Hello {\n constructor() { this.friend = 'totoro' }\n}\n\nvar greeter = new Hello\nconsole.log(greeter.friend)\n"
},
{
"path": "test/fixtures/simple-project/file1.js",
"chars": 15,
"preview": "var file1 = 1;\n"
},
{
"path": "test/fixtures/simple-project/file2.js",
"chars": 15,
"preview": "var file2 = 2;\n"
},
{
"path": "test/tmp/.keepdir",
"chars": 0,
"preview": ""
}
]
About this extraction
This page contains the full source code of the sighjs/sigh GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 40 files (77.8 KB), approximately 20.6k tokens, and a symbol index with 27 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.