Repository: mattdesl/workshop-web-audio
Branch: main
Commit: 6bcfd8719be8
Files: 37
Total size: 77.5 KB
Directory structure:
gitextract_o9v0g74s/
├── .gitignore
├── .npmignore
├── LICENSE.md
├── README.md
├── docs/
│ └── snippets.md
├── package.json
└── src/
├── 01-play-mp3-stream.html
├── 02-play-mp3-buffer.html
├── 03-gain.html
├── 04-waveform.html
├── 05-waveform-advanced.html
├── 06-meter.html
├── 07-meter-levels.html
├── 08-frequency.html
├── 09-frequency-advanced.html
├── 10-tone-demo.html
├── 11-tone-tap.html
├── 12-tone-patatap.html
├── 13-tone-effects.html
├── 14-tone-sequencer.html
├── 15-tone-mp3-effects.html
└── js/
├── 01-play-mp3-stream.js
├── 02-play-mp3-buffer.js
├── 03-gain.js
├── 04-waveform.js
├── 05-waveform-advanced.js
├── 06-meter.js
├── 07-meter-levels.js
├── 08-frequency.js
├── 09-frequency-advanced.js
├── 10-tone-demo.js
├── 11-tone-tap.js
├── 12-tone-patatap.js
├── 13-tone-effects.js
├── 14-tone-sequencer.js
├── 15-tone-mp3-effects.js
└── riso-colors.json
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
bower_components
node_modules
*.log
.DS_Store
bundle.js
================================================
FILE: .npmignore
================================================
bower_components
node_modules
*.log
.DS_Store
bundle.js
test
test.js
demo/
.npmignore
LICENSE.md
================================================
FILE: LICENSE.md
================================================
The MIT License (MIT)
Copyright (c) 2017 Matt DesLauriers
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
OR OTHER DEALINGS IN THE SOFTWARE.
================================================
FILE: README.md
================================================
# Web Audio Synthesis & Visualization
This repository includes resources & course notes for those attending my _Web Audio Synthesis & Visualization_ workshop with Frontend Masters, demonstrating the raw Web Audio API, [p5.js](https://p5js.org) for rendering, and [Tone.js](https://tonejs.github.io) for synthesis.
# Contents
- ✨ [Course Demos](#course-demos)
- 🔧 [Tools](#tools)
- ✂️️ [Code Snippets](#code-snippets)
- 📖 [Setup](#setup)
- ✨ [Further Reading](#further-reading)
# Course Demos
- 📚 Collections
- 🔈 **[web-audio-demos.glitch.me](https://web-audio-demos.glitch.me/)** — playback and visualization examples with pure WebAudio
- 🔈 **[tone-demos.glitch.me](https://tone-demos.glitch.me)** — synthesis and other examples with Tone.js
- 🎨 **[p5-demos.glitch.me](https://p5-demos.glitch.me)** — examples with p5.js
# Tools
Here is a list of tools and libraries that will be used during the workshop.
| Tool | Documentation | Version | Description |
| ----------------------------------- | -------------------------------------------------------------------------- | --------------------------------------------------------------------- | ---------------------------------------------------------------------------------- |
| _A browser_ | | | A modern browser, [Chrome](https://www.google.com/chrome/) is recommended |
| [Glitch](https://glitch.com) | [Help](https://glitch.com/help/) | | An online platform for editing & sharing JavaScript projects |
| _Web Audio API_ | [API Docs](https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API) | | This API is built into modern browsers and allows us to work with audio and sound. |
| [Tone.js](https://tonejs.github.io) | [API Docs](https://tonejs.github.io/docs/) | [13.8.25](https://unpkg.com/tone@13.8.25/build/Tone.js) | A JavaScript audio library for playing synths and sounds |
| [p5.js](https://p5js.org) | [API Docs](https://p5js.org/reference/) | [0.9.0](https://cdnjs.cloudflare.com/ajax/libs/p5.js/0.9.0/p5.min.js) | A JavaScript graphics library for creative coding |
# Just Starting Out
This workshop assumes you are comfortable with JavaScript and ES6 syntax, and instead focuses just on the audio side of things. If you're new to JavaScript, you might want to begin your journey below:
- [JavaScript For Cats](http://jsforcats.com/)
Also great is Daniel Shiffman's video series, which often uses p5.js:
- [Programming from A to Z](https://shiffman.net/a2z/)
A more comprehensive guide on the JavaScript language can be found here:
- [The Modern JavaScript Tutorial](https://javascript.info/)
And here's a useful cheat sheet to use as a reference:
- [Interactive JavaScript Cheat Sheet](https://htmlcheatsheet.com/js/)
# Code Snippets
I've also included a small "recipes" document that you can use as a reference if you are forgetting some of the patterns and recipes discussed during the workshop.
- [Code Snippets](./docs/snippets.md)
# Setup
If you want to run locally (without internet), clone this repo into a new folder, then use Node.js to cd into the folder and run the following to install dependencies:
```sh
npm install
```
Now, start the development server:
```sh
npm start
```
Edit files in `src/js` and see them reflected in the browser.
# Further Reading
More links to web audio, creative coding, and more:
- Resources & Tutorials
- [awesome-web-audio](https://github.com/notthetup/awesome-webaudio) — A list that includes resources, books, and more on Web Audio
- [The Coding Train](https://thecodingtrain.com) with Daniel Shiffman
- [Creative Coding with Canvas & WebGL](https://frontendmasters.com/courses/canvas-webgl/) — My own course, if you want to continue exploring the world of creative coding and generative art
- [awesome-audio-visualization](https://github.com/willianjusten/awesome-audio-visualization) — A large list of interesting Web Audio visualizers
- [awesome-creative-coding](https://github.com/terkelg/awesome-creative-coding) — A large list of resources
- Math
- [Sine / Cosine Reference](https://www.mathsisfun.com/algebra/trig-interactive-unit-circle.html)
- [Sine and Cosine Calculator](https://www.desmos.com/calculator/hlqxvc6hho)
- [math-as-code](https://github.com/Jam3/math-as-code) — A cheat sheet for mathematical notation in code form
- Learning Audio
- [Learning Synths by Ableton](https://learningsynths.ableton.com/)
- [Learning Music by Ableton](https://learningmusic.ableton.com/index.html)
- [Music Theory](https://www.lightnote.co/)
- Fun Web Audio Sites
- [generative.fm](https://play.generative.fm/browse)
- [Patatap](https://patatap.com/)
- [Blob Opera](https://artsandculture.google.com/experiment/blob-opera/AAHWrq360NcGbw?hl=en)
- [Sounds of the Pub](https://soundsofthepub.com/)
- [Pink Trombone](https://dood.al/pinktrombone/)
- Tools
- [Spectrum Analyser](http://spectrum.surge.sh/) — see the frequency spectrum of an MP3 file
# License
MIT, see [LICENSE.md](./LICENSE.md) for details.
================================================
FILE: docs/snippets.md
================================================
#### :closed_book: [Web Audio Synthesis & Visualization](../README.md) → Snippets
---
# Snippets
Here you will find some 'recipes' and patterns that we'll be using during the workshop.
## Contents
- [Playing an Audio Tag](#creating-an-audio-tag)
- [Loading an Audio Buffer](#loading-an-audio-buffer)
- [Playing an Audio Buffer](#playing-an-audio-buffer)
- [Analysing Audio Waveform](#analysing-audio-waveform)
- [Analysing Audio Frequency](#analysing-audio-frequency)
- [Root Mean Squared Metering](#root-mean-squared-metering)
- [Indexing into the Frequency Array](#indexing-into-the-frequency-array)
- [Disabling Builtin Play/Pause Controls](#disabling-builtin-playpause-controls)
## Creating an Audio Tag
```js
// Create tag
const audio = document.createElement("audio");
// set URL to the MP3 within your Glitch.com assets
audio.src = "path/to/music.mp3";
// To play audio through Glitch.com CDN
audio.crossOrigin = "Anonymous";
// Optional: enable looping so the audio never stops
audio.loop = true;
// Play audio
audio.play();
// If it's not already playing, resume audio context
audioContext.resume();
```
## Loading an Audio Buffer
```js
let audioContext;
let audioBuffer;
async function loadSound() {
// Re-use the same context if it exists
if (!audioContext) {
audioContext = new AudioContext();
}
// Re-use the audio buffer as a source
if (!audioBuffer) {
// Fetch MP3 from URL
const resp = await fetch("path/to/music.mp3");
// Turn into an array buffer of raw binary data
const buf = await resp.arrayBuffer();
// Decode the entire binary MP3 into an AudioBuffer
audioBuffer = await audioContext.decodeAudioData(buf);
}
}
```
## Playing an Audio Buffer
This relies on the `loadSound` function just described previously, as you can only play an audio buffer once it's been loaded and decoded asynchronously.
```js
async function playSound() {
// Ensure we are all loaded up
await loadSound();
// Ensure we are in a resumed state
await audioContext.resume();
// Now create a new "Buffer Source" node for playing AudioBuffers
const source = audioContext.createBufferSource();
// Connect to gain (which will be analyzed and also sent to destination)
source.connect(audioContext.destination);
// Assign the loaded buffer
source.buffer = audioBuffer;
// Start (zero = play immediately)
source.start(0);
}
```
## Disabling Builtin Play/Pause Controls
Browsers, by default, will play/pause `` elements on keyboard controls, and also sometimes when you connect and disconnect bluetooth headphones. In many apps, you may want to override this.
```js
// just ignore this event
navigator.mediaSession.setActionHandler("pause", () => {});
```
## Analysing Audio Waveform
```js
let data;
let analyserNode;
function setupAudio() {
/* ... create an audio 'source' node ... */
analyserNode = audioContext.createAnalyser();
signalData = new Float32Array(analyserNode.fftSize);
source.connect(analyserNode);
}
function draw() {
analyserNode.getFloatTimeDomainData(signalData);
/* now visualize ... */
}
```
## Analysing Audio Frequency
```js
let data;
let frequencyData;
function setupAudio() {
/* ... create an audio 'source' node ... */
analyserNode = audioContext.createAnalyser();
frequencyData = new Float32Array(analyserNode.frequencyBinCount);
source.connect(analyserNode);
}
function draw() {
analyserNode.getFloatFrequencyData(frequencyData);
/* now visualize ... */
}
```
## Root Mean Squared Metering
Start with [Analysing Audio Waveform](#analysing-audio-waveform) snippet and then pass the data into the following function to get a signal between 0 and 1.
```js
function rootMeanSquaredSignal(data) {
let rms = 0;
for (let i = 0; i < data.length; i++) {
rms += data[i] * data[i];
}
return Math.sqrt(rms / data.length);
}
```
## Indexing into the Frequency Array
If you have an array that represents a list of frequency bins (i.e. where the indices represent a frequency band in Hz and the array elements represent it's signal in Db) you can convert from Hz to an index and back like so:
```js
// Convert the frequency in Hz to an index in the array
function frequencyToIndex(frequencyHz, sampleRate, frequencyBinCount) {
const nyquist = sampleRate / 2;
const index = Math.round((frequencyHz / nyquist) * frequencyBinCount);
return Math.min(frequencyBinCount, Math.max(0, index));
}
// Convert an index in a array to a frequency in Hz
function indexToFrequency(index, sampleRate, frequencyBinCount) {
return (index * sampleRate) / (frequencyBinCount * 2);
}
```
##
#### [← Back to Documentation](../README.md)
================================================
FILE: package.json
================================================
{
"name": "workshop-web-audio",
"version": "1.0.0",
"description": "",
"main": "index.js",
"license": "MIT",
"author": {
"name": "Matt DesLauriers",
"email": "dave.des@gmail.com",
"url": "https://github.com/mattdesl"
},
"devDependencies": {
"live-server": "^1.2.1"
},
"scripts": {
"start": "live-server src"
},
"keywords": [],
"repository": {
"type": "git",
"url": "git://github.com/mattdesl/workshop-web-audio.git"
},
"homepage": "https://github.com/mattdesl/workshop-web-audio",
"bugs": {
"url": "https://github.com/mattdesl/workshop-web-audio/issues"
}
}
================================================
FILE: src/01-play-mp3-stream.html
================================================
sketch
================================================
FILE: src/02-play-mp3-buffer.html
================================================
sketch
================================================
FILE: src/03-gain.html
================================================
sketch
================================================
FILE: src/04-waveform.html
================================================
sketch
================================================
FILE: src/05-waveform-advanced.html
================================================
sketch
================================================
FILE: src/06-meter.html
================================================
sketch
================================================
FILE: src/07-meter-levels.html
================================================
sketch
================================================
FILE: src/08-frequency.html
================================================
sketch
================================================
FILE: src/09-frequency-advanced.html
================================================
sketch
================================================
FILE: src/10-tone-demo.html
================================================
sketch
================================================
FILE: src/11-tone-tap.html
================================================
sketch
================================================
FILE: src/12-tone-patatap.html
================================================
sketch
================================================
FILE: src/13-tone-effects.html
================================================
sketch
================================================
FILE: src/14-tone-sequencer.html
================================================
sketch
================================================
FILE: src/15-tone-mp3-effects.html
================================================
sketch
================================================
FILE: src/js/01-play-mp3-stream.js
================================================
let audioContext;
let audio;
function mousePressed() {
if (!audioContext) {
// setup our audio
audioContext = new AudioContext();
// create new tag
audio = document.createElement("audio");
// optional; enable audio looping
audio.loop = true;
// set the URL of the audio asset
audio.src = "audio/piano.mp3";
// trigger audio
audio.play();
const source = audioContext.createMediaElementSource(audio);
// wire the source to the 'speaker'
source.connect(audioContext.destination);
} else {
// stop the audio
audio.pause();
audioContext.close();
audioContext = audio = null;
}
}
function setup() {
createCanvas(windowWidth, windowHeight);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background("black");
fill("white");
noStroke();
// Draw play/pause button
const dim = min(width, height);
if (audioContext) {
polygon(width / 2, height / 2, dim * 0.1, 4, PI / 4);
} else {
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/02-play-mp3-buffer.js
================================================
/* eslint-disable */
let audioContext;
let audioBuffer;
function mousePressed() {
playSound();
}
async function loadSound() {
// Re-use the same context if it exists
if (!audioContext) {
audioContext = new AudioContext();
}
// Re-use the audio buffer as a source
if (!audioBuffer) {
// Fetch MP3 from URL
const resp = await fetch("audio/chime.mp3");
// Turn into an array buffer of raw binary data
const buf = await resp.arrayBuffer();
// Decode the entire binary MP3 into an AudioBuffer
audioBuffer = await audioContext.decodeAudioData(buf);
}
}
async function playSound() {
// Ensure we are all loaded up
await loadSound();
// Ensure we are in a resumed state
await audioContext.resume();
// Now create a new "Buffer Source" node for playing AudioBuffers
const source = audioContext.createBufferSource();
// Connect to destination
source.connect(audioContext.destination);
// Assign the loaded buffer
source.buffer = audioBuffer;
// Start (zero = play immediately)
source.start(0);
}
function setup() {
createCanvas(windowWidth, windowHeight);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background("black");
fill("white");
noStroke();
// Draw play/pause button
const dim = min(width, height);
if (mouseIsPressed) {
circle(width / 2, height / 2, dim * 0.1);
} else {
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/03-gain.js
================================================
let audioContext;
let audio;
let gainNode;
function mousePressed() {
if (!audioContext) {
// Create a new audio context
audioContext = new AudioContext();
// Create tag
audio = document.createElement("audio");
// set URL to the MP3 within your Glitch.com assets
audio.src = "audio/piano.mp3";
// To play audio through Glitch.com CDN
audio.crossOrigin = "Anonymous";
// Enable looping so the audio never stops
audio.loop = true;
// Play audio
audio.play();
// Create a "Media Element" source node
const source = audioContext.createMediaElementSource(audio);
// Create a gain for volume adjustment
gainNode = audioContext.createGain();
// wire source to gain
source.connect(gainNode);
// wire the gain -> speaker
gainNode.connect(audioContext.destination);
} else {
// Clean up our element and audio context
audio.pause();
audioContext.close();
audioContext = audio = null;
}
}
function setup() {
createCanvas(windowWidth, windowHeight);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background("black");
fill("white");
noStroke();
// Draw play/pause button
const dim = min(width, height);
if (audioContext) {
// Get a new volume based on mouse position
const volume = abs(mouseX - width / 2) / (width / 2);
// Schedule a gradual shift in value with a small time constant
gainNode.gain.setTargetAtTime(volume, audioContext.currentTime, 0.01);
// Draw a volume meter
rectMode(CENTER);
rect(width / 2, height / 2, dim * volume, dim * 0.05);
} else {
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/04-waveform.js
================================================
/* eslint-disable */
let audioContext;
let audioBuffer;
let analyserNode;
let analyserData;
let gainNode;
function mousePressed() {
playSound();
}
async function loadSound() {
// Re-use the same context if it exists
if (!audioContext) {
audioContext = new AudioContext();
}
// Re-use the audio buffer as a source
if (!audioBuffer) {
// Fetch MP3 from URL
const resp = await fetch("audio/chime.mp3");
// Turn into an array buffer of raw binary data
const buf = await resp.arrayBuffer();
// Decode the entire binary MP3 into an AudioBuffer
audioBuffer = await audioContext.decodeAudioData(buf);
}
// Setup a master gain node and AnalyserNode
if (!gainNode) {
// Create a gain and connect to destination
gainNode = audioContext.createGain();
// Create an Analyser Node
analyserNode = audioContext.createAnalyser();
// Create a Float32 array to hold the data
analyserData = new Float32Array(analyserNode.fftSize);
// Connect the GainNode to the analyser
gainNode.connect(analyserNode);
// Connect GainNode to destination as well
gainNode.connect(audioContext.destination);
}
}
async function playSound() {
// Snsure we are all loaded up
await loadSound();
// Snsure we are in a resumed state
await audioContext.resume();
// Now create a new "Buffer Source" node for playing AudioBuffers
const source = audioContext.createBufferSource();
// Connect to gain (which will be analyzed and also sent to destination)
source.connect(gainNode);
// Assign the loaded buffer
source.buffer = audioBuffer;
// Start (zero = play immediately)
source.start(0);
}
function setup() {
createCanvas(windowWidth, windowHeight);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background(0, 0, 0);
if (analyserNode) {
noFill();
stroke("white");
// get time domain data
analyserNode.getFloatTimeDomainData(analyserData);
beginShape();
for (let i = 0; i < analyserData.length; i++) {
// -1...1
const amplitude = analyserData[i];
const y = map(
amplitude,
-1,
1,
height / 2 - height / 4,
height / 2 + height / 4
);
const x = map(i, 0, analyserData.length - 1, 0, width);
vertex(x, y);
}
endShape();
} else {
fill("white");
noStroke();
// Draw a play button
const dim = min(width, height);
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/05-waveform-advanced.js
================================================
/* eslint-disable */
let audioContext;
let analyserNode;
let analyserData;
let gainNode;
let audio;
let isFloat = false;
let interval;
function mousePressed() {
// Only initiate audio upon a user gesture
if (!audioContext) {
const AudioContext = window.AudioContext || window.webkitAudioContext;
audioContext = new AudioContext();
// Optional:
// If the user inserts/removes bluetooth headphones or pushes
// the play/pause media keys, we can use the following to ignore the action
// navigator.mediaSession.setActionHandler("pause", () => {});
// Make a stream source, i.e. MP3, microphone, etc
// In this case we choose an element
audio = document.createElement("audio");
// Upon loading the audio, let's play it
audio.addEventListener(
"canplay",
() => {
// First, ensure the context is in a resumed state
audioContext.resume();
// Now, play the audio
audio.play();
},
{ once: true }
);
// Loop audio
audio.loop = true;
// Set source
audio.crossOrigin = "Anonymous";
audio.src = "audio/piano.mp3";
// Connect source into the WebAudio context
const source = audioContext.createMediaElementSource(audio);
source.connect(audioContext.destination);
analyserNode = audioContext.createAnalyser();
// You can increase the detail to some power-of-two value
// This will give you more samples of data per second
const detail = 4;
analyserNode.fftSize = 2048 * detail;
isFloat = Boolean(analyserNode.getFloatTimeDomainData);
analyserData = new Float32Array(analyserNode.fftSize);
if (isFloat) {
// We can use float array for this, for higher detail
analyserTarget = new Float32Array(analyserData.length);
} else {
// We are stuck with byte array
analyserTarget = new Uint8Array(analyserData.length);
analyserTarget.fill(0xff / 2);
}
// connect source to analyser
source.connect(analyserNode);
// Only update the data every N fps
const fps = 12;
interval = setInterval(() => {
if (isFloat) {
analyserNode.getFloatTimeDomainData(analyserTarget);
} else {
analyserNode.getByteTimeDomainData(analyserTarget);
}
}, (1 / fps) * 1000);
} else {
// kill audio
audio.pause();
audioContext.close();
clearInterval(interval);
audioContext = analyserNode = null;
}
}
// Smooth linear interpolation that accounts for delta time
function damp(a, b, lambda, dt) {
return lerp(a, b, 1 - Math.exp(-lambda * dt));
}
function setup() {
createCanvas(windowWidth, windowHeight);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background(0, 0, 0);
fill("white");
noStroke();
if (analyserNode) {
// Interpolate the previous frame's data to the new frame
for (let i = 0; i < analyserData.length; i++) {
analyserData[i] = damp(
analyserData[i],
isFloat ? analyserTarget[i] : (analyserTarget[i] / 256) * 2 - 1,
0.01,
deltaTime
);
}
// draw your scene
noFill();
stroke("white");
// draw each sample within the data
beginShape();
const margin = 0.1;
for (let i = 0; i < analyserData.length; i++) {
// Map sample to screen X position
const x = map(
i,
0,
analyserData.length,
width * margin,
width * (1 - margin)
);
// Signal coming from this frequency bin
const signal = analyserData[i];
// Boost the signal a little so it shows better
const amplitude = height * 4;
// Map signal to screen Y position
const y = map(
signal,
-1,
1,
height / 2 - amplitude / 2,
height / 2 + amplitude / 2
);
// Place vertex
vertex(x, y);
}
// Finish the line
endShape();
} else {
// Draw a play button
const dim = min(width, height);
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/06-meter.js
================================================
let audioContext;
let audio;
let signalData;
let analyserNode;
function mousePressed() {
if (!audioContext) {
// Create a new audio context
audioContext = new AudioContext();
// Create tag
audio = document.createElement("audio");
// set URL to the MP3 within your Glitch.com assets
audio.src = "audio/piano.mp3";
// To play audio through Glitch.com CDN
audio.crossOrigin = "Anonymous";
// Enable looping so the audio never stops
audio.loop = true;
// Play audio
audio.play();
// Create a "Media Element" source node
const source = audioContext.createMediaElementSource(audio);
// Create an analyser
analyserNode = audioContext.createAnalyser();
analyserNode.smoothingTimeConstant = 1;
// Create FFT data
signalData = new Float32Array(analyserNode.fftSize);
// Connect the source to the destination (speakers/headphones)
source.connect(audioContext.destination);
// Connect the source to the analyser node as well
source.connect(analyserNode);
} else {
// Clean up our element and audio context
if (audio.paused) audio.play();
else audio.pause();
}
}
// Get the root mean squared of a set of signals
function rootMeanSquaredSignal(data) {
let rms = 0;
for (let i = 0; i < data.length; i++) {
rms += data[i] * data[i];
}
return Math.sqrt(rms / data.length);
}
function setup() {
createCanvas(windowWidth, windowHeight);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background("black");
// Draw play/pause button
const dim = min(width, height);
if (audioContext) {
// Get the *time domain* data (not the frequency)
analyserNode.getFloatTimeDomainData(signalData);
// Get the root mean square of the data
const signal = rootMeanSquaredSignal(signalData);
const scale = 10; // scale the data a bit so the circle is bigger
const size = dim * scale * signal;
stroke("white");
noFill();
strokeWeight(dim * 0.0075);
circle(width / 2, height / 2, size);
} else {
fill("white");
noStroke();
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/07-meter-levels.js
================================================
let audioContext;
let audio;
let signals;
// Isolate specific bands of frequency with their own colors
const frequencyBands = [
{ frequency: 55, color: "#D5B3E5" },
{ frequency: 110, color: "#7F3CAC" },
{ frequency: 220, color: "#22A722" },
{ frequency: 440, color: "#F1892A" },
{ frequency: 570, color: "#E84420" },
{ frequency: 960, color: "#F4CD00" },
{ frequency: 2000, color: "#3E58E2" },
{ frequency: 4000, color: "#F391C7" },
];
function mousePressed() {
if (!audioContext) {
// Create a new audio context
audioContext = new AudioContext();
// Create tag
audio = document.createElement("audio");
// set URL to the MP3 within your Glitch.com assets
audio.src = "audio/piano.mp3";
// To play audio through Glitch.com CDN
audio.crossOrigin = "Anonymous";
// Enable looping so the audio never stops
audio.loop = true;
// Play audio
audio.play();
// Create a "Media Element" source node
const source = audioContext.createMediaElementSource(audio);
// Connect the source to the destination (speakers/headphones)
source.connect(audioContext.destination);
// For each frequency we want to isolate, we will create
// its own analyser and filter nodes
signals = frequencyBands.map(({ frequency, color }) => {
// Create an analyser
const analyser = audioContext.createAnalyser();
analyser.smoothingTimeConstant = 1;
// Create FFT data
const data = new Float32Array(analyser.fftSize);
// Create a filter that will only allow a band of data
// through
const filter = audioContext.createBiquadFilter();
filter.frequency.value = frequency;
filter.Q.value = 1;
filter.type = "bandpass";
source.connect(filter);
filter.connect(analyser);
return {
analyser,
color,
data,
filter,
};
});
} else {
// Clean up our element and audio context
if (audio.paused) audio.play();
else audio.pause();
}
}
// Get the root mean squared of a set of signals
function rootMeanSquaredSignal(data) {
let rms = 0;
for (let i = 0; i < data.length; i++) {
rms += data[i] * data[i];
}
return Math.sqrt(rms / data.length);
}
function setup() {
createCanvas(windowWidth, windowHeight);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background("black");
// Draw play/pause button
const dim = min(width, height);
if (audioContext) {
signals.forEach(({ analyser, data, color }, i) => {
// Get the waveform
analyser.getFloatTimeDomainData(data);
// Get the root mean square of the data
// Note this will already have been 'filtered'
// down to the band of frequency we want
const signal = rootMeanSquaredSignal(data);
const scale = 10; // scale the data a bit so the circle is bigger
const size = dim * scale * signal;
// Draw the rectangle
fill(color);
noStroke();
rectMode(CENTER);
const margin = 0.2 * dim;
const x =
signals.length <= 1
? width / 2
: map(i, 0, signals.length - 1, margin, width - margin);
const sliceWidth = ((width - margin * 2) / (signals.length - 1)) * 0.75;
rect(x, height / 2, sliceWidth, size);
});
} else {
fill("white");
noStroke();
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/08-frequency.js
================================================
let audioContext;
let audio;
let analyserNode;
let frequencyData;
function mousePressed() {
if (!audioContext) {
// Create a new audio context
audioContext = new AudioContext();
// Create tag
audio = document.createElement("audio");
// set URL to the MP3 within your Glitch.com assets
audio.src = "audio/piano.mp3";
// To play audio through Glitch.com CDN
audio.crossOrigin = "Anonymous";
// Enable looping so the audio never stops
audio.loop = true;
// Play audio
audio.play();
// Create a "Media Element" source node
const source = audioContext.createMediaElementSource(audio);
analyserNode = audioContext.createAnalyser();
// Get some higher resolution toward the low end
analyserNode.fftSize = 2048 * 2;
// These are the defaults but different tracks might
// need different values
analyserNode.minDecibels = -100;
analyserNode.maxDecibels = -30;
frequencyData = new Float32Array(analyserNode.fftSize);
// Connect source to analyser node
source.connect(analyserNode);
// Connect the source to the destination (speakers/headphones)
source.connect(audioContext.destination);
} else {
// Clean up our element and audio context
audio.pause();
audioContext.close();
audioContext = audio = null;
}
}
// Convert the frequency in Hz to an index in the array
function frequencyToIndex(frequencyHz, sampleRate, frequencyBinCount) {
const nyquist = sampleRate / 2;
const index = Math.round((frequencyHz / nyquist) * frequencyBinCount);
return Math.min(frequencyBinCount, Math.max(0, index));
}
// Convert an index in a array to a frequency in Hz
function indexToFrequency(index, sampleRate, frequencyBinCount) {
return (index * sampleRate) / (frequencyBinCount * 2);
}
// Get the normalized audio signal (0..1) between two frequencies
function audioSignal(analyser, frequencies, minHz, maxHz) {
if (!analyser) return 0;
const sampleRate = analyser.context.sampleRate;
const binCount = analyser.frequencyBinCount;
let start = frequencyToIndex(minHz, sampleRate, binCount);
const end = frequencyToIndex(maxHz, sampleRate, binCount);
const count = end - start;
let sum = 0;
for (; start < end; start++) {
sum += frequencies[start];
}
const minDb = analyserNode.minDecibels;
const maxDb = analyserNode.maxDecibels;
const valueDb = count === 0 || !isFinite(sum) ? minDb : sum / count;
return map(valueDb, minDb, maxDb, 0, 1, true);
}
function setup() {
createCanvas(windowWidth, windowHeight);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background("black");
fill("white");
noStroke();
// Draw play/pause button
const dim = min(width, height);
if (audioContext) {
analyserNode.getFloatFrequencyData(frequencyData);
const cx = width / 2;
const cy = height / 2;
const radius = dim * 0.75;
strokeWeight(dim * 0.0075);
noFill();
// draw the low frequency signal
stroke("#E84420");
const drum = audioSignal(analyserNode, frequencyData, 150, 2500);
circle(cx, cy, radius * drum);
// draw the higher frequency signal
stroke("#F4CD00");
const voice = audioSignal(analyserNode, frequencyData, 50, 150);
circle(cx, cy, radius * voice);
} else {
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/09-frequency-advanced.js
================================================
/* eslint-disable */
let audioContext;
let frequencyData;
let analyserNode;
let currentHue = 0;
let maxFrequencyTarget = 0;
let audio;
function setup() {
createCanvas(windowWidth, windowHeight);
// Optional:
// If the user inserts/removes bluetooth headphones or pushes
// the play/pause media keys, we can use the following to ignore the action
navigator.mediaSession.setActionHandler("pause", () => {});
}
function mousePressed() {
// Only initiate audio upon a user gesture
if (!audioContext) {
audioContext = new AudioContext();
// Make a stream source, i.e. MP3, microphone, etc
// In this case we choose an element
audio = document.createElement("audio");
// Upon loading the audio, let's play it
audio.addEventListener(
"canplay",
() => {
// First, ensure the context is in a resumed state
audioContext.resume();
// Now, play the audio
audio.play();
},
{ once: true }
);
// Enable looping
audio.loop = true;
// Set source
audio.crossOrigin = "Anonymous";
audio.src = "audio/piano.mp3";
// Connect source into the WebAudio context
const source = audioContext.createMediaElementSource(audio);
source.connect(audioContext.destination);
analyserNode = audioContext.createAnalyser();
const detail = 4;
analyserNode.fftSize = 2048 * detail;
analyserNode.minDecibels = -100;
analyserNode.maxDecibels = -50;
frequencyData = new Float32Array(analyserNode.frequencyBinCount);
source.connect(analyserNode);
} else {
audio.pause();
audioContext.close();
audioContext = null;
}
}
// Convert the frequency in Hz to an index in the array
function frequencyToIndex(frequencyHz, sampleRate, frequencyBinCount) {
const nyquist = sampleRate / 2;
const index = Math.round((frequencyHz / nyquist) * frequencyBinCount);
return Math.min(frequencyBinCount, Math.max(0, index));
}
// Convert an index in a array to a frequency in Hz
function indexToFrequency(index, sampleRate, frequencyBinCount) {
return (index * sampleRate) / (frequencyBinCount * 2);
}
// Get the normalized audio signal (0..1) between two frequencies
function audioSignal(analyser, frequencies, minHz, maxHz) {
if (!analyser) return 0;
const sampleRate = analyser.context.sampleRate;
const binCount = analyser.frequencyBinCount;
let start = frequencyToIndex(minHz, sampleRate, binCount);
const end = frequencyToIndex(maxHz, sampleRate, binCount);
const count = end - start;
let sum = 0;
for (; start < end; start++) {
sum += frequencies[start];
}
const minDb = analyserNode.minDecibels;
const maxDb = analyserNode.maxDecibels;
const valueDb = count === 0 || !isFinite(sum) ? minDb : sum / count;
return map(valueDb, minDb, maxDb, 0, 1, true);
}
// Find the frequency band that has the most peak signal
function audioMaxFrequency(analyserNode, frequencies) {
let maxSignal = -Infinity;
let maxSignalIndex = 0;
for (let i = 0; i < frequencies.length; i++) {
const signal = frequencies[i];
if (signal > maxSignal) {
maxSignal = signal;
maxSignalIndex = i;
}
}
return indexToFrequency(
maxSignalIndex,
analyserNode.context.sampleRate,
analyserNode.frequencyBinCount
);
}
function damp(a, b, lambda, dt) {
return lerp(a, b, 1 - Math.exp(-lambda * dt));
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
function draw() {
// fill background
background(240);
noStroke();
rectMode(CENTER);
if (analyserNode) {
analyserNode.getFloatFrequencyData(frequencyData);
maxFrequencyTarget = map(
audioMaxFrequency(analyserNode, frequencyData),
0,
500,
0,
360,
true
);
}
const cx = width / 2;
const cy = height / 2;
const dim = min(width, height);
colorMode(HSL);
currentHue = damp(currentHue, maxFrequencyTarget, 0.001, deltaTime);
let hueA = currentHue;
let hueB = (hueA + 45) % 360;
const colorA = color(hueA, 50, 50);
const colorB = color(hueB, 50, 50);
const maxSize = dim * 0.75;
const minSize = dim * 0.15;
const count = 6;
background(currentHue, 50, 50);
for (let i = 0; i < count; i++) {
const t = map(i, 0, count - 1, 0, 1);
const c = color((currentHue + 90 * ((i + 1) / count)) % 360, 50, 50);
const minBaseHz = 200;
const maxBaseHz = 2000;
const minHz = map(count - i, 0, count, minBaseHz, maxBaseHz);
const maxHz = map(count - i + 1, 0, count, minBaseHz, maxBaseHz);
const signal = analyserNode
? audioSignal(analyserNode, frequencyData, minHz, maxHz)
: 0;
const baseSize = map(i, 0, count - 1, maxSize, minSize);
const size = baseSize + (maxSize / 4) * signal;
const edge = 0.5;
fill(c);
rect(cx, cy + ((maxSize - size) * edge) / 2, size, size);
}
if (!audioContext) {
// Draw a play button
const dim = min(width, height);
fill("white");
noStroke();
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/10-tone-demo.js
================================================
// Master volume in decibels
const volume = -15;
// The synth we'll use for audio
let synth;
// Create a new canvas to the browser size
function setup() {
createCanvas(windowWidth, windowHeight);
// Clear with black on setup
background(0);
// Make the volume quieter
Tone.Master.volume.value = volume;
// Setup a synth with ToneJS
synth = new Tone.Synth({
oscillator: {
type: "sine",
},
});
// Wire up our nodes:
// synth->master
synth.connect(Tone.Master);
}
// On window resize, update the canvas size
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
// Render loop that draws shapes with p5
function draw() {
const dim = Math.min(width, height);
// Black background
background(0);
// Get a 0..1 value for the mouse
const u = max(0, min(1, mouseX / width));
// Choose a frequency that sounds good
const frequency = lerp(75, 2500, u);
synth.setNote(frequency);
if (mouseIsPressed) {
const time = millis() / 1000;
const verts = 1000;
noFill();
stroke(255);
strokeWeight(dim * 0.005);
beginShape();
for (let i = 0; i < verts; i++) {
const t = verts <= 1 ? 0.5 : i / (verts - 1);
const x = t * width;
let y = height / 2;
// This is not an accurate representation, but
// instead exaggerated for the sake of visualization
const frequencyMod = lerp(1, 1000, pow(u, 5));
const amplitude = sin(time + t * frequencyMod);
y += (amplitude * height) / 2;
vertex(x, y);
}
endShape();
}
// Draw a 'play' button
noStroke();
fill(255);
polygon(width / 2, height / 2, dim * 0.1, 3);
}
// Update the FX and trigger synth ON
function mousePressed() {
synth.triggerAttack();
}
// Trigger synth OFF
function mouseReleased() {
synth.triggerRelease();
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/11-tone-tap.js
================================================
// Master volume in decibels
const volume = -2;
// The synth we'll use for audio
let synth;
let mouse;
// Create a new canvas to the browser size
function setup() {
createCanvas(windowWidth, windowHeight);
// Clear with black on setup
background(0);
// Make the volume quieter
Tone.Master.volume.value = volume;
// Setup a synth with ToneJS
synth = new Tone.Synth({
oscillator: {
type: "sine",
},
});
// Wire up our nodes:
// synth->master
synth.connect(Tone.Master);
}
// On window resize, update the canvas size
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
// Clear with black on resize
background(0);
}
// Render loop that draws shapes with p5
function draw() {
const dim = Math.min(width, height);
// Instead of drawing black, we draw with
// transparent black to give a 'ghosting' effect
const opacity = 0.085;
background(0, 0, 0, opacity * 255);
// If we have a mouse position, draw it
if (mouse) {
noFill();
stroke(255);
strokeWeight(dim * 0.01);
circle(mouse[0], mouse[1], dim * 0.2);
// Clear position so we stop drawing it,
// this will make it fade away
mouse = null;
}
// Draw a 'play' button
noStroke();
fill(255);
polygon(width / 2, height / 2, dim * 0.1, 3);
}
// Update mouse position and play a sound
function mousePressed() {
// Store mouse position when pressed
mouse = [mouseX, mouseY];
// Hirajoshi scale in C
// https://www.pianoscales.org/hirajoshi.html
const notes = ["C", "Db", "F", "Gb", "Bb"];
const octaves = [2, 3, 4];
const octave = random(octaves);
const note = random(notes);
synth.triggerAttackRelease(note + octave, "8n");
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/12-tone-patatap.js
================================================
// Master volume in decibels
const volume = -2;
// The synth we'll use for audio
let synth;
let risoColors;
let colorJSON;
let active = false;
function preload() {
// loads a JSON as an object
colorJSON = loadJSON("js/riso-colors.json");
}
// Create a new canvas to the browser size
function setup() {
createCanvas(windowWidth, windowHeight);
background("black");
// unpack the JSON object as an array
risoColors = Object.values(colorJSON);
// Make the volume quieter
Tone.Master.volume.value = volume;
// Setup a synth with ToneJS
synth = new Tone.Synth({
oscillator: {
type: "sine",
},
});
// Wire up our nodes:
// synth->master
var feedbackDelay = new Tone.FeedbackDelay("8n", 0.6);
synth.connect(feedbackDelay);
synth.connect(Tone.Master);
feedbackDelay.connect(Tone.Master);
frameRate(25);
}
// On window resize, update the canvas size
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
// Render loop that draws shapes with p5
function draw() {
// We slowly clear each frame
const opacity = 0.05;
background(0, 0, 0, opacity * 255);
if (!active) {
const dim = min(width, height);
noStroke();
fill(255);
polygon(width / 2, height / 2, dim * 0.05, 3);
}
}
// Update mouse position and play a sound
function mousePressed() {
// First time we click...
if (!active) {
active = true;
// Clear background to white to create an initial flash
background(255);
}
// choose a note
const note = random(["A3", "C4", "D4", "E3", "G4"]);
synth.triggerAttackRelease(note, "8n");
const dim = min(width, height);
const x = mouseX;
const y = mouseY;
noStroke();
const curColorData = random(risoColors);
const curColor = color(curColorData.hex);
const size = max(10, abs(randomGaussian(dim / 8, dim / 8)));
const type = random(["circle", "line", "polygon"]);
curColor.setAlpha(255 * 0.25);
background(curColor);
curColor.setAlpha(255);
fill(curColor);
textAlign(CENTER, CENTER);
textFont("monospace");
text(curColorData.pantone, x, y + size / 2 + 20);
text(curColorData.name, x, y - size / 2 - 20);
if (type === "circle") {
ellipseMode(CENTER);
circle(x, y, size);
} else if (type === "line") {
strokeWeight(dim * 0.01);
stroke(curColor);
polygon(x, y, size * 0.5, 2, random(-1, 1) * PI * 2);
} else if (type === "polygon") {
polygon(x, y, size * 0.5, floor(random(3, 10)), random(-1, 1) * PI * 2);
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/13-tone-effects.js
================================================
// The setup() function is async
// So it might take a little while to load
let ready = false;
// The synth that plays notes
let synth;
// Can be 'sine', 'sawtooth', 'triangle', 'square'
// Can also add suffixes like sine8, square4
const type = "square";
// Global volume in decibels
const volume = -15;
// The filter and effect nodes which we will modulate
let filter, effect;
// Min and max frequency (Hz) cutoff range for the filter
const filterMin = 100;
const filterMax = 5000;
// 0..1 values for our FX
let fxU = 0.5;
let fxV = 0.5;
// The notes we will use
const notes = ["C5", "A3", "D4", "G4", "A4", "F4"];
// Create a new canvas to the browser size
async function setup() {
createCanvas(windowWidth, windowHeight);
// Clear with black on setup
background(0);
// Make the volume quieter
Tone.Master.volume.value = volume;
// Setup a reverb with ToneJS
const reverb = new Tone.Reverb({
decay: 5,
wet: 0.5,
preDelay: 0.2,
});
// Load the reverb
await reverb.generate();
// Create an effect node that creates a feedback delay
effect = new Tone.FeedbackDelay(0.4, 0.85);
// Create a new filter for the X slider
filter = new Tone.Filter();
filter.type = "lowpass";
// Setup a synth with ToneJS
synth = new Tone.Synth({
oscillator: {
// We prefix 'fat' so we can spread the oscillator over multiple frequencies
type: `fat${type}`,
count: 3,
spread: 30,
},
envelope: {
attack: 0.001,
decay: 0.1,
sustain: 0.5,
release: 0.1,
attackCurve: "exponential",
},
});
// Now lets wire up our stack like so:
// synth->effect->reverb->filter->master
synth.connect(effect);
effect.connect(reverb);
reverb.connect(filter);
filter.connect(Tone.Master);
// Now we're ready for drawing!
ready = true;
}
// On window resize, update the canvas size
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
// Clear to black on resize
background(0);
}
// Render loop that draws shapes with p5
function draw() {
// Make sure async setup() is done before we draw
if (!ready) return;
filter.frequency.value = lerp(filterMin, filterMax, fxU);
effect.wet.value = fxV;
// For consistent sizing regardless of portrait/landscape
const dim = Math.min(width, height);
// Black background
background(0, 0, 0, 20);
// draw the two FX knobs
if (mouseIsPressed) {
noFill();
strokeWeight(dim * 0.0175);
stroke(255);
drawEffectKnob(dim * 0.4, fxU);
drawEffectKnob(dim * 0.6, fxV);
}
// Draw a 'play' button
noStroke();
fill(255);
polygon(width / 2, height / 2, dim * 0.1, 3);
}
// Draws an arc with the given amount of 'strength'
function drawEffectKnob(radius, t) {
if (t <= 0) return;
arc(width / 2, height / 2, radius, radius, 0, PI * 2 * t);
}
// Update FX values based on mouse position
function updateEffects() {
fxU = max(0, min(1, mouseX / width));
fxV = max(0, min(1, mouseY / height));
}
// Update the FX and trigger synth ON
function mousePressed() {
updateEffects();
if (synth) synth.triggerAttack(random(notes));
}
// Update the FX values
function mouseDragged() {
updateEffects();
}
// Trigger synth OFF
function mouseReleased() {
if (synth) synth.triggerRelease();
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/14-tone-sequencer.js
================================================
// Master volume in decior audio
let synth;
// Whether the audio sequence is playing
let playing = false;
// The current Tone.Sequence
let sequence;
// The currently playing column
let currentColumn = 0;
// Here is the fixed scale we will use
const notes = ["A3", "C4", "D4", "E3", "G4"];
// Also can try other scales/notes
// const notes = ["F#4", "E4", "C#4", "A4"];
// const notes = ['A3', 'C4', 'D4', 'E4', 'G4', 'A4'];
// const notes = [ "A4", "D3", "E3", "G4", 'F#4' ];
// Number of rows is the number of different notes
const numRows = notes.length;
// Number of columns is depending on how many notes to play in a measure
const numCols = 16;
const noteInterval = `${numCols}n`;
// Setup audio config
Tone.Transport.bpm.value = 120;
// Create a Row*Col data structure that has nested arrays
// [ [ 0, 0, 0 ], [ 0, 0, 0 ], ... ]
// The data can be 0 (off) or 1 (on)
const data = [];
for (let y = 0; y < numRows; y++) {
const row = [];
for (let x = 0; x < numCols; x++) {
row.push(0);
}
data.push(row);
}
// Create a new canvas to the browser size
async function setup() {
// Setup canvas size as a square
const dim = min(windowWidth, windowHeight);
createCanvas(innerWidth, innerHeight);
// Clear with black on setup
background(0);
// Setup a reverb with ToneJS
const reverb = new Tone.Reverb({
decay: 4,
wet: 0.2,
preDelay: 0.25,
});
// Load the reverb
await reverb.generate();
// Create an effect node that creates a feedback delay
const effect = new Tone.FeedbackDelay(`${Math.floor(numCols / 2)}n`, 1 / 3);
effect.wet.value = 0.2;
// Setup a synth with ToneJS
// We use a poly synth which can hold up to numRows voices
// Then we will play each note on a different voice
synth = new Tone.PolySynth(numRows, Tone.DuoSynth);
// Setup the synths a little bit
synth.set({
voice0: {
oscillator: {
type: "triangle4",
},
volume: -30,
envelope: {
attack: 0.005,
release: 0.05,
sustain: 1,
},
},
voice1: {
volume: -10,
envelope: {
attack: 0.005,
release: 0.05,
sustain: 1,
},
},
});
synth.volume.value = -10;
// Wire up our nodes:
synth.connect(effect);
synth.connect(Tone.Master);
effect.connect(reverb);
reverb.connect(Tone.Master);
// Every two measures, we randomize the notes
// We use Transport to schedule timer since it has
// to be exactly in sync with the audio
Tone.Transport.scheduleRepeat(() => {
randomizeSequencer();
}, "2m");
}
// On window resize, update the canvas size
function windowResized() {
// const dim = max(windowWidth, windowHeight);
resizeCanvas(innerWidth, innerHeight);
}
// Render loop that draws shapes with p5
function draw() {
// Our synth isn't loaded yet, don't draw anything
if (!synth) return;
const dim = min(width, height);
// Black background
background(0);
if (playing) {
// The audio is playing so we can show the sequencer
const margin = dim * 0.2;
const innerSize = dim - margin * 2;
const cellSize = innerSize / numCols;
push();
translate(innerWidth / 2 - dim / 2, innerHeight / 2 - dim / 2);
// Loop through the nested data structure, drawing each note
for (let y = 0; y < data.length; y++) {
const row = data[y];
for (let x = 0; x < row.length; x++) {
const u = x / (numCols - 1);
const v = y / (numRows - 1);
let px = lerp(margin, dim - margin, u);
let py = lerp(margin, dim - margin, v);
noStroke();
noFill();
// note on=fill, note off=stroke
if (row[x] === 1) fill(255);
else stroke(255);
// draw note
circle(px, py, cellSize / 2);
// draw a rectangle around the currently playing column
if (x === currentColumn) {
rectMode(CENTER);
rect(px, py, cellSize, cellSize);
}
}
}
pop();
} else {
// Draw a 'play' button
noStroke();
fill(255);
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Here we randomize the sequencer with some data
function randomizeSequencer() {
// Choose a % chance so that sometimes it is more busy, other times more sparse
const chance = random(0.5, 1.5);
for (let y = 0; y < data.length; y++) {
// Loop through and create some random on/off values
const row = data[y];
for (let x = 0; x < row.length; x++) {
row[x] = randomGaussian() > chance ? 1 : 0;
}
// Loop through again and make sure we don't have two
// consectutive on values (it sounds bad)
for (let x = 0; x < row.length - 1; x++) {
if (row[x] === 1 && row[x + 1] === 1) {
row[x + 1] = 0;
x++;
}
}
}
}
// When the mouse is pressed, turn on the sequencer
function mousePressed() {
// No synth loaded yet, just skip mouse click
if (!synth) {
return;
}
if (playing) {
// If we are currently playing, we stop the sequencer
playing = false;
sequence.stop();
Tone.Transport.stop();
} else {
// If we aren't currently playing, we can start the sequence
// We do this by creating an array of indices [ 0, 1, 2 ... 15 ]
const noteIndices = newArray(numCols);
// create the sequence, passing onSequenceStep function
sequence = new Tone.Sequence(onSequenceStep, noteIndices, noteInterval);
// Start the sequence and Transport loop
playing = true;
sequence.start();
Tone.Transport.start();
}
}
// Here is where we actually play the audi
function onSequenceStep(time, column) {
// We build up a list of notes, which will equal
// the numRows. This gets passed into our PolySynth
let notesToPlay = [];
// Go through each row
data.forEach((row, rowIndex) => {
// See if the note is "on"
const isOn = row[column] == 1;
// If its on, add it to the list of notes to play
if (isOn) {
const note = notes[rowIndex];
notesToPlay.push(note);
}
});
// Trigger a note
const velocity = random(0.5, 1);
synth.triggerAttackRelease(notesToPlay, noteInterval, time, velocity);
Tone.Draw.schedule(function () {
currentColumn = column;
}, time);
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
// A utility function to create a new array
// full of indices [ 0, 1, 2, ... (N - 1) ]
function newArray(n) {
const array = [];
for (let i = 0; i < n; i++) {
array.push(i);
}
return array;
}
================================================
FILE: src/js/15-tone-mp3-effects.js
================================================
// Master volume in decibels
const volume = -16;
const MP3 = "audio/piano.mp3";
// The synth we'll use for audio
let player;
let autoFilter;
// Create a new canvas to the browser size
async function setup() {
createCanvas(windowWidth, windowHeight);
// Make the volume quieter
Tone.Master.volume.value = volume;
// We can use 'player' to play MP3 files
player = new Tone.Player();
player.loop = true;
player.autostart = false;
player.loopStart = 1.0;
// Load and "await" the MP3 file
await player.load(MP3);
// Wire up connections
autoFilter = new Tone.AutoFilter("8n");
autoFilter.start();
player.connect(autoFilter);
autoFilter.connect(Tone.Master);
}
// On window resize, update the canvas size
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
}
// Render loop that draws shapes with p5
function draw() {
if (!player || !player.loaded) {
// MP3 not loaded
return;
}
const dim = Math.min(width, height);
// Black background
background(0);
autoFilter.wet.value = mouseY / height;
autoFilter.frequency.value = map(mouseX, 0, width, 0.5, 1.5);
// Draw a 'play' or 'stop' button
if (player.state === "started") {
noStroke();
fill(255);
polygon(width / 2, height / 2, dim * 0.1, 4, PI / 4);
stroke("tomato");
noFill();
strokeWeight(dim * 0.0175);
circle(mouseX, mouseY, dim * 0.2);
} else {
noStroke();
fill(255);
polygon(width / 2, height / 2, dim * 0.1, 3);
}
}
// Update the FX and trigger synth ON
function mousePressed() {
if (player && player.loaded) {
if (player.state === "started") {
player.stop();
} else {
player.start();
}
}
}
// Draw a basic polygon, handles triangles, squares, pentagons, etc
function polygon(x, y, radius, sides = 3, angle = 0) {
beginShape();
for (let i = 0; i < sides; i++) {
const a = angle + TWO_PI * (i / sides);
let sx = x + cos(a) * radius;
let sy = y + sin(a) * radius;
vertex(sx, sy);
}
endShape(CLOSE);
}
================================================
FILE: src/js/riso-colors.json
================================================
[
{
"name": "Black",
"hex": "#000000",
"pantone": "BLACK U"
},
{
"name": "Burgundy",
"hex": "#914e72",
"pantone": "235 U",
"zType": "S-4225"
},
{
"name": "Blue",
"hex": "#0078bf",
"pantone": "3005 U",
"zType": "S-4257"
},
{
"name": "Green",
"hex": "#00a95c",
"pantone": "354 U",
"zType": "S-4259"
},
{
"name": "Medium Blue",
"hex": "#3255a4",
"pantone": "286 U",
"zType": "S-4261"
},
{
"name": "Bright Red",
"hex": "#f15060",
"pantone": "185 U",
"zType": "S-4263"
},
{
"name": "RisoFederal Blue",
"hex": "#3d5588",
"pantone": "288 U",
"zType": "S-4265"
},
{
"name": "Purple",
"hex": "#765ba7",
"pantone": "2685 U",
"zType": "S-4267"
},
{
"name": "Teal",
"hex": "#00838a",
"pantone": "321 U",
"zType": "S-4269"
},
{
"name": "Flat Gold",
"hex": "#bb8b41",
"pantone": "1245 U",
"zType": "S-4271"
},
{
"name": "Hunter Green",
"hex": "#407060",
"pantone": "342 U",
"zType": "S-4273"
},
{
"name": "Red",
"hex": "#ff665e",
"pantone": "WARM RED U",
"zType": "S-4275"
},
{
"name": "Brown",
"hex": "#925f52",
"pantone": "7526 U",
"zType": "S-4277"
},
{
"name": "Yellow",
"hex": "#ffe800",
"pantone": "YELLOW U",
"zType": "S-4279"
},
{
"name": "Marine Red",
"hex": "#d2515e",
"pantone": "186 U",
"zType": "S-4281"
},
{
"name": "Orange",
"hex": "#ff6c2f",
"pantone": "ORANGE 021 U",
"zType": "S-4283"
},
{
"name": "Fluorescent Pink",
"hex": "#ff48b0",
"pantone": "806 U",
"zType": "S-4287"
},
{
"name": "Light Gray",
"hex": "#88898a",
"pantone": "424 U",
"zType": "S-4291"
},
{
"name": "Metallic Gold",
"hex": "#ac936e",
"pantone": "872 U",
"zType": " S-2772"
},
{
"name": "Crimson",
"hex": "#e45d50",
"pantone": "485 U",
"zType": "S-4285"
},
{
"name": "Fluorescent Orange",
"hex": "#ff7477",
"pantone": "805 U",
"zType": "S-4289"
},
{
"name": "Cornflower",
"hex": "#62a8e5",
"pantone": "292 U",
"zType": "S-4617"
},
{
"name": "Sky Blue",
"hex": "#4982cf",
"pantone": "285U",
"zType": "S-4618"
},
{
"name": "Sea Blue",
"hex": "#0074a2",
"pantone": "307 U",
"zType": "S-4619"
},
{
"name": "Lake",
"hex": "#235ba8",
"pantone": "293 U",
"zType": "S-4620"
},
{
"name": "Indigo",
"hex": "#484d7a",
"pantone": "2758 U",
"zType": "S-4621"
},
{
"name": "Midnight",
"hex": "#435060",
"pantone": "296 U",
"zType": "S-4622"
},
{
"name": "Mist",
"hex": "#d5e4c0",
"pantone": "7485 U",
"zType": "S-4623"
},
{
"name": "Granite",
"hex": "#a5aaa8",
"pantone": "7538 U",
"zType": "S-4624"
},
{
"name": "Charcoal",
"hex": "#70747c",
"pantone": "7540 U",
"zType": "S-4625"
},
{
"name": "Smoky Teal",
"hex": "#5f8289",
"pantone": "5483 U",
"zType": "S-4626"
},
{
"name": "Steel",
"hex": "#375e77",
"pantone": "302 U",
"zType": "S-4627"
},
{
"name": "Slate",
"hex": "#5e695e",
"pantone": "5605 U",
"zType": "S-4628"
},
{
"name": "Turquoise",
"hex": "#00aa93",
"pantone": "3275 U",
"zType": "S-4629"
},
{
"name": "Emerald",
"hex": "#19975d",
"pantone": "355 U",
"zType": "S-4630"
},
{
"name": "Grass",
"hex": "#397e58",
"pantone": "356 U",
"zType": "S-4631"
},
{
"name": "Forest",
"hex": "#516e5a",
"pantone": "357 U",
"zType": "S-4632"
},
{
"name": "Spruce",
"hex": "#4a635d",
"pantone": "567 U",
"zType": "S-4633"
},
{
"name": "Moss",
"hex": "#68724d",
"pantone": "371 U",
"zType": "S-4634"
},
{
"name": "Sea Foam",
"hex": "#62c2b1",
"pantone": "570 U",
"zType": "S-4635"
},
{
"name": "Kelly Green",
"hex": "#67b346",
"pantone": " 368 U",
"zType": "S-4636"
},
{
"name": "Light Teal",
"hex": "#009da5",
"pantone": "320 U",
"zType": "S-4637"
},
{
"name": "Ivy",
"hex": "#169b62",
"pantone": "347 U",
"zType": "S-4638"
},
{
"name": "Pine",
"hex": "#237e74",
"pantone": "3295 U",
"zType": "S-4639"
},
{
"name": "Lagoon",
"hex": "#2f6165",
"pantone": "323 U",
"zType": "S-4640"
},
{
"name": "Violet",
"hex": "#9d7ad2",
"pantone": "265 U",
"zType": "S-4641"
},
{
"name": "Orchid",
"hex": "#aa60bf",
"pantone": "2592 U",
"zType": "S-4642"
},
{
"name": "Plum",
"hex": "#845991",
"pantone": "2603 U",
"zType": "S-4644"
},
{
"name": "Raisin",
"hex": "#775d7a",
"pantone": "519 U",
"zType": "S-4645"
},
{
"name": "Grape",
"hex": "#6c5d80",
"pantone": "2695 U",
"zType": "S-4646"
},
{
"name": "Scarlet",
"hex": "#f65058",
"pantone": "RED 032 U",
"zType": "S-4647"
},
{
"name": "Tomato",
"hex": "#d2515e",
"pantone": "186 U",
"zType": "S-4648"
},
{
"name": "Cranberry",
"hex": "#d1517a",
"pantone": "214 U",
"zType": "S-4649"
},
{
"name": "Maroon",
"hex": "#9e4c6e",
"pantone": "221 U",
"zType": "S-4650"
},
{
"name": "Raspberry Red",
"hex": "#d1517a",
"pantone": "214U",
"zType": "S-4651"
},
{
"name": "Brick",
"hex": "#a75154",
"pantone": "1807 U",
"zType": "S-4652"
},
{
"name": "Light Lime",
"hex": "#e3ed55",
"pantone": "387 U",
"zType": "S-4653"
},
{
"name": "Sunflower",
"hex": "#ffb511",
"pantone": "116 U",
"zType": "S-4654"
},
{
"name": "Melon",
"hex": "#ffae3b",
"pantone": "1235 U",
"zType": "S-4655"
},
{
"name": "Apricot",
"hex": "#f6a04d",
"pantone": "143 U",
"zType": "S-4656"
},
{
"name": "Paprika",
"hex": "#ee7f4b",
"pantone": "158 U",
"zType": "S-4657"
},
{
"name": "Pumpkin",
"hex": "#ff6f4c",
"pantone": "1655 U",
"zType": "S-4658"
},
{
"name": "Bright Olive Green",
"hex": "#b49f29",
"pantone": "103 U",
"zType": "S-4659"
},
{
"name": "Bright Gold",
"hex": "#ba8032",
"pantone": "131 U",
"zType": "S-4660"
},
{
"name": "Copper",
"hex": "#bd6439",
"pantone": "1525 U",
"zType": "S-4661"
},
{
"name": "Mahogany",
"hex": "#8e595a",
"pantone": "491 U",
"zType": "S-4662"
},
{
"name": "Bisque",
"hex": "#f2cdcf",
"pantone": "503 U",
"zType": "S-4663"
},
{
"name": "Bubble Gum",
"hex": "#f984ca",
"pantone": "231 U",
"zType": "S-4664"
},
{
"name": "Light Mauve",
"hex": "#e6b5c9",
"pantone": "7430 U",
"zType": "S-4665"
},
{
"name": "Dark Mauve",
"hex": "#bd8ca6",
"pantone": "687 U",
"zType": "S-4666"
},
{
"name": "Wine",
"hex": "#914e72",
"pantone": "235 U",
"zType": "S-4674"
},
{
"name": "Gray",
"hex": "#928d88",
"pantone": "403 U",
"zType": "S-4693"
},
{
"name": "White",
"hex": "#ffffff",
"zType": "S-4722 "
},
{
"name": "Aqua",
"hex": "#5ec8e5",
"pantone": "637 U",
"zType": "S-4917"
},
{
"name": "Mint",
"hex": "#82d8d5",
"pantone": "324 U",
"zType": "S-6316"
},
{
"name": "Fluorescent Yellow",
"hex": "#ffe900",
"pantone": "803 U",
"zType": "S-7761"
},
{
"name": "Fluorescent Red",
"hex": "#ff4c65",
"pantone": "812 U",
"zType": "S-7762"
},
{
"name": "Fluorescent Green",
"hex": "#44d62c",
"pantone": "802 U",
"zType": "S-7763"
}
]