Skip to content

Commit

Permalink
beat and beat track and some tests - general improvements
Browse files Browse the repository at this point in the history
  • Loading branch information
sethbrasile committed Sep 23, 2024
1 parent 5bfe155 commit 4392bc4
Show file tree
Hide file tree
Showing 15 changed files with 673 additions and 151 deletions.
26 changes: 20 additions & 6 deletions pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion src/app/pages/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import { setupPlayButton } from './play-button'
const codeExample = `
import { createSound } from 'ez-web-audio'
function playSound() {
async function playSound() {
// 1. Load a sound from a URL
const note = await createSound('Eb5.mp3')
// 2. Play the sound
Expand Down
6 changes: 3 additions & 3 deletions src/app/pages/play-button.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,10 @@ export function setupPlayButton(element: HTMLButtonElement): void {
// remove the setup listener
element.removeEventListener('click', setup)

note.play()

// add a listener to play the note again when the button is clicked for the rest of the document's life
element.addEventListener('click', async () => {
note.play()
})
element.addEventListener('click', () => note.play())
}

element.addEventListener('click', setup)
Expand Down
71 changes: 3 additions & 68 deletions src/app/pages/timing/drum-machine.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,8 @@
import { codeBlock } from '../../utils'
import nav from './nav'

const Content = {
setup() {
// const keys = document.querySelector<HTMLOListElement>('#keys')
// setupPiano(keys!)

},
html: `
Expand All @@ -13,75 +11,12 @@ ${nav}
<h1>Timing</h1>
<div class="docs">
<i><small>
Note: It is not necessary to understand this concept, as Ember Audio has methods that allow you to ignore it.
I encourage you to understand it anyway. It's easy to grasp, and if you're building a rhythm/timing heavy
app as this knowledge will be very useful to you.
</small></i>
<p>
Timing with the Web Audio API can seem tricky at first. It's unlike any other timing system native to the
browser. It's not very complex, and easy to wrap your brain around once you "get" it.
Below is an example of a drum machine that loads up three samples for each lane and allows you to program a drum beat.
The sample is automatically alternated so you never hear the same sample back-to-back.
</p>
<p>
It's based on the concept of a currentTime that starts at 0 and counts it's way up in seconds (as a high-precision
Double). This currentTime starts the moment that an AudioContext has been created.
</p>
<p>If, for instance, you wanted a sound to play exactly 1 second after a user clicks a button, it could look like this:</p>
${codeBlock(`
// This is pseudo-code. The goal at this point is to get the concept across,
// not to potentially confuse you with framework-specific stuff.
// The moment that audioContext is created, audioContext.currentTime starts counting seconds
const audioContext = new AudioContext();
const sound = // Create or load a sound and hook up audio inputs and outputs.
// Not important right now...
// We'll say that the result is an audio "node" that is ready to play
function handleClick() {
// Get the current time from audioContext.
const now = audioContext.currentTime;
// Start the sound we created up there^, adding 1 second to "now"
// The Web Audio API deals in seconds, not milliseconds
sound.start(now + 1);
}
`)}
<p>Now what if we wanted to schedule the sound 5 times, each exactly 1 second apart?</p>
${codeBlock(`
// Again, I want to mention that this code will not work as-is. It's ignoring
// some other quirks of the Web Audio API. We're only focused on understanding
// timing at the moment.
const audioContext = new AudioContext();
const sound = // Create or load a sound and hook up audio inputs and outputs.
function handleClick() {
const now = audioContext.currentTime;
for (let i = 0; i < 5; i++) {
sound.start(now + i);
}
}
`)}
<p>
As you can see, as far as an AudioContext is concerned, the moment that it is created is "the beginning of time"
and scheduling events is achieved by specifying an exact moment in time. sound.start(100) would play the sound
exactly 100 seconds after the AudioContext was created, regardless of what time sound.start(100) was called.
If sound.start(100) is called after 100 seconds has already passed since "the beginning of time," the sound
will play immediately.
</p>
<i><small>
Again, this is an important concept to understand, but in many cases (even more complex cases, such as
<a href="/ez-web-audio/timing/drum-machine">building a rhythmically-based instrument</a>) this is already handled f
or you. Check out <a href="/ez-web-audio/timing/with-ez-web-audio">Beats</a>, or the very last example on this page.
</small></i>
</div>
`,
}
Expand Down
48 changes: 48 additions & 0 deletions src/beat-track.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
import { expect, it } from 'vitest'
import { AudioContext as Mock } from 'standardized-audio-context-mock'
import type { Playable } from './interfaces/playable'
import type { Connectable } from './interfaces/connectable'
import { BeatTrack } from '@/beat-track'

function createBeatTrack() {
const context = new Mock() as unknown as AudioContext
const sounds: (Playable & Connectable)[] = []
return new BeatTrack(context, sounds)
}

it('exists', () => {
expect(BeatTrack).toBeTruthy()
})

it('can be created', () => {
const track = createBeatTrack()
expect(track).toBeTruthy()
})

it(`remembers beats' 'active' state when numBeats changes`, () => {
const beatTrack = createBeatTrack()
let [beat1, beat2, beat3] = beatTrack.beats

beat1.active = true
beat3.active = true

beatTrack.numBeats = 6

beat1 = beatTrack.beats[0]
beat2 = beatTrack.beats[1]
beat3 = beatTrack.beats[2]

expect(beat1.active).toBe(true)
expect(beat2.active).toBe(false)
expect(beat3.active).toBe(true)

beatTrack.numBeats = 4

beat1 = beatTrack.beats[0]
beat2 = beatTrack.beats[1]
beat3 = beatTrack.beats[2]

expect(beat1.active).toBe(true)
expect(beat2.active).toBe(false)
expect(beat3.active).toBe(true)
})
133 changes: 133 additions & 0 deletions src/beat-track.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,133 @@
import { Beat } from './beat'
import type { Connectable } from './interfaces/connectable'
import type { Playable } from './interfaces/playable'
import { Sampler } from './sampler'

const beatBank = new WeakMap()

/**
* An instance of this class has an array of "sounds" (comprised of one or multiple
* audio sources, if multiple are provided, they are played in a round-robin fashion)
* and provides methods to play that sound repeatedly, mixed with "rests," in a
* rhythmic way. An instance of this class behaves very similarly to a "lane" on a drum machine.
*
* @class BeatTrack
* @extends Sampler
*
* @todo need a way to stop a BeatTrack once it's started. Maybe by creating
* the times in advance and not calling play until it's the next beat in the
* queue?
*/
export class BeatTrack extends Sampler {
constructor(private audioContext: AudioContext, sounds: (Playable & Connectable)[], opts?: { numBeats?: number, duration?: number }) {
super(sounds)
if (opts?.numBeats) {
this.numBeats = opts.numBeats
}
if (opts?.duration) {
this.duration = opts.duration
}
}

/**
* @property numBeats
*
* Determines the number of beats in a BeatTrack instance.
*/
public numBeats = 4

/**
* @property duration
*
* If specified, Determines length of time, in milliseconds, before isPlaying
* and currentTimeIsPlaying are automatically switched back to false after
* having been switched to true for each beat. 100ms is used by default.
*
* @default 100
*/
public duration = 100

/**
* @property beats
*
* Computed property. An array of Beat instances. The number of Beat instances
* in the array is always the same as the `numBeats` property. If 'numBeats'
* or duration changes. This property will be recomputed, but any beats that
* previously existed are reused so that they will maintain their `active`
* state.
*/
public get beats(): Beat[] {
let beats = []
let numBeats = this.numBeats
let existingBeats

if (beatBank.has(this)) {
existingBeats = beatBank.get(this)
numBeats = numBeats - existingBeats.length
}

for (let i = 0; i < numBeats; i++) {
const beat = new Beat(this.audioContext, {
duration: this.duration,
playIn: this.playIn.bind(this),
play: this.play.bind(this),
})

beats.push(beat)
}

if (existingBeats) {
beats = existingBeats.concat(beats)
}

beatBank.set(this, beats)

return beats
}

/**
* @method playBeats
*
* Calls play on all Beat instances in the beats array.
*
* @param {number} bpm The tempo at which the beats should be played.
* @param noteType {number} The (rhythmic) length of each beat. Fractions
* are suggested here so that it's easy to reason about. For example, for
* eighth notes, pass in `1/8`.
*/
public playBeats(bpm: number, noteType: number): void {
this.callPlayMethodOnBeats('playIn', bpm, noteType)
}

/**
* @method playActiveBeats
*
* Calls play on `active` Beat instances in the beats array. Any beat that
* is not marked active is effectively a "rest".
*
* @param {number} bpm The tempo at which the beats and rests should be played.
* @param noteType {number} The (rhythmic) length of each beat/rest. Fractions
* are suggested here so that it's easy to reason about. For example, for
* eighth notes, pass in `1/8`.
*/
public playActiveBeats(bpm: number, noteType: number): void {
this.callPlayMethodOnBeats('ifActivePlayIn', bpm, noteType)
}

/**
* @method callPlayMethodOnBeats
*
* The underlying method behind playBeats and playActiveBeats.
*
* @param {string} method The method that should be called on each beat.
* @param {number} bpm The tempo that should be used to calculate the length
* of a beat/rest.
* @param noteType {number} The (rhythmic) length of each beat/rest that should
* be used to calculate the length of a beat/rest in seconds.
*/
private callPlayMethodOnBeats(method: 'ifActivePlayIn' | 'playIn', bpm: number, noteType: number = 1 / 4): void {
// http://bradthemad.org/guitar/tempo_explanation.php
const duration = (240 * noteType) / bpm
this.beats.forEach((beat, idx) => beat[method](idx * duration))
}
}
Loading

0 comments on commit 4392bc4

Please sign in to comment.