Querying for Player Controls
What’s a custom player without custom controls? The next step is to query for all the custom controls we added in the HTML. We’ll using the Document.querySelector()
method to return each associated element assigned by a variable.
// Player controls and attributes const playButton = document.querySelector(".player-play-btn") const playIcon = playButton.querySelector(".player-icon-play") const pauseIcon = playButton.querySelector(".player-icon-pause") const progress = document.querySelector(".player-progress") const progressFilled = document.querySelector(".player-progress-filled") const playerCurrentTime = document.querySelector(".player-time-current") const playerDuration = document.querySelector(".player-time-duration") const volumeControl = document.querySelector(".player-volume")
Here we have variables for each independent control and the progress bar shown in the user interface.
Waiting Before JavaScript Fires
To properly load the audio, which can sometimes take longer than other items on the page, it probably makes sense to wait for the entire page to load before we run any JavaScript.
We’ll start with an event listener that waits for the page to load. We can wrap the entirety of our code in this block.
window.addEventListener("load", () => { // all code goes here besides variables })
We’ll start by listening for the playButton
variable’s click event to instruct our player to Play.
// Play button toggle playButton.addEventListener("click", () => { // check if context is in suspended state (autoplay policy) // By default, browsers won't allow you to autoplay audio. // You can override by finding the AudioContext state and resuming it after a user interaction like a "click" event. if (audioCtx.state === "suspended") { audioCtx.resume() } // Play or pause track depending on state if (playButton.dataset.playing === "false") { audioElement.play() playButton.dataset.playing = "true" playIcon.classList.add("hidden") pauseIcon.classList.remove("hidden") } else if (playButton.dataset.playing === "true") { audioElement.pause() playButton.dataset.playing = "false" pauseIcon.classList.add("hidden") playIcon.classList.remove("hidden") } })
A few things happen at once when the playButton
gets clicked.
- Browsers are smart enough to stop auto-playing audio from playing on the first load. Inside the
AudioContext
method, there is a state method that returns a value of “suspended”, “running”, or “closed”. In our case, we’ll be looking for “suspended”. If that‘s the state that returns, we can proceed to resume the audio with the method calledresume()
. -
We use data attributes in the HTML to denote when the button is “playing” or “paused”.
-
If the play or pause button is clicked, we can dynamically tell the
audioElement
to play or pause. -
For a better user experience, I added the ability to show and hide the play or pause icons depending on the player’s state.
Update Time Stamps and Progress
Each track you load with an AudioElement context will have its characteristics and metadata you can display in the HTML. We start by making everything zero on the first-page load and proceed to call a function that dynamically updates and formats the time as the audio gets played or paused.
We’ll additionally show a progress bar that will dynamically fill based on the amount of lapsed audio. This is handy for the end user who might want to glance at a progress bar rather than read the remaining time.
// Update progress bar and time values as audio plays audioElement.addEventListener("timeupdate", () => { progressUpdate() setTimes() })
I created two functions that are extracted elsewhere in the JavaScript file. The main thing to denote about the code above is the type of event listener we keep track of. The timeupdate
event is unique to media like Audio or Video within the Web API.
Displaying and Formatting Time
We can use the playerCurrentTime
and playerDuration
variables to display and format time. We’ll set the textContent of those tags in the HTML to match a new timestamp relative to the audioElement’s current attributes. An audioElement will have a currentTime
property and a duration
property.
Using the Date API in JavaScript, we can tap into a handy one-liner to convert the default seconds that get returned from currentTime
and duration
in a format that matches HH:MM:SS
(Hours, Minutes, Seconds).
// Display currentTime and duration properties in real-time function setTimes() { playerCurrentTime.textContent = new Date(audioElement.currentTime * 1000) .toISOString() .substr(11, 8) playerDuration.textContent = new Date(audioElement.duration * 1000) .toISOString() .substr(11, 8) }
Updating Player Progress
Updating the progress bar in our HTML is relatively simple and comes down to a percentage calculation. We’ll get the percent returned by dividing the audioElement.currentTime
by the audioElement.duration
and multiplying that by 100.
Finally, we can set some CSS via JavaScript by using the progressFilled
variable we created before and adjusting the flex-basis
property to grow or shrink depending on the change percentage.
// Update player timeline progress visually function progressUpdate() { const percent = (audioElement.currentTime / audioElement.duration) * 100 progressFilled.style.flexBasis = `${percent}%` }
Add Volume Controls
Adjusting volume taps back into the AudioContext object we used before. We’ll need to call a method named createGain()
and change the gain value to map to the volume range input within the HTML.
// Bridge the gap between gainNode and AudioContext so we can manipulate volume (gain) const gainNode = audioCtx.createGain() volumeControl.addEventListener("change", () => { gainNode.gain.value = volumeControl.value }) track.connect(gainNode).connect(audioCtx.destination)
We created a track
variable early on in this tutorial and are finally putting it to use here. Using the connect()
method, you can connect the track to the gainNode
and then to the AudioContext
. Without this line, the volume range input doesn’t know about the volume of the audio.
We’ll listen for a change event to map the volume relative to the gain.
What Happens When the Audio Ends?
We can reset the player after the audio ends so it can be ready for another listen should the end user want to start it over.
// if the track ends, reset the player audioElement.addEventListener("ended", () => { playButton.dataset.playing = "false" pauseIcon.classList.add("hidden") playIcon.classList.remove("hidden") progressFilled.style.flexBasis = "0%" audioElement.currentTime = 0 audioElement.duration = audioElement.duration })
Here we toggle the play button icon from pause to play, set the data-playing attribute to false
, reset the progress bar, and the audioElement’s currentTime
and duration
properties.
Scrubbing the Progress Bar to Skip and Rewind
Our progress bar is functional visually, but it would be more helpful if you could click anywhere on the timeline and adjust the current audio playback. We can achieve this with a series of event listeners and a new function.
// Scrub player timeline to skip forward and back on click for easier UX let mousedown = false function scrub(event) { const scrubTime = (event.offsetX / progress.offsetWidth) * audioElement.duration audioElement.currentTime = scrubTime } progress.addEventListener("click", scrub) progress.addEventListener("mousemove", (e) => mousedown && scrub(e)) progress.addEventListener("mousedown", () => (mousedown = true)) progress.addEventListener("mouseup", () => (mousedown = false))
The scrub()
function requires an event argument we listen for. In particular, the offsetX
property allows us to pinpoint where a user clicked and make calculations relative to the audioElement’s properties.
Finally, we can listen on the progress bar itself for a set of events like click, mousemove, mousedown, and mouseup to adjust the audio element’s currentTime
property.
4. Putting it All Together
The final JavaScript code is below. One thing to note is on the first-page load; I call the setTimes()
function once again so we can get real-time displayed correctly before the user even starts manipulating the audio player.
// load sound via <audio> tag const audioElement = document.querySelector("audio") const audioCtx = new AudioContext() const track = audioCtx.createMediaElementSource(audioElement) // Player controls and attributes const playButton = document.querySelector(".player-play-btn") const playIcon = playButton.querySelector(".player-icon-play") const pauseIcon = playButton.querySelector(".player-icon-pause") const progress = document.querySelector(".player-progress") const progressFilled = document.querySelector(".player-progress-filled") const playerCurrentTime = document.querySelector(".player-time-current") const playerDuration = document.querySelector(".player-time-duration") const volumeControl = document.querySelector(".player-volume") document.addEventListener("DOMContentLoaded", () => { // Set times after page load setTimes() // Update progress bar and time values as audio plays audioElement.addEventListener("timeupdate", () => { progressUpdate() setTimes() }) // Play button toggle playButton.addEventListener("click", () => { // check if context is in suspended state (autoplay policy) // By default, browsers won't allow you to autoplay audio. // You can override by finding the AudioContext state and resuming it after a user interaction like a "click" event. if (audioCtx.state === "suspended") { audioCtx.resume() } // Play or pause track depending on state if (playButton.dataset.playing === "false") { audioElement.play() playButton.dataset.playing = "true" playIcon.classList.add("hidden") pauseIcon.classList.remove("hidden") } else if (playButton.dataset.playing === "true") { audioElement.pause() playButton.dataset.playing = "false" pauseIcon.classList.add("hidden") playIcon.classList.remove("hidden") } }) // if the track ends, reset the player audioElement.addEventListener("ended", () => { playButton.dataset.playing = "false" pauseIcon.classList.add("hidden") playIcon.classList.remove("hidden") progressFilled.style.flexBasis = "0%" audioElement.currentTime = 0 audioElement.duration = audioElement.duration }) // Bridge the gap between gainNode and AudioContext so we can manipulate volume (gain) const gainNode = audioCtx.createGain() const volumeControl = document.querySelector(".player-volume") volumeControl.addEventListener("change", () => { gainNode.gain.value = volumeControl.value }) track.connect(gainNode).connect(audioCtx.destination) // Display currentTime and duration properties in real-time function setTimes() { playerCurrentTime.textContent = new Date(audioElement.currentTime * 1000) .toISOString() .substr(11, 8) playerDuration.textContent = new Date(audioElement.duration * 1000) .toISOString() .substr(11, 8) } // Update player timeline progress visually function progressUpdate() { const percent = (audioElement.currentTime / audioElement.duration) * 100 progressFilled.style.flexBasis = `${percent}%` } // Scrub player timeline to skip forward and back on click for easier UX let mousedown = false function scrub(event) { const scrubTime = (event.offsetX / progress.offsetWidth) * audioElement.duration audioElement.currentTime = scrubTime } progress.addEventListener("click", scrub) progress.addEventListener("mousemove", (e) => mousedown && scrub(e)) progress.addEventListener("mousedown", () => (mousedown = true)) progress.addEventListener("mouseup", () => (mousedown = false)) // Track credit: Outfoxing the Fox by Kevin MacLeod under Creative Commons + MDN for the link. })
Conclusion
There you have it! With a bit of JavaScript and elbow grease, you can create your very own branded music player.
From here, you might experiment with adding more controls, like skipping buttons or panning buttons. I’d also check out the AudioTracklist interface, which allows you to create playlists and extend the design as necessary.