3

Building an Inter Dimensional Video Player for Lord Huron

 2 years ago
source link: https://leemartin.dev/building-an-inter-dimensional-video-player-for-lord-huron-5061d53a8df2
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Building an Inter Dimensional Video Player for Lord Huron

In Support of “Your Other Life” and “Ton Autre Vie”

1*tb-gyxCoClJ6sfkb8eIZGg.jpeg
The “Life Fader” Control

Growing up Cajun in South Louisiana, I’d hear both English and French in conversation and music. Personally, I can’t speak or understand French, and my grandmother, Velma Marie, knew this. When she and other adults wanted to hide what they were saying from us, they would simply “switch to French.” We would sit there, nodding our heads, hoping to identify any foul language or being identified as a “couyon.” I miss my maw maw.

At the end of last year, Republic Records contacted me about a pair of new songs Lord Huron had created to support the new Deluxe Edition of 2021’s Long Lost: “Your Other Life” and “Ton Autre Vie.” Lyrically and sonically, the songs are similar, but one is sung in English by Ben and the other is sung in French by Sarah Dugas. Accompanying this tale of uncovering a lover’s secret life, is a pair of music videos shot by Adam Willis.

The original concept we set out to build would allow users to “toggle” between both music visuals but after having conversations with Adam and Ben, we set our ambitions higher. What if instead of building the “A/B Toggle” player, we provided fans with a “Life Fader” to slide seamlessly from one dimension to another. Not only would this allow the user to hear both versions of the song, it would also reveal a hidden duet between both of our protagonists. You can now see and hear all sides of the story. It is a music video as a multiverse.

Head over to www.yourotherlife.com or www.tonautrevie.com to experience the video for yourself and read on to find out how it came together.

A Multiverse of Solutions

The core technical function of our app involves keeping the two music visuals in sync both visually and sonically. To say it was a journey getting to the final solution would be an understatement but I thoroughly enjoyed the problem. Here’s some of the things we tried, where we ended up, and why.

Two Videos

The first solution seemed obvious to me: place each song’s video in its own <video> tag and build controls that would play, pause, and seek them simultaneously. Initially one video would be at full volume and opacity while the other would have its volume and opacity set to zero. As the user adjusted the fader, each video’s volume and opacity would be adjusted accordingly. This seemed to work okay until I tested it on my iPhone. Apparently, you can’t play two videos with audio at the same time. (Think about how your iPhone pauses one video before you start another.) Honestly, I wasn’t sure if we were even going to be able to pull off a solution for mobile but I made the decision to try something different in an attempt to get it working.

One Video, Two Audio Files

In my experimenting, I built an audio only version of the fader using Howler.js. What I noticed is that Howler did a really good job keeping both audio files in sync using the Web Audio API and preloaded audio tracks. In fact, there was very little to no lag at all. So, I made the decision to remove audio from the videos and instead serve them separately via Web Audio. Unlike video, iPhone browsers will allow you to play multiple audio files at once. Since the videos no longer needed audio on them, I could now merge them into a single stacked visual. Now that’s what I call video syncing. The play button would now play the one video and two audio files simultaneously and I began to check for any drift between the video’s position and audio position.

let diff = Math.abs(videoTime — audioTime)

If this difference in time met a certain threshold, I would seek the video to the position of the audio.

video.currentTime = audioTime

I found that it was less jarring to have the visual jump a few milliseconds than to have the audio stop and make a similar jump. This kept the audio playing seamlessly and allowed the video to adjust itself as required. Having said that, I didn’t want the video jumping around too much so I used a Lodash throttle function to prevent the sync method from being called more than once every 5 seconds. Javascript promises are your friend. I used them religiously to better understand when the media was successfully seeked before taking any other media actions.

This solution was pretty good but both audio files had the exact same instrumental backing. If they became even slightly out of sync, it was very noticeable — especially on drum hits. After confirming with Ben that they were, in fact, similar, I decided to extract the instrumental into a third audio file and test to see if it also stayed in sync.

One Video, Three Audio Files

While adding a third audio file, increased the overall preload size, it ended up being a viable solution across all of the devices I was testing. Following Howler’s format recommendations, I created both MP3 and smaller WEBM audio files to serve. I ended up writing an audio sync method but I only call it when the video is paused or paused during seeking. This method seeks all audio files to the current position of the video.

return Promise.all(
howls.map((howl, i) => {
return new Promise((resolve, revoke) => {
// Once seek
howl.once(‘seek’, resolve) // Seek
howl.seek(videoTime)
})
})
)

I now had all of my audio files and my single video file playing simultaneously and working on keeping itself in sync. That’s when I faced a real ghost in the machine. For whatever reason, my audio would not play occasionally. It didn’t seem related to a particular browser or device. It just didn’t play… sometimes. These are the bugs you fear. After a long reverse engineering of the app, I discovered that Howler “suspends the Web Audio AudioContext after 30 seconds of inactivity to decrease processing and energy usage.” What was happening in the case of our app, is that sometimes the video preload was taking longer than 30 seconds and when I went to click play, the audio was suspended. It was a religious experience when I finally realized this and luckily Howler provides a one line solution to disable their auto suspend functionality.

Howler.autoSuspend = false

With that bug out of the way, I made some final decisions on how we would handle video formats and preloading. First, there was both a low and high quality of the video created. The low quality would be served to mobile devices and the high quality video would be saved for non-mobile devices. The same went for audio, though most browsers chose to use the smaller WEBM audio file. Mobile devices seemed much more susceptible to video buffering conflicting with the overall sync methodology, so I decided to preload the video entirely on that environment. However, non-mobile devices are able to start the experience as soon as the video is loaded. It might sound a little counterintuitive to load things on mobile and not load things on non-mobile devices but it worked well for this use case.

In order to really preload a video in its entirety, you must fetch the file, get a blob of it, create an object url of that blob, and then update the <video> source.

// Fetch entire video
let response = await fetch(‘video.mp4’)// Get video blob
let blob = await response.blob()// Create video url from blob
let url = window.URL.createObjectURL(blob)// Update video source
video.src = url

You can even visualize the progress of the fetch event which we decided to do.

The Life Fader

1*8BvSz33-rWnzraDnsb0NGw.jpeg
Yes, I made memes

The UX of our experience feels very much like any other video player with the exception of a unique control sitting in the middle of the footer: the Life Fader. The Life Fader allows the user to seamlessly transition from one dimension to another or rather from “Your Other Life” to “Ton Autre Vie.” In reality, this control is adjusting the volume of the audio and additive blending of the videos. Let’s talk about each of these.

Audio

At all times, the instrumental track is playing at full volume but the vocal tracks are being dynamically adjusted by the fader. When the fader sits in the middle position, both vocal tracks are being played at full. Positioning the fader to either side begins to fade out the opposite vocal track while maintaining the associated vocal track at full volume. Since the fader itself goes from 0.0 to 1.0, this just takes a little bit of JavaScript logic to handle.

// If fade is on left
if (fade <= 0.5) {
// Set english to full
englishAudio.volume(1.0) // Set french to fraction
frenchAudio.volume(fade / 0.5)} else {
// Set english to fraction
englishAudio.volume(1.0 - ((fade - 0.5) / 0.5)) // Set french to full
frenchAudio.volume(1.0)}

A similar logic is used to fade the song titles in the header.

Video

Video is even simpler. Sorta. The “Ton Autre Vie” visual sits on top of the “Your Other Life” visual and its opacity is adjusted by the fade value directly. This creates a simple additive blend between the two videos. Initially I thought I could handle this in a simple HTML canvas but I noticed some environments having trouble with the globalAlpha property. So, I instead looked towards a pair of Three.js VideoTextures and Planes fit to a dynamically sized canvas. Note: the video is being played from a hidden <video> tag.

First, we initialize a video texture and clone it for both videos, adjusting the repeat and offset accordingly to focus on a particular visual.

// Initialize video texture
const videoTexture = new THREE.VideoTexture(video)// Set video texture wrapping
videoTexture.wrapS = THREE.RepeatWrapping
videoTexture.wrapT = THREE.RepeatWrapping// Set video repeat
videoTexture.repeat.set(1, 0.5)// Clone into english video texture
const englishVideo = videoTexture.clone()// Adjust english video texture
englishVideo.offset.set(0, 0.5)// Clone into french video texture
const frenchVideo = videoTexture.clone()

Then we just need to use these video textures on a pair of Three.js Planes.

// Initialize plane geometry
const planeGeometry = new THREE.PlaneBufferGeometry(2, 2)// Initialize english material
const englishMaterial = new THREE.MeshBasicMaterial({
map: englishVideo,
opacity: 1,
transparent: true
})// Initialize english plane
const englishPlane = new THREE.Mesh(planeGeometry, englishMaterial)// Initialize french material
const frenchMaterial = new THREE.MeshBasicMaterial({
map: frenchVideo,
opacity: 0,
transparent: true
})// Initialize french plane
const frenchPlane = new THREE.Mesh(planeGeometry, frenchMaterial)// Add french plane to scene
scene.add(frenchPlane)// Add english plane to scene
scene.add(englishPlane)

Note, we’re using an Orthographic Camera with the following settings to make sure the planes stretch to fill the entire canvas.

// Initialize camera
this.camera = new THREE.OrthographicCamera(-1, 1, 1, -1, 0, 1)

Now, when we’re rendering the Three.js scene, we simply need to update the french material’s opacity to match that of the Life Fader fade value.

// Adjust french material opacity
frenchMaterial.opacity = this.fade

The last (but very important) bit is to resize the Three.js canvas when the user resizes or changes the orientation of their device. We can easily listen for a resize on the window.

// Listen for window resizing
window.addEventListener(‘resize’, resize)

And when this happens, we’ll check the height and width of the canvas’ parent element and update the Three.js renderer accordingly. That parent element just so happens to be a div element set to full width which uses the fancy new CSS aspect-ratio to make sure it is always the right ratio for our video (1920/798.) Here’s the resize method.

// Get parent element size
let { width, height } = canvas.parentElement.getBoundingClientRect()// Update renderer size
renderer.setSize(width, height)

Now, I’d be lying if I said this didn’t come with other issues. For example, when an iPhone is tilted into landscape mode, it fires the resize event but the overall width and height are incorrect. This is, apparently, a known bug. Because of this, I’ve had to check for resizing more frequently than just the window’s event listener can provide. It’s a hack but it works.

Design

One last thing on the Life Fader: it’s design and construction. The fader is actually just a standard HTML range input styled via CSS. That is also the case for the player’s seek bar. I’ve found this CSS Tricks article extremely useful in designing cross browser compatible range inputs.

Conclusion

1*v7QpQ-QFhD0N_gkvOE3C_A.jpeg
The “Your Other Life” and inverted “Ton Autre Vie” artwork

In the end, this experience consists of two range inputs, one <video> tag, three web audio objects, a couple of buttons, and an HTML <canvas>. This is why I love the web. All of these elements are part of the foundation of browser technology. We just worked very hard to put them in this magical arrangement to benefit these two incredible videos. I learned A LOT about HTML video events, cross-browser media capabilities, and other things that didn’t even make the final build. I hope this inspires more unconventional media players in the future.

I find these projects which involve a unique consumption of an artist’s music to be the most stressful and most fulfilling. This app joins the Dave Grohl Play player, R.E.M. Monster player, and Me Rex Megabear player as opportunities I will never forget.

Thanks

1*MKT6btPixsEt4Gx5jikimg.jpeg
Lord Huron

I have to once again thank Elliott Althoff, Alexander Coslov, and Tim Hrycyshyn from Republic Records, for making me part of this. Shout out to Rich Cohen and all of LoyalT Management for support. Special thanks to Adam for nailing these visuals and helping keep everything in sync. I think we know what we’re doing now. Be sure to check out Adam’s post on Instagram to see everyone who was involved in the shoot. Finally, thanks to Ben and Lord Huron for following up Anne and I’s album of the year with another incredible expression of art and music. We’ve now done three projects together and I look forward to many more. May we build until we die.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK