gatsbyjs / gatsby

The best React-based framework with performance, scalability and security built in.
https://www.gatsbyjs.com
MIT License
55.22k stars 10.32k forks source link

How to populate a React Context File with MDX data and asset files from a GraphQL query? #18404

Closed rchrdnsh closed 4 years ago

rchrdnsh commented 4 years ago

So, I have a working persistent music player on a project I am working on, thanks to this article:

https://upmostly.com/tutorials/how-to-use-the-usecontext-hook-in-react?unapproved=15591&moderation-hash=88f22760754aa9ff30643d45bc4c41eb#comment-15591

And it's just great. But The information for the music is contained in the Context File, like so:

const MusicPlayerContext = React.createContext([{}, () => {}]);

const MusicPlayerProvider = (props) => {

  const [state, setState] = useState({
    audioPlayer: new Audio(),
    tracks: [
      {
        name: 'Baktun',
        artist: 'RYKR',
        file: Baktun,
        artwork: BaktunArtwork
      },
      {
        name: 'Bash',
        artist: 'RYKR',
        file: Bash,
        artwork: BashArtwork
      },
      {
        name: 'Frost',
        artist: 'RYKR',
        file: Frost,
        artwork: FrostArtwork
      },
      {
        name: 'Greyskull',
        artist: 'RYKR',
        file: Greyskull,
        artwork: GreyskullArtwork
      },
      {
        name: 'Sprial Up',
        artist: 'RYKR',
        file: SpiralUp,
        artwork: SpiralUpArtwork
      }  
    ],
    currentTrackIndex: null,
    isPlaying: false,
  })

  return (
    <MusicPlayerContext.Provider value={[state, setState]}>
      {props.children}
    </MusicPlayerContext.Provider>
  )

}

export { MusicPlayerContext, MusicPlayerProvider }

...but rather than manually input all this information into the context file via a JS object, I would rather store the music in folders with an MDX file per song and the artwork and audio file in there as well, and then inject that information into the Context file, I assume via GraphQL.

I don't really know how to go about doing that, however, so any guidance or examples would be great, thank you :-)

I'm currently importing all of the audio and artwork files manually into the Context file and manually inputing the track information as well, which is not optimal and scalable.

rchrdnsh commented 4 years ago

ooooooooh...that clears quite a few things up in my head :-)

rchrdnsh commented 4 years ago

ok, so now I'm trying to figure out how to get the play button on the individual track pages to work properly...

So, currently, there is a grid of pieces of music:

Screen Shot 2019-12-18 at 11 39 46 AM

...and when the user clicks the play button, that piece will play. If a user clicks on the learn more button, then they are taken to a programmatically generated mdx page with an article describing the piece and how it was created:

Screen Shot 2019-12-18 at 11 41 01 AM

...where there is also a play button on that page as well in the title card, which I would like to be linked to the play button from the grid view of all the pieces, so that if the user clicked on the play button in the grid, then clicked the learn more button and navigated to the article explaining the piece, then the play button there would reflect the current state of the music playing, meaning it would have the pause icon. But if the user did not click the play button, but instead simply navigated to the individual article page, then the play button would be the play icon, and when it is clicked, that track would play.

So, I'm starting to do this by literally copying and pasting the play button from the tracklist component over into the article template:

const Track = ({ data }) => {

  const {
    currentTrackName,
    playTrack,
    isPlaying,
  } = useMusicPlayerState()

  const { frontmatter, timeToRead, body } = data.mdx

  return (
    <MDXProvider components={{...components }}>
      <ArticleBlock>
      <TitleBlock>
        <TitleText>
          <Title>{frontmatter.name}</Title>
          <Subtitle>{frontmatter.artist}</Subtitle>
          <Subtitle>{frontmatter.genre}</Subtitle>
          <Subtitle>{frontmatter.bpm}</Subtitle>
          {/* <Subtitle>{frontmatter.description}</Subtitle>    */}
        </TitleText>
        <PlayButton
          whileHover={{ scale: 1.1 }}
          whileTap={{ scale: 0.9 }}
          onClick={() => playTrack()}
        >
          {currentTrackName === frontmatter.name && isPlaying ? (
            <img src={PauseButtonIcon} alt="Pause Button" />
          ) : (
            <img src={PlayButtonIcon} alt="Play Button" />
          )
          }
        </PlayButton>
        <Artwork fluid={frontmatter.artwork.childImageSharp.fluid} alt="cool stuff." />
      </TitleBlock>
      <TitleScrim />
      <TitleImage fluid={frontmatter.artwork.childImageSharp.fluid} alt="cool stuff." />
      <Content>
        <MDXRenderer>{body}</MDXRenderer>
      </Content>
    </ArticleBlock>
  </MDXProvider>
  )
}

...but when i click the play button I get the following error message:

in Chrome:

Screen Shot 2019-12-18 at 11 45 45 AM

...and in Safari:

Screen Shot 2019-12-18 at 11 46 23 AM

...so I'm thinking I need to tell it to select that particular track to be connected to the play button, but not sure how to go about doing that.

universse commented 4 years ago

The problem is you are not passing the index to playTrack. To do that, you needs to associate each individual track page with the correct index, and pass that index to playtrack click handler of the play button.

Maybe you can do something like

// HERE get slug from pageContext
const Track = ({ data, pageContext: { slug } }) => {

  const {
    currentTrackName,
    playTrack,
    isPlaying,
    trackList
  } = useMusicPlayerState()

  const { frontmatter, timeToRead, body } = data.mdx
  const index = trackList.findIndex(track => track.slug === slug) // HERE - find the correct track index

  return (
    <MDXProvider components={{...components }}>
      <ArticleBlock>
      <TitleBlock>
        <TitleText>
          <Title>{frontmatter.name}</Title>
          <Subtitle>{frontmatter.artist}</Subtitle>
          <Subtitle>{frontmatter.genre}</Subtitle>
          <Subtitle>{frontmatter.bpm}</Subtitle>
          {/* <Subtitle>{frontmatter.description}</Subtitle>    */}
        </TitleText>
        <PlayButton
          whileHover={{ scale: 1.1 }}
          whileTap={{ scale: 0.9 }}
          onClick={() => playTrack(index)} // HERE
        >
          {currentTrackName === frontmatter.name && isPlaying ? (
            <img src={PauseButtonIcon} alt="Pause Button" />
          ) : (
            <img src={PlayButtonIcon} alt="Play Button" />
          )
          }
        </PlayButton>
        <Artwork fluid={frontmatter.artwork.childImageSharp.fluid} alt="cool stuff." />
      </TitleBlock>
      <TitleScrim />
      <TitleImage fluid={frontmatter.artwork.childImageSharp.fluid} alt="cool stuff." />
      <Content>
        <MDXRenderer>{body}</MDXRenderer>
      </Content>
    </ArticleBlock>
  </MDXProvider>
  )
}
rchrdnsh commented 4 years ago

wow, so that...just...worked! Super awesome!

Now I'm working on the inline audio samples component again and I'm running into a couple issues. So far I have this:

import React from 'react'
import styled from 'styled-components'
import { motion } from 'framer-motion'

import PlayButton from '../images/controls/play-button.svg'
import PauseButton from '../images/controls/pause-button.svg'

import { useMusicPlayerState } from '../player/MusicPlayerContext'

const Button = styled(motion.button)`
  // styles n' stuff...
`

const AudioBlock = styled.div`
  // styles n' stuff...
`

const AudioText = styled.p`
    // styles n' stuff...
`

function Audio ({ src, children }) {
  const {
    currentTrackName,
    playArticleTrack,
    isPlaying,
  } = useMusicPlayerState()

  const articleTrackName = src;

  return (
    <AudioBlock>
      <Button
        whileHover={{ scale: 1.13 }}
        whileTap={{ scale: 0.9 }}
        onClick={() => playArticleTrack(src) }
        src={src}
      >
        { currentTrackName === articleTrackName && isPlaying ? (
          <img src={PauseButton} alt="Pause Button" />
        ) : (
          <img src={PlayButton} alt="Play Button" />
        )}
      </Button>
      <AudioText>{children}</AudioText>
    </AudioBlock>
  )
}

export default Audio

...where I'm trying to pass in the src as an audio file, and have the play button function as it always has in other instances.

Issue 1:

Issue 2:

Issue 3:

Issue 4 (actually, a more global issue, possibly):

rchrdnsh commented 4 years ago

so, I'm trying to amend the playArticleTrack() function like this:

  function playArticleTrack(src) {
    if (src  === state.currentTrackIndex)
    {
      togglePlay()
    } else {

      state.audioPlayer.pause()

      const audioPlayer = new Audio(src)

      audioPlayer.play()

      setState(state => ({
        ...state,
        currentTrackIndex: src,
        isPlaying: true,
        audioPlayer,
      }))
    }
  }

...so my Audio.js element can look like this, roughly:

const Audio = ({ src, children }) => {
  const {
    currentTrackName,
    playArticleTrack,
    isPlaying,
  } = useMusicPlayerState()

  return (
    <Box>
      <AudioBlock>
        <Button
          whileHover={{ scale: 1.13 }}
          whileTap={{ scale: 0.9 }}
          onClick={() => playArticleTrack(src) }
          // onClick={togglePlay}
          src={src}
        >
          { currentTrackName === articleTrackName && isPlaying ? (
            <img src={PauseButton} alt="Pause Button" />
          ) : (
            <img src={PlayButton} alt="Play Button" />
          )}
        </Button>
        <AudioText>{children}</AudioText>
      </AudioBlock>
    </Box>
  )
}

...so it can be this in an mdx file:

<Audio src={Groove}>Example Groove That is Super Duper Awesome!</Audio>

...and this in the app itself:

Screen Shot 2019-12-29 at 8 03 03 PM

...but it's not working right yet...when i had it a little simpler I couldn't get the pause button to show up when it was playing, and now that I'm trying to more closely mirror the playTrack() function, but not sure how to add the new src's to the trackIndex, I guess...also, this is currently happening with the current version of the playArticleTrack() function:

Screen Shot 2019-12-29 at 10 27 53 PM

...but I'm not sure why that function is interested in that line of code.

universse commented 4 years ago

The problem is src is a string ("someurl.com/name.mp3") while currentIndex is a number. currentTrack which is trackList[currentTrackIndex] will be undefined when you use src as the currentTrackIndex. That's why the above line cannot be evaluated since you are getting audio property of undefined.

From what I can see, you need to refactor quite a big chunk of your music player functionalities to handle this new use case.

rchrdnsh commented 4 years ago

yeah, that's what i was thinking as well...i think for now I can just use the <audio/> element, as it basically works, and then come back around to this later.

I have another, more pressing issue, however. I cannot get the project to successfully build using gatsby build, as the new Audio() thing is not supported during SSR builds, so I need to figure out how to get around that so that the app can actually even work outside of gatsby develop mode.

Here is the related issue:

https://github.com/gatsbyjs/gatsby/issues/19522

I asked @DSchau on twitter the other day and he said this:

Screen Shot 2020-01-04 at 11 26 43 AM

...so i'm currently trying to understand what he means, but I have many holes in my hooks knowledge as well as my general knowledge.

What I don't understand is how to set state inside of another hook...like...am I using useState inside of useEffect?

Going by what he said, this is all I can come up with:

 useEffect(() =>{
    [],
    const [client, isClient] = useState();
  })

...obviously this is wrong and doesn't work and is full of errors, but I'm not sure what he means in regards to how to set it up, and how the syntax should work. If you have any thoughts, they, of course, would be much appreciated :-)

universse commented 4 years ago

You can try this

const [state, setState] = useState({
    audioPlayer: {}, // instead of new Audio()
    currentTrackIndex: null,
    isPlaying: false,
})
rchrdnsh commented 4 years ago

Hi @universse XD

So, @DSchau and @jlengstorf recommended using an isClient conditional approach to making this app actually build, and I am currently trying to make this approach work. They recommended doing something along these lines, in which I create a state hook to set whether the code is running on the client or not:

  // ---------------------------------------------------------
  // Establishing if the code is running on the client
  // ---------------------------------------------------------

  // state hook for setting a starting state for wether or not this code is running on the client
  const [isClient, setIsClient] = useState(false)

  // useEffect hook to set the client boolenan to true
  useEffect(() => {
    setIsClient(true)
  }, [])

...then set state for the new Audio() HTMLAudioElement using the setState function, like so:

  // state hook for setting a starting state for the Audio() constructor which creates a new HTMLAudioElement
  const [state, setState] = useState({}) 

  // conditional statement to set up a few different things if the the current code is being run on the client. 
  if (isClient) {
    setState({
      // a key value pair creating a new Audio() HTMLAudioELement using the constructor function
      audioPlayer: new Audio(),
      currentTrackIndex: null,
      isPlaying: false,
    })
  }

...but this only ended up making other things break, so I am now to trying to wrap as many things as needed into isClient conditionals, and I keep moving on to new errors each time I wrap something else, like setting the currentTime for example:

  // ---------------------------------------------------------
  // Setting Current Time 
  // ---------------------------------------------------------

  // A useState hook that will read and write the current time of the HTMLAudioElement
  const [currentTime, setCurrentTime] = useState()
  // if the code is running on the client then set the current time
  if (isClient) {
    setCurrentTime(state.audioPlayer.currentTime)
  }

  // This const declaration uses the getTime() function to properly format the current time of the currently playing audio file. If no audio is playing then the current time will display 0:00
  const formattedTime = getTime(currentTime) || `0:00`

...but even though I did that I'm still getting error messages like this:

Screen Shot 2020-01-08 at 6 33 43 PM

...which is odd since that is what I just wrapped with an isClient conditional.

Anyway, the following is where my AudioContext file is at now...if you can see anything I need to fix using this approach, it would be much appreciated. @DSchau and @jlengstorf as well if you have the time or energy to take a look XD

// ---------------------------------------------------------
// Audio Context
// ---------------------------------------------------------

import React, { useState, useMemo, useContext, useEffect } from 'react'
import { useStaticQuery, graphql } from 'gatsby'

const AudioContext = React.createContext([{}, () => {}])

const AudioProvider = props => {

  // ---------------------------------------------------------
  // Data Querying
  // ---------------------------------------------------------

  // This is a static query to get all of the information and audio and artwork for all the pieces of music that are being stored as folders containing mdx files with frontmatter and any associated files, like audio and image files, among other. This queries both track information and the asociated assets since only one staticQuery per file is sufficient to do this properly.
  const { tracks, assets } = useStaticQuery(graphql`
    query Tracks {
      tracks: allMdx(filter: { fileAbsolutePath: { regex: "/content/music/" } }
      ) {
        edges {
          node {
            fields {
              slug
            }
            frontmatter {
              name
              artist
              genre
              bpm
              # ADD BASE HERE
              artwork {
                base
                childImageSharp {
                  fluid(maxWidth: 1000) {
                    ...GatsbyImageSharpFluid
                  }
                }
              }
              alt
              description
              release(formatString: "MMMM Do, YYYY")
              audio {
                absolutePath
                base
              }
            }
          }
        }
      }

      # query all mp3 and jpg files from /content/music/
      assets: allFile(
        filter: {
          extension: { in: ["mp3", "jpg"] }
          absolutePath: { regex: "/content/music/" }
        }
      ) {
        edges {
          node {
            publicURL
            relativePath
          }
        }
      }
    }
  `)

  // This uses the map function to create an array of objects, where each object is a track and all it's information and assets. We need to use `useMemo` to avoid re-computation when the state changes, as this information is all static.
  const trackList = useMemo(
    () =>
      tracks.edges.map(track => {
        const slug = track.node.fields.slug
        const { frontmatter } = track.node
        const {
          name,
          artist,
          genre,
          bpm,
          artwork,
          alt,
          description,
          audio,
        } = frontmatter

        return { slug, name, artist, genre, bpm, artwork, alt, description, audio }
      }),
    [tracks]
  )

  // ---------------------------------------------------------
  // Utility Functions
  // ---------------------------------------------------------

  // This a function to turn a file name, which is a string, into a new string without the file type at the end. For example: frost.mp3 -> frost
  function basename(name) {
    return name.slice(0, name.lastIndexOf('.'))
  }

  // A function that transforms the audio.currentTime value into human readable minutes and seconds
  function getTime(time) {
    if (!isNaN(time)) {
      return Math.floor(time / 60) + ':' + ('0' + Math.floor(time % 60)).slice(-2)
    }
  }

  // ---------------------------------------------------------
  // Establishing if the code is running on the client
  // ---------------------------------------------------------

  // state hook for setting a starting state for wether or not this code is running on the client
  const [isClient, setIsClient] = useState(false)

  // useEffect hook to set the client boolenan to true
  useEffect(() => {
    setIsClient(true)
  }, [])

  // ---------------------------------------------------------
  // Creating the Audio Context
  // ---------------------------------------------------------

  // -------- ORIGINAL VERSION -------- //

  // using a useState hook to set the state of the audio player, including the audioPlyer itself, the currentTrackIndex and the isPlaying boolean
  // const [state, setState] = useState({
  //   audioPlayer: new Audio(),
  //   // audioPlayer: {}, // instead of new Audio()
  //   // don't really need trackList in state
  //   // tracks: trackList,
  //   currentTrackIndex: null,
  //   isPlaying: false,
  // })

  // -------- END ORIGINAL VERSION -------- //

  // state hook for setting a starting state for the Audio() constructor which creates a new HTMLAudioElement
  const [state, setState] = useState({}) 

  // conditional statement to set up a few different things if the the current code is being run on the client. 
  if (isClient) {
    setState({
      // a key value pair creating a new Audio() HTMLAudioELement using the constructor function
      audioPlayer: new Audio(),
      currentTrackIndex: null,
      isPlaying: false,
    })
  }

  // ---------------------------------------------------------
  // Setting Current Time 
  // ---------------------------------------------------------

  // -------- ORIGINAL VERSION -------- //

  // A useState hook that will read and write the current time of the HTMLAudioElement
  // const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)

  // This const declaration uses the getTime() function to properly format the current time of the currently playing audio file. If no audio is playing then the current time will display 0:00
  // const formattedTime = getTime(currentTime) || `0:00`

  // -------- END ORIGINAL VERSION -------- //

  // A useState hook that will read and write the current time of the HTMLAudioElement
  const [currentTime, setCurrentTime] = useState()
  // if the code is running on the client then set the current time
  if (isClient) {
    setCurrentTime(state.audioPlayer.currentTime)
  }

  // This const declaration uses the getTime() function to properly format the current time of the currently playing audio file. If no audio is playing then the current time will display 0:00
  const formattedTime = getTime(currentTime) || `0:00`

  // ---------------------------------------------------------
  // Setting Duration 
  // ---------------------------------------------------------

  // -------- ORIGINAL VERSION -------- //

  // This const declaration uses the getTime() function to properly format the duration of the currently playing audio file. If no audio is playing then the duration will display 0:00
  // const formattedDuration = getTime(state.audioPlayer.duration) || `0:00`

  // This const declaration uses the getTime() function to properly format the duration of the currently playing audio file. If no audio is playing then the duration will display 0:00
  // const formattedDuration = getTime(currentDuration) || `0:00`

  // -------- END ORIGINAL VERSION -------- //

  // A useState hook that will read and write the duration of the HTMLAudioElement
  const [currentDuration, setCurrentDuration] = useState()
  // if the code is running on the client then set the current time
  if (isClient) {
    setCurrentDuration(state.audioPlayer.duration)
  }

  // This const declaration uses the getTime() function to properly format the duration of the currently playing audio file. If no audio is playing then the duration will display 0:00
  const formattedDuration = getTime(currentDuration) || `0:00`

  // ---------------------------------------------------------
  // Setting Time
  // ---------------------------------------------------------

  // useEffect hook that will reset currentTime to 0 when state.audioPlayer changes
  useEffect(() => {
    setCurrentTime(0)
  }, [state.audioPlayer])

  useEffect(() => {
    // if isPlaying is true, then this useEffect hook will start the timer of the currentTime and set it again every second, written in milliseconds
    if (state.isPlaying) {
      const timeoutId = setInterval(() => {
        setCurrentTime(currentTime => currentTime + 1)
      }, 1000)

      return () => {
        // clear interval run when paused i.e. state.isPlaying is false
        clearInterval(timeoutId)
      }
    }
  }, [state.isPlaying])

  // convert to obj for fast lookup
  const assetObj = useMemo(
    () =>
      assets.edges.reduce((obj, file) => {
        const { publicURL, relativePath } = file.node
        obj[relativePath] = publicURL
        return obj
      }, {}),
    [assets]
  )

  // This function will Toggle between play or pause
  function togglePlay() {
    if (state.isPlaying) {
      state.audioPlayer.pause()
    } else {
      state.audioPlayer.play()
    }
    // calls the setState function and passes in an arrow function that is then passed an object with the current state being passed in the spread operator, then the isPlaying property being updated with the opposite boolean state of what it currently is.
    setState(state => ({ ...state, isPlaying: !state.isPlaying }))
  }

  // This function plays the current track
  function playTrack(index) {
    if (index === state.currentTrackIndex) {
      togglePlay()
    } else {
      state.audioPlayer.pause()

      const base = trackList[index].audio.base // frost.mp3
      const baseName = basename(base) // frost

      // new Audio() does not support relative path
      // hence the need for window.location.origin
      const audioPlayer = new Audio(
        `${window.location.origin}${assetObj[`${baseName}/${base}`]}`
      ) // new Audio('http://www.domain.com/static/frost-[hash].mp3')

      audioPlayer.play()
        setState(state => ({
          ...state,
          currentTrackIndex: index,
          isPlaying: true,
          audioPlayer,
        }))
      }
  }

  // Play the previous track in the tracks array
  function playPreviousTrack() {
    const newIndex =
      (((state.currentTrackIndex + -1) % trackList.length) + trackList.length) %
      trackList.length
    playTrack(newIndex)
  }

  // Play the next track in the tracks array
  function playNextTrack() {
    const newIndex = (state.currentTrackIndex + 1) % trackList.length
    playTrack(newIndex)
  }

  // This function is for pausing the main music that is being played and play the audio clip that has been selected somewhere in the application
  function playArticleTrack(src) {
    state.audioPlayer.pause()
    const audioPlayer = new Audio(src)
    audioPlayer.play()
    setState(state => ({
      ...state,
      currentTrackIndex: null, // I assume article tracks are not parts of the original track list
      isPlaying: true,
      audioPlayer,
    }))
  }

  // function playArticleTrack(src) {
  //   if (src  === state.currentTrackIndex)
  //   {
  //     togglePlay()
  //   } else {

  //     state.audioPlayer.pause()

  //     const audioPlayer = new Audio(src)

  //     audioPlayer.play()

  //     setState(state => ({
  //       ...state,
  //       currentTrackIndex: src,
  //       isPlaying: true,
  //       audioPlayer,
  //     }))
  //   }
  // }

  // function playArticleTrack(src) {

  //   state.audioPlayer.pause()

  //   const audioPlayer = new Audio(src)

  //   if (src === state.currentTrackIndex) {
  //     togglePlay()
  //   } else {
  //     state.audioPlayer.pause()
  //   }

  //   setState(state => ({
  //     ...state,
  //     // currentTrackIndex: 0, // I change to 0 instead
  //     // currentTrackIndex: null, // Trying null and it seems to work, although the player keeps going after it's done.
  //     currentTrackIndex: src,
  //     isPlaying: true,

  //     audioPlayer,
  //   }))
  // }

  let currentTrackName,
  currentTrackArtist,
  currentTrackArtwork,
  currentTrackAudio

  // simplify things a bit
  if (isClient && state.currentTrackIndex !== null) {
  // if (state.currentTrackIndex !== null) {
    const { currentTrackIndex } = state
    const currentTrack = trackList[currentTrackIndex]

    const base = currentTrack.audio.base // frost.mp3
    const baseName = basename(base) // frost

    currentTrackName = currentTrack.name
    currentTrackArtist = currentTrack.artist
    currentTrackArtwork = assetObj[`${baseName}/${currentTrack.artwork.base}`] // assetObj['frost/frost.jpg']
    currentTrackAudio = assetObj[`${baseName}/${currentTrack.audio.base}`] // assetObj['frost/frost.mp3']
  }

  return (
    <AudioContext.Provider
      value={{
        playTrack,
        playArticleTrack,
        togglePlay,
        currentTrackName,
        currentTrackArtist,
        currentTrackArtwork,
        currentTrackAudio,
        currentTime,
        // setCurrentTime to be used by ProgressSlider
        setCurrentTime,
        formattedTime,
        currentDuration,
        setCurrentDuration,
        formattedDuration,
        // volume,
        audioPlayer: state.audioPlayer,
        trackList,
        isPlaying: state.isPlaying,
        playPreviousTrack,
        playNextTrack,
      }}
    >
      {props.children}
    </AudioContext.Provider>
  )
}

// access global state from MusicPlayerContext
function useAudioState() {
  return useContext(AudioContext)
}

export { useAudioState, AudioProvider }
universse commented 4 years ago

The isClient approach is fine but in your case it's over-complicating stuffs a little bit. You don't need new Audio() as the initial state. An empty {} as suggested above is fine. You can try and see if it builds.

rchrdnsh commented 4 years ago

hmmm...tried that before but it didn't work...trying it again and the play buttons don't work and then I get this error message:

Screen Shot 2020-01-08 at 8 24 16 PM

also, i don't quite understand how one can have audio without Audio()...

So, to clarify, it seems to build, but then it doesn't work.

jlengstorf commented 4 years ago

you can simplify a bit:

  const [state, setState] = useState({}) 

  useEffect(() => {
    setState({
      // a key value pair creating a new Audio() HTMLAudioELement using the constructor function
      audioPlayer: new Audio(),
      currentTrackIndex: null,
      isPlaying: false,
    })
  }, [])

the error you're getting looks like it's not initializing once it loads in the browser

you can't play audio without a browser, and when Gatsby is building, there's no client — it's server-side rendering — which means Audio and other browser APIs don't exist

so what you're doing is saying, "build the site without the Audio player, then initialize the Audio player once the site loads in a browser"

does that make sense?

rchrdnsh commented 4 years ago

Hi @jlengstorf, thank you for taking some time to help me out :-)

So, I understand what you are saying conceptually very clearly, it's just how to go about making it work in the app, which is very challenging for me.

I copied and pasted your code snippet from the previous comment verbatim and got this error message:

Screen Shot 2020-01-08 at 11 29 23 PM

...so I'm thinking that I clearly need to do one or (many) more things to re-factor the code to make it work inside of Gatsby and get the code passed the SSR build step.

So now I move down the page and add a useEffect hook to the currentTime declaration, like so:

  // -------- `useEffect` VERSION -------- //

  // A useState hook that will read and write the current time of the HTMLAudioElement
  const [currentTime, setCurrentTime] = useState()
  // useEffect hook to set the currentTime when the code is running in the client
  useEffect(() => {
    setCurrentTime(state.audioPlayer.currentTime)
  }, [])

  // This const declaration uses the getTime() function to properly format the current time of the currently playing audio file. If no audio is playing then the current time will display 0:00
  const formattedTime = getTime(currentTime) || `0:00`

  // -------- End `useEffect` VERSION -------- //

...and now I get this error message:

Screen Shot 2020-01-08 at 11 35 08 PM

...so then I move down the page and add a useEffect hook to the duration declaration, like so:

  // -------- `useEffect` VERSION -------- //

  // A useState hook that will read and write the duration of the HTMLAudioElement
  const [currentDuration, setCurrentDuration] = useState()
  // useEffect hook to set the currentDuration when the code is running in the client
  useEffect(() => {
    setCurrentDuration(state.audioPlayer.duration)
  }, [])

  // This const declaration uses the getTime() function to properly format the duration of the currently playing audio file. If no audio is playing then the duration will display 0:00
  const formattedDuration = getTime(currentDuration) || `0:00`

  // -------- End `useEffect` VERSION -------- //

...and get this error message:

Screen Shot 2020-01-08 at 11 58 51 PM

...so now I move down the page and try adding a `useEffect hook to the if statement that this line of code is contained within, like so:

// -------- `useEffect` VERSION -------- //

  useEffect(() => {
    if (state.currentTrackIndex !== null) {
      const { currentTrackIndex } = state
      const currentTrack = trackList[currentTrackIndex]

      const base = currentTrack.audio.base // frost.mp3
      const baseName = basename(base) // frost

      currentTrackName = currentTrack.name
      currentTrackArtist = currentTrack.artist
      currentTrackArtwork = assetObj[`${baseName}/${currentTrack.artwork.base}`] // assetObj['frost/frost.jpg']
      currentTrackAudio = assetObj[`${baseName}/${currentTrack.audio.base}`] // assetObj['frost/frost.mp3']
    }
  },[])

  // -------- END `useEffect` VERSION -------- //

...and now I get the same error message that I started with!

Screen Shot 2020-01-08 at 11 59 25 PM

...only this time i already have a useEffect wrapped around the currentTime declaration, so...............i mean.............................i............................just................................................

huh

tenor

cosmos

uhhhhh

...maybe I need to wrap a useEffect hook inside another useEffect, possible creating an infinite regression of useEffects...? (I think that is humor, although it's hard for me to tell at this point ;-P

...so I'm not exactly sure what to do next to make it all the way to the finish line, or what error message is waiting for me there, although getting the same error message after having seemingly solved that issue is stumping me at this point.

universse commented 4 years ago

Can you try this

const [state, setState] = useState({
  audioPlayer: {},
  currentTrackIndex: null,
  isPlaying: false,
})

const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)

useEffect(() => {
  setState({
    audioPlayer: new Audio(),
    currentTrackIndex: null,
    isPlaying: false,
  })
}, [])

const formattedTime = getTime(currentTime) || `0:00`
const formattedDuration = getTime(state.audioPlayer.duration) || `0:00`

......

let currentTrackName
let currentTrackArtist
let currentTrackArtwork
let currentTrackAudio

if (state.currentTrackIndex !== null) {
  const { currentTrackIndex } = state
  const currentTrack = trackList[currentTrackIndex]

  const base = currentTrack.audio.base // frost.mp3
  const baseName = basename(base) // frost

  currentTrackName = currentTrack.name
  currentTrackArtist = currentTrack.artist
  currentTrackArtwork = assetObj[`${baseName}/${currentTrack.artwork.base}`]
  currentTrackAudio = assetObj[`${baseName}/${currentTrack.audio.base}`]
}

.......
rchrdnsh commented 4 years ago

Holy Shit!!!!!

https://rykr.netlify.com

Thank you @universse !!!!!

...and thank you @jlengstorf and @DSchau for taking time out to help me as well. Learning so much these days from giving people like yourselves :-)

DSchau commented 4 years ago

@rchrdnsh I hardly helped that was all @jlengstorf. Thanks for the kind words and thanks for using Gatsby 💜

rchrdnsh commented 4 years ago

Hi @DSchau XD,

With all due respect to @jlengstorf (and he certainly deserves maaaad respect ) it was actually @universse 's solution that ended up working for me...he is also generally a badass as well and very generous and giving of his time and expertise...dunno if he needs work and/or if gatsby is hiring, but he is a legend none the less, at least to me :-)

But in the end, it is everybody together as a collective open source hive mind that really helps push open source forward, which is super awesome, and I couldn't do what i have done so far without everybody that has taken the time to help me out, including yourself and @jlengstorf and @universse and everybody else who gives of themselves to push web technology forward :-)

rchrdnsh commented 4 years ago

Actually, found a new issue @universse and @DSchau and @jlengstorf , but only with the live production version of the app running live on the web via netlify...

When I'm running the app in gatsby develop mode locally the tracks play instantly (of course)

When I gatsby build the app then gatsby serve it the tracks play instantly (makes sense)

When I upload the site to netlify and run it in production at https://rykr.netlify.com the tracks take roughly 8-10 seconds before they play, even thought the progress bar starts moving right away. I thought at first that they were just taking some time to load, but it also happens when i replay a track that has been previously played, so I'm not sure where the issue even lies...is it a gatsby issue? a netlify issue? an html audio issue? all of the above?

Any thoughts would be much appreciated. Googling the issue has not helped much yet...

jlengstorf commented 4 years ago

sounds like a loading issue to me — I'd try uploading plain HTML/vanilla JS to Netlify to see if the same delay happens. if that loads fast, then you can add Gatsby back in and see if it gets reintroduced.

I'd bet that link rel="prefetch" and caching are going to be your friends here

rchrdnsh commented 4 years ago

hmmmm...they are also pretty big files... in the 10 to 15 MB range, even as mp3...trying to figure out how to compress them further without too much audio loss, but yeah...maybe i need to look into libraries for audio and video streaming on the web...

jlengstorf commented 4 years ago

@rchrdnsh Cloudinary is pretty good at this — they make it sound like they only support video, but I use it for e.g. Honkify and it's super easy + automatically resamples for smaller file size

rchrdnsh commented 4 years ago

hmmmm...so, i can make the audio files smaller myself, and netlify is also a cdn...so is there any other feature of cloudinary that makes it better than rolling my own? I'm not seeing any sort of audio or video streaming features listed, like breaking the audio up into pieces and loading the pieces into a buffer so that it can play instantly, but of course I could be missing it...

jlengstorf commented 4 years ago

I like that it doesn't require me to roll my own, but the trade-offs may be different for you

full details on what they can do with audio: https://cloudinary.com/documentation/audio_transformations

rchrdnsh commented 4 years ago

Awesome, had not found that page, thank you :-)

Sooooooo...getting back to the whole preload and cache idea.........ummmmmm............how?

Been googling for Gatsby solutions but getting a bit lost...

I ask because there are really no links, per say, as the audio files are being gathered via a graphic static query, so I don’t have any knowledge as to how to implement preloading and caching to assists from a graphql query, or if it should be done via something in the template, or via a web pack plugin or something...

...yeah...🤷🏼‍♂️

jlengstorf commented 4 years ago

I'll let @DSchau take over for the caching question — I'm not sure what the current official Gatsby stance is on that

for preloading: https://developer.mozilla.org/en-US/docs/Web/HTML/Preloading_content

rchrdnsh commented 4 years ago

Hi @universse!

The site is now working not too badly with gatsby-plugin-offline turned on, which is nice.

Running into an odd issue in safari, however. The audioPlayer.duration is returning infinity:aN, instead of the duration, but it works fine in chrome and firefox...

Screen Shot 2020-01-16 at 8 28 23 PM

googled around a bit and found a couple things, but I don't understand them at all...

https://stackoverflow.com/questions/29019672/javascript-returns-nan-for-audio-duration-depending-on-request-sequence

https://stackoverflow.com/questions/22213577/audio-track-duration-showing-infinitynannan-in-safari-only

if you have a chance, would you mind looking at the issues and see if they make any sense to you? I'm gonna keep trying to wrap my head around it as well.

PS - I should also mention that it works in gatsby develop but not after gatsby build while running gatsby serve, or live on the internet via netlify:

https://rykr.co

rchrdnsh commented 4 years ago

Hi @universse,

I think i figured out what's causing the the infinity:aN issue...gatsby-plugin-offline

when I disable that plugin then the duration works properly in safari, but when I re-enable that plugin and push it back to the server, then the infinity:aN comes back...no idea how to fix it yet, but at least I'm starting to narrow it down...also found this article, but have yet to understand it at all in any way...

https://philna.sh/blog/2018/10/23/service-workers-beware-safaris-range-request/

universse commented 4 years ago

Can you console.log(state.audioPlayer.duration)? I wonder what the un-formatted value is.

rchrdnsh commented 4 years ago

So, in develop mode or with gatsby-plugin-offline disabled, I'm getting the raw duration values, like so:

Screen Shot 2020-01-18 at 9 47 54 AM

...in which duration is NaN without audio selected or playing, but then changes to an integer value while playing.

With gatsby-plugin-mode enabled, this happens:

Screen Shot 2020-01-18 at 10 33 21 AM

...where the value is NaN before audio plays and infinity after a play button has been pressed, and every time the setTimeout runs the infinity value is re-calcualted? I think.

rchrdnsh commented 4 years ago

As a side note, I'm exploring the possibility of using a library like howlerjs instead of continuing to roll my own, as they have accounted for many edge cases and have a nice suite of features built-in, like streaming and chunking and sprite-ing and loading states, spatialization, etc...

There's even a nice little library called:

https://github.com/E-Kuerschner/useAudioPlayer

...which is a react hooks wrapper around howler, and the developer is super chill as well.

Been learning so much doing this though, so maybe even digging into the howler code and seeing if I can port any of the techniques over might be a good idea as well.

universse commented 4 years ago

Seems to be an issue with service worker serving audio. I'm not entirely sure how to fix that. I suggest you open a new issue for this.