fadelakin / slide

do you slide on all your nights like this?
20 stars 0 forks source link

Phillips Hue Ideas #3

Open zethussuen opened 7 years ago

zethussuen commented 7 years ago

Hi @fadelakin! Am a fan of the project and music/visualization in general. Would love to help out on some of the ideas you have for Hue integration (single/multi-bulb). Can you dump ideas on how you use the colors?

Currently it seems that you're taking imageio and just feeding it extracted colors. We could probably pull beat information from the track via the python pkg aubio and cycle the extracted hex based on the returned date?

fadelakin commented 7 years ago

So the Hue integration isn't added in yet. But the idea is this. If I have one or two hue lightbulbs, I would set the color of those light bulbs to the dominant color of the album art. So that way, the lights change with each track.

If I have like four or five light bulbs, I can set the color of each lightbulb to be one color from the color palette that we generate through colorthief. I've done a lot of audio visualization stuff in Processing before and I'm looking to move that into Python based stuff due to Python's incredible flexibility and ease to build things.

There are essentially two routes that one could take with the Hue integration.

  1. Simply use the most dominant color and set the color of the light bulb to that color. Kind of like a mood lighting set up.
  2. What you're saying by pulling the beat information from the track and set the colors that way. This is more complicated because we would need an audio source (i.e. computer's microphone or external audio input) to be able to get the data we want/need to visualize.

I'm open to either but I was essentially planning on implementing the first route first and then the second a bit down the line.

cheung31 commented 7 years ago

We could probably pull beat information from the track via the python pkg aubio and cycle the extracted hex based on the returned date? @zethussuen

  1. What you're saying by pulling the beat information from the track and set the colors that way. This is more complicated because we would need an audio source (i.e. computer's microphone or external audio input) to be able to get the data we want/need to visualize. @fadelakin

With respect to the tempo/BPM feature of the track, you could pass along whatever track meta you can obtain from your current AppleScript into the Search Item endpoint (https://developer.spotify.com/web-api/search-item/), and then grab the found track's audio features to obtain the tempo: https://developer.spotify.com/web-api/get-audio-features/

See:

tempo float The overall estimated tempo of a track in beats per minute (BPM). In musical terminology, tempo is the speed or pace of a given piece and derives directly from the average beat duration.

This use of Spotify API could greatly reduce the complexity of analyzing the track at runtime. I guess the tricky part with this approach would be synchronizing the timing of this tempo interval with the music playing.

zethussuen commented 6 years ago

Dug around Spotify's API a bit more. The Audio Features endpoint would not be the best use if you want a more connected feeling to the music as tempo itself can change throughout the song and you don't want an overall average.

The Audio Analysis endpoint seems to give a much richer breakdown of segments (with timestamps) of the track with some analysis on bpm, key, etc. I think a rough MVP could just consist of taking the start values from each segment as a trigger to cycle colors?

As for the timing of the playback, there's also an endpoint that will return progress_ms of the currently playing track here: https://developer.spotify.com/web-api/get-the-users-currently-playing-track/

Obviously this is all scope creep and just theoretical. Perhaps just connecting to a Bridge would be a good first step :P