dmrschmidt / DSWaveformImage

Generate waveform images from audio files on iOS, macOS & visionOS in Swift. Native SwiftUI & UIKit views.
MIT License
980 stars 109 forks source link

different colors #21

Closed MysteryRan closed 3 years ago

MysteryRan commented 3 years ago
截屏2021-02-20 下午3 02 15

want to realize like this function,set different part different color, how to work it? thx.

dmrschmidt commented 3 years ago

There's many different ways this could be achieved. In this particular case, it looks like the "2nd color" is simply the same white, just with 40% or so opacity. So what you could do is:

I happen to have done this masking in one of my apps, so for illustration here's that code:

func updateProgressWaveform(_ progress: Double) {
    let fullRect = viewModel.playbackWaveformImageView.bounds
    let newWidth = Double(fullRect.size.width) * progress

    let maskLayer = CAShapeLayer()
    let maskRect = CGRect(x: 0.0, y: 0.0, width: newWidth, height: Double(fullRect.size.height))

    let path = CGPath(rect: maskRect, transform: nil)
    maskLayer.path = path

    viewModel.playbackWaveformImageView.layer.mask = maskLayer
}
MysteryRan commented 3 years ago
截屏2021-02-20 下午4 13 48

thx.

dmrschmidt commented 1 year ago

There’s no intrinsic content size being calculated. So the short answer is, there is none. Instead, you define the size of the view (and thus waveform) by either setting the view’s frame or via auto layout constraints. The audio file is then downsampled to fit into the width and height you define.

If you want a specific resolution instead, you’ll have to do some manual math based on the audio files total duration and then set the view’s dimension accordingly.

dmrschmidt commented 1 year ago

So by „unfilled“, you mean „unplayed“ then, assuming that image you posted represents playback progress?

In that case, you‘d need to do sth similar to what I had originally described in https://github.com/dmrschmidt/DSWaveformImage/issues/21#issuecomment-782581320

If you do still need the dimensions and position of the unplayed / unfilled area, you’ll just need to calculate the „inverse“ of that answer‘s maskLayer. So sth along the lines of let unplayedWidth = fullRect.width - newWidth. The origin is essentially newWidth Plus the x-position of the waveform view.

Im just on my phone right now, so can’t write a full code sample but I hope this gives the direction.

CacereLucas commented 1 year ago

Hey @dmrschmidt,

Any idea if is possible doing this in SwiftUI? thx.

dmrschmidt commented 1 year ago

There's tons of different ways to achieve this, one of the simplest might be

// @State var progress: CGFloat = 0 // must be between 0 and 1

ZStack(alignment: .leading) {
    WaveformView(audioURL: audioURL, configuration: configuration)
    WaveformView(audioURL: audioURL, configuration: configuration.with(style: .filled(.red)))
        .mask(alignment: .leading) {
            GeometryReader { geometry in
                Rectangle().frame(width: geometry.size.width * progress)
            }
        }
}

[Edit] I've added this to the README now so that it's easier to find in the future :)

tayyab13-git commented 1 year ago

can you please help me how can i make it draggable also currently i have the design requirement to show stripped waves thank you @dmrschmidt

dmrschmidt commented 1 year ago

Hey @tayyab13-git,

getting a striped waveform is easy via sth similar to this

WaveformView(audioURL: audioURL, configuration: configuration.with(style: .striped(StripeConfig(color: .red, width: 3, spacing: 5, lineCap: .round))))

Making the overlay from the above example draggable is also relatively straightforward. You'll just need to add a DragGesture on the outer ZStack which would need to modify some @State variable using its onChanged(_:) modifier. That would then be used instead of the simplistic progress to manipulate the width of the .mask.

Maybe also have a look at https://developer.apple.com/documentation/swiftui/adding-interactivity-with-gestures in case you haven't used gestures in SwiftUI yet.

tayyab13-git commented 1 year ago

Im using UIKit and i think it will same on UIKIt too i need to add gesture in updateProgressWaveform function? Am i right?

dmrschmidt commented 1 year ago

Ah my bad. Well. So with UIKit then, yeah, you could in principle just re-use updateProgressWaveform as-is. You'd then need to add a UIPanGestureRecognizer to your view hierarchy where it makes sense in your specific case. And then just use its translation(in:) to infer a value within the interval (0...1) to be able to call it without modifications.

tayyab13-git commented 1 year ago

Thank you for answering Please tell me should i need to add another image over the static image as mask then on that mask i should add pan gesture and for progress i should use the value that is coming from Pan gesture in and pass it to updateProgrssWaveform Thanks for help 🙌🏻

dmrschmidt commented 1 year ago

So in the example above you need 2 identical images of your waveform on top of each other, each with a different color. The one referenced in that example as playbackWaveformImageView is the top one, which indicates the playback progress / dragging position.

That is the one getting the mask applied, so you only need 2 images. (The lower one just isn't referenced in that code, cause its static).

Where you add the pan gesture recognizer depends on your desired UX. One option is you'll add it on the view containing both the images. And yes, then you will need to do some math to map the current dragging position to the desired progress of the waveform. Maybe taking the initial position where the user touched into account or maybe not. Definitely with some calculation, because the pan recognizer gives you CGPoint and updateProgrssWaveform requires a Double between 0 and 1. It really depends a whole lot on how you want this to behave in the end.

tayyab13-git commented 1 year ago

So in the example above you need 2 identical images of your waveform on top of each other, each with a different color. The one referenced in that example as playbackWaveformImageView is the top one, which indicates the playback progress / dragging position.

That is the one getting the mask applied, so you only need 2 images. (The lower one just isn't referenced in that code, cause its static).

Where you add the pan gesture recognizer depends on your desired UX. One option is you'll add it on the view containing both the images. And yes, then you will need to do some math to map the current dragging position to the desired progress of the waveform. Maybe taking the initial position where the user touched into account or maybe not. Definitely with some calculation, because the pan recognizer gives you CGPoint and updateProgrssWaveform requires a Double between 0 and 1. It really depends a whole lot on how you want this to behave in the end.

Thank you soo much. I am working on a text to speech converter app and I'm working on a simple player to show the progress and allow user to drag the seek bar

dmrschmidt commented 1 year ago

you're welcome. and good luck with that @tayyab13-git!

ducduy20 commented 1 year ago

Can you give me detailed instructions on how to play an audio file that has a waveform like this? Thank you @dmrschmidt

ducduy20 commented 1 year ago

I still don't understand what is the viewModel here? and where is playbackWaveformIamgeView

dmrschmidt commented 1 year ago

Are you using UIKit or SwiftUI?

ducduy20 commented 1 year ago

I use SwiftUI

dmrschmidt commented 1 year ago

Then the code mentioned in https://github.com/dmrschmidt/DSWaveformImage/issues/21#issuecomment-1261171324 does everything you need to show the playback progress.

The comments referencing playbackWaveformImageView are irrelevant for you, as that’s UIKit.