Skip to content

Wrong widthPerSecond in AssetTrackSampleLoader used to calculate samplesPerPixel #2

@mcertain

Description

@mcertain

I kept noticing that for short audio 3 seconds or less, the width seemed very off but as the sample became larger it was fine. I finally noticed that the "generateTrackOutput()" in the constructor of AssetTrackSampleLoader() uses the widthPerSecond stored in the AssetTrackSampleLoader class to calculate the samplesPerPixel. But the widthPerSecond is still the default value of 10, so it never changes. It should be the calculated actualWidthPerSecond. So added a parameter to AssetTrackSampleLoader so that it can take the widthPerSecond. Currently, the width is set on the loader only after generateTrackOutput() is called, so it makes no difference.

Update the init/constructor for AssetTrackSampleLoader as follows...

public init(track: AVAssetTrack, actualWidthPerSecond: CGFloat) {
    self.widthPerSecond = actualWidthPerSecond
    self.track = track
    generateTrackOutput()
}

Then pass in the value when creating it in the loadSamples function...

        let loaders = asset.tracks(withMediaType: .audio).map { (track) -> AssetTrackSampleLoader in
            let loader = AssetTrackSampleLoader(track: track, actualWidthPerSecond: widthPerSecond)
            //loader.widthPerSecond = widthPerSecond
            return loader
        }

Also, I'm not sure why but dividing by 5000 instead of 20000 in "updatePoints()" would allow the waveform to show more clearly for me. I'm not sure where 20000 comes from but maybe this should be selectable too. I tried scaling the height of the waveform instead but that didn't seem to help.

        func updatePoints(with audioSamples: [VIAudioSample]) {
            var points: [Float] = []
            if let audioSample = audioSamples.first {
                points = audioSample.samples.map({ (sample) -> Float in
                    return Float(sample / 5000.0)
                })
            }
            strongSelf.viewModel.points = points
        }

Lastly, I don't ever want to truncate samples. I want to see the whole waveform. So I changed the following by commenting the if check block and always setting the actualWidthPerSecond based on the duration. I think this was just here due to the above issue where the minWidthPerSecond was not getting set right.

        // if fill the timeline view don't have enough time, per point respresent less time
        /*
        if CGFloat(duration) * strongSelf.minWidthPerSecond < strongSelf.minWidth {
            strongSelf.actualWidthPerSecond = strongSelf.minWidth / CGFloat(duration)
        } else {
            strongSelf.actualWidthPerSecond = strongSelf.minWidthPerSecond
        }
         */
        
        strongSelf.actualWidthPerSecond = strongSelf.minWidth / CGFloat(duration)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions