Skip to content

CMPTO_J2k Encoding / Decoding Latency #416

@slm-sean

Description

@slm-sean

Hi,

As suggested, I wanted to look into seeing if there is a way to reduce the latency of the CMPTO J2K codec. We have measured the latency by comparing the source signal on a reference monitor with burned in timecode, side by side with a monitor of the same model displaying the output of and Ultragrid decoder.

We have observed a latency of approx 4-6 frames when encoding an 8-10bit 422-444 signal. The latency seems to increase by several frames when encoding 12bit 444 to approx 6-7 frames.

Below are screenshots of the reported video encoding times and the corresponding settings. I have noticed, as expected, that reducing the quality reduces the encoding time of each frame. I have not yet verified if this results in a reduction in end to end latency, which I am hoping to test soon. Is this the behaviour I should expect to see?

UHD 444 10bit - Quality=1 - MCT Enabled - Tiles=1 - Pool=1
image
UHD 444 10bit - Quality=.5 - MCT Enabled - Tiles=1 - Pool=1
image

UHD 444 12bit - Quality=1 - MCT Enabled - Tiles=1 - Pool=1
image
UHD 444 12bit - Quality=.6 - MCT Enabled - Tiles=1 - Pool=1
image
UHD 444 12bit - Quality=.5 - MCT Enabled - Tiles=1 - Pool=1
image

For our use case, we feel that this solution is right on the edge of being acceptable for the majority of our users, so even a reduction of 2-3 frames could greatly improve the experience and accuracy of the remote work being done.

Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions