Aaron Boxer opens the talk by caling it he “Codec of the Future”
JPEG was originally standardized in 1992. > 25% of internet traffic in the 90’s
JPEG has issues though, compression artifacts at low bitrates, lossy compression only, limited to 8 bits, 3 components (colour components?) Image dimensions are limited to 2^16-1
JPEG2000 is supposed to solve this
JPEG2000 was standardised in 1998 ITU T.800 and is available Royalty free
JPEG2000 boasts superior compression performance (20% to 200% vs JPEG) and less compression artifacts. The codec has support for lossy and lossless compression, which is different from JPEG which only supports lossy compression. It also has a visually losless compression which removes invisible data from the image. The comparison here is drawn to the inaudible compression of the mp3 file format.
The JPEG200 format has some interesting features w.r.t. to progression. The codec can already do something with incomplete data and there are different schemes for this:
Progression by quality allow the codec to already display a lower quality image of the same resolution while loading continues. There are different quality layers encoded in the image.
Progression by resolution allows the codec to display a lower resolution replica. It first shows a thumbnail and as more data becomes available a progressively larger image can be displayed.
Progression by spatial resolution allows the encoder to decode a small region at a time. This is is interesting for memory constrained applications.
Progression by components e.g. luminance and chroma.
Interesting is that the encoder can recompress a given image without introducing compression artifacts.
Given these features JPEG2000 is already used int he following industries:
The reason why JPEG2000 is not more popular is due to
The standard has been adapted and there is now High throughput JPEG2000. The block coder is new which increases performance x10 and is SIMD/SIMT friendly.
Design is now also GPU optimised: originally multiplication was considered to be the limiting factor, however the recent rise in parallel architectures has made branching the thing to optimise against.
Original JPEG2000 data can be encoded into HTJPEG2000 without loss.
Open Source toolkits:
During the Q&A JPEGXS was mentioned as a competing standard, however it is encumbered by patents. The presenter also mentions that JPEG2000 is reportedly faster than JPEGXS. Another competing standard mentioned in a question is JPEGXL which is a Google project.