Is there a faster lossy compression than JPEG?

Is there a compression algorithm that is faster than JPEG yet well supported? I know about jpeg2000 but from what I've heard it's not really that much faster. Edit: for compressing. Edit2: It should run on Linux 32 bit and ideally it should be in C or C++.

Richard Knop asked Dec 29, 2010 at 16:50 Richard Knop Richard Knop 83.1k 152 152 gold badges 396 396 silver badges 558 558 bronze badges for decompressing or compressing? Commented Dec 29, 2010 at 16:57 Just curious, why do the images need to be compressed? And by how much? Commented Dec 29, 2010 at 21:24

@Mark Ransom: Well, I need them compressed to send them from small humanoid robot with 500Mhz CPU and 256MB RAM over UDP to a pc for processing. I need to get to at least 20 images per second and the wifi stick is not fast enough to send that much data over 1 second so I am using JPEG to decrease the bandwidth.

Commented Dec 29, 2010 at 21:32 A video codec would be more appropriate than managing individual full frames. Commented May 9, 2013 at 20:43

6 Answers 6

Jpeg encoding and decoding should be extremely fast. You'll have a hard time finding a faster algorithm. If it's slow, your problem is probably not the format but a bad implementation of the encoder. Try the encoder from libavcodec in the ffmpeg project.

answered Dec 29, 2010 at 17:44 R.. GitHub STOP HELPING ICE R.. GitHub STOP HELPING ICE 214k 36 36 gold badges 392 392 silver badges 729 729 bronze badges

JPEG encoding is designed for fast decoding. This does not always mean that it has fast encoding as well (in fact, many times it is much slower to encode).

Commented Dec 29, 2010 at 19:14

Both are extremely fast if you're not striving for the optimal encoding. A low-end x86 from within the last few years should be able to encode jpeg at a rate of 30 megapixels per second or better (rough estimate off the top of my head).

Commented Dec 29, 2010 at 19:28

An encoder meant for video encoding is bound to be optimized for speed. I know that MJPEG has been plenty fast for years, although I always thought it took specialized hardware to achieve that.

Commented Dec 29, 2010 at 19:37

Well, I am using OpenCV to encode raw images to jpeg on a robot with 500Mhz CPU and 256MB RAM. It is taking 0.25s to encode one 640*480 RGB image now which is not acceptable. I need 20+ images per second.

Commented Dec 29, 2010 at 21:19

Well, I'm hoping to get to 0.05s for encoding a 640*480 image (YUV422) which would mean 20 fps. I hope it's realistic on 500Mhz CPU.

Commented Dec 29, 2010 at 22:17

Do you have MMX/SSE2 instructions available on your target architecture? If so, you might try libjpeg-turbo. Alternatively, can you compress the images with something like zlib and then offload the actual reduction to another machine? Is it imperative that actual lossy compression of the images take place on the embedded device itself?

answered Dec 29, 2010 at 17:13 Wyatt Anderson Wyatt Anderson 9,793 1 1 gold badge 23 23 silver badges 25 25 bronze badges

the license for libjpeg-turbo is lgpl = not right for commercial or true/actual open source projects.

– Matthieu N. Commented Dec 29, 2010 at 17:43 zlib compression is several times slower than jpeg compression. Commented Dec 29, 2010 at 17:48

png uses zlib compression. zlib is painful with embedded, the code is not really 32/64 bit clean and cross compiles poorly as well as requiring lots of ram in its default configuration. depends on how embedded you are.

Commented Dec 29, 2010 at 17:48

You could use the busybox implementation for embedded systems, but I'm not sure how well it performs.

Commented Dec 29, 2010 at 19:33

In what context? On a PC or a portable device?

From my experience you've got JPEG, JPEG2000, PNG, and . uh, that's about it for "well-supported" image types in a broad context (lossy or not!)

(Hooray that GIF is on its way out.)

answered Dec 29, 2010 at 16:54 evilspoons evilspoons 406 2 2 gold badges 6 6 silver badges 16 16 bronze badges I'd go so far as to say JPEG2000 isn't universal, so the list is really down to just JPEG and PNG. Commented Dec 29, 2010 at 16:56

The patents on LZW have expired at least in parts of Europe, so there's no real reason avoiding GIF except for it's limited colorspace. And that can be circumvented (rather ugly though).

Commented Dec 29, 2010 at 17:00 It's for an embedded linux robot. Commented Dec 29, 2010 at 17:01 tiff is still around somehow, I seem to keep running into it with scanners. also non-lossy. Commented Dec 29, 2010 at 17:48

DCT-compressed images can be put in a TIFF container, so technically TIFF can be either lossy or non-. Doesn't change the baseline observation that DCT is just about the only game in town for lossy image compression, though.

Commented Dec 29, 2010 at 17:51

JPEG2000 isn't faster at all. Is it encoding or decoding that's not fast enough with jpeg? You could probably be alot faster by doing only 4x4 FDCT and IDCT on jpeg.

It's hard to find any documentation on IJG libjpeg, but if you use that, try lowering the quality setting, it might make it faster, also there seems to be a fast FDCT option.

Someone mentioned libjpeg-turbo that uses SIMD instructions and is compatible with the regular libjpeg. If that's an option for you, I think you should try it.

answered Dec 29, 2010 at 16:57 6,574 8 8 gold badges 33 33 silver badges 37 37 bronze badges It's encoding binary images to JPEG which is too slow on my embedded linux robot. Commented Dec 29, 2010 at 17:00

@Richard Knop: Binary? As in black/white with no gray and no color? That changes things considerably.