Java memory usage

27 October 2018
In an old article on worthwhile programming languages I remarked on some speed-testing of Java and native code I did in the mid-late 2000's, although for some reason at the time I decided not to cite the figures I had obtained. From memory the Java run-time using HotSpot optimisation was twice as fast as the GCJ-compiled native code but it used 60 times more memory — ~7mb vs. 450mb — which I did not consider an acceptable trade-off. For context a desktop computer with 1GB of RAM at the time would have been considered generous. Since then I have nevertheless bought up the experiment as one of my reasons for disliking Java.

After digging through my old computer archives, I found what I suspected was the program I used for the bench-marking and it was — an intentionally inefficient program that sorts an array of integers by looking for a permutation for which the values monotonically increase. Why I chose this program I have long-forgotten but I suspect it was wanting an algorithm that did not make use of any fancy Java APIs. Having found the code I decided to re-run the tests, about ten or so years after I had done the original experimentation, so that I would have a full set of numbers to cite.

Building & using GCJ

A complication with re-running the tests is that GCJ — the Java GCC front-end — has long been discontinued. Around October 2016 GCJ was removed from GCC October 2016, which based on release dates means the last version of GCC to include it is v6.2.0. Cue doing build from tarball sources, which is fairly straightforward, although the download done in ./contrib/download_ecj might have to be done manually:

tar -xzvf gcc-6.2.0.tar.gz cd gcc-6.2.0 ./contrib/download_ecj cd .. mkdir build cd build ../gcc-6.2.0/configure --enable-languages=c,c++,java --prefix=/opt/gcc-6.2.0 --disable-multilib make make install

A rebuild will take an hour or two, even on a fairly high-end system. Using make -j might speed things up, but I have known some older builds to break when this option is used. Once finished the EvilSort class file can be compiled to machine code using the following commands. I did not see any significant difference in end result compiling the Java source file instead:

export LD_LIBRARY_PATH=/opt/gcc-6.2.0/lib64 /opt/gcc-6.2.0/bin/gcj EvilSort.class -O3 --main=EvilSort -o EvilSort.exe

Running the tests

The three commands-lines for Java with HotSpot, Java without HotSpot, and native executable, are shown below in order. Note that the full path /usr/bin/time needs to be used, as some shells have time as a built-in command that doesn't have the option of showing run-time statistics other than time taken — the one of interest is maximum resident set size, which is the amount of memory execution chews up.

/usr/bin/time -v java EvilSort 8 /usr/bin/time -v java -Djava.compiler=NONE EvilSort 8 /usr/bin/time -v ./EvilSort.exe 8

Test results

Execution time and peak memory consumption are shown in the table below for each of the three commands above, with each of the values being the average of three runs. These were done a head-less server that was otherwise unloaded at the time.

Enviornment Running time (seconds) Memory usage (megabytes)
JVM & HotSpot 82 4,892
JVM 702 370
Native GCJ 287 48

Biggest surprise was that the non-HotSpot JVM was substantially faster than I was expecting, as back in the 2000's I remember it taking a dozen or so times longer than GCJ-compiled code, whereas here it took over just half the time. I am unsure whether this is due to inefficiency on the part of GCJ, or advances in CPU technology that are squashing overheads of interpretation — probably a combination of both. Rewriting the test program in C might give better results, but that simple exchanges one set of issues with comparative fairness with another set.

The comparison of real interest is between HotSpot and GCJ-compiled code. HotSpot in this case was four times as fast, but it used 100 times the amount of memory. These days 48 megabytes is not much at all, much like 7 megabytes was not much in the late-2000s, but the close to five gigabytes that HotSpot chews its way through would be a big hit on all but the beefiest of modern systems. My 2012-built home desktop has a total of 8gb, which is still quite respectable even today for a non-gaming system.


Java as a language is itself quite good, and as shown by the by the Dalvik run-time on Android it can be efficient, but on the desktop I developed a hatred of Java practically from day-one. Those early days were Sun Ultra-5 systems with 64MB total memory, so running the Java program I was developing would cause the system to swap like mad, and since then the anti-social memory demand has made me avoid it as much as possible. At a personal level it was also nice to be able to go back to experiments I did many years ago and get results that were not far off the ones I remember.