A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from http://blog.cfelde.com/2010/06/c-vs-java-performance/ below:

C++ vs Java performance; It’s a tie!

So a while back I came across this Java vs C++ performance benchmark. It was kind of dated and as the Java VM continues to improve (in addition to the C++ compiler I would presume), I thought it would be interesting to rerun the tests to see what we would end up with.

So this is what I did: I started up a new Rackspace Cloud Server of their biggest kind (most CPU and 15.5G RAM) and booted it up with Debian 5.0. I then compiled all the tests and ran them all 25 times each. I used the time utility to measure the elapsed real time used by each test and logged that for later analysis. Of the 25 runs on each test, the slowest and the fastest were removed so that I ended up with 23 sample points pr test. These 23 sample points were then averaged so that we end up with one measure point pr test.

C++

The diagram below shows (click to enlarge) the run time for each of the C++ test cases, compiled with the -O, -O2, and -O3 optimization flags. All test cases are compiled with -march=x86-64, using g++ version 4.3.2.

Not too surprising, in general, the performance improves with increasing optimizations.

Java

Next up is Java. It’s run with the -client and -server flags, and the diagram below (click to enlarge) shows run time for each Java test case. The Sun Java HotSpot VM, version 1.6.0_20 was used, and as we see there’s not much of a difference between using either the -client or -server flags when benchmarking these test cases.

C++ vs Java

So how do they compare to each other? In the diagram below (ones again, click to enlarge), I’ve taken the absolute best run time for each of the tests from both C++ (either -O, -O2, or -O3) and Java (either -client or -server) and set them up against each other.

And the conclusion? It’s a tie between C++ and Java, each “winning” 6 of the 12 test cases.

Additional information

I’ve made available two ZIP files. The first one contains all the source code, compile and run scripts + input files (quite large, therefor the large ZIP size), and the second contains all this + the compiled files and time logs for each of the test cases.

There’s also a Google spreadsheet with all the numbers and diagrams.

Finally, there were two tests I excluded, hash.cpp and hash2.cpp (available in the ZIP files), as the C++ compiler complained and refused to compile the source code due to a deprecated or antiquated header.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.3