[Neil Schemenauer] > I seem to remember someone saying that GCC generated better code for: > > if (exceptional) { > do exceptional things > break / return / goto > } > do normal things > > Is GCC in the dumb category? Yes, any compiler that doesn't do branch prediction based on *semantic* analysis is dirt dumb. A simple example of semantic prediction is "comparing a pointer to NULL is probably going to yield false". Ditto comparing a number for equality with 0. I'd like to see a reference for the pattern above; it goes against the very common "forward branches usually aren't taken" heuristic. Note that Vladimir applied that gimmick to an extreme in obmalloc.c's malloc() function. > Also, the Linux is starting to use this set of macros more often: ["likely" and "unlikely"] They're late to the party. Cray had a "percent true" directive 20 years ago, allowing for 48 bits of precision in specifying how (un)likely <wink>. > ... > I don't have GCC >= 2.96 otherwise I would have tried sprinkling some of > those macros in ceval and testing the effect. Maybe more interesting: One of the folks at Zope Corp reported getting significant speedups by using some gcc option that can feed real-life branch histories back into the compiler. Less work and less error-prone than guessing annotations.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4