Hi, I have a SCEV like this: {16,+,8}. Say that this came from a loop like: int64_t *p; for (int64_t i = 0; i < ...; ++i) p[i+2]... And assuming that I'm on a 64-bit machine. What I would like to do is normalize it like that, basically this: {2,+,1} i.e. map it to the index. Now, I tried to get the underlying element size of the pointer, then getUDivExpr(OriginalSCEV, ElementSize); But I don't get the desired SCEV back. Instead I'm getting: {16,+,8} \u 8, i.e. just adding udiv in the original expression. Why is this not simplified? Thanks, Stefanos Baziotis -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20200730/ffc5f897/attachment.html>
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4