João Távora <joaotavora@gmail.com> writes:
On Thu, Nov 24, 2022 at 3:01 AM Dmitry Gutov <dgutov@yandex.ru> wrote:I'm imagining that traversing a directory tree with an arbitrary predicate is going to be slow. If the predicate is limited somehow (e.g. to a list of "markers" as base file name, or at least wildcards), 'git ls-files' can probably handle this, with certain but bounded cost.
I've seen references to superior performance benefits of git ls-file a couple of times in this thread, which has me a little confused. There has been lots in other threads regarding the importance of not relying on and not basing development on an underlying assumption regarding the VCS being used. For example, I would expect project.el to be completely neutral with respect to the VCS used in a project.
I also wonder if some of the performance concerns may be premature. I've seen references to poor performance in projects with 400k or even 100k files. What is the expected/acceptable performance for projects of that size? How common are projects of that size? When considering performance, are we not better off focusing on the common case rather than extreme cases, leaving the extremes for once we have a known problem we can then focus in on?
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4