Dmitry Gutov <dgutov@yandex.ru> writes: >> I also wonder if some of the performance concerns may be premature. I've >> seen references to poor performance in projects with 400k or even 100k >> files. What is the expected/acceptable performance for projects of that >> size? How common are projects of that size? When considering >> performance, are we not better off focusing on the common case rather >> than extreme cases, leaving the extremes for once we have a known >> problem we can then focus in on? > > OT1H, large projects are relatively rare. OT2H, having a need for > subprojects seems to be correlated with working on large projects. There are medium-sized projects where sub-projects makes a lot of sense and the performance of C-x p f project-find-file is fine for both the super-project and the sub-project. But it's true that the larger the project, the more likely the need for subprojects. And it's also true, as you note, that sub-projects C-x p f might surprisingly be less performant than in the super-project. But IMO optinion the slowness of C-x p f in large projects isn't solved by git ls-files in my opinion. That can only do so much. Once the project grows large enough in files, git ls-files can be quick as lighting, but you just won't be able to cons that ginormous list of files in the project-files implementation. This is why it's important to allow project-files to return a generalized completion table. To write those tables, you will need an external tool (like "voidtool everything" on MS Windows, or GNU locate on Linux) and the "backend completion" style that I'm implementing with Stefan. João
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4