-
-
Notifications
You must be signed in to change notification settings - Fork 266
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proposal: add a --by-dir option #225
Comments
Sounds like a reasonable request. I could see it being pretty useful for those who don't want to use SQL with the new (yet to release SQL option). If you are going to look into this, you might want to hold off till Go 1.16 because then changes there to the file/path functions look like something that should be integrated in. Certainly its something I want to evaluate. Ill have to try out the new walking/directory methods and compare to whats in scc currently to see if that's the correct path forward though. Odds are I would merge this though, so long as its done in such a way I can maintain it. |
I think an even usefuler option would be |
Or even |
I may be wrong on this, but isnt it possible to do
I do this from time to time with fzf like so
When I want stats for a single file. |
@boyter I think if you do that, you will get the results by-file anyways (the tool collates all arguments) I think what we're asking for is for the tool to report each argument separately. So kind of equivalent to doing a shell for loop and running SCC for each directory (but an an order of magnitude faster bc it can cache results). In fact that is what I'm currently doing. I'm building a tool that uses SCC to add a .metadata.json filé in every directory that holds the number of lines of every nearest child (file/subdir). This can be super useful for tools like LSD, NeoTree, Ranger, or Web interfaces (git forges) to show tho SLoC of every file and directory in the file tree. As mentioned, I currently have to run SCC on every single directory. This is quite slow (a largish codebase of 0.5M lines takes 3 min on my M1). Even worse, it grows superlinearly, bc larger codebases have more nestedness. (E.g., Linux repo @ 25M LoC is intractable.) (Alternatively, if you think the idea/ standardI'm trying to advance is a good idea, SCC could create the jsons itself. Happy to work with you on that) |
I came by to ask for a similar feature. Specifically, what I want is sub-total reports per directory. That means that in a repository with directories |
Thanks for the comment. This is now back on my list of things to investigate. |
OK first step towards this has been done with a max depth option added into the file walker boyter/gocodewalker@2458d89 allowing part two of this. Note I need to make a release of that to bring it into scc but that will happen in the next few days. |
At the moment I am inclined to follow the following format,
Which provides the most flexibility with the least work for me. It also has the benefit that you can dump your args into a file and then use that to give you the ability to rerun things more easily. Where things get harder is the display. Looking for ideas on how to actually display this, I have something like the below for the moment as something to consider. However JSON output is also something to consider as it would either require a new JSON/XML/CSV etc... output format OR a non backwards compatible format.
|
I think that looks great ! |
I often find it useful to list code size by directory; use cases are:
foo
package".--test
flag, similar to--gen
might also make sense, especially for environments where tests aren't in separate directories, like Go, but that's a different issue).Right now there's
--by-file
, but that's too fine-grained. So I propose to add a new flag:The parameter works the same as
-d
for e.g.du
; I could add a new--depth
parameter, but I think just accepting a value here makes more sense.I'll work on a patch, but I wanted to make an issue first to discus to prevent working on something that may not get merged.
Thanks!
The text was updated successfully, but these errors were encountered: