If I do ls -1 target_dir | wc -l, I get a count of files in a directory. I find this a bit cumbersome. Is there a more elegant or succinct way?
- 3,442
5 Answers
Assuming bash 4+ (which any supported version of Ubuntu has):
num_files() (
shopt -s nullglob
cd -P -- "${1-.}" || return
set -- *
echo "$#"
)
Call it as num_files [dir]. dir is optional, otherwise it uses the current directory. Your original version does not count hidden files, so neither does this. If you want that, shopt -s dotglob before set -- *.
Your original example counts not only regular files, but also directories and other devices -- if you really only want regular files (including symlinks to regular files), you will need to check them:
num_files() (
local count=0
shopt -s nullglob
cd -P -- "${1-.}" || return
for file in *; do
[[ -f $file ]] && let count++
done
echo "$count"
)
If you have GNU find, something like this is also an option (note that this includes hidden files, which your original command did not do):
num_files() {
find "${1-.}" -maxdepth 1 -type f -printf x | wc -c
}
(change -type to -xtype if you also want to count symlinks to regular files).
- 544,893
- 125,559
- 25
- 270
- 266
-
Won't
setfail if there are very many files? I think you might have to usexargsand some summing code to make that work in the general case. – l0b0 May 01 '14 at 11:31 -
1Also
shopt -s dotglobif you want files starting with.to be counted – Digital Trauma May 01 '14 at 15:19 -
1@l0b0 I don't think
setwill fail under these circumstances, since we're not actually doing anexec. To wit, on my system,getconf ARG_MAXyields 262144, but if I dotest_arg_max() { set -- {1..262145}; echo $#; }; test_arg_max, it happily replies 262145. – kojiro May 01 '14 at 17:52 -
-
@l0b0
setis a shell builtin, it does not suffer fromARG_MAXrestrictions. – Chris Down May 01 '14 at 21:28 -
potentially more elegant. Definitely not more succinct. And not as obvious as other solutions. – Michael Martinez May 01 '14 at 22:32
-
4@MichaelMartinez Writing obvious code is not a substitute for writing correct code. – Chris Down May 02 '14 at 05:51
-
-
@ChrisDown I don't wan't to interfere in the discussion on code correctness, but hasn't that solution a memory usage proportional to the number of files (if one consider an average name length) ? – Emmanuel May 02 '14 at 17:11
-
-
@l0b0 I was trying to explain what Chris Down said more succinctly. The
ARG_MAXrestriction is an error given by theexecfamily of functions – it's not a feature of the shell per se, any any command that doesn'texecwould not suffer from the limitation. Shell builtins are the most common example of commands that take arguments but don'texec. – kojiro May 05 '14 at 13:18 -
+1 for the
findsolution. It deals correctly with special filenames (like ones with line breaks in it). – LatinSuD Jun 19 '14 at 18:14
f=(target_dir/*);echo ${#f[*]}
works correctly for file with spaces, newlines, etc. in the name.
- 284
-
-
it could. you could also put it directly in the shell. that version assumed you wanted the current directory; i've edited it so it's closer to your question. basically it creates a shell array variable containing all the files in the directory, then prints the count of that array. should work in any shell with arrays -- bash, ksh, zsh, etc. -- but probably not plain sh/ash/dash. – Aaron Davies May 02 '14 at 15:40
ls is multi-columns only if it outputs directly to a terminal, you can remove the "-1" option, You can remove the wc "-l" option, only read the first value (lazy solution, not to be used for legual evidences, criminal investigations, mission critical, tactical ops..).
ls target | wc
- 4,187
-
5
-
@Emmanuel You'll need to parse the result of your
wcto get the number of files in even the trivial case, so how is this even a solution? – l0b0 May 02 '14 at 07:22 -
@Emmanuel This can fail if
targetis a glob which, when expanded, includes some things that start with hyphens. For example, make a new directory, go into it and dotouch -- {1,2,3,-a}.txt && ls *|wc(NB: userm -- *.txtto delete those files.) – David Richerby May 02 '14 at 07:51 -
Did you mean
wc -l? Otherwise you get newline, word, and byte counts of thelsoutput. That is what David Richerby said: You have to parse it again. – erik May 02 '14 at 20:55 -
@erik I ment
wcwith no argument you don't need to parse if your brain knows that the first argument is newline. – Emmanuel May 02 '14 at 22:06
If it's succinctness you're after (rather than exact correctness when dealing with files with newlines in their names, etc.), I recommend just aliasing wc -l to lc ("line count"):
$ alias lc='wc -l'
$ ls target_dir|lc
As others have noted, you don't need the -1 option to ls, since it's automatic when ls is writing to a pipe. (Unless you have ls aliased to always use column mode. I've seen that before, but not very often.)
An lc alias is quite handy in general, and for this question, if you look at the "count the current directory" case, ls|lc is about as succinct as you can get.
- 284
So far Aaron's is the only approach more succinct than your own. A more correct version of your approach might look like:
ls -aR1q | grep -Ecv '^\./|/$|^$'
That recursively lists all files - not directories - one per line including .dotfiles beneath the current directory using shell globs as necessary to replace non-printable characters. grep filters out any parent directory listings or .. or */ or blank lines - so there should only be one line per file - the total count of which grep returns to you. If you want child directories included as well do:
ls -aR1q | grep -Ecv '^\.{1,2}/|^$'
Remove the -R in either case if you do not want recursive results.
- 58,310
-
1I tend to prefer to do that kind of thing with
find. If you only want a count, this should work:find -mindepth 1 -maxdepth 1 -printf '\n'|wc -l(remove the depth controls to get recursive results). – Aaron Davies Aug 17 '15 at 21:33 -
@AaronDavies - that doesn't actually work. Put a newline in any of those filenames and see for yourself. Also, to do the same thing portably you do:
find . \! -name . -prune | wc -l- which still doesn't work, of course. – mikeserv Aug 19 '15 at 04:45 -
1I don't follow -- the
printfinstruction prints a constant string (a newline) that doesn't include the filename at all, so the results are independent of any strange filenames. This trick can't be done at all with afindwhich doesn't supportprintf, of course. – Aaron Davies Aug 29 '15 at 23:37 -
@AaronDavies - oh, true. I assumed the filename was included. It can be done portably, though, of course:
find .//. \!. -name . -prune | grep -c '^\.//\.'– mikeserv Sep 08 '15 at 07:16 -
brilliant!
/being the only other character that can't appear in filenames, the.//.sequence is guaranteed to appear exactly once for every file, right? a couple questions though -- why.//., and why-prune? when would this differ fromfind . \! -name . | grep -c '^\.'? (i assume the.in your\!.is a typo.) – Aaron Davies Sep 18 '15 at 03:45 -
@AaronDavies - the
-pruneis to keep from recursing. The/can appear in soft-links, butfinddoesn't list them that way. Its spec'd to report paths a certain way, and so you can rely on the spec in such a way that you can contort the output in simple, one-off ways that accomplish your goal. its why the spec is so important - in order to reliably do a thing you need to know reliably and up front what it won't do first. – mikeserv Sep 20 '15 at 02:46
lsalready gives the total count, so how aboutls -l | head -1? Make it an alias if you want something shorter. – Daniel Wagner May 02 '14 at 04:19ls -lindicates the total size of the files, not the number of files. – David Richerby May 02 '14 at 07:58ls | wc -lwill give you the wrong count if any file names contain newlines. – chepner May 02 '14 at 19:29
– ctrl-alt-delor Aug 02 '15 at 22:50stat -c %h .gives the same information asls -ld . | cut -d" " -f 2