0

Either this is a bug, or I'm not understanding something. But I use less -p value filename.txt to search for value in filename.text and when I do it for a file that only contains a couple of lines of text, it finds it promptly and then highlights the result.

However, I'm also using less to search in a 73GB file. I'm confident the search result is in the file, still, less stops searching at a random location but doesn't highlight its result. Actually, upon closer inspection of that frame, the text string is nowhere to be found either.

Is this expected behavior of less or what is going on here?

I'm able to exclude running out of memory as a cause, I measured memory using ps aux and checking the process' proc file but for ps aux it was 0.0 percent and VimSize was 18432 kB

  • 3
    I'm also using **less** to search in a 73GB file. Dear God, why? Use something like grep or something. But, to answer your question, check how much memory less is using-- I know that less doesn't pull the entire file into memory, but using it to search for a phrase might use more. – Barry Carter Aug 15 '22 at 13:46
  • My problem is that I have a huge JSON formatted file with some syntactical errors. And I'd like to find those and edit them out. I don't see how grep can be helpful as the amount of work needed to construct fitting regex patterns is costly when all that would be necessary was to go to that line with a text editor and make changes. Hence I'm using less. Open to alternative approaches too. – Tim Daubenschütz Aug 15 '22 at 13:51
  • 1
    Does this file have newlines, or is it a single line (or very few lines?). Something like tidy -json or json_xs or json_pp could break it into shorter lines with would help both less and grep. Also, if your search strings are fixed, you can use fgrep which is faster. Also, when you do find the errors, how do you plan to edit the file? – Barry Carter Aug 15 '22 at 13:54
  • Just to report on memory quickly, I used ps aux and before I fed less with a search string for the big file. px aux reported no memory use 0.0. vmSize in the proc file of the process was 18432 kB. – Tim Daubenschütz Aug 15 '22 at 13:58
  • 1
    Since you're the OP, put results of commands in your main message. – Barry Carter Aug 15 '22 at 13:59
  • If you're really curious, try running it under strace to see where it crashes-- I'm guessing it's doing a lot of lseeking and perhaps even mallocing (although your measure of its memory size remaining stable as it runs sort of rules that out) – Barry Carter Aug 15 '22 at 14:04
  • I found the question you asked a few comments up interesting: Namely how I'd edit the files had I found the parse error: And this is the key question. I know where the error occurs and I theoretically could manually construct an sed or awk script. But I'll have to fix the file in many places and escaping and matching json is a ton of work such that I believe a text editor that supported large file editing would truly be the solution here. Does this exist? – Tim Daubenschütz Aug 15 '22 at 14:06

0 Answers0