A Dive Into Linux Command Line Bashing
Recently, a customer at my Day Job asked for data that could only be retrieved from our public API. But, most of our customers aren’t the API driven types. So rather than tell them “Use the API”, I Gave Them The Pickle as is customary for us.
To generate the report, I needed to parse some JSON at the command line and do some basic search and replaces, then convert it all to CSV. Data was retrieved from two API calls, each producing a key piece of data (an ID) and then separate details related to those ID’s. The API calls were easy enough with wget, and I was going to use JQ to parse the resulting JSON.
Now I had TWO problems
I was told that JQ was simple to use by somebody who’d used it in the past. But after fiddling with it for 20 minutes and not having much luck, I reached out to a coworker who’s a highly respected developer. My coworker admitted that they weren’t super familiar with JQ, and that they typically resorted to tried and true Unix commands and pipes rather than learning something new that could only do one thing.
Those tried and true commands he was talking about were the very ones I thought I would be “cheating” to use, because I was forgoing the use of a newer tool.
Do One Thing Well
Those tried and true commands we were referring to? None other than the usual
awk sed cut grep and of course the Unix pipe | to glue them all together. Really, why use a JSON parsing program that only could only do one function (parse JSON) when I could use a combination of tools that, when piped together, could do far more?
Would I really gain anything by using a specialized tool released in 2013 (jq) (what Alton Brown calls a “unitasker”, devices that are created for one job and one job only) over using the common tools released for Unix between 1973 and 1985? And would it be worthwhile to learn something brand new instead of using tools were already well known to me?
I decided it wasn’t. So I got to work doing things the way I always do them.
grep sed awk and
cut and a loop.
For Things in Thing, Do Stuff. Thanks.
The bash script that I ended up writing does things quick and somewhat ugly. First, it saves the API call output to a couple of temp files. Then, it parses the JSON with the aforementioned Unix staples in a for loop. Then it uses the second API call to cross reference another piece of info I need and then
sed to replace it in my output. All done, with the needed output sent to my customer, in less time than it would have taken me to properly learn JQ.
An Added Benefit: Portability
Bash might not be sexy, and
grep sed awk and
cut might not be cutting edge, but they are everywhere. MacOS? Yup. Linux? Obviously. And Windows? With WSL2, absolutely. In fact that’s where I developed this tool, and then went and ran it on a native Linux machine without having to install any additional packages. Had I used JQ, I’d have had to make sure it was installed everywhere my script was intended to be ran. It’s just the first steps on the stairway down to dependency hell.
This Might Be Awkward, but…
All of these musings led me to a conclusion that works for me in my situation: If I can’t do it with
grep sed awk and
cut and a
while loop then maybe I’m not the right guy to do it. Oh sure, I could learn Python. But I don’t need Python on a daily basis like I do bash. And sure, I could learn Perl, but this is 2023 and people know better these days. [Editor’s note: He’s kidding, sort of, but he’s also mostly right]
Sometimes the best tool for the job isn’t the New Shiny, but rather it is the tool you know, love, and trust. What are your favorite old and crusty go-to’s these days?