Today I learned that bash has hashmaps (2024)
xeiaso.net169 points by stefankuehnel 5 days ago
169 points by stefankuehnel 5 days ago
It’s amazing how much of a superpower merely reading the manual is nowadays.
While I agree with the larger sentiment I think you’re making (I also make it a habit to peruse the manual just to see what is available), how often do you reread the manual, especially for your shell? I knew about associative arrays in bash, but not by reading the manual as those were introduced in v4 and I’ve been using bash for longer than that.
You should probably read the release notes for new major versions. Also, anytime you think “I wish there was <feature> available in this program”, you should probably double-check if such a feature has appeared since you last learned the program.
That's how I discovered the '&' regex feature in less some years back.
Do you mean PCRE’s “(?&name)” feature? That is not portable; for instance, Python calls it “(?P=name)”. Or do you mean the common feature of “&” in the replacement string meaning “the whole matched string”? While this feature is common in languages which use regexps (including ed, sed, awk, etc.), that’s not actually a regexp feature, since the & character has that effect in the replacement string, not in the regular expression. It is also not always portable; for instance, Emacs requires escaping as \& for the character to have the special feature.
From man(1) less:
&pattern
Display only lines which match the pattern; lines which do not
match the pattern are not displayed. If pattern is empty (if
you type & immediately followed by ENTER), any filtering is
turned off, and all lines are displayed. While filtering is in
effect, an ampersand is displayed at the beginning of the
prompt, as a reminder that some lines in the file may be hidden.
$ less --version
less 487 (GNU regular expressions)
Copyright (C) 1984-2016 Mark Nudelman
Oh, you meant the program “less”! Right, I understand what you meant now.
:-)
Yeah.
I was viewing some files and thinking "it'd be really cool if less(1) had some way of filtering to just lines matching a specific pattern".
RTFM'd and TIL'd.
Mind that at that point I'd used less for ... oh, at least 30 years.
Agreed. This is also how I found out that features in Python I took for granted, like breakpoint(), were much more recent than I thought (3.7). Nothing like having to use ancient versions of a language to discover limitations.
I once worked with a dev (frontend, of all things) who reread the bash manual annually.
I work in ops, and I don’t even have that dedication. My hat is off to them.
TBF, the Bash manpage now runs > 80 pages:
man -Tps bash | ps2pdf - bash.pdf
That can be daunting. Though illuminating.Bash is low on the list of things to learn, especially as many greybeards suggest keeping on POSIX compatibility and/or using a "proper" language (Python, older: Perl) for anything longer than a few lines.
They suggest that based on their experience. Error handling in bash is awful so I advice against using it for anything complex.
You can go a really really long way, with a script that will work everywhere, with just set -e and aborting at the first error.
Better yet, 'bash unofficial strict mode':
set -euo pipefail
IFS=$'\n\t'
That has lots of issues. See https://mywiki.wooledge.org/BashPitfalls#set_-euo_pipefail for some.
It does, yes, but IME if you can get people who aren’t familiar with shell to use it, it prevents more problems than it solves on the whole.
Then again, so does enforcing passing shellcheck in CI.
Anything complex should be written in a competent language like Java. Script languages (like Bash and Python) are for short (a few lines long) scripts. Using the tool outside the scope of what it was designed for is not a good idea.
Tell me you have never seriously used Python without telling me you have never seriously used Python.
I mean, viewing Python strictly as a scripting language? I am honestly lost for words. There are many huge and major applications and web sites written in Python, without people regretting it after the fact. And yet here you are dismissing it out of hand without a single argument.
Meanwhile most of the time topics like this come up and people hate on shell scripts, those of us that like them see those criticisms the same way you're looking at this comment about python: So far out there it's almost not worth responding. I think that's why GGP and GGGP think "greybeards" don't think it's worthwhile based on experience - it's actually not worth arguing against misinformed comments so newer people don't realize it's still heavily used, just quietly in the background for things it's actually good at.
Further down is a comment about that: https://news.ycombinator.com/item?id=42664939
> …not worth arguing against misinformed comments …
Yeah, I have these same response patterns. Shell works really well for some use cases. I generally don’t respond to the comments that list the various “footguns” of shell, or that complain about security holes, etc. My use cases are not sensitive to these concerns, and even besides this, I find the concerns overstated.
I see that kind of thing all the time. Usually it is about static types. People think that dynamic languages aren't "serious", or something. It is laughable that these people still make up a significant amount of comments, here in 2024.
Don't be too rude, this is a common view among people who are technically adjacent but not engineers, like IT people. It's an incorrect superstition, of course, but in tech almost everybody has their superstitions. There's no reason to be rude -- ignorance is not a crime.
Using a tool beyond its design can be problematic.
But Python is not designed to only be a scripting language:
> What is Python?
> Python is an interpreted, interactive, object-oriented programming language. It incorporates modules, exceptions, dynamic typing, very high level dynamic data types, and classes. It supports multiple programming paradigms beyond object-oriented programming, such as procedural and functional programming. Python combines remarkable power with very clear syntax. It has interfaces to many system calls and libraries, as well as to various window systems, and is extensible in C or C++. It is also usable as an extension language for applications that need a programmable interface. Finally, Python is portable: it runs on many Unix variants including Linux and macOS, and on Windows.
When my scripts outgrow bash, they almost always wind up in Python.
That said, Sonnet 3.5 had gotten me much further in bash than was possible before - and it's all really maintainable too. I highly suggest consulting with Sonnet on your longer scripts, even just asking it "what would you suggest to improve".
Why shouldn't you use Python for larger projects, and why do so many startups succeed with large Python repos?
Give bash some credit - it's actually amazing for even very large, complex systems (of which there are many -- quietly doing their jobs for decades.)
With xargs or GNU parallel, you can have multi-processing, too. Combining with curl or ffmpeg, you can literally build a production-grade web scraper or video transcoding pipeline in a couple of minutes and a few dozen lines of code.
My one-shot memory worsened with years but grasping concepts from examples got much better due to experience. So I find walls of text much less useful than a continuous “synopsis” of snippets and best practices. One usage example is worth a hundred of words and few pages of it may be worth a whole manual, especially when “at hand” through editor means. For bash idk if that exists on the internet, so I maintain my own in my one big text file. I wish every manual had the synopsis-only section (maybe I should just LLM these).
> One usage example is worth a hundred words ... For bash idk if that exists on the internet, ...
Totally agree with that - I maintain a big txt file too.
Maybe this bash compendium is a bit similar:
You might like these:
• <https://learnxinyminutes.com/>
And some code search engines:
• Debian: <https://codesearch.debian.net/>
• Python: <https://www.programcreek.com/python/>
• Linux: <https://livegrep.com/search/linux>
• GitHub: <https://grep.app/>
• HTML, JS and CSS: <https://publicwww.com/>
Oh dear. I've been trying to get people to not use this feature for a while.
One thing that has bitten me in the past is that, if you declare your associative arrays within a function, that associative array is ALWAYS global. Even if you declare it with `local -A` it will still be global. Despite that, you cannot pass an associative array to a function by value. I say "by value" because while you can't call `foo ${associative_array}` and pick it up in foo with `local arg=$1, you can pass it "by reference" with `foo associative_array` and pick it up in foo with `local -n arg=$1`, but if you give the passed in dereferenced variable a name that is already used in the global scope, it will blow up, eg `local -n associative_array=$1`.
As a general rule for myself when writing bash, if I think one of my colleagues who has an passable knowledge of bash will need to get man pages out to figure out what my code is doing, the bash foo is too strong and it needs to be dumbed down or re-written in another language. Associative arrays almost always hit this bar.
I'm not seeing this local scope leak with bash 5.2.15. The below script works as I'd expect:
#!/bin/bash
declare -A map1=([x]=2)
echo "1. Global scope map1[x]: ${map1[x]}"
func1() {
echo " * Enter func1"
local -A map1
map1[x]=3
echo " Local scope map1[x]: ${map1[x]}"
}
func1
echo "2. Global scope map1[x]: ${map1[x]}"
outputting 1. Global scope map1[x]: 2
* Enter func1
Local scope map1[x]: 3
2. Global scope map1[x]: 2
The local scope leak seems to only happen when you drop down the call stack. See below how I can call func2 from the top level and it's fine, but if I call it from within func1, it will leak.
#!/bin/bash
declare -A map1=([x]=2)
echo "1. Global scope map1[x]: ${map1[x]}"
func1() {
echo " * Enter func1"
local -A map1
map1[x]=3
echo " Local scope map1[x]: ${map1[x]}"
func2
}
func2() {
echo " * Enter func2"
echo " Local scope map1[x]: ${map1[x]}"
}
func1
func2
echo "2. Global scope map1[x]: ${map1[x]}"
outputing:
1. Global scope map1[x]: 2
* Enter func1
Local scope map1[x]: 3
* Enter func2
Local scope map1[x]: 3
* Enter func2
Local scope map1[x]: 2
2. Global scope map1[x]: 2UPDATE: I did a bit of exploration and it turns out ANY variable declared `local` is in the scope of a function lower down in the call stack. But if you declare a variable as `local` in a called function that shadows the name of a variable in a callee function, it will shadow the callee's name and reset the variable back to the vallee's value when the function returns. I have been writing bash for years and did not realise this is the case. It is even described in the man page: When local is used within a function, it causes the variable name to have a visible scope restricted to that function and its children.
Thank you. You have taught me two things today. One is a bash feature I did not know existed. The second is a new reason to avoid writing complex bash.
This is know as dynamic scope, as opposed to lexical scope: https://en.wikipedia.org/wiki/Scope_(computer_science)#Lexic...
Not only do they exist, but they have some fantastic foot guns! https://mywiki.wooledge.org/BashPitfalls#A.5B.5B_-v_hash.5B....
>they have some fantastic foot guns!
Otherwise wouldn't be getting the full shell experience.
It's not like we would have 50 years of good programming language design and then stick to things that were created 50 years ago.
Why hasn't shell scripting evolved? It's god-awful.
It's evolved quite a bit if you don't need a `sh`-compatible shell.
I'm a huge fan of `fish` for personal scripting and interactive use: https://fishshell.com/
Obviously you'll still have `bash` installed somewhere on your computer, but at least you don't have to look at it as much.
Evolve into what? Something that works and doesn't have foot guns? Then you might as well just use any other programming languages that exist today. I will not suggest any alternative as that just attract flame wars.
Why hasn't using a rock as a tool evolved? Exactly, because it's god-awful and got superseded, use a real hammer or a cordless screwdriver instead.
Just like the post about improvements to C on the front page today, that list can be made infinite. Language designers have already learned from those mistakes and created zig, rust, go or even c++, that fixes all of them and more. Fixing all the flaws will turn it into a different language anyhow. There is only that much you can polish a turd.
What would you like to see?
I’d guess there’s a solution to almost any set of priorities you have for shell scripting.
The domain space is extremely challenging: by default, executing any program on behalf of the caller using arbitrary text inputs, and interpreted.
All modern shells allow calling of literally any binary using the #!/usr/bin/env randombinary incantation.
Upshot: bash has its place, and while some of that place is unearned inertia, much of it is earned and chosen often by experienced technologists.
In my case, if I’m doing a lot of gluing together of text outputs from binaries, bash is my go-to tool. It’s extremely fast and expressive, and roughly 1/4 the length of say python and 1/8 the length of say go.
If I’m doing a lot of logic on text: python. Deploying lots of places/different architectures: go
I use this as my main shell on Windows, and as a supplementary on Mac and Linux.
- bash is available pretty much everywhere, so if you learn it you can always use it, whereas if you learn a weird new shell you'll be occasionally forced to fall back on bash anyway, so people learn just bash for efficiency's sake (because learning one shell is painful enough as it is). And any proposed replacement will be non-portable.
- some of the things that make shell scripting terrible can't be fixed in the shell itself, and need changes to the entire ecosystem of console applications. e.g. it would be awesome if every Unix utility output structured data like JSON which could be parsed/filtered, instead of the soup of plaintext that has to be memorized and manipulated with awk, but that almost certainly won't happen. There's a bunch of backward-compatibility requirements like VT-100 and terminal escape sequences limiting the scope of potential improvements as well
- there's a great deal of overlap between "people who do a lot of shell scripting" and "people who are suspicious of New Stuff and reluctant to try it"
> it would be awesome if every Unix utility output structured data like JSON
I see this argument a lot, and I think it has a ton of overlap with the struggles from devs trying to grok RDBMS I see as a DBRE.
Most (?) people working with web apps have become accustomed to JSON, and fully embrace its nesting capabilities. It’s remarkably convenient to be able to deeply nest attributes. RDBMS, of course, are historically flat. SQL99 added fixed-size, single-depth arrays, and SQL2003 expanded that to include arbitrary nesting and size; SQL2017 added JSON. Still, the traditional (and IMO, correct) way to use RDBMS is to treat data as having relationships to other data, and to structure it accordingly. It’s challenging to do, especially when the DB providers have native JSON types available, but the reasons why you should are numerous (referential integrity, size efficiency, performance…).
Unix tooling is designed with plaintext output in mind because it’s simple, every other tool in the ecosystem understands it, and authors can rest assured that future tooling in the same ecosystem will also understand it. It’s a standard.
JSON is of course also a standard, but I would argue that on a pure CLI basis, the tooling supporting JSON as a first-class citizen (jq) is far more abstruse than, say, sed or awk. To be fair, a lot of that is probably due to the former’s functional programming paradigm, which is foreign to many.
Personally, I’m a fan of plaintext, flat output simply because it makes it extremely easy to parse with existing tooling. I don’t want to have to fire up Python to do some simple data manipulation, I want to pipe output.
If there were some kind of standard or widely-followed convention for Unix tools to print plaintext, I wouldn't mind it so much. It's the fact that you need to memorize a different flag and output format for each tool, followed by an awk command where the intention of the code is generally very obtuse, which bothers me. By contrast, for all its faults, the fact that PowerShell has unambiguous syntax for "select this field from the output" helps a lot with both reading and writing. e.g. to get your IP address, "Get-NetIPAddress | ? {$_.InterfaceAlias -like "Ethernet" -or $_.InterfaceAlias -like "Wi-Fi"} | select IPAddress" is a lot clearer in intent than the Unix equivalent regex soup, and it can be written without looking anything up by printing the raw output of "Get-NetIPAddress" to the shell and seeing what you need to filter/select on. You can even get tab completion to help.
A hypothetical Unix equivalent doesn't need to be JSON, I just brought that up as an example. But any structured data would be an improvement over the situation now, and as the PS example shows, with the appropriate ecosystem tooling you can do all the data manipulation over pipes.
"fantastic foot guns"
It wouldn't be a worthy bash feature if after learning about it you wouldn't spend a few days figuring out why the damn thing doesn't work the way it works in any other language.
Indeed! It's delightful like all of bash. I've used them in a few places but I try not to
Too many. I still write /bin/sh syntax (I know it's a symlink to bash now but I mean the old school sh). Anything that requires bash-that-isnt-sh is usually better written in perl or something else.
Only on some distro's.
Debian variants tend to link dash, which self consciously limits itself to posix compatibility.
I love shell, I think it's killer feature are pipes and whoever figured out how to design so you can pipe in and out of control structures(Doug Mcilroy?) is a goddamn genius. however, after writing one to many overly clever shell scripts, I have a very clearly delineated point in which the script has become too complex and it is time to rewrite in a language better suited for the task. and that point is when I need associative arrays.
A lot of the sins of shell are due to it's primary focus as an interactive language. Many of those features that make it so nice interactively really hurt it as a scripting language.
Pipe-related concepts in various restricted forms were floating around for years. Doug McIlroy indeed proposed them in 1964 and was heading the Bell Labs team when they were implemented in the Third Research Edition of Unix (1973).
Sure, but in and out of control structures? I think that's the major point your parent was making.
Control structures? Do you mean something like '{ cmd1 || cmd2; } | cmd3', where the control structures are the braces?
I advocate the following rules for when to write and when not to write a shell script.
# Write a shell script:
* Heavy lifting done by powerful tool (sort, grep, curl, git, sed, find, …)
* Script will glue diverse tools
* Workflow resembles a pipeline
* Steps can be interactively developed as shell commands
* Portability
* Avoid dependency hell
* One-off job
# Avoid shell scripting:
* Difficult to see the preceding patterns
* Hot loops
* Complex arithmetic / data structures / parameters / error handling
* Mostly binary data
* Large code body (> 500 LoC)
* Need a user interface
A need for associative arrays (implemented in Bash as via hashmaps) moves the task to the second category (avoid shell scripting).
I’ve been a sysadmin for nearly a decade, a frontend web designer for much longer (I still have a book on all the exciting changes in HTML4), and while I can very easily learn, compose, and use many markup and scripting languages, I have always struggled with full-on programming languages, and I’m not exactly sure why. Part of it, I think, is that most tutorials and online learning resources are focused on novices who don’t have any existing grasp on general programming concepts and syntax, but I’m already pretty deep into Bash. To the extent that I am sure I have crossed the thresholds you list, and used quite long and complex Bash scripts for tasks that would almost certainly be easier in Python. I’d love to find A Bash Scripter’s Guide To Python or something similar—an intermediate course that assumes that I already know about variables, operators, functions, Boolean expressions, et al. I have searched for this a few times, but it’s full of keywords that makes searching Google difficult.
So this has inspired me to Ask HN, I’m getting ready to post it with a reference to this discussion, but thought I’d start here: does anyone know of a good resource for learning Python (or Go, Perl…any good small project/scripting/solo hacking languages) that’s tailored for folks who already have a good foundation in shell scripting, markup, config/infra automation, build tools, etc.? I’m open to books, web-based tutorials, apps, or just good examples for studying.
I’m open to the notion that I simply haven’t put in the work, and powered through the doldrums of novice tutorials far enough to get to where the meaty stuff is. Worst case, I’m a big fan of taking courses at the local community college for my own edification, absent any specific degree program or professional certification. It would still necessitate a lot of remedial/redundant steps, but I can always help others through those chapters while filling in any of my own gaps. Day-to-day, I generally work alone, but I find such things are easier with others to help motivate and focus my efforts. Perhaps I have answered my own question, but even still, I appreciate any advice/recommendations folks might have.
One of the differences between "general programming" and the kinds of coding that you have done is the /approach/. I'm not sure if that's even the right word, but there's a different set of concerns that are important.
My guess is that you can easily learn the syntax and that you have the logical and analytical skills, but the part you have to learn is how to THINK about programming and how to DESIGN a program. If you have or haven't taking coding classes, I think reviewing topics like data structures, algorithms, encapsulation/object oriented programming, functional programming, etc. is the way to learn how to think about general programming. I don't think the language matters, but there might be resources about these topics in the language you're interested in.
An example of what you DON'T want is something like Effective Go (not just because it's out-of-date now): https://go.dev/doc/effective_go. This page can give you a really good base of information about what Go is, but I think you'll get much more bang for your buck with a resource that is more about the WHYs of programming rather than the WHATs.
My search through existing Ask HN has shown that this, unsurprisingly, has been asked before by several people in other specific contexts (side note, Ask HN is such an incredibly useful resource, thanks to everyone who engages with these questions). I don’t want to add more noise, so I’m going to work through the existing answers before posting.
If anyone else is interested, this thread from 2020 is where I am starting, it seems to align with my own quest pretty well: https://news.ycombinator.com/item?id=22932794
> assumes that I already know about variables, operators, functions, Boolean expressions, et al.
Learning Go by Jon Bodner is a good choice. It seems to assume that Go is the reader's second (or tenth) language.
A couple things:
- Between using Bash's functions and breaking large scripts into multiple, separate scripts, one can keep things reasonably tidy. Also, functions and scripts can be combined in all the ways (e.g., piping and process substition) that other programs can.
- If I run into a case where Bash is a poor fit for a job, I ask myself, "Self, what program would make this easy to do in a shell script?" If I can respond with a good answer, I write that, then continue with the shell scripting. If not, I write in something else (what the kids would call a "proper" language).
No one else seems to have mentioned this, but POSIX sh does not include this feature.
Until it does, and major POSIX shs have shipped with it for a decade, then the feature will actually exist in a way that the average shell coder cares about. You're better off just shipping Rust/Go/etc binaries and leave sh for your simple glue tasks.
Even I've eventually switched to this, and I've written 10k+ long Bash scripts that followed best practices and passed shellcheck's more paranoid optional tests.
Use the right language for the right job.
> then the feature will actually exist in a way that the average shell coder cares about
I think it's worth picking at this a bit. At least IME, a fairly small fraction of the Shell I write needs to run literally (any|every)where.
I don't mean to suggest there aren't people/projects for whom the opposite is true--just that it's worth chunking those cases and thinking about them differently.
It obviously isn't a global solution, but in the Nix ecosystem we have decent idioms for asserting control over the shell and tools any given script runs with.
Reliable environment provisioning can spare hundreds of lines of code real-world Shell tends to accumulate to sanity-check the runtime environment and adapt to the presence/absence of various utilities. It also enables us to use newer shells/features (and shell libraries) with confidence.
It's enough leverage that I only give it up when I must.
Bash Associative Arrays [1] are handy! Some examples of how I've used them:
- my site builder (for evalapply.org): inject metadata into page templates. e.g. https://github.com/adityaathalye/shite/blob/b4163b566f0708fd...
- oxo game (tic-tac-toe): reverse index lookup table for board positions: https://github.com/adityaathalye/oxo/blob/7681e75edaeec5aa1f...
- personal machine setup: associate name of installed application to its apt source name, so we can check for the app, and then install package https://github.com/adityaathalye/bash-toolkit/blob/f856edd30...
[1] I'd say "hashmap" is acceptable, colloquially. However, I don't think Bash computes hashes of the keys.
(edit: fix formatting snafu)
Nice that these exist—but does anyone else absolutely abhor shell programming? The syntax is impossible to memorize, it’s incredibly easy to make mistakes, and debugging is a pain. I hate it more than C++ and AppleScript.
I enjoy it quite a lot :shrug:
Bash itself isn't a very big language, so I wouldn't call it "impossible to memorize".
I have around a billion short bash scripts, it's rare I've used an array but it's cool it has it as long as the script doesn't go beyond a few lines.
Unfortunately, MacOS ships an earlier version of bash that does not include associative arrays, so they’re not as portable as you might like.
Picky and probably pointless question: are they actually hashmaps? If I understand correctly, a hashmap isn’t the only way to implement an associative array.
This seems pertinent, NEWS > bash-5.0 to bash-5.1 > mm
Seems to suggest it's indeed a hashmap in implementation, but I can't be bothered to look any closer.
That is certainly true - associative arrays in c++ (std::map) for example, are implemented as red-black trees.
I think many devs don't know the difference and simply call any dictionary/associative array a hash map. It might be one of the concepts that "the internet" promotes: it sounds fancy, more technical, makes you seem more knowledgeable, so it gets repeated. Then newcomers think this is the way it's always been called, and that gives it enough visibility to become the preferred name.
A couple of weeks ago I learnt that Bash on Mac does not have associative arrays. We worked around the issue by changing the script to run under Zsh, but beware.
Many programs on macOS are stuck in the ancient past, due to Apple:
<https://web.archive.org/web/20240810094701/https://meta.ath0...>
Sounds like a bash 3 issue.
$ bash --version
GNU bash, version 5.2.32(1)-release (aarch64-apple-darwin23.4.0)
$ declare -A aaa; aaa[a]=a; aaa[b]=bb; for i in ${!aaa[@]}; do echo "$i --> ${aaa[$i]}"; done
b --> bb
a --> a
To elaborate on this, macOS default bash is still stuck (assuming due to licensing) in v3.2.x (released in 2007). Bash got associative arrays in v4 (released in 2009).
You should be getting bash from homebrew anyways.
Or using an OS which doesn’t ship ancient software. Linux exists. It’s pretty awesome.
Just use the included zsh.
I like my scripts to run on both linux and macos. It's somewhat limiting, but it saves trouble.
Linuxes have zsh. If you install zsh on linux or bash on macos, hassle is same ish. Installing homebrew is required but I always do it myself. Although bash is more often the available shell on cloud services.
bash associative arrays are fantastic, I've made heavy use of them, but be warned that there have been several memory leaks in the implementation. (Sorry, no version notes, once we realized this, we rewrote a key component in C.)
IIRC, the latest bash addresses all of these, but that doesn't help too much if you are stuck with an older, stable OS, e.g., RHEL 7 or 8; even 9 likely has a few remaining.
These leaks become an issue if you have a long running bash script with frequent adds and updates. The update leak can be mitigated somewhat by calling unset, e.g., unset a[b], before updating, but only partially (again, apologies, no notes, just the memory of the discovery and the need to drop bash).
I'd agree with the idea that bash wasn't the best choice for this purpose in the first place, but there was history and other technical debt involved.
Every few years I rediscover this fact and every few years I do my best to forget it
I discover stuff like this every day, and it’s delightful. Sure, reading the manual would’ve saved me from the surprise, but it’s incredibly difficult for me to read them unless I have a specific goal in hand.
I found out about hashmaps in Bash a year ago[1], and it came as a surprise. This morning, I came across dynamic shell variables and the indirection syntax[2]. Each time I wrote about them and learned more than I would have if I had just passively grokked the manual.
I'm guilty of this. I knew zsh had them but since I can never remember the exact things zsh has that bash doesn't, I just assume anything remotely useful isn't compatible.
This policy comes from a six hour debugging session involving (somewhere) a script that manipulated a binary file - bash can't store zero bytes in variables and zsh can, but it's not like it'll tell you that it's pointlessly truncating your data and I never thought to look. So now every step in that script gets converted back and forth through xxd, and I don't trust Bash anymore.
Got burned on this on an interview question before.
In a related question I said something about switching to a language like Python when a script started to get "complicated"
Then the interviewer explained how his favorite language was bash, how he uses it for everything...
I did not get the job. Ironically my next job I did a bunch of Perl to Bash conversion.
One thing the article doesn't mention is how you can use indirect expansion to get a list of all keys.
For example: `${!myvar[@]}` would list all of the keys.
I've written about associative arrays in Bash a bit here: https://nickjanetakis.com/blog/associative-arrays-in-bash-ak...
doesn't work in zsh?
It doesn't but if you're putting this into a script that's ok. You can set the script's shebang to bash so even if your user's shell is using zsh, the script will run with bash.
When you feel the need for a hash in shell, or even an array, is also probably when you should rewrite your shell in python.
(I feel obliged to add that when you feel the need to add type annotations a bit later is when to abandon python)
Notice that other shells also has associative arrays, or at least zsh. I've found hyperpolyglot [0] to be a nice Rosetta stone for translating syntax between e.g. bash and zsh.
[0]: https://hyperpolyglot.org/unix-shells#associative-arrays
Before I learned about SSH identities, I used to have a shell script sourced into my bashrc with a hashmap containing all my hosts and full connect string, so that I could `ssh hostname` like one does with SSH identities.
Don't loop on values with `*`; the difference key/value is the lack of `!` at the start of the expression; `*` and `@` rules are the sames a $@ and $* and you almost never want *.
> Q: How do I declare a Hashmap?
> A: You use the command `declare -A HASHMAP_NAME`
This is why I think Bash is a horrible language
what do you mean?
Just that it is ugly and strange
I don't know about that.
std::unordered_map<std::string, int> myHashMap;
looks a lot uglier and stranger to me.With that kind of syntax, Wouldn't you just use perl or python instead?
They are also slow AF because a lookup takes linear time.
Which is because they're not hashmaps, they're associative arrays. Article treats them as the same thing, but they're not - that's why the "declare -A" is A and not H - it stands for "associative".
Author of the article here. As far as I care, if it quacks like a hashmap, it's better to describe it as a hashmap. The fact that it's an associative array under the hood is irrelevant.
It doesn't; it's just that you probably never heard the quack of a hashmap so you don't know the difference.
A hashmap is a very specific structure that uses a hashing algorithm to normalize the key sizes, and then usually constructs a tree-like structure for fast key lookups. Anything that doesn't use such an implementation under the hood should not be called a hashmap, whether you care or not.
You have this the wrong way around. Associative array is the abstraction you are talking about. Hash maps are a specific implementation of that abstraction. You could also define hash map as an abstraction which has the same interface as associative array but with more strict performance characteristics. But either way it's wrong because bash is neither implemented as a hash map nor has the performance characteristics of a hash map.
For reference the main implementations of associative array are alists/plists (linear time), balanced binary trees, b-trees (logarithmic time) and hash maps (amortised constant time).
Where are people learning that "hashmap" means associative array? It's obviously an easier word so I can see the natural language preferring that. Is this common? I can see it causing some communication problems when the details are important.
bashmaps?
LZ77 compression/decompression in pure Bash using hashmaps:
For some tasks, if as much as possible was coded in bash, it would work being called anywhere from any programming language.
Now to add hashtables to that.
To me, this is a development in the wrong direction. Shell is great precisely because it's so minimal. Everything is string rule is one that calms all of your type-induced fears.
Having to implement hash-tables, while still keeping the appearances of everything being a string is the anti-pattern known as "string programming" (i.e. when application develops a convention about storing type / structure information inside unstructured data, often strings).
I would love for there to be a different system shell with a small number of built-in types that include some type for constructing complex types. There are many languages that have that kind of type system: Erlang, for example. But, extending Unix Shell in this way is not going to accomplish this.
> To me, this is a development in the wrong direction.
That ship has sailed. These were introduced in Bash 4, released in 2009, and Bash is already on version 5.