Language may rely less on complex grammar than previously thought: study
scitechdaily.com20 points by mikhael a day ago
20 points by mikhael a day ago
What did I previously think?
Something funny in linguists seeing complex grammar behind language instead of custom.
I would be interested in a study that compares breaches in the word order for adjectives in the English language vs noun-verb mis-numbering...
Paywall.
In any case, the short answer is "No!". There is a LOT written about language and I find it difficult to believe that most ANY idea presented is really new.
For example, have these guys run their ideas past Schank's "conceptual dependency" theory?
The article presents the fact that we appear to treat non-constituents (eg “in the middle of the”) as “units” to mean that language is more like “snapping legos together” than “building trees.”
But linguists have proposed the possibility that we store “fragments” to facilitate reuse—essentially trees with holes, or equivalently, functions that take in tree arguments and produce tree results. “In the middle of the” could take in a noun-shaped tree as an argument and produce a prepositional phrase-shaped tree as a result, for instance. Furthermore, this accounts for the way we store idioms that are not just contiguous “Lego block” sequences of words (like “a ____ and a half” or “the more ___, the more ____”). See e.g. work on “fragment grammars.”
Can’t access the actual Nature Human Behavior article so perhaps it discusses the connections.
There's no reason to assume that an human word begins and ends with a space. Compound words exist. The existence of both "Aforementioned" and "previously spoken of" isn't based on a deep neurological construct of compound words.
Sorry, I'm not following. What do spaces have to do with this? Grammar is dependent on concepts like lexemes (sort of like words), but there aren't any spaces between lexemes in spoken language.
Not sure why you bring up Schank's conceptual dependency theory. That was back in the late 60s, and I don't think anybody has worked in that theory for many decades.
Unless you’re referring to academic paper, I’m not getting a pay wall.
I read the article (but not the paper), but it doesn’t sound like a no. But I also don’t find the claim that surprising given in other languages word matters a lot less.
In languages where word order matters a lot less, the grammar is still there---it just relies more on things like case markers and agreement markers (i.e. morphology).
> In any case, the short answer is "No!".
If the question you're answering is the one posed by the Scitechdaily headline, "Have We Been Wrong About Language for 70 Years?", you might want to work a bit on resistance to clickbait headlines.
The strongest claim that the paper in question makes, at least in the abstract (since the Nature article is paywalled), is "This poses a challenge for accounts of linguistic representation, including generative and constructionist approaches." That's certainly plausible.
Conceptual dependency focuses more on semantics than grammar, so isn't really a competing theory to this one. Both theories do challenge how language is represented, but in different ways that don't really overlap that much.
It's also not as if conceptual dependency is some sort of last word on the subject when it comes to natural language in humans - after all, it was developed for computational language representation, and in that respect LLMs have made it essentially obsolete for that purpose.
Meanwhile, the way LLMs do what they do isn't well understood, so we're back to needing work like the OP to try to understand it better, in both humans and machines.