Tuesday, March 29, 2022

Some Thoughts on AllNoun

AllNoun is a constructed grammar by Tom Breton. It does not actually claim to be a constructed language, because it has no vocabulary--one simply borrows vocabulary as convenient from whatever source language you like to slot into the AllNoun grammar (typically English vocabulary). AllNoun was originally inspired by Glosa, which has a single syntactically flexible lexical class with syntactic functions disambiguated by heterogenous function words. The aim of AllNoun was to take that one step further--eliminate the function words, and produce an entirely monocategorial grammar. In the process, AllNoun became one of the earliest documented language projects to invent the argument-tagging approach to verblessness, as is also seen in Machi, which I previously reviewed.

The central conceit of AllNoun is that any word, with any semantics, can serve either referentially, or as a role marker. This is explained quite vividly in the following excerpt from the AllNoun FAQ:

Question:
Aren't there really two classes of noun, the "parts" and the "roles"?

Answer:
No, they really are interchangeable. Words may tend to be more useful as
roles or parts, but any word really can fit in either category.

As a limiting example, consider that in his column (and later book)
Metamagical Themas, Douglas Hofstadter once asked, in complete
seriousness, "Who is the Dennis Thatcher of America?". By this he meant,
"Who or what in America plays the same role that prime minister Margaret
Thatcher's husband plays in England?"

It seems to me that if the proper noun "Dennis Thatcher" can be a role,
then anything can be.
This is a feature that I have not seen totally replicated in in any other conlang yet, which is kind of a shame. However, despite this gem of a core concept, AllNoun is on the whole a failure. And I don't feel particularly bad about saying that, since Tom himself has been open about problems that he saw in AllNoun a few years after its initial publication. However, I think Tom is not entirely correct about what the real problems actually are. I'll just go through them one-by-one:
Problems? My treatment of adjectives is the big one. It treats subsective and intersective adjectives well enough, but its intersective mechanism is not sufficient for nonsective adjectives. For instance, a "former friend" is not any sort of friend, and not easily seen as a member of "the set of all former things". "Alleged thief" is not the intersection of all thieves and all "alleged things"
This is a legitimate issue, but not as big of one as Tom seems to think. If you regularize the syntactic semantics of English adjectives, you get the same problem--but English merely allows lexical specification of modifier semantics (as do most natural languages). Granted, doing that in AllNoun would disrupt the engineered simplicity and elegance of the syntactic system, but there is another solution, which I employed in WSL: just treat non-intersectives as relations between the target of modification and the final referent. Anything can be a relation (or role) in AllNoun,  so why not non-intersectives?
Another difficulty, which Paul Doudna pointed out to me, is that it is unclear how propositions are to be expressed. At the time, I believed that a top-level nominal should be interpreted as existential. Eg, to say "I ate the apple" you'd express "an eating of the apple by me in the past", and it would be understood as existential "(There is) an eating of the apple by me in the past". However, it isn't as expressive as I would like. Paul also suggested it was weak at expressing fictional contexts, which aren't really existential, but I'm not sure it's any worse than natural language.
This basically comes down to a an aesthetic concern, rather than a functional one. Which is perfectly legitimate--if Tom doesn't like how the predication structure turned out, that's entirely his prerogative. But it does not constitute a functional failure of AllNoun as potential grammar for a language.
Non-declarative moods (questions, imperatives) worked clumsily, by using the declarative mood.
And yet, there are natural language which get by just fine with little or no formal marking of different moods. So why shouldn't AllNoun?
Expressing determinacy ("the", "an", "some") was always clumsy. Determiners more than any other natural words want to modify the nominal adjacent to them, which is totally contrary to AllNoun. So I ended up with a lot of examples that simply did not express determinacy.
And again, there are plenty of natural languages which get by just fine with no articles and no grammaticalized definiteness system. So why shouldn't AllNoun?
Relative clauses, while they worked, tended to not be linear. Eg, you could express "apple *that I ate*", but AllNoun did not neccessarily make the relation of the relative clause to the matrix clause immediately clear. Eg, one might say: (apple (eat agent:me time:past patient:^)). ...where it isn't clear until the end of the relative clause that it is about the apple. Of course it's not neccessarily so: (apple (eat patient:^ agent:me time:past))
And yet again, there are plenty of natural languages that keep you in suspense with important structural information saved till the end, and plenty of natural languages which don't even have embedded relative clauses at all, instead relying on parataxis. So if AllNoun doesn't handle relatives very well... who cares? That's no reason it can't still be perfectly functional for communication!

Finally, Tom does make a very good point about general semantics:
>  The : also seems to inherit some
>  of the many different uses of the possesive and "of". 

Yes.  In a philosophical sense, I believe that relatedness is basic in
communication.  Any communicator in any language has to, at some basic
underlying level, recognize a relation just because it is named,
without additional underlying mechanism.
David Gil, with his theory of associative semantics in Isolating-Monocategorial-Associative (IMA) language would certainly agree!

So, having sung the praises of AllNoun in contradiction to its author why do I think it's a failure? Because, while AllNoun is not a bad basis for some sort of language, it completely fails to meet its own primary design objectives. From the Introduction to AllNoun:
AllNoun has only one part of speech, which is largely but not entirely
analogous to nouns in other languages. Thus the name AllNoun.

Words are never inflected in AllNoun. It is a 100% isolating language.

 And yet, it fundamentally relies on semantically-significant punctuation. Either the punctuation symbols are inflections, in which case AllNoun is not100% isolating, or they are function words, in which case AllNoun does not, in fact, have only a single part of speech. Tom was not unaware of this objection, and addresses it in the FAQ:

Question: Aren't the punctuation markers non-noun parts of speech? So it's not really _all_ nouns, is it? Answer: I suppose in a very abstract philosophical way, one could consider punctuation a part of speech. If that were the case, then we would say that (say) English had not only nouns, verbs, etc. but also commas, periods, and so forth, for maybe 13 or 14 parts of speech. But generally we don't, in any language. Perhaps because there is no useful sense of a vocabulary of punctuation. In any case, I'm satisfied that AllNoun is nearly as homogeneous as possible. IMO if punctuation markers are an anomaly they are a neccessary one.
But this is a false equivalence; natural languages can be, and were for many, many centuries, written entirely without punctuation, and still retain their meaning. Punctuation serves to make reading easier, through a variety of means such as marking sentence types, more generally marking clause or other constituent boundaries, and giving hints to prosody--but it does not, by itself, have semantic content. AllNoun punctuation, on the other hand, makes up a much larger proportion of the system than it does in any natural language--comparable to a typical distribution of function words--and utterly indispensable to encoding meaning. Indeed, the entirety of AllNoun as a "constructed grammar" consists in the rules for how to use the punctuation! Furthermore, Tom acknowledged that, if AllNoun were to be spoken, the punctuation must be pronounced, and described his proposed pronunciations as "words":
Q: So how are you going to pronounce the punctuation? ( ) : ^ A: As I see it, the best way is to treat groups of one or more punctuators _infixed between part and role_ as pronounceable words, and also include single parentheses. That way multiple infix-pronounciation can efficiently join into a single word, except for free parentheses which could stack. 11 short verbal symbols are required: : ): :( ):( ^: )^: ^:( )^:( ^ )^ ) ( 0 1 2 3 4 5 6 7 8 9 a b The symmetries above should be reflected in the sounds. Here is an unofficial proposal for how it might be sounded: : ): :( ):( ^: )^: ^:( )^:( ^ )^ ) ( awf af oof if aws as oos is awsh ash ath ooth /Of/ /Uf/ /Os/ /Us/ /OS/ /&T/ /&f/ /If/ /&s/ /Is/ /&S/ /UT/ So a sentence like... beat boy^(chase dog^(catch cat^(eat mouse^(leave maid cheese^ table:on. ...might sound like... Beat boy awshooth chase dog awshooth catch cat awshooth eat mouse awshooth leave maid cheese awsh table off on.
So, I rest my case. AllNoun is not, in fact, made entirely of nouns, or any single part of speech. Furthermore, AllNoun does not, in fact, represent the simplest that a grammar could possibly be. It is not "as homogenous as possible", nor are punctuation markers a "necessary" anomaly--and this is easy to prove by construction, if we simply demonstrate the existence of actual monocategorial grammars. These can come in two types, as far as I know to date:
  1. Concatenative / combinator grammars.
  2. IMA grammars
Concatenative grammars are based on combinatory logic--or, more specifically, the parenthesis-free SKA calculus, all of whose morphemes belong to a uniform class of combinators with differing arities. Once again, we must mention Fith--Fith does use more than one formal part of speech, which is a legacy of its origin as an artistic language for fictional aliens, but it is a concatenative language, and as such didn't strictly need more than one part of speech from an engineering point of view. One could argue that differing arities make for different parts of speech--but then, for consistency, we would have to consider intransitive, transitive, and ditransitive English verbs to all be different parts of speech as well, and no one does. Nevertheless, we still have another option: David Gil's IMA grammar, introduced in the paper How Much Grammar Does It Take To Sail A Boat? In such a language, the lexicon is strictly monocategorial, with all words belonging to the syntactic class S (or Sentence), and the grammar is strictly isolating and associational. In other words, the only syntactic rule is that two words or phrases that are next to each other can form a larger constituent, and the two parts of a constituent are interpreted as being associated with each other in some way. Unlike concatenative grammars, which are completely structurally unambiguous, an IMA grammar is almost maximally ambiguous--exactly how to group words into a parse tree, and exactly what kind of association each grouping has, is all left up to context and pragmatics. It may seem that no language could possibly actually function that way, and indeed all human languages do have some function words at the very least; however, contrary to Tom's intuition, they aren't often strictly necessary. They exist because, in the words of William Annis, "people be extra", and as David Gil demonstrates, it is actually possible in some languages to find surprisingly extensive examples of people conversing in pure IMA form, without resorting to any function words. It works because language always exists in a larger context, and people are really good at pragmatic inference.

So, there you go. That's how simple a grammar can really be.

So, what of AllNoun? I have rather complicated feelings towards it. Not until sitting down to write this review did I realize that, actually, there's quite a bit that it does well. If we ignore what its creator wanted from it, and treat it as just another loglang, it's pretty neat. It could serve as a great basis on which to construct other languages with other goals, and it does at least one cool thing that I really think is worth playing with more extensively. But I really want to dislike it. From the first time I ever encountered it, it always struck me as a wrong thing--not the way to do what it was supposed to do, and in the unfortunate position of being one of the standard examples of "verbless language" for a generation of conlangers, and thus promulgating a very flawed view of what a minimalist language can really be. It was not until many years later that I formalized my idea of "spitelanging"--creating conlangs just to prove other people wrong about what a language could or could not be like--but early exposure to AllNoun may very well have been the initial spark that primed me to take up that cause!

1 comment:

  1. Good article! You have analyzed and commentated AllNoun language well.

    ReplyDelete