We have norms of grammar and usage for one purpose: to make language universally intelligible to speakers of the same language. The norms are largely arbitrary as can be seen by comparing the propriety of expressions across different languages. For example, many tonal languages do not have plurals. How can this be? Seems unthinkable for an English-speaker, right? Well, the billion plus speakers of tonal languages in Asia seem to do just fine. Why? Because their grammar and usage does not require making words plural due to the heavy reliance on context in their speech. The point being that a norm in English is not universal, but is something that developed as the language itself developed and is not, as some might have you believe, sacrosanct (like the use of the second person plural just now and this run-on parenthetical fragment).
Norms are useful to the extent that speakers and writers internalize them and generally abide by them when speaking and writing which makes expression more readily intelligible to listeners and readers (though it should be noted that even our notions of speech and understanding bear correcting as it has become well-known among interested neuroscience specialists that much speech and the understanding of speech actually involves anticipation and so the actual completed expression of a speaker is of less importance than would seem to be the case from a common sense perspective on how language works). Likewise, expressions that deviate from the norm but cleave close enough to it to be readily intelligible to speaker and listener or writer and reader seem for all practical purposes to fulfill the function of linguistic expression regardless of existing norms. "Ain't" would be such an expression. Every competent American English speaker who hears it knows what the speaker means and it seems rather prudish and Victorian to publicly insist that one ought to instead say "isn't" when they are functional equivalents. It doesn't matter what the history of the contraction is or frankly any other highbrow criticism of the contraction as lacking meaning because, by any reasonable definition of "meaning," "ain't" has it and it, the meaning that is, is clear.
Ludwig Wittgenstein summarized the problem with being priggish about grammar and usage succinctly: “For a large class of cases of the employment of the word ‘meaning’—though not for all—can be explained in this way: the meaning of a word is its use in the language” (Philosophical Investigations 43). Language is practical and is driven by how it is used, not by rules that are appended to it (and which, it must be said, are always appended to it after the fact - the edifice of language by necessity preexists the rules that are applied to it). The notion that language requires policing is narcissistic, to say the least. In the spirit of openness and honesty, I must admit that I am predisposed to this narcissism and struggle mightily not to be a grammar and usage zealot. In any case, the problem is that if the goal is intelligibility and an expression is understood, then it is narcissistic and patronizing to insist that the speaker conform his or her speech not to intelligibility but to the norm that the pedant, as unofficial arbiter of all things language, informs them is the only way to express themselves intelligibly.
It must be said that norms serve a purpose and there are situations in which adhering to them is necessary. However, adhering to the norms in special cases is usually not necessary for intelligibility but rather is an issue of culturally transmitted norms of credibility. Hence, when a lawyer writes a brief to an appellate court, she will avoid writing "ain't" because the accepted norms of legal writing deem such an expression to be improper and, given the insistence by those in the legal profession that the norms be adhered to, using the expression will diminish her credibility in the eyes of the readers of the brief despite the fact that "ain't" may have been otherwise perfectly intelligible to those same readers. I believe that ad hoc and arbitrary norms in any specialized field should be abandoned in favor of common usage, but I also recognize that until the norms are abandoned one must conform to them if one hopes to maintain his or her credibility. The same would be true of speech among any specialized cohort such as teenagers, chemists, or detectives.
Language is perpetually in tension: pulled between the particular and the general. This reflects the fact that language develops in response to local conditions but is also common to diverse populations and so the influence of one pulls at the other and vice versa. Hence, we have phenomena such as the northern cities vowel shift, which is a geographically particular change in the way many persons in cities from Buffalo to Minneapolis pronounce certain vowel sounds. It would be patently ridiculous to say speakers from this region are somehow speaking improperly in the same way it would be ridiculous to admonish all speakers of English from the 15th century or so onward for speaking improperly because they adopted the pronunciation of vowel sounds common after the great vowel shift and in some ways were speaking a debased English, an improper dialect if you will.
The same can be said of grammar and usage - the particular influences the general and vice versa not because of some failure to abide by fixed and inviolate rules but rather because any human language is (and should be) subject to the vicissitudes of the experiences of its speakers. Thus "hopefully" as commonly used may appear to some as daft or moronic or unintelligible, but for the vast majority of English speakers and readers, the term is immediately understood and perfectly functional. While the pedants descry or laugh at the "malaproprism," the rest of us just use it. And this is how it should be.
So speak to be understood and when you understand don't pontificate or act pompous about how another said it. You are not Moses and words are not written in stone.
No comments:
Post a Comment