Wednesday, July 26, 2017

Language and Christianity, Part 2

Reading Christianity...
     The scriptural authority of the bible has a fascinating history, which I only wish to touch on here. It was not until the 16th century of Christendom that the doctrine sola scriptura (scripture only) became recognized and protestant theologians began to dismiss apostolic and ecclesiastic authority over matters of doctrine and belief in favor of close readings of the books of the Bible, and in particular the books of the New Testament. Today, not all protestants, and not even all evangelicals, follow the strictest version of sola scriptura, preferring instead the doctrine of prima scriptura (scripture first), which allows a role for non-textual influences on doctrine1.
    Readings of the bible then began to assume two primary characteristics: first, scripture was to be read as a vehicle for absolute, literal, objective truth, and second, as either directly inspired by God (weak version) or written directly by God through the hands of human scribes (strong version). This began to cause all sorts of difficulties as modern scholars and scientists began their work in the eighteenth and nineteenth centuries. Scientific discoveries made it increasingly difficult to reconcile the literal story of a seven-day creation with such monkey wrenches as geology and fossils thrown in. Textual scholars, especially in Germany, began to throw serious doubt on some of the assumptions about authorship, from the single-author assumptions about the Pentateuch (Moses) to questions about the assigned authorship of New Testament books (i.e., Ephesians and 1,2, and 3 Peter, among others). Then came Charles Darwin. Darwin's early work had gone virtually unnoticed among Christians, but publication of The Descent of Man in 1871 set pulpits ablaze. The Genesis man, created fully human ab initio, could not be reconciled with simian forebears. The Scopes “Monkey Trial,”2 an unfortunate comedy of judicial favoritism, lawyerly grandstanding, and considerable behind-the-scenes maneuvering by special interest groups, ended in conviction, followed by reversal due to judicial error in setting the amount of the fine.
     Thus the stage was set for the ongoing effort to keep 'creationism' in school classrooms. By the time I was a youngster in once-a-week school, creationism had been replaced in evangelical circles by a confused mix of “creation science,” in which scientific evidence was cited to support a sort of mongrelized creation idea, and a resigned effort to match belief with evolution.
     My first Bible, given to me by “My Mother” as written in her unmistakable school-trained penmanship, is an Authorized Version (KJV), printed in Scotland in 1939. Although its covers are much bent by my nervous habit of trying to press the edges over the pages while sitting through several dozens of church services, it has survived in good shape, needing only a touch of glue a few years ago to repair the binding. It is now accompanied on my bookshelves with one or more copies of: The New English Bible, the Jerusalem Bible (I bought it in the college bookstore when it first came out in 1966; it was the first non-KJV I owned), the New Jerusalem Bible, the New Revised Standard Version, the Harper Collins Study Bible (NSRV), the New International Version (2011), the Zondervan Study Bible (NIV-2011), the New American Standard Bible. the Holman Christian Standard Bible (now simply referred to as the CSB), the English Standard Version, the Common English Bible,3 the Literary Study Bible (ESV), and perhaps one or two I've forgotten about (I don't feel like digging through the stack right now). That is not to say I have all the current English language versions available—I can think of at least twenty more I don't have.4 And, of course, there are the Greek New Testaments (two), the interlinear NIV (Koiné Greek, Hebrew, Aramaic), and the concordances and guides, starting with the venerable Strong's and going on. Some are much-thumbed and underlined (my poor little Zondervan Study NIV's pseudo-leather covers are shedding like a mangy old dog), while others seem relatively untouched by the passage of time—perhaps a metaphor for the survival of the Word over the past two millenia.
     Why so many?
     Money. The Bible sells, and at as little as $4-5 for a humble 'pew' Bible to $80-700 for a genuine leather, thumb-indexed heirloom Bible, there's dollars in the Bible business. Consider, for example, the Southern Baptists, the nation's largest denomination. When the New York Bible Society (now Biblica) wouldn't give them a price break on its New International Edition (the largest-selling Bible in the U.S.), they promptly found translation issues and determined to develop their own.5 6
     Translation. For the most part, translations don't differ a lot. Advances in scholarship since 1970 or so have produced experts who are surprisingly unanimous in their opinions. Nevertheless, a decision about a single word can betray a theological stance by the editors, and when support for doctrinal stands can turn on one or two words here and there, that decision can become an issue that reverberates across denominational books of discipline, or statements of faith and practice, everywhere.7
     More Money. The Bible is, every year, the biggest selling book in the country. Publishers study the market, devise new ways to package Bibles, enlist religious celebrities to front their sales pitches, and pursue new markets assiduously—ebooks, audio books, colleges—stay tuned! Thomas Nelson, whose business is basically Bibles, recently sold for $473M.8 Biblica, the nonprofit which owns the NIV, took in just over $14M in FY2016.9 Zondervan, which is owned by industry giant Harper Collins and actually publishes the NIV, has more than 450M copies in print.10
      The money issue alone brings Michel Foucault's postmodern identification of discourse (text) with appropriation and ownership to the fore.11 Texts, or discourses, are acts, performed by an individual or individuals, and as such subject to social sanctions, positive or negative. We've already seen how language only gains meaning through its social use, so Foucault's analysis seems fitting. Buying a book, or in the case of the SBC not buying a book, is an act with consequences, and those consequences cannot ultimately be separated entirely from the scene of the discourse.
     Foucault's “archaeological” method continues to look at the history of authorship. In the Middle Ages, the age of the 'auctorité'12 (authority), texts carried significance because of the author associated with them—to cite Pliny or Hippocrates was to carry the day. As modernity began, however, readers began to look for the truth or meaning of the text without reference to the author. Authors could still win renown if readers accepted their arguments, but the validity of the text came first.
        What this meant for the Bible was that the early church sought to choose its sacred texts, especially in the crucial New Testament, on the basis of authorship. The nod went to those closest to Christ or his immediate followers in time or place. Actual disciples of Christ and prominent apostolic leaders thus took precedence over the many later writings, including the ever-popular 'gospel' genre. Those who wrote later—the church fathers--were not included in the sacred text, but listened to on the basis of their social authority, Not a bishop? Don't bother.
     With the modern, and that certainly is the case with Luther, Zwingli, Calvin, et al., readers sought for truth found in close, individual readings, without reference to what other authorities might have to say about the text. Such attention seemed to produce fresh, new interpretations, which shook the established church to its core. And therein lay a problem: inevitably, the new interpretations became not a single 'revelation,' but a conversation, a series of discourse acts set into a long series of such acts beginning, in a sense, with the Levite prescriptions of doctrine and continuing into Martin Marty and James McLendon, Jr. St. Augustine predicted the logical outcome: doctrine decided by a committee of expert interpreters guided by the Holy Spirit.13 Having joined many academic experts in committees, I cannot in good conscience recommend the insights developed in such activities—the result is usually a compromise that speaks more to the politics of the committee members and the institution they serve than any purportive 'truth.' So, meaning continues to be at once a matter of individual concepts (signs) and social environment.
     Another way to describe this activity is 'rhetorical.' If my own reading produces certain concepts about a text's meaning, and especially if those concepts affect my own behavior, intellectually or physically, then any effect that has socially is a matter of persuasion. Typically, I vocalize my concept and attempt to win its valorization by others. Of course, to the extent that my own reading is affected by others' input, my meaning is rhetorical as well. Rhetorical discourse, by its very nature, is agonistic. Those conversations about doctrines and revelations are not simply text tossed out willy-nilly for the edification of the reading public. They are near-sacrilic arguments designed to win believers and affect actions. They exist in a milieu filled with constant battles of religious belief.
     Postmodern theorists recognize the agonistic nature of discourse. Sociolinguistic work from the 70s on has focused extensively on such issues as power, control, dominance and strategy. As good, objective researchers, sociolinguists describe the various interpersonal negotiations over power and dominance rather persuasively. I don't wish to recapitulate them here, though you might want to consider, for example, the means you use to capture a place for your own contributions to an ongoing conversation, or the strategy you might use to persuade your boss to grant some time off, or your spouse to take out the garbage (or not). But is agonistic discourse necessarily a bad thing?
     One might say that agonistic discourse simply is what it is. Whatcha gonna do? That still leaves us with the problem of deciding where we decide to fit when the argument starts. Do we pick a side? Do we step aside and meditate while the world turns around us? Modern theologians have generally recognized discourse concerning religious matters as agonistic. At first, they defined the main conflict as that between the new (post-Reformation) and the old (pre-Reformation) readings of the Bible. As the protestants gained numbers and power and at the same time science seemed to dispose of several key readings of the word, the scene became one of conflict between belief and non-belief. Theologians became defenders of relevance, their work more intellectual than practical. With the Great Awakening, they faded into the background and the key religious discourse became the sales pitch.
     When I was a younker, I was allowed to go unescorted into the Northgate Theater (the Northgate Mall was the first covered shopping mall—Seattle, rain, a no-brain-er) to watch Disney's new movie Pollyanna. Little did I know that Pollyanna, a nice, enjoyable movie about a young girl whose positive outlook changes a small town, would leap into mind when I began to read up on the first and second Great Awakenings and the birth of my late boyhood church denomination. On screen played out, in color on the big screen (no small-box mall screen for Northgate Mall, thank goodness), the story of Great Awakening-style evangelism: repent or burn!
     More recently, I encountered Roland Barthes' review of a 1955 Billy Graham rally in Paris.14 It is part of 53 reviews Barthes wrote for Parisian publications in the mid-50s, as well as a lengthy essay on the idea of contemporary myth. Barthes, a literary theorist and critic, was a prototypical post-war French rationalist, who had little use for religion, hence his review in a book titled for its identification of everyday myths. Given the history of the French Reformation (a century of non-stop, vicious civil wars), the deliberate founding of the French Republic on atheistic grounds, and France's humiliation in WW2, one can appreciate Barthes' state of cynicism, but he nevertheless accurately, if uncomfortably for American evangelicals, described the rally:
     Dr. Graham brings us a method of magical transformation: he substitutes suggestion for persuasion: the pressure of the delivery, the systematic eviction of any rational content from the proposition, the grandiloquent designation of the Bible held at arm's length like the universal can opener of a quack peddler, and above all the absence of warmth, the manifest contempt for others, all those operations belong to the classic material of the music hall hypnotist.15
     My college was associated with an evangelistic denomination, and we experienced daily chapel services. At one of these, a professor from the Dept. of Religion introduced a student who, he declared, was a marvelously talented young preacher, and would now deliver himself of a 15-minute sermon. For the first few minutes, I thought that this was a comic skit designed to relieve us of our pre-midterms nerves. But, sadly, the bombastic nonsense was not. There is a thin, thin line, I discovered that morning, between the satirical and serious. I returned to my literary studies with a new appreciation for early modern satirists like Swift, Pope and Byron.16
     Billy Graham (I've been to a rally and watched several on TV) wisely recognized that while theological arguments for God abound, in the face of modern skepticism neither the Great Awakening nor the theological approaches would suffice. Instead, he faced the central issue: that belief is a spiritual matter, not a rational one. What a concept: religion is spiritual! Who'd've thunk it?
     I would suggest that reading the Bible is more a spiritual act than a rational one. Let me repeat that. Reading the Bible is a spiritual act. It is an action, because it requires both a physical commitment and an assent to wrestle with the concepts of the signs involved. It is spiritual, because committing to a rational imagining of its content leaves us with the shortcomings of meaning that led modernist readers to question its validity, and fundamentalists to cling desperately and angrily to uninformed myths about its linguistic aspects.
     Allowing the Bible to pass through our rational 'screen' and come to rest in our spiritual understanding is a transformative experience, and thus better understood in a postmodern, postchristian way. And that is something this blog will, if the creek don't rise, continue to stumble towards...

Notes
1This is particularly true for most Wesleyan, Anglican and Episcopal churches. Some conservative Wesleyan offshoots, however, embrace sola scriptura.
2In July 1925, substitute teacher John Scopes was charged by the state of Tennessee with teaching evolution in the town of Dayton in violation of a state law prohibiting such action. Two famous lawyers, William Jennings Bryant (prosecution) and Clarence Darrow (defense) eventually faced off in a trial followed around the country.
3Not “Contemporary,” as BibleGateway.com would have it.
4The others: Douay-Rheims, Amplified, Phillips, Darby, Disciples Literal, Geneva, Jubilee 2000, Young's, Wycliffe, New Living, World English, Orthodox Jewish, New Century, Tree of Life, Mounce, The Message, New American, Names of God, Lexham, Good News...I've probably left out yours, but you get the idea.
5http://www.av1611.org/vance/hcsb.html; http://www.christianpost.com/news/southern-baptists-pass-resolution-rejecting-2011-niv-at-annual-convention-51288/
6In the interest of full disclosure, I must point out that the SBC does not actually order Baptists to use a particular translation; Baptists are traditionally and theologically independent. Still, to say that Lifeways and Holman are not Baptist organizations is kind of like saying that the Pope isn't Catholic.
7See http://www.Bible-researcher.com/niv.html for a good example of such stuff.
8Radosh, Daniel, “The Good Book Business.” The New Yorker (December 18, 2006). http://www.newyorker.com/magazine/2006/12/18/the-good-book-business
9FY2016 Annual Report, https://downloads.biblica.com/docs/annual-reports/biblica_2016_annual_report.pdf
10http://christiantoday.com.au/news/niv-remains-the-bestselling-Bible-translation.html
11“What Is An Author?”. Tr.. Josué V. Harari, in Textual Strategies: Perspectives in Post-Structural Criticism, ed. Josuê Harari. (Ithaca, NY, Cornell UP, 1979), incl. in The Fouault Reader, ed. Paul Rabinow, pp 101-120 (NY, Pantheon, 1984).
12Because of the Latin root and similar pronunciation of both words, Middle English often confused “author” with “actor,” thus innocently recognizing the 'act' of discourse.```
13On Christian Doctrine, Book 4. Ed. Phillip Schaff, tr. Marcus Dodds and J.F Shaw (Amazon, Kindle Books).
14“Billy Graham at the Vel' D'Hiv'” in Mythologies, tr. Richard Howard and Annette Lavers (NY, Hill and Wang, 1957), pp.109-112.
15Ibid, p.111.

16Not to mention the biblical books of Jonah and Amos, and some marvelous passages in Isaiah (i.e. 5:16).

Language and Christianity, Part 1

In which something is said about meaning...
     (This, my second posting, started to get longer and longer as I worked my way along, so I broke it up into two parts.)
     What is “postchristian,” anyway? Is it possible to be a postchristian? Is there such a thing as postchristianity, or postchristianism? Is this simply a clever new way to say “liberal christian?” What should evangelicals, or main-streamers or fundamentalists think about it? What about catholics, or anglo-catholics, or the orthodox, or coptics? Or atheists or agnostics? Does it mean replacing traditional Christianity, of whatever stripe, with some new and possibly heretical notion? Is it biblical? Experiential? Traditional?
     In many ways, that's what this blog is all about. And many posts down the line, it may arrive at an answer, or two, or three. I hope. Since it's my blog, I'll approach the issues my way—slowly and carefully, with, I hope, respect for my readers as well as my ideation processes. The questions above are not straw men set up for knocking down like pop-up targets on a military shooting range. I'm not sure I have answers for all of them, but I'd like to find out.
     In a sense, I'm reverting to Montaigne, who began each of his open-ended Essais outside Bordeaux and ended up in Villefranche-sur-Mere or Ouistreham. Come to think of it, I wouldn't mind ending up putt-putting down the Quai de la cordiere on a Vespa Primavera, taking in the sailboats alongside. Or better yet, on the Carrer Moll de Llevant in Mahon (I've always wanted to see Admiral Nelson's villa on Menorca). But alas, only in fiction can I pick up a fountain pen—a Wing Sung 698 piston fill, if you're interested—and go to the Mediterranean or the Channel.
Speaking of fiction, it was Juliet (Shakespeare's pen at work) who asked
     What's in a name? that which we call a rose
     By any other name would smell as sweet;1
     Thus this troubled teenager opened up a can of worms that has bedeviled philosophers and scientists since at least Plato, and which forms (yet again) one of the central issues of something much like “postchristian,” “postmodern.” Would a rose smell sweet if we named it an artichoke? In Plato's day, one group would have answered “we would never call it an artichoke, because it's a rose, and it we did, we would be losing the ability of language to carry true meaning,” and the other would answered ”oh, pshaw, who cares—let her marry the guy,” or something to that effect. There's a whole discipline called “semiotics” that deals that with that today, and which may or may not be a postmodern discipline, depending on which scholar you nudge out of her afternoon nap. Haaah, I'm getting ahead of myself. I wanted to go slowly. And carefully. So, back to the beginning. (By the way, Romeo had his own answer in the next speech, and I rather like it with respect to Christianity. You can either look it up or stick with me to the end of this post, where I'll quote it. I promise.)

In the beginning, was...?
     School. Two schools, actually; school school and Sunday school. One was five days a week, 9 months a year, sort of like work for kids—it got us out of our parents' hair while they went to their work. The other was once a week, all year long. In one, I was bored to death. I still remember sitting dutifully while other first graders struggled to read a couple of lines of “Run, Lois, Run” out loud. How in the heck could they not figure out how to read and say words like “Lois?” If they were Japanese, I could see it. That “l” isn't easy for them, I later learned. My Japanese secretary had to resort to calling me by my first name (“Mel-san”--she could get individual 'l's), since there was no way, despite her good English skills, she could pronounce my perfectly good Scots surname. But native English speakers, in a white, middle class, 50s neighborhood? So, for twelve years, I sat numbly in school school and thought about other stuff.
     The other school was easier to get through, because it was shorter. The hardest part was memorizing those Psalms in Jacobean poesy, but it was good practice when in college I had to memorize the General Prologue of The Canterbury Tales. “Whan that Aprils shoures the droght of March hath perced to the rote...” Sixty-four lines, still got'em.
     School didn't get interesting until I showed up on the college campus. Actually, I'd grown up on college campuses—and lived in dorms—but this was the first time I got to go into an actual college classroom with actual professors and—remember, this was 1965--a damned maroon beanie with “69” embroidered on it. Nevertheless, the beanie was worth it, because for the first time most of the teachers, as opposed to a minority of them, actually knew what they were talking about. Many of them knew more than I did. (That's not a joke.)
     On the Sunday front, school also got interesting, as we finally lost those canned Bible lesson plans and began to focus on the relationship between our beliefs and our own culture and society. We even got to talk about S-E-X and stuff! Meanwhile, back across the street (literally) at college, I encountered something I would later learn was “structuralism.” Structuralism was everywhere. It was in Freshman English, Anthropology2, Sociology, Psychology, Art History, and Music Composition Theory, though not in Intermediate Tennis.
     Structuralism was something that had made its way into language study early in the 20th century, thanks to Swiss linguist Ferdinand de Saussure, who proposed that language consisted of two distinct parts: actual speech (parole in French, in which Saussure lectured), and a structure--language--that allowed people to create and understand speech (langue in French). One uses langue to create parole. Thus understanding language was the result of examining and dissecting actual speech to discover the overall structure of language. Saussure was soon overtaken by linguists who disposed of some of his ideas, but he had pointed the way.
     His way was soon adopted into the social sciences, philosophy and almost anywhere it could be made to fit. Basically, if one could discover the interrelationships between things, one could construct an overall structure that would explain them.

The story of the sign, and a tale or two...
     Central to Saussurean linguistics was the notion of the sign. Imagine, if you would, glancing out my window at the Harley dealer's new, gigantic flagpole. If we wish to discuss this flagpole, we must first be able to direct our mutual attention to it. To do this, one of us says “flagpole.” Thus our attention is directed not to a telephone pole, a motorcycle, a middle school student, or any other item within our view, but to the flagpole. (In linguistics, this is called a “pragmatic” function of language.) Now, the word “flagpole” is clearly not itself a pole, it simply substitutes for it in conversation. Thus it can be called a signifier, or something that signifies the existence of a particular object. The object itself can be called the signified. This is rather obvious to anyone who has language, whether it's a flagpole, a mât pour drapeau, or a pòla brataich, as my fellow clan members' ghosts would say of a late, dark evening at the An t-Aodann Bàn Lodge3. What Saussure proposed was that there was a third thing involved besides the signifier and the signified. That thing he called the sign.
     The sign is a mental concept, rather than a particular physical object, such as a word, realized in sound waves or bodily movement or technological means such as writing, or a flagpole, realized by, well, just sitting there in front of us. These three things—the sign, the signifier, and the signified—form a triangle of meaning. Each is necessary. The signified is linguistically irrelevant without a signifier to isolate it from all other possibilities, and because the signified is different from ourselves, we must form an idea of it in order to talk about it. Thus language, by calling upon our ability to conceive ideas, now allows us to mitigate—to imagine and manipulate—our world.
     Saussure recognized one problem with the notion of the sign, that as a mental concept it was susceptible to change. Signifiers and signifieds can be affected by various physical alterations—erasures, destruction—but the concept of them remains unaffected. To change a concept, one must change one's mind, so to speak. In Saussure's notion of language, in which existing speech (parole) is the only evidence allowed, change in the sign is indicated by changes that show themselves over time. Thus Saussure determined that language could be studied either at a single point in time (synchronic) or over a length of time sufficient to produce change (diachronic).
    Wait a minute! Do you see what I see? (No fair quoting Noam Chomsky—he comes into this later.) In keeping with the best tradition of Enlightenment and modern thought, Saussure imagined language as an objective and discrete 'thing,' which can be isolated as a topic of science. There is, in the example above, not only a unique and separate signified and signifier—the flagpole and the word “flagpole”--but a similarly unique sign. This must presuppose a single, objective mind to conceive the thought. Of course, Saussure didn't entirely miss the problem. He posited the sign as created by a sort of social consensus. Everyone speaking a language must agree on the concept that is the sign. Except that language is objective only in the most high-flown of theoretical imaginings. Yes, language functions in part because most people agree to accept and use meanings that are equivalent. But signs share equivalencies, not equalities. What language's lexicon uses, to steal from the slogan of the French revolution, is fraternité, not ègalité. Nobody experiences exactly the same sign because everyone's experiences and learning are at least slightly different. Conceiving an identical sign c'est impossible.
     So, what exactly is the role of the social, or the subjective? Even the idea of a single, objective mind as a model for the structure of language obviously cannot exist without a social parameter. We learn, or acquire, language from our social surroundings—caregivers, family, friends, eventually school—and there is no dictionary at hand until the latter, and even there it's introduced rather late in the proceedings. Chomsky, working from observations of language acquisition and structural universals noted during the heyday of structuralist and ethnological research, argues that the human mind is biologically built for language acquisition.4 What differentiates one language from another is social input—semantic, syntactic and pragmatic options acting upon and within the potentials built into the brain. Although he has been challenged on the biological issue, in spite of pretty good evidence for it, his critics in no way challenge the necessity of the social environment for the acquisition of language.
     As for the subjective, let me tell a horse story. Horses are social creatures, raised within a band of other youngsters, their dam, other mares, a few young bachelor stallions and a breeding stallion. On a horse ranch, they also encounter a few humans, like me, who assist (if necessary) at conception and birth, medicate, feed and water them. The fillies and colts receive a good deal of training, first from their dam, then the lead mare and her 'assistants', who teach them the behaviors necessary to be good members of the horse world, and humans, who teach them the behaviors they need in order to remain safe and socially acceptable in the human world of stables, barns, pastures, gates, fences, lead ropes, farriers, tractors and pitchforks.
Horse number one in this story is a colt out of an aged Spanish stud originally from the Canary Islands, who has a fine Western conformation but, unlike the usual Spanish Arabian, has a timid streak, which he passes on to some of his male offspring. This causes them, just like daddy, to be unpredictable and occasionally aggressive around people. Not the best personality for a horse bred to ride, work and show. We also had a very old, arthritic, Egyptian stallion who'd long since passed out of his breeding days. Normally one does not dare to put stallions together in the same paddock—their immediate instinct is to fight each other. But I experimented. I took the timid colt and his stablemate, another yearling colt, and turned them out every day in a paddock with the old stallion. A year later, the timid colt, following the example of his old tutor and his young buddy, had developed completely out of his timidity and unpredictability, and responded well to further training and use around people and the farm.
     Horse number two is a young filly who, after weaning, was put in a stall in the mare barn and turned out to pasture with other mares. She bruised her foot and developed a painful abscess inside her hoof wall, which required three or four weeks of treatment, including daily wrapping of her hoof as well as keeping her out of wet, muddy places, which in turn required her to be turned out in the smaller, dry upper pasture. When I pronounced her cured, I planned to turn her back out in her old, large pasture, which required her, each morning, to prance through a few feet of the sticky mud that often develops near busy pasture gates in rainy climates. On the first morning of her new turnout, the filly's limp reappeared and she refused to go through the gate. She walked along on three legs, wincing painfully every time her previously bad hoof neared the ground. Her owner gasped and wondered aloud what had happened to make the abscess recur. But this was not my first rodeo, as they say. I opened the gate to the upper, dry pasture, and released the lead rope. She happily ran through that gate, all trace of a limp miraculously gone.
     While neither the colt or the filly learn to “talk,” in the human sense, they learn to communicate from the first time their dam teaches them to stick close to her, beginning a few moments after birth. Horses communicate primarily through equine body language, which has a significant vocabulary and at times a rough syntax. As they live with people on the farm, they also learn to “read” human body language, For example, a commonplace among horse people is the horse who can tell when a human approaches their territory with the intention of catching them up, even if the rope or lead line is carefully hidden. Many people who live with horses or dogs learn to rely on their animals' “reading” of strangers, which is often superior to their own.
     Animal trainers such as Cesar Milan (dogs), and Monty Roberts5 and Dan M. “Buck” Brannaman (horses) have achieved fame by learning canine and equine “language” and using it to quickly and easily train them to function in the human environment. In my nearly ten years working with horses of all ages and genders, I found that a combination of Roberts' techniques and close observation of horses using my professional linguistic training (as Roberts advises) gave me what sometimes seemed to onlookers (and me, for that matter) a “magic touch” with them. (If only I had devised such a methodology with their owners!)
     At the very least, we learn that language is acquired and developed through social interaction, including observation and teaching, and the subject's ability, like the young filly, to apply their acquired language to new situations. The case of the filly is an especially interesting one, in that it indicates that learners are not simply receptacles of an objective knowledge, but can invent new “words” or signs, and apply them in new ways to manipulate (mitigate) situations to their advantage. In one academic study I observed, children of early school age learned narrative skills by attempting to create new, fictional narratives of their school experiences that were designed to avoid criticism or negative feedback from the parents and family. At first, they were frustrated by parental skepticism or countervailing versions from older siblings. As they grew older, they learned how to avoid parental skepticism and deal with the siblings (up to a point, as anyone who has siblings can attest).
     We see, then, that language is anything but a purely objective, isolated thing. It is thoroughly social and subjective. Meaning is derived from the ability to apply and interpret signs to and within their context. So-called “dictionary” meanings that imagine signs as discrete things with precise and limited “meanings” are belied by the fact that the dictionaries themselves are constantly adding additional “meanings” to signs, and subtracting older “meanings” as outmoded. Consider the famous quote from Robert E. Lee: “Duty then is the sublimest word in the English language. You should do your duty in all things. You can never do more, you should never wish to do less."6 Merriam-Webster lists the meaning of 'sublime' as “to elevate or exalt especially in dignity or honor.”7 This seems to help us make sense of Lee's quote. But is this what Lee really “meant”? Lee's education in the first quarter of the nineteenth century was largely a product of the eighteenth, where the concept of the sublime was a distinct philosophical notion made famous by philosopher Edmund Burke. Burke and others—especially poets and artists--often commented on the sense of awe, horror and dread created by certain sights, such as their experiences in the Alps. What Lee meant by “sublimest” in all probability was conditioned by his educational experiences with the 18th century concept of the sublime. When one considers many of his other well-known quotes about war and life, and his life experience (he was known as an excellent student as well as a military officer with considerable combat experience), it is entirely probable that Lee was thinking of the 18th century concept rather than Merriam-Webster's in the 21st.
     Thus endeth part 1 of this lengthy meditation on meaning. I said earlier that I'd give you Romeo's answer, so here it is:
     I take thee at thy word:
     Call me but love, and I'll be new baptized;
     That seems to be the gospel8 in a nutshell.

Notes
1Romeo and Juliet, II:ii
2I am being generic. My own college favored cultural anthropology à la Boas-like ethnology, rather than Lévi-Strauss style structuralism, quite possibly due to its historic missions/pastoral focus, and/or the close association between Marxist thought and some structuralists (this was the cold war, after all).
3For non-Gaelic speakers seeking fine dining, great scenery, single malt Talisker, and real ghosts, it's the Edinbane Lodge in Edinbane, Isle of Skye. Go in summer...unless you have a lot of heavy Scots woolen clothing.
4Chomsky, Noam. Structure of the Theory of Syntax. (MIT., Cambridge, MA. 1965.)
5The Man Who Listens to Horses (London, Arrow. 1997)
6http://www.historynet.com/robert-e-lee-quotes

8'god's spel', from Old English. “Spel” became the modern word 'spell', as in a magical spell. To the Angle-Saxons, words were considered to have power over the physical and spiritual world.