Category Archives: History

You Guys and Y’all

 

English has many features that make it a complicated language to learn, from its irregular spelling to its multitude of variations in grammar, dialect, and pronunciation. One area that English is especially confusing in is in its second person pronoun, you, which officially only has the one form. You is both singular and plural, formal and informal, though the scale of formality is beginning to change, with the introduction of yaas a second person singular informal pronoun. For more on that, watch my last video, which discusses youand ya.

Now the lack of differentiation between singular and plural youcan be awfully confusing for most English speakers. This language requires the use of names or titles to make it clear who the speaker is referring to. I might say, “Could you tell me how to get to Gordon Square?” and if I’m with a number of friends or relatives it won’t necessarily be clear a.) who I’m speaking to and b.) if I’m speaking to one person or to multiple people. For someone like me who has often found using personal names to be a bit too direct, and thus borderline rude, this ambiguity in pronouns makes English all the more difficult.

Different English dialects have determined their own resolutions to this problem. In some parts of Britain, the plural youbecomes you lot, while the older second person plural pronoun yeis still occasionally used in some places. Here in the United States, this country is largely split into three camps: those who use you, those who use you guys, and those who use y’all. Now I’m from the Midwest, and I’ve always said you guys, it’s just how the pronoun has developed. In this instance, guy is a lot like the word men, in that it has both a masculine and a neuter meaning.

You see, at one point in English the word menwas actually two words: the plural of manand the name of the human species as a whole. Today that second version ofmenand its variant mankind has mostly been overtaken by the more observably gender-neutral humanand humanity. This is a change that reflects the social changes in our society, with the rise of gender equality in English-speaking countries like the U.K., the U.S., Canada, Australia, Ireland, and New Zealand. Granted, there’s still a very long way to go to reach true, proper, gender equality, but we’re on that road.

The word guy has a similar dual meaning. It’s both an informal way of saying manand has, at least in the U.S., developed a gender-neutral meaning as an informal way of saying people. Now the word guy comes from the name of Guy Fawkes, the English Catholic who tried to blow up Parliament in the Gunpowder Plot, an event that’s commemorated in the U.K. every 5th November. So, unlike most English pronouns which are of Germanic origin, guycomes from a personal name, which is actually a variation on the Italian Guido. To me, in saying you guysit’s a gender-neutral phrase.

Granted, that’s not how everyone sees it. In an April 2017 edition of the podcast Midwesternishproduced by KCUR, the NPR affiliate in Kansas City, the hosts argued that you guysis gendered and too masculine to be used as a generic second person plural pronoun. Instead, they proposed using the Southern American y’allas an alternative. Now, grammatically I can see how this makes sense, but I whole heartedly disagree. To me, as much as y’allmay not be as confusable with any gendered language, it nevertheless is tainted by history. To me, y’allreminds me of the racism, discrimination, and fear that has permeated the American South since the colonial era. It’s a Southern phenomenon that I don’t want to use. And while to me you guys doesn’t sound inherently masculine, considering English’s complicated way of making gendered nouns be spelled and pronounced the same as non-gendered ones, I can understand how to some it might appear misogynistic to call a group of people that include women guys.

So, what’s an alternative? Unless we decide to go back to the old plural ye, which has been thoroughly ridiculed through centuries of hear ye’s, it’s probably best for the moment to just use youand stick with that. Then again, there’s always the Norman French inspired plural sin English, which this language didn’t originally have, so maybe we should just say youse

Advertisements

“You and Ya”: the Return of Informal vs. Formal Pronouns in English

English is an awfully tricky language, it has had so many influences and developed so many dialects and variations over the millennium and a half that it has existed that are not just influenced by geography, society, economics, or religion, but a big mix of all four and many more. My English can be officially designated as Inland Northern American English, which roughly corresponds to the Great Lakes region going west to the Dakotas and somewhat south to cities like St. Louis and Kansas City where it runs headlong into Midland American English and Southern American English.

Yet my English also has been heavily influenced by a somewhat sizeable list of other European languages, brought to my home region, the Midwest, by generations of immigrants including my own ancestors, whether it be the slightly more Minnesota/Wisconsin/Ontario “o” sound that I use in “Go,” and “Sorry” which I’ve noticed sounds awfully similar to the Swedish “å”, while the addition of a “y” sound in any word starting with a hard consonant followed by an “a” or an “e” such as in “cart”, which I heard in Cardiff, Wales, though I’ve read that it may also come from the Dutch spoken in New Netherland (modern New York) in the seventeenth century. Then there’s the more recent influences, from my loving adoption of the odd Yiddish curse word, especially “schmuck” to how I was told that I inadvertently use the word “utilise” more ever since I started learning French (“utiliser” is French for “use”).

One especially confusing thing about English is the second person pronoun, of which there’s only one: you. It’s not like French which has “tu” for singular and informal and “vous” for plural and formal, or Irish which has “tú” and “síbh” for the singular and plural respectively. We used to have more variety in second person pronoun in English. I study the early sixteenth century, and spend my working days reading manuscripts and books written in Early Modern English. Back then, English had “thou” (sometimes spelled thow) for the singular informal, “you” (sometimes spelled “yow”) for the singular formal, and “ye” for the plural. At that time English was written phonetically, so “thou” and “you” were both pronounced with the same ending vowel as we use in “you” today. Yet by the American Revolution, “thou” had largely disappeared from common use in English, surviving most recognisably in the Church of England and through older poets and authors like Shakespeare and Chaucer.

So, today, we have “you”, and only “you”. It’s singular and plural, formal and informal. In academic writing, especially as a historian, we’re strongly advised to avoid using “you” as well as the first person pronouns “I” and “we”. Academic writing is about the most formal form of English today. By using only the third person “he,” “she,” and “it” it is impersonal and distant from the reader. This is one element of the attempt amongst us historians at being objective.

Despite all the certainty that there is only one “you” in English, I’ve noticed this starting to change. “You” is gaining an informal variant again, only this time it is not the now antiquated “thou”. Rather, the frequently used “ya” seems to be gaining traction as an informal version of “you”, one that we can use with our families and friends, but one that I wouldn’t use with a boss, professor, or other official. I’ll often hear people greet their friends with “Hey, how are ya doing?” or even more informally “Hey, how’ya doin?” or when asking for input on something, “Hey, what do ya think about this?”

So, is this a permanent change in English? Will generations future learn in their English classes that they must use “you” with their teachers and “ya” with their friends? Or will “ya” fade away like “thou” and “ye” have gone before it? Granted, while this is primarily an American English phenomenon, with the advent of the Internet and with how much easier it is getting to travel internationally, “ya” could spread across the English-speaking world. An American export like “jazz,” “baseball,” and “barbecue.”

But what do you think? Do you use “you” in all cases? Is there a different type of “you” that you use in your own dialect of English? And if you are learning English, or speak another language as your first language, how do English pronouns differ from those of your own language? And does your native language influence your English?

Nolan’s “Dunkirk” – An Abstract Tribute

dunkirk-christopher-nolan-trailer-images-75

Credit: Christopher Nolan [found at Cinemavine.com])

What I found especially gripping about Christopher Nolan’s latest film, a retelling of the Miracle at Dunkirk, was that each of the individual people in the story were not the main character. That role was filled by the seemingly indomitable human spirit, and will to survive and struggle ever onwards. Dunkirk might well be one of the most defining moments of the Twentieth Century for Britain, and quite possibly as well a crucial turning point for the whole world.

The film follows three main groups: the soldiers on the beaches, the sailors both civil and naval crossing the Channel, and the RAF in the air trying to keep the fighters and bombers of the Luftwaffe from wrecking further havoc to the men stuck at Dunkirk and the ships trying to ferry them to the safety of home, a mere 26 miles away. Though the plot is not in itself chronological, it nevertheless helps tie together each disparate group, connecting their experiences in a spiritual fashion as each come ever closer to the film’s climax.

For British and Commonwealth viewers this film will certainly reinforce that Dunkirk Spirit, that steely determination that even in the darkest of hours Britain and her sister countries will never surrender. I became quite emotional when, after witnessing the sense of doom the soldiers on the shore felt for a good hour, the hodgepodged fleet of little ships arrived in the waters off of Dunkirk. This moment, though one of the darkest hours in British history is also equally one of the most inspiring to have transpired in that island nation’s long story.

For American viewers this film should give us pause. In our present hour of immense internal divisions, of political unrest and civil discontent we should consider what it would mean for us as one people to come together for a cause we all knew to be necessary for the continued survival of our country and the liberty it’s Constitution assures. In this hour of great uncertainty we should be looking not to what divides us but what can unite us.

Hans Zimmer’s score is a welcome change from his usual set of loud brass, excessive strings, and choirs primarily singing “Ah” for far too many measures. While loud, this score adds to the energy of the film, and in a musical sense is largely understated. The music helps bring the viewer into the picture, onto the beach, aboard the small boats and naval ships, and in the cockpits of the Spitfires high above in the air. I really appreciated the echoes of Elgar’s Nimrod that played over the final scene as Britain and her forces came to rest aground again and prepare for the inevitable Battle of Britain to come.

Overall, I thoroughly enjoyed this film, and just a few minutes prior to sitting down to write this review I told my writing partner Noel that I have to go back and see this again soon. Dunkirk is a film that triggers both the conscious and sub-conscious, that calls upon one’s entire emotional and physical self. It is one of a number of films that are to me the new “talkies”; they address not only our visual and aural senses, but our emotional senses as well. I have a feeling there will be many more films like Dunkirk to come.

dunkirk-movie-trailer-christopher-nolan-screencaps-

Designing Cities for People

St Paul's at Sunset - April 2016

St Paul’s at Sunset

In older standards of measurement, the imperial mile (1.609 km) was not the longest measurement of distance available; the league filled that role instead. As I have understood it, one league is equal to the distance a person can walk in one hour. For me, that is around three miles, thus making 1 league equal to a decent distance for a nice morning stroll. In 2016 when I was undertaking my first Master’s degree, this one in International Relations and Democratic Politics, at the University of Westminster, I would occasionally decide to walk the league from the university on Regent Street back to my flat in the shadow of Fenchurch Street station on the eastern edge of the old Roman city.

The walk was quite pleasant, a stroll first down from the university to Oxford Circus, then eastwards along Oxford Street to Holborn, and then down High Holborn across the Holborn Viaduct and past the Old Bailey at Newgate, past St Paul’s and onto Cheapside, crossing in front of the Royal Exchange and Bank of England before continuing down Lombard Street and onto Fenchurch Street. At Fenchurch Street station, I would descend a short flight of steps leading towards St Olave’s Hart Street and under the station viaduct itself past the city walls and into my building on Minories.

What made this a nice walk was that I was able to see so much of the capital, everything from the imperial Edwardian grandeur of Regent Street to the new skyscrapers that are being built across the Square Mile to the east. It was an opportunity to experience London as so many had done so before, to get to know the metropolis by foot. In London this is something that can fairly easily be done, one can walk around the capital if one wants to. Sure, most of the suburbs are out of reach for the pedestrian, but with the well established system of underground and suburban railways, as well with the very thorough bus network, London is a city that a person can easily live in without owning a car, let alone riding in one on a daily basis.

When I moved back to Kansas City at the end of August 2016, I thought I would try at keeping up my walking, to walk the same ten miles each day. Yet that didn’t happen. Far from it, I found Kansas City to a.) be built largely for cars, and b.) with a climate far more harsh than the one I had known in London. As a result, not only did I not walk nearly as much as I had wanted, but I found myself hardly walking at all beyond going out of my parents’ house to get into the car and drive somewhere.

While my own lack of fortitude certainly is to blame in part for this sudden drop in my exercise, I also have to lay blame on the city planners here in Kansas City. This city, like so many others in the United States and Canada were designed, or re-designed, for motorists. In fact, it is illegal for a human being to walk in the street in Kansas City, Missouri; if you’re human, you have to stay to the sidewalks (pavements). The rest of the street is reserved for cars, buses, bicycles, vans, and trucks. We have built this city and so many others like it without the human touch that has made cities so universally human in nature.

For thousands of years, our ancestors have lived in cities that were not unlike the Central London; they were just big enough that an able-bodied person could walk from one end to another in about an hour. Cities were built with walking in mind, with the understanding that all of the basic necessities that a city offers should be within walking distance of each citizen’s home. Smaller medieval cities like Besançon in France, Canterbury in England, or Galway in Ireland are prime examples of this sort of pedestrian-focused urban planning.

“In fact, it is illegal for a human being to walk in the street in Kansas City, Missouri”

Here in the United States too there are some attempts at returning to this older model of having residential and commercial establishments within the same general area. Here in Greater Kansas City there are some newer developments that aspire to this goal. Two in particular that I visited this last Friday stand out to me as examples of how to undertake this task, and how not to do so. The latest pieces in the Town Center shopping complex, Park Place is an excellent example of such a development.

A set of winding, narrower streets lined by three and four story buildings, its street level fronts are filled with shops, restaurants, and some offices, while the upper levels are largely residential. In this way, one can live in a compact community, within which one does not necessarily need a car to get around. I first was able to experience Park Place two years ago when walking a 5K through the Town Center area. At that time Park Place was still under construction, yet even as a construction site it seemed vastly out of place when compared to its neighbours in the most arch-suburban of American counties, Johnson County, Kansas. What particularly makes Park Place odd, and in the end stunted in its growth and feasibility is that one has to have a car to access it. Sure, one could live within Park Place as a pedestrian, but going beyond its towering confines on foot can be a perilous exercise with traffic on the surrounding avenues averaging a speed of around 45 mph (72 km/h).

What Park Place does well is its compactness, including both commercial and residential in the same area. Another, equally new development a few miles south of Park Place ignores this principle of traditional urban planning, setting the residential aside from the commercial. This particular development is the fascinatingly misplaced Prairiefire complex on 135th Street in Leawood, Kansas. Another physically enormous complex, Prairiefire’s crown jewel is the Museum at Prairiefire, billed as Kansas City’s Natural History Museum, and an affiliate with the American Museum of Natural History in New York. While the Prairiefire Museum’s architecture is aesthetically beautiful, its size is much like the rest of the Prairiefire development: lacking long-term thinking.

My biggest problem with Prairiefire is the way in which its residential development is divided from its commercial sector by a massive concrete parking garage. Prairiefire was designed by a suburbanite intending to create their image of a compact urban community, albeit without ever having stepped foot inside of a traditional compact city. By splitting the residential from the commercial, they make it far less amenable to residents to take advantage of the shops, restaurants, and entertainment in the commercial side of the property. What’s more, the Museum at Prairiefire itself is deeply flawed in that it is not built with the ability to expand in mind. The current structure is small, built more like a community arts centre and less like a great temple dedicated to nature.

Our over-reliance on cars here in the United States is flawed at the utmost degree. Should there be a major energy crisis in the near future, the vast majority of our cities and states will find themselves paralysed, unable to function owing to the lack of oil to fuel our cars. Developments like Park Place and Prariefire might be able to last for longer, owing to their relative compactness compared to the more traditional suburban sprawl, yet their isolation amidst the sea of suburbia will soon find these two developments in the same situation as the traditional suburban developments.

Our cities must first and foremost be self-reliant; we must be able to grow our own food, and use our own renewable energy sources to power all aspects of our lives. Yet along side this if we are going to build smart, self-sufficient cities, we must build them more compactly, with ourselves in mind. Just consider, if you are suddenly without your car, and don’t have the option of taking public transport, how will you get around? You could certainly walk around your city, but that prospect is only truly viable if said city is designed for walking.

Today, I generally prefer using metric to the more traditional imperial standards of measurement, yet that most old-fashioned of imperial measures, the league, is one that should be maintained. It keeps us humans at the centre, and reminds us of our own physical limitations and abilities. When we consistently push ourselves far beyond those abilities, we endanger the stability of our societies, making any potential crisis even more disastrous.

The Constancy of the Modern

Embed from Getty Images

If we can learn anything from history, it is that our story has always been acted out and subsequently recorded by people, not unlike us. Each successive generation has done their part to immortalise their greatest tales through stories, both oral and written, into the collective memory of society. As time has passed, each generation of historians has endeavoured to best tell these stories of their predecessors in a way which their own generation can well understand. To the historians of the Renaissance, the millennium immediately proceeding their own time quickly gained the pejorative name the Dark Ages, while its architecture was equally appallingly disparaged as Gothic.

To the Renaissance and subsequent Enlightenment historians, the time to hearken back to with all glory was that of Classical Antiquity, of Greece and Rome. The intervening millennium in between the Fall of Rome in 476 CE and the rebirth of classical learning in the Italian city-states in the late fifteenth century was merely a setback in the onward march of human progress. It was a setback defined by religious fervour and superstition, when science was equated to wizardry and the light of literacy confined to only a select class of clerics and aristocrats.

Each generation of historians has strived to understand the past both in the light of their own times and in the understanding of how those in the past understood themselves. Yet for the analytical nature of the study of history in our present scientifically-centred age to be properly propped up, contemporary historians must continue to classify and divide history into particular periods, places, and categories. Political history must remain distinct from cultural history and social history, while the aforementioned Renaissance must somehow be understood as different from the Medieval period that came before it.

What is most striking is the division of the discipline into broad spans of time, particularly concerning European history. One has a choice of diving deep into the past with Ancient history, a concentration primarily focused upon the Mediterranean world from the earliest communities to the fall of the Western Roman Empire in 476 CE. Or perhaps one would prefer to study Medieval History, focusing on Europe during the ten centuries between the aforementioned fall of the Western Roman Empire and the eventual fall of the Eastern Roman Empire in 1453.

Or, if that does not suit one’s fancy, one could try one’s hand at Early Modern History, covering the period of time between the turn of the sixteenth century to the French Revolution. While modern, this period still has its fair share of the medieval about it to make it more remote. Then there are the modernists, those whose focus is squarely on recent European history, the stuff that has happened since the fall of the Ancien Régime in 1792 and the rise of modern European liberal democracy through the nineteenth and twentieth centuries.

What this model of understanding European history is founded on is the old Renaissance understanding that Europe will always be dominated by the legacy of Rome; therefore all European history must be understood in relation to the glories of the Roman Empire. The medieval is a giant leap backwards in the ruins of the once great imperial edifice, while the rise of modernity marks the return of European society to its former Roman glory. The other thing that this model is focused on is we modern man. Since it was first devised in the seventeenth century, this understanding of history has always held modernity as the pinnacle of human achievement, at least to that point.

The term modern itself comes from the Late Latin modernus, an adjectival modification of the Classical Latin adverb modo, meaning “just now.” Modernus in turn developed into the Middle French moderne by the fourteenth century, indicating that something similar to our understanding of the present time as modern was in use as early as what we would now call the Late Middle Ages. True, to my generation devices like the digital tablet, electric car, or the ability to make videocalls are decidedly modern, our grandparents could equally have said the same fifty years ago for the television, jet airplane, and IBM 7080 computer were equally modern to their own time. Likewise for our great-grandparents the very idea of a subway, car, or airplane on its own was incredibly modern.

The way I see it, the term modern is the hour hand on the clock of time; it is the pointer that marks where we are on the cycle that is human history. Just as Edward III was a modern monarch for his own time so too Elizabeth II is for ours. Likewise, while Geoffrey Chaucer may well have been seen as a modern writer for his day and age, working into the late hours of the night in his rooms within the edifice of London’s Aldgate, so too someone like me is all too modern for my own time. Though I write so often about the past, and do my best to draw connections between what has been and what is present, I cannot help but understand the people, places, and things that have already come and gone through the lens of my own times, of my modernity.

Therefore to define ourselves as modern is not to make us anymore unique than our predecessors. Rather, to do so we not only continue on the legacies of their respective modernities, and write our own story, always utilising this most constant of chronological labels.