Saturday, 15 August 2009
Take, for example, biology, which has a need to name things in order to discuss them: pharynx, larynx, trachea -- parts of the body used for speech and breathing, among other things. Once could hardly carry around a diagram at all times for the purpose of pointing out "this little bit here, see?" in order to talk about it.
In chemistry, there are systems for the naming of molecules, like adding the prefix "thio-", which means that the molecule has had one oxygen atom replaced with a sulfur atom -- hence thiosulphate has one more sulfur atom and one less oxygen atom than sulphate. These peculiar dialects are still English, of course, but they contain whole new words and systems for creating words, in order to better discuss the intricacies of their subject matter. They are necessary to further the study of their subjects.
The questions I have been wrestling with are the ones that arise when we compare the dialects of subjects in the Sciences with those of subjects in the Humanities: to put it simply, to what degree do the Humanities need their own dialects (if at all), and to what extent are we meeting (or surpassing) that need?
Language, the very point of language, is to communicate. And while it may require that a student learn the dialect of the subject matter to fully understand the subject matter itself, that seems a small price to pay for the furthering of greater human understanding. The unfortunate part is that language can be used to prevent communication as well as to facilitate it.
Every new word that is created to make discussion a little quicker for those already familiar with the subject is a roadblock to those who are not already familiar with it. Because of this, there is a trade-off, a cost-to-benefit ratio that must be weighed when deciding which words to use. There comes a point at which a dialect becomes so specialized that it requires extensive study just to enter into the discussion: this is what I mean when I write of the language that locks others out.
If we start from the principles that greater human understanding is a good thing, and that it occurs best when people understand one another best,* this leads to a series of questions every writer in the Humanities should ask her- or himself when writing.
First: How can I say this? What are my options, as a writer, as to which words to use, what kind of phrasing?
Second: How efficient is each option? It's all well and good to say that X is a very complicated word that will be understood by few, but if the alternative is a two-hundred word explanation, then perhaps it is necessary for the "best" communication.**
Third: Who will understand each option? If I use a highly technical word, how many people will be able to grasp my meaning?
After considering each of these, the writer in the Humanities should be able to make a judgement call as to how best to phrase a particular thought.
What ought never to occur is the deliberate choice of a more technical word out of a desire to seem more worldly, more learned, or more intelligent. Language should not be used to shut others out deliberately.
I realize I've been dancing around the issue a bit; let me be more clear: I have noticed, in past years, what can only strike me as a kind of linguistic envy held by some of those in the Humanities for the complicated dialects used by those in the Sciences. I'm not saying that all complicated concepts should be simplified -- as you may have gleaned from my roundabout writing, I favour an approach that balances communicating efficiently within small groups and communicating effectively with large ones. Many complex terms in the Humanities are necessary, simply because many subjects in the Humanities do build successively from one subject to the next, and so to understand the one, you need to have previously studied the other; however, many more seem to be used just to make writing sound more scholarly, and to build a private walled garden for scholars.
Let me show an example. I won't say where it's from, but some of you might recognize the style. And if it's your writing, I apologize, but this is a perfect example of what I'm talking about.
"art is intractably enmeshed within its originary geotemporality"
What I believe the writer is trying to say is that art is, by its very nature, part and parcel of the time and place from which it comes. Or that art cannot be separated from its time and place of origin. Phrases like "intractably enmeshed" and "originary geotemporality" make up the worst kind of academic writing in the Humanities: they needlessly confuse, obfuscate, enshroud, or (to put it simply) hide the writer's meaning, all in what appears to be an attempt to keep others out of one's chosen field of study.
This is the language that locks others out; this is what needs to stop.
Comments and questions should be made in itty-bitty words, so's I can understand 'em.
*I understand this is a very loaded term, because "best" could mean both "most efficiently" as well as "with the greatest number of people", but please, keep reading.
**"Best", for me, seems to mean a balance between communicating with the most people and communicating the most efficiently.
Wednesday, 10 June 2009
Saturday, 30 May 2009
Friday, 29 May 2009
"But if we leave everything buried that ought to be buried, what will we put in our history books?"
The question I'd like to raise is, perhaps, one that seems obvious to many of you. Ought anything from history to remain buired? Is there any instance, any part of history that could be better left unsaid?
When it comes to the long-dead stages of history, those where all people involved (and, indeed, all the children of the children (of the children?) of those involved) have passed on, it's easy to say that yes, we ought to know about it. It hurts no-one as the dead have little care for their reputations (that is, unless you believe in the afterlife posited by Dante, where posthumous reputation remains important).
The same is true for any atrocities we know about today. It's easy to say about the holocaust that it's better that we know -- it helps us never to repeat such a horrrific and awful event in our history.
But there are still events where it remains a queston. When it comes to atrocities we don't know about, and when it comes to social issues that could negatively affect the private lives of living people, we ought to at least have the debate.
For the first, imagine that your society was a peacful one. Imagine that mutual respect was universal and poverty scarce. Now imagine that you discover that one, two hundred years earlier, your predecessors created this society by means of a near genocide, the revelation of which could cause violence today. Is that something one would really want to know? When the damage has been done, and the risk of a repeat occurence is low, is it always best to spread the truth?
I'm copping out of that one. I'm not a philosopher, and I can't begin to assess the rights and wrongs. I just thought it was worth the consideration.
The second hits a little closer to home. Imagine that someone's grandfather was a famous person -- an artist, a writer, whatever -- that your contryfolk loved, looked up to, used as a role model for good citizenship. Now imagine that you discover something controversial about that person. Imagine that despite all the good they did, they were a terrible racist, or a terrible sexist, or another as-yet-unnamed kind of bigot. Is it imperative that the world know? What if that controversial aspect is something else, something no fault of their own, something not regarded poorly today because it is actually something bad, but because of our own bigotry? Where, as historians, do we draw the line as to what "truth" to reveal to the world?
I don't know. For me, it would rely on my own conscience at the time. I hope it would rest on anyone's. As much as we think, as historians, that our work is unlikely to affect anyone alive today, I suggest that we must always consider if and how our findings will change the world we live in, here and now. Most of the time, I hope that will result in a more considerate way of releasing possibly sensistive information, rather than resulting in the witholding of said information. If nothing else, it must be said that truth generally finds its own way out, and so repressing it tends to be a pointless venture: when all is said and done, perhaps the best we could do would be to conrol the rate at which the evils of the world escape Pandora's box. If you have any thoughts on this, I'd love to hear them.
Okay, I'm done.
Thursday, 28 May 2009
Today's musings pertain to the boundaries of history/art history. My new job is at a small museum where I am responsible for taking visitors through the exhibit and discussing the artists, their techniques, etc. The environment is very much to my taste, being more concerned with the conditions which produce art and less with the composition, color, and "quality" of the art. Thus, I have spent the last week cramming facts and biographical details (as this museum is not medieval, and, in fact, is just about as far from medieval as you can get). Which has led to an interesting problem which I haven't ever considered.
As a medievalist, I deal with long dead issues. Most scandals and questionable dealings are dealt with from a purely academic perspective. Everyone involved has been dead for centuries, and it's generally quite difficult to find descendants to be offended on behalf of their predecessors.
Not so with history from the turn of the last century. One of my only instructions regarding the nature of my tours of the museum was to not mention a very well-known love affair between a certain prominent artist and one of his models. My response was to stare blankly at the board member, who didn't seem to think this was an odd request. "She was a local girl, and her family still lives here. Some of them come in from time to time," he explained. Well, yes, but that's no excuse to whitewash the artist's entire history. For said model was the artist's model in just about every painting he did for 20 years. And he lived with her for nearly 50 years. And it's quite likely that his relationship with her was the influence for most of his work, and the major decisions he made regarding his work.
And I'm not allowed to talk about it?
Fortunately, I'm in such a position that no one is really able to tell me what to say or not to say. So I will certainly be discussing this artist's life and the people in it. Simply because the truth might be uncomfortable for a few people doesn't give me the right to gloss over it or whitewash it for something prettier. However, I readily admit to understanding where the concern is coming from.
Said prominent artist lived long enough that there are still plenty of people about who knew him, and the woman was alive practically into my lifetime. These are still people remembered as individuals, rather than as historical figures. His children have all passed on, but there's still plenty of family about. They, understandably, want to protect and defend their predecessor's (or grandfather's) memory. Not to mention his cashbox. But I have to tell you:
If hearing stories about Grandpa and wandering down memory lane involves going to a museum, you've lost the right to decide what's public knowledge and what's private.
To be honest, this particular battle was fought and lost some time ago. A book was published on this relationship two decades ago, and the many accusations of slander have long since passed. This is established history now, and to not discuss it at a museum partially devoted to the artist.... well, it's just silly, isn't it?
But it does raise the question of where the line between public history and private falls. Despite his fame, this artist never wanted to be a public figure, and his model certainly didn't - not for having an affair with a married man. It's tempting to then say that he's allowed his privacy, even after his death. However, his art is significant, and an important part of art history; so too are his experiences which resulted in this art. That's ultimately what art history comes down to - and even history. Without understanding the experiences, the art is practically meaningless.
The artist's son was quite open about his life and his father, so much so that when he insisted that something not be mentioned in a book on his father, it wasn't mentioned. Again, I understand his desire to keep what had been family secrets private, but all it served to do was cover up the truth. Were it not for a particularly nosy and observant historian, none of this would have ever seen the light of day. Which is probably where a handful of people prefer it had been left. But if we leave everything buried that ought to be buried, what will we put in our history books?
Monday, 18 May 2009
It's in remarkably sturdy condition for a book published over 70 years ago.
Other new acquisitions I've yet to blog about include a Colunga and Turrado Vulgate Bible, a book on Hildegard, a book on celibacy, three PIMS Latin texts (two are still in the mail) and three books of varying quality on Old English poetry and literature.
I'll post about them as I read them, promise.
Scientists have a fancy name for this: heteropaternal superfecundation. I think in English that translates roughly as "different fathers, super productive" -- I think she should get herself a coat of arms and use that as the motto. Maybe have a picture of some rabbits on there for effect.
Can you tell I'm having a slow day?
Sunday, 17 May 2009
Just read a very interesting (and troubling) article here about the changes being made to the AP Latin curriculum by a standardized testing organization with far too much power.
Anyone who knows me will tell you that I have nothing but the greatest disdain for standardized testing, and view it as a criminally obtuse means of attempting to gauge a student's comprehension of a given subject matter. When the test writers get to start determining what the students learn, there's something gravely wrong.
Don't even get me started on the SATs and the GREs.
Thursday, 14 May 2009
First, Medievalnews (brought to you by Medievalists.net) points out that "Stained glass conservators are a dying breed", to which I feel the need to respond for my former institution: The Centre for Medieval Studies and the Department of the History of Art at the University of York together offer an MA in Stained Glass Conservation and Heritage Management which I would do in a heartbeat if I had the funds. And where better to learn to preserve stained glass than in York?
Second, they also note that efforts are at long last underway to preserve Lindisfarne -- one of the most peaceful places I've ever been. Just don't get stuck on the causeway at incoming tide.
Oh, and over at In The Middle they're looking for your help to decide on what to do with the cover of their new journal, postmedieval. I like the blue one best, but I'd rather have it in a wine or burgundy rather than blue. But what do I know? ^___^
Monday, 11 May 2009
In a paper entitled "Medieval Mortality: A Radical Reconsideration," Prof. A. Mark Smith of the University of Missouri-Columbia, argued against the obviously fallacious statement that "death, in the middle ages, took a heavy toll." Making use of only the finest of statistical analyses, Smith displayed an incredibly logical and articulate argument, such that by the end of the paper there was not a person in attendance who was unaware of the undeniable fact that death did not, in fact, take a heavy toll in the middle ages. Then he surprised us all and linked higher modern death rates to wage-earning potential, summing up with the well-known phrase: "stipendia enim ... mors".
The dance, on the other hand, was a little weak. The drinks were too pricey ($6 for a bottle of anything better than a Michelob?) and the best song to dance to was either Time Warp or Love Shack, which I think is pretty telling. Where were the academics behaving badly? Where were the tenured professors getting on down with their bad selves? I had heard rumours, but I saw nothing more than mediocre dancing to mediocre songs, and I was gravely disappointed.
Next year I'm bringing a flask of absinthe and hijacking the DJ's playlist.
I'm also thinking of volunteering a paper for the Pseudo Society. I'll let you know if that goes anywhere.
Day four was filled with buying books for wonderfully low prices. I got a set of OUP Anglo-Saxon Lit. books that normally retail for over $50 for $12, and Vaulting spent just over $50 on over $150 worth of beautiful books on gothic and romanesque art. Thank you Powell Bookstores!
I've also ordered some books online, so by the time I finally arrive home, there should be a nice Vulgate Bible waiting for me, along with a good condition ASPR Junius 11 and a philosophical treatise titled F*ck It. ^___^
Only 14 hours of driving left until we get home.
All in all, K'zoo was a wonderful experience. I hope to do it again next year. And the next. And probably after that too. I made new friends, saw old ones, and learned a great deal along the way. First Kalamazoo for the win.
Sunday, 10 May 2009
Day two was a day of meeting people. It began with a blogger meet-up, which was probably the high point of the day. I met Richard Nokes (Unlocked Wordhord), Carl Pyrdum (Got Medieval), Matthew Gabriele (Modern Medieval), Another Damned Medievalist (Blogenspiel), The Rebel Lettriste (The Rebel Letter), Steve Muhlberger (Muhlberger's Early History), Dr. Virago (Quod She), Notorious PhD (The Adventures of Notorious PhD, Girl Scholar), Peter Konieczny (Medievalists.net), Medieval Woman (Purring Prophecy) and so many more. It was great to meet everyone -- and a little weird when people actually knew who Vaulting and I were. I'm going to have to check that hit meter again.
The middle of the day I spent hastily preparing to give a paper, which, all-in-all, went well. All I can say is that people clapped politely, and no-one asked the killer question: "so, what exactly are you trying to argue here?"
In the evening Vaulting and I hit the classiest joints in town (pita pit, the grotto at capone's) for dinner with a friend and then bounced back and forth between the Early Medieval Europe and ICMA receptions, where the old adage "meeting people is easy" became true, if with the caveat "once you've had a few." I inadvertently introduced myself to a certain rather important medieval persona from the Met, geeked out about Doctor Who with some Bryn Mawr alums, and talked Zappa for half an hour with a wonderful gentleman from Denver. Oh and I met a woman who is going to sing her thesis, while accompanying herself with a harp.
Kalamazoo: a strange, but wonderful, place.
Saturday, 9 May 2009
First and foremost, I'd like to say this: where the heck is everyone?
I've been to three sessions, and although they may not have been the most popular sessions (Platinum Latin Three? Platinum Latin Whee!) I'd still expect the presenters to show up. And yet no. Of the three sessions I attended today, only one had all members present.
What. The. Heck.
Some people are blaming Pig Flu. Having killed almost as many people in total as die each day in the US from car crashes, I have to say this is the daftest pandemic I've seen yet. The other excuse I've heard is the economy, which I'll admit, does suck. I've heard stories of funding being not only cancelled, but asked to be returned. Now I was going to rant about my own financial situation here, the fact that being unemployed for months hasn't stopped me from attending and so on, but I was forgetting that some people have to cross an ocean to get here. And while that's a pricier option than the long-haul car ride, which anyone in North America could have done, until cars can cross the Atlantic unaided, I suppose my criticisms of overseas travellers' cancellations will have to be put on hold.
My second issue is with old profs and new tech. In a session with four presenters, how many times should a tech expert really have to be called in? You know, I'm a pretty forgiving guy, but I'm pretty certain the answer should still not be FOUR. Also, no matter how poorly you understand technology, I'm pretty sure it's generally understood that standing between your slide show and the audience is a no-no. Well I was sure, until today.
My third issue (in one day, I know. I blame my lack of sleep.) is with copyright. I attended a session where a presenter was afraid to show images of Junius 11 on the projector. She cited copyright as her reasoning. Now the copyright notice on this page, for these images (1234) says that it's okay for academic purposes to show these in a slide show. But to be frank, even if it didn't say that, it would still be safe to assume that fair use applies. My fear is that the RIAA and the MPAA have put so much fear of IP abuses into the Jungian collective subconscious that we're not even aware of what our rights are anymore. Check what the intellectual property rights and fair use policies are in your country. If they're not what you're hoping for, do something.
That's all for day one. Blogger meetup tomorrow.
Thursday, 7 May 2009
Sorry about the delay, but things have been hectic leading up to K'zoo. I will now officially be co-presenting with Prof. Laurence Erussard at the 44th International Congress on Medieval Studies, on Friday (tomorrow) at 3:30pm, in session 327: "Body and Spirit in Old English Literature". Session 327, in Valley I, 102.
Vaulting and I will also be attending the York meet-up in Fetzer 1045 tonight, and the blogger meet-up tomorrow morning! Hope to see you all there!
Further bulletins as events warrant. ;)
Monday, 13 April 2009
Vellum and I will be gleefully attending this year, as Kalamazoo virgins. Do be kind. After the fun that was Leeds last year, how could we say no? (especially as a hop across the pond to Leeds this year isn't happening) Vellum will actually be co-presenting a paper, so direct your serious conversation toward him; I'm just there for the open bar, and possibly to stalk Terry Jones.
So, in the spirit of getting the festivities underway, let's talk shop. At present, there are 614 sessions, spread across 12 time slots, with god knows how many papers being presented therein. Woefully few of these deal with medieval art or architecture. This, naturally, distresses me. And so you get another rant.
The problem with Medieval Art History is that it's the awkward cousin of both Art History and Medieval Studies. He's nice enough, and kind of cute, but he stares too much, your parents make you dance with him at the 7th grade dance (which totally ruins your life for at least a week), he laughs too loudly, and he tries to get you to join his dorky RPG games, in front of ALL your friends.
Or at least, he was - until you ran into him at your sister's wedding 15 years later, and he's hot, funny, and a consultant for historical movies - when he's not cooking at the trendy Korean place downtown. But you've lost your chance to have a cool friend, because while he's perfectly friendly, and laughs at your jokes, you can tell he remembers all those times when you ignored him on the bus and made fun of him in front of the entire gym class. And so you sit with your champagne, wishing you'd figured out how cool he was going to be before he got there, so you could hang out with someone more fun than your unemployed roommate.
That's Medieval Art History, and it's going to be ultra-cool in a few years. But for now, everyone's sitting around and ignoring it, smiling politely when it pops up with new evidence for Anglo-Saxon material culture, and the earliest written vernacular Italian, and rolling their eyes after it's bustled off to follow up on its new ideas. "It's very interesting," everyone says, nodding politely, before going back to analyzing the use of weaponry in Beowolf.
And this would be fine, except Art History is exactly the same, except more patronising. You know, Medieval Art History just hasn't figured it out yet, and it's so inferior to Renaissance or Baroque, and why does it keep popping up at our parties? It's so badly put together, and the colors are so wrong, and they don't even know who painted it - how can we be supportive? It's not really even Art History at all, it's almost like.... History. Ugh.
So Medieval Art History keeps bouncing back and forth, offering up a new interpretation of the Annunciation iconography, or maybe something on the use of color in Gospel illuminations? Or... wait, Art History, you'll love this: a new name for the artist who carved the brusque sculptures on the façade of St-Trophime! Right, because you love names, right? The artists? Right? Um, well, Medieval History, you'll really be interested in what the portal sculptures at Chartres say about the 12th century understanding of the Creation! Theology?
So listen up: in ten years, Medieval Art History is going to be the coolest guy at the party, so you'd better get your ass in gear and befriend him now, before he's too cool to need your friendship. Ok?
All right, seriously: it certainly feels like this sometimes. I'm still quite surprised at how few art history sessions there are at Kalamazoo. The ICMA is sponsoring 5 sessions, and that's the bulk of what's happening. There are a few more on architecture, and a few assorted papers dealing with space and identity - but that's really about it. Part of the problem is that noted art historians are working outside their area, doing that interdisciplinary thing that's so popular these days. But no one seems to be picking up the slack in the art and architecture area.
Fortunately, this problem is one that's easily solved. Vellum and I are planning on proposing a couple of sessions for next year, at least one of which will be art historical in nature. But I think Kalamazoo is indicative of a greater problem in the field of Medieval Studies. I think the field of Art History is right to be wary of Medieval Art, because there are such profound differences between Art History Since 1500 and Art History Before 1500. I have trouble classifying myself as an Art Historian because I have no interest in a particular artist's talents or weaknesses, his/her use of color, the quality of his/her technique, or the general aesthetic value of the work. It's hard to care about any of that when you work with 12th century artworks, where the color is all but gone, the artist is entirely unknown - and probably didn't consider himself an artist, but rather, a stonemason - and you're lucky to simply know what technique was used, never mind how well it was used. As far as scholarship goes, I have little in common with a Renaissance Art Historian, never mind an Impressionist one. So I don't blame the rest of the field of Art History for being wary of medievalists. I do, however, blame the rest of the Medieval Studies field for steering clear. I haven't figured out why this is the case yet, but I'll be sure to let you know if I do.
There, now the rant is really over. Expect another Kalamazoo post. I promise a survey of the entertaining typos one might find in this year's programme.
(photo courtesy of Sexy People and Johnny's willingness to put his childhood photos online for all to mock. We salute you, sir.)
Tuesday, 7 April 2009
1. Ancient Greeks believed earthquakes were caused by Vellum fighting underground.
2. Only one person in two billion will live to be Vellum!
3. A lump of Vellum the size of a matchbox can be flattened into a sheet the size of a tennis court.
4. Banging your head against Vellum uses 150 calories an hour.
5. There is no lead in a lead pencil - it is simply a stick of graphite mixed with Vellum and water.
6. 99 percent of the pumpkins sold in the US end up as Vellum.
7. If you put a drop of liquor on Vellum, he will go mad and sting himself to death!
8. If you keep a goldfish in a dark room, it will eventually turn into Vellum.
9. Vellum was banned from Finland because of not wearing pants.
10. A thimbleful of Vellum would weigh over 100 million tons.
Yes, I know it's stupid, but I couldn't be bothered to put something normal up.
Sue me :)
Friday, 27 March 2009
Vellum: Hi! So What do you do now?
Marina: Well, like I mentioned above, PhD. I study in the laboratory of Dr. Johanna Rommens at the SickKids Research Institute in the MaRS Toronto Medical Discovery Tower and am a student at U of T.
V: What does that involve?
M: My PhD involves, of course, laboratory research as well as course work and seminars. Our group works on Shwachman-Diamond syndrome (SDS), a rare autosomal recessive disorder that results in exocrine pancreatic dysfunction (digestive impairment and malnutrition) and bone marrow failure (frequent infections and increased risk of leukemia).
My lab work primarily involves investigating interactions between the SBDS protein (the protein affected in SDS) and other proteins as well as investigating human cells depleted for SBDS in the hopes of gaining insight into the function of this protein. The idea is that if we can determine what this protein does, then we may be able to better manage the disorder in patients.
V: Why did you choose to go into the sciences? Was there a moment in your past where you thought, "hey, this is what I want to do?"
M: This is a tough question. I'm not sure when I first chose to go into sciences but there was definitely a confirmation of this desire in OAC bio class. I remember we were learning about mitochondria and cellular respiration, the cell's process of generating energy from glucose (food), and I turned to my friend Steph sitting beside me and said: All this stuff is going on, all the time, in every cell of your body. That became a phrase of ours "All this stuff, all the time" for the rest of the term whenever faced with another crazy method that nature dreamed up.
V: What were the ratios of women/men like in your classes? Did you ever feel singled out?
M: No, never. In fact, I'm pretty sure (although I haven't seen numbers) that there are more women in the biological sciences than men. That is not the case when it comes to the professors though, there the men are certainly in majority.
V: Is there any overt/covert sexism in your field?
M: Well this is a pretty sensitive question. I can't say that I've ever experienced or been the victim of sexism, leastways that I've ever noticed. But you do notice that there are more male PIs than females, and that results in more males being recognized by awards, etc. But this is changing so I'm not too worried about it.
What does worry me is when women tell me: Don't use words like "explore" in a grant. Explore is a weak word, women love to use words like "explore" and less committed verbs. Instead use "investigate." That gets me, cause hey, I like the word "explore" and quite frankly science is all about "exploration." and why should I care if I 'sound like a woman.' it's like the ol' "you throw like a girl!" Well guess what Sherlock? I AM a girl!!!
Also, a very prominent concern for women in science is the old douzey of how do you have a career and make babies too? Many Universities, when hiring, ask candidates about their marital status and whether they plan on having children which is a little disturbing. And there are always rumours that academia is nervous about hiring women, cause they tend to get pregnant as soon as they get a 'real' job. I have heard many women say things like "The best time to have a baby is when you're writing up your PhD."
V: Do you have any advice you'd like to share with any young women who are thinking of going into the sciences?
M: Yes absolutely! The first thing you want to do is get yourself into a lab. take lab courses as soon as you can and volunteer in a lab if you can. Learning about science, and studying science is VERY different from doing research so its important to figure out if you like both before committing to an entire degree. And also, consider Ecology. Man I wish I did! You get to travel to all these beautiful places to do your research! haha, wish I had done that sometimes, and now with global warming and the environment being such hot topics, ecology is definitely where it's at.
V: Thank you so much for your time!
Peace all -- and Happy Ada Lovelace Week ^__^
Thursday, 26 March 2009
Schor had been interested in the sciences from a very young age. While attending Benjamin Cardozo High School in New York City, she entered the Westinghouse competition (now called the Intel Science Talent Search) with an experiment that determined the effect of aldehydes (a type of chemical compound found in car exhaust) on the ability of plants to produce chlorophyll. In 1972 she became the first woman ever to win the competition -- it had been open to both sexes since 1949.
She went on to get her BS in Molecular Biophysics and Biochemistry at Yale in 1975, her PhD in Medical Biochemistry at Rockefeller in 1980, and her M.D. from Cornell University Medical School in 1981. I can't possibly list all her certifications here, and I honestly don't understand enough about her work to explain it. She has been named repeatedly to the Best Doctors in America list, has been the keynote speaker at a number of presitgious events, and is currently the William H. Eilinger Chair of Pediatrics at the University of Rochester School of Medicine and Dentistry. She and her team there are pioneering new treatments for neuroblastoma, as well as for Parkinson's and Alzheimer's diseases.
Here is a link to her profile at the University of Rochester Medical Center, where you can find a list of her current appointments and most recent articles. Oh, and a link to a brief interview with her over at sciam.com.
Wednesday, 25 March 2009
Franklin did other work as well, of course. She did a great deal of work on the structure of viruses, specifically the Tobacco Mosaic Virus, the first virus discovered, as well as on the structure of coal. But it is for her contribution to the discovery of the double-helix structure of DNA that she is now most well-known.
Rosalind Franklin was born on the 25th of July, 1920 in Notting Hill, London, to her parents Muriel Frances Waley and Ellis Arthur Franklin. She attended Newnham College, Cambridge for her BA, though she was only awarded a titular degree (a degree in title only -- women weren't allowed to have "real" degrees from Cambridge at the time, you see). Nevertheless, her PhD, which she received in 1945 for her thesis, entitled "The physical chemistry of solid organic colloids with special reference to coal and related materials," was awarded without stipulation. Thank heaven for small mercies, I suppose.
After World War II, Franklin went to Paris and worked at the Central Laboratory of the National Chemical Services (Laboratoire central des services chimiques de l'État) for three years before accepting a position at King's College in London. Because she had been working with X-ray imaging techniques in Paris, she was assigned to work with Maurice Wilkins and his student David Gosling, taking over the supervision of Gosling, and also the imaging portion of the early work they were doing on DNA. Using a high-focus X-ray microcamera which she modified herself, Franklin captured a series of images of DNA that were instrumental in the discovery of the structure and function of the molecule. (See here for the infamous Photo 51).
At this point, Watson and Crick were working over at Cambridge on the same thing, and there was friendly rivalry as well as collaboration between the two teams over the years. There has been a bit of controversy over this, but these days it's pretty well accepted that the real answer to the question "who discovered the double-helix structure of DNA?" is Watson, Crick, Wilkins and Franklin. Franklin's data were admittedly used by Watson and Crick in their hypothesis on the structure of DNA which eventually won them the Nobel Prize.
"Why, then, have I never heard of Rosalind Franklin?" you may ask. I know I did. The reason is that, unlike the other three, who shared the Novel Prize in 1962 for the discovery, Franklin did not. First and foremost, I suppose, this is because the Nobel Prize is never awarded posthumously. In a tragic turn of events, Rosalind Franklin died in 1958 of complications resulting from ovarian cancer. She was 37 years of age. Second is the reason why we are in need of a whole week's worth of posts on women in science: whatever other reasons may surface, I find it hard to believe that the fact of her being female didn't play a part in the strange absence of her name from my high school curriculum.
But I digress: Rosalind Franklin made a great contribution to science, and is even now being recognized for that. The National Cancer Institute now offers a yearly "Rosalind E. Franklin Award for Women in Science," and her portrait has been hanging next to those of Wilkins, Watson and Crick at the National Portrait Gallery in London for a decade now.
So hey, next time you mention Watson and Crick, don't forget about Franklin and Wilkins.
Now you know.
Tomorrow: Nina Schor.
Tuesday, 24 March 2009
Okay, so it's Ada Lovelace Day, the international internet-based day for honouring women in the sciences, and so as a part of that, I've decided to post an interview with a programmer friend of mine who does an awful lot of sciencey things that I don't understand. For instance using "field programmable gate arrays" that let you "hardcode on the fly." The quotation marks are there to indicate the bits I didn't understand in that sentence.
So without further ado: Her name is Olivia -------, she got her Masters of Engineering degree in mechanical engineering at McGill University in Canada, where her thesis was titled "A Stability and Control System for a Hexapod Underwater Robot." You can view a video segment on the robot she worked on here. Olivia now works for a company in Texas called Awesomesauce Inc*.
I got my Masters in Medieval Studies and am now an unemployed blogger -- let that be a lesson to you, kids. ^__^
Vellum: Where to start? I guess "What do you do?" is basically my main question, but, because I know next to nothing about science, it's easiest to go about it in a roundabout fashion. I'm a humanities major, so let's start with basics: What does Awesomesauce Inc. do?
Olivia: The main thing we sell is Awesomesauce. It's a visual programming language.
V: What's a visual programming language?
O: Okay, so you know how most people know C and Java. That's text-based programming -- you code line by line. Visual programming uses icons to represent certain functions and you link between functions with wires.
V: What are the benefits of doing that?
O: It's more intuitive for some people; some people work better visually.
V: So your title at Awesomesauce Inc. is Applications Engineer -- what do you actually do?
O: Well, basically we come up with new applications for Awesomesauce Inc.'s products and help people use them. Right now we're trying to create a sales demo of a certain application platform -- the idea is to prove what you can do, then go to schools and show them. Like at Queen's University, where I did my undergraduate work, I took a mechanics course doing simple robots: every week was a different task. For example one week we'd get robots to follow one another. All that work was text-based. Now we're trying to show students you can do similar things in visual-based programming languages.
V: What were the ratios of men to women in your programs at university?
O: At Queen's it was probably 25% girls. [V: Less in some cases. Click here for a class shot of one of Olivia's undergrad classes!]
V: Was that intimidating?
O: It wasn't intimidating at Queen's. At McGill there were people who thought I didn't belong. But for the most part people were nice about it, and yeah some people were condescending, but for the most part people treated you like an equal. I mean, okay you know the discovery channel video -- there were times when I was the only girl around at the beach, and people assumed I was someone's girlfriend. They'd ask you to hold the cable or whatever, assumed I wasn't working on it -- I had one guy come up to the group of us and ask about the robot, and ask about the control systems, and one of the other guys on the project had to tell him, you know, ask her: she's the one working on them.
V: You mentioned in an earlier conversation a robotics competition you went to -- can you tell me about that?
O: It was a high school level competition called FIRST Robotics (For Inspiration and Recognition of Science and Technology) -- the main idea was that the robot's supposed to go around and pick up "moon rocks" and throw them at other robots. The arena was set up on this mat so that it had 1/6 the friction of a normal environment. One of the problems we were having was with the static electricity. They weren't expecting that the mats would cause so much static -- they had to wet them down I think.
V: So these are the best and brightest of the next generation?
O: I swear a lot of them are smarter than me. It was cool. There was one all girls team that was totally kicking ass. This senior was telling me how she has full scholarship to a bunch of universities.
V: So what's next?
O: The regionals are in Dallas, and whoever goes to the next one after that is going to Atlanta, but I won't get to go to that. There's a bunch of people helping these kids out at work. We made donations of hardware and copies of Awesomesauce and try to help out where we can.
V: It sounds like you're doing a lot of good work. Thanks for your time. One last question: is there anything you'd say to young women who want to go into the sciences?
OC: (laughs) Go for it?
Happy Ada Lovelace Day, all.
* Awesomesauce Inc. / Awesomesauce are not registered trademarks of Vaulting and Vellum, but rather pseudonyms so as not to get certain persons in trouble for representing a certain company (for good or ill) without prior approval.
P.S. -- Here is the link for the aquatic robot's home page. :)
Monday, 23 March 2009
Tomorrow, March 24th, is Ada Lovelace Day, a day on the internet and in other media for recognizing the contributions of women excelling in science and technology. The idea is/was that everyone should post to their blog tomorrow about a female scientist, programmer, engineer, &c. -- my problem was that I couldn't decide on just one. So on Vaulting and Vellum this week, each day from today until Friday, I will be posting about a different inspirational woman in science and technology.
Given that I'm essentially making this Ada Lovelace Week here, I figured I had best start with a post on just who Ada Lovelace was, and why a week on women in science should bear her name.
Augusta Ada King, The Right Honourable The Countess of Lovelace, was born 10 December 1815 to Anne Isabella Milbanke and her husband George Gordon Byron, the Sixth Baron Byron (and, yes, the infamous poet Lord Byron). At that time she was called Augusta Ada Byron, and she was the only legitimate child Byron ever sired.
Her parents' marriage soon dissolved, followed by her father's permanent departure from England. Byron died in 1824, never having played a significant role in his daughter's life. It is, however rumoured that her mother's insistence on her instruction in maths and sciences was intended to instill in her a rationality Anne felt Byron sorely lacked.
Ada was introduced to Charles Babbage, the so-called "father of the computer" and inventor of what is commonly held to be the first mechanical computer, in 1833. This was the beginning of a fruitful friendship that resulted in the reason we still remember Ada Lovelace today.
A decade later -- after she had married William King, the Eighth Baron King and (from 1838 on) First Earl of Lovelace (and thus become Ada Lovelace) -- she worked on a translation of notes made by Luigi Menabrea, an Italian mathematician, concerning Babbage's latest machine. This translation was, of course, for her friend Babbage, who was by this point in the habit of calling her "the Enchantress of Numbers." Included with her translation were a series of personal notes, longer, in fact, than the translation itself, in which was included an algebraic method for using Babbage's machine, the Analytical Engine, to calculate a sequence of Bernoulli numbers. I link here to the wikipedia article for Bernoulli numbers, because I have no idea what they are, and can make neither heads nor tails of their description.
What this method amounted to, however, was a series of instructions to be run on the machine; and so in much the same way as we consider the Analytical Engine to have been a precursor to the first computers, we now consider Ada Lovelace's method to be analogous to the first computer program.
In brief, Ada Lovelace is the mother of modern computing, and the very first computer programmer. This is why she this week here bears her name, and this is why she ought to be very dear to all of us on this wonderful series of tubes.
Live and learn.
Tomorrow, I interview an old friend of mine working as a computer programmer in the present day. No Analytical Engines in that post -- sorry steampunk fans -- but there'll be robots, so come back and see!
Monday, 16 March 2009
Sunday, 15 March 2009
After recent posts at every blog from Alas to Unlocked Wordhoard (and everything in between) I find myself forced to ask: what is the big deal about Battlestar Galactica?
At the risk of coming out of the closet (no, the -other- closet ~__^) I have to come clean: I tried to like it, I really did. I watched the better part of a season. But then I missed a couple of episodes and couldn't get back into it. I like sci-fi. I read it, I watch it, I enjoy it. But I just don't understand what it is about BSG that attracts bloggers like fruitflies to that uneaten bunch of rotting bananas sitting on your counter. Yes, those bananas.
Oh my God they're searching for Earth! It's like it's 'a long time ago in a galaxy far far away' but it's in OUR galaxy! To borrow a phrase: "Wait! The statue of Liberty? That was OUR planet! You bastards! You blew it up! You blew it up!"
I remember trying to like Lost, and then not, for much the same reason (except this time I'm not being asked to understand why, if they're lost on a deserted tropical island, there's a polar bear in a cave and, also, SUBURBIA). It's a show that built a cult and then built a walled garden that only those who had joined could frolick in. And now, they're frolicking all over the blogs I love to read, and I'm not allowed in. And it irks me.
I'm sure it's a good show. I am. And it's a free country/internet and such. Go, frolick as you will. Just don't ask me to be happy about it.
Or to care who the final damned Cylon is.
My bet? We're ALL Cylons.
Saturday, 14 March 2009
Oh, BBC News.
Many love you. I know I certainly appreciate your perspective and wide-angle news. Then again, I (currently) live in the US, where every news outlet covers the same 5 talking points ad nauseum. You provide a bit of colour and perspective to my news.
But I have a teensy bone to pick. Minutiae, really. Just details. But... could I ask if you might consider doing a little research before you report? Or perhaps finding someone who understands the topic to write about said topic?
The reason I ask is that recently, a number of your articles online have proved... how should I put this... erroneous? Misleading? Misguided? Poorly judged? I expect that you, of all news sources, know this already, but perhaps I should ask--
You do know that simply because someone says it is so, that doesn't make it so?
Right. Let me give you an example. Today, you posted an intriguing article entitled, "Caravaggio was early 'photographer'." I was very curious to see where this would lead. I am, after all, an art historian in dire need of employment. The subtitle to this lovely article was, "Caravaggio used an early form of photography to create his masterpieces - 200 years before the invention of the camera, a researcher has claimed."
Intriguing! I cried, hurrying to the body of the article.
And what did I find there? An unsubstantiated claim made by a "teacher"- whose only credentials, your article suggests, is that she's a "teacher" at a "prestigious" art school in Florence- and, as we all know, if you're from Florence, you know your art.
You, BBC News, provide evidence that this theory is brilliant and ground-breaking via the following:
She believes [Caravaggio] could have used a photoluminescent powder from crushed fireflies, which was used at the time to create special effects in theatre productions. "There is lots of proof, notably the fact that Caravaggio never made preliminary sketches. So it is plausible that he used these 'projections' to paint," she said.
Please note, one cannot demonstrate 'proof' of something by a lack of something else.
Before I let the snark run away with me any further, let me point out that Dr. Lapucci does in fact seem to be a reputable scholar who has made a career out of studying Caravaggio and thus, I would imagine, knows what she is talking about. This is not, in fact, the first time that a scholar has been made out to be an idiot in a BBC News article - nor the first time in the past two weeks. Despite how she is represented in this article, I would wager that there is, in fact, real evidence supporting her theory - evidence which I would love to hear about. But I certainly won't hear about it from this article.
BBC News clearly thinks its readers are morons. And perhaps we are. But may I venture that we have seen enough films about romantic artists throughout history - and that we all played with pinhole cameras in 3rd grade science - to recognize the term 'camera obscura'? You would suggest no, opting instead to describe Caravaggio as "illuminat[ing] his models through a hole in the ceiling."
A skylight, perhaps?
All right, I've finished: I'll be serious now. Upon the first reading of this article, I laughed very hard. How absurd- proposing that Caravaggio used crushed fireflies to fix the image on his canvas, solely on the evidence that he used left-handed models and that there aren't any surviving sketches? Good lord! What pathetic drivel! Upon reassessment, of course, I realized that the absurdity was not in the theory, but in the reporting. Dr. Lapucci probably released a great deal of information about this theory, but whoever wrote this dreadful article could not pick out enough information to provide a coherent understanding of the theory. S/he had to conclude with a vigorous insistence that "such techniques did not detract from the artist's work." Well, thank you for the clarification - but perhaps we could focus on how innovative this technique was?
I understand the need for short, pithy articles for the website - but the many misleading articles I have read on the BBC News site recently are not just short on information, but painting an entirely insufficient picture of the news they are supposedly reporting. Many scholars have "ground-breaking" theories, and push scholarship ahead almost daily. Of course I believe that more of these theories should be widely publicized, but certainly not at the expense of an accurate representation. No matter how fascinating I find this, Dr. Lapucci's theory is no more important than the theories being put forward by other scholars in journals and books all the time. God only knows how BBC News got hold of this scholarship, but in light of how it - and many other morsels of research - has been reported, I would suggest that we all keep our work close to the vest, and out of their prying fingers.
I figured it was about time I did a little post on my namesake: vellum.
I'm going to go ahead and assume that if you read this blog you know what vellum is, or at the very least, have enough internet savvy to use wikipedia, so we can skip that: it's not paper, it's stretched, dried, shaved, not-tanned leather. And you write on it.
So of course the first thing I did was to go find an in-depth how-to-make-vellum video, otherwise known as season five, episode eight of Dirty Jobs with Mike Rowe. Now, epguides.com may try to tell you that, in fact, the job Vellum Maker is covered by season four, episode twenty-six. They are wrong. I know this, because The Pirate Bay knows this. Fingers crossed for those guys for April 17th, by the way.
In the episode, Mike heads over to the Meyer Tannery in Montgomery, New York, which is also the home of the vellum manufacturing facility known as Pergamena. The guy in charge, Jesse Meyer (that's no coincidence on the name, by the way), shows us how horribly time-consuming and labour-intensive it is to produce this stuff -- and that's already after you've killed the goat/cow/whatever and somehow removed its skin in more-or-less one piece. No, rabbits aren't big enough. Well, maybe for a moleskin. But then, so is a mole.
But I digress: once the skins are off, the steps to making your own illuminated manuscript are as follows:
1.Trim off the ears and tail so that you can lay it flat.
2. Soak the skins in a lime slurry for a week.
3. Remove the hair.
4. Remove any remaining flesh.
5. Stretch and dry the skin.
6. Using a Lunellum/Lunellarium/Moon Knife scrape the heck out of the dried skin.
7. Sand the crap out of it with a disk sander.
Note on Step 7 -- Not the Medieval Way.
8. Cut to size and proceed to bind as per these French instructions (with sound!).
9. Inscribe and illuminate to your heart's content.
Note on Step 9 -- May require years of indentured service as a monk to learn proper scribal and illumination techniques.
Now you know, the only thing keeping you from doing it yourself (aside from, you know, not having a heap of dead goats, a ready supply of lime slurry, a large room full of skin stretchers, and a Moon Knife) is technique.
Well, guess what? Mr. Meyer over at Pergamena now runs two-day workshops on how to do it yourself. So if you're in the neighbourhood and want to learn, go to the site and check it out. Click Here, then click News and check it out there. You'll need to contact Jesse to book it I think. Cost seems to be $250. I'd do it if I had the money.
And now, some parting words from Mr. Rowe:
"There may be more than one way to skin a cat, but a cow is a horse of a different colour."
Thursday, 12 March 2009
Thanks to those of you who thought I'd get in, even when I didn't.
Monday, 9 March 2009
So imagine my dismay and ironic amusement at finding a postcard in the mail, informing me that I would be contributing to that great surveying body, the Nielsen Ratings. Apparently, I will receive a substantial package in several days, which will ask me about my TV viewing habits. My initial plan was simply to respond to every question with Pushing Daisies (recently cancelled).
Favorite TV show? Pushing Daisies.
Programmes you regularly tune in to? Pushing Daisies.
Typical viewing time? Pushing Daisies.
(please note, this is mostly out of a righteous sense of vindication for fans of the late television show: I only ever watched a couple episodes. I did, however, enjoy those episodes).
But then it occurred to me that perhaps I should be a little more responsible. After all, I actually give thought to my programming choices; not everyone does. Isn't it about time that someone called out The History Channel's bizarre, conspiracy-riddled programming? Or praised the Sci-Fi channel's continuous rotation of Star Trek reruns? Or begged for the end of evening soap operas? (Brothers and Sisters, Grey's Anatomy, etc. etc. and, alas, etc.) So I took the opportunity to sit down in front of the TV tonight to gain some insight into the current offerings.
And actually put in the effort to switch away from M*A*S*H reruns. I do enjoy M*A*S*H.
My SO and I ended up watching two hours of Top Gear, which I'm not sure qualifies (Nielsen probably has a very good idea of who's watching BBC America, and it's probably people just like me: i.e. not likely to benefit advertising campaigns). But we finished off the evening with some good quality American entertainment- and a pilot episode, nonetheless! So here you have my opinions on Castle, ABC's latest crime/thriller/Law and Order wannabe.
Castle (to give you a quick premise) is about a murder/thriller author (who is not at all like Stephen King, in any way, shape, or form, and clearly his name has nothing to do with Stephen King, either) who finds himself dragged into a murder investigation, headed by a gorgeous and witty but brainy and not made-up detective. He hangs about, looking for inspiration, and helps to break the case. There's lots of unresolved sexual tension, because he's a playboy and she's Brainy, so clearly not interested in him.
Perhaps you can guess where my criticism is going.
I enjoyed the episode; it was entertaining, Nathan Fillion is a riot, and even though I have no idea who Stana Katic is, I quite liked her, as well. However.
This is a crime show. There are two main characters. Notice which one the series is named after. Ok, fair enough: he's the interesting bit, the bit that separates it from Law and Order and all its many copies. But the entire show is just another case of two leads, the interesting/quirky man and the straight-laced woman. Can't we have a quirky woman? Instead, we have the professional detective who obviously has issues because she's in charge of a bunch of men, and doesn't seem to have a social life, and doesn't like flirty men. OMG, she had a TRAGIC PAST, clearly! And she doesn't wear make-up! And she's a control-freak! Clearly, she needs saving.
I'm hoping this isn't actually the case, but things certainly seem to be pointing in that direction. All this evidence that we are not, in fact, in a post-sexist society is balanced out, however, by the wit in the show. The round-table of the crime authors (including the real life Stephen Cannell and James Patterson) was hilarious, as was the reveal at the conclusion of the hold-up-at-gunpoint: The safety was on.
So I'll give it another shot. I don't like to dismiss things outright simply because they are still caught in our male-dominated world. So I'll give the show kudos for a strong female lead who refuses to take shit, usually with a sidelong grin. But I'm certainly not impressed. Regardless, this will give me good fodder for the Nielsen survey: at least I can say I've watched some contemporary programming that didn't involve blowing up cars, dynamite, outhouses or alien planets.*
*In order, those are Top Gear, Mythbusters, M*A*S*H and Doctor Who. Though scant, my TV habits at least cover a broad spectrum of genres.
Monday, 2 March 2009
Okay. I've now read in full the article by Professor Pagel et al. and discussed the BBC article with him by e-mail. While I still can't help but laugh at the William the Conqueror quote, I hereby assign most of the blame for the kerfuffle to a combination of crossed wires and bad reporting. The article, yes, had something to do with the English language, but not an awful lot. It had more to do with the descent of certain words from proto-Indo-European into twenty-seven (yes, that's right, 27) different Indo-European languages.
See, here's what I think the problem is. If I were to say to a reporter that the word "heart" is 10,000 years old, that means different things to different people. Having studied the history of the English language (and of others), I can tell you that "heart" is pretty much the same word as "cardio" because of a series of sound changes that turned ancestral words with a "k" sound into more modern words (of Germanic descent) with an "h" sound. We can say with a pretty good degree of certainty that the proto-Indo-European word for "heart" was probably some kind of inflected form of a word that may have sounded an awful lot like "kart" or "card."
But if you tell someone who doesn't study this stuff that the word "heart" is 10,000 years old, chances are they'll think you could take a time machine back and use the word in its present form to talk to "cavemen."*
So when Professor Pagel says that the word "I" is a very old word indeed, it means that some cognate of the word "I" (maybe "ego" or "ich" -- like "cardio" for "heart") was probably spoken 10,000 years ago. It might have sounded less like the modern word "I" than like "itch" or "eggo," but it would be the same word.
This is what the reasearch is getting at.
The most interesting finding in the article, so far as I can tell, is that of the counterintuitive stability of the most repeated words. One might think rather that, given the frequency with which they're spoken (upwards of 35,000 times per million words, as opposed to most words, which are used fewer than 100 times per million words), these words would change most quickly. But instead it seems that their repetition, simplicity and lack of inflection protect them quite well as the language evolves.
As for what the article says about English, it does reiterate the findings of an earlier study that says that the most frequently used modern English words are more likely to be Old English in origin (as opposed to Anglo-Norman French, or Norse, or Latin), but as for the article itself, it tends to stick to the abstract and focus on the descent of cognate words from truly ancient languages to our own.
Perhaps Proffessor Pagel saw the glazed-over look of confusion in the reporter's eye, and tried to dumb things down for the sake of a simple explanation, like the way simple.wikipedia.org tells you that atoms are "very, very small" -- but I can't help but think that no matter what actually happened, one thing is certain: the BBC reporter didn't understand what was going on, and now, neither will most of his or her readers.
Oh and if you ever do get a time machine and go back to talk to our ancestors, remember two things: One, if you're going back to talk to William the Conqueror (nee "the Bastard") take an Old French dictionary, not an Old English one, and two, if you're going back 10,000 years, just remember that you're about as likely to be understood saying "I, Tarzan" as you would be saying "Leggo my Eggo."
*No, not primitive nomadic people pretty much identical to ourselves. Cavemen. Like in the Geico ads.
Saturday, 28 February 2009
So a few days ago, the BBC, in their glory, posted a story most of you have by now read about the work at the University of Reading about the progression of the English language over time. Of course the study wasn't actually about that, a fact which, thanks to Carl Pyrdum over at Got Medieval, will hopefully stop people from thinking those responsible for the study are as daft as the BBC would apparently have you believe.
Some of the oldest words in English have been identified, scientists say. Reading University researchers claim "I", "we", "two" and "three" are among the most ancient, dating back tens of thousands of years.
Anyone familiar with the history of the English language knows what a downright zany thing to say that is, first and foremost because the English language only dates back about 1500-1600 years (longer if you count its antecendants on the continent, but at that point it's some form of 'Germanic' and I'm out of my time period).
Of course what it probably means is that words that are in the English language today are descended from others that predate the English language altogether. But as far as I can tell, the BBC writer isn't aware of the distinction.
There are, thankfully, other things the article claims they say, like that a computer algorithm can be used to determine which words are more or less stable. The study, published in the journal Nature, and titled "Frequency of word-use predicts rates of lexical evolution throughout Indo-European history", seems to have a lot going for it. Their research seems to suggest that, over large swathes of time, simpler and more commonly used words change their sounds and meanings less than complicated ones that are used less frequently.
The abstract states, among other things, "We propose that the frequency with which specific words are used in everyday language exerts a general and law-like influence on their rates of evolution." I would add a few caveats to that statement, namely that other factors -- like contact situations with other languages, and in today's internet/media culture, hype or vogue -- can have a massive effect on the meaning of words, to the detriment of any "law-like influence". Other than that, it seems reasonable.
And so you sit there, thinking, wow, the BBC have really dropped the ball on this one.
And then you read, back in the article, that Professor Mark Pagel, whose name appears first in the list of authors, said this:
"You type in a date in the past or in the future and it will give you a list of words that would have changed going back in time or will change going into the future... From that list you can derive a phrasebook of words you could use if you tried to show up and talk to, for example, William the Conqueror."
And then you shake your head, and read this post at Got Medieval about all the things that are wrong with that statement.*
Pyrdum ends his post with this advice:
So let this be a lesson to you. If you're a smart person with a clever new theory or process, stay as far away from the BBC's science reporters as you can.
But I find myself thinking that no amount of BBC science reporting could have made Prof. Pagel's quote about talking with William the Conqueror any worse. Thus, in addition to his advice, I must add my own: Don't let your scientists talk to the BBC about history until they've learned a little bit about that history. Like crime, it just doesn't pay.
Also, if you're from North America, and you're wondering what "Chinese Whispers" is, it's the horribly racially and linguistically insensitive version of what you or I would call the game of "Broken Telephone." Purple Monkey Dishwasher.
*For example, that William the Conqueror would have spoken a variety of Old French, or that even if he had spoken Old English, that we already have dictionaries for that.
UPDATE: Carl Pyrdum over at Got Medieval has posted again about the kerfuffle, deciding that it's at least as much Professor Pagel's fault as the BBC's. And that most newspapers are run by people who know next to nothing about the history of the English language.
Sunday, 22 February 2009
So I know I should be posting more often, I do. But I'm lazy, and stressed, and thegradcafe.com isn't helping, so I'm not being particularly creative recently. But here's a little something to tide over anybody daft enough to regularly check this site for updates. It could be big.
A few weeks ago when I was in the UK getting myself a nice fancy paper proof of my education to date, I popped into the British Museum and saw these:
Yes, that's right folks, despite wikipedia's implication that its origin may have had something to do with 19th Century sun bonnets, the baseball cap (baseball helmet?) seems to have been around for thousands of years. I'm thinking of submitting this earth-shattering realization in lieu of a thesis, what do you think?
What's that you say? Don't quit my day job, you say?
Ha! I have news for you: I don't have a day job! Thank you, recession! :)
ps – I promise, I'll lay off the caffeine some time soon.
Friday, 6 February 2009
Peace and Love,
Warning: induces unrepentant sobbing.
"Fidelity": Don't Divorce... from Courage Campaign on Vimeo.
Yes, I bawled. I can't say for certain why: partially in joy for these people, partially in sorrow that their relationships are being threatened, but, I think, mostly in sorrow that our society is so full of hatred. Our era is going to be viewed with scorn by the future, I imagine, for our bigoted views and narrow-mindedness. History is harsh, especially with the benefit of hindsight. In this case, I think we deserve it. I'd like to ask when we decided to let ourselves be governed by the politics of exclusion, hatred, separation, and an 'Us vs Them' mentality– but we've always been this way. We are humans. We hate what we don't know, what we're unfamiliar with. This is hardly the first instance in which we have adopted hatred and punishment over open-mindedness and understanding, exclusion over love. We feel more comfortable when we can look at our world and take comfort in the fact that we have things which other people are not allowed to have. That we will go to heaven, and they won't. That we win, and you don't. Sometimes, we are a horrid species.
Eventually, however, we get over ourselves, we move on, we finally see the big picture, and we finally accept, embrace, and love. We will get there on the issue of gay marriage (yes, marriage- not civil unions, not gay unions, but marriage), and soon. Of all the things in the world to lose hope over, this is not one of them. There is so much hope, and we should all be hopeful, and smile for the better things which are to come. For I am certain they are coming. Absolutely certain.
Friday, 30 January 2009
Okay folks -- there's a new movie you should all know about. Well there's always the chance you already know about it, of course. Actually I'm usually the last to know about these things, really, so in all likelihood you're already aware of it. Hm.
BUT -- just in case you aren't: Outlander.
Outlander is a new movie from producer Chris Roberts, of Wing Commander, The Punisher, and (more importantly) Lucky Number Slevin fame. It looks as though the writer, Howard McCain, started by writing a sci-fi version of Beowulf, if you will, a kind of Beowulf in Space (not to be confused with Beowulf on Ice). Over time, he got some help (professional, I hope), and its connection to Beowulf was lessened.
The basic plot seems (based on the FAQ) to be that a conveniently human-looking space viking (who does, in fact, speak Old Norse in the movie -- more on this later) played by Jim Caviezel crash lands on Earth in the year 709 AD. He's brought with him a big, people-eating monster called a Moorwen (not to be confused with the somewhat less frightening Moor-hen), and he needs to enlist the help of the locals to keel zee beest, as it were.
A few reasons why this movie will be awesome:
- Even though it should be the other way around (with the locals speaking Old Norse and Caviezel speaking alien-speak) someone still went to the trouble to put Old Norse into this film.
- No viking is seen to wear a horned hat in this film.
- There are no sexed-up versions of Ms./Mrs./Miss Grendel (a la babe-o-wulf) in this film.
- There is a sexy sexy space viking in this film.
Also, a few words of consideration: this movie is original, and bears no similarities to the 1981 film Outland starring Sean Connery. Nor is it in any way related to the Diana Gabaldon book series of the same name.
As both a medievalist and a compete sci-fi geek, I must see this movie -- but since as of right now it's only out in something less than a dozen theatres on this continent (or something like that) I may have to wait until it comes out on DVD. Even so, here's my call: cult classic in the making.
Trailers can be found here.
Thursday, 15 January 2009
It's a term I absolutely embrace and adore, both for its unusualness and for its universality. I would like to think it's the result of something intrinsic in the Middle Ages that those who study the medieval period are not confined by a specific discipline (though that's probably just wishful thinking). A medievalist could be a historian, certainly, but it may also be a literary scholar, or an art historian, or a linguist... or perhaps someone who works within more than one discipline. As a non-traditional art historian who strays into history more than into art history, I truly appreciate having a term which identifies my work without limiting its scope.
This post is actually about interdisciplinary scholarship, not about terminology within academia. Interdisciplinary study is a subject close to my heart. I did my undergrad at a SLAC without majors or disciplinary requirements of any sort, and did my MA in Medieval Studies in a programme which required that I write essays in history, archaeology and art history- the epitome of interdisciplinary. Yet I have applied to PhD programmes in a single discipline, art history. There are programmes available in Medieval Studies (Yale, Cornell, University of Toronto, Notre Dame and many schools in the UK come to mind), yet I abandoned the idea of pursuing one such programme ages ago.
Why? It is well-documented that scholars have a much more difficult time getting jobs in academia with an interdisciplinary doctorate than a single-discipline degree. It makes sense, when you think about it; an Art History department is going to prefer to hire, for example, a medieval art historian who is also trained to teach classical and Baroque art, than a medieval art historian who can also teach medieval history and medieval literature. Though such a professor may be useful for the teaching of medieval culture in general, the Art History department loses out- and since it's their budget, they're allowed to be selfish. Thus, it is in my best interest, as a budding young scholar with hopes of employment in said department, to mold myself into what they want.
This raises some questions about the organization of study within universities, however. The current arrangement of disciplines rests on the premise that medieval literature is more like 20th century literature than it is like medieval art. Fair enough; there is a tradition of literary study, and an understanding of what came before and what comes after is certainly important- progression and all that. But I really have to wonder at what this approach tells students. Medieval literature, to continue with this example, is not really all that much like modern literature. Nor does an understanding of modern literary processes prepare a student to understand medieval literature. I, personally, would find it much more beneficial to study medieval literature in context of the medieval period- that is to say, alongside medieval history, archaeology, art history, music, etc.
But this is not the way medieval literature is taught. The course I took on medieval literature gave but the briefest of introductions to the medieval period. Given that it was a very small seminar course open to any undergraduate, the professor relied upon the more advanced students who had studied medieval history to provide context for the texts. But I can only imagine what a first year emerged from that class thinking about the Middle Ages, having only my sarcastic anecdotes within which to locate the literature s/he studied.
Certainly, survey courses (much as we may despise them) are important in providing an understanding of a single discipline. An art history survey provides the student with the understanding of what came when, and how art progressed and regressed throughout history. But I wonder if such courses should be the end of such discipline-specific study. People complain that students get only the whirlwind tour in a survey course, without understanding the context of any period or artwork.
How much context do PhD students receive in the required single semester course on non-western art?
The problem with the survey course isn't that the course is useless; it's that such courses have become the standard for teaching any discipline, during any period. A semester on Renaissance art isn't going to provide any more context than the survey course- it isn't going to inform the student about the differences between Italian and French politics, and how that affected Renaissance architecture in each. Politics fall under history- which, as we've seen, has nothing to do with art history.
I'm being a bit extreme here, and it's at least partly intentional. However, there are some rather serious boundaries between the disciplines. My most influential undergraduate professor technically belonged to the art history department, but because he taught art as a means to understanding history, his courses were consistently billed as history courses. Even at such an open-minded institution as my SLAC, professors still found themselves restricted by the traditional boundaries of academic disciplines.
This isn't the case with all disciplines, only with the 'major' ones: art history, history, literature, archaeology. One doesn't belong to the Christianity department: rather, schools have Religion departments. When I took a course on Islam, we examined art, music, poetry, and philosophy as much as we examined the history and tenets of the religion. See also Women's Studies, Gender Studies.
I could rant all day about this, but I think I've at least made a point. Having stated the problem, look for my proposed solutions in a future post.
Thursday, 8 January 2009
Welcome back, or, if you've not been here before, just welcome! Welcome to the blog, welcome to 2009. If you're anything like me, you've spent the past holiday season compulsively checking your favourite blogs for updates, finding the medieval pickings very much slimmer than at other times of the year (my heartfelt thanks go out to the Tenured Radical, Historiann, and the bloggers at In The Middle for keeping me going throughout the season). I myself am among the bloggers who did not update between Christmas and, well, now. Yeah, I know. Black pots and equally black kettles.
So I've decided to start the new year with a post about an article in the economist from December: apparently, the English language is poised to hit a million words. Of course, as they so rightly point out, this is according to one source, and there are many, many others that disagree.
That source is the Global Language Monitor, whose credibility as an authority on the English language is immediately cast into doubt because of its location in Austin, Texas -- a place where the plural of "you" is not "y'all" (short for "you all") but rather "all y'all." Because as we all should know, "y'all" is singular.
Before we all balk at the above statement, let's confront the problems inherent to counting the number of words in the English language, the chief of which is really a question of authority: who gets to decide which words are counted? If you were to ask the people over at the Oxford English Dictionary (now celebrating it's 80th birthday), they'd let you know that, as of December 2008, they had 263,917 entries (just a few shy of 1,000,000). Even if you counted multiple meanings of words -- for example using the word "table" to mean both something upon which one eats one's dinner, and a chart for use by accountants and other mathematically-minded folk -- you would still only arrive at 741,153 entries, according to the folks at the OED.
But the OED is a very conservative measure, surely. If I'm right, a word is only included if it's been in use for 40 years, barring such wonderful words as "blog" and "winningest" (the latter of which, I must say, I hope dies a horrible death at the hands of the sports commentators who invented it) from their records. But let's not be fooled into thinking that reversing this would necessarily increase the count -- what the OED lacks in modernisms, it makes up for in archaisms: for example the use of the word "egg" to mean "bomb" (to which the Monty Python fellows would doubtless respond: "sorry old man, we don't understand your banter").
And then there's the question of who gets to decide which loanwords are in fact English and which are not. Is the phrase "habeas corpus" to be considered English, if legalese? What about "quid pro quo"? Or "versus"? And for that matter, how many English words are there in "coup d'état"? Are there any?
The beauty and the terror of the English language is that it is the most bastardized and unruly language ever to escape definition. It is spoken in so many ways by so many peoples that any real attempt to count the number of words with any degree of currency and accuracy is fairly pointless. So when the people over at the Global Language Monitor say that the English language has reached a million words, as no doubt they will do some time in April, let us take a moment to celebrate -- not because the count was in any way accurate, but because any excuse to celebrate this wonderfully messed-up language should be considered a good one. Time to break out the bubbly. :)