Friday, 27 March 2009

Friday of Ada Lovelace Week - it's been fun!

Welcome back! It's Ada Lovelace Friday, and the fifth and final post in this 5-day series. I honestly cannot imagine blogging 5 days a week every week. Not unless this was a real paying job. Which it's not. So A) kudos to anyone who has the dedication to post daily, and B) I hope you enjoy today's installment, an interview with a good friend of mine named Marina T-----. She recieved her Honours BSc in Biology and MSc (specializing in plant biology) from York University, and is currently working on her PhD at the University of Toronto in the Department of Molecular Genetics, studying exocrine pancreatic dysfunction in Shwachman-Diamond syndrome (!!).

Vellum: Hi! So What do you do now?
Marina: Well, like I mentioned above, PhD. I study in the laboratory of Dr. Johanna Rommens at the SickKids Research Institute in the MaRS Toronto Medical Discovery Tower and am a student at U of T.

V: What does that involve?
M: My PhD involves, of course, laboratory research as well as course work and seminars. Our group works on Shwachman-Diamond syndrome (SDS), a rare autosomal recessive disorder that results in exocrine pancreatic dysfunction (digestive impairment and malnutrition) and bone marrow failure (frequent infections and increased risk of leukemia).

My lab work primarily involves investigating interactions between the SBDS protein (the protein affected in SDS) and other proteins as well as investigating human cells depleted for SBDS in the hopes of gaining insight into the function of this protein. The idea is that if we can determine what this protein does, then we may be able to better manage the disorder in patients.

V: Why did you choose to go into the sciences? Was there a moment in your past where you thought, "hey, this is what I want to do?"
M: This is a tough question. I'm not sure when I first chose to go into sciences but there was definitely a confirmation of this desire in OAC bio class. I remember we were learning about mitochondria and cellular respiration, the cell's process of generating energy from glucose (food), and I turned to my friend Steph sitting beside me and said: All this stuff is going on, all the time, in every cell of your body. That became a phrase of ours "All this stuff, all the time" for the rest of the term whenever faced with another crazy method that nature dreamed up.

V: What were the ratios of women/men like in your classes? Did you ever feel singled out?
M: No, never. In fact, I'm pretty sure (although I haven't seen numbers) that there are more women in the biological sciences than men. That is not the case when it comes to the professors though, there the men are certainly in majority.

V: Is there any overt/covert sexism in your field?
M: Well this is a pretty sensitive question. I can't say that I've ever experienced or been the victim of sexism, leastways that I've ever noticed. But you do notice that there are more male PIs than females, and that results in more males being recognized by awards, etc. But this is changing so I'm not too worried about it.
What does worry me is when women tell me: Don't use words like "explore" in a grant. Explore is a weak word, women love to use words like "explore" and less committed verbs. Instead use "investigate." That gets me, cause hey, I like the word "explore" and quite frankly science is all about "exploration." and why should I care if I 'sound like a woman.' it's like the ol' "you throw like a girl!" Well guess what Sherlock? I AM a girl!!!

Also, a very prominent concern for women in science is the old douzey of how do you have a career and make babies too? Many Universities, when hiring, ask candidates about their marital status and whether they plan on having children which is a little disturbing. And there are always rumours that academia is nervous about hiring women, cause they tend to get pregnant as soon as they get a 'real' job. I have heard many women say things like "The best time to have a baby is when you're writing up your PhD."

V: Do you have any advice you'd like to share with any young women who are thinking of going into the sciences?
M: Yes absolutely! The first thing you want to do is get yourself into a lab. take lab courses as soon as you can and volunteer in a lab if you can. Learning about science, and studying science is VERY different from doing research so its important to figure out if you like both before committing to an entire degree. And also, consider Ecology. Man I wish I did! You get to travel to all these beautiful places to do your research! haha, wish I had done that sometimes, and now with global warming and the environment being such hot topics, ecology is definitely where it's at.

V: Thank you so much for your time!

Peace all -- and Happy Ada Lovelace Week ^__^


Thursday, 26 March 2009

Ada Lovelace Week Day 4: Nina Schor

Nina Schor, M.D., Ph.D., is a scientist at the University of Rochester Medical Center studying both neuroblastoma (the most commonly occurring childhood brain cancer) and neurodegenerative disorders. Scientific American has called her "a reluctant poster-child" for women in the sciences, and as such, I feel like it's worth a post about her here.

Schor had been interested in the sciences from a very young age. While attending Benjamin Cardozo High School in New York City, she entered the Westinghouse competition (now called the Intel Science Talent Search) with an experiment that determined the effect of aldehydes (a type of chemical compound found in car exhaust) on the ability of plants to produce chlorophyll. In 1972 she became the first woman ever to win the competition -- it had been open to both sexes since 1949.

She went on to get her BS in Molecular Biophysics and Biochemistry at Yale in 1975, her PhD in Medical Biochemistry at Rockefeller in 1980, and her M.D. from Cornell University Medical School in 1981. I can't possibly list all her certifications here, and I honestly don't understand enough about her work to explain it. She has been named repeatedly to the Best Doctors in America list, has been the keynote speaker at a number of presitgious events, and is currently the William H. Eilinger Chair of Pediatrics at the University of Rochester School of Medicine and Dentistry. She and her team there are pioneering new treatments for neuroblastoma, as well as for Parkinson's and Alzheimer's diseases.

Here is a link to her profile at the University of Rochester Medical Center, where you can find a list of her current appointments and most recent articles. Oh, and a link to a brief interview with her over at



Wednesday, 25 March 2009

Wednesday of Ada Lovelace Week: Rosalind Franklin

Today's post is about another world-famous scientist -- one who should probably have won the Nobel Prize along with Watson and Crick for the discovery of DNA: Rosalind Franklin.

Franklin did other work as well, of course. She did a great deal of work on the structure of viruses, specifically the Tobacco Mosaic Virus, the first virus discovered, as well as on the structure of coal. But it is for her contribution to the discovery of the double-helix structure of DNA that she is now most well-known.

Rosalind Franklin was born on the 25th of July, 1920 in Notting Hill, London, to her parents Muriel Frances Waley and Ellis Arthur Franklin. She attended Newnham College, Cambridge for her BA, though she was only awarded a titular degree (a degree in title only -- women weren't allowed to have "real" degrees from Cambridge at the time, you see). Nevertheless, her PhD, which she received in 1945 for her thesis, entitled "The physical chemistry of solid organic colloids with special reference to coal and related materials," was awarded without stipulation. Thank heaven for small mercies, I suppose.

After World War II, Franklin went to Paris and worked at the Central Laboratory of the National Chemical Services (Laboratoire central des services chimiques de l'√Čtat) for three years before accepting a position at King's College in London. Because she had been working with X-ray imaging techniques in Paris, she was assigned to work with Maurice Wilkins and his student David Gosling, taking over the supervision of Gosling, and also the imaging portion of the early work they were doing on DNA. Using a high-focus X-ray microcamera which she modified herself, Franklin captured a series of images of DNA that were instrumental in the discovery of the structure and function of the molecule. (See here for the infamous Photo 51).

At this point, Watson and Crick were working over at Cambridge on the same thing, and there was friendly rivalry as well as collaboration between the two teams over the years. There has been a bit of controversy over this, but these days it's pretty well accepted that the real answer to the question "who discovered the double-helix structure of DNA?" is Watson, Crick, Wilkins and Franklin. Franklin's data were admittedly used by Watson and Crick in their hypothesis on the structure of DNA which eventually won them the Nobel Prize.

"Why, then, have I never heard of Rosalind Franklin?" you may ask. I know I did. The reason is that, unlike the other three, who shared the Novel Prize in 1962 for the discovery, Franklin did not. First and foremost, I suppose, this is because the Nobel Prize is never awarded posthumously. In a tragic turn of events, Rosalind Franklin died in 1958 of complications resulting from ovarian cancer. She was 37 years of age. Second is the reason why we are in need of a whole week's worth of posts on women in science: whatever other reasons may surface, I find it hard to believe that the fact of her being female didn't play a part in the strange absence of her name from my high school curriculum.

But I digress: Rosalind Franklin made a great contribution to science, and is even now being recognized for that. The National Cancer Institute now offers a yearly "Rosalind E. Franklin Award for Women in Science," and her portrait has been hanging next to those of Wilkins, Watson and Crick at the National Portrait Gallery in London for a decade now.

So hey, next time you mention Watson and Crick, don't forget about Franklin and Wilkins.

Now you know.

Tomorrow: Nina Schor.



Tuesday, 24 March 2009

Ada Lovelace Day!

Okay, so it's Ada Lovelace Day, the international internet-based day for honouring women in the sciences, and so as a part of that, I've decided to post an interview with a programmer friend of mine who does an awful lot of sciencey things that I don't understand. For instance using "field programmable gate arrays" that let you "hardcode on the fly." The quotation marks are there to indicate the bits I didn't understand in that sentence.

So without further ado: Her name is Olivia -------, she got her Masters of Engineering degree in mechanical engineering at McGill University in Canada, where her thesis was titled "A Stability and Control System for a Hexapod Underwater Robot." You can view a video segment on the robot she worked on here. Olivia now works for a company in Texas called Awesomesauce Inc*.

I got my Masters in Medieval Studies and am now an unemployed blogger -- let that be a lesson to you, kids. ^__^

Vellum: Where to start? I guess "What do you do?" is basically my main question, but, because I know next to nothing about science, it's easiest to go about it in a roundabout fashion. I'm a humanities major, so let's start with basics: What does Awesomesauce Inc. do?
Olivia: The main thing we sell is Awesomesauce. It's a visual programming language.

V: What's a visual programming language?
O: Okay, so you know how most people know C and Java. That's text-based programming -- you code line by line. Visual programming uses icons to represent certain functions and you link between functions with wires.

V: What are the benefits of doing that?
O: It's more intuitive for some people; some people work better visually.

V: So your title at Awesomesauce Inc. is Applications Engineer -- what do you actually do?
O: Well, basically we come up with new applications for Awesomesauce Inc.'s products and help people use them. Right now we're trying to create a sales demo of a certain application platform -- the idea is to prove what you can do, then go to schools and show them. Like at Queen's University, where I did my undergraduate work, I took a mechanics course doing simple robots: every week was a different task. For example one week we'd get robots to follow one another. All that work was text-based. Now we're trying to show students you can do similar things in visual-based programming languages.

V: What were the ratios of men to women in your programs at university?
O: At Queen's it was probably 25% girls. [V: Less in some cases. Click here for a class shot of one of Olivia's undergrad classes!]

V: Was that intimidating?
O: It wasn't intimidating at Queen's. At McGill there were people who thought I didn't belong. But for the most part people were nice about it, and yeah some people were condescending, but for the most part people treated you like an equal. I mean, okay you know the discovery channel video -- there were times when I was the only girl around at the beach, and people assumed I was someone's girlfriend. They'd ask you to hold the cable or whatever, assumed I wasn't working on it -- I had one guy come up to the group of us and ask about the robot, and ask about the control systems, and one of the other guys on the project had to tell him, you know, ask her: she's the one working on them.

V: You mentioned in an earlier conversation a robotics competition you went to -- can you tell me about that?
O: It was a high school level competition called FIRST Robotics (For Inspiration and Recognition of Science and Technology) -- the main idea was that the robot's supposed to go around and pick up "moon rocks" and throw them at other robots. The arena was set up on this mat so that it had 1/6 the friction of a normal environment. One of the problems we were having was with the static electricity. They weren't expecting that the mats would cause so much static -- they had to wet them down I think.

V: So these are the best and brightest of the next generation?
O: I swear a lot of them are smarter than me. It was cool. There was one all girls team that was totally kicking ass. This senior was telling me how she has full scholarship to a bunch of universities.

V: So what's next?
O: The regionals are in Dallas, and whoever goes to the next one after that is going to Atlanta, but I won't get to go to that. There's a bunch of people helping these kids out at work. We made donations of hardware and copies of Awesomesauce and try to help out where we can.

V: It sounds like you're doing a lot of good work. Thanks for your time. One last question: is there anything you'd say to young women who want to go into the sciences?
OC: (laughs) Go for it?


Happy Ada Lovelace Day, all.



* Awesomesauce Inc. / Awesomesauce are not registered trademarks of Vaulting and Vellum, but rather pseudonyms so as not to get certain persons in trouble for representing a certain company (for good or ill) without prior approval.

P.S. -- Here is the link for the aquatic robot's home page. :)

Monday, 23 March 2009

Ada Lovelace "Week" Kicks Off

Tomorrow, March 24th, is Ada Lovelace Day, a day on the internet and in other media for recognizing the contributions of women excelling in science and technology. The idea is/was that everyone should post to their blog tomorrow about a female scientist, programmer, engineer, &c. -- my problem was that I couldn't decide on just one. So on Vaulting and Vellum this week, each day from today until Friday, I will be posting about a different inspirational woman in science and technology.

Given that I'm essentially making this Ada Lovelace Week here, I figured I had best start with a post on just who Ada Lovelace was, and why a week on women in science should bear her name.

Augusta Ada King, The Right Honourable The Countess of Lovelace, was born 10 December 1815 to Anne Isabella Milbanke and her husband George Gordon Byron, the Sixth Baron Byron (and, yes, the infamous poet Lord Byron). At that time she was called Augusta Ada Byron, and she was the only legitimate child Byron ever sired.

Her parents' marriage soon dissolved, followed by her father's permanent departure from England. Byron died in 1824, never having played a significant role in his daughter's life. It is, however rumoured that her mother's insistence on her instruction in maths and sciences was intended to instill in her a rationality Anne felt Byron sorely lacked.

Ada was introduced to Charles Babbage, the so-called "father of the computer" and inventor of what is commonly held to be the first mechanical computer, in 1833. This was the beginning of a fruitful friendship that resulted in the reason we still remember Ada Lovelace today.

A decade later -- after she had married William King, the Eighth Baron King and (from 1838 on) First Earl of Lovelace (and thus become Ada Lovelace) -- she worked on a translation of notes made by Luigi Menabrea, an Italian mathematician, concerning Babbage's latest machine. This translation was, of course, for her friend Babbage, who was by this point in the habit of calling her "the Enchantress of Numbers." Included with her translation were a series of personal notes, longer, in fact, than the translation itself, in which was included an algebraic method for using Babbage's machine, the Analytical Engine, to calculate a sequence of Bernoulli numbers. I link here to the wikipedia article for Bernoulli numbers, because I have no idea what they are, and can make neither heads nor tails of their description.

What this method amounted to, however, was a series of instructions to be run on the machine; and so in much the same way as we consider the Analytical Engine to have been a precursor to the first computers, we now consider Ada Lovelace's method to be analogous to the first computer program.

In brief, Ada Lovelace is the mother of modern computing, and the very first computer programmer. This is why she this week here bears her name, and this is why she ought to be very dear to all of us on this wonderful series of tubes.

Live and learn.

Tomorrow, I interview an old friend of mine working as a computer programmer in the present day. No Analytical Engines in that post -- sorry steampunk fans -- but there'll be robots, so come back and see!



Monday, 16 March 2009


Followed the link from ahistoricality to The Progressive Quiz - 40 questions about your political leanings.

Vaulting scored 365 / 400.
Vellum scored 336 / 400.

Make of that what you will.

Sunday, 15 March 2009

BSG stands for what, again?


After recent posts at every blog from Alas to Unlocked Wordhoard (and everything in between) I find myself forced to ask: what is the big deal about Battlestar Galactica?

At the risk of coming out of the closet (no, the -other- closet ~__^) I have to come clean: I tried to like it, I really did. I watched the better part of a season. But then I missed a couple of episodes and couldn't get back into it. I like sci-fi. I read it, I watch it, I enjoy it. But I just don't understand what it is about BSG that attracts bloggers like fruitflies to that uneaten bunch of rotting bananas sitting on your counter. Yes, those bananas.

Oh my God they're searching for Earth! It's like it's 'a long time ago in a galaxy far far away' but it's in OUR galaxy! To borrow a phrase: "Wait! The statue of Liberty? That was OUR planet! You bastards! You blew it up! You blew it up!"

I remember trying to like Lost, and then not, for much the same reason (except this time I'm not being asked to understand why, if they're lost on a deserted tropical island, there's a polar bear in a cave and, also, SUBURBIA). It's a show that built a cult and then built a walled garden that only those who had joined could frolick in. And now, they're frolicking all over the blogs I love to read, and I'm not allowed in. And it irks me.

I'm sure it's a good show. I am. And it's a free country/internet and such. Go, frolick as you will. Just don't ask me to be happy about it.

Or to care who the final damned Cylon is.

My bet? We're ALL Cylons.




Saturday, 14 March 2009

Media's reporting of scholarship

I was prepared to do another post on my television watching habits, but then I thought, you know, that's probably a bit excessive. So regardless of how offensive I found it that The History Channel *coughcough* could not find a single woman to appear in its hour-long show on the Second Punic War, you will not get a feminist rant from me tonight. Instead, you will get a rant on another form of media.


Oh, BBC News.

Many love you. I know I certainly appreciate your perspective and wide-angle news. Then again, I (currently) live in the US, where every news outlet covers the same 5 talking points ad nauseum. You provide a bit of colour and perspective to my news.

But I have a teensy bone to pick. Minutiae, really. Just details. But... could I ask if you might consider doing a little research before you report? Or perhaps finding someone who understands the topic to write about said topic?

The reason I ask is that recently, a number of your articles online have proved... how should I put this... erroneous? Misleading? Misguided? Poorly judged? I expect that you, of all news sources, know this already, but perhaps I should ask--

You do know that simply because someone says it is so, that doesn't make it so?

Right. Let me give you an example. Today, you posted an intriguing article entitled, "Caravaggio was early 'photographer'." I was very curious to see where this would lead. I am, after all, an art historian in dire need of employment. The subtitle to this lovely article was, "Caravaggio used an early form of photography to create his masterpieces - 200 years before the invention of the camera, a researcher has claimed."

Intriguing! I cried, hurrying to the body of the article.

And what did I find there? An unsubstantiated claim made by a "teacher"- whose only credentials, your article suggests, is that she's a "teacher" at a "prestigious" art school in Florence- and, as we all know, if you're from Florence, you know your art.

You, BBC News, provide evidence that this theory is brilliant and ground-breaking via the following:

She believes [Caravaggio] could have used a photoluminescent powder from crushed fireflies, which was used at the time to create special effects in theatre productions. "There is lots of proof, notably the fact that Caravaggio never made preliminary sketches. So it is plausible that he used these 'projections' to paint," she said.


Please note, one cannot demonstrate 'proof' of something by a lack of something else.

Before I let the snark run away with me any further, let me point out that Dr. Lapucci does in fact seem to be a reputable scholar who has made a career out of studying Caravaggio and thus, I would imagine, knows what she is talking about. This is not, in fact, the first time that a scholar has been made out to be an idiot in a BBC News article - nor the first time in the past two weeks. Despite how she is represented in this article, I would wager that there is, in fact, real evidence supporting her theory - evidence which I would love to hear about. But I certainly won't hear about it from this article.

BBC News clearly thinks its readers are morons. And perhaps we are. But may I venture that we have seen enough films about romantic artists throughout history - and that we all played with pinhole cameras in 3rd grade science - to recognize the term 'camera obscura'? You would suggest no, opting instead to describe Caravaggio as "illuminat[ing] his models through a hole in the ceiling."

A skylight, perhaps?

All right, I've finished: I'll be serious now. Upon the first reading of this article, I laughed very hard. How absurd- proposing that Caravaggio used crushed fireflies to fix the image on his canvas, solely on the evidence that he used left-handed models and that there aren't any surviving sketches? Good lord! What pathetic drivel! Upon reassessment, of course, I realized that the absurdity was not in the theory, but in the reporting. Dr. Lapucci probably released a great deal of information about this theory, but whoever wrote this dreadful article could not pick out enough information to provide a coherent understanding of the theory. S/he had to conclude with a vigorous insistence that "such techniques did not detract from the artist's work." Well, thank you for the clarification - but perhaps we could focus on how innovative this technique was?

I understand the need for short, pithy articles for the website - but the many misleading articles I have read on the BBC News site recently are not just short on information, but painting an entirely insufficient picture of the news they are supposedly reporting. Many scholars have "ground-breaking" theories, and push scholarship ahead almost daily. Of course I believe that more of these theories should be widely publicized, but certainly not at the expense of an accurate representation. No matter how fascinating I find this, Dr. Lapucci's theory is no more important than the theories being put forward by other scholars in journals and books all the time. God only knows how BBC News got hold of this scholarship, but in light of how it - and many other morsels of research - has been reported, I would suggest that we all keep our work close to the vest, and out of their prying fingers.

Spitefully yours,

Vellum: The Post

I figured it was about time I did a little post on my namesake: vellum.

I'm going to go ahead and assume that if you read this blog you know what vellum is, or at the very least, have enough internet savvy to use wikipedia, so we can skip that: it's not paper, it's stretched, dried, shaved, not-tanned leather. And you write on it.

So of course the first thing I did was to go find an in-depth how-to-make-vellum video, otherwise known as season five, episode eight of Dirty Jobs with Mike Rowe. Now, may try to tell you that, in fact, the job Vellum Maker is covered by season four, episode twenty-six. They are wrong. I know this, because The Pirate Bay knows this. Fingers crossed for those guys for April 17th, by the way.

In the episode, Mike heads over to the Meyer Tannery in Montgomery, New York, which is also the home of the vellum manufacturing facility known as Pergamena. The guy in charge, Jesse Meyer (that's no coincidence on the name, by the way), shows us how horribly time-consuming and labour-intensive it is to produce this stuff -- and that's already after you've killed the goat/cow/whatever and somehow removed its skin in more-or-less one piece. No, rabbits aren't big enough. Well, maybe for a moleskin. But then, so is a mole.

But I digress: once the skins are off, the steps to making your own illuminated manuscript are as follows:

1.Trim off the ears and tail so that you can lay it flat.

2. Soak the skins in a lime slurry for a week.

3. Remove the hair.

4. Remove any remaining flesh.

5. Stretch and dry the skin.

6. Using a Lunellum/Lunellarium/Moon Knife scrape the heck out of the dried skin.

7. Sand the crap out of it with a disk sander.

Note on Step 7 -- Not the Medieval Way.

8. Cut to size and proceed to bind as per these French instructions (with sound!).

9. Inscribe and illuminate to your heart's content.

Note on Step 9 -- May require years of indentured service as a monk to learn proper scribal and illumination techniques.

Now you know, the only thing keeping you from doing it yourself (aside from, you know, not having a heap of dead goats, a ready supply of lime slurry, a large room full of skin stretchers, and a Moon Knife) is technique.

Well, guess what? Mr. Meyer over at Pergamena now runs two-day workshops on how to do it yourself. So if you're in the neighbourhood and want to learn, go to the site and check it out. Click Here, then click News and check it out there. You'll need to contact Jesse to book it I think. Cost seems to be $250. I'd do it if I had the money.

And now, some parting words from Mr. Rowe:

"There may be more than one way to skin a cat, but a cow is a horse of a different colour."



Thursday, 12 March 2009

When Will There Be Good News? Now, It Seems.

As a rule, I don't normally post personal things to this blog, but, as rules are made to be broken: I received good news this week. In September, barring any major Latin-based catastrophes, I will be enrolled in a Medieval Studies PhD program at pretty much the only Canadian medieval institution worthy of note. We shall henceforth thus refer to it as the Canuck Medieval Studies Institute, hosted by Northern Colossal University (Not Their Real Names).

Thanks to those of you who thought I'd get in, even when I didn't.


Monday, 9 March 2009

Supporting the Nielsen surveys

I don't watch a lot of TV. Blame it on too many years without a means of viewing cable (sophomore year of undergrad: ecstatic to find BBC News on the rabbit ears), or too much relocation (junior year: TV is not as much fun when everyone is speaking Italian), or just plain old too-high expectations. Regardless, my TV watching consists of occasional Mythbusters, downloaded episodes of Doctor Who, and once in a while, a nostalgic episode of The X-Files.

So imagine my dismay and ironic amusement at finding a postcard in the mail, informing me that I would be contributing to that great surveying body, the Nielsen Ratings. Apparently, I will receive a substantial package in several days, which will ask me about my TV viewing habits. My initial plan was simply to respond to every question with Pushing Daisies (recently cancelled).

Favorite TV show? Pushing Daisies.
Programmes you regularly tune in to? Pushing Daisies.
Typical viewing time? Pushing Daisies.


(please note, this is mostly out of a righteous sense of vindication for fans of the late television show: I only ever watched a couple episodes. I did, however, enjoy those episodes).

But then it occurred to me that perhaps I should be a little more responsible. After all, I actually give thought to my programming choices; not everyone does. Isn't it about time that someone called out The History Channel's bizarre, conspiracy-riddled programming? Or praised the Sci-Fi channel's continuous rotation of Star Trek reruns? Or begged for the end of evening soap operas? (Brothers and Sisters, Grey's Anatomy, etc. etc. and, alas, etc.) So I took the opportunity to sit down in front of the TV tonight to gain some insight into the current offerings.

And actually put in the effort to switch away from M*A*S*H reruns. I do enjoy M*A*S*H.

My SO and I ended up watching two hours of Top Gear, which I'm not sure qualifies (Nielsen probably has a very good idea of who's watching BBC America, and it's probably people just like me: i.e. not likely to benefit advertising campaigns). But we finished off the evening with some good quality American entertainment- and a pilot episode, nonetheless! So here you have my opinions on Castle, ABC's latest crime/thriller/Law and Order wannabe.

Castle (to give you a quick premise) is about a murder/thriller author (who is not at all like Stephen King, in any way, shape, or form, and clearly his name has nothing to do with Stephen King, either) who finds himself dragged into a murder investigation, headed by a gorgeous and witty but brainy and not made-up detective. He hangs about, looking for inspiration, and helps to break the case. There's lots of unresolved sexual tension, because he's a playboy and she's Brainy, so clearly not interested in him.

Perhaps you can guess where my criticism is going.

I enjoyed the episode; it was entertaining, Nathan Fillion is a riot, and even though I have no idea who Stana Katic is, I quite liked her, as well. However.


This is a crime show. There are two main characters. Notice which one the series is named after. Ok, fair enough: he's the interesting bit, the bit that separates it from Law and Order and all its many copies. But the entire show is just another case of two leads, the interesting/quirky man and the straight-laced woman. Can't we have a quirky woman? Instead, we have the professional detective who obviously has issues because she's in charge of a bunch of men, and doesn't seem to have a social life, and doesn't like flirty men. OMG, she had a TRAGIC PAST, clearly! And she doesn't wear make-up! And she's a control-freak! Clearly, she needs saving.

I'm hoping this isn't actually the case, but things certainly seem to be pointing in that direction. All this evidence that we are not, in fact, in a post-sexist society is balanced out, however, by the wit in the show. The round-table of the crime authors (including the real life Stephen Cannell and James Patterson) was hilarious, as was the reveal at the conclusion of the hold-up-at-gunpoint: The safety was on.

So I'll give it another shot. I don't like to dismiss things outright simply because they are still caught in our male-dominated world. So I'll give the show kudos for a strong female lead who refuses to take shit, usually with a sidelong grin. But I'm certainly not impressed. Regardless, this will give me good fodder for the Nielsen survey: at least I can say I've watched some contemporary programming that didn't involve blowing up cars, dynamite, outhouses or alien planets.*


*In order, those are Top Gear, Mythbusters, M*A*S*H and Doctor Who. Though scant, my TV habits at least cover a broad spectrum of genres.

Monday, 2 March 2009

The History of Words, a follow-up post

Okay. I've now read in full the article by Professor Pagel et al. and discussed the BBC article with him by e-mail. While I still can't help but laugh at the William the Conqueror quote, I hereby assign most of the blame for the kerfuffle to a combination of crossed wires and bad reporting. The article, yes, had something to do with the English language, but not an awful lot. It had more to do with the descent of certain words from proto-Indo-European into twenty-seven (yes, that's right, 27) different Indo-European languages.

See, here's what I think the problem is. If I were to say to a reporter that the word "heart" is 10,000 years old, that means different things to different people. Having studied the history of the English language (and of others), I can tell you that "heart" is pretty much the same word as "cardio" because of a series of sound changes that turned ancestral words with a "k" sound into more modern words (of Germanic descent) with an "h" sound. We can say with a pretty good degree of certainty that the proto-Indo-European word for "heart" was probably some kind of inflected form of a word that may have sounded an awful lot like "kart" or "card."

But if you tell someone who doesn't study this stuff that the word "heart" is 10,000 years old, chances are they'll think you could take a time machine back and use the word in its present form to talk to "cavemen."*

So when Professor Pagel says that the word "I" is a very old word indeed, it means that some cognate of the word "I" (maybe "ego" or "ich" -- like "cardio" for "heart") was probably spoken 10,000 years ago. It might have sounded less like the modern word "I" than like "itch" or "eggo," but it would be the same word.

This is what the reasearch is getting at.

The most interesting finding in the article, so far as I can tell, is that of the counterintuitive stability of the most repeated words. One might think rather that, given the frequency with which they're spoken (upwards of 35,000 times per million words, as opposed to most words, which are used fewer than 100 times per million words), these words would change most quickly. But instead it seems that their repetition, simplicity and lack of inflection protect them quite well as the language evolves.

As for what the article says about English, it does reiterate the findings of an earlier study that says that the most frequently used modern English words are more likely to be Old English in origin (as opposed to Anglo-Norman French, or Norse, or Latin), but as for the article itself, it tends to stick to the abstract and focus on the descent of cognate words from truly ancient languages to our own.

Perhaps Proffessor Pagel saw the glazed-over look of confusion in the reporter's eye, and tried to dumb things down for the sake of a simple explanation, like the way tells you that atoms are "very, very small" -- but I can't help but think that no matter what actually happened, one thing is certain: the BBC reporter didn't understand what was going on, and now, neither will most of his or her readers.

Oh and if you ever do get a time machine and go back to talk to our ancestors, remember two things: One, if you're going back to talk to William the Conqueror (nee "the Bastard") take an Old French dictionary, not an Old English one, and two, if you're going back 10,000 years, just remember that you're about as likely to be understood saying "I, Tarzan" as you would be saying "Leggo my Eggo."



*No, not primitive nomadic people pretty much identical to ourselves. Cavemen. Like in the Geico ads.