What ho, mimsy borogroves!

by Jennifer Z. Gong

2013 Year in Review

I’ve ever done one of these before, but it seems to be good practice. This was a hurry-up-and-wait sort of year, a lot of long-term things done cooking or still ongoing. Events that stand out to me 2 hrs until midnight:

January: Tibial stress fracture puts me on crutches for six months. Arm and shoulder muscles develop enough to speed me through NYC rush hour even in pouring rain.

February: Commuting to Midtown NYC on crutches is simply unsustainable. Begin working from home.

April: Moving-to-San-Francisco-So-Eat-Everything-in-Our-Kitchen party. “Drink with Me” from Les Misérables is sung.

Menu: 10 lbs slow roasted and crisped pork shoulder w/ Chinese steamed rolls and Thai chili sauce w/ cucumber slices, 3 lbs braised brisket with au poivre sauce, arugula, chickpea, and beet salad with lemon-olive oil dressing, thyme polenta with roast chicken drippings, green beans, Japanese/Korean-style shredded kelp and sesame salad, Korean seaweed and cod soup with shiitake and kelp broth, strawberry tart, strawberry and ladyfingers cake with mousse and rosewater cream, madeleine cake, fruit tart, truffle salt caramels, 4 bottles of wine, and some whiskey. Still have 4 more bottles of wine alas. And cognac.

April: Very agitated about Boston. No friends hurt despite running marathon. “When I was a boy, and I would see scary things on the news, my mother would say to me, “Look for the helpers. You will always find people who are helping.” -Fred Rogers

May: Celebrate our second year of marriage, almost tenth(!) year of relationship, with my ineffably remarkable husband.

May: Husband defends PhD thesis! (He passed.)

May: Fantastic cross-country road trip with my husband, stopping in Chicago, the Badlands, Yellowstone, and Elco (Nevada) overnight. Spent an extra day each in the Badlands and in Yellowstone. A+, 5 stars, would drive again.

June: Move into new apartment in San Francisco.

June: March in San Francisco Gay Pride parade as part of Dropbox streetcar.

July: Finally realize that when the blonde says to Neo, “let’s go, coppertop,” she is referring to him as a human battery, not as a secret ginger.

July: Start attending SF area meetups.

August: Housewarming party. Five desserts. Coin the word “bake-nado”, e.g. “I was baking a lot. I was in a bake-nado.” I accidentally made an extra cake and a Gâteau St. Honoré.

September: Feeling like an adult spending a weekend in Napa staying at a cottage with friends. Hit up Ad Hoc and Bouchon while I’m at it. Faintly underwhelmed.

September: Folsom Street Fair. Underwhelmed.

October: Celebrating ten years of a relationship with my husband.

November: Help organize and cook for Thanksgivukkah for about a dozen people. Also discover how to make dairy-free chocolate mousse and how to spatchcock a bird.

December: Start thinking about new year’s resolutions; realize I’ve forgotten all the ones I made for this year.

Today is National Poetry Day

I was recently advised I should indulge myself in writing, as foolish or bad as it could be, instead of limiting myself to editing. Today seemed like an okay day for that. Forgive, forgive.

Frosted Windows, or Going Out

I taught myself how to say good-bye
before I understood
Some things you don’t forget.

The humming chorus of light
through ice and glass
so warm in your reaching
and colder still
than the numb familiar will of
tile and wood and home,
pulling me back and fracturing
what I told myself
I’d only guard for others.

But I always kept a glowing shard,
to soften and sand,
safe in my Insides because
my heart couldn’t be my own.

But now I’m grown, and
the house-shadows only
warn me
of the sun that burns
and how time not-wasted on my guarded heart
can melt even glass and stone
and years of regret.

I pretend I’m not a friend of good-bye,
and turn my face to the burning light.

The root of bad writing?

Writers frequently present me with the following two problems that I suspect have a common cause: they worry too much about words.

1. “Why doesn’t this work?” Make sure the words and sentences have muscle. Use the limited amount of time and space to say (or not say) the most you can.

I like the mental exercise of imagining the most impatient critic who is juggling five chainsaws while riding a unicycle across a tightrope with hornets flying around his nose. You want to grab his attention. You can’t SCREAM at him because you’ll sound like the chainsaws, but you can’t mumble because he won’t bother straining to hear you. The scream is the gimmicky, hyperbolic, idiom-filled, conspiratorial declaration. The mumble is the wordy, self-referential, disclaimer-filled, hesitant suggestion.

It’s a crude analogy, but it works, because it forces you to ask of your sacrosanct writing: Why should your sentence matter more than anyone else’s? You do not have the luxury of calling him your friend/colleague/relative/hairdresser, who might have told you your work is great and interesting, honestly. Why should he care? Why should you survive his triage of things that matter? Simple, strong, and efficient writing states the stakes, and gets his attention.

2. Writer’s block? When a writer comes to me complaining that he can’t articulate the nuance of a brilliant multi-faceted concept, or that this one sentence just isn’t working, I often ask him to take his time and talk out loud to me what he wants to say. (Writers can be “she”s, but for the sake of simplicity and not using s/he and his/hers throughout, I’m sticking with “he”.) My reason for this relates to Michael Billig’s article calling for a re-assessment of the power of “ordinary words,” where he says:

By using nouns or verbs in the passive voice, authorities can present their own decisions as if they were objective realities, rather than as actions arbitrarily taken by powerful persons.

Calvin: "I realized that the purpose of Writing is the inflate weak ideas, obscure poor reasoning, and inhibit clarity. With a little practice, writing can be an intimidating and impenetrable fog! Want to see my book report?" // Hobbes: "The Dynamics of Interbeing and Monological Imperatives in Dick and Jane: A Study in Psychic Transrelational Gender Modes" // Calvin: "Academia, Here I come!"

My favorite Calvin & Hobbes comic about academic writing. I have it on my wall in my office. (by Bill Watterson)

Too often, when we write, we worry about pleasing our audience or sounding important. It’s a sad reality that people genuinely solidify their opinions based on first impressions, or surface appearances, or showy credentials that have nothing to do with the work before them. Practicality forces us to bend to this tendency, but it can freeze good writing. When I’m developing ideas and writing with someone, often they give me a 30-word sentence that would make their vocabulary instructor proud. I simply say, “write down what you just said to me.” Then see if the words have muscle and revise.

That’s if you’re lucky, if the words are merely jumbled and you can’t comb through them. Sometimes you’re looking at a blank screen or sheet, you have a deadline, and there is nothing you can do. The ink has dried up.




…in all seriousness, think of the trouble professional authors get into if they must force creativity on demand. There are tons of resources out there with tips on breaking writer’s block. My favorites for fiction writers are to imagine the most surprising thing that could happen in a scene or if the scene did not exist. For academic writing, I ask my writer, what do you want the reader to leave this section thinking? What kind of spark or action or motivation or perspective do you want to create? Think not in terms of the information you want to deliver, but the short- and long-term impact you want to make. The answer is often over-ambitious, but it gets the gears turning, so that the writer can start jotting down related word clouds and figure out what information s/he actually does want to deliver.

Earlier in the post, I suggested that the root problem is that many writers worry too much about words. They worry about words so much, they forget to think about the ideas, even though many of them have brilliant ideas. To borrow Calvin’s words in the comic above, “inflat[ing] weak ideas” is what often happens when you neglect reasoning and clarity, and all you get is an empty shell that cannot hold up to posterity. I jokingly warn people that I’ll “Gertrude Stein” whole paragraphs, a reference to a (probably apocryphal) story that Gertrude Stein crossed out all the adjectives in Ernest Hemingway’s manuscript when she was giving him feedback. One good reason for a “fresh pair of eyes” is not because you can’t spot mistakes anymore, but because sometimes, your familiarity with the text can prevent you from cutting words that really need to go. Someone new with no investment and who is not best friends with that great phrase on page 8 realizes it has no business in your essay and will strike it. It’ll be a cleaner and better piece, and maybe you can use it somewhere else. Think of it as going to a better place. After all, you’re writing. You’re not building the kitchen sink.

Required reading: George Orwell explains good writing better than anyone has and possibly ever will in his essay, “Politics and the English Language“. Randall Monroe’s ambitious exercise explains a space project using only the 1000 most frequently-used words in Up Goer Five.

Postscript: Yes, I know the ending two phrases to this piece are self-indulgent. Ironically, I’m not a very good writer, but among the precious few things I am unabashedly proud of are my editing skills. Also, none of the above is to be construed as professional advice. Every writer and piece has different needs at different stages and levels of detail.

[antitrust class action] allegations of nearly a decade of instant noodle price fixing by 4 South Korean companies

The Korean Fair Trade Commission found several South Korean food conglomerates to be colluding to fix prices in Korea, and now the original plaintiff is claiming the conspiracy extended to the US market. So Nong Shim, Ottogi, Samyang, and Korea Yakult/Paldo America have been named in an antitrust class action lawsuit, alleging they were part of a conspiracy to fix and increase the prices of Korean instant noodle products by 54% between 2001-2008.

The noodle companies blamed the price increases on rising costs of wheat flour, palm oil, vinyl wrapping, potato starch, and even the weather, Plaza says in the lawsuit. However, “As determined by the KFTC, the truth is that Korean Noodle price increases substantially exceeded increased input costs,” the complaint states. Plaza Co. claims the conspiracy extended to the U.S. market, and charts year-by-year price increases from 2003 to 2008. Nong Shim’s U.S. sales in 2006 totaled $47.7 million. The companies “sold hundreds of millions worth of noodles in the United States,” the lawsuit says.

To be fair, as a college graduate and therefore a regular consumer of instant ramen noodle cups, the best being Nong Shim by virtue of my Korean then-boyfriend now-husband’s preference for it, they are pretty high quality ramen noodles with large servings compared to the Maruchan instant noodles, which I see more often in America. (It was surprisingly difficult to find an actual photo of a Maruchan and Nong Shim cup with true size comparison on Google Image, but plenty of heated debates exist online about the major noodle brands.)

Sure, Nong Shim was about $2 more than the 99¢ Maruchan Instant Noodle cups, but you didn’t get as heavy of that greasy, sadness feeling that comes with stuffing your stomach with cheap oil, starch, and sodium. I tried to eat a bowl recently and I couldn’t stomach it, either because I’m much older, or lingering psychosomatic trauma of relying on them for late-night working sessions.

The Korean Fair Trade Commission fined Nong Shim 107.7 billion won, and Samyang 11.6 billion won. Ottogi was ordered to pay a 9.7 billion won fine, while Korea Yakult was hit with a 6.3 billion won fine, according to the English-language edition of the March 23, 2012 Yonhap News.

Still, as Joe Patrice from Above the Law (says)…I didn’t know that stuff could be cheaper.

You can read the Courthouse News bulletin (here).

The Parsons Code Eliminates Earworms

For two weeks, I had a relentless earworm that randomly materialized and would not vanish. Musical friends could not identify it, Shazam and Soundhound were no help, and I could not even place the composer (Mozart? Beethoven?). The same few notes drove round and round my head and worst of all, I couldn’t recall the rest of the piece!

When my frustration had me systematically listening to my entire classical music library, my husband discovered something called the Parsons Code and then the Parsons Code Database for Melodic Contour. Musipedia explains the Parsons Code:

Each pair of consecutive notes is coded as “U” (“up”) if the second note is higher than the first note, “R” (“repeat”) if the pitches are equal, and “D” (“down”) otherwise. Rhythm is completely ignored. Thus, the first theme from Beethoven’s 8th symphony that is shown above would be coded DUUDDDURDRUUUU…

In his “Directory of Classical Themes” (Spencer Brown, 1975), D. Parsons showed that this simple encoding of tunes, which ignores most of the information in the musical signal, can still provide enough information for distinguishing between a large number of tunes.

(I feel like more people need to know about this remarkable system, though if I am late in the game, then I’m glad that I’ve finally come across it now.)

So what was the culprit?

 Harpsichord Concerto No. 7 in G Minor, BWV 1058. by Johann Sebastian Bach

I had the first six measures stuck in my head in a loop for two weeks. Once I found the sheet music and played the rest of it, the notes stopped looping in my head!

Thanks, Musipedia.org, for preserving my sanity.

Advice for Entering Freshmen

What advice would have been useful for me to take before I entered college?

My college roommate’s little brother is entering Harvard next year. As a responsible big sister, she’s asked our friends what advice we could offer him. My answer boiled down to a few points that I realized are true about life as well.

1. People matter. Talk to EVERYONE. Classmates, professors, janitors, presidents, lecturers, museum curators, the people in the kitchen (very important for quality of life), campus police, people sitting next to you. Introduce yourself and ask them how their day is. Remember their names or something they find interesting about themselves. Share your dreams (your actual dreams, not what sounds nice to people) and listen to theirs. It’ll go a long way in making you feel comfortable and at home on campus. There’s something interesting about everyone, and having a strong sense of community will lend you confidence on bad days when you’re trudging around the Yard.

2. “Requirements” don’t matter as much as you think. Go for what you want. Don’t listen to anyone telling you what the “rules” or “usual” path someone follows through a concentration or requirement fulfillment are. You can petition your profs, tutors, proctors, etc. for almost anything, so long as you show you are prepared, you’ve done your research, and they won’t have to babysit you. Go for what you want, not what you think you ought to do or might find “useful” in the future, because Harvard’s going to give you great peers and a great analytical skillset, so whatever you’re passionate about, you’re going to be awesome at it regardless of what “useful” classes you didn’t take.

3. Regardless of authority or tenure, people who know you more intimately can be more helpful. Professors make the class, but sometimes, for larger classes like government, TFs make the class. Many of the TFs in gov’t and law have awesome careers elsewhere; get to know them if you can, and they can hook you up with great gigs if they see you’re passionate. College gives you a wild menagerie of options–sometimes you’ll need direction more than you’ll need “contacts.”

4. Seek help. Find a mentor, any mentor. Doesn’t have to be in your concentration, year, hall, House, class, whatever. Just find someone you can talk to, who can help you grow or sift through your whirlwind of experiences.

5. Take the time to indulge in curiosity. Shop classes you’re interested in but probably aren’t or won’t take because you don’t have enough interest or time. Grab the syllabi and check out the assigned list of readings, and if you’re intrigued, go and read the books to give yourself a taste.

Good luck to all the prefroshies, and to the rest of us.

Professional scientists, “lay people,” and the truth

I recently heard an unusual, horrified outburst: “You’d let lay people administer a scientific experiment?”

Scientists have the right to be proud, but not to be too proud to fail. It got me thinking about ivory towers and the supposedly-unassailable authority that media often assigns to science. PhDs and MDs are a smart group of people. They peer into incredibly complicated mechanisms, try to explain the nearly invisible, and hunt down vital defenses against devastating illnesses. Science is increasingly specialized now, and scientists certainly deserve to be respected for their intelligence, dedication, and insight. Scientists, however, are not infallible–a large and necessary part of scientific progress is failure. Otherwise, many important discoveries might not be made. A failure in science is just as valuable as a successful end product (though some frustrated researchers might disagree).

It doesn’t matter who makes the discovery if the method is sound and the results can be reproduced countless times by peer review. The same applies to the concept of professional scientists versus amateurs: science is agnostic, and a result is a result no matter who discovers. It only matters how. Gregor Mendel was educated (and incredibly patient), but he lived in a time where there was no professional “accreditation” for scientists. He was just a curious friar who wanted to figure things out. The teenagers who win the annual Siemens Foundation Competition–including seventeen year-old Angela Zhang’s cancer stem cell-destroying nanoparticle, or Joshua Kubiak’s molecular scaffold that could make mounting chemicals used in medicine more efficient–do not have PhDs, though they were mentored by PhDs. Certainly, it’s more likely for someone trained in precise lab techniques and unbiased research design, to produce a remarkable discovery or result (or failure). A bright high school student can still strive to do the same.

Science is always striving towards the truth, but we’ll never know if we’ve reached it. Yes, science assumes there is an absolute truth we strive towards with each experiment. Each rigorous, unbiased (or as unbiased as possible), empirical result gets us a little closer to that truth. If we’re wrong, we revise all our operating principles, and it’s perfectly fine to change our minds because the empirical evidence has shown otherwise. Science has authority not because there are cartoon characters in lab coats titrating green, bubbling fluid between Erlenmeyer flasks, and not <i>merely</i> because the discoverer has a PhD.

When the media overemphasizes the authority of science in a discovery, it creates assumptions that anti-science groups rely upon to try to discredit the importance of science (and scientific results) in education and debate. That’s what I find dangerous about media reports that emphasize that scientists were the ones who made the discovery, and science makes the finding (often misinterpreted) super-true. When this authority is misused to debunk the very discipline itself–for example, when people point out that science shouldn’t be trusted because they get things wrong–it’s because science has been misused by the media to mean truth and authority, when in fact science is often wrong, and sometimes needs to be wrong. (I’d rather not link to these kinds of sites and give them more traffic, but I am referring to, though not exclusively, the kinds of creationist arguments used against teaching evolution in schools.)

By privileging science as an amorphous, unassailable authority, media creates a mystery around the discipline that discourages people from entering or trying it. Anyone can do science, and that’s what’s amazing about peer review, because everyone learns together (ideally, instead of sabotaging a competing lab or making up your own data). Creating this kind of mystery is incredibly intimidating for curious thinkers who do not have scientific backgrounds to encourage them to pursue their passions. And sadly, when science is criticized for being “fallible” and less than absolute, these thinkers will be even more discouraged to ask the questions that science could have tried to answer.

Science isn’t perfect, and that’s because we recognize our own fallibility. Because we’re human, egos get in the way, stubbornness about a beloved hypothesis can lead to interdepartmental fights or tenure denial, and that’s why science strives to isolate human bias from experiments, and to compensate for our failings.

A result is a result. The degree of truth it contains can really only be measured by rigorous peer review using empirically obtained evidence.

Butternut Squash Soup (10 min, one pot)

A conversation with a friend involving immersion stick blenders leads me to post this easy, 10 minute one pot recipe. It’s especially useful around the holidays because it’s fast and comes out fancy-looking. It is easily modified for any root vegetables and vegetarians (just use water or veggie broth instead of chicken broth).

Ingredients are:

  • Solids: Onions, butternut squash, potatoes
  • Liquids: Olive oil, chicken broth/boullion (or water)
  • Equipment: Pot, stove, immersion blender

Instructions are:

  1. Sautee onions in the pot with olive oil, don’t brown them.
  2. Dump chopped butternut squash (1 average size squash) and potatoes (3 usually) in there. Pot can be as full as 4/5 full by now; I wouldn’t go more than that or you’ll have really thick soup.
  3. Dump 3 cups of chicken broth (from boullion; veggie broth or plain water also works) in there.
  4. Bring to boil on high heat. 
  5. Cover pot, lower to medium heat.
  6. Cook 20 min.
  7. Open pot, immersion stick blend the sh*t out of that thing.
  8. Optional: Add sour cream and a dash of freshly cut cilantro. Fancy!
  9. Devour.

Separation of Science and Religion

Science is agnostic. That’s one of the things I love about it. Regardless of your creed or personal faith, science’s one true tenet is the truth, and the pursuit of that truth. No matter how painful that truth, you can be assured that, if you were rigorous, thorough, logical, and demanding, that truth is still the best and truest understanding you could hope to achieve with the best of your abilities. It’s been through the fire of variable elimination, of subjective bias blindness, of attempting to prove the opposite of the intended result. It has no opinion, it has no sympathy, and it has no permanency. Truth is only truth for now, as we know it. If gravity were ‘disproved’ tomorrow (notwithstanding quantum mechanics) by science, we would still accept it (and we did, with quantum). That’s scientific truth. It’s not always easy or sensible to accept, but it is the cold hard gleaming truth.

I talked about politics at a social mixer today. Nothing alarming happened, but the increasing politicization of knowledge, and recontextualizing science as a matter of faith, came up. How do we convince climate change deniers and intelligent designers of the truth? I argue that science and faith can’t meet in a meaningful way on their own grounds. They operate fundamentally on different principles, with different semantic meaning for the same things. For a scientist, truth is merely the best version of what we can tell based on thousands of reproducible experiments and tests, of hard self-questioning and denial of personal bias. In religion, truth is faith based, fundamentally. For many believers, truth is what their religious text tells them, or their religious authority, or what they feel to be true deeply in their heart of hearts. Trying to reconcile these two truths ignores that they can never meet: one is a religious truth and the other is a scientific truth. In the face of scientific evidence, religious truth is unassailable unless someone’s faith changes in some way. This is confirmation bias at its most resilient.

This is no new thing under the sun: Stephen Jay Gould’s non-overlapping magisteria is another way of putting it. In opposition, Richard Dawkins has been very vocal about John William Draper’s conflict thesis, which proposes that religion will always challenge new scientific ideas and produce social conflict. But wouldn’t it be better to let each religion and science go their own way, as fundamentally irreconcilable and isolated fields? If we can’t get along, we can at least recognize in each other a shared wonder and love of the universe.

A Mosaic of Human(?) Evolution: Australopithecus sediba‘s Challenges

The anthropology community has been filled with buzz recently about the discovery of a new species, Australopithecus sediba. Is it really an ancestor to modern-day humans? Does it have a human-like brain or an ape-like brain? What do its humanoid hands but ape-like feet mean for the evolution of walking? We may be arguing about these issues for a while, but the completeness of the skeleton and its distinctive blend of early and more modern humanoid features set it on par with Lucy (Australopithecus afarensis) in importance. In a field where many even critical discoveries revolve around no more than a piece of jaw or a corner of a hip bone, this is a prize opportunity to learn more about how we developed the features that set humans apart from chimps and gorillas.

For the longest time, the world has known of Lucy, the star of the paleoanthropological world, as our ancestor from about 3 million years ago. Despite many interesting findings since her fateful discovery, either due to the lack of a fossil record from incomplete skeletons or theoretical arguments about our family tree, we haven’t been able to draw a clear timeline of what led from Lucy to the first Homo habilis, the “handyman” that led to our own (Homo) sapiens. This all changed when a dig in Africa produced four fossil skeletons of stunning completion. They are now known as Australopithecus sediba, dated to a little less than 2 million years ago. The star of the show so far has been MH1, a juvenile male with a skull so complete that scientists have constructed a virtual model of its brain.

How can we even know what kind of brain it had if all we get is bone? Scientists used a CT scan of the male skull to create a model of the interior of the cranium. This endocast is constructed from many X-ray scans, rotated 360-degrees around a central point, so that each scan is like a cross-sectional slice of the skull. By digitally modeling the combination of those slices, scientists can deduce what type of brain and therefore what mental capacity it had. Based on that, we can then try to predict whether it used tools, and even speculate on its social organization and its capacity for planning and self-awareness.

A. sediba’s brain challenges what we thought we understood about the evolution of childbirth, bipedalism, and tool use. Some scientists are even claiming that the four fossils aren’t a separate species at all. Anthropologists are arguing not just about where to place A. sediba in our family tree, but about old and established theories about human evolution that have dominated the field for decades.

A Challenge to Childbirth

We originally thought that it was our big brains that caused our pelvises to evolve the way they did. After all, one major constraint on brain size, and therefore head size, is childbirth. How could our enormous brains fit through our tiny pelvises? We compensated for that with severely delayed development: compared to other mammals, human babies are basically born premature. When sheep are born, they can stand up within minutes. For a human baby, that process can sometimes take twelve months. We grow our brains, and the rest of ourselves, outside the womb, whereas other mammals emerge nearly ready-made. As a result, the easiest explanation would be that our pelvis has also rotated and reshaped to accommodate the wider birth canal.

A. sediba has thrown a wrench into this theory, because it has a small head, but its pelvis is still rotated the way a Homo pelvis would be, and yet its narrowness still echoes of Lucy. If we look closely at A. sediba’s pelvis and skull, we find that while its pelvis is a blend of the rotated hominin pelvis and the narrower australopithecine pelvis, its brain measures only 420 cubic centimeters (cc). To put that into context, a modern human brain averages upwards of 1500 cc. Lucy (A. afarensis) had a brain of around 400 cc. A. sediba has a brain size comparable to that of a chimp, clocking in at less than 500 cc on average. This means that even though A. sediba shares the same brain volume as Lucy and chimpanzees, its pelvis (and the rest of it, as we shall later see) was already beginning to change. So if A. sediba didn’t have a big brain to reshape its pelvis, what did it have instead?

A Challenge to Bipedalism

The answer to that question lies outside of the cranium, and requires us to think about how bipedal A. sediba was compared to Lucy or to a modern human. One feature that sets us “above” our ancestors is our ability to rise up and move about on two legs alone. Humans are completely bipedal; for the most part, we do not suddenly decide to switch to all fours in the middle of a meeting, or swing from the pipes on a train platform because it’s easier than walking to the train. Chimpanzees primarily travel on all fours, and although they can occasionally walk around on their hind legs, knuckle walking is much easier for them than for us. Human and ape skeletal features have evolved to suit their nearly locomotion lifestyles, but we see something different when we look at fossils from the transition between the ape-like australopithecine and the modern human.

Bipedalism requires changes in the shoulder blade, the pelvis, the legs, and the feet. Even the neck and spine are involved in upright mobility. Although chimpanzees and humans might share a common ancestor and are not related in any direct line of descent, it’s still useful to compare the chimpanzee’s ape features with our own. Our arms are relatively short compared to our legs, but apes and australopithecines have long upper limbs, with large joints to handle the weight they share with the rear limbs. The thickness and strength of the arm, leg, and wrist bones adjust in humans and in chimps based on how much weight they habitually need to support. On a chimpanzee, the shoulder blade is completely rotated so it can swing between branches, and humans still retain some of that flexible shoulder joint. The thickness of the spinal vertebra and the orientation of the pelvis both shift to accommodate the suddenly vertical load that humans endure in order for us to lift our heads above the crowd of ape-like relatives.

When we take all these comparisons and apply them to Australopithecus sediba, it’s as if we had tripped along the way and tossed all these features together. The sides of A. sediba’s pelvis are more vertical like you would expect of Lucy, and the size is more like Lucy’s, but the shape and angle of the pelvis where it sits in the body is more like a human’s. It also has the strong, long arms of a chimpanzee or an australopithecine, and the large joints of someone used to supporting their weight on their arms. Parts of the hip, knee, and ankle look like they would be best for bipedalism, but the foot looks much more like an ape’s knuckle walking foot. Overall, there is a mix of tree-swinging, knuckle-walking australopithecine and a large, bipedal hominin, with each individual distinct feature creating a confusing bigger picture.

There is no change in brain size or head size that could explain the change in pelvis, but there are changes in the rest of the body that are related to a newfound reliance on bipedalism, rather than swinging from branch to branch or knuckle walking over the ground. These differences happen in an otherwise australopithecine body carrying an australopithecine-sized brain. The old theory of childbirth changing our pelvises may just be untrue, and A. sediba might be the perfect exception that disproves the rule. It might just be possible that bipedalism, and not babies with bigger brains, is the cause for the signature changes in our pelvis that mark the evolution from Australopithecus to Homo. Then again, as many scientists have pointed out, why can’t it be both? The jury’s still out and the papers are still being written.

A Challenge to Tool Use

If walking came before bigger brains, does that also mean it came before smarter brains? The precise origins of stone tools are murky, and even if we see evidence of tool use 3 million years ago, that still doesn’t tell us how we came up with the idea of creating knives or axes out of bits of boulders. Whether a stone broke into a chopping blade by accident, or a few australopithecines started pounding rocks together out of sheer boredom at night (a wonderful image from an old professor of mine), the invention of tools had profound changes on human ancestral physiology.

The stunning endocast created for A. sediba, combined with skeletal evidence from its hands, can tell us a great deal about the changes in brain capacity, diet, and maybe even social complexity as it developed towards human society.

We associate the brain’s frontal lobe with planning, thinking, emotions, and other higher functions. Your frontal lobe stops you from saying something rude, helps you decide not to steal, and recognizes that surprise from a practical joke isn’t a signal for your body to go into survival mode. Many scientists think that planning ahead is a very human thing, and specifically, planning several steps ahead with many other humans. Chimpanzees are known to get a bunch of friends together for precise attacks against other chimpanzees. Baby baboons will fake an injury to get more food. Other primates can be just as devious as we are, but no chimpanzee has ever led a concerted and sustained effort to conduct siege warfare or to coordinate a commodities trading market in bananas. The simplified answer is that they do not have the same frontal lobe organization that we do.

Compared to other australopiths, A. sediba’s brain isn’t remarkable except for its frontal lobes. Like the blend of australopith and hominin features we see in is skeleton, its brain is overall australopithecine, but its frontal lobes have the shadows of future humans to come. Why the change? Its australopithecine cousins also have bipedalism, but their brains don’t harbor these glimmerings of the future man to come. They’re also known for tool use, as early as 3 million years ago, but their brains don’t have this kind of neural reorganization.

We might be lost at this point, if not for A. sediba’s hands and teeth. Its hands don’t completely look like ours, and probably wasn’t as good with precision grip as we were. But remember that A. sediba’s hands were occasionally freed to do other things while it walked around bipedally. Its hands could grasp more than tree branches, at least, and we see that in its human-like thumb to finger proportions. It’s as if an almost-human hand was grafted onto an australopithecine arm.

Another hint comes to us in the form of the juvenile male’s molars. Inside its vertical, human-like face, second molars already developed. Their arrangement is australopithecine, but their size is closer to Homo. The simple supposition is that what A. sediba was eating had changed how large its teeth needed to be. If its diet changed, then the way it gathered or reached those foods had changed too. Bipedalism meant it could see higher in non-wooded areas, and the improved finger dexterity meant it might be in same tool-making tradition we share with the early makers of stone “shovels.”

Whatever its brain, teeth, and hands can tell us about its life, we know that evolutionarily speaking, A. sediba’s brain organization was moving towards Homo before its size had tried for that shift.

A Mosaic of Evolution

The arguments surrounding A. sediba are enormous, complicated, and critical for our understanding of human evolution. Scientists are even arguing that it shouldn’t be classified as Australopithecus, or that it isn’t even a new species at all. That would mean no new species, no changes in existing theory; just an expansion of the range of features we used to assign. Even if it was a new species, Australopithecus sediba might not even be related to us; instead, it could be an example of how another organism has experienced similar environmental pressures to evolve in a similar way. It’s hard to say; there aren’t enough skeletons to let us know for certain. As with any new discovery, there are bound to be hundreds of new theories, new ideas, and new papers written arguing new sides to be taken.

Fossil hominins are an elite and lonely crowd, and their rarity makes every new discovery the next potential Lucy. As exciting and puzzling as A. sediba’s skeleton is, each individual piece of bone pieces together a hodgepodge of theories, ideas, and histories. However it ends up getting classified, the fact remains that paleoanthropologists carefully rescued four isolated skeletons from the darkness of history. In the future, there will hopefully be more like A. sediba, of any species, to transform, challenge, and energize our understanding of our origins and what it means to be Homo sapiens.


  • Berger, et al. Australopithecus sediba: a new species of Homo-like Australopith from South Africa. Science 328, 195 (2010).
  • Carlson, et al. The Endocast of MH1, Australopithecus sediba. Science 333, 1402 (2010).
  • Cartwright, J. (2000) Evolution and Human Behavior. Great Britain: Palgrave.
  • Gibbons, A. Skeletons present an exquisite paleo-puzzle. Science 333, 1370 (2011).
  • Kivell, et al. Australopithecus sediba Hand Demonstrates Mosaic Evolution of Locomotor and Manipulative Abilities. Science 333, 1411 (2011).
  • Zipfel, et al. The foot and ankle of Australopithecus sediba. Science 333, 1417 (2011).

Get every new post delivered to your Inbox.

Join 380 other followers