On email signatures

Should I include my pronouns?

“He, him, his.

“She, her, hers.

“They, them, theirs,” chanted the schoolchildren as they prepared for their pronouns quiz.

I’ve been thinking about email signatures. You know, that block of text automatically inserted into the bottom of all your emails with your name and contact information. Yeah. That.

You may or may not have a very long email signature. (You probably do if your employer has a branding guide, and the guide has several hundreds words to say on how to format email signatures.)

I don’t want to pass judgement on how you end your emails, but I started to think there may be a problem with mine after I realized there were usually more words in my automated signature than in the actual body of my email messages.

Am I putting in too much information? Does it seem like that I’m trying to show off how much work I can do? Am I complicit in contributing to an overburdened, overstretched campus work culture?

(I want to do none of these things.)

People Who Email Me Keep Assuming I’m A Man. How Do I Tell Them I’m Not?

James Smith (he/him/his)

Several people at Northwestern, including some of my friends, include pronouns as part of their signatures.

Usually, it’s buried in the middle of their list of expansive contact information, along with their own email address (because the From: field wasn’t enough), office phone number (actually useful), office address (I thought we were sending emails, not letters), and the words “Northwestern University” in branding colors (okay, branding guidelines, now you’re just wasting space).

This group doesn’t just include those at Campus Inclusion and Community, but also a small group of students and staff across the entire university; yet, it’s still a tiny enough number that I notice whenever somebody does do it. If normalizing the inclusion of pronouns in overpacked email signatures is the goal, then we’re still a long way from the practice becoming common.

Because that is the goal, isn’t it?

We’re trying to switch away from a world where pronouns are assumed based on visible cues of gender identity (name, body shape, hair style) to an environment where we respect each other’s gender identities by not making such assumptions. Part of that means we’re creating a space — and normalcy — for people to decide their own pronouns, as opposed to the rest of us deciding for them.

I think.

What about me?

I used to include my pronouns (“he, him, his”) in tiny letters underneath my name. Then I started using a new email client and created a new signature without gender pronouns. (I didn’t exclude them. I just forgot to include them.)

(Yes, to the presumable shock and horror of many working in IT support, I’m a so-called digital native who uses an email client instead of just typing ‘gmail.com’.)

I stuck with this signature for several months. But I’m now bouncing back and forth between including gender pronouns and leaving them out.

I recognise it’s a privilege for me to have other people look at me (or my name) and assume pronouns correctly, and I want to put in some effort to help people who don’t enjoy that privilege.

At the same time, I’m a cisgender man, and a part of me feels that, by doing so, I’m broadcasting my privileges; that including “(he, him, his)” at the end of my emails casually is like me saying: “this little action doesn’t hurt me at all and I’m such a good person because I’m an amazing ally to trans and gender nonconforming communities!!!

Does including gender pronouns in my emails contribute to a culture where trans and gender nonconforming individuals feel pressured? Will I therefore be complicit in creating an environment where such individuals have to choose between remaining closeted about their identities, or else outing themselves before they’re ready?

I don’t like this. I feel like I’m acting rash and disrespectfully if I do include pronouns, and I’m not doing enough to indicate my support of people in communities whose pain and suffering I can understand the most. I feel like there’s no winning solution here.

But my email signature now does say “(he/him/his)” in 8-point font, so at least I’m doing something, right?

The giant pool of money

Why saying “That’s my money you’re spending” doesn’t make sense

I discuss money a lot. Sometimes it’s discussing my own money that I’ve earned/borrowed/received; sometimes it’s discussing money under my control as Treasurer of North by Northwestern; sometimes, it’s discussing money that neither mine nor under my control.

At the current stage of my life, it’s usually about this university I attend, and its US$9.648 billion endowment and its total $2.03 billion expenditures last year.

Because whenever there’s a capital expenditure (read: construction of new buildings and facilities), the usual response of students is to say: “my tuition dollars paid for that.”


As any aspiring business student who has taken Intro to Accounting will probably tell you, the General Fund is an accounting term used to describe, well, the general fund.

(I’m simplifying to communicate a point. Bear with me, please, business students. You’re going to do great work.)

It’s an idea that’s usually only used in government and nonprofit accounting, because money given to these organizations can and sometimes do come with strings attached earmarking them for specific purchases and services.

(General-purpose for-profit corporations, which usually are businesses, don’t have to make this kind of accounting distinction. They can spend money however their directors feel. As a result, they don’t usually have anything but a general fund, and this kind of fund accounting doesn’t apply.)

The general fund is kind of a lump pool of all money that isn’t set aside for a specific purpose. When you give money to one of these organizations—in the form of taxes, fees, tuition, fines, whatever — unless the transfer from you to the organization came with specific caveats, your money generally gets dumped into the general fund.

Alternately, if you make a donation to the university’s library and specify it should be used to purchase pre-colonial United States documents, then it’s likely going to end up in a fund specifically for library purchases of pre-colonial United States documents. (This might seem far-fetched, but you could apply this principle in the form of scholarship funding and financial assistance for college students. And a lot of people do.)

In practice, this cash — general-purpose or not — gets lumped together and deposited into a single bank account, because having more money in an account generally means you get a higher interest rate.

But in the organization’s accounting department, the cash earmarked for your pre-colonial library collection might as well be kept in your long-lost Great Uncle George’s secret underpants drawer, because there’s no way that it’s going to be spent unless some obscure 1700s parchment turns up for auction.

However, money in the general fund can be spent for whatever and whenever, as long as it’s there. And it does get spent, a lot of the time.

This is why it’s so difficult to track down your money flows once it’s in an organization, and where it’s being spent. Even with the most sophisticated of all accounting tools, it’s just not possible to say this dollar belonged to Jackie, this dollar belonged to Jose, and this dollar belonged to Joon.

It would be analogous to thousands of people filling a swimming pool with one cup of water each, then taking out a cup of water and saying “Whose water is this?”

You could divide up that volume of water proportionally to how much everyone put into the pool, in which case each person owns a tiny fraction of that cup. Or, you could assign the entire volume of water as one person’s contribution, and then exclude them from having ownership for any other cup of water you later extract from the pool.

Both are equally valid, because both are equally nonsensical.

The only way you could sensibly say “This cup of water is Jordan’s cup of water” is if you kept the water from Jordan’s cup in the side kiddie pool — and congratulations, you’ve just discovered the basis of fund accounting.

“Where are you from?”

This planet. And everywhere on it.

It’s a difficult question to answer.

“Where are you from?”

I don’t know. And it’s a problem that’s been written many times before. And like those authors before me, I don’t fit neatly into a box of stereotypical identity: I am not your Chinese-American immigrant from Taiwan, nor am I the epitome of Asian-Americanness from California, and I’m certainly not your blonde girl from rural Wisconsin.

I’m an American. And I believe I am entitled to lay claim to that identity, because I believe in all the things that America is supposed to stand for: freedom, liberty, and justice for all. A thriving, inclusive, respectful democracy and a strong economy built upon the ideology of freedom of individual and choice.

It gets a little tricky when you try to narrow it down a little further. While I live in Evanston because of my enrollment at Northwestern, I don’t claim to be an Evanstonian any more than the next student. Evanston politics are not my concern, and while I don’t have any negative feels about Evanston, I also am distinctly conscious of my disconnect from much of this suburban city’s life and the likelihood of me moving out of Evanston in the near future.

Nor do I feel an attachment to the city of Chicago, because I can appreciate and respect its culture and politics, but I don’t feel as if I understand the culture of the city: its intricacies, its beliefs, its concerns and its pride. To me, Chicago feels like a foreign land, and I’ve never found the pulse of this Midwest city.

I think I have a claim to identifying with the state of Illinois (cause, you know, place of birth and all that), but I don’t do so. For one thing, Illinois still doesn’t have a state budget after two years, and I can barely keep up with the frolicking going on in Springfield. For another, I can’t make a distinction between Illinois and other US states (although I’m pretty sure they do exist), and I’ve left uncertain about how much of my feelings are associated with the Illinois state or the United States nation.

But I’m painfully reminded that I’m not considered ‘American’ because of my race, ethnicity, and personal history. And while I understand where it’s coming from, I’m frustrated that this isn’t a two-way street because there’s no way I can politely and easily explain to someone how these “microaggressions” works.

Like voting. I was venting a frustration about photographers in voting booths during the presidential election, when I was questioned on my identity: “But wait, you can’t vote, can you?”

Or in breakout sessions: “I’d like us to go around the table, tell us your name, what you do, where you’re from, and a fun fact about yourself!” as if where I’m from (or a fun fact about myself, for that matter) has any impact on the subsequent conversation or what I’m able to do.

It’s hard to find statistics that can back up a personal narrative, but I’m pretty sure I’m among the first of a generation whose parents had the economic and political resources to be able to live a life where the idea of a “hometown” falls apart. For the first time in human history, it was possible in the 1980s for an average family to pick up their belongings and move halfway across the globe in search of better jobs and a better life.

And the children born during that time — in the 1980s and the 1990s — are just coming of age now, and it’s starting to shake the foundations of questions such as “where are you from?” and “what’s your hometown?”

I’m not even quite certain what information you’re looking for. Like, are you genuinely curious about my place of birth? Or are you, as I think you are, using this question as a supposedly polite way to try to understand my attitudes and values, in the same way we ask people what they do as an arms-length indicator of your education level, social class, and income group?

I’d imagine many people would give the same answer to both motivations of the question, but I am not one of them. I know what you’d assume based on my answers, and neither my place of birth nor the cultures of my last permanent residence are indicative or informative of my identity, beliefs, and values.

So, where are you from?

You are Northwestern.

Two roads diverged in a wood, and I — 
I took the one less traveled by,
And that has made all the difference.

“The Road Not Taken” by Robert Frost

Just over two years ago, I learned I was accepted into Northwestern. It remains one of the most pivotal moments in my life.

If I could have gone back and spoken to myself two years ago, and told myself that life was going to get this much better, I don’t think my seventeen-year-old self could have believed it. But it’s true: life does get better.

If my seventeen-year-old self could see me now, he probably wouldn’t recognize me.

Left: Feb 2015. Right: Feb 2017. Some things have changed. And some things, well, haven’t.

I swapped my glasses for contacts. I started wearing a lot more purple. I developed a quiff.

FB_101: How to embarrass yourself in the future

Made friends. Lost friends. Made new ones. Tried a relationship. Learned that Northwestern has a hookup culture and that was very much not what I was looking for.

Entered Medill. Studied the philosophy of modern journalism, which I didn’t understand. Started to challenge the notion that journalism could ever be independent and objective. (Still am.)

Survived reporting and writing with lecturer Michael Deas. Learned so much that I’m planning on taking news editing with Deas again to prepare for anything the world might want to throw at me.

Wrote for the Daily Northwestern. Didn’t like it. Joined N0rth by Northwestern. They haven’t kicked me out yet.

Went to San Francisco. Never looked back.

Since I first stepped onto Northwestern’s main campus in Evanston, I’ve discovered more about myself than I had ever believed possible. I’ve learned what it means to be me: the eclectic, eccentric and electric personality that I am. I discovered the science of living out of packed boxes, of planning my life in 10-week blocks, and the art of looking more beautiful than most.

I’ve also found my place in the world, and discovered that everything I’ve been doing has a word: design.

As well as being a student journalist, I’m also going to claim the identity of “designer,” because I firmly believe that design is about making the world a better place, and everything I’ve ever done in my life is try to make the world better than it was before.

Design is about making the world a better place. That may or may not involve pretending to climb the Golden Gate Bridge (which is one of my favourite pieces of infrastructure).

But it might not have turned out like this.

March 2015.

I wonder: would I would recognize myself as a 20-year-old Duke undergraduate?

Who would I have been? What would I have become?

I wonder.

Human centered design

I was having a perfectly delightful conversation with a Northwestern undergraduate manufacturing and design engineer about the faculty in the Segal Design Institute when he dropped a sentence: “So when I was taking Human-Centered Design with him last year — ”

“Excuse me – what?!”


“Human-centered design?” I said. “As opposed to what other kind of design?

Since my only exposure to design as a profession had been through the Segal Design Institute, it baffled me that design could be about anything other than humans.

It turns out, according to this junior in MADE, that Segal is unusual in its heavy focus on human-centered design; there are other design philosophies, he explained, such as manufacturing-centered design – which focuses on designing for the manufacturing process as opposed to the end user’s experience.

“But what,” I ranted, “is the point of design if it doesn’t make people happy?”

This used to be an image of the McCormick School of Engineering’s new branding. Now it’s Segal’s own branding.

Design is a new concept for me, but the philosophies and basis underlying design thinking are much more familiar. Some of today’s design innovation finds its roots in industrial engineering, which my father teaches at another university; other parts of design innovation, such as “understanding human behavior,” falls out of a combination of my journalism studies and my mother’s teachings on how to be a decent human being.

So I don’t know how to feel about the concept of designing for anything other than humans. With few exceptions, nothing we create as a species is intended for anyone – or anything – other than ourselves. Our failure to contain the environmental crisis known as climate change is a testament to our self-centered focus.

(One notable exception is the Voyager Golden Records, specifically intended for an audience other than humans; and we spent forever designing those records to inform another intelligent life form, without knowing what communication media were even available.)

And so the technologies and processes we use to make today’s products and services are supposed to serve people, not the other way around.

The processes we design and build with new technologies have always been about people. We may have shifted the predominant written media from parchment to paper to typewriters to computers to the Internet, but fundamentally we have been serving other people with our writing (by informing them) as opposed to the technologies that enable our communication.

I feel that designing for anything other than humans leads to a failure of design. When you don’t design for the human experience, then you may as well have not bothered to design at all.

The divide between news and editorial


For readers, however, that divide, no matter how firm, is often viewed as a distinction without a difference. “We have readers who’ve been reading the paper their whole lives and who understand those distinctions very well,” says Goldberg of the LA Times. “But, of course, we have plenty of readers who don’t understand the distinctions we’ve spent years and years trying to make…”

I like newspapers. I was the weird guy in high school who had a recurring subscription to the newspaper, and would be found reading the paper in the form room the first thing in the morning. One year, when I was absolutely terrible about it, I kept an entire year’s stash of newspapers in my locker, and didn’t clear it out until the end of the year. (It was a lot of ink and paper.)

I accept, though, that the new Internet age of journalism has meant things are changing. And I think one of the changes lies in newspapers editorials, and the divide — or lack thereof—between the news pages and opinion pages of a newspaper.

Because that’s exactly the point. In a newspaper, you can rely on the position of a piece within the paper — towards the front for news, or the back for opinion — to give you clues about what you’re reading. That distinction does not exist on the web.

We talk about “content” on the web because that’s all the web cares about. The technologies that power the World Wide Web today — HTTP, HTML, WebKit — don’t care about news and editorial, only content. At the end of the day, the web delivers content from server A to consumer’s device B, and no part of the Internet’s infrastructure cares about what the content is.

Which is what?

In a modern day news site (like the Wall Street Journal), everything looks pretty much the same. Go on. Take a look. If you can tell which is the news article and which is the opinion piece just from glimpsing at the screenshots, then I’m guessing you work for the Wall Street Journal.

You can’t rely on a reader peering at the tiny text (“politics” and “opinion”) to determine the type of article they’re reading. And in an environment of sharable content, Facebook, Twitter, Google and the like also don’t help in abstracting away the tiny differences that remain to help readers identify the difference between news and editorial.

Go on. Which is which?

This lack of distinction is why some readers of newspapers don’t seem to get there’s a difference: because everything looks the same. And it’s what the Columbia Journalism Review was touching upon when it discussed today’s WSJ editorial:

The Inquirer’s Jackson, who has served on the editorial boards of three different newspapers, agrees. “Most readers do not [understand the distinction],” he says. “Most readers believe there is collusion between the editorial writers and the news writers and editors, that one reflects the other.”

It’s easy to blame the reader, but the fault lies with the newspaper’s publisher and editors, not the readers. The news publisher hasn’t made it clear that there’s a difference between the editorial webpages and the news webpages.

How do you do it? Use a different template. Different fonts. Different layout. Maybe put big photos of the columnists on the article itself, to make it clear it’s a personal opinion, and not the newsroom’s stance.

Or kill the distinction between news articles and editorial articles, and start delivering opinion pieces that are truly representative of the newsroom’s opinion. They’re all WSJ journalists. Why are news staff and editorial staff any different?

What I don’t understand about politicians and their tax talk

House Republicans have inserted a last-minute provision into the Republican healthcare bill, as part of an effort to woo more votes from upstate New York representatives.

The provision would deny matching federal aid to the state of New York if it did not shift the the burden of Medicaid from New York counties to the state government.

Here’s the bit I don’t get:

But William Cherry, the Schoharie County treasurer, said upstate counties would be able to make a significant reduction to their property taxes if they did not have to shoulder part of the cost of the state’s Medicaid program.

“This would be a huge step and a great benefit to taxpayers,” said Mr. Cherry, a Republican who is president of the New York State Association of Counties.

It’s not for me to comment upon New York policies, because I don’t live in New York, don’t pay taxes to the state of New York, and have no ties with the state.

But my point: government is government. Yes, there may be different layers of government, from local, to county, to state, to federal, to international (hello, EU), but it’s ultimately all government.

And governments of any kind have basically two ways of raising revenue: taxes and borrowing. And as a nation, we’re supposed to hate government borrowing.

William Cherry, the county treasurer mentioned in The New York Times article, argues that upstate counties can make a “significant reduction to their property taxes,” by eliminating Medicaid costs from county budgets.

Notwithstanding that the majority of the county property tax burden goes to paying for schools, not Medicaid, there’s another point I’d like to make: shifting the burden from counties to the state doesn’t reduce those costs, it just moves them to another balance sheet.

At the end of the day, the state government is going to have to pay for all the costs previously borne by the county government, and the state is going to have to get the money somewhere. If the money doesn’t come from county taxes, then it’s very likely going to come from state taxes, unless the state does something else: like drastically cutting back on spending for Medicaid.

Shifting costs from the county level to the state level does nothing for taxpayers. It’s merely another example of how the system of government we’ve set up is designed against taxpayers’ interests.

If politicians really cared about taxpayers (let alone citizens), then they wouldn’t be fighting about who should pay for what, but instead trying to work out what’s the best way to bring the best possible public services at the lowest cost to the taxpayer.

Grammar, crash blossoms, and 140 characters

I was disturbed by this tweet from the New York Times National News account.

It felt off. There was something about the grammar that I didn’t like. I spent over a minute trying to decipher what it meant.

How do you break down this headline? Was it “Trump and House”? And what linguistic role does the word “Work” play — a verb, a noun, or an adjective? I thought I had forgotten how to read English.

I then realized that it wasn’t my fault: the tweet itself was problematic, because the order of the words was confusing my brain’s ability to put them together.

An illustration of my first interpretation of the word groups. These word groupings don’t make sense as a sentence.

We understand sentences as we read them. This sentence is problematic because it’s not immediately clear how the words are supposed to be grouped.

How the tweet was supposed to be grouped.

This type of wieldy, unclear sentence is known as a crash blossom, after a badly-written Japan Today headline confused readers.

I wasn’t just going to criticise The New York Times without knowing how to improve upon it. And I tweeted a suggestion.
But I’ve since come up with a better text for tweeting this story: “Trump and G.O.P in House Agree to Require Medicaid Recipients to Work”.

I added the preposition “in” to clarify the relationship between the House and the G.O.P.

I shortened “make a requirement” to “require.” I put in “recipients” to clarify the requirement will apply to people, not the Medicaid program.

It’s short. It’s clear. And it’s only 69 characters, which means it can fit in a tweet.

Incidentally, the article’s headline says it will be the states, not the federal government, that will actually decide to require able-bodied Medicaid beneficiaries to work.

But the article itself doesn’t say who gets the final decision about this requirement. It could be a requirement written into the bill itself by Congress and the president, or the bill will merely include permission for states to decide.


Dear Professor Poliakoff

Sir Martyn Poliakoff CBE FRS, a British chemistry professor researching the applications of supercritical fluids in green chemistry. He is probably best known for his appearances in the Periodic Table of Videos by videojournalist Brady Haran at http://www.periodicvideos.com/. Photo from the Royal Society.

I write to thank you for your contributions to the promotion of science at Periodic Videos.

While I am now training to be a journalist at university, I was originally set on a career in the natural sciences. Despite my journalism background, I still find it difficult to articulate my sorrow at having given up a life of science for one of the creative arts; I have never found more accomplishment in anything than successfully grasping the intricacies of the natural sciences and coming a little bit closer to understanding the universe around me.

I believe that you have been responsible for inspiring countless young people, including myself, to gain an interest for the sciences. Your enthusiasm for and appreciation of chemistry is clearly evident whenever you appear on Periodic Videos.

Like many young people in developed societies, I believe, I grew up in an academic setting where the natural sciences were boring and repetitive — a constant drone of textbooks and theory—but my love of science was sustained because of individuals such as yourself, who demonstrate the excitement of learning and knowledge every single day.

Thank you for sharing the love of knowledge with me and everybody else. Thank you for making it cool to study science. I hope that, in addition to your scientific research, you will continue to educate and celebrate the wonders of chemistry.

Avoiding the word “normal”

Very few words are as telling about your worldview as the word “normal.”

I think a lot about the power of words. I started thinking about the interpretation of language in English Literature class, when I was taught to attempt to illustrate every possible understanding of a particular noun phrase to build a passable case for my interpretation of the author’s intentions.

Most of the time, I felt I was analysing why the curtains were blue.

But it gave me an appreciation for the power of words, and the improbable possibility of deriving the author’s “attitudes and values” from their choice of language. What you say and don’t say are strong indicators of who you are and what you believe.

So I believe we should stop using the word “normal” as an adjective, especially when we’re describing people and their traits.

“Normal” is a subjective term. It suggests there is some collective sense of normalcy, and connotes a value system where deviance from the standard is somehow unsavory or intolerable.

Sometimes it’s okay to use the term, particularly when such norms are fundamental: the belief in a free press to oversee and report on the government is a fundamental norm in American society.

Most of the time, though, it simply demonstrates the author’s narrow-mindedness and lack of exposure to the world.

I once called out a fellow student of journalism for describing a desire to hear a “normal accent” in a video featuring an obviously-fake British accent; while the voice was horrible, the idea of a “normal” accent only exists if you believe there is somehow a standard accent that’s desirable, and usually it’s your own.

Meanwhile, you imply you believe all other accents are less desirable, and you — epitomized by your accent — are higher status than the other people who speak different from how you do.

I don’t believe this particular individual meant any harm. But using the phrase “a normal accent” conveys a particular system of values, like using phrases such as “a normal person” to describe an able-bodied individual, or “a normal color” to describe someone’s race and ethnicity.

It doesn’t mean it isn’t common experience for you. But believing that the human attributes familiar to you is “normal” implies, at best, a naive understanding of the world around you; at worst, a narrow-minded, bigoted and regressive perspective of the rich fabric of humanity.

We have perfectly acceptable alternatives to communicate a better meaning than “normal”: common, usual, typical. It is not normal for someone in the United States to be white or Caucasian: it is common. It is not normal for black and African-American communities to be disproportionately impacted by U.S. governments, law enforcement, and the criminal justice system: it is typical.

My second lesson in journalism school was how to shorten phrases to convey meaning more compactly. This includes replacing long phrases such as “located at the intersection of Johnson and Madison Avenues” with “near Johnson and Madison.”

But my first lesson at Medill was accuracy is the single, overriding value in journalism.

In an Internet age where word limits are no longer defined by column inches on a page, we can afford to be more accurate by using more words. The shortcut afforded by “normal” is no longer appropriate.