Monday, June 2, 2014

Pseudo-anonymity: Defense

type='html'>Back to our FIFO queue! Today we have...

pop(Pika):
The other day I made a mistake and left a comment on someone's blog under my own first name instead of the pseudonym. I deleted it as soon as I noticed, but then I got a bit paranoid if anyone could see who I am just from that one single comment. So I googled my first name.

And got the shock of my life.

I am there, my workpage pops up immediately, right on the first page of results... 
How googleable are you? 
I meant to post about this topic months ago, but found myself struggling with how to appropriately discuss it. The problem with me writing a post like this is I could give hints on how to 'out' someone who is blogging/internetting pseudo anonymously, and I don't really want to do that for obvious reasons. The good news is that most of the techniques to de-anonymize bloggers remain firmly in the realm of researchware, but I wouldn't bank on that being the case for too much longer.

Instead, I'd like to suggest a few defensive things pseudo anonymous netizens can do to help maintain their anonymity. Some of these suggestions are social, some are technical, but nearly all are grounded in the privacy literature.

1) Don't tell anyone you know in your open (non-anon) life about your pesudo-anonymous identity/blog. Someone will tell someone, and the next thing you know someone posts something somewhere revealing your real name. People are awful at keeping secrets, and if you ever become a famous (or controversial) blogger you run the risk of someone accidentally (or purposely) outing you.

2) Don't write things that would be devastatingly embarrassing for you if you were outted. As I said, right now it's easy to be a little bit anonymous online, but I would not at all bet on that trend continuing. I saw a paper presented at a conference recently that scared the crap out of me, so do take heed.

3) If you blog, turn on the comment approval settings. If you use facebook or other social networks, even if it's under your pseudonym, turn on the settings to approve your wall posts / picture sharing / etc. Seriously, lock that puppy down. Better to introduce a delay then suffer the consequences of someone commenting, "Great post, Imelda D! See you at lunch tomorrow."

4) Never forget: once it's out there, it's out there. There are no takebacks in the era of RSS feeds and google. There is no ephemerality. Be extra careful when you post something not to sign your real name, discuss something specific about your location, etc. You have absolutely no idea who is subscribed to get a blog's comments, and once their RSS reader grabs it, there's nothing you can do.

5) There is a lot of literature on how people can infer your identity based on your interests, social network friends, etc. (See references in this post). Some people who work in the security/privacy fields make their name on this kind of thing, no pun intended. Again, this supports my first suggestion to keep your pseudo-anonymous life and your non-anonymous life as separate as possible. If you need to share something personal, change some details here and there. You know, say you love dogs instead of cats.

6) Use Tor, or another anoymizer web browsing service when visiting other people's blogs/websites. Definitely anonymize your IP when commenting elsewhere under your pseudonym. While Google Analytics provides a slight layer of anonymity and lets your individuality get lost in the noise, not all trackers are so gracious. Remember, every time you hit a webserver, your IP address is logged. It is trivial to deduce who you are based on your IP. So you are completely relying on the good graces of the website/blog owner not to out you. By using an anonymizer, you can at least protect yourself a bit better.

I think that's it for now. Happy pseudo-anonymous blogging!

Credited to the Author

My student is so good you can't have them

type='html'>It's been positively fascinating reading recommendation letters for prospective graduate students. The majority are fairly normal, but a few are kind of clingy.
"Ms. Hopper is awesome and will do great at your university, except I really don't want her to go there, I want her to stay with meeeeeeeee."
Sometimes, the clingy professors will trash the student too, sort of like in the way I tell people, "this chocolate cake is TERRIBLE, you definitely don't want any."

While it's true good students are hard to find, and showing some level of adoration and commitment toward them is a good idea, clinging too tightly is bad practice professionally and managerially. And in fact, I've heard some stories of clingy advisors that cross me as borderline abusive.

Maybe in other fields this sort of thing is tolerable, but a graduate student in CS, even a bad one, can get a job anywhere, and make 5 times as much as they would as a PhD student. So it doesn't cross me as particularly clever to treat them poorly.

Credited to the Author

Agent's Smith Registry

type='html'>I have just discovered the joy that is comixed.com. This one is probably my favorite so far:

Image Description*: Panel 1: Neo says, "So you just keep duplicating
your program over and over? Aren't you afraid of registry errors?"
Panel 2: Agent Smith says, "Mr. Anderson...Do you honestly think that I would allow
there to be any errors in my system's regist.."
Panel 3: Hugo Weaving in drag with an outlandish orange and yellow costume.
Panel 4: Agent Smith (I think?) with white light coming out of his eyes.

While we're talking about The Matrix, I just stumbled across this video of a recreation of a scene from the film in Lego. I somehow missed it the first time around when it came out in 2009, so in case you did too here it is:



You can also watch the side-by-side with the original film. It's amazing. 




* From now on I'm going to try to make my captions more accessible to readers who are blind and/or visually impaired. Please call me on it if I forget!

Credited to the Author

Be your own princess charming

type='html'>I recently came across this lovely series of photographs from photographer Jaime Moore in Austin. She wanted to take photos of her 5-year-old daughter, Emma, on her birthday. While looking for inspiration on the net, all she could find was how to dress one's daughter like a Disney Princess. So she took matters into her own hands -
It started me thinking about all the REAL women for my daughter to know about and look up too, REAL women who without ever meeting Emma have changed her life for the better. My daughter wasn’t born into royalty, but she was born into a country where she can now vote, become a doctor, a pilot, an astronaut, or even President if she wants and that’s what REALLY matters. I wanted her to know the value of these amazing women who had gone against everything so she can now have everything. We chose 5 women (five amazing and strong women), as it was her 5th birthday but there are thousands of unbelievable women (and girls) who have beat the odds and fought (and still fight) for their equal rights all over the world……..so let’s set aside the Barbie Dolls and the Disney Princesses for just a moment, and let’s show our girls the REAL women they can be.
Here are a few of these remarkable photos:




The rest of the photos are here: http://www.jaimemoorephotography.com/2013/05/09/not-just-a-girl/

I would love to see the marketing industry, toy industry, and Hollywood follow Jaime Moore's lead. K-12 STEM outreach is great and all, but if we want to make a significant impact we need to get ImagesOfGreatness (beyond beauty) into the minds of young women on a daily basis.

Credited to the Author

New adventures in publishing metrics

type='html'>In case you haven't heard, Google Scholar Citations recently opened its doors, allowing academics to set up Google Scholar profiles, track their citations, h-index and i10-index, and see pretty graphs.

At first I thought: Yay! Especially since, for Computer Science, this was right on the heels of Cite Scholar's beta release, which is all about highlighting the fact that in CS we're all about the top tier conferences and journals don't matter much for us.

Then I thought: Boo! Now it's easier for the bean counters to count beans. Also, I sense there's this "who's searched for me" button coming, which creeps me out. This is actually why I don't ever click on academia.edu pages.

After a few weeks of reflection I am still on the fence. While I can't speak for other fields, in CS number of citations doesn't necessarily mean anything about quality or impact of work. I can think of several lackluster papers that have hundreds of citations, whereas others are incredible and barely hardly any. Also, sometimes an insane number of citations simply means you forced encouraged people to cite you by releasing some software or data.

On the other hand, I find these new graphs seem to ignite my "MUST WRITE MORE" instinct, just as the darling tune my new washing machine plays encourages me to do more laundry.

Credited to the Author

Not your grandma's Tufte

type='html'>I really appreciate good data visualization because I think it's one of the best ways to communicate ideas. It's hard to know how to do it well, though, if you're not trained in design. For some reason I always stress over little details - fonts, colors, margins. But that happens when I cook, too, so maybe that's more a reflection of my personality than anything else.
Image from Understanding Graphics

I once met Ben Fry, who if you don't know is the co-creator of the Processing programming language, and (also) made a name for himself doing some gorgeous visualizations for SEED magazine. I had just seen a fantastic design of his, and asked him how he went about choosing its colors and fonts. He gave me some advice I still follow today: start with black helvetica and go from there.

As well as this advice has served me, I feel ready to move into serif fonts and at least dual-colored bar charts. So lately I've enjoyed reading the blog posts from Shawn Allen's Data Visualization Course. He starts with a history of data visualization (remember Florence Nightingale?) and builds up from there. It's excellent and I highly recommend checking it out.

Credited to the Author

Women in CS: It's not nature, it's culture

To convince people to stop throwing up their hands and saying women are just not interested in Computer Science, and instead do something about it

Whenever someone says this, the implication is, "It's not our fault. We're not doing anything wrong." Instead of saying, "How can we retain/attract women?" they simply assume that we are:



  • Not interested

  • Biologically deficient at math/logical thinking

  • Don't have what it takes to be a "code cowboy"1

These assumptions are not correct. In some parts of the world we see equal numbers of women and men studying computer science and engineering and being employed in the tech sector. For example, in Malaysia women comprise 50-60% of jobs in the tech sector. This is entirely due to culture - turns out, men who work indoors are seen as less masculine than those that work outdoors, and women who work outdoors are seen as lower class [1]. It's funny, we actually have sexism working in reverse - leading to an increase in women working in technology. 








The data ain't pretty, though 1982 sure looked nice.

Source: NYTimes
We see similar trends in other parts of Asia. I don't have recent data, but around 2003 women were earning 59% of science and engineering degrees in China, 46% in South Korea, and 66% in Japan. (Compared with 33% in the US). [2]

As for the biology argument, if you're still not convinced, read Terri Oda's excellent piece, "How does biology explain the low numbers of women in computer science? Hint: it doesn't."

So for the non-Asian world that is still struggling with underrepresentation [2], I think we need to change our culture. We need to eliminate the geek mythology (that to be successful you must eat/sleep/breathe code and nothing else) [3], we need to ensure we illustrate the purpose and value of computing [4], we need to be proactive in recruiting women [5], we need to provide plenty of other women as peers and role models [6], and we probably should de-masculinize the workplace [7]. These approaches are all shown to make a big difference in attracting and retaining women in CS.

If you do these things at your institution, I guarantee you that in addition to helping attract women, you will also attract men. And probably also people of different races, cultures, ethnicities, and socio-economic statuses. Which given the recent yo-yo enrollment trends over the last decade is not a bad thing at all. We really don't want computer science departments going the way of the way of the dinosaur2.

So, please - stop mansplaining and start doing.

Notes
It's worth mentioning since it's a common misunderstanding: coding != computer science, coding ? computer science. It's like how doing an assay is not molecular biology research. Assays are an aspect of the field of molecular biology, coding is an aspect of the field of computer science.
2 No disrespect to anyone who works in Paleontology (I am truly saddened to hear of your suffering), but I just couldn't resist the pun.


References

[1] Mellstrom, U. Masculinity, Power and Technology: A Malaysian Ethnography. Ashgate, 2003.
[2] Simard, C. "The State of Women and Technology Fields Around The World". Anita Borg Institute. 2007.
[3] Margolis, J. and Fisher, A. "Geek Mythology and Attracting Undergraduate Women to Computer Science" Impacting Change Through Collaboration, proceedings of the Joint National Conference of the Women in Engineering Program Advocates Network and the National Association of Minority Engineering Program Administrators, March, 1997.
[4] Margolis, J., Fisher, A., and Miller, F. "Caring About Connections: Gender and Computing". IEEE Technology and Society, December, 1999.
[5] Cohoon, J. M. 2002. Recruiting and retaining women in undergraduate computing majors. SIGCSE Bull. 34, 2 (Jun. 2002), 48-52.
[6] Lagesen, V. The Strength of Numbers: Strategies to Include Women into Computer Science. Social Studies of Science February 2007 vol. 37 no. 1 67-92.
[7] Cheryan, S., Plaut, V.C., Davies, P.G., Steele, C.M. "Ambient belonging: How stereotypical cues impact gender participation in computer science." Journal of Personality and Social Psychology, Vol. 97, No. 6. (December 2009), pp. 1045-1060.

Credited to the Author

The Social Network

type='html'>Due to circumstances beyond my control (a long plane ride), I watched "The Social Network". I didn't really want to watch it, but also sort of did, kind of like a train wreck. I also wanted to see sudo*-Matt Welsh's cameo teaching Operating Systems.

I was pleased that Hollywood got some of the technobabble correct (apache with a SQL backend), and I loved that the closeup of Mark's laptop showed it running *nix. I also thought it was cute they re-branded the iBook laptop as "Book".  

However, I was greatly displeased with how the film portrayed women. By my count, there was only one female character who was not: a flake, a flirt, a drunk, a girlfriend, or crazy - and she was a lawyer with hardly any personality depth. Why were there no female engineers, or CS majors? Or, heck, I'd even take an Art History major. Just somebody with some brains to accompany the legs.

I was also displeased with how Mark Zuckerberg was portrayed. I don't know the real Mark, but the director seemed really dedicated to employing the geek-with-zero-social-interaction-skills trope. Couldn't the actor have smiled occasionally? Been somewhat friendly now and then?

So, Hollywood, your scorecard is: B+ for suspending my geek disbelief, but an F for perpetuating stereotypes.


(*) Pun intended! 

Credited to the Author

Finally, some useful internet activism

type='html'>Great article in this Monday's Technology Review on different websites/apps set up to help people in Japan, such as Ushahidi, SparkRelief, and Hurricane Party.

I was happy to read about these efforts, and encourage you to participate with them and/or donate on your own.

For monetary donations, InterAction has a list of verified charitable organisations who are accepting donations, which also describes how they will use the funds. Definitely check InterAction or with the Better Business Bureau before donating - there are a lot of scams out there.

For non-monetary donations, you can donate frequent flyer milessocks, or send hopeful letters. (The sock guy mentioned letter writing as a thoughtful gift the Japanese will appreciate, which I think sounds like a great idea).

Credited to the Author

A boy called Sue

type='html'>Kim O'Grady writes, "I understood gender discrimination once I added “Mr.” to my resume and landed a job".

The tl;dr version is: Kim was an experienced engineering/business person who was applying for jobs. Sent out dozens of resumes to top places, did not get a single interview. Sent out his resume to a bunch of lower tier places, still no interview. Finally, he realizes they are taking "Kim" to mean he is a woman. So he adds the prefix "Mr." to his resume, sends it out again, and immediately lands interviews.
My first name is Kim. Technically, it’s gender neutral, but my experience showed that most people’s default setting in the absence of any other clues is to assume Kim is a woman’s name. And nothing else on my CV identified me as male. At first I thought I was being a little paranoid, but engineering, sales and management were all male-dominated industries. So I pictured all the managers I had over the years and, forming an amalgam of them in my mind, I read through the document as I imagined they would have. It was like being hit on the head with a big sheet of unbreakable glass ceiling.
This is so sad. It reminds me of neurobiologist Ben Barres' experience, where after giving a seminar as a Ben after his transition from Barbara someone in the audience remarked, "Ben Barres's work is much better than his sister's."

The one I hear a lot in my field is, "X is a superstar" or "X is gifted", and always "X" is a man. I've never heard a woman referred to as a superstar or being gifted in her field. I've also never heard of a young woman referred to as a child prodigy.

Credited to the Author

How to get your paper accepted: Orshee

type='html'>In today's installment of how to get your paper accepted, we shall discuss gender inclusive language.

Back in my days of blissful ignorance, I didn't notice gender use in language very much. "John Doe" and "He" were pretty much par for the course.

At some point, I was reading an article and it was positively littered with "him or her" "he or she" "his or hers", and I wanted to pull my hair (short or long) out. While I appreciated the sentiment it was completely distracting from the prose.

I once was given a Parenting 101 book, and it alternated between male and female examples per section (i.e., every few pages). I liked this approach a lot better, because it made for much easier reading while still being gender inclusive.

Gender exclusive language has no place in scientific writing, unless the author is describing a single case study (i.e., "When Patient M. first came to the hospital, he..."), a gendered-exclusive event (i.e., The Society for Women Engineers summer camp for fourth grade girls), or is somehow written in the third person from the perspective of one of the authors.

It's very easy to use anonymous, gender-neutral subjects in sentences to give examples of people. For example, "the student", "the user", "the agent", "the engineer", "the scientist", etc.

It takes practice to write in active voice while remaining gender neutral; sometimes the writing can get a bit bogged down when you start. Sometimes writing they or them can feel awkward. But like any sort of writing, practice makes perfect. After awhile it becomes second nature.

Unlike those days of blissful ignorance, as a reviewer I am now very distracted and occasionally annoyed by both gender exclusive language (of either gender), as well as by too many Orshees. In some particularly egregious cases of the former I have politely reminded the authors to be more sensitive to their use of language. I know it is often a result of English being a second language.

Google, however, really should know better. Check out this error message I just got in Chrome (emphasis mine):
In this case, the certificate has not been verified by a third party that your computer trusts. Anyone can create a certificate claiming to be whatever website they choose, which is why it must be verified by a trusted third party. Without that verification, the identity information in the certificate is meaningless. It is therefore not possible to verify that you are communicating with  XXX.YYY.ZZZ, instead of an attacker who generated his own certificate claiming to be XXX.YYY.ZZZ. You should not proceed past this point.
If I was a man I might be offended. I'm sure there are plenty of female hackers out there. (Heck, even that attack is poorly named - "man in the middle". I guess it's catchier than "person in the middle", but still).

Credited to the Author

Oh noes, it's women CEOs!

type='html'>Today at Scientopia I discuss the latest debate raging across the pond - hiring quotas to ensure there are more women CEOs of companies.

Credited to the Author

Reviewer armchair psychology

type='html'>Did I mention July is the month for reviews this summer? I must have reviewed 25 this month (one for every hot, humid day!)

After I review papers, if I have time I enjoy doing armchair psychology on my fellow reviewers. Some conferences / journals let you see the reviews others have submitted, and some even allow you to change your score based on what you read. I'm not sure if this is a good thing or a bad thing, but it's interesting.

When there are 3-4 reviewers for a paper, the scores tend to regress to the mean. So on a 1-5 scale, the average score will be 3. There are also often repeats - so if I give it a '4', it's likely some one else will give it a 4 too. Really bad papers tend to have scores that cluster around 2, and really good papers cluster around 4.

So I'm always intrigued when I see the following:
Reviewer 1:  4
Reviewer 2:  5
Reviewer 3:  3
Reviewer 4:  1
As an nascent author, when you get a set of reviews back like the first one you tend to think, "Reviewer 4 is a jerk who Didn't Get It."

As a more seasoned author, you tend to think, "Oh no, what is my Fatal Flaw? (Reviewer 4 is a jerk who Didn't Get It.)"

And as a seasoned reviewer, you tend to think, "Who is Reviewer 4 and what is their beef?"

Occasionally Reviewer 4 has a valid point, and the other three reviewers really did miss something major. But more often than not Reviewer 4 is angry at the authors for taking too many liberties in their paper. Or for not citing Their Brilliant Work. Or it's the "Someone is WRONG on the internet" phenomenon.

In any case, when I'm an editor or paper chair I can ignore the outlier and life goes on. But when I'm a fellow reviewer I feel more vested in the outcome, particularly when I 'm Reviewer 2. I hate to see the possibility of good science getting squished because some reviewer was being thick, especially when it's someone else's science.

So sometimes, if a conference or journal offers a discussion period for reviewers, I occasionally have to confront Reviewer 4 head on, less they somehow manage to convince Reviewers 1 and 3 to change their scores.

Anyway, this is some of what goes on behind the scenes behind your favorite publication venue. As an author, try not to let the outliers get under your skin. If your other reviews are good, be persistent and try again somewhere else. There's an awful lot of randomness in this process.

Credited to the Author

NRC Computer Science Rankings Reprise

type='html'>There's an article in CACM this month by Computer Scientists Andrew Bernat and Eric Grimson, Doctoral Program Rankings for U.S. Computing Programs: The National Research Council Strikes Out. It talks about the ways in which NRC rankings are broken for CS (we have heard this before), but it details ways in which it could be fixed, which we hadn't heard before, and I like.

Two suggestions I thought were good:
  • "Explore making the rankings subdiscipline-dependent. It is clear that different departments have different strengths. Thus, enabling a finer-grained assessment would allow a department with strength in a sub-field, but perhaps not the same across-the-board strength, to gain appropriate visibility. This may be particularly valuable for students deciding where to apply."
  • "Use data mining to generate scholarly productivity data to replace commercially collected citation data that is incomplete and expensive."
The first is a nice idea; for example, you might be interested in a top ranked department, but it turns out 19/20 faculty focus on Theory and you actually want to do Systems. Or there might be some school with three top faculty exactly in your subspecialty, but you don't see them because they're 93rd in the rankings. 

The second is nice as well; I think with Google Scholar Citations data available this turns out to be a trivially easy problem to solve. 

Maybe CRA can do their own rankings; they collect a lot of their own data anyway, and it avoids needing to rely on the NRC. 

Credited to the Author

"But you don't look like a computer scientist!"

type='html'>Joe McCarthy has an interesting post about the Boopsie Effect, "wherin women in upper-level positions in historically male-dominated professions find that 'attractiveness suggests less competence and intellectual ability'". He discusses some female computer science researchers he knows who have felt compelled to conceal their attractiveness in order to be taken seriously by their colleagues.

This is a picture of Hedy Lammar, silver screen 
actress and wireless security pioneer.

Photo by BooBooGBs
I thought this was an interesting comment, because I've fortunately never encountered this sort of problem from my male colleagues. However, I have most certainly encountered this from the lay public.

For example, I recently bought an iPad while traveling. Because I had suitcases to carry, I accepted the Apple store employee's suggestion to open the box and register the SIM card so I could leave all the packaging at the store. After he finished I began to gather my bags and he said, "Do you need any help setting up your email?" to which I replied, "No thanks, I'm a computer scientist." He had a look of shock on his face and said, "Oh! I underestimated you!"

I'm still not entirely sure what to make of that comment. What are we computer scientists supposed to look like exactly?

Of course this comment isn't nearly as bad as one I received a long time ago. I was out with some friends, and a man came over and started talking to me. He asked me what my profession was. I told him, and he said, "But you don't look like a computer scientist!" I had to leave the room for a second, and when I came back he was gone. Perhaps he was hoping I looked more like Hedy?

Credited to the Author

My how things change

type='html'>Before you are a professor:
"OMG, I cannot believe the professor's slides have typos in them. And a boring PPT template. And looks like it was written in 1983. Oh, and, like, OMG, don't they know we don't use gopher anymore??"
After you are a professor:
Hmm... will they notice the mustard stain? 
Yeah. It's that bad, folks. I have typos! Spelling errors! Ugly slides that are from 1983! And despite my best efforts, large parts of my lectures are BORING!

My dear professors of yesteryear, I am sorry for judging. I get it now.

Credited to the Author

[#CSEdWeek] My favorite software

type='html'>Happy Computer Science Education Week. I did my part! I debugged a memory leak with a pre-literate child sitting next to me, wanting to punch the meta-key in emacs. I can't remember if he realized we needed an extra * or I did, but all I can say is that if pointers are so simple a five year old can explain 'em, no grumbling allowed, undergrads.

Anyhow, to kick off CS Ed Week, I'd like to talk about some of my favorite programs. These are small utilities most of you have probably never heard of, but they fill me with great joy.

1. DTerm (OS X)

This, is, hands down, the piece of software I have been waiting for my whole life. It's basically a "command line anywhere" sort of program. See, for some bizarre reason, OS X doesn't allow you to, say, create a text file in the Finder here , like Windows or even some versions of Linux does. I have no idea why, but this was a gross oversight.

DTerm saves me tons of effort. Old way: Launch terminal window, cd tab-tab-tab-tab-tab (or drag from finder), touch foo.txt. New way: Dtermshortcut: touch foo.txt. Done!

2. Quicksilver (OS X)

Along those lines, Quicksilver is also super useful and has accelerated my workflow. Instead of trying to find things (which I'm terrible at), I just type command-period, type the first two letters of the application name, email address, text file, whatever, and boom - there it is. I am so used to this now I have to install it on new machines, or else I can't use them. (Sad but true).

3. F.lux (OS X, Android, iOS, Windows, Linux)

This program is very clever - it dims your monitor/screen to help simulate getting ready for sleep. For someone who ends up foolishly doing work at 11pm or 5am, and who travels through way too many time zones than is healthy, it's nice to give the 'ol hypothalamus a break.

4. Instapaper (All OSes, web-based)

This is the best piece of software ever written. It lets you save a webpage, from anywhere, for all time. (Removing all ads and annoying stuff). It's shareware, but if you give the developer $3, you can also search through your clippings. It's beautiful, well designed, and wonderful for reading lots of news/journal articles on long airplane rides.

I think that's it for now. I've been trying to be like Beki and stop using my inbox as a TODO list, but I'm still experimenting with applications for that. It's my New Year's Resolution. I'll leave you with a CSEdVideo from Obama (h/t CCC blog). Enjoy!



Credited to the Author

Scary Stories to Tell in the Dark: Conference Edition

type='html'>I recently gave a talk, and afterward someone doing very similar work, "Sue", came up afterward and we started chatting. We had very complimentary research interests, so went to dinner together to keep chatting.

Sue and I are in very different disciplines, but we both attend conferences regularly, so we ended up swapping stories.

Some of these conference attendance stories were funny (e.g., the general chair getting trashed and loudly singing German drinking songs at the banquet), and some were embarrassing (e.g., the young scientist asks the senior scientist, 'What are your thoughts on Embedded Rubber Ducks?' and the senior scientist says, 'Young man, I INVENTED Rubber Ducks!"), but overall they were greatly entertaining. Like campfire story-telling for academics.

Though as any good campfire event goes, we reached the inevitable point in such a conference story-swapping conversation -- horror stories. These are the kind of stories that, at the time, make you want to jump off a cliff, but years later you can (sort of) laugh about with colleagues. Sue told me a few that I wish I could write about here but I was sworn to secrecy.

Anyway, I thought I'd ask the peanut gallery out there - what are your conference stories? Any funny ones? Scary ones? Would love to hear.

Credited to the Author

How to get your paper accepted: Short paragraphs

type='html'>July seems to be the month for reviews, so I thought I'd organize some of my observations on scientific writing into bite-sized advice posts.

1) If you want to get your paper accepted, please, for the love of all things, use short paragraphs.

I was reviewing a two-column ACM format paper recently, and a few paragraphs took up the entire left-side column and half of the right-side column. My eyes went blurry by the end, and frankly it negatively biased me against the authors.

If authors are concerned about space, they should either use less words or make their diagrams smaller. I'd much rather see smaller diagrams and more readable text than huge diagrams and squished prose.

Also - putting hundreds of lines of code into a paper is rarely necessary. (And XML is never necessary*). Use small chunks. Just the important idea behind the awesome algorithm. If the code paragraphs are taking up more than half a page, please consider an alternate presentation style. (See Justin Zobel for nice presentation ideas).


-------------------
(*) I'm sure there's a good xkcd comic out there for this sentiment, though my Google fu is weak today.

Credited to the Author

Top Secret Rosies

type='html'>A special "rose" for you for Valentine's Day - I've just posted at Scientopia about the incredible new documentary "Top Secret Rosies: The Female Computers of WWII."




Credited to the Author

Kudos, ACM!

type='html'>Kudos to ACM for featuring two prominent Female Computer Scientists on in this month's Communications of the ACM (CACM) -- Jeannette Wing and Barbara Liskov (via Valerie Barr). I especially enjoyed reading Valerie's article about Barbara's keynote at Grace Hopper. Barbara is the second woman to win the Turing Award, which is basically the Nobel Prize for Computer Science. I liked this:
"Liskov talked about her technical work that ultimately led to the Turing Award. Much of her work was motivated by an interest in program methodology and the questions of how programs should be designed and how programs should be structured. So, after receiving the Turing Award, she went back and reread the old literature, discovering anew that there is great material in old papers and that her students were unaware of it. So, she is now pointing people to these papers and encouraging people to read them. 
For example, three key papers she cited are:
  • Edsger Dijkstra, "Go To Considered Harmful," Communications of the ACM, Vol. 11, No. 3, March 1968, pp. 147–148.
  • Niklaus Wirth, "Program Development by Stepwise Refinement," Communications of the ACM, Vol. 14, No. 4, April 1971, pp. 221–227.
  • David Parnas, "Information Distribution Aspects of Design Methodology," IFIP Congress, 1971."
I recently had a similar "everything new is old again" epiphany. I was looking up a paper that everyone cited and realized it was far too recent. So I went down the citation rabbit hole and found the original paper, written over 30 years ago. And, wow, great ideas - but they completely got lost in the whisper-citation-down-the-lane effect.

Anyway, good stuff, check it out if you have the chance.

Credited to the Author

Taking a risk for someone else

type='html'>I read this article last night, and it brought tears to my eyes. It's about James Zwerg, a white college student who was part of a group of non-violent civil rights activists (the Freedom Riders) who rode an integrated bus though the deep south in 1961. They did this to prove a point - travel facilities on the interstates in the South were just as segregated and racist as they ever were despite the Supreme Court's rulings, and it was time for the government to act.

They endured incredible violence and emotional tumult, the latter not only from aggressive people they encountered during their travels, but also from their families.

Stanley Nelson made a documentary about the Freedom Riders which you can watch on your local PBS station or on DVD. (It first aired May 16th). Many clips from the flip are online as well.

I find myself struck by two thoughts. First, what an incredibly brave thing these students did. They risked their lives and endured a tremendous amount of grief to get the government to actually do something.

Second, I wonder why now, 50 years later, so many people are so risk-averse when it comes to standing up to people who make racist/sexist/ableist remarks. Compared to what The Freedom Riders did, confronting someone on this stuff is nothing.

I hope this film, and other associated events of the 50th anniversary of the Freedom Rides, gives people the courage to act.

Credited to the Author

Cryptocontributions

type='html'>John Regehr of Embedded in Academia has a great post about Cryptocontributions in writing:
Even when interesting and unexpected results make it into a paper (as opposed to being dismissed outright either by the PI or by a student doing the work) the discussion of them is often buried deep in some subsection of the the paper. When this happens — and the interesting development is not even mentioned in the abstract or conclusion — I call it a “cryptocontribution.” Sometimes these hidden gems are the most interesting parts of what are otherwise pretty predictable pieces of work. When authors are too focused on getting the thing submitted, it’s really easy to shove interesting findings under the rug. Certainly I’ve done it, though I try hard not to.
I like that in his post, there is a little bit of a cryptocontribution, and that is - by being so conference deadline-driven, Computer Science is, as a Science, still a bit immature. If I have time I'll write more about this topic next week, because it's an idea I've been pondering for awhile.



PS - A note to John and other bloggers who run WordPress type-things - I seem to be unable to leave IP-anonymous comments on your blogs via Tor. I try, and try, and try, and am thwarted. So I've given up! But do know I'd love to comment if I could. Maybe this summer if I have some free time I'll write a Tor browser plugin that works with WordPress.

Credited to the Author

Things I don't have to think about today

type='html'>John Scalzi posted an absolutely breathtaking poem in his blog yesterday - check it out.

(Normally I'd post excerpts to tease you, but I'm loathe to perturb poetry. Just trust me, it's worth the click!)

Credited to the Author

Mobile Scholar: Part I

type='html'>
Image by Mike Licht
I am trying to turn my iPad into a laptop in order to lighten my load while traveling (and save my poor neck). It will never replace a proper computer from a software development perspective, but from a scholarly reading and writing perspective I am almost there.

The Chronicle had a nice post in ProfHacker regarding PDF annotation and organization, and Christopher Long has also written in greater detail about how one goes about "Closing the Digital Research Circle". For PDF annotation, syncing, and citing, I strongly suspect Mendeley is going to win the race. As much as I love the idea of Zotero, I just don't use Firefox on any of my machines or mobile devices. (I did enjoy using the open source Aigaion, but once my entire bibliography got trashed while upgrading I decided to stick with the pros). Mendeley can be buggy, but as one person said, "When it works, it works really well," and they're right.

Anyway, that's still just consumption and management of existing content, which is only half the problem. The other is creating and editing manuscripts.

In my field, everyone writes papers in LaTeX. Some journals and conferences occasionally permit the submission of Word documents, but personally I have a hard time understanding how anyone can do that without pulling their hair out. The last time I wrote an article in Word I spent several days dealing with misplaced references, unusual figure formats, caption problems, and incompatibility issues. When I write in LaTeX I can just focus on the writing and ignore everything else. (Kind of like writing a program in Java vs. C++)

But how to write LaTeX on the go? Due to a lack of multitasking in the present OS, as well as Apple forbidding any applications that compile code (e.g., no easy way to typeset your documents), what's a body to do?

I recently found LaTeX Lab, which lets you edit and typeset LaTeX Google Docs. Hooray! Almost there!

...sadly, Google Documents are not yet natively editable on the iPad.

I can, of course, remote login to my machines back at the office using virtualization software and edit LaTeX files there, but that just feels so inelegant. So we're not there just quite yet. I'm going to try a few things over the next few weeks and will report back.

Credited to the Author

How to get your paper accepted: Our results are very awesome!

type='html'>
In today's installment of how to get your paper accepted, I'd very, very much like to discuss intensifiers. And exclamations! So I will.

Scientific writing is first and foremost about clear, careful communication. You can have the most amazing results in the world, but if you can't clearly walk your reader through your science, you're going to run into problems. Furthermore, I said "careful" because in scientific writing it is also important to be humble, and not take your conclusions too far.*

By using intensifiers, which are adverbs that elevate the word following it, you not only run the risk of over-generalizing, but you also risk angering your reviewers/readers. It is highly unlikely most authors can make claims like: "Our work makes a very important contribution", "We present really groundbreaking work on embedded rubber ducks", or "This work is extremely revolutionary".

Exclamations, too, rarely have place in scientific prose. Sometimes if you are trying to write something that catches the reader's attention, exclamations may be appropriate. For example, if you were writing a technical article on cellular phone use in rural India and wished to point out some fact about how people are more likely to have phones than shoes, say, that might make sense.

An editorial or book review is a fine place to use qualifiers, and possibly also exclamations. These are publication venues that expect authors to state opinions and generalizations, as well as to catch a reader's attention.

But for your standard journal or conference article, keep the intensifiers (and exclamations!) at home. 

------
* This is important for many reasons, not least of which is making a generalization or prediction about the future that is entirely wrong and/or taken out of context, and having to relive it for decades. See also, "There is no reason for any individual to have a computer in their home."

Credited to the Author

Birthers, Racism, and The Media

type='html'>Here I was thinking, how I can I best write about what I think about the birthers? And then Tony Auth drew a fabulous editorial cartoon. Well done, Tony.

Image Descrption: Four panels. Upper left "He wasn't, you know, born in America."
Upper right, "He's not, you know, A Christian." Bottom left, "He's you know, a Muslim
or a Kenyan.". Bottom right, "He's well, you know..." [silhouette of Obama,
visual implication is: '...black']
Also on the topic of birthers and latent racism, The New Black Woman posted a critique of how the media just played into Trump's hands, let him and other birthers spew all sorts of racist and xenophobic garbage unchecked, etc,:
"...many of the traditional news outlets and journalists refused to examine the racial factor behind the birther issue. Big Media refused to dig deeper into the underlying racist feelings that when a black man or woman obtains higher power or authority, there's something astray about that person's ascent to power. It failed to ponder why so many people feel that whenever a black man or woman achieves great success, their rise to fame or fortune must be the result of either a law or being broken or skewed in their favor at the expense of a white person.

But, I can't blame Big Media for failing to delve into any analytical reporting or investigating. Reporting on the racial, xenophobia aspect of the birther issue would require the media to confront the system of white supremacy and privilege set up to benefit many of the reporters working for Big Media. It would require them to dig deeper than the shallow reporting they are so accustomed to (due to advertising demands, a short attention span and hollow reasoning by their audience) and examine the subconscious racism laying dormant in a majority of our society. It would require making their audience and their bosses uncomfortable reading and editing stories about race as they would see quotes or segments reminiscent of their underlying racist feelings."
I sincerely hope the media outlets take this as a challenge to make themselves uncomfortable and truly confront latent racism. The day they finally realize that their power is more than just selling toothpaste and viagra, and they can have a major hand in changing people's negative attitudes toward other races, cultures, and abilities. It doesn't have to be after school specials, just even the topics they choose to discuss and the way in which they discuss them. They could do so much better with not all that much effort.

Of course, then there's always the inevitable, "Oh noes! Our art will suffer by having to care about how we use language!" argument. You think I'm joking. If you watch the actual noose gaffe video, while fumbling Mitt Romney quips, 'You have to be careful what you say these days!' Aw. Not like the good old days where you could make noose jokes without a problem. Poor guy, he already has a lot on his plate, I shouldn't pick on him.

Credited to the Author

Pop Quiz: How we discuss woman in STEM

type='html'>As scientists, engineers, and thinkers, I know several of you are interested in the phenomenon of the subtle ways in which women in STEM are diminished by sexist language and behavior. Sticks and stones, perhaps, but even this stuff is critical to addressing if we truly want to make progress and enable a cultural shift. (See also, death by a thousand paper cuts).

In fact, the more I think about it, the more I realize progress relies almost entirely on the shoulders of mass media. Yesterday NPR had a story about Hollywood Health and Society, which consults with writers about how to write correct and useful story lines on healthcare and climate change*. Turns out the majority of Americans learn about science and healthcare from fictional TV- surprise!

So, writers, you have an important job to do. You need to portray scientists as they actually are. No putdowns, no pedestals, and definitely no tropes.

*Ahem*.

Ok, ready for the pop quiz?

Part 1: Read these quotes, and list all the tropes. 

1) "For Janet Yellen, Obama’s Federal Reserve nominee, quiet patience paid off"

2) "Though he says she hasn't been a superstar economist like her husband, George Akerlof, who shared the 2001 Nobel prize, and her achievements have been overshadowed by Bernanke and former Fed chair Alan Greenspan, she is a great role model for women, because throughout she has proved her intelligence, technical expertise, creativity, and her ability to cooperate with others and work hard."

Part 2: Consider the following two Wikipedia summaries**. What's different? (Hint: check the things in red). 



Pencils down!


-------
*We need this for Computer Science. Nearly every computer whiz portrayed in television is a socially inept caucasian man and/or psychopathic underachiever woman. And speaking of which, while I'm happy Elementary attempted to discuss P ?= NP last week, though there were some problems as Lance points out. More importantly, why was the woman a professor at some podunk university I'd never heard of, and the man was a professor at Columbia? And all she did is teach. And, PS, sexy librarian trope.

**This is my next project. It is positively absurd how women are described on wikipedia in comparison to men. Not just scientists - musicians, actors, artists, writers, athletes - pretty much every profession. Women quietly cooperate and have babies! Men invent things and lead.

Credited to the Author

How to make your journal editor happy

type='html'>And in today's Hints from Heloise...

If you want to make your journal editor / reviewers happy when submitting a revision for review, use colorful highlighting annotations in your PDF document to show what's new. This makes skimming a 48 page manuscript so much more pleasant, and as an editor I am far more likely to click, "Hoo-rah, accept!" than I otherwise would.

Recently I read one manuscript where the authors put their new text in yellow and their revised text in blue. Just this simple gesture made it so easy for me to check if they'd made the required changes.

You'd like to think your reviewers are not this easily manipulated, but I can tell you at least one of them is. :)

Credited to the Author

Today was a a ____ day to be a professor

type='html'>At the end of every day, I make a statement like, "Today was a good day to be a professor", or, "Today was a bad day to be a professor". (And some days are partly cloudy.)

It's interesting to reflect upon which activities bring me the most joy, and which are the most frustrating. So, let's see:

Favorite thing: Meeting with my RAs. They are just good kids. They are sweet, fun, and brilliant. I love sitting around and bouncing ideas around with them and solving problems together. They impress the heck out of me with all they've accomplished thus far.

Least favorite thing: Drama and politics. Every sphere of this job involves some of each. For drama, I process it on a case by case basis, and try to be as fair and understanding as I can.

For politics, I am usually completely clueless. Sometimes I'll talk to someone, and hours later realize there were hidden subtexts beyond my ability to comprehend and quickly respond to in the moment. I'm not sure if I'm poorly socialized, aloof, or both, but frankly a lot of the politics surrounding this job positively baffle me.

Unfortunately being successful as a professor seems to require political savviness, in a way very different than in industry. I felt like in industry the rules were clearer; perhaps because everyone was working toward the same goal (e.g., please the customer). Academia is more like a collection of small empires. We all have shared goals of Furthering Education and Advancing Knowledge, but go about them in very different ways. We have frequent encounters with other Dukes, where we must broadcast our land's contribution to the Kingdom at every turn.

Post the PhD level, anyone with motivation and drive can learn to prep and teach a class, acquire external funding, effectively manage a research group, and publish lots of papers in good places. Political savviness, however, is another beast entirely.

Credited to the Author

RSA hack - Trouble with a capital T

type='html'>It seems RSA was hacked today. This means, if you use one of those nice little SecureID fobs to connect to your corporate server or bank, it may have been compromised.

This is a big deal. Using two-factor authentication is an industry gold standard, and RSA is one of the most prolific manufacturers of such fobs.

Securious has a nice write up of the fact vs. fiction surrounding the attack, including a note that this was an APT attack, not some random script kiddie in Germany.

I'm not trying to stir up panic here, but if you work with sensitive data, this might be a good time to add another layer of encryption on it*. There are lots of free solutions, like True Crypt, or if you're on a Mac the easiest thing to do is create a password protected disk image. Remember not to use the same password for your encrypted disk partition that you use for anything else (logging in, email, etc.). But also don't lose this password - if you do then your data is "irrevocably lost". Whee!

* Obviously all the "check with your (IT) doctor" disclaimers apply here.

Credited to the Author

Fashion Tips, Part I

type='html'>I have recently been asked by several people to provide fashion suggestions for how to dress in professional settings. This is going to be a multipart essay - there is much to cover, and I'll make some more specific suggestions in future posts. 

When in professional settings, it is good to dress professionally. Professional settings are defined as one's workplace, a conference, a job interview, giving a talk, etc.

However, defining "professional dress" can be tricky, and selecting the right attire for the organization can be tricky. I have worked for some organizations where professional attire means jeans and T-shirts. But usually professional dress falls somewhere between "business casual" (button-down shirts, nice looking pants, non-boots/non-sneakers*) and "formal" (suit, dress shoes).

The most important aspect of picking the appropriate level of professional attire is this: If you are inside the organization (i.e., employee), dress exactly as everyone else dresses, but if you are outside the organization (i.e., job candidate), dress one level up from what everyone else is wearing.

For example, if you work at an company where all the other employees wear a suit to work every day,  you should wear a suit to work every day too. If they wear jeans, you wear jeans. It's all about blending in. You don't want to be noticed for your clothes - you want your clothes to be background noise to your brains.

Now there is one exception here - if you want to get promoted, or seen as able to fulfill a role "higher" than where you currently are, dress a level up. So if you want to be promoted to be a project leader, dress like all the project leaders do. If you want to be hired as a professor, don't dress like a graduate student at conferences. You want to be seen as a peer.

If you are outside an organization, for example, as a job candidate, you want to dress slightly better than what everyone in the organization wears. If they're all wearing jeans and sneakers, go one level up to "business casual". You probably don't want to wear a suit - especially if you're interviewing in Cupertino! If the employees wear a mix of business casual and jeans, then it's reasonable to wear a suit. Once you are employed you can figure out what to wear, but if you're an outsider trying to get in, dress slightly better than everyone.

If you don't know in advance what the standard attire is for the organization, err on the side of formal dress. People (including you!) take you more seriously when you are dressed up - there's peer-reviewed articles on this. :). I know some Computer Scientists who fiercely debate this, and argue that the scruffy person in flip flops and torn jeans is always the smartest person in the room, but take my word - don't be scruffy as an outsider.

(*) Dear CS Men: I beg of you, from the bottom of my heart, please do not wear those sinfully awful black sneakers (c.f. this). I don't know which uber-geek started this trend, but he was wrong to do it - they are a fashion abomination. Go buy yourself a nice pair of Rockports, or something from the Walking Company. If you absolutely must wear sneakers, get a pair of Converse or some trendy Adidas or something. 

Credited to the Author

How to be black

type='html'>I really liked today's "conversation" in the Washington Post, "How to be black" Baratunde Thurston. He's a comedian and writer, and wrote an auto-biographical book, some of which is excerpted in a slideshow on the WP website.

Worth a look. I really loved #11 "How to speak for all black people", and #13 "How to be the black employee", because I think they are also applicable to being a woman or other underrepresented person in technology. (I think I had a post on this?.... ah, yes, I did.)

Credited to the Author

Letting papers go

type='html'>Awhile back, a colleague and I wrote a paper and submitted it to a journal. The first round of reviews came back, and one reviewer told us our work was fatally flawed.

We went through a few rounds of back-and-forth with the editor, all the while repeating that Reviewer #7* was mistaken because of such-and-such reason.

Recently my colleague and I were examining our resubmission. My colleague drew a picture to clarify something, and I stopped dead in my tracks. "Holy crap, Colleague. Reviewer #7 is right! Our entire paper is irrevocably flawed."

We went though the data, checked a few things, and sure enough - fatal flaw. I'm not sure how I missed it the first time, I guess because I was not the first author and busy doing other things when we first submitted it.

So - 30 page paper goes in the trash. Clunk!

Now you might say, "But wait! Why can't you just fix that broken part? Write a big disclaimer within a limitations section?"

I can't fix it because it's wrong. The entire concept of the paper is flawed. Even with a disclaimer it would be disingenuous to publish this at all.

So I let it go. I'm not too sad, though. We actually re-designed how we'd do things to avoid this flaw in the future, and I am sure our next paper will be super fantastic when we write it. And in any case, there are always more great ideas out there.

----
(*) Not the Reviewer's actual number. 

Credited to the Author

I'm male, yet again!

type='html'>I just got back yet another revision back of a paper I reviewed, and once again, I am male! Check this out:
We'll fix XYZ... (also as pointed out by Reviewer 1 and addressing his comment as well).
And the first author is a woman, no less! For shame.

I wish I could write back that I am not a man. But that would surely out me, as, really, there's only N women in my subfield and you can count them on two hands.

I accept that in this day and age "guy" and perhaps even "man" are gender-neutral - I've given up on those battles. But "his" and "he" are most definitely masculine in English.

Interestingly, this is from the same journal whose editor called me "Ms." and my male co-author "Dr.", even though we are both still PhD students. And the re-invtation from the editor again called me "Ms.", but at least he didn't call me "Mr."

Anyway, this is all quite entertaining. I've decided I'm going to keep a scorecard of times I'm referred to as a male after giving anonymous reviews. New category and all.

This month we are batting .250. Watch out for that Mendoza Line, authors!*

(*) Yes, I just made two sports analogies. Maybe I am male!

Credited to the Author

And the Truthy Shall Set You Free

type='html'>I've just read about a great project at Indiana called Truthy. (Here's a linky). The idea is simple: during the upcoming election, their system will detect smear campaigns on social networking sites in real time, and post some visualization of how the meme spreads over time. The idea is to try to prevent "astroturfing", which are well-organized political campaigns masquerading as grassroots efforts.
"The team will then generate diffusion network images that visitors to Truthy.indiana.edu can view as groups of nodes and edges that identify retweets, mentions, and the extent of the epidemic...
Menczer got the idea for the Truthy website after hearing researchers from Wellesley College speak earlier this year on their research analyzing a well-known Twitter bomb campaign conducted by the conservative group American Future Fund (AFF) against Martha Coakley, a democrat who lost the Massachusetts senatorial seat formerly held by the late Edward Kennedy. Republican challenger Scott Brown won the seat after AFF set up nine Twitter accounts in early morning hours prior to the election and then sent out 929 tweets in two hours before Twitter realized the information was spam. By then the messages had reached 60,000 people.

Menczer explained that because search engines now include Twitter trends in search results, an astroturfing campaign -- where the concerted efforts of special interests are disguised as a spontaneous grassroots movement -- that includes Twitter bombs can jack up how high a result shows up on Google even if the information is false...

'One of the concerns about social media is that people are being manipulated without realizing it because a meme can be given instant global popularity by a high search engine ranking, in turn perpetuating the falsehood,' Menczer said."
Definitely a clever approach to the problem, and if you're a twitter user, get involved!

Credited to the Author

Eight little words inculcate imposter syndrome

type='html'>The great Maria Klawe, ACM Fellow, AAAS Fellow, president of Harvey Mudd, wrote a surprisingly humbling and honest article in Slate on imposter syndrome.

In some ways, this type of article is good for young women in the field, because they figure if superstars like her can feel it, they can feel it too. i.e., "It's normal to feel this way."

Except, it's not normal to feel this way.

The reason we feel like we don't belong / aren't good enough, is because we've been encultured to believe this since Day 1. The message from the media is passive pink, and rarely are young women cast in roles of lead scientist in film and television. The whiz computer genius in a show usually looks like this:



"That doesn't look like me. Also, he seems really unhappy. I don't belong in computer science."

Readers protest, "But it's just TV! It doesn't matter!"

But it does. This is how kids choose careers. As much as we'd like to think that our annual science outreach visit to our children's classrooms hugely influences students' future career learnings, we're talking marbles vs. Large Hadron Collider. Hollywood is it.

So for the lucky few who manage to beat the cultural odds and enter our field anyway, they have one more major hurdle.

It's not the intellectual requirements of the job.
It's not work-life balance.
And it's certainly not babies!

Nope. It is eight little words that skewer you with a knife. Eight little words that knock you down in one fell swoop.

Eight little words that men never hear.

"You only got here because you're a woman".

Have you ever said this to someone? Have you ever thought this and not said it?

This is an awful, awful thing to say. Why? Because underlying it is the assumption that only men can do computer science. Why on earth would you think that?

I first heard these words as an undergraduate, from someone I thought was a close friend. I felt sick to my stomach. I never felt imposter syndrome before that point. I loved technology, I was good at understanding how it worked, and how to make it do the things I wanted it to do. Up until that point, I assumed my strong technical abilities and grades was why I had been admitted into the program. Surely not my gender!

After I felt sick, I felt mad. Really mad! Who was this joker to tell me I didn't belong here? I'll show him.

Now, I'm fortunate, because I face adversity with stubbornness. It's just my nature. But most people are not like this. They get beaten down with a stick enough times, and they head for the hills. I can completely understand that, I've had my moments.

Here's the thing. Every time you say or even think these eight words, you're beating someone with a stick. You might think it's an innocuous statement, but really what you're saying is, "Go home dumb little girl."


Don't be a boorish bear.

Credited to the Author

Being Brilliant vs. Writing Well

type='html'>In the vast world of academic Computer Science publishing, I am about to tell you the greatest secret of all:

You can make up for lack of genius by being a good writer.

Being a good writer will never guarantee a paper acceptance. But I'll tell you, if your paper is like butter for a reviewer to read, it makes it all the more difficult for them to tear it apart. If your writing is crisp and clear and sharp and snappy, it makes reviewers feel joy in the hearts. Especially compared to the other poor abuses of the English language they had to sludge though before your paper walked through the door.

It's actually quite easy to learn to write well. Here are some tips:

1) Practice, practice, practice. A blog can really help, actually. Twitter probably not so much. You want to aim for cogent prose.

2) Read a lot. Read well-edited publications - newspapers, magazines, journals. Journalists are excellent at grabbing your attention and keeping it. This skill is invaluable in scientific writing.

3) Less is more. You are not getting paid by the word here. (In fact just the opposite - many conferences have page charges if you go over the limit!). It is not necessary to give every gory detail. It is highly unlikely you need to paste code into your paper. Just convey the information that is most important - what is it you want people to take away after reading your paper?

4) Once you're confident, take some risks. I know your 3rd grade teacher told you all of these things about structure and topic sentences and a conclusion section and an outline section and all that jazz. But really you need to figure out your own style that best helps you convey clear ideas.

5) Proofread your paper very carefully before submitting it. I am shocked when I read papers with grammar errors, spelling errors, and typos, particularly from senior academics who are fluent English speakers. Take the time to proofread, or outsource. (Occasional errors are understandable, but a paper should not be littered with them).

6) Practice!

Credited to the Author

Achoo !

type='html'>I feel strange not blogging much, like I didn't brush my teeth or something.

Life has been incredibly busy, though all for good reasons which I'll discuss here soon.

I think we need a panel not on work-life balance, but on blog-life balance. After housework, the blog is the first thing to go.

Anyway, it seems my first blog-a-versary is on Sunday, which is neat. I've enjoyed meeting so many of you online and interacting with you, reading your blogs, and learning from you. It's fun when people in meatspace mention some of your blogs at work, and I have to silently giggle under my thin veil of pseudo-anonymity.

Thanks for reading!

Credited to the Author

Dear Software Designers Near And Far

type='html'>Dear Software Designers Near And Far:





























*


Because frankly I can't figure out a damn thing on any of your new fancy, textless toolbars. Yes I know the magnifying glass icon means zoom. I know the printer-looking icon means print (if I can see it). I know "X" means "close". But that's it. I should not have to go through 18 menus to say "turn text labels on". I should not have to hover over every single picture to figure out what they mean. Just tell me, with words. 

Many people cannot read, and I respect that you want to make these interfaces accessible to them. But please make them accessible to me too.

Love,
FCS

----
(*) Since I actually do care about any readers who use screen reading software, these labels are meant to say: "Please, please, please put labels on buttons."

Credited to the Author

Equal Pay Day

type='html'>Today on Scientopia I write about Equal Pay Day.

Credited to the Author

Why men still get more promotions than women

type='html'>Very interesting article in The Harvard Business Review on male vs. female mentoring, and the difference it can make in business contexts.
All mentoring is not created equal, we discovered. There is a special kind of relationship—called sponsorship—in which the mentor goes beyond giving feedback and advice and uses his or her influence with senior executives to advocate for the mentee. Our interviews and surveys alike suggest that high-potential women are overmentored and undersponsored relative to their male peers—and that they are not advancing in their organizations. Furthermore, without sponsorship, women not only are less likely than men to be appointed to top roles but may also be more reluctant to go for them.
The article is a bit anecdotal in parts, but has some underlying interesting ideas in it that are grounded in research. I'm not sure how applicable it is to academic careers, but having a mentor who in addition to giving you advice can help sell you and your ideas to others (institutional peers, editors, etc) is almost always, in my experience, a helpful thing.

Credited to the Author

Female Computer Scientists FTW

type='html'>In other security news this week, Google is claiming China orchestrated some major attacks against gmail users. No shock, but what I found interesting is that they were discovered by blogger and fellow female computer scientist Mila Parkour.

Kudos, Mila!

And this just in - apparently Sony! Soni! Soné! has been hacked again. With script kiddie SQL injection attacks. The PC World article says, "Sony seems to ignore compliance requirements and basic security best practices".

For shame, Soné, for shame. You should totally hire Mila to fix you up. After China I suspect a gaming network will be child's play.

Credited to the Author

Reason #452 why women leave academia: macho students

type='html'>Recently I was teaching a class of students, let's say on the topic of rubber ducks. I give the class some exercises related to rubber duckery and let them get to work. After a short while, I ask if there are any questions. Student A, one of my "hat boys" as I like to call them, replied.

Student A: "Well, it would be a lot easier for me to study rubber duckery if the bath was better implemented to support multiprocessor floatation devices and had a better internal physics model of how fluids move." (I'm just making things up here, but you get the idea).

I realize he's just trying to show off. But I know enough about the gobbledy gook he's spouting and can hold my own, so what I intended to say was something like, "Yes, blahblahblah is true, but this exercise is about rubber ducks, so don't worry so much about this other stuff." But before I can get a word in edgewise, Student B interrupts.

Student B: "Uh, no. Multiprocessor flotation devices were, like, so last year. Now the fluid dynamics blahblahblahblah."

Student A: "Uh, no! Blahblahblahblah."

Me: "Look, I - "

Student B: "Blahblahblahblah"

Me: "But if we just - "

Student A: "Well, actually, blahblahblah."

I keep trying to interrupt to tell them to quit chit chatting about this silly tangent and get back to work, but the two students keep ignoring me. The other students start snickering at the interchange. Finally, I put my hand on Student B's shoulder, because he's so engrossed in arguing with Student A he's not even making eye contact with me. And I say, "Let's talk about all of this later, and get back to rubber ducks."

Then Student B has the audacity to say, "But this is far more interesting."

Sigh. Clearly I need to go sign up for those assertiveness teaching classes. Or else start teaching undergrads who have less of a chip on their shoulder. Because I have to say, moments like these, I honestly wonder why I'm interested in traveling down this path toward being a professor. Having to deal with hundreds of smarty-pants kids all at once does just not appeal to me right now. In general I actually enjoy teaching, but not these moments.

Credited to the Author