2006 (old posts, page 1)

Yojimbo, hit and miss

A few people have commented about Yojimbo, including Brent, who gives it a place in his dock. His post has some good points in the comments thread. I've already paid for VoodooPad, and put a lot of my brain in there, so I'm not moving my notes anywhere soon. This is a good point to note if you're thinking of wading into the note-taking app market - if I can't move my data into your app, it's unlikely that I"m going to bother using it.

It's nice to see a real app shipped using Core Data - although the data model appears to be pretty simple. I wonder if they hit any snags developing it? I'd be interested to hear what kind of effort they put into it.

I have a couple of quick nits to pick, in case anyone cares: If you have a three-pane interface, the detail pane has to scroll when I hit the space bar. Especially if it displays web content. Let me say that more clearly: If you display web content, use different key shortcuts than Safari at your own risk.

There's a useful info inspector for the items, but Cmd-i doesn't bring it up - it is still trying to italicize something, even if I'm not selecting text. An issue of fit-and-finish I was surprised to see in a BareBones app.

Finally, it's simple and elegant - it gets the important things right, but I don't know if it's really solving a problem that many people have - casual users can store passwords and bookmarks already, and serious researchers have more powerful tools. Maybe the biggest missed opportunity is that a having a single place for all of this data would be a great start towards making it available to other programs, which I think is where the next big leap in computing experience is - think a combination of the iLife media browsers and bookmark/note taking apps, along with a bit of Onlife. That's what I think the future tastes like...

PLDI Papers I'm interested in, part one

I mentioned that I'd post about some of the papers I found interesting from this year's PLDI conference. Disclaimer: for the most part this is based on reading the abstracts only, so this shouldn't be considered a thorough review.

Session one is Transactions. I will probably look through these, especially the first paper, "The Atomos Transactional Programming Language" [1] from Stanford, because transactional memory and processing seems to be a consensus pick for the next big thing, and Burton Smith once told me that languages using transactional memory and invariants with respect to state are his bet for what can solve the parallel programming problem. (What problem? It's too hard to write good parallel code.) So, I want to see what a transactional language looks like.

There's a paper in the Compilers session that looks like a cool idea for improving analysis - "A Framework for Unrestricted Whole-Program Optimization" [2]. The abstract says they have a way for intra-procedural passes to work on arbitrary subgraphs of the program, so they're not just limited by procedural boundaries, and don't have to rely on inlining to optimize across calls. I'm curious what languages it supports, and how the scheme would work with dynamic languages.

A paper about dynamic software updating, "Practical Dynamic Software updating for C" [3] (project link) is also interesting, because it seems like a step towards the way things should work. Essentially, they compile a program so that it can be easily updated without stopping it. They do it in a way that doesn't violate type-safety and sounds reasonably efficient. It reminds me of Apple's ZeroLink and Fix & Continue (note that those aren't the first examples of such technology), and I'm curious how similar it is. Certainly I don't think Fix & Continue tries to guarantee type-safety.

The parallelism session should be interesting, and I'm most curious to see an abstract for "Shared Memory Programming for Large Scale Machines" [4], I can't tell from the title if they are introducing a new language or measuring an existing technique. I have a note to myself somewhere to look for a full copy of that paper.

Power has been a big deal in HPC and mobile devices for a while, and now it's everyone's problem, so "Reducing NoC Energy Consumption Through Compiler-Directed Channel Voltage Scaling" [5] caught my eye. I'm always interested to learn about power usage effects of different kinds of code, since I have found it to be satisfyingly unintuitive at times. (Maybe I should've taken more EE classes!) Also, this is a paper from Penn State, and I'm curious what research they've got going on back at my alma mater.

I'll probably read everything in the Runtime Optimization and Profiling session, but "Online Performance Auditing: Using Hot Optimizations Without Getting Burned" [6] is particularly interesting, since I know Brad Calder and his students do really good work, and I honestly didn't know what Jeremy was up to. I should probably be more social around the department. (These guys are at UCSD)

OK, I'm not out of interesting papers, but I'm going to stop here for now. Check out the program, let me know what you think is cool - am I missing something really great?

References

[1] "The Atomos Transactional Programming Language" Brian D. Carlstrom, JaeWoong Chung, Austen McDonald, Hassan Chafi, Christos Kozyrakis and Kunle Olukotun.

[2] "A Framework for Unrestricted Whole-Program Optimization" Spyridon Triantafyllis, Matthew J. Bridges, Easwaran Raman, Guilherme Ottoni, and David I. August

[3] "Practical Dynamic Software Updating for C" Iulian Neamtiu, Michael Hicks, Gareth Stoyle and Manuel Oriol

[4] "Shared Memory Programming for Large Scale Machines" Christopher Barton, Calin Cascaval, Siddhartha Chatterjee, George Almasi, Yili Zheng, Montse Farreras, Jose Amaral

[5] "Reducing NoC Energy Consumption Through Compiler-Directed Channel Voltage Scaling" Guangyu Chen, Feihui Li, Mahmut Kandemir, Mary Irwin

[6] "Online Performance Auditing: Using Hot Optimizations Without Getting Burned" Jeremy Lau, Matthew Arnold, Michael Hind, Brad Calder

PLDI 2006 Papers

The technical program for PLDI 2006 is out now - there are certainly a lot of interesting papers in there. I'm looking through them now and will probably comment on a few of the ones I think are cool in another post.

PLDI is traditionally a very competitive conference with an emphasis on experimental results, and this year they received 169 submissions and accepted 36. PLDI stands for "Programming Language Design and Implementation", and covers compilers, languages and runtime systems.

There are some interesting workshops co-located with PLDI this year: a Workshop on Transactional Memory Workloads (WTW), a Workshop on Programming Languages and Analysis for Security (PLAS), and the first ACM SIGPLAN Workshop on Languages, Compilers, and Hardware Support for Transactional Computing (TRANSACT). Does it sound like transactional computing is hot these days? Yes it does...

Update: for historical reference, this year's 21% acceptance rate puts it right at the average, according to the ACM's data from 1995-2003.

On Reviewing

Like most students, I've been asked to review papers in my area (and a few that were pretty far outside it), and I always try to do a good job - this is definitely a golden-rule situation. If I don't take it seriously, I am absolutely convinced that karma will get me in the end, denying a crucial publication that could have pushed me over the edge to tenure.

I've also been lucky enough to have the fascinating experience of helping out with the Program Committee of a major conference, something students don't usually get to do. It is the kind of experience that really gives you perspective, and it's harder to get upset about disappointing results since then. Important decisions often come down to the quality of the reviewers and practical constraints - for instance, you may have space for 12 papers in the area, and you might be looking at a paper has two 'strong accept' reviews and one 'weak', but is pretty good. It seems like a borderline paper that might get in, right? But there are probably 20 others that got three 'strong accept' reviews - this paper has no practical chance unless someone champions it and the 'weak accept' reviewer wasn't very convincing.

The point of that little anecdote was that every review counts, even student reviews, and good reviews make the program committee's job a lot easier.

Off the top of my head, here are a few rules of thumb to reviewing:

  • Read the whole thing. It's only fair.
  • No matter what the questions on the review form say, always include a summary in your own words of the main point and contribution. This is really helpful to the author to put your other comments in perspective.
  • Don't be a wimp. If you mean 'reject', say so. If you use 'weak accept', explain why.
  • If there's space to add comments to the program committee, use it. Especially if you could be convinced to change your opinion of the paper. That can be useful if another reviewer had a very different opinion and the committee needs to reconcile them.
  • You're reviewing for a specific venue - if the paper is good, but you can think of a better place for it, say that and name that place - maybe the authors won't have thought of it, and at least it'll soften the blow a bit if it doesn't make it.
  • Take the time to scan the references - if they cite their own or similar work, check it out. This could be the only way you can answer the novelty question - how else will you know if they wrote the same paper six months ago and only added one result just to get to go on a nice trip?
  • Be honest about your expertise - it can help with the decisions, and it's hard to tell if you were dismissive because the paper was crap or because you don't understand its importance.

Does anyone else have a good tip for reviewing? Let me know in the comments.

Focus

Maybe it's a little dramatic to think of it this way, but it has seemed like I have two computing personalities - the one that writes here about Macs, user-app programming and interfaces, goes to WWDC and hangs out with indie developers, and then the other one that actually gets paid - a Ph.D. candidate in Computer Science who works on compilers, performance tools and high-performance computing at UCSD and SDSC. OK, I don't get paid much, but that's who I am in real life.

In order to get real work done, I've had to cut back on the first guy, shelving a few projects I would love to release, and dropping out of sight for months at a time on the BibDesk project, which doesn't seem to miss me, really.

What this means is that I've been pretty silent here lately, which I think is a shame, because I'm hugely vain and love attention. And yet I don't post personal details. This is evidence that I am complex and fascinating. Nevertheless, you really have got to hear what I've got to say.

In order to help you with that, I'm going to start posting about research topics, both my own and good papers I read or talks I go to, and hopefully some of it will be interesting. I have no idea how dedicated I will be to this, and it could get touchy - don't expect anything too controversial, since I do want to get hired and like a fool, I used my full name as the domain for my blog.

Coming up next - some thoughts on peer review.

Universal I-Search

There are a couple minor improvements in the pipeline, but I wanted to get a universal binary version of the I-Search plugin out before you noticed that it wasn't universal already. It's the same as the last version, just twice as fat. Get it here.

Next up for this project is to move it to the stalled leverage project.

Sending large files?

This problem comes up occasionally - I want to send some large files (hundreds of megabytes) to someone else. They're too big for email, I don't want to share them with the world, can't just burn a DVD, and neither of us has a computer that is constantly connected to the internet.

I usually resort to putting them up at my web space temporarily under an obscure directory name, but that's not ideal, since I need to remember to take them down, and have to make sure I tell search engines that they're not to be spidered.

Doesn't this seem like a problem that should have been solved already? It should be very simple to just send a file once to someone, asynchronously, and not have to worry about anyone else getting it, or taking it down, or going over your bandwidth limit or email attachment limit.

If you have a clever way of sending large files around, let me know in the comments. (Update, hours later- comments are enabled now, I don't know what happened there)

Lapwarmer

New Powerbooks, whatever they call them. I'm not sure I'll be able to say "MacBook Pro" without a self-conscious wince.

4x faster is a nice number - the memory bandwidth numbers (6.3x faster) are certainly impressive. I'll have to avoid using one so I don't start hating my 1.25Ghz G4.

A little disappointed that there's no massive design change - it's a bit thinner, but I'm still hoping someone will solve the core ergonomic problem of laptop screens - why can't I slide it up about six inches when I'm using it?

I love the claim that it'll be cooler and use less battery power, but I'll wait to hear what people say.

I think the killer 3rd party accessory for this will be a magnetic adapter for old power supplies - it's a nice new design, but lots of people have more than one power supply, and they're expensive to replace.

Takk… etc.

I bought three new albums with my iTunes gift card this Christmas, and I thought I'd share how they've turned out so far:

  1. Takk… by Sigur Ros. I bought this because I liked their last album, the parentheses one, and because John Allison's hilarious year end album review seemed to imply it was more of the same: "Everyone is fooled!" I find that I do like it just as much, but unlike all those Untitled songs in the last one, I don't mind listening to these songs when the sun is out. A fun game for casual fans is to play songs from the two albums together at random, and try to guess which album they're from. I call this the "Coldplay game".

  2. Late Registration by Kanye West. Everyone says it's great, and I don't listen to radio or watch MTV anyway, so I can't be annoyed by overplayed singles. I really like this album. Great for playing really loud while driving someone else's car across the desert at perfectly legal speeds. Luckily I live in Southern California, so nobody looks at me funny when my pale white fingers crank the volume on the way to my hockey game. I think my favorite track here is "Addiction".

  3. Final Straw by Snow Patrol. I've listened to it a few times already, it seems like there's a few great songs in there, a few I won't ever remember, and I have a nagging feeling that this will become another Josh Joplin Group, who I loved for three weeks and almost always skip past now. Which would be sad. If there was one song to get from this album, I think it'd be "Chocolate", not "Run", which was the big single. So there it is.