price.mit.edu/blog

Archive for January, 2010

Read-Write Software

leave a comment

My favorite moments with free software are when I get annoyed with some manual task that a tool leaves me to do for myself, and then invent a feature that the tool should have to handle the task for me.

With any software free or proprietary, if I’m lucky the tool might have a configuration system powerful enough to let me effectively add the feature from the outside. But with free software, I don’t need the authors to have anticipated my needs—I can reach into the guts of the software itself and change it to work the way I want. If it’s a friendly codebase or if I’ve hacked on it before, I may be able to add my change in a few minutes. And hey presto: software that does exactly what I wanted. It’s a lot more fun than praying to the vendor and waiting a few years, and it’s faster and more reliable too.

So it went with Git one night last October. I was repeatedly revising a branch with git rebase -i. A couple of points along the branch were marked as branches of their own, so every time I changed something I would have to either

  • rebase the full branch, then do a dance with checkout and reset to update the sub-branches, carefully typing the correct new commit IDs;
  • rebase the full branch, then muck with update-ref with the same care about getting commit IDs right; or
  • rebase the first sub-branch, then use rebase --onto to move the next sub-branch on top of it, then rebase --onto again for the main branch

What I really wanted to do was just

  • rebase the full branch, and tell the sub-branches to come along for
    the ride.

Fortunately I’d worked on the code for Git’s interactive rebase before—at Ksplice we push Git to its limits in six different directions, and rebase -i we push beyond the limits of stock Git—so I knew where to find the moving parts that could do what I needed. Four minutes after having the idea, I was happily using the new feature.

If you want the feature too, it’s up on my Git git repo. Or you can wait until I get it upstream. Why haven’t I done that already? That’s another old story about software. My 4-minute, 4-line patch turned into 29 lines with documentation and with proper error handling, then 147 lines to make the feature easy to invoke, and then 231 lines with test cases. So I just finished all that work today. Maybe you’ll see the feature in Git 1.7.1 this spring.

Written by Greg Price

January 25th, 2010 at 2:30 am

Posted in Uncategorized

Tagged with , , ,

The Soul of a New Machine

leave a comment

At Ksplice, we put a lot of effort and discussion into how we manage projects, in part because we know we aren’t as good at it yet as we’d like to be. Books by managers and books for managers lie scattered around the office and employees’ rooms. So imagine my surprise and delight the day after Christmas when I opened The Soul of a New Machine, Tracy Kidder’s 1981 classic about the machines that made the computer age and the geeks who built them, and discovered it was about project management.

Keith sometimes remarks that Ksplice needs a documentarian. The Soul of a New Machine is the result of a project that had a documentarian, one who produced prose. The Eagle project at Data General set out to build a new computer, a 32-bit version of the existing 16-bit Eclipse line, just as DEC raced into the lead in the minicomputer market with the 32-bit VAX. Tracy Kidder, a writer for The Atlantic looking to write a story about technology, connected with his editor’s old college roommate, the leader of the Eagle project, Tom West.

The story that Kidder tells is full of implicit lessons that look as current today as they were in the computing projects of thirty years ago.

Schedules. Everyone knows that computing projects run slow. When the Eagle project started in July of 1978, it set an insanely fast timeline to have the whole computer architected, designed, built, and debugged by April 1979. They didn’t make it, of course, and many team members expected that from the start. But at every stage West, the engineer in charge, insisted on treating the schedule seriously—”come on, this schedule’s real”—setting intermediate deadlines as if the April date would be met. And though April slipped, the project was done in October, just fifteen months after it started and still a rapid turnaround by anyone’s count.

Delegation. On a big project, the person in charge can’t do everything themselves, or even keep a close eye on all of the work. They have to rely on others to do it right. Good managers know this, but it’s nerve-wracking to actually put it into practice. West itched to get into the lab and start debugging the prototype himself, telling Kidder, “Rocking back here in my chair and talking about doing it is one thing, but it makes me worry. It gives me a nauseous feeling, because I’m not doing it.” Eventually, as one lieutenant puts it, he “gripped the arms of his chair and decided to trust” the engineer leading the hardware team.

Perfect vs. done. Before Eagle, some of the same engineers had worked on projects to build a 32-bit machine from scratch. None of these projects were completed. Eagle would be tied by backward compatibility to the 16-bit Eclipse, and at the outset some engineers saw the idea as “a kludge on a kludge on a kludge”, or “a paper bag on the side of the Eclipse”, and wanted nothing to do with it. Yet West convinced them all to sign on to the project anyway, and in the end it was the compatible computer that was completed, sold well, and rescued Data General.

It’s not only the lessons that seem not to have changed. Some of the engineers on the Eagle project recall how as undergraduates they would “stay up all night and experience … the game of programming”, and we can all think of people thirty years later who, like a few of them, “started sleeping days and missed all their classes, thereby ruining their grades.” One comforting thought: Carl Alsing, the engineer in charge of the microcode team, was one of those who actually flunked out of school.

Finally, a word about the writing. The technical exposition is incredible. On the one hand, the reviewer for the New York Times heaped praise on the prose that enabled him to “follow every step” despite knowing nothing about computers (and a reviewer writing in 1981 could mean that in a much stronger sense than any reviewer typing into their laptop today.). From my very different perspective, I was fascinated to learn details about the faraway architectures and design constraints of a different era. And in 291 pages delving frequently into technical aspects of computer architecture, digital logic, and software, I never felt condescended to and I found not one mistake.

Maybe Ksplice should get a documentarian after all.

Written by Greg Price

January 11th, 2010 at 1:27 am

Posted in Uncategorized

Tagged with , ,

Math is hard

leave a comment

This week I got myself eaten by a math problem. Fortunately the problem is solved now, and I’ve escaped. Let me explain.

Two years ago, I was in Erik Demaine’s class about computational geometry, and heard about a problem that I wanted to solve. Suppose you have a convex polyhedron in ordinary 3-space. Then there is an intrinsic metric on the surface points of the polyhedron: the distance between two points is the length of the shortest path between them that runs along the surface. You can describe this metric to me without mentioning the embedding in 3-space, for instance by giving the shapes of a bunch of triangles and telling me which sides connect to which. The metric will certainly have these three properties:

  1. it’s homeomorphic to a 2-sphere: I can put the triangles all on a sphere preserving their interconnections, if I’m willing to stretch and bend them
  2. it’s a polyhedral metric: every point has a neighborhood with the same metric as a Euclidean disk (for most points) or cone (for the vertices)
  3. it’s locally convex: every circle of radius r has circumference at most 2πr

Now suppose you hand me a metric in the form of a bunch of triangles and notes on which edges to connect to which, and I confirm that they satisfy properties 1, 2, and 3. The great thing is that this is actually enough to guarantee that the triangles can be put together in 3-space to form some convex polyhedron, and this polyhedron is unique. (Here it’s crucial that I be allowed to bend the triangles, but without stretching — like paper.) Soviet mathematician A. D. Alexandrov proved this theorem in the 1940s. But Alexandrov’s proof doesn’t say how I can actually go and construct this embedding, so it opens up an algorithmic problem: given a metric satisfying properties 1, 2, and 3, how can we construct the unique convex polyhedron with this metric that Alexandrov’s theorem guarantees us?

Just the previous year, mathematicians Alexander Bobenko and Ivan Izmestiev had shown how to build the polyhedron by solving a rather involved differential equation. I figured it should be possible to get an explicit algorithm by discretizing their differential equation, and put a time bound on it by showing that the discrete steps don’t have to be too small. I cornered math problem-solver extraordinaire Daniel Kane, we sat down for a couple of hours with the Bobenko-Izmestiev paper, and he agreed—this shouldn’t be too hard.

Well, we were half right. Our approach worked, and we obtained an algorithm with an explicit time bound, a polynomial in the size of the input and its geometric ratios. Voilà:

Theorem. Given a convex polyhedral metric M homeomorphic to a sphere with n vertices, ratio S between the largest and smallest distance between vertices, and defect (discrete Gaussian curvature) between ε1 and 2π−ε8 at each vertex, an ε6-accurate ε9-convex embedding of M can be found in time

Õ(n915/2 S832 / (ε121 ε1445 ε8617) )

where ε = min(ε6/nS, ε9 ε12/nS6).

Now, that bound with its enormous exponents was great for getting people’s attention in talks. You don’t often see a time bound like that, and the reason is that to get that bound we had to pursue the problem and hammer it out piece by piece, assembling a diverse array of geometric lemmas, long past the point where sane people might have stopped. Daniel and I probably spent close to 100 hours meeting and working on the problem, some of it together with Erik, and I spent something like two or three times as long nailing all the details down in writing. The work became my master’s thesis, I presented it at the WADS conference this summer, and we submitted the paper to a journal. We breathed easy for a while.

Then we heard back from the reviewers. Turns out peer review is a good idea. One reviewer grabbed onto a particular subproblem that he apparently thought was more interesting than the whole rest of the paper. At each iteration we need to find an appropriate retriangulation of the surface. What we need is described by a well-understood concept, a “weighted Delaunay triangulation“. In the familiar Euclidean setting a Delaunay triangulation is equivalent to a “Voronoi diagram“, so we originally thought that some existing papers that described how to compute a Voronoi diagram on a polyhedral surface were enough to give us the weighted Delaunay triangulation we needed.

This reviewer demanded we give more attention to the issue, and he was right—the problem was harder than we realized. On a polyhedral surface things are more complicated than in the plane; for one thing, there can be infinitely many straight lines between two points, and to make a Delaunay triangulation you need to use the right ones. But in two marathon sessions this week and 2000 words of writing, Daniel and I worked out how to get an unweighted Delaunay triangulation from a Voronoi diagram, and then how to adjust the triangulation to handle weights. So the reviewer was right to think the problem more interesting than we had treated it, and now we’ve solved it. Since our new algorithm for Delaunay triangulation is straightforward and is known to run in a practical amount of time, it may prove to be the most reusable part of the paper.

Anyone interested can read more in the paper (pdf). The new Delaunay triangulation algorithm is in the last section.

So that was my vacation. How was yours?

Written by Greg Price

January 4th, 2010 at 1:32 am

Posted in Uncategorized

Tagged with , ,