Lithography became an essential tool for materials research during the post–World War II computing revolution. Increasing computing power required shrinking circuits and packing transistors more tightly together. Lithography made it possible to write small, precise circuits on a semiconducting surface, setting the stage for modern computing and fueling Moore’s Law — the observation that transistor density on chips has tended to double every eighteen months. But lithography was by no means a postwar development. It dates to the late-eighteenth century and is notable as a technique borrowed for materials research from the storied and ostensibly distant craft practices of ink-based printing. What ties these disparate applications together — as...