Richard Southall

Talk given at the meeting of the Association of European Printing Museums, Musée de l’imprimerie, Lyon, 13 October 2012.

 

The conservation of dematerialised type

 

The discussion that follows is based on a particular view of type conservation. On this view, a font of type can be said to be strongly conserved if all the elements needed for its production and use are themselves conserved, or can be accurately emulated.

In abstract terms, these elements are the following:

  • shape specifications for the characters in the font, and means of producing them;
  • realisations of the shape specifications, and means of producing these;
  • means of producing the components of a printing surface, and the components themselves;
  • means of assembling a printing surface from its components.

If a font is strongly conserved its types can be made, composed and printed from.

In these terms, many of the fonts of type which have been made by hand using traditional technology are strongly conserved. The character shape specifications are punches, cut by traditional methods. The shapes specified by the punches are realised as justified matrices, struck by hand or in a press and finished with justifier’s tools. The printing-surface components – the types themselves – are produced by hand casting, and the printing surface by hand composition.

With the arrival of hot-metal composing systems, type manufacture became industrialised; and drawings, rather than a craftsman’s understanding, became the starting point. The change from craft to industry adds a significant factor to the conservation picture. The hardware of the manufacturing process – drawings, patterns, punchcutting machines and matrix-striking presses – still exists and can be conserved, at least in principle. The additional challenge is now to conserve the knowledge needed to operate the machines correctly.

Photomatrices

Photocomposition redefines the conservation problem once again, as it redefined everything about type composition after its introduction in the 1950s. With the new technology type, in the sense of three dimensional assemblies of handleable objects, disappears. The shape specifications which are the starting point for photomatrix making are still drawings, but now in solid colour, rather than the outlines used for making punchcutting patterns. While these are relatively easy to conserve, the cameras used in the actual manufacture of photomatrices are not.

Early photomatrix cameras, rather than being objects like a punchcutting machine or a matrix-striking press, were structures: often as large as a room, with steel-framed panels to hold the character masters and steel or concrete bases for the mechanism that moved and exposed the matrix. Later ones, which often used two stage reduction processes, were still too large and heavy to make them at all easy to conserve.

Similarly, the composing machines used in the early days of the technology, apart from the traditional Monophoto, are effectively impossible to preserve in working order. The control units of the first direct photography machines used relay logic; later ones used discrete-component transistorised modules and the first integrated circuits. The skills needed to maintain them are dying out with the generation of engineers that worked on them. Once again, the challenge in conserving such machines is to preserve the documentation that explains how they worked and how they were maintained.

A photocomposing system produces arrays of character images on photographic film or paper, which are specific to particular jobs. Unlike metal type, they cannot be recycled. Unlike type again, they are almost never directly useable as a printing surface. Film output from a composing machine could be contact-printed to plate; output on paper had to be rephotographed in a process camera. Because time was usually precious, the output material of choice for the composing machine was very often resin-coated paper, developed with a quick two-bath process in which the image was stabilised but not washed. At best, the lifetime of this kind of output has to be counted in years rather than decades. Film may be preserved in publishers’ or printers’ archives; paper, from the early years of photocomposition at least, is effectively gone. Photomatrices, on the other hand, are relatively common. Once again, it is important to conserve the information that goes with them as well as the matrices themselves. Hardly any of the matrices for direct photography photocomposing machines carry any information about the widths of their characters. This is because in these machines the selection of the character to be photographed, the sizing of its image and its positioning in the line of text are all independent functions, performed by different parts of the machine. The matrix does not need to know about character width, because (unlike a hot-metal matrix) it does not know how big the eventual character image is going to be. All the calculations about the positioning of the image, which take its actual width on the output material into account, are done by the part of the machine that drives the escapement; and all that part needs to know is which character is coming along next, and how wide its image is.

In scanned-matrix photocomposition, on the other hand, the matrix does carry width information. In this technology character images on glass matrices are scanned by a combination of a cathode-ray tube and a photoelectric cell. The output images are written as successive vertical stripes of light, projected on to the output material from another cathode-ray tube by a continuously-moving optical system. The matrix has to carry width information, because the escapement as a separate piece of mechanism has disappeared.

Digital photocomposition

The bodies of printing types dematerialised in the 1950s with the arrival of photocomposition. Sizing lenses and the stepwise escapement followed them in 1967 with scanned-matrix composing machines. The matrix itself had taken its first steps along the same path two years earlier, when digital photocomposition arrived with Dr Rudolf Hell’s announcement of his Digiset machine in 1965. The first production Digiset was installed in Copenhagen in 1967. The fonts it used were written on magnetic tape. This carried information about the configuration of character images, which was stored as compressed sequences of binary digits in the composing machine’s controlling computer. In the machine itself, images were written on a high-resolution cathode-ray tube and projected on to the output material by a single fixed lens. Fast computer memory was very expensive at the time, so that the machine’s character repertory was limited, but this was not a great hindrance in the telephone-directory composition for which it was first used.

From the point of view of conservation, digital fonts are quite different from analogue ones: the photomatrices of direct-photography and scanned-matrix photocomposition. With photographic matrices it is not particularly difficult to get at character shape information, but without a working composing machine it is hard to use the recovered characters to compose text. With digital photocomposition the problem of strong conservation still has two parts, one more difficult than the other; but the two are the other way round.

With digital fonts, recovering the font information is hard. It still seems to be possible to get data transferred from magnetic tape to more current storage media; but knowing how the data is encoded is crucially important for reconstructing the information in the font.

Once the font information is decoded, though, writing it out to reconstruct the character configurations, and making fonts to compose text, is straightforward. The PostScript programming language, and high-resolution laser imagesetters, offer everything that is needed. What is much harder, for the early cathode-ray tube machines at least, is to fulfil the proposed criterion for strong conservation completely, and accurately simulate the appearance of their output. It is easy to forget just how blurry and soft-edged the character images produced by some early digital composing machines were.

Numerical photocomposition

The same two-part conservation problem occurs for the next generation of machines: cathode-ray tube typesetters using numerical fonts. The information in the font is now numerical descriptions of character outlines. The coordinates of points on the outline are stored in the font, and the machine uses one or another mathematical function to calculate the shapes of the curves that join them. The problems of recovering the font information are the same as for digital fonts, though now the storage medium is as likely to be 8-inch or 5¼-inch floppy disks as tape. Reconstructing the outlines is straightforward, as long as one knows the functions used to specify them. Rasterising them is straightforward as well. For machines like the Linotron 202, where the film or paper is directly in contact with the faceplate of a fibre-optic tube, the output is much less unlike imagesetter output than it was in machines that used a fixed relay lens to project images from a high-resolution tube with a flat glass faceplate.

One way to look at outline fonts, and the open font formats that followed them, is as the final stage in a process of abstraction that starts with photocomposition. In metal-type technology the matrix specified the shape of the character image on the face of a type, and this in turn specified the image produced on paper by the printing press. In direct-photography photocomposition the type has disappeared, and the image on the matrix specifies the image on the output material directly. In scanned-matrix photocomposition the matrix image is dissected by the scanning electronics, and the image on the output material is built up immediately afterwards by the writing CRT. In digital photocomposition the dissection process is carried out, offline and ahead of time, by the type manufacturer, and the composing machine’s task is to reconstruct the character image, dot by dot or line by line, according to the specification in the font.

With outline fonts, the character images the composing machine produces are not specified directly by the information in the font. Instead, this provides material for rasterising hardware or a scan-conversion algorithm to work on, and it is the digital information that results from that process that specifies the image on the output material. Thus, in a sense rather less general than Donald Knuth’s use of the term in his system for mathematical typesetting, an outline font is a ‘meta-font’: a single source from which digital fonts that specify character images in a range of sizes can be derived. For Knuth, a metafont would yield ranges of weights and even styles as well as a range of sizes, according to the settings of its parameters. Adobe’s multiple master fonts, such as Twombly and Slimbach’s Myriad, approached this objective to some extent.

It is clear that with digital and numerical composition, even more than with earlier technologies, conserving documentation is crucially important for conserving fonts. It will be difficult enough to recover information from a font storage medium, or to get a computer-controlled machine to work, even with the manuals: without them, the task is impossible.

The PostScript revolution

In the thirty years from 1955 to 1985, photocomposing machines went through several generations (exactly how many depends on how they are counted). On one reckoning there were six: ‘first-generation’ machines like the Monophoto; direct-photography machines like the Lumitype, with xenon flash tubes and sizing lenses; scanned-matrix machines with CRT output; digital and numerical CRT machines; and the first laser imagesetters.

In the 27 years since 1985, on the other hand, there has really only been a single way of producing typeset output: raster imaging, using either scanning laser beams or travelling arrays of light-emitting diodes. Similarly, there has really only been one tool for specifying the layout and content of pages: the PostScript programming language. The fact that page specifications written in PostScript produce, in principle at least, the same results on any raster-scan output device opened the way to the desktop publishing industry, which has largely done away with compositor’s work as a trade with distinctive skills.

Open font formats: the end of conservation?

Like the CRT photocomposing machines they succeeded, raster-scan imagesetters use numerical fonts. There are two principal languages for specifying the character shapes the fonts contain: PostScript and its erstwhile competitor TrueType, released by Apple in 1991. Nearly all fonts supplied at the present day are in OpenType, which is a wrapper format into which TrueType or PostScript character shape specifications are packed. Interpreters for all three formats are built into every new computer operating system. Thus it can be argued that for typefaces designed after 1985 the conservation problem has disappeared – at least for the moment. It will resurface when raster-scan imagesetting becomes obsolete; but since imagesetters are universal machines which within the limits of their resolution will draw anything they can be programmed to draw, that seems unlikely to happen in the near future.

Conclusions

The first conclusion that can be drawn from this discussion is that strong conservation of fonts for directphotography and scanned-matrix photocomposition – conservation, that is, in which the ability to make new fonts and compose text with them using appropriate technology is preserved – is hard. This is because the conservation of matrix-making cameras is effectively impossible, and that of photocomposing machines of the period so difficult as to be effectively impossible as well.

The second conclusion relates to the digital and numerical fonts produced after the dematerialisation of the photographic matrix. These are assemblies of machine-readable data on storage media of some kind. The problem in conserving them is to read the stored data. Once it is read, reconstructing the information in the font and replicating it in fonts that can be used with present-day output devices is a problem in computer programming – though not necessarily a simple one.

The third conclusion follows from the second one: it is that conserving the documentation that describes how dematerialised fonts are formatted and stored is crucial to their preservation. Without this information the font data on a storage medium is useless, even if it can be read.

The final conclusion is a paradox: the easiest typemaking technologies to conserve are the oldest and the newest. This is because in both cases their underlying technologies – fine metalworking in the one case, and computer programming and laser imagesetting in the other – are well documented and well understood. It is also because they are both currently practised. Hot-metal matrices are still made in a few places; nobody makes photomatrices any more.

A final question

Tablet devices are undoubtedly replacing printed material in some applications. Do they have a place in the museum of printing? If so, where? Which ones? If not, why not?