Categories
Intellectual history

What the Digital Dark Age Can Teach Us About Ancient Technologies of Writing

Editors’ Note: due to the disruption of academic networks and institutions caused by the ongoing Coronavirus pandemic, JHIBlog will shift to once-a-week publication for the time being, supplemented by a selection of older posts from our archives. We are grateful for our readers’ understanding, and hope to resume normal scheduling as soon as possible.

By guest contributors Sara Mohr and Edward C. Williams

In contemporary science fiction it is hard to avoid the trope of an archaeologist or explorer unearthing a piece of ancient advanced technology and finding that it still functions. This theme may have its roots in the way we often encounter artifacts from the ancient world—decayed but functional or legible, as material culture and/or as carriers of written language. However, the prototypical “ancient technology” in fiction often resembles the electronic information technology of our modern age. Keeping our modern technology active and functional requires orders of magnitude more energy than the neglect implied by ancient ruins—delivery of spare parts fueled by cheap energy, complex schematics and repair manuals, and even remote connections to far-off servers. The idea that our technology would work hundreds of years in our future without significant intervention is unbelievable. In a certain sense, a Mesopotamian clay tablet is far more similar to the ancient advanced technology found in media—if it’s in good enough condition, you can pick it up and use it thousands of years later.

Will the archaeologists of the future see the information storage of the digital age not as sources of knowledge about our time, but undecipherable black boxes?  The general problem of data preservation is twofold: the first is preservation itself and the second is the comprehensibility of the data. The BBC Domesday Project recorded a survey of British life in the 20th century on adapted LaserDiscs—a format that, ironically, requires considerable emulation (the process of enabling a computer to use material intended for another kind of computer) efforts to reproduce on a modern machine only 35 years later. This kind of information loss is often referred to as the coming of the Digital Dark Age (Bollacker, “Avoiding a Digital Dark Age”). Faced with the imposing pressure of a potential Digital Dark Age and the problematic history of modern data storage technology, perhaps it is time to rethink our understanding of ancient technology and the cultures of the past who were able to make their data last long into the future.

Scholars of the Ancient Near East are intimately familiar with the loss and recovery of written information. Our sources, written in the wedge-shaped script called cuneiform, are numerous and frequently legible despite being thousands of years old. Once the scribal practice that transmitted the script was interrupted, considerable scholarly work was required to reconstruct it, but the fundamental media of data storage—the clay tablets—were robust. Even then, many valuable tablets have been lost due to mishandling or improper storage. Despite the durability of the medium, once the system of replicating, handling, understanding, and deliberately preserving these tablets were lost much information was lost as well.

But cuneiform writing is more than just the act of impressing words onto clay with a reed stylus; it is deeply rooted in the actions and culture of specific groups of people. This notion is certainly true of technology as a whole. An interrelationship exists between all elements of a society, and each constituent element cannot be considered or evaluated without the context of the whole (Bertalanffy, General Systems Theory: Foundations, Development, Applications, 4). Rather than focus on the clay and the reeds, it is necessary to take into account the entire “socio-technical system” that governs the interaction of people and technology (Schäfer, Bastard Culture! How User Participation Transforms Cultural Production, 18). Comparative studies of cuneiform writing and digital technology as socio-technical systems can inspire further insight into understanding ancient technology and illuminating why it is that humble cuneiform writing on clay tablets was such a successful method of projecting information into the future, as well as informing us about the possible future of our contemporary data storage.

Only recently have those who work regularly with cuneiform tablets studied the technology of cuneiform.  Cuneiform styli could be made of various materials: reed, bone, bronze, or even precious metals (Cammarosano, “The Cuneiform Stylus,” 62). Reed styli were the most common for their advantageous glossy, waterproof outer skin that prevented them from absorbing humidity and sticking to the clay. Another key part of scribal training was learning the art of forming tablets by levigating and kneading raw clay (Taylor and Cartwright, “The Making and Re-Making of Clay Tablets,” 298), then joining lumps of clay together in a grid pattern or by wrapping an outer sheet of clay around a core of a thin piece of folded clay (Taylor, “Tablets as Artefacts, Scribes as Artisans,” 11).

But cuneiform technology goes beyond the stylus and tablet and must include the transmission of cuneiform literacy itself. Hundreds of thousands of legible cuneiform tablets have been found and documented to date, with many more in the processes of being excavated. With such perishable materials as clay tablets and organic styli, how is it possible that these texts have survived for thousands of years? Surprisingly, the answer may lie in how we think about modern technology, data preservation, and our fears of losing records to the Digital Dark Age.

The problem is growing worse, with more recent media demonstrating shorter lifespans compared to older media. We see a variety of different projects that look back to older forms of information storage as a stop-gap between now and the possibility of a Digital Dark Age. For example, The Rosetta Project, from the Long Now Foundation, has been collecting examples of various languages to store on a 3-inch diameter nickel disc through microscopic etching. With minimal care (and the survival of microscopy), it could last and be legible for thousands of years.

We tend to think that a return to older forms of information storage will solve the problem of the Digital Dark Age—after all, the ancient technology of stylus + clay preserved Mesopotamian data neatly into the modern era. However, such thinking results from an incomplete understanding of the function of technology as it applies to the past. Technology is more than just machinery; it is a human activity involving technological aspects as well as cultural aspects interwoven and shaping one another (Stahl, “Venerating the Black Box: Magic in Media Discourse on Technology,” 236). Regardless of the medium or time period, the data life cycle largely stays the same. First, people generate data, which is then collected and processed. Following processing comes storage and possibly analysis.

But in the end, we always have humans (Wing, “The Data Life Cycle”).

Humans are the key to why information written in cuneiform on clay survived as long as it did.  In ancient Mesopotamia, scribal culture meant copying the contents of clay tablets repeatedly for both practice and preservation. There are texts we know from only one copy, yet in order for that copy to survive several other copies had to have existed. The clay and the stylus did not ensure the preservation of cuneiform information—it was the people and their scribal practice.

It is somewhat surprising that the discussion of ancient technology has not yet embraced the social aspect that accompanies the machinery, especially when we so readily acknowledge its impact on modern technology. To avoid losing electronic data, users are exhorted to intercede and make regular backup copies. We also find that the history of obsolete technology is based in innovative technology that died as a result of socioeconomic pressures (Parikka, “Inventing Pasts and Futures: Speculative Design and Media Archaeology,” 227). The technology was perfectly sound, but it was never a good match to the social and economic times of its release.

This need for protection from loss largely comes from the idea that “electronic writing does not have the permanence of a clay tablet” (Gnanadesikan, The Writing Revolution: Cuneiform to the Internet, 268). However, those of us who work with clay tablets are more than aware of the frightening impermanence of the medium. We are well acquainted with the experience of opening a box meant to contain a cuneiform object only to find that it has been reduced to a bag of dust. It has been said that more redundancy usually means less efficiency, but that does not hold for all circumstances. Mesopotamian scholars generated redundancy through productive training, and we now look to redundancy to save our digital future. However, redundancy was not a part of the physical technology, but rather the surrounding cultures that used it.

At its core, writing is an information technology. It is a system of communication developed for use by particular groups of people. In the case of cuneiform, the scribe who wrote the latest known datable cuneiform tablet composed an astronomical text in 75 AD (Geller, “The Last Wedge,” 45). Despite being able to date its final use, the last wedge, we are still able to read and understand Akkadian cuneiform today. However, it was not the process of incising the wedge itself that made this continuity possible. Rather, it was the scribal culture of ancient Mesopotamia that committed to copying and re-copying over the course of millennia.

The possibility of a Digital Dark Age has the world thinking of ways in which we can adjust our cultural practice around technology. Examples from Mesopotamia highlight the importance of the connection between human activity and machinery in technology. We would do well as historians to take notice of this trend and use it as an inspiration for expanding how we study ancient technology like cuneiform writing to incorporate more on human attitudes alongside the clay and the reeds.

Sara Mohr is a PhD student in Assyriology at Brown University. Her research spans from digital methods of studying the ancient world to the social function of secrecy and hidden writing. 

Edward Williams (Brown ‘17.5) is a software engineer at Qulab, Inc, working on machine learning and computational chemistry software for drug discovery. He acts as a technical consultant for the DeepScribe project at the OI, developing machine learning pipelines for automated cuneiform transcription.

Categories
Intellectual history

An Intellectual History of Their Own?

by guest contributor John Pollack

‘Tis the season. Not that season—but rather, the curious period in the United States between the holidays of “Columbus Day” and “Thanksgiving” when, at least on occasion, the issues confronting America’s Native peoples receive a measure of public attention. Among this year’s brutal political battles has been the standoff at Standing Rock Reservation, where indigenous and non-indigenous peoples from the entire continent have gathered to support the Standing Rock Sioux’s opposition to the Dakota Access Pipeline, the construction of which would threaten sacred lands. Although this conflict will not be a subject of discussion at every Thanksgiving table, at the very least the resistance at Standing Rock serves as a reminder of the very real environmental and political battles that continue to play out in “Indian Country.”

29405104981_34476e5053_z
Standing Rock Protestors. Image courtesy of The Lakota People’s Law Project.

On October 13, 2016, I attended a lecture given by Winona LaDuke to open the conference “Translating Across Time and Space,” organized by the American Philosophical Society and co-sponsored by the Penn Humanities Forum. I was in an auditorium at the University Museum at the University of Pennsylvania, but Ms. LaDuke did not attend the conference in person. She spoke instead from an office at Standing Rock, where she is leading resistance to the pipeline. Ms. LaDuke’s remarks at a conference focused upon the study and revival of endangered Native languages were a reminder to me and other audience members that being a “Native American Intellectual” means being a political figure, a public voice speaking and writing in contexts of imperial expansion and ongoing legal, military, and economic conflicts over territory. We may date the creation of the term “intellectual” to the late 1890s, with Emile Zola’s public attack upon the French military for covering up the innocence of Alfred Dreyfus—but it is arguably the case that Native American public leaders, whatever labels we assign them, have been speaking truth to power since 1492.

Over the past year, a team at Amherst College, in conjunction with the Association of Tribal Archives, Libraries, and Museums; the Mukurtu project; and the Digital Public Library of America, has been planning a framework for a “Digital Atlas of Native American Intellectual Traditions.” This exciting initiative promises to develop a new set of lenses through which we may observe and connect the intellectual histories of America’s indigenous peoples, across time and across territories. All students of the “history of ideas” should welcome this extension of the boundaries of the field in new directions.  

From Collection(s) to Project

Collectors of books and documents can play surprising roles in shifting scholarly attention in new directions, and this project is a case in point. In 2013, Amherst College Library’s Archives and Special Collections acquired the Pablo Eisenberg Native American Literature Collection. Known now as the The Younghee Kim-Wait (AC 1982) Pablo Eisenberg Native American Literature Collection, after its collector and the donor whose gift enabled the purchase, the collection, Amherst suggests, is “one of the most comprehensive collections of books by Native American authors ever assembled by a private collector.” (I would add that this is really a collection of mainly Native North American authors.) Few of the titles in the Eisenberg Collection are unknown or unique exemplars—but their assembly by one collector into one collection motivated Mike Kelly, Kelcy Shepherd, and their Amherst colleagues to investigate how such a collection might help reshape discourses about Native Americans and their intellectual histories.

 

Kim-Wait/Eisenberg Native American Literature Collection
Click to view Amherst’s Flickr gallery of images from the Kim-Wait/Eisenberg Native American Literature Collection.

 

Working outward from this impressive body of material, their project will create a framework drawing together “Native-authored” materials held in widely scattered repositories. They seek a digital solution to one of the problems researchers working in digital environments regularly confront: the difficulty of connecting related items across institutions. The authors note:

Search and retrieval of individual items allows for only limited connections between related materials, erasing relevant context. Tools for visualizing and representing these networks can ultimately provide even greater access and understanding, challenging dominant interpretations that misrepresent Native American history and obscure or de-emphasize Native American intellectual traditions.

Digital projects, I would add, can often exacerbate rather than reduce this effect of disaggregation and de-contextualization. Working online, we can easily fail to comprehend a collection of documents or printed materials as a collection, in which the meaning of individual items may be shaped by the collection as a larger whole. Some online projects select out particular items, extracting and featuring them—much as an old-style museum might present an artifact in a display with a rudimentary label, disconnected from its cultural origins. Others provide digital results in an undifferentiated mass. The immediate benefit of finding new materials online can feel impressive, but the tools for interpreting what we access can feel strangely limited.

The Digital Atlas, the authors argue, will fill a void, the current “absence of a national digital network for Native-authored library and archival collections.” Here they invoke that recurring librarians’ dream—the search for the perfect search tool. This can take the form of “union” catalogs that gather information from many places into one data source and make them easily searchable; or of “federated” searching, the creation of tools that straddle multiple data platforms and present results for researchers in a single, coherent view; or of the “portal,” an organized launching point that gathers disparate research materials together. Still to be negotiated, I imagine, is how this “national digital platform” will connect with other such “national” platforms, including the Digital Public Library of America.

Searching protocols represent only one of the challenges; the work of classification itself must be subjected to scrutiny. One of the project’s partners is Mukurtu, an open source Content Management System (CMS) that has been designed to encourage the cooperative description of indigenous cultural materials using categories designed by Native peoples themselves. Mukurtu, which describes itself as “an open source community archive platform,” provides tools allowing repositories to rethink the ways in which materials by or about Native peoples are categorized, cataloged, and accessed.

This new methodology will make “Native knowledge” more visible in collections held by libraries, archives, and museums:

The project will develop methods for incorporating Native knowledge, greatly enriching public understanding of Native culture and history. It will identify approaches for enhancing metadata standards and vocabularies that currently exclude or marginalize Native names and concepts. We will share this work with the digital library community and with Native librarians, archivists, and museum curators.

The project will “include both tribal and non-Native collecting institutions, building relationships between the two.” This promise to create new partnerships between academic and institutional collections and Native communities is a welcome vision of sharing and exchange. A number of institutions are redefining what the “stewardship” of Native documents or artifacts means and reconsidering the thorny question of who “owns” the cultural productions of Native peoples. At the American Philosophical Society in Philadelphia, for example, the Center for Native American and Indigenous Research has embraced a community-based methodology that actively shares indigenous linguistic collections with Native peoples and invites Native researchers to take intellectual if not physical ownership of these collections, wherever they reside.

This proposal’s creators have, for now, chosen to avoid a discussion of what is, and what is not, “Native-authored.” Authorship and authority are always contested domains, and Native authorship has been a subject of debate since the eighteenth century. Like African American writers, Natives have had to work with or against non-Native editors, printers, publishers, and of course readers. I hope that the Digital Atlas will give us new tools for studying these tensions and new ways to chart the impacts of Native author-intellectuals over time, in printed books, in periodicals and newspapers, at public events, and in letters.

Mapping an “atlas”

Another argument behind the Digital Atlas is that Native writing must be understood in its relationship to place: to location, to land, to social memory, and to the environment. At the same time, the authors insist that we cannot adopt a static spatial view but instead must focus on mobility—that is, on the connections between authors, texts, and routes.

The proposal poses this question: “What tools, methodologies, and data would be required to visualize and represent the networks through which Native people and authors traveled and maintained/produced Native space?” Data “visualization,” the use of mapping software to show nodes of activity and networked connections, has become a standard tool in the field of digital humanities and a frequent complement to scholarship in fields including book history, medieval and renaissance studies, and American literary studies. Indeed, Martin Brückner has recently argued that literary studies is in the midst of a widespread “cartographic turn,” noting the pervasive language of cartography—the map as tool and the map as metaphor—throughout the field.  

Given the project’s focus upon geography, visualization, and mobility, though, I confess that I find the Atlas’s emphasis that it will be a “national” product disappointing, if understandable—with its suggestion of a continuing focus upon the old familiar geography of the nation-state. I suspect that the project’s authors are well aware of this tension. Scholars like Lisa Brooks (an advisor to the Digital Atlas) and others have pushed us to think about the many routes along which Natives and their words have circulated: through territories shaped by geographic features and personal connections; along riverine networks; and over trading and migration paths that long antedate and overlap the national, state, or territorial borderlines drawn by European surveyors and colonial agents. Will the Atlas help us follow the movements of ideas along non-national paths and across networks other than those circumscribed by nations? I hope so.

Intellectual traditions, Intellectual histories

With its focus on assembling and mapping intellectual traditions, the Atlas proposal also makes the implicit argument that it is time to move beyond the old debate about the influence of the “oral tradition” and the impact of “written culture” upon Native peoples.

As Brooks and others have persuasively argued, anthropologists in the nineteenth and early to mid twentieth centuries often ignored the ways in which Native peoples used various forms of writing, including European ones, for their own purposes (cultural, literary, and legal), preferring instead to search for presumably older oral traditions that were somehow isolated from and uncontaminated by writing. Historians of Native America now question the dichotomy between oral and written. We must be particularly cautious about identifying the former as essentially Native and the latter as essentially Western or European.

In the European context too, the dichotomy has been questioned. Scholars including Roger Chartier and Fernando Bouza have pointed out the permeability of oral and written discourses within the European context and shown that these categories were both unstable and contested in the early modern period. Texts and images circulated through the social orders in complex ways, and oral, written, and visual forms maintained overlapping kinds of authority.

To be sure, European colonists, missionaries, and political leaders sought to create colonial regimes in which the written and the printed word would be dominant, even as orality continued to occupy an important place within their own cultures. Yet Native peoples in many regions, from Peru, to Mexico, to Northeastern North America often successfully retained their own highly developed cultures of oratory. And rather than classifying indigenous populations as peoples “without writing,” we have come to understand that the definitions of communication must be broadened to include the range of semiotic systems Native peoples used to share and exchange goods and information, and to preserve narratives and historical memory. Native peoples also adopted, adapted to, appropriated, or resisted European writing and print culture in a wide variety of ways.

But why, I wonder, will this be an atlas of intellectual traditions and not of intellectual histories? With this title, the project softens its potential impact upon the field known as intellectual history or the history of ideas. It seems to locate the project in an anthropological and not a historical mode. Native peoples, like peasants, workers, lower class women and other so-called “peoples without history” (to borrow Eric Wolf’s ironically charged phrase), are still too often relegated to the realm of tradition, and locked into a static past.

In 2003, Robert Warrior pointed out that the field of American Studies had only just begun to include the voices of Native American Studies scholars. We might now extend his point to encompass the field of the “history of ideas” or intellectual history. A search across the content of the Journal of the History of Ideas turns up not a single reference to Warrior or his work, and I am hard pressed to find a discussion in its pages of the “history of ideas” in Indian Country. Rather than assuming that the field’s concepts are too Euro-centric and have no bearing upon an equally complex but distinctly different realm of Native ideas and philosophies, I would prefer to work toward more common ground. We can expand the history of ideas to encompass Native American intellectual histories—while respecting Warrior’s call to maintain the “intellectual sovereignty” of Native America (Secrets 124).

I eagerly await the results of the Digital Atlas of Native American Intellectual Traditions. I look forward to studying its reimagined maps of American intellectual history, and to hearing more voices of the public intellectuals of Native America, past and present.

John H. Pollack is Library Specialist for Public Services at the Kislak Center for Special Collections, Rare Books and Manuscripts at the University of Pennsylvania. He holds a Ph.D. in English from Penn; he has published on colonial writings from New France and edited a volume of essays on Benjamin Franklin and colonial education. He is currently working on a monograph about the circulation of Native words in early European texts on the Americas.

Categories
Think Piece

Of Nuance and Algorithms: What Conceptual History Can Learn from Topic Modeling

by contributing editor Daniel London

Intellectual historians may be familiar with two general approaches toward the study of conceptual meaning and transformation. The first, developed by J.G.A. Pocock and elaborated upon by Reinhart Koselleck, infers the meaning of a concept from the larger connotative framework in which it is embedded. This method entails analyzing the functional near-equivalents, competitors, and antonyms of a given term. This “internalist” approach contrasts with Quentin Skinner’s “contextualist” method, which lodges the meaning of a term in the broader intentions of that text’s author and audience. Both of these methods tend to entail close, “slow” reading of a few key texts: in a representative prelude to his conceptual history of English and American progressives, Marc Stears writes, “It is necessary… to read the texts these thinkers produced closely, carefully, and logically, to examine the complex ways in which their arguments unfolded, to see how their conceptual definitions related to one another: to employ, in short, the strategies of analytical political theory.”

But what about the seemingly antithetical approach of topic modeling? Topic modeling is, in the words of David Mimno, “a probabilistic, statistical technique that uncovers themes and topics within a text, and which can reveal patterns in otherwise unwieldy amounts of material.” In this framework, a “topic” is a probability distribution of words: a group of words that often co-occur with each other in the same set of documents. Generally, these groups of words are semantically related and interpretable; in other words, a theme, issue, or genre can often be identified simply by examining the most common words pertaining to a topic. Here is an example of a sample topic drawn from Cameron Blevins’ study of Martha Ballard’s diary, a massive corpus of 10,000 entries written between 1785 and 1812:

gardin sett worked clear beens corn warm planted matters cucumbers gatherd potatoes plants ou sowd door squash wed seeds

At first glance, this list of words might appear random and nonsensical—but here is where a contextual and humanistic reading comes into play. Statistically, these words did co-occur with one another: what could the hidden relation between them be? Blevins labeled this set “gardening.” Her next step was to chart this topic’s occurrence in Ballard’s diary over time:

Screen Shot 2016-04-13 at 09.30.25

Clearly, this topic’s frequency tends to aligns with harvesting seasons. This is somewhat unsurprising, but note the significance: through mere statistical inference, a pattern of words was uncovered in a corpus far too large to be easily close-read, whose relation to one another seems to bear out both logically and in relation to real-time events.

Another topic produced by Blevins’ algorithm, which Blevins provisionally labelled “emotion,” looked like this:

feel husband unwel warm feeble felt god great fatagud fatagued thro life time year dear rose famely bu good

This might appear even more of a stretch, but Blevins quickly discovered that occurrences of this topic matched particularly “emotional” periods in Ballard’s life, such as the imprisonment of her husband and the indictment of her son.

These two examples encapsulate the three major features of topic-modeling techniques. First, they enable us to “distantly read” a massive body of texts. Second, they reveal statistically significant distributions of words, forcing us to attend humanistically to the historical relations between them. Finally, and most importantly, these topics emerge not from our a priori assumptions and preoccupations, but from “bottom-up” algorithms. While not necessarily accurate or reflective of the actual “contents” of a given corpus—these algorithms, after all, are endlessly flexible—they are valuable, potentially counterintuitive humanistic objects of inquiry that can prompt greater understanding and generate new questions. Practitioners of topic-modeling techniques have studied coverage of runaway slaves, traced convergences and divergences in how climate change is discussed by major nonprofits, and tracked the changing contents of academic journals. They have scanned the content of entire newspapers, and charted changes in how major public issues are framed within them.

While these applications only hint at the possibilities for topic-modeling for historians in a variety of fields, a growing number of practitioners are considering the implications of this technique for historians of ideas—with results that are already surprising. Ted Underwood examined the literary journal PLMA for insights into transformations in critical theory over the twentieth century, finding that articles associated with the “structuralist” turn were appearing earlier, and were associated with different sets of concepts (“symmetry” rather than “myth” or “archetype”), than has been assumed. Michael Gavin has brilliantly compared “rights” discourse in 18,000 documents published between 1640 and 1699, detailing the frequency with which different concepts (“freedom,” “authority”) and institutions (“church,” “state”) occur within this discourse. Topic-modeling enables him to distinguish what made 1640s “rights talk” different from 1680s talk, as well as the overlap between discourses of “power” with those of “rights”:

Screen Shot 2016-04-13 at 09.30.40

Topic-modeling does not find the “best” way to analyze text. The algorithms are malleable. It does not take word-order or emphasis into account. It does not care about motive, audience, interest, or any of those pesky “external” contexts that Skinnerians see as essential to understanding conceptual meaning. On the other hand, “internalists” will nod appreciatively at the concerns that structured Gavin’s study of “rights” discourses. Which terms co-occur when a particular keyword is invoked? Which points of connections are made between keywords? Which words and concepts appear to be central, and which are more peripheral? Which words tend to be shared across keywords, and which remain site specific? They can also agree with a more general premise behind Gavin’s study: that concepts are defined by the “distribution of the vocabulary of their contexts.” The next step is to agree that these distributions can be compared mathematically. Once you agree there, we’re in business.

Topic-modeling is, like the field of digital humanities more generally, in the phase of development which Kuhn would have called “normal science”: developing and testing methodologies that derive from established disciplinary questions and paradigms, shoring up the tool’s reliability for more adventurous work to come. For this reason, much of topic-modelers’ current work could fall into the “so-what” category. Yes, we know people gardened more in the summer, and that a king would appear frequently in the same texts as “rights” and “power.” However, conceptual historians should not be so quick to dismiss topic-modeling as a gimmick. If letting go of conceptual blinkers and generating new theories and findings is important to us, we should be willing to let go of some of our own.

Categories
Think Piece

What Does Early Modern Bibliography Have To Do with a Blog?

by Madeline McMahon

Conrad Gesner’s 1545 Bibliotheca universalis was a powerful tool for managing information. Like a Wikipedia dedicated solely to authors who had written in Latin, Greek, and Hebrew, the catalogue was intended as a companion for anyone trying to wade through what Gesner referred to as the “harmful and confusing abundance of books” available. Of course, Gesner’s own enormous book contributed to this overabundance—it even required a separate book’s thematic indices, those of the unfinished Pandectae, to navigate the thousands of authors listed in it. Gesner, as Ann Blair has vividly shown, lived in a kind of information age before our own (Blair, Too Much To Know, especially 1-2, 13, 56, 162-3). Yet Gesner’s distrust of the multitude of poorly written material and subsequent impulse to manage data had a flipside. His attempt to be exhaustive reveals an equal impulse towards preservation, even at a time when there were more books than ever before.

It is this first information age’s panicked scramble to preserve that I want to explore here, since it seems more alien to our own experience of information overload. Gesner’s older contemporary and friend, the Englishman John Bale, also wrote reference books that were intended to preserve as well as digest knowledge. Bale collaborated with Gesner during his exiles under both King Henry VIII and Queen Mary. While Bale worked for the talented Basil printer Oporinus, Oporinus in turn published Bale’s magnum opus, the Scriptorum illustrium Maioris Brytannie … catalogus (1557-59), a bibliographical compendium of British writers and their works. Bale’s Catalogus was organized chronologically and geographically, with indices functioning as readers’ “search” function and additional lists— of writers who had written on the Book of Revelation (for Bale, the key text to understanding human history), writers especially useful for writing a continuous narrative history of England, and priors general of the Carmelite order (Catalogus II:59).

Given Bale’s rabid anti-Catholicism (not for nothing did a historian of the next century dub him “Bilious Bale”), this latter list may be a surprise, even though Bale had been a Carmelite monk before precipitously and wholeheartedly converting to Protestantism in 1536 (ODNB). But Bale’s research into British libraries’ medieval books began during his time as a Carmelite. His wide-ranging knowledge of the location of medieval books, apparent in his notebook in the Bodleian library, benefitted from his itinerant career cataloging Carmelite monasteries’ libraries—perhaps all the more once those monasteries were dissolved (Andrew Jotischky, The Carmelites and Antiquity, chapter 7). For despite his confessional allegiance, Bale had one reason to mourn the dissolution of the monasteries: the accompanying dispersal of monastic libraries. Bale lamented the loss of these books to Matthew Parker, Elizabeth’s first archbishop of Canterbury and Bale’s employer: since the dissolution, he had found books “in stacyoners and boke bynders store howses, some in grosers, sopesellars, taylers, and other occupyers shoppes, some in shyppes ready to be carryed over the sea into Flaunders to be solde” (in Timothy Graham and Andrew Watson, The Recovery of the Past in Elizabethan England, 17). Bale’s own efforts to recover the books under Edward had been counteracted under Mary, and Matthew Parker assumed the aged Bale’s lifework, eventually bequeathing his own and many of Bale’s medieval manuscripts as the Parker Library to Corpus Christi College, Cambridge. Bale’s catalogue and other writings functioned like a shopping list for Parker’s team of scholars, helping them to identify books from the masses of material they found in private collections and cathedral libraries as well as to trace missing manuscripts’ provenance.

This desire to preserve the past crossed confessional boundaries—John Dee had pushed (in vain) for a national library, not so different from Parker’s, to be created under Catholic Mary with “great and speedy diligence” (John Dee’s Library Catalogue, 194). To compile books—whether in a book or a library—demonstrated early modern scholars’ faith in Pliny’s maxim, “no book so bad,” whatever its age, aesthetics, or ideology.

Among the many things this blog is not is an early modern reference book. And yet I hope that we can curate information in such a way as to preserve voices from the past and discover new ones in the field of intellectual history as we, too, sift through the sources of the past, and trace the meaning of old debates into the present.