The Metagraph

This week our reading discusses “the call for methodological transparency in historical writing” as presented by Gibbs and Owens in their essay within Writing History in the Digital Age (Gibbs and Owens). Their primary argument is exactly this, a call for transparency. They request for discussions of data queries, workflows, the production and interpretation of visualizations, explaining productive failures, and a whole host of other things. Historians have been presented with a new hurdle to pass, as the sheer volume and ease of collection of data in this digital age leads to projects that comes to a strong conclusion, but the methods that hey used to get there are obscured within hundreds or thousands of data points. The need for methodological transparency allows for responses and inquiries on how the author came to their conclusions as well as their project itself.

Gibbs and Owen also discuss the uses of data and harken back to the cliometricians, claiming this is not what they are calling for, While they wish for data and methods to be transparent when writing findings, they also wish for the data to show how they reached different stages in their project. They call for historians to treat data as text, not as a reason or evidence to support a claim. You should draw your conclusions from your data, allowing it to speak for itself instead of finding data that corroborates the hypothesis you have created. In their conclusion, Gibbs and Owens request that “just as historians learn to find, collect, organize, and make sense of the traditional sources, they also need to learn to acquire, manipulate, analyze, and represent data” and this is a fitting final statement.

As a student, their call to arms for transparency is welcome. They mention Google Ngram and use it to show how vast the data located at your fingertips on a personal computer can be. A transparency with scholars would aid me in the construction of my own projects, not just digital but traditional, by illustrating how professional scholars go about chosing their topic, arguments and finding and presenting their results. The most difficult part in my writing process is finding topics. With transparency regarding how one might move from interesting data to a historical question and argument would be invaluable.

From here, our reading focuses on Cameron Blevins’ Mining and Mapping the Production of Space. Blevins provides us with a Methodology section that is exactly what the previous authors called for. Blevins addresses the problem of abundance, with the newspapers he examines containing nearly 2500 issues with 130 million words. Blevins employed “distant reading” to track trends over long periods of time and vast amounts of issues, and luckily the University of North Texas Library had digitized these newspapers. This digitization allowed Blevins to use Optical Character Recognition to mine the data of all of the newspapers. Blevins also addresses the problems with his particular methodology, in this case they were smudges, tears, and changing typefaces. All of these reduce the accuracy of the OCR. Advertisements and stories that continue later int he issue also present a problem to the computer, as they cannot distinguish between their location and the adjacent text, sometimes combining words. Blevins then removed high frequency words such as “a” and “the” and began to count the occurrences of placenames in the newspapers. This data set was used to create a map that visualize the city’s changing position in the space of the midwest, as well as the cities that Houston frequently addressed and spoke about.

All in all, I believe Blevins answered Gibbs and Owens’ call to arms regarding transparency and provided valuable insight into the mind of a scholar and the production and completion of a massive project. His use of OCR can be transferred to my research when Emancipation Day, Memorial Day, and discussions of Olustee are discussed in Floridian newspapers.

The Historian’s Macroscope

Exploring Big Historical Data: The Historian’s Macroscope is an interesting book project created by the collaboration of S. Graham, I. Milligan, and S. Weingart. The interesting feature of this book is how it was produced, which was markedly different from others. The book began as a draft online, allowing the public to see the drafting and writing process on page at a time. This meant that the authors, as well as readers, had continual input in the creation of the text. COmments on the webpage where the draft was held, Tweets from Twitter, and a variety of other means allowed the authors and readers to actively engage in a conversation regarding the unfinished book. This process gave it a sort of peer review prior to completion in my eyes. This collaborative version of the text is the one available for free online, as the authors state the reason is for those who cannot afford a print version.

The print version of the book is nearly $40, and while I do prefer reading a tangible book, that price seems a bit far-fetched for me. The print version seems to be a simple tenure checkmark. THe authors put an incredible amount of time into this work, but without it in print many institutions will not count it towards tenure. I also believe that with the print version, the reader is missing some of the most interesting aspects and uses of the book: the application of digital tools to history. This book addresses the fluidity of digital scholarship in an interesting way by allowing the public to comment and suggest alterations to the text, and while this might create a few controversies, it also forms a peer review board for the book. This is something extremely important I think, and something that most people say digital scholarship lacks.

The authors are all self-proclaimed computer geeks. This work is a blend of their two passions, technology and history. While the origins of their love for each are different, their primary goal is to create a fusion of the two. Ian Milligan did this prior to the book at a THATCamp session, while Shawn Graham worked independently on his blog “New Tools for Teaching and Research in Archaeology.”

The Historians Macroscope is clearly aimed at students. The trend that I’ve seen authors working towards in their writings on digital history seems to focus less and less on theory, instead the focus has become getting the new generation of historians on the bandwagon for the digital history. The increase of digital publication only legitimizes the work there, in most cases, senior scholars have done while also broadening the field. The authors also intend to introduce people to the “macroscope.” THeir intention with this is to simply put the knowledge out there so that researchers and students who could benefit from the use of the macroscope may do so, where they may have not found it before. It  is also thoughtful to provide a free version for the broke college students!

The authors suggest working with multiple tools throughout the book, and I chose to create a word cloud of the transcript of the South Carolinian “Declaration of the Immediate Causes Which Induce and Justify the Secession of South Carolina from the Federal Union.” This is something I have read for a previous class in an attempt to draw out what we thought the causes of the U.S. Civil War were. The avoidance of the word slave is interesting in the document, while the fight for states rights easily show the significance of the Lost Cause myth in this document.


Prior to creating this cloud, I had some trouble figuring out how to remove the common English language words and phrases such as to, the, for. Basically, the simple words muddied up the data for the cloud, allowing no conclusions to be drawn from it. Luckily, Voyant’s only setting is this text removal. While this is a simple version of text mining, I think it would be interesting to mine newspapers from a period and try to reconstruct the world as they knew it. You would be able to see what businesses and other towns a community was speaking about during the era, which I think would be fascinating.

The History Manifesto

This week we are studying the incredible debate surrounding The History Manifesto, by  Jo Guldi of Brown University and David Armitage of Harvard University. The primary argument of their manifesto is that our society needs to shift its focus from the “short termism” that we are accustomed to, and in turn move backwards to the long term thinking that was present in society a century or two ago. This short-termism is not just present in history, but in politics and economics as well. History does present a particular problem for the author’s as it is their field of study, one they wish to revolutionize. With the authors wishing to recapture the history of the longue durée, they argue that historians should have never moved past this stage. I personally fall right in the middle of the time-scale argument when I am studying memory. The event that is informing memory most definitely falls into the brand of micro-history, but the memory itself spans many decades. Still, a lifespan of study is considered too narrow for the authors of this book, something I find ridiculous. The significance of micro- and macro-histories are equally great, but the authors seem to equate great with long. The reviewers of The History Manifesto, Cohen and Mandler, most definitely agree on that. In their words, “Guldi and Armitage persistently equate long with significant.” (AHR Exchange 536) Both brands of history have equally as many pros and cons, I think it’s just a matter of how you utilize the history and other methodologies within them. One argument that I had discussed with some classmates was the idea that living in the technological age has created our crisis of short-termism by the reader being constantly overloaded with information. While I believe it can be particularly challenging to find the correct materials to study when there is so much available, I also believe that this age of technology is what has allowed the micro-histories to be so great! The availability of information has allowed unprecedented work to be completed, just look at Philip Curtin’s Census. This project has allowed a deeper understanding to be drawn from the Atlantic Slave Trade that many had never thought possible, but the collected and easily accessed information has brought a number of new analyses to a subject that many only superficially addressed before hand.

The availability of information seems to be the exact reason why Guldi and Armitage are calling for the return to the longue durée. The new trend of digital history has recently allowed historians to process an unbelievable amount of data through text mining and other methods. The work Dr. French did with Dr. Staley is a prime example of this, as is the work of Visualizing Emancipation. Both of these works analyze an incredible amount of information only possible through digital means, but each falls into what the authors would call short-termism. Thee scope of the projects are not hundreds of years, but instead focus on a little less than a century. The scope of these works are extremely impressive in my opinion, and offer the same benefits as a longue durée approach.

The arguments presented by Guldi and Armitage do not particularly move me as a historian, but instead reveal to me that they have a flawed understanding of the interaction of both long- and short-term histories. I believe that the long-term histories of the 1970s have informed the short-term ones of today invaluably. With new methodologies and theories on how history should be written, the long-term will always have new publishing, but these publishing’s may not be as vast a scope as the authors have wished. The scope of the longue durée has shortened itself in the age of information, instead of covering centuries, it instead covers decades. That is my opinion of the shifts in the field, and this is something I believe the authors are discounting.