2014 NDSR Symposium Recap
The NDSR 2014 Symposium, “Emerging Trends in Digital Stewardship,” ended on the afternoon of April 8th with positive reviews from both attendees and panelists.
Many thanks go to the National Library of Medicine for opening their campus and conference facilities, to all of the speakers and panelists for their thoughtful remarks and enthusiastic participation, to the CLIR DLF staff for hosting an excellent reception in the evening, to the Library of Congress for their funding, to NDSR Heidi Dowding for creating the website, and to all of the other NDSRs for their work putting this event together.
Finally, special thanks are also due to NDSRs Maureen McCormick Harlow and Lauren Work, who took leadership roles in planning and coordinating the event.
Below is a summary of the opening and closing remarks and the panel discussions. A conference recap on The Signal will also be provided. The conference was recorded by the NLM staff and audio will be made available soon.
Demonstration: Christopher (Cal) Lee and BitCurator
The Symposium opened with remarks and a demonstration of BitCurator from Cal Lee, a professor at the UNC-Chapel Hill School of Information and Library Science. He opened with the quote from Edmond Locard “every contact leaves a trace,” a canonical statement in digital forensic science. He further expanded on how that principle works in the digital world, where every one of our actions leaves a digital fingerprint, whether in the creation of documents, ‘likes’ on Facebook and ‘favorites’ on Twitter, the creation of IP addresses and packets, cached copies of content, system logs, hidden metadata (EXIF in photos, for example), Windows Registry information, and so on, including even the traces of files we deleted.
Digital forensics tools seek to capture those traces, preserving the context and content of our activities from the smallest magnetic flux transition to the largest aggregation of complex objects. BitCurator, the digital forensics tool Cal Lee demonstrated, is an open-source digital forensics desktop that aggregates and unifies several digital forensics tools and allows the user to work with them through the command line or graphical user interface. It is designed to support existing digital preservation environments (see the wiki) and is intended to work on its own Linux environment, though it may also be run from a virtual machine.
The Signal has posted about BitCurator before and, for more information, I would highly encourage a glance at Trevor Owens’ interview of Cal Lee, as well as some time on BitCurator’s webpage, wiki, and the many webinars and videos posted about it. More and more, librarians and archivists are developing widely available, user-friendly tools for accomplishing some of the more important or difficult tasks in digital preservation.
Panel Discussion: Social Media, Archiving, and Preserving Collaborative Projects
The first panel began with an opportunity for panelists to introduce their projects. Laura Wrubel, Electronic Resources Content Manager at George Washington University/Gelman Library, spoke about her team’s work developing the Social Media Feed Manager application, an open-source platform which anyone can use to help them create personal archives of targeted Twitter streams for preservation or research. Current users include professors, students, and GWU itself – as Laura pointed out, a large amount of student and university activity now takes place on social media instead of on static web pages, and so GWU has begun to collect content streams related to student groups, sports, university events, and the university hashtags (#GWU) for its own archive.
Leslie Johnston, Chief of Repository Development at the Library of Congress, spoke about the Library’s efforts beginning 14 years ago to archive the web, and mentioned some of the challenges created by the ever-changing technology behind social media websites such as Facebook, Myspace, and Twitter, including the fact that the Library now is called upon to develop entirely new technologies in order to carry out its traditional mandate to collect and preserve the record of human activity for future scholars.
Janel Kinlaw, Librarian at the National Public Radio, spoke about the NPRchives project, a collaboration between the NPR Digital team and the NPR Library to highlight historic radio stories from the NPR Archives starting in 1984, and how working on this project and pushing it out to their audience through social media channels has often been a lever for enriching existing metadata, re-discovering valuable legacy content, and transforming how her organization works together.
Some of the key themes that emerged from the panel discussion included the need to develop metrics of success for social media and collaborative projects (qualitative or quantitative, but most importantly, shared), to change how we train our users (new kinds of data, new amounts of data, new ways of interacting with data may require new skills), to think about how such projects can be leveraged to break down internal work silos and learn again how to collaborate across our organizations, to imagine what ephemeral and changing platforms will emerge next for us to preserve and study, and to carefully consider how we define “perfection” versus “function” for the tools we create and use.
Panel Discussion: Open Government and Open Data
After lunch, the second panel of the day commenced, and featured panelists with a deep knowledge of and passion for open data and open government issues. Daniel Schuman, from the Citizens for Responsibility and Ethics in Washington (CREW), spoke about his organization’s work advocating for better legislation around government open data, and as he put it, “shamelessly pandered to the audience” by opening with the phrase, “librarians and archivists are the unsung heroes of the open data movement.” That is, they know where all the information is buried, and keep it safe for future discovery.
Nick Shockey, from the Scholarly Publishing and Academic Resources Coalition (SPARC), also demonstrated his knowledge of the audience by mentioning his organization’s work to advocate for more open systems of scholarly publication, and how the success of PubMed (created and managed by the conference host, the National Library of Medicine) has been a critical success story for his group to bring into congressional offices to help push for more and similar platforms for public access to scholarly data.
Jennifer Serventi, from the National Endowment for the Humanities, opened by saying “I am the Government, and I am The Man,” but quickly pointed out how critical it was for change to come also from within the government. She also spoke about her organization’s important work in keeping the voice of the humanities present and relevant in discussion of open data and access.
Themes that emerged in this panel include the importance of “cross-pollination” between the technologies and developers, who know how to manipulate and utilize the information, and the librarians and archivists, who know where the information is; the importance of bringing the crowd back in, that is, showing them how the data helps their work and how they can contribute to the conversation; supporting data literacy while we advocate for more open data policies, for there would be little reason to provide open data if our communities did not have the skills in statistics, quantitative analysis, data mining, and knowledge of the legal frameworks and ethical considerations necessary in order to use it; and finally, the need to focus on getting better legislation passed, such as legislation that might expand re-use rights and avoid creating negative incentives for publishers.
The panel closed with a few remarks from each speaker on how librarians can help advocate for open data policies and initiatives in their own institutions. Librarians, it was said, could help advocate by understanding the objections to open data, understanding the history and context of their institutions and the “reasons behind the reasons” for any objections, and by finding the allies and critical supporters for open data policies in their communities in addition to those from the libraries.
Panel Discussion: Digital Strategies for Public and Non-Profit Institutions
The third and final panel of the day addressed digital strategies for cultural heritage institutions. Eric Johnson from the Folger Shakespeare Library began with a discussion of the new five-year strategic plan for his institution, a plan that entailed the reorganization out of which his position was created, and which has the first goal of “Building the Research Library of the 21st Century.” Technological initiatives, such as expanding the Folger’s online infrastructure to reach an ever-widening audience, were a large part of that goal, but so were goals such as continuing to enrich the Library’s physical collections.
Matt Kirshenbaum from the Maryland Institute for Technology in the Humanities then spoke about MITH’s work developing standards and testing new methods for working with personal archives that include born-digital materials, mentioning specifically MITH’s work on the Deena Larson and Bill Bly collections. Accepting and working with born-digital personal archives was an upcoming challenge mentioned by panelists throughout the day.
Carl Fleischhauer and Kate Murray from the Library of Congress followed by speaking about the Library’s strategic direction, Carl giving everyone a rich look at the history of the Library’s digital asset programs (reaching back to 1994 and the American Memory Project) and Kate discussing some of the more recent initiatives, including the Federal Agencies Digitization Guidelines Initiative (FADGI) and the National Digital Stewardship Alliance (NDSA). They discussed the benefits of having the resources of such a large institution behind their efforts, and about the sometimes complicated process of getting an initiative that was created in one office effectively pushed out across all Library offices, and then to the public as well.
Once again, we heard interesting points made about what libraries and archives should consider as they plan for the next five to ten years. One of the points was that, with the rapid pace of technological change, developing a clear vision of what is coming in five or ten years may no longer be possible. Instead, library and cultural institutions will have to become comfortable with a different level of uncertainty and risk, and may need to reshape how they accomplish their work so that this level of uncertainty works in their favor, rather than against it. A related point was that this uncertainty can in fact be a catapult, rather than an anchor. When libraries and archives have the right people in place, they can capitalize on the unexpected successes and failures of projects to shape dynamic new initiatives that continue to push their institutions forward.
The panelists also commented on whether the project-based model for digital library initiatives is still effective, or whether libraries should consider a more widespread re-organization to make digital projects part of the quotidian workflow. Panelists offered thoughtful arguments for each perspective. On the one hand, project models can be nimble, isolate risk, and enable experimentation that leads to valuable results and sometimes even more valuable side effects, such as the professional development of staff involved in new initiatives. On the other hand, projects can be difficult to sustain both financially and technologically, resulting in an organization where both projects and “normal” operations proceed at a snail’s pace when staff are able to contribute time. A global reorganization to incorporate projects into regular operations can provide valuable, dedicated support to ongoing digital efforts.
In the end, two themes rose to the surface of the day. The first was that one should always recognize the primacy of the physical collections in library and cultural institutions, and build a deep commitment to them alongside a commitment to digital initiatives, whether projected or functionally managed. The second was that an organization will always need a long-term strategy, no matter the muddiness of future waters. And while this strategy must have room to incorporate uncertainty and re-calibrate as it encounters change, it is vital to have in place in order to baseline an organization’s performance, know whether it is succeeding or failing, and both learn from failures and build upon success.
Jeffrey S. Reznick, Chief of the History of Medicine Division at the National Library of Medicine, ended the conference with closing remarks that included the importance of keeping one foot in the past but always looking to the future, and having a vision so that any challenge that comes our way can be reframed as an opportunity. These opportunities may be a new way of collecting, new things to collect, new methods of connecting with an audience, new technologies and services to provide, new ways of doing business within our organizations, or new ways of training the next generation of leaders. The trends we see now will be different in five years. But we can always look for them and see in them a way to inform and enrich our traditional missions – to collect, preserve, and provide access to the record of human achievement.
The audio of the program will be posted soon along with transcripts, courtesy of the generous staff at the National Library of Medicine.
Furthermore, the Twitter feed was very active and contains not only quotes, thoughts, and reactions, but also valuable links to some of the websites and white papers the panelists mentioned in their discussion. NDSR resident Emily Reynolds also set up a TAGS explorer and archive, and Kirsten Mentser set up a Storify.
Again, there will be another look at the symposium in an upcoming post on The Signal, Emily Reynolds posted her take on her blog, and others may add their thoughts as well. A list of the current residents can be found at this link.
-Julia Blase, NDSR resident at The National Security Archive. Editing provided by Maureen Harlow, NDSR resident at the National Library of Medicine.