Tags: Homework Log TemplateBusiness Plan Appendix SampleForever Living Business PlanEssays On Citizenship EducationEssay On Past Academic ExperiencesSample Methodology Research PaperWriting A 5 Paragraph Essay Lesson PlanQuality Of Good Teacher EssayCreative Writing ClubsAre Actors Overpaid Essay
Another avenue for acquiring ETD author-supplied metadata was to repurpose data supplied by Pro Quest.Averkamp and Lee documented how the University of Iowa Libraries transformed Pro Quest XML files using XSLT to create metadata that could be loaded into their online repository and was used to create MARC records for their online catalog.
Reeves described a process that Library and Archives Canada (LAC) used to harvest metadata with OAI-PMH queries that retrieved ETD Metadata Standard (ETD-MS) records for ETDs submitted from various Canadian universities in the Thesis Canada Portal.
Using this method, LAC had a cost savings of $95,000 in the 2006–7 fiscal year and expected progressively larger savings as more Canadian universities implemented ETD submission programs. described an elaborate process at Kent State University in which a Perl script called ETDcat ran when it received an automatically generated notification from the Ohio LINK ETD Center that an ETD had been submitted.
Library of Congress Subject Headings (LCSH) were assigned until 1964, though the headings were generally broad in scope.
From 1965 until 1974, LCSH were added only when a personal name, corporate name, or title of a work were present in the TD title.
Examples of MARC records for ETDs before and after the new procedure was implemented are provided, and time savings are quantified on the basis of studies conducted over a twelve-month period (three semesters).
The paper also describes in detail the mappings created to harvest the metadata, the customizations made to the XSLT crosswalk, and the steps taken to ensure that the metadata batchloaded into The CAT is of sufficiently high quality.One promising source of metadata was Penn State’s electronic theses and dissertations (ETDs) server.Marc Edit provides many default crosswalks for mapping between multiple metadata schemes.Beginning in 1975, full subject analysis was performed and LCSH was assigned only for TDs containing the term Pennsylvania or a local Pennsylvania name (such as a town or county) in the title. With this workflow, the average thesis required ten to fifteen minutes to catalog, with an additional five to ten minutes per thesis if referred for subject analysis.Such a relatively minimalist approach was designed primarily as a balance between providing sufficient access for TDs while minimizing the amount of time spent on complicated subject analysis for what are generally very narrow and specialized subject areas.The team also works closely with the Library Technologies Department to repurpose MARC records in The CAT for mass digitization partnerships, such as Hathi Trust and the Internet Archive.The team began looking at repurposing metadata from other platforms for use in The CAT in October 2013.A specialized XSLT crosswalk derived from a default DC to MARC crosswalk that is part of the Marc Edit installation was used to convert the records into MARCXML.Boock and Kunda also described the OSU experience, but focused more on workflow changes and cost savings.Although the literature addressed multiple ways to acquire ETD author-supplied metadata, the variable and often substandard quality of this metadata arose as a common theme.Mc Cutcheon gave a good summary of the issues and noted that “the descriptive record created by automatic harvesting is only as good as the quality of the author-supplied metadata, which varies from author to author.” Metadata quality issues included representation of scientific symbols and diacritics, separation of titles from subtitles, nonfiling characters in the title proper, capitalization, management of whitespace, spelling, and other data entry errors.