Post on 02-Jan-2016
transcript
Scholarly Communications in an Electronic Age
ASIST 2003 Panel Session
• Bradley Hemminger
• School of Information and Library Science• University of North Carolina at Chapel Hill
• bmh@ils.unc.edu
Public Storehouse of Knowledge
• Multiple open digital archives, holding all the world’s knowledge. A single logical universal archive, created by dynamic federation of all public archives.
• Contains everything: archive holds grey literature (publicly deposited) and gold literature (refereed articles).
• No barriers to access. Knowledge is freely available to anyone, any time, anywhere.
• Access to information and knowledge correlates to society’s quality of life.
Virginia Tech ETD
arXix University of CaliforniaElectronic Repository
psycprintsUNC Chapel Hill
Harvester(NeoRef)
Archive Model (NeoRef)• All material and metadata are author contributed to a
public OAI archive (author retains ownership).• OAI archives have automated or manual moderator to
filter out “junk”.• Everything--articles, reviews, comments, indexings, etc.,
are stored as digital content items on archive using the same mechanism. Reviews contain quantitative score, qualitative grade, qualitative comments.
• All materials universally available via search engines that harvest metadata from OAI archives.
• Retrieval is through Google like one stop shopping search interface, with dynamic filtering based on metadata and reviews to limit hits to manageable number to review.
Challenges are in Retrieval
• All material is archived (good and bad)• Metrics (some new) are used to
differentiate type, content, and quality.• Dynamic Searching allows quickly finding
materials of most interest. Search on– Type article=Review AND date > 1950– Content (schizophrenia AND GeneX)– Quality: Peer reviewed {journals}, citation rate
> XYZ
How Peer Review might work…
• Author submits article to her institution’s open archive (DOI uncch:sils/0007548.pdf).
• Author “submits” to journal EMEDICINE by providing DOI of article.
• Journal Editor schedules two reviewers. Reviewers review article, and submit their reviews (cornell:0191.pdf, ucb:0084.pdf).
• Author revises, and places revised article (DOI uncch:sils/0007957.pdf) on archive, and submits this final version to EMEDICINE.
• Journal submits review (EMED:0023424.pdf) which is final statement from journal (editor), and indicates acceptance of uncch:sils/0007957.pdf as EMED article).
Scholarly Communications ProcessToday’s Example
Idea
V1
Present to colleagues
V2
Present at conference
V3
Submit to journal V4
Referees Revision for journal V5
Journal Final Revision
V6
Revision to update analysis
V7
Revision to include additional new results V8
Scholarly Communications Process: What’s Captured Today
Journal Final Revision
V6
Only one version is captured, and the same community then pays to buy back access to article
Scholarly Communications Process:What I’d like to see saved!Idea
V1
Present to colleagues
V2
Present at conference
V3
Submit to journal V4
Referees Revision for journal V5
Journal Final Revision
V6
Revision to correct analysis
V7
Revision to include additional new results V8
formulate discussion discussion,revision
Two peer reviews
CopyproofingCriticisms, new thoughts,revision
new results,revision
commentscommentscomments
Author revision
Change the Process!
• Think of scholarly communication as continuous process instead of single product (journal publication).
• Capture significant changes/versions of a work.• Include all criticisms and comments about work (all
stages). • Support normal scholarly discourse, including authors
responses as well as others comments. • Add reviewer’s quantitative rating of material to allow
better filtering based on absolute quality metric during retrieval.
• Add machine (automated) reviews.
Can we save the Gold and Grey?
Idea
V1
Present to colleagues
V2
Present at conference
V3
Submit to journal V4
Referees Revision for journal V5
Journal Final Revision
V6
Revision to correct analysis
V7
Revision to include additional new results V8
formulate discussion discussion,revision
Two peer reviews
Author revision
Criticisms, new thoughts,revision
new results,revision
commentscommentscomments
Copyproofing
NeoRef Storage Model
Conference paper (v3)
Comments on V6
Journal Submission V4
Journal Final Revision V6
Revision to include additional results and analyses V8
Auto-indexing
Material expressing content
Two peer reviewsLocal powerpoint Presentation v2
Comments on V3
Automated
Author Indexing
Recognized Expert
Open (anyone)Top Tier (Keep Forever)
Filter (Moderate)
Grey Literature
Author
Machine Review
ContentItem
Selected Technical Challenges
• Self Contribution– Author indexed– Author supplied metadata (Dublin Core) as part of
authoring process (i.e. not separate after the fact).– Automatic extraction of metadata from document.– Archive file(s) must be in standard open format
NeoRef: PDF/A with DC elements in tags for automatic extraction of metadata. Expect migration to XML as we continue to divorce content from presentation.
Challenges
• Searching– DC metadata to allow coarse discovery.– Specialized searching within domain after locating material
(based on metadata field indicating appropriate search interface).
– Interactive searching to allow refinement to most desirable set within a few seconds. Use reviews to help filter search (Facultyof1000).
– Google searching on full text (covers all materials, but generates large number of hits, lower specificity).
– Automated agents to bring material of interest to your attention (California digital library).
• Example: article scores > 7.0, refereed, citation count above 10, type=research article, search terms = schizophrenia, geneX)
Challenges
• Knowledge Representation– Extend DC to include “concepts” and “claims”
(ScholOnto) to allow higher level searching compared to simple indexing.
– Make OAI and DC representation more robust by always supporting DOI to uniquely identify materials.
– Support unique identification of authors as well.– Making all content items submitted permanent– Use DC fields to link related items, new version of
paper to old version.
Challenges
• Rights Administration– Support mechanisms to allow authors to set
permissions as they desire, and enforce this.
– NeoRef supports Creative Commons through DC rights element.
– OAI recent supported rights administration using Creative Commons (and looking at how to handle collections etc where DC rights element may be insufficient).
What do users want?
The ALPSP survey was intended to discover the views of academics, both as authors and as readers. Some 14,000 scholars were contacted across all disciplines and all parts of the world, and with almost 9% responding.
Alma Swan and Sheridan Brown. Authors and Electronic Publishing: The ALPSP Research Study on Authors' and Readers’ Views of Electronic Research Communication. (West Sussex, UK: The Association of Learned
and Professional Society Publishers, 2002).http://www.alpsp.org/pub5.htm
Importance of journal features
010
20
30
40
50
60
70
80
90
Citation links Additionaldata
Addit/colourimages
Manipulablecontent
Video/sound
Importance of the peer review process
0102030405060708090
100Peer-reviewed
Refs' commentspublished
Referees identified
Public commentary oneprints
Post-publication publiccommentary
Ability to submitcomments
Importance of publishers’ rolesFactor Responses as authors Responses as readers
Peer review 81 80
Gathering articles together to enable browsing of content
64 49
Selection of relevant and quality-controlled content
71 54
Content editing and improvement of articles
60 39
Language or copy editing 50 34
Checking of citations/adding links
46 28
Marketing (maximising visibility of journal)
44 20
Survey (Project Romeo)
• Authors want quick and convenient dissemination of their work– Free access to others papers– Not overly concerned (or aware) of copyright
issues unless it stops them from freely distributing their work or accessing others.
Survey (Zhang 1999)
• Important to authors are– Permanence and Quality of electronically
survey archived resources– Better (faster, more accurate) searching
capabilities, i.e. using metadata instead of just search engines.
Survey (Rowland)
• 16% said that the referees would no longer be anonymous
• 27% said that traditional peer review would be supplemented by post-publication commentary
• 45% expected to see some changes in the peer-review system within the next five years
Fytton Rowland, “The Peer-Review Process,” Learned Publishing 15 no. 4 (October 2002): 247-258.
Report version: http://www.jisc.ac.uk/uploaded_documents/rowland.pdf
Additional Challenges
• Archive Hosting– Off the shelf computer system with lots of disk
space and public domain archiving application (DSpace, Eprints).
– Who maintains the material? {Library (MIT DSpace), Grad School, University (California), Publisher (PLoS, BioMedCentral), Society (arXiv)}
– Where are comments and reviews held (after the fact content items that reference original)?
Challenges
• Make content universally available– Export OAI items so they can be harvested– Have public domain quality harvesters that
support quick and simple searching (i.e. Google for metadata).
Challenges
• Peer Review– Make more public. Make available comments
on articles.– Add quantitative scoring as well as qualitative.
Overview of Peer Review
Review
Peer, Open, Machine
Accept, reject, revise with respect to XYZ standards
Comments to Author
Published Article
Article submitted
Send elsewhere
Filter
Reject
Quantitative Grade Score (1-10)
Qualitative
Quantitative
General Review Model Parallels
• In general, you have sample (material) which is judged/scored quantitatively and qualitatively by an identified observer with respect to some standard.
NeoRef for Movies, Products,…
• The same process used by NeoRef to support Scholarly Communication could be used for most any communication of informaiton purpose. All that is required is storage of Digital Content Items, and linking of reviews, comments, etc to them.
• DocSouth: self cataloged and indexed items are Grey; librarian/archivist cataloged and indexed items are Gold.
• Movies: Grey is everyone’s reviews; Gold is Siskel and Ebert reviews.
• Consumer Products: product reviews by Consumer Reports (gold), user reviews (grey).
Current Peer Review Options• Human Judgement
– Expert peer review (status quo)– Certified expert peer review– Open Peer Review BMJ, BioMed– Open comment review psycprints
• Computer Judgement– Computer peer review
• Human Usage– Citation-based (CiteSeer)– Usage counts (CiteSeer) Example– Quantity of discussion
• Coarse Categorization– Two Tier (grey/gold)– Moderator (current arXiv)– No review (old arXiv)
Quantitative
√√√√
√
√√√
Qualitative√ (relative)√ (relative)√ (absolute)√ (absolute)
√
√√√
Importance of future dissemination channels
Dissemination method Very important plus important
categories
Ranking
Traditional print + electronic journal 91 1
Discipline-based electronic reprint archive 78 2
Traditional print journal 77 3
Traditional electronic-only journal 66 4
Institution-based electronic reprint archive 60 5
New forms of electronic-only journal 49 6
Discipline-based electronic preprint archive
44 7
Institution-based electronic preprint archive
33 8
Provider Service Change
• What is worth paying for?– Quality review (Faculty of 1000)– Proofing, citation linking, professional presentation (
CiteSeer, Cite-base)– Marketing – Archival (JStor)
• Who hosts material:– Society (arXiv)– Commerical Publishers (Elesiever,BioMedCentral)– University Library (MIT Dspace)