Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A framework for evaluating automatic indexing or classification in the context of retrieval
Linnaeus University, Faculty of Arts and Humanities, Department of Cultural Sciences. (Library and Information Science)ORCID iD: 0000-0003-4169-4777
University of Buffalo, USA.
City University, UK.
University of South Wales, UK.
Show others and affiliations
2016 (English)In: Journal of the Association for Information Science and Technology, ISSN 2330-1635, E-ISSN 2330-1643, Vol. 67, no 1, 3-16 p.Article in journal (Refereed) Published
Abstract [en]

Tools for automatic subject assignment help deal with scale and sustainability in creating and enriching metadata, establishing more connections across and between resources and enhancing consistency. While some software vendors and experimental researchers claim the tools can replace manual subject indexing, hard scientific evidence of their performance in operating information environments is scarce. A major reason for this is that research is usually conducted in laboratory conditions, excluding the complexities of real-life systems and situations. The paper reviews and discusses issues with existing evaluation approaches such as problems of aboutness and relevance assessments, implying the need to use more than a single “gold standard” method when evaluating indexing and retrieval and proposes a comprehensive evaluation framework. The framework is informed by a systematic review of the literature on indexing, classification and approaches: evaluating indexing quality directly through assessment by an evaluator or through comparison with a gold standard; evaluating the quality of computer-assisted indexing directly in the context of an indexing workflow, and evaluating indexing quality indirectly through analyzing retrieval performance.

Place, publisher, year, edition, pages
2016. Vol. 67, no 1, 3-16 p.
National Category
Information Studies
Research subject
Humanities, Library and Information Science
Identifiers
URN: urn:nbn:se:lnu:diva-45521DOI: 10.1002/asi.23600ISI: 000368340100001Scopus ID: 2-s2.0-84975028899OAI: oai:DiVA.org:lnu-45521DiVA: diva2:842453
Available from: 2015-07-20 Created: 2015-07-20 Last updated: 2017-12-04Bibliographically approved

Open Access in DiVA

fulltext(919 kB)245 downloads
File information
File name FULLTEXT02.pdfFile size 919 kBChecksum SHA-512
78b1551bcf8ee355868b30215f7c096b0328371a2af5357c9ddc17c623acac3041fbfcc7cd6f24d1d128d314eb56d4ae78f904241f53027574f18ba1cdfb17cb
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Golub, Koraljka
By organisation
Department of Cultural Sciences
In the same journal
Journal of the Association for Information Science and Technology
Information Studies

Search outside of DiVA

GoogleGoogle Scholar
Total: 245 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 1002 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf