Aslib-Cranfield studies on the evaluation of indexing systems

  • 23 Pages
  • 4.64 MB
  • 4773 Downloads
  • English
by
Institute of Librarianship, University of Ibadan , Ibadan
Aslib. -- Cranfield Research Project., Inde
Statementby S.M. Lawani.
SeriesOccasional paper -- 5
Classifications
LC ClassificationsZ695.9 .L38
The Physical Object
Pagination23 p. ;
ID Numbers
Open LibraryOL17008018M

Aslib-Cranfield studies on the evaluation of indexing systems. Ibadan [Nigeria] Institute of Librarianship, University of Ibadan, (OCoLC) Material Type: Government publication, National government publication: Document Type: Book: All Authors / Contributors: S M Lawani. The Cranfield experiments were a series of experimental studies in information retrieval conducted by Cyril W.

Cleverdon at the College of Aeronautics at Cranfield University in the s, to evaluate the efficiency of indexing systems. The experiments were broken into two main phases, neither of which was computerized.

The entire collection of abstracts, resulting indexes and results were. ASLIB CRANFIELD RESEARCH PROJECT REPORT ON THE FIRST STAGE OF AN INVESTIGATION INTO THE COMPARATIVE EFFICIENCY OF INDEXING SYSTEMS - by - C. CLEVERDON An investigation supported by a grant to ASLIB by the National Science Foundation The College of Aeronautics, Cranfield, England September, ASLIB Cranfield Research Project: Factors Determining the Performance of Indexing Systems, Volume 1; Part 1, Part 2, and Volume 2 (3 volumes) [Cyril Cleverdon, Jack Mills, Michael Keen] on *FREE* shipping on qualifying offers.

ASLIB Cranfield Research Project: Factors Determining the Performance of Indexing Systems, Volume 1; Part 1, Part 2Author: Cyril Cleverdon, Jack Mills, Michael Keen.

Download Aslib-Cranfield studies on the evaluation of indexing systems EPUB

Lawani has written: 'The Aslib-Cranfield studies on the evaluation of indexing systems' -- subject(s): Aslib, Aslib. Cranfield Research Project, Indexing Asked in Electronics Engineering.

Lawani has written: 'The Aslib-Cranfield studies on the evaluation of indexing systems' -- subject(s): Aslib, Aslib.

Cranfield Research Project, Indexing Asked in C Programming, Computer. ASLIB CRANFIELD RESEARCH PROJECT FACTORS DETERMINING THE PERFORMANCE OF INDEXING SYSTEMS VOLUME 2 Cyril Cleverdon and Michael Keen An investigation supported by a grant to Aslib determined by an evaluation of the system, the main consideration probablyCited by: 5.

The Text Retrieval Evaluation Conference (TREC), coordinated by the US National Institute of Standards and Technology (NIST), is the largest information retrieval (IR) experimentation effort in Author: Dagobert Soergel.

Cleverdon and M. Keen, Factors Determining the Performance of Indexing Systems, ASLIB Cranfield Research Project Vol. 2, pp. – (). Google Scholar Cited by: 9. Era three saw a number of case studies of limited generalisability and a general recognition that the best search performance can be achieved by the parallel use of the two types of indexing languages.

The emphasis in Era four has been on the development of end-user-based systems, including online public access catalogues and databases on by:   His work on evaluation--of libraries, of information retrieval systems and information systems more generally, of indexing and abstracting structures, of indexes and journals, of online searching, of users and their needs, of training programs, of library collections and information services, of evaluation as a management tool--is truly monumental.

The second Cranfield experiment examined variations in indexing and vocabulary control and the test collection was later used by other experimenters.

The Aslib‐Cranfield studies inspired the many subsequent TREC and similar retrieval evaluation conferences, which for 50 years have consistently continued to reach the same : Michael K. Buckland. Ammenwerth E, de Keizer N () An inventory of evaluation studies of information technology in health care trends in evaluation research Methods Inf Med, 44(1) Google Scholar.

The evaluation of information retrieval systems has gained considerable momentum in the last few years. Several evaluation initiative are concerned with diverse applications, usage scenarios and.

testing indexes + index language devices - aslib cranfield project. american documentation. ; 15 (1): 4-& 27 2: 0: evaluation of indexing systems.

annual review of information science and systems evaluation by comparison testing. college & research libraries. ; 27 (1): & 4: 6 0: 0: 29 lancaster fw. evaluation of. director of studies of the educational program at the Institut National des Techniques de la Documentation in Paris in she has a claim to being the founding mother of the i-school movement.

Otlet’s interest in the transformative effects of new media was echoed in France in the middle of the first decade of the s in the extensive studiesAuthor: Michael K. Buckland.

The salient components of this system include texture feature extraction, image segmentation and grouping, learning similarity measure, and a texture thesaurus model for fast search and indexing.

INDEXING THEORY AND EVALUATION. Evaluation of Library Techniques for the Control of Research Materials I.A. Warheit 7 (4): With the.

vices, of evaluation as a management tool—is truly monumental. There are some sixty of his publications focused on this area.

Description Aslib-Cranfield studies on the evaluation of indexing systems FB2

As Wilf said in his comments as part of the Pioneers of Information Science Scrapbook, “Following work on the Aslib Cranfield Project, and various evaluation studies for Herner & Co., I had recently completed a. At the time of his recommendation to NLM, Lancaster was head of the Systems Evaluation Group at Herner & Company in Washington, DC, working on a project for the Technical Library at the U.S.

Navy Bureau of Ships (Lancaster, ) and utilizing procedures similar to those used in the Cranfield studies and later used in the MEDLARS evaluation. King and Bryant have written a book on the evaluation of information services and products emphasising the commercial aspects.

Goffman and Newill describe MILLS, J. and KEEN, M., Factors Determining the Performance of Indexing Systems, Volume I - Design, Volume II - Test Results, ASLIB Cranfield Project, Cranfield (). This report contains recommendations on a sample course curriculum in the general area of information organization and information system design in a Ph.D.

Computer Science Program. The subject are Cited by: 8. Information Storage and Retrieval (IS&R) encompasses a broad scope of topics ranging from basic techniques for accessing data to sophisticated approaches for the analysis of natural language text and the deduction of : MinkerJack.

An evaluation of retrieval effectiveness for a full-text document retrieval system. Communications of the ACM, 28, Cleverdon, C. Report on the testing and analysis of an investigation into the comparative efficiency of indexing systems.

Cranfield, England: College of Aeronautics, Aslib Cranfield Research by: This banner text can have markup. web; books; video; audio; software; images; Toggle navigation.

Claire K. Schultz, Wallace L. Schultz and Richard H. Orr, “Comparative indexing: terms supplied by biomedical authors and document titles,” American Documentalist 16 (): Claire K.

Schultz, Peter B. Henderson and Richard H. Orr, Evaluation of indexing by group consensus, Final report, Contract OEC Bureau. Full text of "Automatic indexing: a state-of-the-art report" See other formats. Cleverdon, Cyril W. "Report on the Testing and Analysis of an Investigation into the Comparative Efficiency of Indexing Systems".

ASLIB Cranfield Research Project. Cranfield: College of Aeronautics. Daconta, Michael C., Leo Joseph Obrst and Kevin T. Smith. 'The design and testing of a fully automatic indexing-search system for documents consisting of expository text', In: Information Retrieval: A Critical Review (Edited by G.

Schecter), Thompson Book Co., Washington D.C., (). Quality Control in the Publishing Process and Theoretical Foundations for Information Retrieval Manfred Kochen UNIVERSITY OF MICHIGAN ANN ARBOR, MICHIGAN I.

Details Aslib-Cranfield studies on the evaluation of indexing systems FB2

Introduction Information retrieval, as usually understood, is performed by retrospective document search systems, like MEDLARS [1] or current awareness systems, like SDi [2].Cited by: 3.

A performance study of a few popular search engines is conducted including AlltheWeb, Google, Microsoft Search Engine, and Yahoo!. The quality of the search results is examined manually by marking relevant pages returned from the search engines in response to a set of randomly chosen queries and computing the RankPower.

In manual indexing systems this choice is easily made by the automatic indexing it is necessary to define what punctuation should be used as what "words" to index.

word and then which of these human indexer. However for word separators and to define Normally word .Factors Determining the Performance of Indexing Systems.

Cranfield, UK: Aslib Cranfield Research Project, College of Aeronautics (Volume 1: Design; Volume 2: Results). Connaway, Lynn Silipigni and Marie L. Radford. Research Methods in Library and Information Science, 6th Edition.

Libraries Unlimited. Cornelius, Ian. An inadequate model invariably leads to useless results. Dr. T a u b e ' s criticism of the "relevance-recall mathematics" introduced by the Aslib-Cranfield studies and t a k e n over by the AI)L writers is misplaced, because it, is based on a confusion on his p a r t between two different uses of the term "relevance.".