
Faculty, Staff and Student Publications
Publication Date
1-1-2005
Journal
American Medical Informatics Association Annual Symposium Proceedings
Abstract
Information overload is a significant problem for modern medicine. Searching MEDLINE for common topics often retrieves more relevant documents than users can review. Therefore, we must identify documents that are not only relevant, but also important. Our system ranks articles using citation counts and the PageRank algorithm, incorporating data from the Science Citation Index. However, citation data is usually incomplete. Therefore, we explore the relationship between the quantity of citation information available to the system and the quality of the result ranking. Specifically, we test the ability of citation count and PageRank to identify "important articles" as defined by experts from large result sets with decreasing citation information. We found that PageRank performs better than simple citation counts, but both algorithms are surprisingly robust to information loss. We conclude that even an incomplete citation database is likely to be effective for importance ranking.
Keywords
Algorithms, Bibliometrics, Information Storage and Retrieval, MEDLINE, PubMed
PMID
16779053
PMCID
PMC1560575
PubMedCentral® Posted Date
January 2005
PubMedCentral® Full Text Version
Post-Print
Comments
PMCID: PMC1560575