Improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment sets

Introduction. To improve user satisfaction and loyalty to the search engines, the performance of the retrieval systems has to be better in terms of the number of relevant documents retrieved. This can be evaluated through the information retrieval evaluation process. This study identifies two method...

Full description

Saved in:
Bibliographic Details
Main Authors: Minnu Helen Joseph, SriDevi Ravana
Format: Article
Language:English
Published: University of Borås 2024-09-01
Series:Information Research: An International Electronic Journal
Subjects:
Online Access:https://informationr.net/infres/article/view/603
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832544540208136192
author Minnu Helen Joseph
SriDevi Ravana
author_facet Minnu Helen Joseph
SriDevi Ravana
author_sort Minnu Helen Joseph
collection DOAJ
description Introduction. To improve user satisfaction and loyalty to the search engines, the performance of the retrieval systems has to be better in terms of the number of relevant documents retrieved. This can be evaluated through the information retrieval evaluation process. This study identifies two methodologies that help to recover and better rank relevant information resources based on a query, while at the same time suppressing the irrelevant one. Method. A combination of techniques was used. Documents that were relevant and not retrieved by the systems were found from the document corpus and assigned new scores based on the Manifold fusion techniques then moved into the relevant judgment sets. Documents based on judgment sets and good contributing systems have been considered in the proposed methodologies. Analysis. Kendall Tau Correlation Coefficient, Mean Average Precision (MAP), Normalized Discounted Cumulative Gain (NDCG), and Rank Biased Precision (Rbp) have been used to evaluate the performance of the methodologies. Results. The proposed methodologies outperformed the baseline works and enhanced the quality of the judgment sets, achieving a better result even with lesser pool depth. Conclusion. This research proposes two methodologies that increase the quality of the relevant documents in the judgment sets based on document similarity techniques and, thus, raise the evaluation process accuracy and reliability of the systems.
format Article
id doaj-art-414a471900d14c58a3b851ddb3140160
institution Kabale University
issn 1368-1613
language English
publishDate 2024-09-01
publisher University of Borås
record_format Article
series Information Research: An International Electronic Journal
spelling doaj-art-414a471900d14c58a3b851ddb31401602025-02-03T10:10:34ZengUniversity of BoråsInformation Research: An International Electronic Journal1368-16132024-09-0129310913110.47989/ir293603600Improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment setsMinnu Helen Joseph0SriDevi Ravana1University MalayaUniversity MalayaIntroduction. To improve user satisfaction and loyalty to the search engines, the performance of the retrieval systems has to be better in terms of the number of relevant documents retrieved. This can be evaluated through the information retrieval evaluation process. This study identifies two methodologies that help to recover and better rank relevant information resources based on a query, while at the same time suppressing the irrelevant one. Method. A combination of techniques was used. Documents that were relevant and not retrieved by the systems were found from the document corpus and assigned new scores based on the Manifold fusion techniques then moved into the relevant judgment sets. Documents based on judgment sets and good contributing systems have been considered in the proposed methodologies. Analysis. Kendall Tau Correlation Coefficient, Mean Average Precision (MAP), Normalized Discounted Cumulative Gain (NDCG), and Rank Biased Precision (Rbp) have been used to evaluate the performance of the methodologies. Results. The proposed methodologies outperformed the baseline works and enhanced the quality of the judgment sets, achieving a better result even with lesser pool depth. Conclusion. This research proposes two methodologies that increase the quality of the relevant documents in the judgment sets based on document similarity techniques and, thus, raise the evaluation process accuracy and reliability of the systems.https://informationr.net/infres/article/view/603information retrievalinformation systemspoolingdocument similarityinformation system evaluation
spellingShingle Minnu Helen Joseph
SriDevi Ravana
Improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment sets
Information Research: An International Electronic Journal
information retrieval
information systems
pooling
document similarity
information system evaluation
title Improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment sets
title_full Improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment sets
title_fullStr Improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment sets
title_full_unstemmed Improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment sets
title_short Improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment sets
title_sort improving the accuracy of the information retrieval evaluation process by considering unjudged document lists from the relevant judgment sets
topic information retrieval
information systems
pooling
document similarity
information system evaluation
url https://informationr.net/infres/article/view/603
work_keys_str_mv AT minnuhelenjoseph improvingtheaccuracyoftheinformationretrievalevaluationprocessbyconsideringunjudgeddocumentlistsfromtherelevantjudgmentsets
AT srideviravana improvingtheaccuracyoftheinformationretrievalevaluationprocessbyconsideringunjudgeddocumentlistsfromtherelevantjudgmentsets