Bibliometrics and citation analysis

Collection Development

With the advent of computing technology, libraries have developed to “help people meet their needs, whether practical, theoretical, religious, or aesthetic” (Rubin, 2010). This goal has inspired librarians to find ways to develop collections that best meet user needs. User needs are constantly changing and expanding to include different types of information resources. Bibliometrics is one tool that allows librarians to determine which journals and materials are needed in their collection development to best meet user needs. In academic libraries, citation analysis can be used to discover the popularity of particular authors, articles, and publications. This information could determine collection development needs for specific college departments and provide valuable clues for future research.

As technology has changed and become affordable to most, libraries have included databases and journals into their collections and expanded the resources available to the community to meet their information needs. Assessing the community and anticipating their information needs keeps the library a hub of information flow that would otherwise be stagnant.

Tracking indicators of scholarly activity

With more databases connected to the internet, the ability to collect and track academic trends becomes easier on a global scale. An online resource such as Google Scholar provides citation features which allow the scientific relationships at the institutional and national level to be mapped globally (Ortega, 2013). Google Scholar provides information on millions of citations, locating relationships and common themes between scholarly research. Web of Science databases allow generational mapping of citations and how they interact and influence other researchers. The ability to track scholarly activity helps researchers locate current topics, and explore studies related to their own. Librarians utilize this information when looking for trends in academic exploration and for helping users meet their information needs for scholarly research. Indicators of scholarly activity could be used to form a library’s collection development policy, determining what databases and journals are available for users.

Automation of citation analysis has provided librarians the opportunity to reflect on results. The ability to quickly and easily index resources with others provides librarians a valuable tool in locating related resources. Updates to databases provide librarians instantly with recent additions, removing the delay from publication to public availability.

Multimedia Access

Developing technology has created new media formats for information to be collected and exchanged. With the growth in the number of digital objects to store, bibliometrics will continue to evolve to include vital information that was previously not recorded. Digitizing images and creating metadata for digital media is currently challenging many libraries today. Determining if a resource is part of a greater open public publishing environment, such as journals, host services, depositories, discussion forums, websites or electronic archives will need to be mapped and measured with authority and integrity (Cronin, 2001). Electronic media is still considered unreliable as a resource, but as methods are devised to ascertain information validity, librarians will need to find ways to create access to these media types to library users.

Challenges facing electronic information retrieval systems

Relevant search results

When users search a database for a specific resource, the results communicated back to the user must not be overwhelming in number. Too many results can discourage the user from finding applicable resources amongst so many. According to the principle of least effort, too many results encourage users to settle for resources that suffice, instead of examining all the results for the best fit (Rubin, 2010). I have used Web of Science extensively as a graduate student, and the ability to track resource citations and related records to locate relevant material has proven invaluable. Refining search parameters by selecting categories, research areas, and publication years have reduced search results beyond Boolean input information.  Users that are not aware of how to refine search parameters are left to roam through information retrieval systems without aim, and typically without success.

Information retrieval systems evaluation

Measuring information retrieval systems to determine if they are meeting their purpose is typically done through collection, services and user satisfaction studies (Rubin, 2010). Studies have to been conducted to determine alternate methods of evaluation. One such study evaluated crowdsourcing to determine the effectiveness compared to laboratory based user studies, and found both to be equally effective in evaluating information retrieval systems (Zuccon, Leelanupab, Whiting, Yilmaz, Jose, & Azzopardi, 2012). Even more importantly, crowdsourcing allowed researchers to collect larger amounts of data to evaluate systems with the potential to test systems with a larger group of users (Zuccon et al, 2012).

Evaluating information retrieval systems can be subjective based on how the evaluation is conducted. Issues of usability, functionality, and accessibility can influence the evaluation whether the information retrieval system is effectively meeting its purpose. Selecting assessment methods can directly influence evaluation results, and so librarians and researchers must carefully devise methods that accurately assess information retrieval systems.

References

Cronin, B. (2001). Bibliometrics and beyond: Some thoughts on web-based citation analysis. Journal of Information Science, 27(1). doi:10.1177/016555150102700101

Ortega, J. (2013). Institutional and country collaboration in an online service of scientific profiles: Google Scholar Citations. Journal of informetrics, 7(2), 394-403.

doi:10.1016/j.joi.2012.12.007

Rubin, R. (2010). Foundations of library and information science. New York: Neal-Schman Publishers.

Zuccon, G. (2013). Crowdsourcing interactions: using crowdsourcing for evaluating interactive information retrieval systems. Information retrieval (Boston), 16(2), 267-305. doi:10.1007/s10791-012-9206-z

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s