The DC Chapter of SLA and ResearchConnect (part of Elsevier) presented a one-day event on Wed., Oct. 27th, 2010. It was called:
Impact and Productivity Measurements in a Changing Research Environment
The 8 speakers came from around the globe and by the end of the day I felt like I was (temporarily) caught up with the fast-moving world of productivity measurements. A few comments about the speakers I enjoyed the most.
Dr. Henk Moed, Senior Scientific Advisor at Elsevier
The Use of Bibliometric Indicators in Research Evaluation: A Critical Overview
Henk provided a wonderful overview and emphasized the fact that numbers can indicate productivity but humans have to do the actual evaluation of faculty, departments, or institutions. He also pointed out that the metrics can be (and have been) manipulated.
Dr. Jevin West, Eigenfactor.org and Dept of Biology, Univ of Washington
The Eigenfactor Metrics: A Network Approach to Assessing Scholarly Journals
His interest in network biology got him interested in the citation network. He believes that understanding the citation network will help us with evaluating and navigating scholarship. Eigenfactor has expanded its data recently to include JSTOR. They are working on hierarchical maps now.
Dr. Frank Krell, Entomology Dept. of Zoology, Denver Museum of Nature and Science
Should Editors Influence Journal Impact Factors?
This was a VERY interesting and slightly depressing talk. His description of how editors could manipulate their journal impact factors was revealing. But the grey areas were cool. Isn't an editor hired to promote the journal? Shouldn't the editor take actions that make the journal more visible in the field? And his discussion of what a citation actually means just fed my fear that this is all a numbers game. Sometimes authors cite a work because someone else cited it; not because they read it. If an author needs to cite a work, but has 7 works that could be cited, how does she choose? Does a citation really indicate quality or popularity?
Dr. Sidney Redner, Physics Dept. Boston University
A Physicist's Perspective on Citation Analysis
Statistical physicists seem to like citation analysis. Dr. Redner echoed some of what Dr. Moed said earlier, you can play with the numbers, but you still need to understand someone's research in order to evaluate it. He walked through math that he said proves the h-index is directly tied to the number of citations a paper receives and doesn't add anything to the discussion of citation analysis. I've added a line to the Scholarly Metrics journal to point out that not everyone thinks the h-index is the bees' knees.