Report on the 2:AM conference and Altmetrics workshop; the entire 2:AM conference had live streaming which can be viewed now via youtube by sessions. Altmetrics are the so-called alternative metrics to science: see the short manifesto. These two events discussed alternative metrics to science. The 2:AM conference was a mix of practitioners = publishers and researchers. The workshop was an elevator pitch style conference geared at researchers, with breakout groups that lead to a set of draft papers on different topics = you brainstorm with a set of researchers on a topic that emerged during the conference and draft a paper together, thus numerous potential publishable papers emerged from these sessions.
Dominating this year’s conference was measuring impact through social media (e.g. spread of info through media, in policy papers, twitter; measuring “value”/citations through download, likes, retweets vs. formal citation). These talks included how do we define impact for science, is it different for researchers, institutions, what should funding bodies or other science orgs consider when evaluating impact. Some notable talks on this topic in particular:
- The conference session on impact with abstracts & see the video.
- The session on altmetrics in research evaluation & see the video.
- From the workshop: Kim Holmberg’s group (full-disclosure, also a co-author of mine) on measuring impact in science (scroll down to project and see the description of the newly funded project on measuring impact in science).
- Cameron Neylon’s talk on developing theory about how social media is used in science and by scientists as research evaluation metrics, he provided via Twitter slides from a different talk but relevant.
- Lot’s of cool stuff coming out of Altmetric, as expected, scattered throughout the event checkout the schedule.
PUNCHLINE: Even though there are few standards on how altmetrics are being/will be used in research evaluation as well as their impact and effect; as an “altmetrician” I would encourage researchers to bring their “non-traditional” outputs to the table. Mention these in the conversations with department heads, group leaders, and for departments as well as in formal evaluations, as at this point, an increasing number of people, institutions and the like are online, thus be open about how you are disseminating your work to the public. In addition these social media tools have proved to be great networking tools, in maintaining a network and in identifying interesting researcher, papers, etc.…
Explicit self-promotion – a twit pic of me presenting our work on the responses of higher education institutions in the UK, where we found exploratory evidence that less-research intensive universities have more diverse organizational responses (departments, policy papers, events, library training, etc..) than more-research intensive. We postulate that these unis may have more to gain in way of positioning themselves to prove impact. More to be seen as we take the next steps in this project to explore these responses.