-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
best practices for metrics gathering #11
Comments
I'm happy to help organize but if someone would like to step up would be glad to defer to them due to a conflicting meeting for the Open Modeling Foundation that week. Still, I should be able to sneak away for a morning or afternoon. I'll update this availability slot ASAP when the agenda for that meeting is finalized I only have availability between 1000 to 1600 MST (UTC-7) unfortunately. |
My unsolicited 2 cents. This is closely related to the effort standardizing the collection and reporting of Data Citation and Usage metrics that has been shepherded via Make Data Count and Scholix. Metrics are challenging. Adoption of comparable methods for collecting metrics is even more challenging, as evidenced by the issues with uptake of the COUNTER Code of Practice for Research Data. You might be interested that several of the folks from MakeDataCount (e.g., @chodacki, @mfenner, @dlowenberg) are currently involved in the effort to create v2 of the COUNTER Code for RD, and some of the main issues being considered are how to handle repositories that have mixes of data, publications, and software, and how to handle mixed research objects that contain both data, software, text, and other scholarly products. Seems like an opportunity for collaboration here between the data and the software worlds, and to have some degree of consistency between those worlds for metrics. |
Thanks @mbjones! We've been loosely following Make Data Count and the other initiatives you mentioned (comses/comses.net#374) and this is an excellent idea to harmonize our efforts. Hopefully some of the interested parties will be able to participate in this session! |
Hi I'm interested in this topic! I'm in the Software Citation #Hackathon |
Great! Apologies for the late reply, but we'll be starting at 1000 MST (1700 UTC) Zoom link: https://asu.zoom.us/j/88391150994?pwd=T0FZTXZOc0JZZzc3L3VIK1BqRWJ0Zz09 |
Key takeaways / action itemsIt's important to ensure that if someone takes the time to properly archive their work that they receive credit when downstream products cite that work. Getting these scholarly metrics right is critical to establish incentives for people to Do the Right Thing. However, some of this appears to be out of our hands and more in the purview of the aggregators / DataCite / Crossref / Scopus / PubMed Central / ISI web of science / Google Scholar / etc. A core part of the issues below are to identify concrete things scientific software registries and repositories can do to make their jobs easier and create the citation / knowledge graph for scientific publications of the future.
|
Hi again Allen, I'm not seeing a way to officially join SciCodes. Any advice would be great! -susan |
Identify and develop strategies for gathering and using metrics for repository / registry managers.
These include but are not limited to software citation metrics (who’s using the software, at what career stage, what industry, where, etc) while being respectful of privacy concerns, GDPR, etc. Best practices for adopting analytics packages (matomo vs GA for example) and UI/UX best practices for gathering analytics in the least annoying way possible
The text was updated successfully, but these errors were encountered: