Skip to content

Practical implementation: similar to TOP guidelines #19

@npscience

Description

@npscience

As a manifesto, I think it's worthwhile to aim for a future you'd like to see, and I can't really comment on those ideals (they look great to me, but I'm not experienced with this area). I wonder if some of the elements may appear overwhelming to someone in today's research world, and if it would be worthwhile taking a pragmatic view to implementation.

The Center for Open Science's TOP guidelines are an interesting method to consider: accept we're not at perfection today, measure where we are (honest and transparent), and see where to take the next small step towards implementation of ideals.

Could something similar be drawn up for this? In particular, I feel there are some useful indicators for this in @khinsen's review #7.

Example, for publishers:

  • are software cited? in what way? e.g. are they cited in text and in references, does this include version number, commit number, etc as recommended in software citation principles
  • are code files shared? in what way? e.g. from source files as supplements to a workflow that encourages authors to post them to active repositories and platforms like GitHub
  • is code reviewed? in what way? e.g. from reviewer proactively requesting code from author (via journal) and reviewing it, to structural processes in place to make this the default. what are review outside of journal infrastructure, e.g. institutional or lab-based review?

These are just starting ideas, but I think it would be really useful to assess where each stakeholder (publisher, PI, developer/researcher) is at present and what is needed to advance the ideals in the manifesto in each situation.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions