These are the slides from Scholastica's 2017 ISMTE conference presentation on how to perform an operational audit at your journal. We cover how to set effective journal performance goals, key metrics to track, and the tools and strategies your editorial team will need to start tracking journal performance on a regular basis. Conducting regular operational audits is a great way for journal teams to refine their workflows and internal documentation, which can prove especially useful when onboarding new editors.
Measures of Dispersion and Variability: Range, QD, AD and SD
Polish Your Peer Review Process
1. Polish Your Peer Review
Process
Brian Cody, Scholastica Co-Founder and CEO
Danielle Padula, Community Development at Scholastica
8/10/17
2. Learning Objectives
1.Understand what an operational journal audit
is and why it matters
2.Identify 3 steps for a successful operational
audit
3.Utilize tools to successfully start auditing
your journal’s peer review process
2
3. Why do you care about your
journal peer review performance?
3
4. Core areas peer review performance affects
4
Journal
reputation
Production
schedule
Team
efficiency
12. “Don’t make a policy change until you
have carefully studied current
data...anecdote is the enemy of
effective office management.
1212
Jason Roberts, Senior Partner at
Origin Editorial
13. Focus on metrics that are:
1.Specific
2.Easy to interpret
3.Reproducible
Like...
13
14. Key Metric: Submission volume
Has volume gone up or
done?
Remember to get specific by
article type - total volume
can be misleading.
14
15. “It’s the old devil’s in the details. Once
you get in and really start looking at all
the pieces, that’s where you see if
there’s a problem that hasn’t been
explored.
1515
Kathy Alexander, freelance consultant in
professional and scholarly publishing
16. Spreadsheet
Include a column specifying the
submission type (e.g. article or
book review) - or ask for that on
the submission form (Google
forms?)
Ways to track submissions by type
Peer review software
Use software with ability to tag
submissions and search and
sort by article type.
16
17. Submission tagging best practices
17
▰ Establish naming
conventions
▰ Keep tags short
▰ Avoid too many new
acronyms
▰ Data clean as needed
18. Key Metrics: Reviewer performance stats
18
▰ Push along pending
reviews with reminders
▰ Identify chronically late
reviewers
▰ Know who to NOT invite
▰ Set clear expectations
when inviting
19. “In short, it’s getting harder to convince
reviewers to provide a review. We all
know that but not many folks seem to
bother measuring it.
1919
Jason Roberts, Senior Partner at
Origin Editorial
20. Don’t have peer review software to automate
reviewer reminders?
20
Try using Zapier to schedule
recurring emails and/or task
reminders
21. Don’t just send emails...track their performance
21
Track email open rates and
click rates
(e.g. are reviewers clicking
the links in my reminder
emails?)
22. Key Metrics: Editor performance
▰Assignment speed
▰Editor decision ratios
(variance across team)
▰Acceptance and rejection
rate
▰Time to decision
22
23. Use shared todo lists and organization tools
23
Use peer review software todos
or free team apps like HITASK
(e.g. find 3rd reviewer, finalize
decision)
Transparency important for
accountability
24. Set time blocks for journal work and share them
24
Google calendar makes it easy
to share your calendars
25. Key Metrics: Author experience
25
Consider surveying authors on
their submission experience
30. Make actionable goals with clear measures of
success...
Examples:
1.Reduce average time to decision by 50%
(from 90 days to 45 days) in the next 6
months
2.All editor assignments will be made within 3
days of a submission
30
31. Keys to successful goal setting
31
Each execution step needs an
owner and a deadline
Metrics: best practice is to
exclude WIP and only use
completed work
32. “Often nobody ever sits down and says
‘here’s the journal’s process, here’s the
duties of everyone involved, and this is
why.’
3232
Kathy Alexander, freelance consultant in
professional and scholarly publishing
35. 35
Go to the source with measurable
results - don’t guess what works
36. Tracking progress to goals means having
▰Metric(s) for every goal
▰Incremental steps to achieve goals (track
outcomes for each!)
▰Reproducible methods for tracking metrics
▰Access to historic data
36
37. “It is best to work your way up through
more-manageable projects. The idea is
to shorten the learning cycle by
tackling a smaller project, so that you
can get early feedback.
3737
Art Markman, Harvard Business Review
https://hbr.org/2017/02/to-achieve-a-major-goal-first-tackle-a-few-small-ones
40. 40
What’s in your toolbox?
Keep in mind - same tools will likely
bring same results
41. Questions for your editorial board to consider
▰Is there a central place to track your journal’s key
performance metrics?
▰Can you automate any of your action steps...which
ones?
▰Can anyone access your performance data at any
time they need?
41
42. Complexity is killer for editorial teams
42
Make sure your focus is on your
peer review process and not
learning complex systems
43. Learning Objectives
1.Understand what an operational journal audit
is and why it matters
2.Identify 3 steps for a successful operational
audit
3.Utilize tools to successfully start auditing
your journal’s peer review process
43
44. Resources for further reading
44
▰ Maintaining Reproducible Journal
Metrics: Interview with Jason Roberts
▰ How to Perform an Operational Audit
of the Scholarly Journals You Publish
▰ Journal Management Best Practices:
Tales from the Trenches
The truth is it can happen. Ok, not literally...but metaphorically. Editorial teams are often scattered and working on different things at different times. If your team doesn’t have a clear peer review process and plan for iterating on it you could find yourselves in a messy situation. And you may not even realize it!
So how do we get from here to here?
Not quite - all journals need their own systems. You can find the best one and refine via an audit. Start with knowing exactly where you are and then figure out where you need to go to be more successful. That’s the point of an operational audit.
Note that they can also use this to automate reminder emails to their team
You can use bitly or UTM links to track clicks on specific links (e.g. once you made a reviewer guide did more people look at it and submit reviews?)
Examples: How does length of your journal’s peer review process compare to other publications, how would they rate reviewer feedback etc.
See if you have an issue (e.g. not enough submissions) or if you see seasonal spikes. Take advantage of this knowledge to make peer review workflows, marketing, etc. more efficient
Reminder to make sure in goal setting you are keeping your publisher accountable - if you have one
Track not only progress to your goal but metrics for the steps you take to reach it. E.g. did you get more reviews submitted after reminder emails or because you invited more reliable people?
Reminder to make sure in goal setting you are keeping your publisher accountable - if you have one
Make sure you know exactly how you got metrics, particularly if your in-software peer review reporting isn’t intuitive
Make sure you know exactly how you got metrics, particularly if your in-software peer review reporting isn’t intuitive