REF2021 Development Plans

Hi all!


As you may have seen, I recently joined the Elements team (yay!), and I've been asked to lead on the development of REF2021 functionality (I managed the University of Edinburgh's RAE2008 and REF2014 submissions, and am joining Digital Science from the Pure team at Elsevier where I product managed reporting and assessment functionality).


We are currently working on our detailed REF2021 development plans through 2020, and we will be in touch about these in the coming weeks.


In the meantime though, we wanted to advise you of our most immediate plans for 5.15 (May release):


Following on from the 5.14.1 release that includes functionality that begins to look at managing outputs through a group view, rather than on a per-individual level, we will continue to focus our efforts in the short term on delivering functionality supporting this concept, along with building up the REF2 data model.  We currently aim for this to include:

  • Expanding functionality available via the new Manage Selections screen to assist in the UoA selection/attribution decision-making process.
  • Introducing the ability for REF managers to capture selection statuses that will be used a basis for UoA level aggregation
  • Supporting restricted visibility of REF-specific fields to REF managers only
  • Adding REF-specific fields such as Double-weighting request, Interdisciplinary flag, Output allocation, and Additional information


Many thanks!


Manya


2 people like this

Hi Manya, thanks for this update! Will this include Symplectic working with the REF submission system?

Hi there!


We don't expect Elements to directly interact with the REF Submission System.  Instead, our plan is to offer functionality that will enable you to download your submission XML out of Elements and import to the REF Submission System.


Thanks :)

Hi Manya


The REF submission guidelines have been published now, attached below, (which I must admit is a little earlier than I thought). Can you outline your development timelines in this respect and confirm whether you will be testing directly with the portal itself?  Any insight appreciated so we can develop plans at our side. 


Thanks Nicky 


Hi Nicky


I know right?  A LOT earlier than expected - which is actually really great (so long as they don't revise it too much over the coming months).


We would hope to be able to test directly with the REF Submission System, but this will depend on whether the REF Team will allow direct vendor access - for REF2014, they only allowed vendors direct access to the test (and possibly pilot, if I remember correctly) system, but not production.  But we have made representations to the REF Team that such access would be very valuable.


Within the next couple of weeks, I will be posting a high-level roadmap here, prior to the REF webinar on 24 April, where we will also discuss plans and timescales.


Thanks!


Manya



Hi all


In preparation for next week's REF webinar, please find attached some documentation outlining our plans for REF2021 development, including mockups of screens relating to the management of REF2 Research Outputs.


Just a reminder that you can register for the webinar here:


https://zoom.us/meeting/register/bcafa4b5e0c9850a7c24e00bf0acd2b8


If you're unable to make it, the session will be recorded and I'll post the link here.


Have a lovely Easter weekend :)


Manya


Hi everyone!


Thanks to everyone who was able to join the REF webinar this morning.  The recording is now available for those who were unable to make it, or who would like to share the link with colleagues:


https://zoom.us/recording/share/GvsXiVGI1L76TlAgY_5a3r4l7AX2uFSZSwPfbkMPxLSwIumekTziMw?startTime=1556093976000


Also, attached are the slides :)


thanks!


Manya


Thanks for sharing the recorded version, very useful! I was interested in Keith Fildes' question about turning off the author selection workflow and doing everything in the review stage - this might be useful for us too, so could you let us know your answer to that one too please?

Hi Mark


I'll post responses to all the queries that I was unable to answer during the session, here in this discussion :) Just working with colleagues now to get those answers!


Thanks!


Manya

Hi Manya


Can you confirm if the 5.15 release is still on target for 2 may and can you advise when the release notes will be issued please?


Regards

Nicky

Hi Nicky,


Yes, dare I say, touch wood, the 5.15 release is on target for release on Thursday, 2 May.

The release notes will be published as part of making that release, and will be available on the support site at that point, in the Release Notes folder ( https://support.symplectic.co.uk/support/solutions/folders/6000177979 ).

Best wishes,
Jonathan M


Hi Nicky,

Just a brief note to confirm that the 5.15 release is available on our support site and the Release Notes are here https://support.symplectic.co.uk/support/solutions/articles/6000222048-symplectic-elements-v5-15-release-notes.


Best wishes,
Jonathan M

Symplectic Elements REF Webinar 24 April 2019 - Questions from the call.

Thanks to you all for raising your questions during the REF webinar. Please find our responses on each of the 8 questions below. Note that Manya is on leave until 20th May, so please accept our apologies that one or two of these are interim answers, with private support tickets involved, as we gather additional details.



1. Even if you aren’t directly supporting the impact case study submission, are you committing to developing the impact module further to help us gather/store/prepare the impact evidence?

 Adam Riches, MMU

 Kate Byrne (VP, Product Management) is currently reviewing all Impact Module related feature requests to identify the next phase of work on the impact module ahead of our mid-year roadmap update. If you have any feedback about the Impact module specifically related to REF which isn’t already captured as a feature request on the Support site please contact us via the Support site.

 


2. Would it be possible to have a custom scoreset at the Accepted -> Attributed decision stage, so that we can rank the pool of possibles into Definite/Likely/Reserve/etc - we will have maybe 3000 to whittle down to 1000 in just one UoA, so some sort of interim scoring would be useful.

 Andy Reid, LSHTM

Thanks for the question, Andy. We have a support query ongoing with you, including further clarifying questions regarding your intended use pattern here. We’ll consider how scoresets can be best used for this purpose, and also consider alternatives. As a possibility, could it help to view / set labels on some of the REF2 screens?



3. How do we calculate (ideally) or flag that the publication was within the contract period for a 1B?  What if multiple periods of Cat A eligible on a single 1B?

 Owen Roberson, Cambridge

Thanks for this Owen. Manya will coordinate our response to this as we take plans for the REF1 aspects of functionality further forward and consider how we might take content from the REF1 form for Category B staff and use it to validate against any attributed REF2s.



4. Would it be worth having a ‘Fewer than 15 authors’ flag as well?

 Andy Reid, LSHTM

 We don’t propose to have such a flag, but rather to employ logic to determine whether a ‘author contribution statement’ is required based on:

UoA

Number of co-authors (derived from associated Publication, or ‘Number of additional co-authors override’ field on the REF2)

Is corresponding / lead author

 Our current working notes on this point are covered by this annotation on the mockups (screenshot below). We welcome input from you and other interested parties in validating the logic involved is appropriate for purpose and that the prioritisation of this part of the feature is appropriate given other competing priorities.




5. In terms of output selection, it would be great if, for each UoA, we could see the list of eligible/output ranked by review score

 Karl Smith, LSBU

 We’ve added this to our development plans.



6. For attribution support, maybe you could develop a report listing all accepted outputs and all REF1 authors in the UoA on a grid showing what could be attributed to who?

 Tim Brooks, Anglia

 

 We will put this request on our list of features to consider for future development.  Unsure how well this will scale if say 300 REF1 authors and 1500 REF2 outputs.



7. Will it be possible to turn off the author selection workflow stage, so we could manage the full corpus in the review stage?

 Keith Fildes, Sheffield Hallam

(Question repeated by Mark H at UAL on https://support.symplectic.co.uk/support/discussions/topics/6000057523 )

By full corpus, do you mean all REF eligible publications in the census period and linked to eligible REF1 staff, rather than only those which have been selected by one or more academic? It’s a question of style/approach as to what extent REF managers chose to over-select publications and manage the whittling down later after they have been reviewed, using the new Acceptance and Attribution later in the REF2-specific future functionality.



8. Are there any plans to amend the open access verification screens?  Or indeed a perceived need to amend these screens?

 Nicola Bates

 We need a little more information to be able to answer this fully, so have contacted Nicola to discuss this directly.


Thanks and best wishes, Jonathan M

Hi Manya


Do you have a release date for the following yet as our users are asking:


Expanding functionality available via the new Manage Selections screen to assist in the UoA selection/attribution decision-making process


Kind regards



Karen

Hi Karen


The 5.15 release included functionality that enabled the Acceptance / Not of outputs, and we expect to expand on this in the 5.16 release (27 June) to include functionality to support the attribution of REF2 outputs to individual REF1 researchers.


Thanks!


Manya


Login or Signup to post a comment