The Australian Research Council (ARC) is undertaking a review of Excellence in Research for Australia (ERA) and the Engagement and Impact Assessment (EI) (the Review). This paper forms the basis of public consultation for the Review. It sets out the key issues for consideration and discussion and has been informed by public reviews and stakeholder feedback.
ERA EI Review Consultation Paper 2020
Front cover image credits:
Biologic cell (colourful), iStock.com/ © dreaming2004
Blue ink, iStock.com/ © Pathathai Chungyam
Top view of inside a green plant, iStock.com/ © Zaharov
Deep blue silky smoke background, iStock.com/ © Storman
ISBN 978-0-6484847-3-8
© Commonwealth of Australia 2018
All material presented in this publication is provided under a CC Attribution-NonCommercial- NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence with the exception of the Commonwealth Coat of Arms, the Australian Research Council (ARC) logo, images, signatures and where otherwise stated. The details of the relevant licence conditions are available on the Creative Commons website as is the full legal code for the CC Attribution BY-NC-ND 4.0 licence. Requests and enquiries regarding this licence should be addressed to ARC Legal Services on +61 2 6287 6600.
Version 1.0
2
Contents
1 Purpose of the Document .............................................................................................. 4
1.1 Submitting feedback ............................................................................................... 4
2 Review Aims, Context and Guiding Principles ............................................................... 5
2.1 Aims ....................................................................................................................... 5
2.2 Terms of reference ................................................................................................. 5
2.3 Context ................................................................................................................... 5
2.4 Guiding principles ................................................................................................... 7
3 Excellence in Research for Australia (ERA) ................................................................... 8
3.1 ERA overview......................................................................................................... 8
3.2 ERA policy .............................................................................................................. 9
3.3 ERA methodology................................................................................................. 11
3.4 ERA process......................................................................................................... 19
4 Engagement and Impact Assessment (EI) ................................................................... 21
4.1 EI overview........................................................................................................... 21
4.2 EI definitions ......................................................................................................... 22
4.3 EI methodology..................................................................................................... 24
5 Overarching Issues Common to both ERA and EI ....................................................... 34
5.1 Frequency of ERA and EI .................................................................................... 34
5.2 Streamlining and simplifying ERA and EI .............................................................. 34
5.3 Utilising technological advancements and existing data sources .......................... 36
Appendix A—Guiding Principles for ERA and EI.....................................................................i
Appendix B—ERA Contextual Indicators .............................................................................. iii
Appendix C—ERA and EI Rating Scales ...............................................................................v
Appendix D—Summary of Questions................................................................................... vii
Appendix E—Acronyms ..................................................................................................... xvii
3
1 Purpose of the Document The Australian Research Council (ARC) is undertaking a review of Excellence in Research for Australia (ERA) and the Engagement and Impact Assessment (EI) (the Review). This paper forms the basis of public consultation for the Review. It sets out the key issues for consideration and discussion and has been informed by public reviews and stakeholder feedback.
1.1 Submitting feedback
The ARC invites responses to the consultation paper.
Feedback is particularly welcomed from stakeholders within the higher education research sector, discipline peak bodies as well as industry and other end-users 1 of university research, and more broadly. We understand the impact that the COVID-19 pandemic has had on the Australian higher education sector and that this may affect the capacity of some universities to provide feedback. Please contact the ARC at ERAEIReview@arc.gov.au should you have any questions or concerns.
We thank you for your continued commitment to review and improve both ERA and EI.
Questions for consideration are provided throughout this paper. You are not limited to the questions posed in this document and additional feedback may be provided in the survey form. Written responses can be made through Survey Monkey or by responding to the survey questions using the template in Appendix D of this document. Submissions will be published at the conclusion of the review. If you do not wish for your submission to be published, please indicate this in your submission.
Submissions close 12 October 2020 .
1 A research end-user is an individual, community or organisation external to academia that directly uses or directly benefits from the output, outcome or result of the research. Examples of research end-users include governments, businesses, non-governmental organisations, communities and community organisations.
4
2 Review Aims, Context and Guiding Principles
2.1 Aims
The aims of the Review are to enable the ARC to:
• respond to the ongoing needs of the university sector, government and the public for a robust evaluation of Australian university research quality, impact and engagement
simplify and streamline ERA and EI
•
• take advantage of recent developments in technology and big data
• ensure that ERA and EI continue to reflect world’s best practice.
2.2Terms of reference
The Review will consider:
• the purpose and value of research evaluation, including how it can further contribute to the Government’s science, research and innovation agendas • the extent to which ERA and EI are meeting their objectives to improve research quality and encourage university research engagement and impact outside of academia • the effects of both ERA and EI on the Australian university research sector, whether positive or negative, intended or unintended • opportunities to streamline the ERA and EI processes to reduce the reporting burden on the research sector (as recommended by the House of Representatives Report, Australian Government Funding Arrangements for non-NHMRC Research ) 2 noting the guiding principles of ERA and EI are:
robust and reliable methodologies
-
- applicability of the methodologies across disciplines
• opportunities for coordination of research data reporting and analysis across government, thereby improving whole-of-government reporting capability and reducing the reporting burden on universities • publicly available data sources and new developments in technology and products to capture research evaluation data
the frequency of ERA and EI
•
• the appropriateness and robustness of the ERA and EI methodologies.
2.3 Context ERA evaluates the quality of university research. EI assesses the engagement and impact of university research. Both ERA and EI are based on the principle that transparent assessment and reporting of university performance provides incentives to universities to improve research quality, engagement and impact. The comprehensive and fine-grained information from ERA and EI assessments provides a valuable resource for universities to use in their strategic planning and
2 House of Representatives Standing Committee on Employment, Education and Training, Australian Government Funding Arrangements for non-NHMRC Research, (Canberra: Parliament of the Commonwealth of Australia, 2018).
5
research management, and for Government to use to inform research policy. Both programs demonstrate the value of investment in research to the Australian community. While the first three rounds of ERA were tied to a modest proportion of Research Block Grant funding to universities, ERA and EI have been primarily reputational, not financial, drivers of university behaviour (see Sections 3 and 4 for overviews of ERA and EI). 3 Feedback is being sought about whether the current objectives and methodologies of ERA and EI will meet the future needs of stakeholders. In addition, stakeholder views are also requested on how ERA and EI may need to be modified in light of the following current and recent reviews: • The Research Sustainability working group (2020) which is a working group of university Vice-Chancellors established to provide advice to the Minister for Education about sustainable approaches to research funding for universities during COVID-19 and beyond. While the linking of ERA and EI to funding is beyond the scope of the ERA EI review, the review aims to continually improve the robustness and suitability of ERA and EI as a measure of the quality of Australia’s research and its impact beyond academia. • The House of Representatives review of Australian Government Funding Arrangements for non-NHMRC Research (2018), which recommended that the frequency of ERA and EI be altered and their processes streamlined to reduce burden on universities. 4 • The Coaldrake Review of Higher Education Provider Category Standards (2018-2019) which recommended changes to the benchmarking of research quality in the Higher Education Provider Category Standards. The Tertiary Education Quality Standards Agency (TEQSA) is responsible for processes and policies related to university provider category standards. The Coaldrake Review recommendations and Government’s response have not specified a methodology for determining the benchmarking of research quality for TEQSA purposes, nor have they indicated that ERA will be used. The ERA EI review will consider the implications for universities of any changes to the ERA methodology. • The Australian and New Zealand Standard Research Classification (ANZSRC) Review (2020) which updated the Fields of Research codes that are used to define disciplines in ERA and EI. The ERA and EI review will consider the implications for universities and research disciplines of the new changes.
3 In 2016, ERA outcomes were tied to approximately 4.8% ($10.1 million) of Sustainable Research Excellence (SRE) funding which equated to 0.6% ($10.1 million) of the total Research Block Grant allocation. 4 House of Representatives Standing Committee on Employment, Education and Training, Australian Government Funding Arrangements for non-NHMRC Research, (Canberra: Parliament of the Commonwealth of Australia, 2018).
6
2.4 Guiding principles
The ERA evaluation and EI assessment were developed within specific guiding principles (Appendix A). Any recommendations or outcomes of the Review must maintain these key principles to ensure that evaluation of university research is:
robust
•
reliable
•
• flexible (i.e. able to be applied across a broad range of disciplines).
In the context of COVID-19, the Review is also guided by considering the ongoing needs of the sector, and therefore value for effort or investment is also a key issue. Streamlining and simplifying the processes, effectively harnessing big data and technology to reduce reporting burden, and improving the transparency and robustness of both programs will help to ensure their value to stakeholders into the future.
Further information on ERA and EI can be accessed on the ARC website.
7
3 Excellence in Research for Australia (ERA) This section provides an overview of ERA and issues raised in previous feedback from stakeholders. It includes questions relating to policy, methodology and process.
For further information about ERA, please visit the ERA homepage on the ARC website.
3.1 ERA overview
ERA is a national evaluation framework that evaluates the quality of Australian university research against international benchmarks. In doing so, ERA aims to identify and promote excellence across the full spectrum of research activity, including both discovery and applied research, within Australian universities.
The specific objectives of ERA are to:
1. continue to develop and maintain an evaluation framework that gives government, industry, business and the wider community assurance of the excellence of research conducted in Australian higher education institutions 5 2. provide a national stocktake of discipline level areas of research strength and areas where there is opportunity for development in Australian higher education institutions
3. identify excellence across the full spectrum of research performance
4. identify emerging research areas and opportunities for further development
5. allow for comparisons of research in Australia, nationally and internationally, for all discipline areas. ERA is a comprehensive collection of university data that includes all eligible researchers and their research outputs. It evaluates the quality of research at each university at the broad and specific discipline level. 6 This enables recognition of excellence regardless of the size or specialisation of a university. At the conclusion of each ERA round, the ARC publishes a national report. The State of Australian University Research 2018–19: ERA National Report presents the outcomes of the most recent round, ERA 2018, and is available via the ARC Data Portal. With four rounds now complete, ERA provides a wealth of fine-grained, sector-wide and discipline- specific data and analyses of Australian university research not available from other sources. This includes performance ratings since ERA 2010, extensive research staffing data (including gender), all Australian university research outputs from 2003 to 2016, and research income and research application data from 2006 to 2016. Information from ERA is used by Government, universities, and other stakeholders for a variety of purposes. While some of this information is available publicly or through commercial providers, it is generally not available by discipline, or does not sufficiently cover all disciplines.
5 In this document, institutions are generally referred to as universities except where ‘institution’ is used in a pre-existing definition. When the terms ‘institution’ or ‘university’ are used, the term is referring to Australian higher education providers as defined by the Higher Education Support Act 2003 (Tables A and B) 6 In ERA, the broad discipline refers to the ANZSRC two-digit Field of Research or Division. Specific discipline refers to the ANZSRC four-digit Field of Research or group.
8
For example, ERA outcomes and data:
• focus attention on research quality and thereby provide incentives for improvements in research performance
• inform a range of policy advice and initiatives across various Government portfolios
• assist universities with their strategic planning, decision-making and their research promotional activities in Australia and internationally (for example, to attract prospective researchers and students).
3.2 ERA policy
3.2.1 Value of ERA As noted in the above section, a key objective of ERA is to identify research excellence across the full spectrum of research activity. The results of ERA have shown that over time, university research has improved in quality (see the ERA outcomes on the ARC Data Portal). Other indicators of research quality have also shown similar trends in the performance of Australian universities and researchers. 7 ERA provides a rich source of information that can inform decisions and shape policies related to Australia’s university research sector. For example, an independent report 8 on ERA commissioned by the ARC found that: • domestically and internationally, ERA was credited with assisting Australian universities’ improvements in international research rankings
• ERA had caused researchers to focus more on quality of publications rather than quantity
• ERA results were used widely by universities for strategic planning. These conclusions are supported by more recent internal ARC analyses. 9
The Review is investigating the extent to which ERA is meeting its objectives. In addition, stakeholder feedback is sought on the impacts of ERA on the Australian university research sector.
7 For example, over the same period that ERA has assessed research outputs (2003-2016), Australia's relative citation impact and share of the world’s top 1 per cent of highly-cited publications have risen as noted in the Australian Innovation System Report 2017, p. 19. 8 ACIL Allen Consulting, Benefits Realisation Review of Excellence in Research for Australia, (2013). 9 ARC, , Australian Research Council Annual Report 2017–18 , (2018).
9
Issues to be explored
Q3.1
To what extent is ERA meeting its objectives to:
a. Continue to develop and maintain an evaluation framework that gives government, industry, business and the wider community assurance of the excellence of research conducted in Australian higher education institutions. A very large amount; A large amount; A moderate amount; A small amount; Not at all . b. Provide a national stocktake of discipline level areas of research strength and areas where there is opportunity for development in Australian higher education institutions. A very large amount; A large amount; A moderate amount; A small amount; Not at all. Please explain your answer. c. Identify excellence across the full spectrum of research performance. A very large amount; A large amount; A moderate amount; A small amount; Not at all. Please explain your answer. d. Identify emerging research areas and opportunities for further development. A very large amount; A large amount; A moderate amount; A small amount; Not at all. Please explain your answer. e. Allow for comparisons of research in Australia, nationally and internationally, for all discipline areas. A very large amount; A large amount; A moderate amount; A small amount; Not at all . Please explain your answer. The ERA objectives are appropriate for meeting the future needs of its stakeholders. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. a. If you disagreed with the previous statement, what should the primary purpose of ERA be going forward? Please explain your answer.
Q3.2
Q3.3
What impacts has ERA had on:
a.
the Australian university research sector as a whole
b.
individual universities
c.
researchers
d.
other?
Please explain your answers.
Q3.4
How do you use ERA outcomes? Please describe.
Q3.5
ERA outcomes are beneficial to you/your organisation. Strongly agree; Agree; Neither agree or disagree; Disagree; Strongly disagree. Please explain your answer. Do you have any suggestions for enhancing ERA’s value to you/your organisation? Please explain your answer.
Q3.6
10
3.3 ERA methodology
ERA was announced in 2008 as a new national evaluation of university research quality. Since that time, rounds have been run in 2010, 2012, 2015 and 2018. While the ERA methodology has matured over each round, the principles underpinning the ERA indicators, agreed upon in 2008, have not changed. The ERA Indicator Principles are at Appendix A. The key quality indicators continue to be peer review or citation analysis, depending on the discipline. 3.3.1 Unit of evaluation In ERA, the unit of evaluation is the broad or specific discipline, as defined by the ANZSRC two-digit and four-digit Field of Research codes, respectively, for an eligible university. 10 An example of the ANZSRC 2020 hierarchical classification structure is shown below:
Division……39 Education
Group…………….3903 Education Systems
Field……………………..390304 Primary Education
In general, for the purpose of this consultation paper, two-digit Field of Research codes are referred to as ‘broad disciplines’. Four-digit Field of Research codes are referred to as ‘specific disciplines’. ‘Disciplines’ refers to the broad and specific disciplines, collectively. For the purpose of ERA, when referring to a discipline at a particular university, ‘unit of evaluation’ is used. Universities assign each item submitted for an ERA round (i.e. research outputs, researchers, research income and applied measures) to one or more specific disciplines.
3.3.2 ERA methodology at a glance
An ERA round process
An ERA round opens with submission of data by universities for evaluation. Evaluations are conducted by Research Evaluation Committees through a series of individual and committee evaluation processes. These are outlined in the ERA 2018 Evaluation Handbook.
Indicators
The ERA indicator suite has been developed to align with the research behaviours of each discipline. For this reason, there are differences in the selection of indicators applicable to each discipline. The key quality indicators for ERA are either citation analysis, or peer review of a 30 per cent representative sample of research outputs. Citation analysis is used more commonly for disciplines in the natural sciences 11 . Peer review is used more commonly in the humanities and social sciences.
10 Eligibility of Australian universities is determined by whether a university is listed in Table A or Table B of the Higher Education Support Act 2003. 11 Exceptions were 0101 Pure Mathematics which is assessed as a peer review discipline. 08 Information and Computing Sciences, 1005 Communications Technologies, and 1006 Computer Hardware have also been assessed as peer review disciplines since ERA 2012.
11
Citation analysis is used for disciplines in which research findings are predominantly disseminated through academic journals and there are sufficient outputs in indexed peer-reviewed journals to allow robust citation analysis. For a range of disciplines, such as humanities, social sciences, information sciences and disciplines at the applied end of the spectrum, citation analysis may not be appropriate—either because these disciplines do not predominantly disseminate their research findings through academic journals, or the citations information for the journals for these disciplines is not available. Many of the disciplines disseminate their research findings through other types of outlets, such as books, conferences, reports, creative works, exhibitions and performances. Therefore, in these disciplines, peer review of a 30% sample of outputs across all output types is the indicator used. In ERA, a sample of research outputs is evaluated by committees of internationally recognised experts, and additional peer reviewers. For ERA, the ARC identified disciplines suitable for citation analysis through consultation with discipline peak bodies. There are also four additional categories of contextual indicators which assist evaluators to understand each unit of evaluation:
volume and activity
•
publishing profile
•
research income
•
applied measures
•
For more information on the application of specific indicators to individual disciplines, refer to the ERA 2018 Discipline Matrix. Further details regarding the citation and peer review methodologies are provided in the following sections.
Issues to be explored
Q3.7
The current methodology meets the objectives of ERA. Strongly agree; Agree; Neither agree or disagree; Disagree; Strongly disagree. Please explain your answer.
Q3.8
What are the strengths of the overall methodology? Please describe.
Q3.9
What are the weaknesses of the overall methodology? Please describe.
Q3.10
Does the discipline-specific approach for evaluating research quality (citation analysis or peer review for specific disciplines) continue to enable robust and comparable evaluation across all disciplines?
3.3.3 Citation analysis methodology The most basic and common measure of research activity is the number of peer-reviewed journal publications. Tracking the number of citations to these publications can reveal trends in the impact and influence of the research. While analysis of citation metrics is a key indicator for some disciplines in ERA, expert review of the indicators by the research evaluation committees is fundamental to the methodology. The analysis of citation metrics is considered by the Research Evaluation Committees and it is the committees that decide the ratings.
12
Citation analysis cannot be used for evaluating research performance across all disciplines, rather it is used for disciplines whose primary research output is in academic journals. Generally, these disciplines are the science, engineering, medical and health disciplines. For ERA, the ARC identified disciplines suitable for citation analysis through consultation with researchers in disciplines. For the most recent round of ERA, the disciplines that use citation analysis are shown in the ERA 2018 Discipline Matrix . ERA uses two broad types of citation analysis—relative citation impact (RCI) and the distribution of publications against year and field-specific benchmarks. A detailed explanation of the citation methodology is located in Section 5.5 and Appendix I of the ERA 2018 Evaluation Handbook .
Issues to be explored
Q3.11
The citation analysis methodology for evaluating the quality of research is appropriate. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer.
Q3.12
What are the strengths of the citation analysis methodology? Please describe.
Q3.13
What are the weaknesses of the citation analysis methodology? Please describe. Can the citation analysis methodology be modified to improve the evaluation process while still adhering to the ERA Indicator Principles? Yes/No. a. If you answered ‘Yes’, please describe how the methodology could be improved.
Q3.14
13
3.3.4 Peer review methodology For a range of disciplines, such as humanities, social sciences, and disciplines at the applied end of the spectrum, citation analysis may not be appropriate—either because these disciplines do not predominantly disseminate their research findings through academic journals, or because the citation data for the journals for these disciplines is not available. If the research output of the discipline is not predominantly made up of journal articles, then citation analysis would only give a partial view of the research activity and would not support an accurate evaluation of the research quality. The research outputs available for peer review through ERA evaluations include the traditional range of academic outputs such as journal articles, books, book chapters, and conference publications. ERA evaluations also include a range of non-traditional research outputs for some disciplines such as original creative works, live performance of creative works, recorded/rendered creative works, curated or produced substantial public exhibitions and events, and research reports for an external body. In ERA, a peer review sample of 30 per cent of research outputs is evaluated by committees of internationally recognised experts, and additional peer reviewers. The sample is nominated by the university. As with disciplines that use the citation analysis methodology, there must be a sufficient volume of research outputs within a unit of evaluation to ensure that the evaluation is robust. A detailed explanation of the peer review methodology located in Section 5.6 of the ERA 2018 Evaluation Handbook .
Issues to be explored
Q3.15
The peer review methodology for evaluating the quality of research is appropriate. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer.
Q3.16
What are the strengths of the peer review methodology? Please describe.
Q3.17
What are the weaknesses of the peer review methodology? Please describe.
Q3.18
Can the peer review methodology be modified to improve the evaluation process while still adhering to the ERA Indicator Principles? Yes/No. a. If you answered ‘Yes’, please describe how the peer review methodology could be improved.
3.3.5 Contextual indicators
Apart from the key quality indicators, ERA also includes a suite of contextual, or supporting, indicators. These are:
volume and activity
•
publishing profile
•
research income
•
applied measures.
•
For the most part, the contextual indicators are designed to provide expert evaluators with a deeper level of understanding about the unit of evaluation they are assessing, and their presence or
14
absence has virtually no effect on the rating given to a unit of evaluation. The one exception to this is the research income indicator. At the final meeting of the research evaluation committee, the committee may decide to increase a rating of a unit of evaluation where it is considered to sit on the boundary between two ratings and the income is exceptional.
Further information is in Appendix B.
Issues to be explored
Q3.19
The volume and activity indicators are still relevant to ERA. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. The publishing profile indicator is still relevant to ERA. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. The research income indicators are still relevant to ERA. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer.
Q3.20
Q3.21
Q3.22
The applied measures are still relevant to ERA:
a. Patents. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. b. Research commercialisation income. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. c. Registered designs. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. d. Plant breeder’s rights. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. e. NHMRC endorsed guidelines. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer.
3.3.6 ERA rating scale ERA uses expert review of research quality indicators to provide ratings for individual units of evaluation. The ERA ratings are scaled 1 to 5, with 1 being well below world standard and 5 being well above world standard. ‘World Standard’ refers to a quality standard. It does not refer to the nature or geographical scope of particular subjects, or to the focus of research nor its place of dissemination. The ratings are bandings, meaning that a range of performance can be recognised within a single rating. Descriptors for each rating band in ERA are at Appendix C. Over the four rounds of ERA there has been an improvement in the ratings of units of evaluation at both the broad discipline and the specific discipline level. The rating improvements over ERA rounds for units of evaluation at the specific discipline level are shown in Figure 1.
15
Figure 1: Comparison of percentage distribution of specific discipline unit of evaluation ratings across ERA rounds
Issues to be explored
One of the objectives of ERA is to facilitate improved research quality. There has been an increasing number of ‘4’ and ‘5’, and a drop in the proportion of ‘1’ and ‘2’ ratings over rounds as the example in Figure 1 shows. While this improvement reflects strategic decisions made by universities regarding their investment in research, some feedback has raised questions about whether the current rating scale can continue to differentiate sufficiently performance at the upper end of the scale.
Q3.23
The five-band ERA rating scale is suitable for assessing research excellence. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. Noting that 90% of units of evaluation assessed in ERA 2018 are now at or above world standard, does the rating scale need to be modified to identify excellence? Yes/No. a. If you answered, ‘Yes’, please explain how the rating scale can be modified to identify excellence.
Q3.24
3.3.7 ERA low-volume threshold
A university is only evaluated in ERA in a broad discipline, or specific discipline, if the number of research outputs submitted reaches the low-volume threshold. The low-volume threshold also ensures that most Australian universities are evaluated in at least one field of research, regardless of their size. With a higher low-volume threshold, it is possible that smaller universities will no longer be evaluated in some disciplines in which they were assessed previously. With a lower low-volume threshold, it is possible that there will be insufficient data to accurately rate some units of evaluation. For further information on the low-volume threshold and how it applies, see the ERA 2018 Evaluation Handbook, section 1.5.1. The ARC has received feedback from some universities that the low-volume threshold is not appropriate and is interested in further information from stakeholders as part of this consultation. Note—due to the recent publication of the ANZSRC 2020, the ARC is unable to provide detailed modelling of the effects of different low-volume thresholds. Stakeholders are invited to provide comments on the low-volume threshold; however, the ARC will need to model likely effects prior to making a decision on any changes.
16
Issues to be explored
Q3.25
The ERA low-volume threshold is appropriate. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. Are there ways in which the low-volume threshold could be modified to improve the evaluation process? Please describe.
Q3.26
3.3.8 ERA staff census date For ERA, the eligibility of research outputs claimed by a university is based on a researcher's place of employment on the ERA census date, not where they were at the time of publication. Using a census date means that all current publications by a researcher, published in the reference period, are carried to the current employing university, regardless of where the original research was conducted. In doing so, the census date provides a snapshot of the current research capacity of the university. The census date approach applies to all research staff who have a formal association with the university. For employed staff, all their eligible research outputs must be submitted. For casual staff, or those with another type of association, for example, adjunct staff and visiting fellows, only those of their outputs with a by-line to the submitting university may be included. Another option for determining which university can claim a research output is by using researcher by-lines. With a by-line approach, a university would only be able to claim a research output if the output has the university named in the by-line. Such an approach would reduce incentives to engage staff merely for the purpose of claiming all their research outputs within the reference period; however, it would also prevent a snapshot of the current research capacity of a university.
Issues to be explored
Q3.27
What is the more appropriate method for universities to claim research outputs—staff census date or by-line? Please explain your answer.
Q3.28
What are the limitations of a census date approach? Please describe.
Q3.29
Would a by-line approach address these limitations? Yes/No. Please explain your answer.
Q3.30
What are the limitations of a by-line approach? Please describe.
3.3.9 ERA interdisciplinary research and new topics ERA is a discipline-based research evaluation exercise which uses the ANZSRC Fields of Research (FoRs) to define disciplines. Interdisciplinary and multidisciplinary research is disaggregated and evaluated in its individual discipline components. Each eligible researcher and research output can be assigned to up to three specific disciplines, with a percentage apportioned to each. For each unit evaluated, Research Evaluation Committees can see an interdisciplinary profile which shows how the research outputs have also been assigned to other specific disciplines. This provides contextual/discipline information for committee members to consider when undertaking their evaluation. Where multi or interdisciplinary work is being considered, the Chair of a committee can also call on members in other committees to provide expert advice.
17
Issues to be explored
Some concerns have been raised by the sector that in evaluating and reporting research quality by discipline, ERA is discouraging interdisciplinary research.
Q3.31
ERA adequately captures and evaluates interdisciplinary research. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. a. If you disagreed with the previous statement, how could interdisciplinary research best be accommodated? Please describe.
3.3.10 ERA and Indigenous research ERA has not evaluated Indigenous or Aboriginal and Torres Strait Islander research separately from other disciplines. This is because Indigenous research was classified in the ANZSRC 2008 at the most granular level (six-digit Field of Research—see Section 3.3.1) and so was not evaluated separately in ERA. 12 For example, the ANZSRC 2008 Field of Research 1 30301 Aboriginal and Torres Strait Islander education was evaluated within the specific discipline, 1303 Specialist Studies in Education and, in turn, within the broad discipline of 13 Education . The same applied to other areas of Aboriginal and Torres Strait Islander research including health, environment, language and culture.
Issues to be explored
The ANZSRC 2020 includes a new broad discipline for Indigenous Studies that includes separate specific disciplines for Aboriginal and Torres Strait Islander, Māori, Pacific Peoples and other Indigenous research. According to the current ERA methodology, these disciplines would be evaluated at a university where the low-volume threshold is met. The ARC is investigating the best way to evaluate the new Indigenous Studies broad and specific disciplines in ERA, including whether universities will be able to meet the low-volume thresholds, and whether citation analysis or peer review is the best method for a particular discipline or set of disciplines. If there is insufficient volume in certain disciplines, it may be more feasible to combine them into one or two units of evaluation.
In ANZSRC 2020 Indigenous Studies is defined as research that significantly:
• relates to Aboriginal and Torres Strait Islander, Māori, Pacific, and other Indigenous peoples, nations, communities, languages, places, cultures or knowledges and/or • incorporates or utilises Indigenous methodologies/ways of knowing, theories, practice and/or is undertaken with or by these peoples, nations or communities. Note—as Indigenous Studies is a new classification in ANZSRC 2020, the ARC is unable to provide detailed modelling at this time regarding volume. We note that universities may also be unable to undertake their own modelling at this time. Stakeholders are invited to provide general comments regarding the evaluation of Indigenous studies; however, the ARC will need to undertake further data analysis and consultation prior to making a decision on any changes.
12 With the exception of 1802 Māori Law
18
Q3.32
My institution would meet ERA low-volume threshold in Indigenous studies at:
a. Two-digit? Yes/No. If you answered ‘yes’, please list which ones.
b. Four-digit? Yes/No. If you answered ‘yes’, please list which ones.
In ERA, the best approach for evaluating Indigenous Studies is (choose one) :
Q3.33
a. Using established ERA methodology i.e. the low-volume threshold would apply to the Indigenous Studies discipline and all its specific disciplines b. For Aboriginal and Torres Strait Islander studies by combining low-volume disciplines into single units of evaluation c. For Aboriginal and Torres Strait Islander studies by combining low-volume disciplines into two units of evaluation (one unit comprising Humanities, Arts, and Social Sciences disciplines and one unit comprising Science, Technology, Engineering and Mathematics disciplines)
d. Other. Please describe.
Q3.34
What would be the advantages and/or disadvantages of your preferred approach for evaluating Indigenous studies in ERA? Please describe.
3.4 ERA process
3.4.1 Collection of ERA data Currently, ERA collects data for evaluation every three years during the ERA submission and evaluation year; the most recent being 2018. In the response to the House of Representatives report on Australian Government Funding Arrangements for non-NHMRC research, some submissions recommended that ERA collect publication data annually, suggesting that this would streamline or reduce the reporting burden associated with a major triennial data collection. 13 The ARC is interested in the views of stakeholders regarding a move to annual collection of data from universities for ERA.
Issues to be explored
Q3.35
ERA should move to an annual collection of data from universities. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer. What would be the advantages and/or disadvantages of an annual data collection? Please describe.
Q3.36
13 House of Representatives Standing Committee on Employment, Education and Training, Australian Government Funding Arrangements for non-NHMRC Research, (Canberra: Parliament of the Commonwealth of Australia, 2018).
19
3.4.2 Publication of ERA data The ARC publishes a range information for each ERA round in the ERA National Report. This report includes the ratings for units of evaluation as well as data on research outputs, staff and research income aggregated at the specific or broad discipline level. Some universities have suggested that volume data, that is, the volume of outputs submitted in each unit of assessment, should also be published. In ERA 2018, additional data was released through the Data Portal, including the metadata for each output submitted. 14 To improve transparency and accountability the ARC intends to publish the discipline assignment information for each research output in future ERA rounds. Where more than one university has included the same output in its submission, the discipline assignment for each university would be shown.
Issues to be explored
Q3.37
In future ERA rounds, should the volume of outputs submitted for each unit of evaluation be published?
a.
Yes, Please explain your answer.
b.
No, Please explain your answer.
Q3.38
In future ERA rounds, research outputs should be published with their assignment to specific disciplines following completion of the round. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer.
a. What would be the advantages? Please explain your answer.
b. What would be the disadvantages? Please explain your answer.
Q3.39
What other data do you think the ARC should publish following an ERA round? Please describe.
14 Metadata included: Research output title, Research output type, reference year, outlet, publisher, ISBN, ERA round, and Institution.
20
4 Engagement and Impact Assessment (EI) This section provides an overview of the EI assessment, its history and development and analysis of issues raised previously in feedback from stakeholders. The section includes questions relating to EI policy, methodology and process.
For further information about EI, please visit the EI homepage on the ARC website.
4.1 EI overview
EI is a national assessment framework that assesses how researchers engage with the users of their research, and how they translate their research into impacts, beyond academia. In doing so, EI aims to encourage greater collaboration between universities and research end- users, such as industry.
The specific objectives of the EI assessment are to:
• provide clarity to the Government and the Australian public about how their investments in university research translate into tangible benefits beyond academia
• identify institutional processes and infrastructure that enable research engagement
• promote greater support for the translation of research impact within institutions for the benefit of Australia beyond academia
• identify the ways in which institutions currently translate research into impact.
The Australian Government first announced the development of an engagement and impact assessment in December 2015, as part of its National Innovation and Science Agenda (NISA). EI was developed through consultations with universities, stakeholders and experts and through a Pilot conducted in 2017 15 . The first full round followed in 2018. EI uses expert review of quantitative and qualitative measures of research engagement and, qualitative measures of research impact and approach to impact at the broad discipline level. Further details of the EI methodology are outlined in Section 4.3. Results and key findings from the EI 2018 assessment were released in March 2019 in the EI 2018 National Report. Over 200 impact studies and 170 engagement narratives that received a high rating were also published as examples of best practice. 16
15 Further information about the EI Pilot and its findings, is available on the ARC website. 16 ARC, ARC Data Portal .
21
Issues to be explored
Q4.1
Considering that EI is a new assessment, to what extent is it meeting its objectives to: a. encourage greater collaboration between universities and research end- users, such as industry, by assessing engagement and impact? A very large amount; A large amount; A moderate amount; A small amount; Not at all. Please explain your answer. b. provide clarity to the Government and the Australian public about how their investments in university research translate into tangible benefits beyond academia? A very large amount; A large amount; A moderate amount; A small amount; Not at all. Please explain your answer. c. identify institutional processes and infrastructure that enable research engagement? A very large amount; A large amount; A moderate amount; A small amount; Not at all. Please explain your answer. d. promote greater support for the translation of research impact within institutions for the benefit of Australia beyond academia? A very large amount; A large amount; A moderate amount; A small amount; Not at all. Please explain your answer. e. identify the ways in which institutions currently translate research into impact? A very large amount; A large amount; A moderate amount; A small amount; Not at all. Please explain your answer. The EI objectives are appropriate for the future needs of its stakeholders. Strongly agree; Agree; Neither agree or disagree; Disagree; Strongly disagree. Please explain your answer.
Q4.2
Q4.3
What impact has EI had on:
a. the Australian university sector as a whole? Please describe.
b. Individual universities. Please describe.
c. researchers. Please describe.
d. other sectors outside of academia? Please describe.
Q4.4
How do you, or your organisation, use EI outcomes? Please describe.
Q4.5
The EI outcomes are valuable to you or your organisation. Strongly agree; Agree; Neither agree nor disagree; Disagree; Strongly disagree. Please explain your answer.
Q4.6
How else could EI outcomes be used? Please describe.
4.2 EI definitions
For the purposes of the EI 2018 submission and assessment, the following definitions were used:
Research
Research is the creation of new knowledge and/or the use of existing knowledge in a new and creative way to generate new concepts, methodologies, inventions and understandings. This could include the synthesis and analysis of previous research to the extent that it is new and creative.
22
This is the same definition used for ERA. It is consistent with a broad notion of research and experimental development comprising "creative and systematic work undertaken in order to increase the stock of knowledge—including knowledge of humankind, culture and society—and to devise new applications of available knowledge" as defined in the ARC funding rules.
Aboriginal and Torres Strait Islander research
Aboriginal and Torres Strait Islander research means that the research (as defined in the preceding definition) significantly: • relates to Aboriginal and Torres Strait Islander peoples, nations, communities, language, place, culture or knowledges, and/or
• is undertaken with Aboriginal and Torres Strait Islander peoples, nations, or communities.
Engagement
Research engagement is the interaction between researchers and research end-users outside of academia, for the mutually beneficial transfer of knowledge, technologies, methods or resources.
Impact
Research impact is the contribution that research makes to the economy, society, environment or culture, beyond the contribution to academic research.
Research end-user
A research end-user is an individual, community or organisation external to academia that will directly use or directly benefit from the output, outcome or result of the research. Examples of research end-users include governments, businesses, non-governmental organisations, communities and community organisations.
Specific exclusions of research end-users are:
• publicly funded research organisations (CSIRO, AIMS, ANSTO, NMI, DST etc.)
• other higher education providers (including international universities)
• organisations that are affiliates, controlled entities or subsidiaries (such as medical research institutes) of a higher education provider
• equivalents (international or domestic) of the above exclusions.
In EI 2018, certain types of organisations were excluded from the definition of end-user for the reason that engagement and impact was required to be beyond academia. There has been some feedback that the research end-user definition is unclear or excludes organisations which are legitimate end-users of research. There is an additional concern that university research which has an impact within the university sector is ineligible for assessment under EI’s current research end- user definition. An example of this is research on higher education which leads to impact within the higher education sector.
23
Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7 Page 8 Page 9 Page 10 Page 11 Page 12 Page 13 Page 14 Page 15 Page 16 Page 17 Page 18 Page 19 Page 20 Page 21 Page 22 Page 23 Page 24 Page 25 Page 26 Page 27 Page 28 Page 29 Page 30 Page 31 Page 32 Page 33 Page 34 Page 35 Page 36 Page 37 Page 38 Page 39 Page 40 Page 41 Page 42 Page 43 Page 44 Page 45 Page 46 Page 47 Page 48 Page 49 Page 50 Page 51 Page 52 Page 53Made with FlippingBook Proposal Creator