Question? Leave a message!




How research proposal is written

how to approach a research proposal and scientific research and research topics on information management and research topics in information technology management
Dr.JohnParker Profile Pic
Dr.JohnParker,Singapore,Researcher
Published Date:01-07-2017
Your Website URL(Optional)
Comment
Research information management Developing tools to inform the management of research and translating existing good practiceContents 1. Introduction .............................................................................................02 2. Aims and objectives ..............................................................................03 3. Methodology and implementation ......................................................04 3.1 Approach to the research study .................................................................04 3.2 Selecting the sample .................................................................................04 3.3 Project interviews ......................................................................................04 3.4 Project workshops .....................................................................................06 4. Suppliers and the marketplace ...........................................................07 4.1 Institutional research information management needs ..............................07 4.2 How suppliers and institutions might meet these needs ............................08 4.3 Who are the suppliers, and what do they supply? ......................................09 4.4 Challenges articulated by research offices ................................................10 5. Performance management ..................................................................12 5.1 Data collection ..........................................................................................12 5.2 Key performance indicators for research ..................................................13 6. IT relationships and strategies ...........................................................15 7. Research system satisfaction ............................................................17 1 8. Vision for research tools .....................................................................20 8.1 Operational research systems ..................................................................20 8.2 Research information tools .......................................................................21 9. Good practice in implementation .......................................................23 9.1 Project management methodologies .........................................................23 9.2 Project structure ........................................................................................23 9.3 Communication and engagement .............................................................23 9.4 Lessons learned from systems implementations .......................................25 10. Conclusions ..........................................................................................26 10.1 Research information landscape .............................................................26 10.2 Information tools .....................................................................................27 10.3 Best practice in implementation ..............................................................29 11. Recommendations ..............................................................................31 12. References ............................................................................................33 Appendices A. Institutions visited .......................................................................................34 B. Interview questions ....................................................................................35 C. Project workshop notes ..............................................................................37 D. A perspective on suppliers active in the marketplace .................................52 E. Imperial College London case study ..........................................................551. Introduction Universities' core business is research. It is Research Management report (2009), this is key to a university's reputation and is central to a young profession, characterised by a lack of its mission. Nationally, governments recognise coordination, few shared structures, and with 1 that research is critical for expanding the no regulated qualification framework . university knowledge base, driving improve- ments in teaching, and in advancing social and Something similar is true of research man- economic gains. As universities have sought to agement systems, which have developed increase and diversify revenue streams and to without a coordinated approach to cope with reduce their dependency on block government increasing demands. Competitive academic funding, externally sponsored research, in environments require efficient and responsive particular, has achieved greater prominence. systems; increasing breadth and complexity in the research portfolio requires systems to be Developing and managing a research portfolio flexible and able to handle a range of different is not easy. The landscape in which research scenarios; and increasing regulation requires grants and contracts are bid for and won is active management and measurement of both competitive and globalised, with competition academic and administrative staff. The infor- only likely to intensify as a result of the cur- mation that is obtained from these systems is rent financial situation. Recent years have required for a variety of reasons. Strategically, seen a trend toward research becoming more it informs an institution of its performance and international and more interdisciplinary, mak- competitiveness and allows it to take decisions ing the management of research funding an based on that information. Operationally, 2 increasingly complex task. On a broader level systems are required to support day-to-day universities are heavily regulated and scruti- administration of research and fulfil the needs nised by governments who seek transparency of external stakeholders. and value for money. Mechanisms such as the Research Assessment Exercise (RAE) and Within the higher education sector there is a the Research Excellence Framework (REF) growing recognition of the need for research have placed significant demands on universi - intelligence and well-established performance ties to ensure they demonstrate quality and management frameworks. These can help value-added outcomes in their research. With focus institutional strategies on research qual- a bleak financial outlook for universities these ity, raise the profile of an institution's research demands are only likely to increase. There nationally and internationally, manage talent, is an even more pressing need to manage and build a high-quality research environment. resources efficiently and to be effective in Yet there also appears to be considerable identifying opportunities. dissatisfaction with the systems on offer, and a lack of coordination between institutions as Research management has evolved to fit each implements their own solution to prob- this dynamic research environment. People, lems that are shared across the sector. processes and systems are key factors in delivering research excellence, both operation- This study aims to understand the current ally and strategically. The functions of univer- research management systems landscape. It sity research offices and the demands on staff has focussed on how information from data working in research management have be- can be used to inform strategic decision- come more varied, growing to embrace a wide making at a variety of levels, and on how range of activities and responsibilities. Yet, research management can be improved as was demonstrated in the Professionalising across the sector. It is not a system-specific 1 John Green and David Langley, Professionalising Research Management (2009). Available at www.researchdatatools.comstudy and seeks to develop an understanding of system needs, especially with relation to information intelligence independent of spe- cific products. Ultimately, however, effective implementation of a software system is as critical as the product itself. There is a need to review the university sector’s success in implementing research management systems with the aim of translating good practice and providing a resource for the sector. This study seeks to evaluate the ways in which institu- tions across the sector create and implement tools for managing research-related data from systems, and to compare the variety of tools available. By doing this it has aimed toward a fuller understanding of how system tools can be best implemented. 3 2. Aims and objectives The aims of this study are to: • Translate good practice and provide a • Develop an overview of the systems used resource for the sector by the institutions involved in the study • Evaluate the ways in which institutions The objective is to provide an interpretation of across the sector create tools for managing the landscape on which further work could be research-related data from systems based to implement the findings of this study. • Compare the variety of tools available in the marketplace • Share possible ways of integrating tools • Develop an understanding of research management metrics • Review the sector’s success in implement- ing research management systems • Build upon and share Imperial College London’s experiences developing and implementing a range of research manage- ment systems3. Methodology and implementation 3.1 Approach to the research study • Amount of externally sponsored research income This study uses an inductive approach to • Geographic location research. No specific theory or hypothesis has been tested; rather, information has been Letters were sent to directors of research collected in an attempt to arrive at key conclu- offices, copied where possible to pro vice- sions that can be related back to existing chancellors for research, at each of the twenty- theories or to develop new concepts. four target institutions. These explained the background and objectives of the study and 3.2 Selecting the sample invited them to participate. Of the initial selec- tion, three declined. These were not replaced The study is confined to English institutions. with comparable institutions as it was felt that The Evidence UK Higher Education Research twenty-one institutions represented a sufficient Yearbook 2009 lists 110 higher education sample size. The institutions interviewed are institutions that are considered research active listed in appendix A. (i.e. they receive funding to carry out research- 2 related activities). This reconciles broadly to In 2008 approximately £3.7bn of external re- current Higher Education Funding Council for search funding was given to higher education England (HEFCE) data that lists 130 higher organisations in England. The sample selected education institutions to whom it distributes for this study represented £2bn of that funding, 3 funding for research-related activities. The accounting for almost two thirds of this total 4 4 slight difference can be explained by the value. The number of institutions selected broader definition of higher education institu - for the sample accounts for a disproportional tion used by HEFCE. amount of value: essentially the average value of research income for the sample selection is Having established the total size of the higher higher than that of England as a whole. Institu- education sector within England and identi- tions with varied research portfolios and one fied those institutions that receive research specialist institution were included, but it was funding, the next step was to select a robust felt that a more complete picture of the sector and representative sample. A sample size of could be achieved by skewing the sample roughly 20% was identified as large enough to toward those with higher levels of research give confidence in the statistical significance income, and that it was in institutions handling of any cross-sector data and trends identified, larger research volumes that the need for balanced with the three month timescale within research management systems and informa- which the study was to be completed. tion analysis would be the most pressing. Twenty-four institutions were approached 3.3 Project interviews to take part in the study, with a bias toward research-intensive universities. This was Letters were sent to the institutions who had because the study focuses on research agreed to take part in the study inviting them information systems and was not concerned to nominate staff involved in the management with teaching activities and their systems. of research and systems to be interviewed. As a result a range of staff involved in research Institutions were selected against the following were interviewed, including senior academic criteria: staff, such as pro vice chancellors for research • Total turnover (PVCRs), directors of research offices, sys - 2 Evidence, UK Higher Education Research Yearbook 2009 (Thomson Reuters, 2009). 3 HEFCE data is taken from http://www.hefce.ac.uk/pubs/hefce/2010/10_08/ 4 Evidence, UK Higher Education Research Yearbook 2009 (Thomson Reuters, 2009).tems and IT staff, and research office staff. ing the interview and were checked by another This broad selection meant that at the majority member of the project team for bias. Detailed of institutions it was possible to capture the notes that summarised and represented the views of both the academic and administrative outcomes of the interview were then agreed. In communities. some cases the interviewees provided further supplementary data post-interview, such as Before each interview secondary data was systems data or more recent strategic plans, compiled. In keeping with most public sector which were used to validate the interview organisations, a large amount of data was write-ups as necessary. available on each institution’s website. Copies were taken of annual accounts, annual reports, Interviewees were assured of their anonym- and strategic or corporate plans. Analysis from ity in advance of all meetings and some the Evidence UK Higher Education Research comments have been edited to preserve Yearbook 2009 was also included. this. Where necessary, repetitions and non- standard English have been removed. Interviews were conducted on a semi-struc- tured basis and covered a broad range of top- The advantages and disadvantages of inter- ics including research strategy, organisation, IT views as a method of data collection have relationships, current IT systems used for the been much debated. In this study interviews management of research, the implementation were appropriate to the exploratory nature of of systems, and performance measurement. the research. Interviews are inherently adapt- To ensure consistency, a standard question able and a skilful interviewer can follow up par- 5 list was prepared to be used at each interview ticular ideas, probe responses and investigate (appendix B). This addressed a range of motives and feelings. Other indicators such as questions related to research management body language, hesitancy or the use of meta- systems, but was focussed on the two project phors can be picked up through face-to-face deliverables: tools to support research and interviews and would not be possible through best practice in implementing systems. This other mechanisms such as surveys or online list was used at each interview, but it was felt exercises. Interviews offer each interviewee that presenting interviewees with a long list of the opportunity to think aloud and uncover questions was not the best method to collect issues not previously thought about, which can information and so the prepared questions contribute to a rich set of data, something that were used as broad topic areas for discussion was apparent during the course of this study. and not all were asked specifically at each interview. Despite the advantages, a number of pitfalls need to be negotiated when adopting an The interviews were conducted in single group interview-based study. It is important to avoid sessions and lasted between one-and-a-half leading questions and bias during interviews, and two-and-a-half hours. Imperial College though it must be acknowledged that inter- London staff led the interviews. At each there viewees can inadvertently contribute to bias by were at least three interviewers, at least one of concealing answers when pushed on sensitive whom was a project leader. Each of the project topics. During this study the interviewers were partners, Imperial College London and Else- keen to avoid bias, and vetted the questions vier, were represented at all of the interviews. with colleagues before settling upon a final list. Independent notes were taken by all interview- During interviews, effort was made to ensure ers. These were compared and collated follow- questioning was consistent and did not steer answers in a certain direction through tone Each section was written on flipcharts, with the or body language. In addition, an external top ten findings based on the interview tran - facilitator was used to review and analyse the scripts listed on each page. Slides of relevant interview notes to provide external moderation. quotations from interviewees in each section were also used. After considering each sec- 3.4 Project workshops tion in turn, attendees were asked to identify themes with which they strongly agreed or After interviews had been conducted at the disagreed. A brief discussion followed to pull sample institutions, participants were invited together findings in each area, and to establish to take part in one of three regional project where a particular finding might have been workshops. These workshops were used to misinterpreted or where there was general validate the findings of the project interviews consensus about a finding. and to develop a consensus. To avoid bias and provide a fresh perspective on the results, the After each workshop notes were written up and workshops were organised and facilitated by collated between the project team (appendix an external provider. Members of the project C). The findings were combined to ensure that: team sat in on the workshops and made notes • Pertinent themes from the interviews were but did not actively contribute. correctly identified • Stakeholders’ vision of future research The workshops were held in locations chosen information systems had been discussed to attract as many attendees from as many and captured 6 institutions in the sample as possible: Leeds, • Elements of good practice that could be Bath, and London. All but five of the institutions shared across the sector to improve imple- interviewed were represented at the project mentation had been examined workshops, though not all interviewees were able to attend and some of the attendees had Finally, the findings were assimilated with the not participated in the project interviews. validated findings from the project interviews to form the basis of the final project report. Time Each workshop lasted two hours though in was taken to identify conflicting opinions and each instance the approach was tailored to to draw out consistent findings. fit the audience attending. The primary objec - tive was to establish a framework that would Considerable enthusiasm was generated encourage open discussion and debate and during the workshops and without exception to find consensus where possible. To facilitate all participants were keen to work together in this, key findings and representative quota - future and to take the findings of this study tions were drawn from the write-ups of the forward in some form. project interviews. These were divided into eight sections: • Suppliers and the marketplace • Performance management • Performance management targets • IT relationships and strategies • Research system satisfaction • Vision for future systems • Research system implementation • Lessons learned from implementation4. Suppliers and the marketplace 4.1 Institutional research information thereby improving their reputation and stature. management needs While this is a broad ranging endeavour, research data is a critical component of the Institutions need to understand their strengths information required in order to understand and weaknesses, and to match their strengths how an institution can develop and deliver its to the landscape in which they operate. They research strategy. From a research information may opt to do this through clear targets and an management perspective the following needs equitable performance management frame- were identified. work aimed at growing the research base and What institutions want from research information 1 Help academics identify funding opportunities to perform research 2 Calculate costs to perform research in order to complete grant applications in compliance with full economic costing (FEC) requirements 3 Monitor academics’ funding applications and monitor success rates 4 Manage funds once awarded, include invoicing and cash collection at appropriate milestones 5 Aggregate and benchmark research outputs and outcomes, including 7 publications, patents, and licences 6 Showcase strengths of individual and institutional research activity, for example through online academic profiles and esteem measures 7 Help researchers collaborate by facilitating and tracking opportunities, especially in interdisciplinary areas, within institutions, across departments, and with researchers from other institutions 8 Help institutions collaborate by facilitating and tracking opportunities with corporations, national and local government bodies, and with other institutions 9 Facilitate business development activities by capturing and analysing a meaningful record of previous activities undertaken with specific funding bodies or potential partners 10 Identify talent externally for potential academic recruitment 11 Facilitate scenario planning at individual and aggregate levels, e.g. income sensitivity to key staff movements or major projects Figure 1: What institutions want from research informationDay-to-day operational needs relate to iden- and additive relationships with outside parties. tifying, applying for, and managing research Coordinating this activity, and maintaining a grants. First, institutions aim to help research- record on an institutional level of what has ers identify funding opportunities, either and has not worked with potential partners, through the research office or more usually by is a recognised need in research offices and providing tools to the academics themselves. business development teams. Once funding has been identified, research - ers need to calculate costs associated with Finally, at a strategic and institutional a project. Costing activities are more usually level, management systems are expected driven by the research office and need to to provide information valuable to long-term adhere to full economic costing (FEC) policy. planning. This is particularly relevant in terms Pricing is in accordance with funders’ terms of faculty planning and financial planning. In and conditions. As grant applications reach the former, research information tools help to decision-points, research offices will typically identify talent for recruitment. Closely related seek to monitor grant success, generally by is the need to scenario plan financial and measuring the number, frequency, financial organisational change. The funding revenue value and outcome of applications. These are that underpins the institutional finances is often broken down by absolute and percentage often sensitive to key departments or faculty. growth measures. Following successful grant Information can provide both the level of applications a post-award team in the research sensitivity (e.g. by projected grant income or finance office typically manage funds to contribution percentage from a single member 8 complete financial execution of the grant. of faculty) and highlight potential answers to mitigating risk (e.g. through identifying exter- Research offices need to support the man - nal talent and monitoring the distribution of agement and development of academic and principal investigator income and awards). institutional performance. Although institutions differ in their use of outcomes to influence their 4.2 How suppliers and institutions personal and institutional development plans, might meet these needs all those interviewed required the research office to monitor research outcomes – usually Institutions articulated consistent responses publication metrics or patent, license and when asked how they would like information esteem measures. In some cases these are to be delivered by their management systems benchmarked by discipline, geography, institu- and analytical tools. They frequently spoke of tion or research cluster. The need to showcase the need for: the strengths of an institution tends to be ad- • Common, consistent, seamlessly integrated, dressed by using the same outcomes – again, underlying datasets at both an individual and aggregate level. • User-friendly, intuitive interfaces • Dashboard-driven, customisable, drill-down Another grouping of needs centres on the reporting capabilities increasingly interdisciplinary and international • Regular and automatic updates, including nature of research. Research offices seek to harvesting of grants and outputs data help researchers collaborate by enabling them • Flexible systems (often requiring integration to identify expertise, affiliations and relation - with specialist finance/human resources/ ships that can support their research goals. intellectual property/publications packages) These needs may be served by using grant • Common key performance indicators and and outcome data to identify complementary benchmarks • Compliance with full economic costing 4.3 Who are the suppliers, and what do guidelines and the requirements of both the they supply? REF and research councils’ joint electronic submission system (JeS) Many institutions and suppliers split their organisations or products into pre- and post- “We want a system that is user friendly and award categories (i.e. business process captures all the data we need to inform the support prior to and after the decision to senior management, but which also enables award a grant). However, there are more individual academics to present their profile.” periodic and strategic needs. In the table below suppliers are mapped to the eleven “We want a cradle to grave process as seam- identified information needs. less as possible, and only have interventions when there is an exception or to check.” “We want everything at the push of a button.” 9 1 Identify funding 2 Calculate costs 3 Monitor grant success 4 Manage funds 5 Monitor research outputs Publication Intellectual property Esteem 6 Showcase strengths Individual Institutional 7 Researcher collaboration 8 Institutional collaboration 9 Business development 10 Identify talent 11 Scenario plan Figure 2: Suppliers mapped to information needs Agresso Alta Atira (Pure) BluQube DSpace Elsevier (Scopus, SciVal) ePrints Imprints InfoEd Inteum MyIP Oracle Grants pFACT ResearchResearch SAP Grants Mgmt Symplectic TechnologyOne Thomson (InCites, WoS) WellspringThis illustrates that few suppliers attempt to IT, and the research office). As research cuts meet all eleven needs, and that most engage across most, if not all, of these stakeholder with only a limited section of the holistic set of groups the requirements become increasingly requirements. This raises the question of why complex and confused. This means a diverse suppliers are fragmented in their provision to customer base influences the requirements serve such a disparate set of requirements. for any system: academics, managers, opera- tional staff, and strategists. The leading suppliers in the research man- agement space have been identified. Those “Academics are repelled by the bureaucrati- referred to most frequently are listed in sation and centralisation of systems.” appendix D. Institutions themselves are not homogene- Interviews showed that institutions were all ous. The number of researchers ranges from engaged in looking for new developments or tens to thousands, and research funding replacements to current systems. As “nobody income from thousands to millions. Inevitably, in the country is happy with systems that they therefore, there is a diversity in the extent have” many interviewees identified new sup - and complexity of information and systems pliers aiming to provide systems that address required, which varies across the sector. multiple needs in the information landscape. For example, institutions with large medical faculties or which focus on arts and humanities 4.4 Challenges articulated by research have significantly different information needs. 10 offices Research offices are variable in structure, Suppliers face four key challenges in design- role and activities, and research management ing systems to meet market needs: is relatively young as a profession. Though • Diversity of stakeholders within institutions research information managers are becoming and particularly those who have the power, more significant and are expected to develop influence or ability to make decisions in this systems and information, in the majority of area institutions their roles and remits are not • Perceived diversity of institution types clearly defined. More importantly, these • Variety of activities and processes by which people are expected to translate business research is managed within institutions processes that are highly variable within and • Lack of shared standards in data and data between institutions into systems. The dif- definitions across the sector that make it ficulty for the suppliers is how to develop cost difficult to define systems requirements effective systems with core functionalities that consistently across institutions can meet the expectations of such a diverse customer base. These challenges make it difficult for suppliers and institutions to understand each other’s “There is little thought leadership and knowl- constraints and needs. edge development around best practice.” In terms of stakeholder diversity, research The lack of standards both in data and data management draws on multiple functions definitions compound the challenge for suppli - within an institution (finance, human re - ers. The difficulties facing suppliers are how sources) and multiple stakeholders (academ- to develop information systems that meet the ics, management, finance, human resources, vocabulary of not only the diverse range of institutions but also of their stakeholders, such “How educated are we at asking suppliers the as funders, government agencies, and industri- right questions?” al partners. Institutions and other stakeholders within the sector are insufficiently joined-up to A common relationship-failing stems from sup- establish common research data terminologies pliers’ lack of knowledge about other systems and structures. As such, institutions are loath and related business processes. This is often to commit to investment in research systems. compounded when information is shared Instead, short-term shifts in research policy across systems and processes. This can result drive short-term institutional needs, which in significant scope creep as system require - they fulfil with ad hoc and piecemeal systems ments begin to overlap. Again, some sympathy implementation. was extended to suppliers as many of the interviewees acknowledged difficulties in brief - This perceived diversity also spurs a lack of ing suppliers on variable business processes collaboration or collective purchasing among and lack of standard requirements. Research institutions, although there are occasional insti- offices are, with occasional exceptions, not tutional clusters that collaborate to move their prepared to accept out-of-the-box functionality, information management agendas forward. unlike more uniform functions such as human The supplier landscape reflects the situation, resources or finance. in that it is diverse, with most suppliers hav- ing entered the industry through an ability to “I want everything I need and nothing else. I deliver one aspect of institutional needs. While must be the customer from hell.” suppliers understand key areas like human 11 resources and finance and have robust and Some interviewees were frustrated by sup- well-established relationships with institutions pliers taking a resolutely single product or a in these areas, the relationships in research one-size fits all approach. Research offices management are less established, resulting in were also frustrated by some of the larger sup- a lack of shared knowledge and understanding pliers who seemed to view research manage- between supplier and institution. ment as an entry point into other major system areas, usually finance, student management or “It would be great if the top five could col - human resources. laborate – especially as they all have the same finance system.” Suppliers are not felt to be delivering against institutional needs or to understand fully research management. As one interviewee commented, “Suppliers do not know what research offices do on a daily basis.” However, there was also sympathy for suppli- ers, as institutions often have difficulty articu - lating their needs, partly because research is seen as a moving target and partly because of the technical and linguistic challenges of com- municating complex technical requirements.5. Performance management 5.1 Data collection individual level (although the use of such data had been accepted by unions at several other The opportunity to report information on perfor- institutions). One PVCR expressed a moral mance within an institution has increased as a dilemma about using performance manage- response to external drivers such as the RAE. ment data, arguing that it would be unfair for As a consequence some institutions have ap- staff in one area to face cutbacks if it is appar- preciated the usefulness of research data and ent that they are performing better individually have begun to develop demand from various than those in areas that are maintained. elements within the institution (vice chancel- lors, PVCRs, heads of departments). While Though some strong views were held, all this acknowledgement of the need for data and institutions recognised data as an essential information has grown, many institutions have building block to inform and to measure at all failed to update tools that were implemented in levels, and to support decision making. response to external drivers. So, for example, ad hoc systems implemented in response to “Unless you have it you cannot make informed the RAE 2008 have already fallen into disuse. decisions; you would be acting based on opinions and hearsay.” While there was an acknowledgement among the institutions of the need for data to man- Despite the concerns of some institutions age performance, there was confusion and opposed to management of academic perfor- contention about the implications of collecting mance, the importance of data at an individual 12 and disseminating such data. Most of the academic level was generally recognised. The sensitivity surrounded the belief that academic reasons for this were: culture negates the need for accountability for • Increased competition and research com- performance, and many were concerned that plexity (e.g. interdisciplinarity) collection of data, even at an aggregated level, • Increased importance of research strategy was inevitably built up from an individual level. and the need for data to inform and evaluate it This was associated with a fear that individual • Statutory reporting and submission of data data would be used to judge performance of to the REF, HESA and funding organisations academics. At one successful institution, the PVCR and “Academia is based on stochastic processes the director of the research office regularly and a dashboard at an individual level would discussed success rates and volumes of ap- be a disaster.” plications and awards. The information was provided at institution, faculty, and departmental While the majority of institutions accepted the levels, with the ability to drill down to individual need for data to performance manage their level when needed. A framework existed for institution at all levels there were a few who the PVCR to discuss information with deans were strongly opposed. They argued that it against targets, and to use information with was inappropriate to allow the centre of the in- research facilitators who worked with academ- stitution to micro-manage and that it should be ics at local level, matching strengths to funding left to departments and individuals to manage opportunities. At this institution there were clear themselves. Justifications for this viewpoint institutional and discipline-specific benchmarks: ranged from the need for academic freedom for example, that grant income be in the top ten to one example where unions had not allowed of the appropriate RAE unit of assessment. data to be used to judge performance at an In some instances there were formal perfor- 5.2 Key performance indicators mance management frameworks that were for research clearly designed to incentivise academic per- formance. For example, at one institution there Many senior staff, both academic and admin- were mechanisms by which junior academic istrative, recognised the need for management staff could progress to senior lecturer level if information, but found it hard to decide what they achieved specific targets, such as securing information was most important. The result a major grant within a three-year timeframe, two was often that an institution had a plethora major grants within a five-year timeframe, and of data but no rigorous way in which to use it four quality publications that could be submitted informatively. Others found it difficult to identify to the REF. These targets were set in consulta- performance measures for research at all. tion with the academics and unions were involved in the process. They were generic, “How do you measure research when it is explicit, and transparent, and readily accepted about people and ideas?” by the academic community. This ensured that young academics understood how to gauge The measures in figure 3 were seen as relevant, their performance and had clear goals and some of which are being used in some institu- ownership of their career progression. It also tions, some of which are an aspiration for others. provided a strong incentive for them as system Most institutions identified with the need for users to ensure their data were accurate and performance measures, were the data available, current, which in turn provided better data for timely, and reliable. Currently, the majority of the institutional picture overall. data is retrospective, resulting in a challenge for 13 institutions in their ability to predict their future It was recognised that institutions needed to act funding flows and output measures. on the evidence provided by key performance indicators. This required a framework to enable Institutions consistently mentioned the difficulty senior staff to make independent assessments in securing meaningful, up-to-date information and take decisions. In institutions where related to comparator institutions, including that performance measures existed without a clear from funding bodies. Even where benchmark- decision making framework and information ing data was available this was not structured flow, little change actually appeared to happen. consistently, published regularly, or available in a suitable format for institutions to use it meaning- Most institutions found it difficult to get academ - fully. As a result of this, some institutions had re- ics to update data within systems, with resulting sorted to sharing their own data with one another problems for data quality. Many were unsure in an effort to generate meaningful benchmarks. how to motivate academics to validate or input However, such moves were relatively informal or data. Some had recognised that performance ad hoc and often the shared data was too old to management at an individual level could be be useful for setting a strategy. used as a potential incentive, whilst others had exploited systems which eased the burden for Several institutions which used clear frame- academics and encouraged engagement. works for performance management had mark- edly improved their research income and their “We set income and publication targets for our reputation, as measured by performance and academics. They rarely need the stick as the league tables. An example of the consistent carrot works well” factors that have contributed to this success are outlined in figure 4.Measure Granularity Inputs Research income Growth; per year; by academic, FTE, department, funder type Research application success rates By funder; per year Volumes of research applications By funder; quarterly, monthly Volumes of research awards By funder; size banding; quarterly, monthly Research overhead, FEC income By funder; quarterly, monthly Post-graduate student numbers Against targets; per year Outputs Publications in quality journals Numbers, citations, by academic Esteem measures Impact measures; innovation activities; patents; licenses Numbers of external National, international collaborations Peer Institutional peer group National, international benchmarks Departments or academic National, international discipline Staff numbers Departmental, national Research office resourcing FTEs in UK institutions Figure 3: Identified benchmarking measures 14 Case study: delivering results through performance management Strategy Research strategy developed using internal and external research data Senior academic team used research data to identify strengths and weaknesses Research income targets and milestones incorporated into research strategy Execution Strong emphasis on evidence based decision making by vice chancellor Key performance indicators set at faculty level and progress reviewed monthly by vice chancellor and heads of faculty Individual level targets set by heads of faculty and reviewed against external peers Performance against targets and feedback fed into annual appraisal mechanism Results Individual level review is not mandatory but take up is high (c100% across institution) RAE 2001 to RAE 2008 performance = +6 places Research income growth (2005 to 2008) = +59% Research income growth ranking (2005 to 2008) = +18 places Figure 4: Case study: delivering results through performance management6. IT relationships and strategies While almost all institutions visited in the study “It is hugely important for system planning to had a published research strategy, only one occur across all levels of the sector, not just had a properly developed and signed off IT within individual universities.” strategy (let alone one that was published, even internally). Many commented that re- “The lack of a long-term vision makes it hard to search management systems were driven by invest and to co-operate within a university let external factors, particularly the RAE and REF. alone across the sector.” Several had implemented systems of varying degrees of sophistication in order to meet the Most institutions visited said that any IT specific needs of the last RAE. Only one had strategy which existed was held within the devised an IT strategy for research and allied domain of the IT department or its directorate. systems that met the needs of the institution Usually development of such a strategy rather than externally imposed requirements. was not carried out at a high level in the Several acknowledged that systems that had institution and often it fell within the domain been built to deliver the requirements of RAE of the registrar or head of administration. 2008 had subsequently fallen into disuse, with One institution commented that strategy was data not being maintained. defined by the director of IT’s budget proposal, which was examined in the annual planning “The principle drivers for our systems are often round, so confusing strategy with budgeting external, e.g. RAE or REF; but they shouldn’t and indicating how research systems are often be – just as a research strategy should not judged as a cost rather than an investment. be developed to respond to the RAE but to Many commented that the research office had 15 respond to our strengths and the external only limited formal input into the definition of IT environment, our systems should be defined strategies or priorities. Possibly as a result of to run our business with the inevitable conse- this, most commented that research was given quence that they will deliver what is needed for a low priority for investment; in every institu- the RAE/REF.” tion interviewed, systems to support finance, human resources and students had, almost The failure of institutions to view information sys- without examination, question or discussion, tems as tools to manage their research business been prioritised for capital investment over re- and instead the propensity to create them in search. Many suggested that this was because reaction to other pressures has lead to wasteful research is complex and that research system ad hoc implementations. In the project work- requirements are therefore harder to articulate shops attendees began to discuss the need for or agree upon; that external reporting is work at a national and strategic level to improve constantly changing; and, significantly, that the the situation. Many expressed the need for better champions for research management systems future planning; the information that government are more diffuse (resting with PVCRs, direc- and its bodies will require in the future needs tors of research offices, deans, the academic to be better defined so that it can feed into the community in general, or a combination of specification for a university information and them all) than in other areas of the administra- research management system now. It was felt tion, where there are always clear champions that this would be an improvement on the current for new systems to support administration. situation in which each institution responds to externally-prescribed metrics and implements “The director of research is well positioned systems in specific response, an approach that to have a strong influence on IT strategy via must be hugely inefficient to the sector overall. structural routes within the university … though whether he can really influence the of research systems was not aligned with desired result is another matter.” organisational priorities (and could be skewed by externally imposed requirements). Many “There is a misalignment between institutional recognised the need for the business to drive imperatives and the direction of IT investment.” and be involved in the development of the IT strategy but found it hard to achieve this; one “Research needs get lost in the noise of commented that the IT department preferred to demands for other, more easily understood, implement systems that they knew were achiev- systems.” able rather than those which added value to the business. There was clearly a shared difficulty One institution developed its IT strategy in establishing exactly what a research system through its corporate information services or research systems strategy should look like. division but the PVCR said that “having grand visions for systems is not a good idea …. I “Research systems are hard to deliver.” have no vision for information systems.” “Research systems are a nebulous totality Possibly as a result, most institutions reported which mean different things to different people historically low levels of investment in research and which therefore makes it difficult to gain management systems even though there is an engagement when talking about the amor- increasing awareness of the need to invest as phous totality.” a result of RAE 2008. In most cases, initiating 16 new research system projects was depend- Despite the lack of formal involvement of those ent on individuals or research offices bidding managing research in the development of IT to either the IT department or to a variety and investment strategies, in almost all of the of committees. As it was not always clear institutions participating in this study relations where the decision would be made it was felt between the IT department and the research that any proposal should be discussed in as office were said to be good at an operational many forums as possible to generate general level, often “as a result of goodwill”. Yet these buy-in and momentum. In several institutions relationships existed almost always at a per- research systems were perceived to be a cost sonal level and not through formal structures. and not a benefit, and few conducted cost- They had developed through work on projects benefit analysis, let alone considered return on in which there had been common goals. investment, when defining priorities. One interviewee commented that since no structures existed it was a question of “having “There is a lack of recognition of the invest- to get on”; another suggested that relation- ment required for an aspirational research ships with other silos within the organisation management system compared with that were similarly weak (finance and library were required for a finance system, for example.” mentioned). Whilst research office staff were felt to work collaboratively with IT staff, there “I do not know where decisions are finally made was a feeling that academics and departmen- to prioritise capital spend on IT systems.” tal administration were even more removed. It is perhaps not surprising that most interview- “Why aren’t we all working together?” ees said that research strategy and IT strategy (where and if it existed) were developed in isolation and that the planning and resourcing 7. Research system satisfaction It was universally agreed that the current activities to fill gaps in the business process systems offering was unacceptable and that and to supplement ineffective systems. The academics and administrative staff had low consequent duplication of data entry through satisfaction levels with the systems they used. lack of integrated systems is a major source Indeed, none indicated that they were satisfied of frustration, particularly for academics, and a with their research system provision; academ- key reason for the more general dissatisfaction ics in particular were unhappy. Most institu- with existing systems. tions interviewed were reviewing the systems used to manage one or more elements of the “Generating meaningful data is labour inten- research cycle (pre-award including costing, sive and sometimes you have to fight to get pricing and negotiation; post award including access to it.” invoicing, reporting, intellectual property and publications management). A major challenge As a result, data quality was a major issue identified was usability. Many systems had that lead, in turn, to entire systems being poor user interfaces that were reluctantly viewed as untrustworthy; a situation in which, accepted by administrative staff, but those who as one person suggested, “data is used when used them less frequently, such as academics, it supports an argument but dismissed when found them harder to use. This had conse- it doesn’t.” All of the institutions interviewed quences for the extent to which academics reported that a considerable amount of staff would champion the development of new sys- time was employed in data cleansing activities. tems or engage with existing systems. There There was a general recognition that, in the was a feeling that usability was key to getting context of the current provision of systems, the 17 academics involved in systems, and that it was ability for an institution to generate meaningful important that academics be consulted early data is time-consuming and labour intensive. in the process when specifying and designing External agencies require reports to be in their new systems. own specific formats, meaning institutions have to report the same data in multiple ways, “There is a culture of not involving academics at significant cost. in the specification of systems which serve both academic and administrative needs; “Universities should work together more to the compromise between the two often suits make their collective voice heard by external neither well.” agencies.” “Too much emphasis is placed on the system The result is an environment in which it is hard and on functionality rather than user experience.” to persuade staff of the advantages of develop- ing holistic, integrated systems; many com- Many institutions reported that they used mented that painful implementations in the past composite systems in which information was dampened the appetite for new projects. The transferred or re-keyed into a combination of sector is well aware of infamous implementa- systems. Locally-held spreadsheets and basic tion failures (even at prestigious research-in- databases were often used despite it being tensive institutions). Few were able to articulate recognised that such a variety of local systems examples of successful implementations of could lead to huge difficulties in data cleanli - research management systems, despite good ness, management, and reporting. Several in- project management structures, relationships stitutions admitted that much of their research and, in some cases, academic support. management process relied on paper-based The memory of the failure of the MAC initiative in the 1980s/90s (which attempted to get the Two institutions are currently developing sector to develop systems collaboratively) their own institutional in-house project “to was often mentioned and several commented create a full research management system” ; on the sector’s inability to work together to another two are working collaboratively to 6 address the underlying issues. develop a costing tool; and two had built in-house systems to manage RAE 2008, “Future efforts are tainted by past failures.” both of which were completely dependent on an individual: “if he left then we would be in “If we got 10 universities in the room to define a complete mess.” a specification for an underlying system then we would be there for years, by which time Figures 5, 6 and 7 illustrate research system demands would have changed – but how can development and usage in the institutions that that be when we are all undertaking the same participated in the study. core business?” 18 Figure 5: Off-the-shelf vs. in-house/bespoke systems by institution 6 See Janette Hillicks, ‘Development Partnerships between HE and Vendors: Marriage made in Heaven or Recipe for Disaster?’, JISC InfoNet (May 2002), 4-6. Available at http://www.jiscinfonet.ac.uk/Resources/external-resources/development-partnershipsFigure 6: Off-the-shelf vs. in-house/bespoke systems by system type 19 Figure 7: Most used off-the-shelf systems