This website uses cookies to ensure you get the best experience. By clicking or navigating the site you agree to allow our collection of information through cookies. More info

About 

Did our impact assessment process work, and how can we improve it? What tools are available for heritage organisations to structure how they reflect and learn from their impact assessment reports and processes? How often do we learn from your successes and mistakes? Could we be more actively learning from what we do and increasing our impact? What are we learning about the value of (digital) cultural heritage? This webinar looks to explore these questions. It took place on 24 February 2021 from 15.00 - 16.30 CET. 

Speakers

  • Pirjo Hamari, Deputy Director of Department at Museovirasto (Finnish Heritage Agency) and Project Lead of the Museums of Impact Project. Pirjo introduces the Finnish evaluation framework for museums that is now being explored at a transnational level in the Museums of Impact project.
  • Aleksandra Janus (Director, Fundacja Centrum Cyfrowe) introduces principles relating to understanding the value of digital cultural heritage and evaluating your work, choosing the right angle, verifying and making improvements, emerging from a recent report by the InDICEs project.
  • Harry Verwayen, Executive Director of Europeana Foundation, shares his thoughts on how far we’ve come and what we’ve learned since the publication of Phase one of the Playbook back in 2017.

Resources

Phase 4 of the Impact Playbook is still in development. To kick off the webinar, we asked participants to tell us, what do we need to think about when we evaluate our impact? We clustered participants’ responses as follows - and many of these themes can be found in both the presentations and discussion afterwards: 

  1. The organisation or project’s mission and vision, goals and objectives, and the desired impact or changes they want to have

  2. The audiences, users and stakeholders of an activity or an organisation - the target beneficiaries and how they’re involved

  3. The process of evaluating objectives, the methods used, the metrics that guide data collection

  4. The capacity to evaluate

  5. The benefits for participants and what has been achieved in terms of change, thinking at different levels (e.g. global) and in different contexts (e.g. social inequalities)

  6. Looking forward to future goals and new activity

(15:24) Harry Verwayen, Executive Director of Europeana Foundation, introduced the case study of Workers Underground, one of the most in depth impact assessments Europeana has undertaken to date. As well as considering the outcomes, this was a moment for Harry to reflect - to evaluate - what was learned from the impact assessment and on the areas that could have been improved for participants in the project. Harry closed by setting out how far Europeana has come and what we’ve learned since the publication of Phase one of the Playbook back in 2017. (Did you know that the first conversations about impact started in 2012?) He framed this within some high-level perspectives that emerged after evaluating our processes and the findings of a number of recently-published impact assessment reports.

(33:10) Pirjo Hamari (Deputy Director of Department at Museovirasto / Finnish Heritage Agency) introduced the Finnish evaluation framework for museums that is now being explored at a transnational level with 11 different partners in the Museums of Impact project. While there are a lot of evaluation approaches available, Pirjo introduced the rationale behind the Finnish Heritage Agency’s adoption of developmental evaluation as a way to support the development of the heritage sector. She set out the distinction between the evaluation of impact (an approach like that of the Impact Playbook) and evaluation for impact. The latter relates to the idea of evaluating in order to improve processes and set the conditions for the generation of more impact. A strong theme from her presentation was that even talking about impact in a heritage organisation is good, and the modular evaluation system they have created helps heritage institutions to do this. 

(48:15) Aleksandra Janus (Director, Fundacja Centrum Cyfrowe) then took to the stage to introduce some principles relating to understanding the value of digital cultural heritage and evaluating your work, choosing the right angle, verifying and making improvements, emerging from a recent report by the InDICEs project. The project aims to understand the social and economic impact of the digitisation of cultural heritage and to use this to innovate the reuse of digital heritage assets. She summarised the approach and findings of a part of the project’s work into better understanding the value chains between heritage organisations and creative industries. All of this work feeds into another core part of the project, a forthcoming digital transformation readiness self-assessment tool, which builds on the many years of experience of the Enumerate project and is designed to help organisations make the most of their digital objects. To help set the groundwork for this, Aleksandra has shaped a digital cultural heritage value creation cycle that can help organisations think about, evaluate and plan the work they are doing with digital cultural heritage and their digital projects. 

We had a lively conversation after the presentations, which, in order to encourage open sharing of people’s experiences, was not recorded. Some of the points are summarised below: 

  • The role of the digital cultural heritage organisation is changing and they need to adapt to this: heritage organisations need to develop new skills to stay relevant for their audiences. Institutions need new skills like story-telling. Evaluation can help institutions understand if they need to develop new skills (e.g. competencies, as set out in the Finnish evaluation framework). 

  • How do impact assessment and evaluation processes become embedded? How can it be embraced as part of the workload, in particular, in the context of the past year when there has been a lot of reacting and not much time for reflecting or evaluating? Evaluation should be framed in terms of why it’s important, and how it helps the institution plan for the future. A modular approach (like the Finnish structure) is an example of how implementation can be made easier. Incentives are also important, so institutions can see an immediate advantage. Pirjo has direct positive feedback from those who have used the tool and this is important to share. Evaluation models can be formalised but they shouldn’t be bureaucratic. Evaluation can also be part of all stages of the Europeana Impact Playbook, too - not just kept for the end - and this is where the connection with developmental evaluation becomes clear. 

  • What are the conditions that allow organisations to change how they think about change? How can we become more agile and iterative? This has implications for both organisations and funders. 

  • Evaluation at a European-scale is difficult. How would such a model take all contexts into account? If the goal is the same - to become more relevant - then the rest can become easier. 

Some institutions do not feel confident in creating change, or acknowledging that they are agents of change. Others are more confident in this. Some institutions think about activities but not about the impact they want to create for their stakeholders.

top