Evaluation skills for robust policy design and program development
2 Day Workshop
Our program combines a comprehensive research foundation with practical exercises to embed evaluation practice in policy practitioners' and program developers' work. Essential to evidence-informed policy and critical to any program planning and development. Important to anyone who wants to design good policy.
Delivered by experienced policy professional, Nonie Malone and highly experienced evaluation expert, Marion Norton. Build your competence to plan, commission and execute robust evaluations – essential for robust policy. This workshop shows how to develop meaningful performance measures to check that policy goals are being realised in implementation – often the missing ingredient to success.
- Understand the role of evaluation in delivering efficient & effective outcomes for government
- Know how to plan for and deliver quality evaluation that is fit for purpose and audience
- Know how to influence the authorising environment to ensure quality evaluations can occur
- Understand and know how to select and use evaluation methodologies
- Develop capability in commissioning and managing evaluations
- Gain and refine program logic skills
- Know how to develop the evidence-base from policy idea to post-program implementation
- Become familiar with the best information available to support your evaluation responsibilities and practice.
Participants from a previous workshop said:
“Program logic will be very useful. Framework for reporting will be a great guide and checklist for strategy reevaluation activities.”
“Thank you very much for delivering a great workshop on evaluation.”
“I feel I a much more capable of influencing evaluation to occur in my organisation as well as shaping it to be ‘fit for purpose’ :)”
Free copy of The Australian Policy Handbook
An invaluable guide for practitioners, academics and students to the craft of policy analysis, development and evaluation. It is an important resource for those with a commitment to sound evidence-based public policy.
Professor Ken Smith, ANZSOG Dean and CEO
An enduring and important contribution to the field. Althaus, Bridgman and Davis’ pioneering policy cycle approach continues to offer vital insights into the policy-making process in Australia and internationally.
Lisa Paul AP PSM, Former Secretary of the Department of Education
Download the PDF Brochure for full course details.
Day One: Fundamental Evaluation Skills
1. Role of evaluation in policy decision-making
- What is policy – where does it come from?
- From policy to program – governments and non-government organisations.
- What determines failure vs success in policy and programs?
- What is evaluation?
- What do we evaluate and why?
- Role of the policy officer in evaluation
2. Evaluation skills for policy and programs
- Applying program theory to uncover what works
- Building outcome chains
- Using program logic as a formative policy analytical tool
3. Basic evaluation tools
- Creating an evaluation framework to specify what, how and when to evaluate
- Identifying objectives, high-level evaluation questions, what to measure and sources of evidence
4. Scoping evaluation questions and data instruments
- Understanding the purpose of the evaluation
- Identifying perspectives of internal and external stakeholders to inform design and scope
- Involving internal and external stakeholders in evaluation processes
- Identifying and designing data instruments (incl. surveys, interviews, audits)
- Revealing unintended consequences
Day Two: Applied Evaluation Skills for For Policy & Programs
5. Evaluation design – choosing methodologies
- Designing at the beginning of the policy cycle
- Choosing time and place for different methods
- Applying techniques for formative and summative evaluation
- Demystifying evaluation terms
- Designing at the end of the policy cycle
6. Evaluation planning, commissioning and governance
- Integrating phases of evaluation with policy development and implementation
- Managing meaningful evaluation with resource constraints
- Incorporating ethical and culturally-appropriate practices
- Securing authority and commitment for conduct and use of evaluation that fulfills its purpose
7. Building credibility through robust analysis and reporting
- Making sense of evaluation data
- Using sound research practices
- Determining cause and effect, and challenging assumptions
- Answering evaluation questions
- Reporting findings to various audiences
8. Open session
- Setting and meeting evaluation requirements for service providers
- Developing competencies and skill building
- Discuss Queensland Program Evaluation guidelines and other practical guidance