May 12, 2010 |

Can we transfer BC methods between disciplines?

Show me the Change
Complexity and the Art of Evaluation – Reporting Sheet

Topic: Can we transfer BC methods between disciplines?

Leader: Millicent Burke (millicentburke@gmail.com)

Participants: Helga Svendsen, Kirsty Fenton, Michael Baranovic, Cheryl Samarasuphe, Danielle Kennedy, Marcia Hewitt, Ian Grant, Bruce Paton, Jonathan Russell, Rebecca Petit

Key Points:

•    There are many different BC models used today.
•    Models are inherently limited (they are linear, they cannot capture the complexity of people – their motives, their behaviour).
•    A common language could both assist and limit us (eg. Assist discussions but also assist assumption – define “evaluation”).
•    Aware of the above limitations of models and language – it could be useful to have a compilation of models and situations where they have worked and not worked.
•    From this a matrix of factors to consider may assist us in finding the ‘best’ model currently available.
•    This matrix would need to include – timescales, resources, connectedness of community, values focussed on, etc.
•    BC Matrix Model factors:
–    Self interest eg, AIDS
–    Timescale (affect now, yes)
–    Geographic
–    Complexity of message (eg, climate change is highly complex)
–    Completeness of science (eg, climate change – scientists don’t agree)
–    Prior Learning
–    Limitations of models – constraints
–    *Complexity of people difficult to capture in a model*

May 12, 2010 |

Pretending

Show me the Change
Complexity and the Art of Evaluation – Reporting Sheet

Topic: Pretending

Leader: Johnnie Moore (johnnie@johnniemoore.com)

Participants: Andrew Rixon, Keren Winterford, David Williamson, Julie Harris, Candyce Presland, Natalie Moxham, Felicity Thomas, Jonathan Russell, Adrienne Fanning, Vera Lubczenico, Murray Irwin

Key Points:

•    We agreed that a lot of pretending goes on around evaluations. It ranges from quietly allowing people to misinterpret evidence when it suits, through being economical with the truth, to pretending project criteria are useful when they aren’t.
•    Evaluation can create pressure to put on a performance and concealing unintended consequences or failures sometimes feels necessary – but results in considerable waste and lost opportunities to learn
•    Creating more informed management relationships supports greater ??, as does negotiating longer term funding. We found not easy solutions – but it may be good to start to at least acknowledge a serious problem exists.

May 12, 2010 |

Our connection with nature – nature as teacher in behaviour change and evaluation

Show me the Change
Complexity and the Art of Evaluation – Reporting Sheet

Topic: Our connection with nature – nature as teacher in behaviour change and evaluation

Leader: 2 person conversation under the oak tree

Participants: Tanya Loos – City of Ballarat (nature@wideband.net.au)
Ellen Regos – Royal Botanic Gardens, Melbourne

Key Points:

•    Connection with nature – isn’t that our motivating factor for so much of our work in environmental and social sustainability?
•    Do we value our experiences in nature? Are nature walks, meditations and sensory activities valid tools for sustainability/BC conferences?
•    This conference has been a lot about the head – sensory experiences and heart based activities, our whole self – this has been missing.
•    However, deep ecology and other “fringe” education and BC methodologies have shown that creating beautiful, fun, sensory experiences in nature we actually highly successful ways of generosity behaviour change.
•    Our conference question – just go outside, sit in the garden and see what answers come.
•    The mainstream does not value nature as a tool as teacher, it is seen as not valid or rational, not part of the built environment.
•    Certainly, we need to frame nature connection experiences appropriately so as to not alienate people. Big fear: “we are not going to hug trees are we?”
•    Can we bring in nature connection methodologies into the more conservative/traditional spaces of workshops, eg, “field visits”.

May 12, 2010 |

Who actually cares about evaluation? Who is it for? Why do they care?

Show me the Change
Complexity and the Art of Evaluation – Reporting Sheet

Topic: Who actually cares about evaluation? Who is for? Why do they care?

Leader: Damien Sweeney

Participants: Jonathon Russel, Greg Bruce, Mark Butz, Julian Denler, Sarah Bartlett, Candyce Presland, Mike Rowell, Rob Catchlove, and more…

Key Points:

•    Different people care about different aspects of evaluation. This can be about accountability and/or learnings. It is important for funders to justify expenditure (in case someone asks about the project).
•    It is important to include mistakes/lessons/failures, to have a balanced report. Lessons need to be carefully phrased so as not to “surprise” or “shock” or “bring fear”.
•    An independent evaluation can bring objectivity, and help to find unexpected consequences.
•    Evaluation reports (especially executive summaries) are useful in review literature to design projects, but often they do not have a summary and clearly outline what worked and what did not.
•    Safe-matching communication (collective social learning) offers a methodology for evaluation.
•    Remember:
1.    Evaluation is audience specific. Eg, you may only need a 5 minute video/snapshot for a councillor
2.    Do not separate evaluation from project management! Use it as a tool to keep learning along the journey
•    What is needed:
•    A forum/website only evaluation reports that showcase failure
•    Easily accessible evaluation reports for collective learning (not hidden on funder’s shelf)
•    Risk – the media can latch on to bad news and stifle creativity (and lead to unbalanced reporting
•    Remaining Question – “has accountability gone too far?”

May 12, 2010 |

Prole – sense – respond in practice AKA “The Risk of Success”

Show me the Change
Complexity and the Art of Evaluation – Reporting Sheet

Topic: Prole – sense – respond in practice AKA “The Risk of Success”

Leader: Jon Kendall

Participants: Anna Straton, Beth Hyland, Anna Lohse, Jonathan Davy, Fran Westmore, Megan Hughes, Mark Butz

Key Points:

•    We focussed all the challenges of the two worlds of change
•    ME     – satisfying the needs of “funders”
– doing things to create sustainable change
•    Tempting to think that we have to pretend in various ways in the direction of funders, and tempting to think something huge has to happen to move the situation on. BUT we concluded that small things are where the power and opportunity live – what small things can we do to probe/test what seem like fixed boundaries? The small steps often create big ripples!
•    And the apparent paradox of funded change! In order to make a difference we decided to maintain the status quo.

May 12, 2010 |

Our “Mindset” and Worldview as we Evaluate Behaviour Change

Show me the Change
Complexity and the Art of Evaluation – Reporting Sheet

Topic: Our “Mindset” and Worldview as we Evaluate Behaviour Change

Leader: Geoff Brown

Participants: Murray Miller, Ross Egleton, Chris Corrigan, Jan Smith, Jon Keudau, Maria Eliadis, Fran Westmore, Maude Lecourt, Brian Hardy, Diane Nichols, Kathryn McCallum, Sarah Bartlett, Sue Xavis, Andrew Rixon, Debbie Coffey, Stephen Kelly, Felicity Thomas, Lisa Keedle, Jonathan Dacy, Tanya Loos, Martin Hausenblas

Key Points:

•    Willingness to be a learner as opposed to an expert – a key mindset needed across the field
•    Sam Ham’s work – as project leaders we do things in the way “we think” it should work
– A Healthy Assumption we should take is this: “we can learn from our participants”
•    Mindset – we still assume a causal link between providing people with information/experience and the behaviour change we want to see
•    Should we even be focussing on the “set of behaviours” we want to change? Instead should we be focussing on other things like political processes, infrastructural change, social networks/relationships?
•    The notion change “hurts” – how can we support people through this?
–    Re: Biodiversity – help people to notice the things around them (respect the things around them)
–    Maybe our role is to help people simply notice the things around them rather than trying hard to change them?
•    As an evaluator it can be a trap to also be the Architect of the Program because the design is therefore “my baby” – too much attachment and bias
•    Relationships are critical between people in our projects – we often assume that the relationship b/w us (ie, our project) and the participant is the most important one to enable change, maybe we should do more to support the relationships and conversations between participants (“we do what we do under the influence of others)
•    Is evaluation part of the project? Look for opportunities for participants to get involved in evaluation activities – participatory evaluation techniques like Most Significant Change (MSC)
•    Having different “Roles” and different expertise in the design and evaluation of a project is critical
•    A Mindset of “Flexibility” is critical – that is our targets and goals may have to change as a project unfolds – this conversation with the funder can build trust and relationships
•    We need to be able to work and link up what happens @ a Distance (strategic level) @Close-Range: The Actions
•    Giving people (participants and partners) a sense of ownership is critical also
•    Story of Change: Narrative Posters (Andrew Rixon – see www.babelfishgroup.com.au)
1.    “You have all the resources you require” – 1st Challenge and the Mindset needed
2.    “We are doing the best we can”
3.    “Being flexible and hanging loose”
Dilemma raised is the discussion: “WE ARE UP AGAINST IT AND IT’S URGENT!! We don’t have time for all of this!”
–    Refer to the Book: “Time Paradox” Phil Zimbardo – our perception of time can bring about much anxiety
•    Maybe the anxiety around “time pressures” stops us being effective and actually slows us down

Healthiest Time Perspective – refer to TED video (http://ca.ted.com/talks/lang/eng/philip_zimbardo_prescribes_a_healthy_take_on_time.html)
1. Past Positive: appreciation of the past is critical
•    Mindset that our participants have the resources
•    Inquiry Mindset = Listening & Respect the Community – spending time learning the “trigger points”
•    The “Reality” of our participants is important to understand
•    Identify your “behaviour change” journey? – Understand it in ourselves first … walk the talk
•    We need to be living the change we are bringing about to others
•    Being comfortable in saying “I don’t know” is difficult for experts – we need to break this Tyranny!
•    Mindset – Can I even expect to change anybody in my work?

References

1. Dave Snowden’s (Cognitive Edge) work on Complexity helps us to gain a healthy mindset when working with behaviour change projects – refer to this video by Shawn Callhan at Anecdote (http://www.youtube.com/watch?v=5mqNcs8mp74)

2. The ‘failures’ of the Castlemaine 500 Project inspired this topic and their is a chapter on the ‘mindset’ needed at the back of the report – http://www.cvga.net.au/main/index.php?option=com_docman&task=cat_view&gid=15&&Itemid=72

May 12, 2010 |

How to reconcile competing values with a shared future?

Show me the Change
Complexity and the Art of Evaluation – Reporting Sheet

Topic: How to reconcile competing values with a shared future?

Leader: Michelle Howard

Key Points:
•    We discussed the universality of many values and how the same value can manifest in different behaviours. There is a risk that we judge others (“Us & Them”) when we are all the change. So how can we support more sustainable behaviours with good in G??, real choices, examples of alternative ways so that we can all take the risk to let of the old trapeze and grab hold of a new one!
•    Values ?? true behaviours different.
•    We are all the change. Un??
•    With nature of ch?? other peoples behaviour.
•    We need ?? and options to be part of change towards a more sustainable future.
•    Organic/voluntary change or “enforced” change – “It’s a bit of a wicked problem”
•    Opportunity to work with positive examples and to sow on fertile ground.

May 10, 2010 |

Ripple effect & nonlinear snowball + How to tip the tipping point faster

SHOW  ME  THE  CHANGE
Complexity and the Art of Evaluation – Reporting Sheet

Leader:    Kathryn McCallum & Ross Egleton

Participants: Keren Winterford, Rob Catchlove, Deanne Wijey,
Jacqui Boreham, Alison Wallace, Candia Bruce
Hayley Giachin, Greg Bruce, Sarah Bartlett,
Gillian Paxton, Julian Donlen, Mike Dodd,
Sue Arndt, Christina Ting, Adrianne Fanning
Ross Egleton

Key Points:

Surfcoast Energy Group, Port Phillip Bay “Human Sign” where parents we transformed to become more sustainable as their children influenced them to join the sign.  WorkSafe Health checks that encouraged story telling within organisations to talk about health and de-stigmatise health issues, GET UP projects.

Key Themes

Design
Include baseline, decide on outcome and work backwards, decide to evaluate ripple effect or simply the activity, embedded evaluation at different levels and cater for unexpected outcomes.  Unexpected outcomes can become recommendations and can also influence expected outcomes.  If evaluating ripple effects then need mechanism to collect information.

Networking
Determine functional groups where people can report back to ensure community is holistic eg the school community goes beyond students and parents.  Need cross sector community group, use strength in networks, it’s a fallacy that we can control, allow for waves of change s actions may be immediate or dormant.

Tools
Storytelling was crucial and provide forum for this and for people, as much as possible to relate effort (good and bad) back to the functional group.  Learning forums such as reunions or ½ day sessions to people to share their stories which can be treated as a closure or celebration of “where to from here”? – collective social learning with action plans.  Education has limitations and the “nag factor”.  Discussed incremental change v’s massive change and important the policy is v. powerful and that research informs policy.

Digital networks – count internet views or hits relating to original email of social pressuring and data in database non-linear – see social network analysis or June Holly – “Network Weaving”.

Copyright approval – www – creative commons.com

Resourcing emerging properties / groups / successes relate reporting back to the activity / group.

Evaluation can be clear cut, especially in research, if well designed though harder with community action.

May 10, 2010 |

How to change behaviour in our “over it” topic?

SHOW  ME  THE  CHANGE
Complexity and the Art of Evaluation – Reporting Sheet

Leader:    Bridge Wetherall

Participants:

Key Points:

•    Making it relevant

•    When they are ready

•    Want to change

•    Imposed change & moved along by peer pressure

•    Reward & recognition – certificate etc.

•    Minimising assumptions

•    Tapping into existing groups – “green parenting” – churches

•    Non-english speaking backgrounds

•    Shift to “their” timelines

•    Research to understand various stages of a project to accommodate “early adoptors” and “strugglers”

•    Use interim evaluation data to prove a point

•    Evaluating regularly

•    Network to know what is going on in your area – local Gov’t network exists

•    ICLEI – run its course – local Gov’t taken over role

•    Greenhouse Alliance – SEW to join

•    Roadmap of what is going on around you

•    Buy-in of program for middle management

•    Scaleable data – different data provided to different levels – mngt, stakeholders

•    Peer mentoring to get middle mngt to buy-in

•    Choose not to work with certain groups in “this” time and context

I 3 model         |    involvement
|
———————–    intervention
|
|
|

•    Media involvement

•    School pick up and drop of times to influence change ie mother with children

•    Understand what else is on in the area and when so no conflicting “messages”

May 10, 2010 |

Assessing the independent contribution of population based programs/interpretation

SHOW  ME  THE  CHANGE
Complexity and the Art of Evaluation – Reporting Sheet

Leader:    Felix Aeber

Participants: Sarah Gorman, Max, Millicent Burke, Sarah Partlett
Stefan Koufman, Bruce Paton, John Harvey,
Bridget Wetherall, Jim Curtis, Catherine Doran
Ian Blain

Key Points:

•    Evaluating the independent contribution of a project is inherently difficult.  A first step is to identify and catalogue co-variants.

•    Asking members of the target population about attribution might be a viable option, but is suggestive and misses social diffusion effects.  As evaluation is not the same as measurement, data that is collected require an appropriate interpretative context.

Visit the archive to see older posts