Archive for the ‘Planning, Evaluation and Measurement’ Category

h1

Plans! We Don’t Need No Stinking Plans!

April 5, 2016

One of the points I was making at the ISA Convention a couple of weeks ago was that in the real world public diplomacy organizations find it difficult to be strategic in the sense of creating a strong connection between their objectives and their means.  In part this is because public diplomacy organizations are always on, the routine logistical requirements of running a programme both on a day to day basis and in the longer term overwhelm the capacity of organizations to be strategic.   There’s no point worrying about SMART goals if you are more worried about keeping the show on the road at all.

Anyway another exhibit to buttress my cases emerged yesterday a US State Department Inspector General’s report on how the public diplomacy work of the embassy in Baghdad was contributing to the counter messaging part of the overall strategy against ISIL

The first item from the summary:

“Embassy Baghdad’s public diplomacy activities operate without formal strategic planning and goals.”

Public diplomacy is not discussed within the embassy’s Integrated Country Strategy  and there is no Public Diplomacy Implementation Plan.

The report obviously thinks that there should be plans but that’s not my point: lots of public diplomacy is reactive, and improvised rather than strategic.  From an analytical perspective it’s often better to look public diplomacies it through an organizational lens rather than an intentional one.

h1

The Secret of Public Diplomacy

February 22, 2016

One of the most stimulating books that I’ve read in the last couple of years was Ray Pawson’s The Science of Evaluation: A Realist Manifesto which is a book about….evaluating policy interventions.

There’s a lot in there but a core idea that recurs is this:

The outcome of a policy intervention is a function of what you do and how you do it in what context.

or to put it another way

Outcome = intervention + implementation + context

There are a lot of implications of this  but here’s four

  1. An outcome is not necessarily one you expected or wanted
  2. A great idea badly implemented will produce different outcomes from the one you expected even if the context is supportive.
  3. An intervention that produced a great outcome in one context may not produce the same outcome in another situation.
  4. In the right context a poorly implemented, badly conceived intervention might still produce a desirable outcome. The problem is that the lessons drawn will be that the intervention worked.

In terms of the analysis of public diplomacies a disciplined application of these four categories is very useful.  Much discussion of public diplomacy tends to focus on the design of the intervention ie communications strategy, message, narrative without too much attention to implementation or context.   The analysis of real cases tends to show a different pattern where  interventions are often a function of the context (we must do something!) and the mode of implementation (organizational repertoire) rather than any careful design process. Much more of this at ISA!

h1

Why Do Government Agencies Have Strategic Reviews?

August 24, 2015

There an interesting new paper in the Journal of Public Policy by Jordan Tama on why government agencies conduct major strategic reviews.   Tama uses the case of the US Quadrennial Defence Review as his starting point. Given the high degree of scepticism about the value of this document in shaping the development of US defence strategy why has the practice spread across other government departments (including, of course State with its two Quadrennial Diplomacy and Development Reviews)? The answer is that the reviews are politically useful – either to Congress or the White House in influencing an agency – or to the leadership of the agency in staving off external threats. Tama also argues that the you can trace the diffusion of these reviews via networks of people who were originally associated with the Department of Defense.

The moral of the story: next time you print out a pdf of an organizations strategic review keep in mind the strategic threat that it is supposed to address may not be ‘out there’ but actually closer at hand in the legislature or treasury.

Tama J (2015) The politics of strategy: why government agencies conduct major strategic reviews, Journal of Public Policy, FirstView: 1–28.

h1

Public Diplomacy and ‘The Good Project’

July 28, 2014

Cardinal Richelieu saw diplomacy as process of ‘continuous negotiation’. States have an ongoing relationship that is subject to continuing adjustment and that is the job of the diplomat. The same can be said of the ‘classical’ modes of public diplomacy or cultural relations – the operation of an information service or a cultural institute is seen as an ongoing activity. Yet over the past 30 years an increasing volume of PD/CR work (as well as aid/development activity) has been organized as projects. This comes both from the attempt to ensure the effectiveness of government activity but also from the movement of resources from geographical to functional bureaux within MFAs. I’ve been wondering what the implications of this ‘projectization’ of diplomacy are. How much difference does it make to think of diplomacy as a set of discrete projects rather than as the maintenance of a relationship?

As a result I was intrigued to come across a new book that explore the impact of project working on humanitarian relief NGOs.   In The Good Project: Humanitarian Relief NGOs and the Fragmentation of Reason Monika Krause of Goldsmiths College, London argues that instead of analysing humanitarianism in terms of lofty goals or hidden interests we need to pay attention to how an organizational dimension shapes what actually gets done. Based on research on NGOs desk officers she concludes  that they are concerned with developing a portfolio of projects that can demonstrate that they have achieved their specified objectives.  NGOs will avoid projects that are too difficult but also where effectiveness cannot be demonstrated because their reputation for effectiveness is important for in getting funding from donors.  The donors are frequently government aid agencies that need to demonstrate to politicians and taxpayers that they are getting value for money. The logic of the ‘good project’ drives attention away from the ultimate ends of policy towards good execution of discrete activities. In some foreign ministries (the FCO is one) much of the discretionary programming spend is allocated as project funding either to embassies, mittlerorganizations or other NGOs. Would a similar investigation into how funding was allocated find that the organizational requirements of the ‘good project’ (and the skills needed to write a good application) were the overriding factor in determining the allocation of resources. My suspicion would be yes.

h1

Recent Report on the French Cultural Network

May 21, 2014

I’ve just come across a September 2013 report by the French Cour des Comptes* on Le réseau culturel de la France à l’étranger  (France’s Foreign Cultural Network).  I haven’t been through it at in detail yet but If you read French this looks like a really useful picture of the state of things in France.

The main recommendations of the report are

1. The network needs more professionalization.

2 . There should be an agreed strategy between the ministries involved.

3 . Evaluation of projects

4. More power to the French Institute and Campus France within their respective networks.

5. Follow up on alumni of the network

6. Better coordination between the public network (ie French Institutes and Cultural Centres) and the Alliance Française.

7. Better financial management

8. Better measures of the impact of the network.

9. Evaluation of economic impact of network.

I think that that these are the same recommendations that all reports on the network have been making since at least the Rapport Rigaud in 1979; come to think of it most of them apply to most reports on cultural relations or public diplomacy in most countries.

*Court of Auditors.  I guess that this would be equivalent to a National Audit Office report in the UK or a Government Accountability Office publication in the US.

 

h1

Reviewing the FCO Communication Capability Review

January 30, 2014

Over the last couple of years British government departments have been subject to Communication Capability Reviews conducted by the Government Communication Network.  These involve a group composed of other government communicators plus outsiders wandering around the Ministry interviewing people and looking at your paperwork – if you’ve worked in the UK public sector or related organizations you will have doubtless experienced something similar.  Anyway I’ve just spotted the Review for the FCO conducted in June 2013.  It’s eight pages so if you want to get a sense of where communications at the FCO is it’s a useful snapshot.

The external reviewers are PR people from a hotel chain,  a corporate PR consultancy and the BBC and the result is what you would expect if you asked corporate PRs to look at an MFA.

They start off by commenting that the FCO is different from other government departments because among other reasons;  communications is a core business, the audience is primarily overseas and the communications capability is distributed across 270 missions.

The FCO communications review of 2011 basically tried to do more with less by reallocating resources away from the centre to directorates and as far as our reviewers are concerned this was bad because this means that the ‘FCO lacks strategic communications resource’.  Here’s an extract from the “areas of challenge”

Status of communications – Communications as a discipline is not widely understood within the FCO. It has not been invested in. While Press Department is widely respected – and used as the main route into the Engagement and Communications Directorate (ECD) – other parts of the communications function are significantly less visible. There is little understanding of the services offered by Engagement and Communications Directorate. There is a lack of clarity on the roles and responsibilities of the various staff who deliver communications activity (Engagement and Communications Directorate, embedded communicators, Senior Regional Communicators and communicators in post, for example). Recruitment is problematic.

Strategic planning – There is no strategic planning process or capability, no overarching communications strategy and no clear narrative targeted on overseas audiences. As a result, policy and communications are not fully integrated: communications is not an integral part of the business planning process. The majority of communications activity therefore is tactical and focused on short-term issues. Posts are unclear whether to amplify messages developed in London or tailor them, taking local issues and concerns into account.

Capacity – The reviewers do not believe that FCO communications resources are used as efficiently as they should be. In particular, the current structure provides insufficient ‘surge capacity’ to support priority policy areas, Foreign Secretary-led initiatives and in-year crises. The large number of locally-engaged staff with little knowledge of UK priorities exacerbates this. The current research resource is under marketed and underutilised. Some internal communications activity is duplicated by embedded staff. Overall, however, the reviewers do not believe that the FCO should increase the amount of resource dedicated to communication.

Capability – FCO staff are intelligent, articulate and committed. However, the current mix of diplomatic staff and communications specialists is sub optimal. Many important issues are dealt with by generalists with insufficient experience of communications and insufficient knowledge of where to go within the FCO for professional communications guidance and support. There is difficulty in ensuring the right level of skills development for diplomatic staff working in communications roles.

Delivery – The lack of strategic planning and lack of clarity over communications roles and responsibilities has led to inconsistent performance in areas including digital, campaign management and delivery, and evaluation.

So what do they want:

A clear vision for communications, an integrated communications plan, a centralized planning and delivery resource, a framework to clarify roles, etc.

There may be something to this but I have a distinct feeling that after starting off by acknowledging how the FCO is different the review then goes on to ignore the fact.  The more I study the history of public diplomacy the more you see that whole area is marked by a number of recurring tensions.  This report manages to hit on several of these tensions but rather than recognizing them simply asserts answers.   Five tensions stand out:

  1. The big question.  What does communication mean in an MFA?  To what extent should communication be a separate function at all? (Remember that in 1953 it was Eisenhower’s psychological warfare advisers like CD Jackson who opposed creating the USIA because everything we do has a psychological effect).  The extent to which comms should be a separate function really depends on your diplomatic concept.  Diplomacy and PD are becoming more linked.
  2. Global strategy vs local adaption?  Absence of global strategy is not necessarily a bad thing if it allows more effective local communications.  I’m up to my eyeballs in the early Cold War at the moment so for an example look at the very rapid disillusionment with Truman’s  Campaign of Truth what looked good in Washington didn’t work in the field.
  3. Centralization vs decentralization.  Same as above but where do you put the resource and control? There are arguments for both.
  4. Specialist communicators vs diplomats.  The FCO has generally leaned towards giving generalists communications experience (see Drogheda Report of 1953)
  5. Locally engaged staff versus home personnel.  Of course the former have local knowledge, language etc but less understanding of the national priorities.

I guess that if you get senior corporate PRs as reviewers they just recommend the things that they think give them status.  Diplomatic communication isn’t PR so next time the government communication networks wants to do one of these reviews maybe they should get at least one reviewer from another MFA.

h1

The State of Evaluation

January 9, 2014

Given the amount of time that people in the PD community spend worrying about evaluation you might be interested in a recent report from the UK National Audit Office on Evaluation in Government…put it this way given the size of PD budgets there are a lot of people with much bigger problems:

The main  findings

  1.  Despite polices that require evaluation of the impact of interventions British government actually evaluates in a pretty random way, departments don’t have a clear view of what they evaluate or why they do it.  A graphic casually points to £51 Billion  of defence expenditure that isn’t being evaluated at all (ie roughly 25 times the entire FCO Budget)
  2. Most evaluation fails basic standards of methodological adequacy
  3. Departments don’t use evaluation evidence in developing policy.
  4. Only a small fraction of requests for funding from The Treasury are supported by evidence from evaluations.
  5. Evaluation reports that are weaker in supporting the causal impact of interventions  make bigger claims for policy effectiveness.

So the next time someone asks you to justify the impact of public diplomacy expenditure you will be perfectly at liberty to ask them about the evidence that any other government activity actually does anything.  The point is not that government activities don’t do anything (even though this might be the case) but that government isn’t very good at producing good evidence that they do.