Friday, October 22, 2010

Concept for IDEAS Research Initiative

Introduction

Human Performance Technology (HPT), a systematic approach to improving productivity and competence, uses a set of methods and procedures -- and a strategy for solving problems -- for realizing opportunities related to the performance of people. More specific, it is a process of selection, analysis, design, development, implementation, and evaluation of programs to most cost-effectively influence human behavior and accomplishment. It is a systematic combination of three fundamental processes: performance analysis, cause analysis, and intervention selection, and can be applied to individuals, small groups, and large organizations.

How Does HPT Work?

Human performance technology is a set of methods and procedures, and a strategy for solving problems, for realizing opportunities related to the performance of people. It can be applied to individuals, small groups, and large organizations. It is, in reality, a systematic combination of three fundamental processes: performance analysis, cause analysis, and intervention selection. 

HPT uses a wide range of interventions that are drawn from many other disciplines including, behavioral psychology, instructional systems design, organizational development, and human resources management. As such, it stresses a rigorous analysis of present and desired levels of performance, identifies the causes for the performance gap, offers a wide range of interventions with which to improve performance, guides the change management process, and evaluates the results. Taken one word at a time, a description of this performance improvement strategy emerges.

Why HPT and International Development

A donor/aid agency’s goals are usually multiple, general, and with both self and altruistic interests. ISPI’s Human Performance Technology (HPT) practitioners can improve the level, amount, and duration of improvements and progress. This is because its approaches are scaleable in 3 ways:

  • Direct project assessment and technical assistance (specific clients)
  • Catalyst/stimulus (when coupled with technical assistance) to embed change
  • General systems / organizational tool to facilitate improved program results


IDEAS Research Parameters

Goal: To demonstrate the value of HPT to the Donor Development Community through its application in development programs/projects and institutions.

Just as with all development-focused activities, the goal of the IDEAS committee research is complex on many levels. This complexity, while making the planning for research challenging, also affords the committee an opportunity to illustrate how HPT is an ideal framework to understand and address the complex challenges of all development work. HPT is, after all, a systematic, holistic, and inter-disciplinary approach to improving results; offering a unique perspective on how to improve performance within complex contexts.

Nevertheless, there are number of parameters (i.e., complexities) that must be considered within the context of demonstrating, through research, the value of HPT to development efforts. A number of these are included below, along with recommendations for how they are to be addressed within the research project.

Recommendations for Direction of Research

1. Dimensions of the Development Arena 

Include development activities from all technical sectors, including work that is internal (improvement within the development institution itself) as well as external (improvement within client/beneficiary organizations). It may also be fruitful to compile activities led by the private sector in developing countries as this can further illustrate the impact of the application of HPT.

2. Phases of Research

a. Research activities can be rolled out in phases. Initially, the Committee can focus efforts on easily accessible activities - such as compiling existing research from recognized sources (for instance, research journals, professional publications, and institutional reports).

b. As the committee’s linkages grow to include universities offering HPT-focused courses, the research can incorporate student-led original research on the application of HPT in a development setting.

c. This can be complemented with an effort to work with donors and implementing organizations to document and share their evaluative research on the application of HPT in their project activities.

d. As a final step, the committee can work with donor agencies to integrate evaluation and research activities into their project designs. In Year One the focus will be on successful applications of HPT (and HPT related interventions), with the focus expanding in Year Two to include what we can learn from projects that did not meet their objectives.

3. Geographic Focus

At the outset, the team should focus its efforts on gaining access to existing research and initiating research that capitalizes on existing opportunities (e.g. donor funded projects, student-led research). This will facilitate research opportunities in all regions. At the same time, if due to other issues (such as language) this limits the initial research review to just a few regions of the world, that is an acceptable starting place.

4. Sources of Information

Research sources should include donors (USAID, World Bank, etc.) as well as implementing organizations (World Learning, AED, etc.). As the donor community continues examining mechanisms to improve the impact of their activities, the Committee should gain buy-in in two key ways:

a. support for research through their funded program activities;

b. readily share evidence of impact with donors at multiple levels (program managers, portfolio managers, bureau managers, administrators).

Concerning the implementing organizations, the Committee should focus efforts on encouraging these organizations to document the impact of their HPT-based activities. This can be accomplished initially through organizations that participate in the Committee, through organizations affiliated with ISPI, or those identified through preliminary research outlined below.

5. Theoretic Discipline Scope

HPT draws upon many disciplines as a basis for methods. These cannot all be addressed simultaneously. At the start, our review of research should be limited to just 3 or 4 primary disciplines that feed HPT, in addition to the research on HPT itself. In later years, these can expanded upon.

6. Languages

Research on HPT and its related disciplines within the donor environment may be done in multiple languages. Given the make-up of the committee and the depth of donor information available in English, this will be the Initial focus. As we move forward, we can look to expand the committee research scope with professionals (or volunteers) that can provide access to publications in other languages.

7. Glossary and Definitions

Compile and vet a list of commonly used definitions and explanations for the critical terms, concepts and ideas underlying both the development donor context and HPT. Use collaborative concept-map tools (such as the free CMAP tools) to better understand the relationships among terms.

8. Research Resources

The IDEAS committee should take the primary role to direct and disseminate results of research. The committee should work closely with the ISPI Research Committee and other ISPI members affiliated with universities (including GWU, Boise State, Capella) to identify faculty, student or internship projects that may be used to complete aspects of this work. Likewise, the committee should seek to aggregate independent research done from within either the donor or HPT community in support of committee goals.

An important element of this will be identifying a central portal for access to research results. This will include operating within the restraints of copyright restrictions.

9. Dissemination of Findings

Periodic dissemination of research findings should be accomplished. This should be done through a combination of written publications in professional journals, community newsletters and media, websites and blogs. As well, findings may be presented in seminars and workshops or as subjects as specialized conferences.

--  concept developed by research task team Ryan Watkins, Matthew Bond, Samantha Spilka, and Steven Kelly

Wednesday, October 20, 2010

HPT Value Proposition: Getting Sustainable Results

A donor/aid agency’s goals are usually multiple, general, and with both self and altruistic interests.
ISPI’s Human Performance Technology (HPT) practitioners can improve the level, amount, and duration of improvements and progress.  This is because its approaches are scaleable in 3 ways:
  • Technical Assistance
  • Catalyst/stimulus when coupled with sectoral assistance
  • Systems/organizational tool to facilitate improved results

 Technical Assistance
    1. Assessment, clarification or program/project for an organization or sector
    2. Project design, including performance measures
    3. Performance management, in organization or sector
    4. Analysis, advisory, selection of interventions to improve institutional sustainability and operational results
    5. Work process analysis and performance improvement
    6. Development of policies, procedures, regulations to support (incentives) management, process, and capacity development improvements; and remove blockages (disincentives)
Catalytic Coupling with Sectoral TA (working with sectoral technical assistance program/project) as a process/communications/organizational polity tool)

Gap analysis, and rationale why and how the old system or technique worked (WIIFM)
  1. System changes required for support/incentive of new approach
  2. Embedding new approach into work processes (linkages to prior, post, and staff functions)
  3. Assist with management and technical change issues, communications
  4. Cooperate with technical team to verify impact on other organization/system interventions (avoid de-development and contradictory projects)
Systems/Organization Tool
  1. Development of performance management measures, polity issues to support and incentives/disincentives to achieve
  2. Strategy development; coupled with system dynamic checks on consequences, feedback and harmonization across sectors; elaboration of plan into operations; understanding intervention points and levers
  3. Program and project development; coupled with system dynamics checks on consequences, feedback, and harmonizing effects of cross-sectoral impact
  4. Reporting mechanisms; communicating results; disaggregation; focus on outcomes vs outputs
  5. Project design, including timelines and budgets, reducing timelags and allowing for context changes during timelags
  6. Review and evaluation
  7. Set up, train, and audit monitoring (skills, approaches, policies, procedures), and critically self-monitoring performance feedback
  8. Program and project management (effective, getting away from the increasingly time consuming and ineffective modern project management and into performance management) 
  9. Process analysis, improving efficiencies and focusing on alignment and results

Thursday, October 14, 2010

Measuring Millennium Development Goals

Hans Rosling’s focuses on the 8 Millennium Development Goals and takes one sample area. This is a presentation sponsored by TED (Ideas Worth Spreading).

This time his topic is child mortality, with a brilliant overview of how we collect statistics, how to get child mortality rates down (educating girls accounts for fully half of the improvements), and a lot of good news on development. Really excellent demonstration on the use of statistics and the dangers of averages.



Friday, October 8, 2010

Effectivess of AID - Country Comparison Study Released


A new study is comparing donor countries and agencies in regards to the effectiveness of assistance provided to transitioning societies. 

Donors, academics, and development advocates have long recognized that not all aid is created equal. Often, the impacts of aid are blunted because it’s spent in the wrong places or isn’t coordinated with recipient government programs. How can we know which donors give aid well, and which donors need to improve?

Nancy Birdsall, president of the Center for Global Development, and Homi Kharas, deputy director of the Brookings Institution’s Global Economy and Development program are the co-creators of the Quality of Official Development Assistance (QuODA) assessment, a new tool that tracks and compares donor programs against four dimensions of aid quality.

With aid budgets under pressure and the 2015 deadline for achieving the Millennium Development Goals quickly approaching, donors and recipients are striving to make the most of available aid. However, until now, there has been no consistent measure of how well donor countries and agencies provide aid.

The new assessment tool strives to analyze the comprehensive quality of aid assessment that evaluates each donor country or agency on four dimensions of aid quality. The first annual Quality of Official Development Assistance (QuODA) assessment compares the aid quality of 31 donor countries and multilateral agencies, as well as 152 individual development agencies.

The four dimensions of aid quality measure:

- Maximizing efficiency, or how smartly aid funds were distributed.

- Fostering institutions, or the degree to which aid is building the capacity of recipient governments.

- Reducing the burden on recipient countries, or how much effort aid recipients need to put in to secure funds.

- Transparency and learning, or how much information about the flow of aid money is available.


"Our ratings bring fresh rigor to the international dialogue about aid effectiveness," says CGD president Nancy Birdsall, who devised the system with Homi Kharas, deputy director and senior fellow of the Global Economy and Development program at Brookings.

Kharas adds that "There is broad international consensus about what constitutes high-quality aid, but without measurement, a lot of that is just hot air. There's lots of room for improvement. We hope that the QuODA rankings will spur donor countries and aid agencies to do a better job."




As summarized in a recent Devex article by Ivy Mungcal,  the four categories are an attempt to capture donors’ adherence to international standards outlined in the Accra Agenda for Action and the Paris Declaration on Aid Effectiveness as well as their commitment to transparency, the report explained.

The study ranked donors on each of the four indices of aid quality. It did not provide an overall ranking that aggregates all four categories.

Ireland and the World Bank’s International Development Association were on the top 10 in all four dimensions of aid quality. They were the only two donors to receive high marks on all four indices.

The Netherlands, Denmark, Inter-American Development Bank’s Special Fund and the Asian Development Fund were in the top 10 in three out of the four categories, while 18 of the donors assessed made the top 10 in at least one of the aid quality indices.

Some countries, including the U.S., Switzerland and Greece, did poorly across all categories. The three countries were consistently in the bottom 10 rankings in the four indices.

The U.S. was rated poorly on nine of the 30 indicators, including share of untied aid, use of recipient countries’ financial systems, minimization of fragmentation across its multiple aid agencies, contribution to multilateral agencies, and reporting of the aid delivery channels it uses.

Multilateral agencies had higher average rankings on three of the four aid quality categories. The agencies took the top five rankings on maximizing efficiency, four of the top eight slots on fostering institutions and the top three on reducing burden. Multilaterals, however, with the exception of IDA and the European Commission, performed poorly in terms of transparency and learning.

The study also assessed individual aid agencies of the countries in its sample. It found that “agency performance can vary widely for a single bilateral donor.”

This variety was specifically evident among the U.S. aid agencies. The study indicated that the Millennium Challenge Corp. did well on the majority of the categories, while the U.S. Agency for International Development and the U.S. Defense Department performed poorly on most.

CGD and the Brookings Institution excluded the consideration of humanitarian aid in both country- and agency-level analyses. The authors, including CGD President Nancy Birdsall, explained that humanitarian aid serves a different purpose and its effectiveness is already measured by a separate index

Monday, October 4, 2010

Who's In (in International Development)


As the global economic balance shifts, the United States and other of the post-World War II powers say they’re ready to make room. The Group of 8 rich countries ceded to the Group of 20, where the interests of Germany and France must vie with those of India. Despite that, the maneuvering before this week’s meetings of the International Monetary Fund suggest that the big players may not be as ready as they claim.

The I.M.F. is not as significant as it was 50 years ago, when it policed a system of fixed exchange rates. It still has a major role to play, with hundreds of billions to prop up countries that run into financial trouble. Developing countries — which have long resented the fund’s demands that they open up their markets — are eager to have more of a say in its deliberations.

Last year, leaders of the G-20 agreed to shift at least 5 percent of the fund’s quota share (akin to shares in the fund’s capital) from overrepresented countries, like Canada or Belgium, to underrepresented ones, like Turkey or Brazil. Brazil, Russia, India and China are pushing for a 7 percent shift. European Union countries — they account for a fifth of the world economy yet have more than a third of the fund’s quota share, and name 9 of its 24 directors — have been especially reluctant to cede. The United States has been willing to give up some of its roughly 17 percent stake, but has balked at proposals that would cut it to 15 percent and deprive it of its veto (decisions need 85 percent of the vote to pass).

This process has also set off fierce haggling — all around — about how clout should be defined. Should G.D.P. be measured at market exchange rates, which make rich countries like Japan look bigger, or using “purchasing power parity,” which favors countries like China where things are cheaper? Should it include population, as India demanded?

Since the end of the cold war, even the old Western allies have lost some of their shared sense of purpose. That, in part, explains why Belgium isn’t eager to let France represent its interests at the table. There is more than self-interest at play. The new powers clamoring for a place don’t necessarily share the institution’s values — especially the I.M.F.’s passion for free capital markets.

That is not an argument to keep the newcomers out. Bringing them into the room is the best chance of persuading them to assume the full responsibilities of their new power, and of building a new global consensus.


A version of this editorial appeared in print on October 4, 2010, on page A26 of the New York edition.