The Failed States Index 2011 Launch Event

Published June 29, 2011
By J. J. Messner
The Failed States Index Launch Event
 
 
Transcript of speech presented to the Failed States Index 2011 Launch Event by J. J. Messner, Senior Associate at The Fund for Peace, at the National Press Club in Washington, D.C. on June 29, 2011.

* * *

Welcome again to the launch of the Failed States Index 2011.

I would like to thank once again Business Executives for National Security and Foreign Policy magazine for their continued support of the Failed States Index. If you are not already a member of BENS, you should become one. If you are not already a reader of Foreign Policy, pick up a copy.

It is my pleasure to speak to you on behalf of the staff at the Fund for Peace who have worked tirelessly to put together the Index: Nate Haken, who coordinated much of the research behind the Failed States Index; supported by Joelle Burbank, Krista Hendry, Patricia Taft, Heidi Ann Davis and Ken Brill; and of course our team of Research Assistants. I also wish to acknowledge Dr. Pauline Baker, president emeritus of The Fund for Peace, who pioneered much of the underlying framework of the Index and led the project in its early years.

* * *

This year, sadly, saw very little change atop the Failed States Index. Somalia was ranked first for the fourth year in a row, when it leapt ahead of Sudan, the last country to hold that position, in 2007. Though Somalia’s raw score actually improved slightly this year, a significant gap still remains between it and second-placed Chad.

So what has conspired to keep Somalia in worst position? Very simply, the situation in Somalia has remained relatively constant. The country is still beset by widespread lawlessness, ineffective government, terrorism, insurgency, crime, abysmal development and the well-publicized problem of piracy off its shores. But, fortunately, the good news is that Somalia also seems not to be getting any worse.

* * *

But before speaking further about the results of the 2011 Index, let us turn now to look at the underlying principles of the Failed States Index, in order to better understand what it means.

But firstly, why does this all matter?

Put simply, weak and failing states pose a challenge to the international community. With increased globalization and a highly integrated economic system, the pressures on one state can have serious effects not only at home, but abroad as well. Identifying the sources of pressure on states is critical to being able to properly address and hopefully mitigate those pressures. More so if that analysis is in the form of early warning, where growing pressures can be addressed before they get out of hand.

State failure represents the collapse of the state’s ability to provide the most basic services and protection to its people. Ranking atop the Failed States Index does not mean that a state is necessarily “failed” – though in the case of Somalia, a very strong case can be made that it is probably the closest representation to what we may consider to be a failed state.

* * *

So how do we measure state failure? The ability to quantify the pressures experienced by states is a relatively new phenomenon. During the Cold War, there was no shortage of analysis on inter-state conflict. However, where there did appear to be a gap was internal conflict, within states. There was not only a dearth of research and analysis on internal conflict, but there was also the absence of any sort of framework with which to assess it. Nowhere was this more apparent than during the Biafran Civil War in Nigeria, where Dr. Pauline Baker, president emeritus of The Fund for Peace, was living at the time.

Some years later, The Fund for Peace, under the leadership of Dr. Baker, set about creating an assessment framework that sought to better understand and assess internal conflict. Thus, we created the CAST (or Conflict Assessment System Tool) framework. The framework assesses internal pressures upon a state – or even a region or a province – based on twelve primary and equally-weighted indicators: 4 social, 2 economic and 6 political and military.

More recently, The Fund for Peace sought to further quantify these assessments. By using the CAST framework as a basis, we created our own conflict assessment software.

The Fund for Peace server collects millions of documents every year, and through the application of highly specialized search parameters, analyzes this content for every country based on the twelve primary social, economic and political indicators, which themselves each have a dozen or more sub-indicators. Using various algorithms, this analysis is then converted into raw scores that represent the salience – or significance – of each of the various pressures upon a given country. This analysis, which is particularly good at capturing trends in event-driven factors, is then integrated with quantitative data from pre-existing data sets such as the World Bank’s Development Indicators and others. Finally, those provisional scores are triangulated with human qualitative analysis - to ensure that the system is correctly interpreting the data.

Countries are scored out of ten points for each indicator; Given that there are twelve indicators, this explains why the total score is assessed out of 120 points. The points system is also somewhat like golf, in that the lower you score, the better you are doing.

* * *

We first conducted a global assessment of multiple countries in 2005, when the inaugural Failed States Index was published in Foreign Policy magazine. Though at that point, it was much smaller than the Index that we know today, with only 76 ‘at-risk’ countries catalogued from Africa, Asia, the Caribbean and Latin America. In 2006, the Index doubled in size, before expanding to its current scope of 177 countries in 2007.

An obvious question is, why 177 countries? To qualify for the Failed States Index, a country must be first of all a UN member state. Hence, you will not find Western Sahara, Taiwan or Kosovo on the list. Secondly, there must be enough data available for us to conduct meaningful analysis. As much as we would like to include nations such as Nauru, Kiribas or Vanuatu, there simply is not enough reliable data for us to analyze using the data processing techniques as we’ve developed them to this point.

* * *

For a country to rank poorly in the Failed States Index, it means that the country is experiencing significant pressures in multiple aspects of the social, economic and/or political make-up of the state. Take China and India, for example. The two countries’ economies continue to grow and both nations continue to advance. Yet, they are ranked 72nd and 76th respectively, worse than Libya, or Belarus. It is important to understand that although China and India may have significant state capacity, they nevertheless face serious pressures socially, economically and politically, pressures far greater than those experienced by countries such as Botswana or Gabon, which also rank better.

It is also important to understand that pressures can vary regionally within a state, and so a country’s score can under-represent the pressures experienced in one part of the country while over-representing the pressures experienced in another. For example, Panama City enjoys a very safe and stable environment. Meanwhile, rural areas such as Colon or the Darien Gap experience significant poverty, crime and demographic pressures. Sitting in a glistening office tower in Panama City, it can seem as if Panama is the picture of health. And in many ways, it is. But for anyone in a rural community in Panama, the environment in Panama City may seem a world apart.

The same can be said for neighboring Colombia, which has ranked as high as 27th in the Failed States Index, and now currently sits at 44th. For anyone in the city of Bogota or on the beaches of Cartagena, the violence of the insurgency in other parts of the country can seem almost unthinkable. The Failed States Index therefore, by its nature, represents what can be described as an average picture for a country as a whole, and it should be understood that some parts of a country may experience either significantly more or significantly less pressures than the overall country score may suggest.

It is also important not to conclude that developed countries have a monopoly on stability. Indeed, the Failed States Index demonstrates that every country experiences its own unique set of pressures. Even a stable, advanced Western European nation can nevertheless experience social pressures, including such things as integration of refugees into society, or economic pressures as we have seen in Portugal, Ireland, Greece and Spain, or political pressures in Belgium.

* * *

We have no doubt that this, the seventh Failed States Index, will attract criticism just as the six before it. Of course, some of this criticism will be valid. No Index should ever pretend that it is infallible. We recognize that the way the 12 indicators interact in one country may differ from another with respect to how they impact the risk of state failure. It is critical that the scores of any country be understood in the context of that country, and the experience on the ground.

Some will criticize the index as having a western bias. Though, as you can no doubt hear from my accent, we’ve managed to overcome and American bias. I would emphasize that although the Index is assembled in Washington, D.C., our methodology is specifically designed to minimize its western bias in that the content we analyze comes from all corners of the globe. Importantly, the data is normalized to control for uneven media coverage by country and indicator in the global news-scape.

Like any Index, the temptation is to see who is better than who. During this presentation you have and will continue to hear reference to where countries rank on the Index. An Index is by its very nature competitive. After all, no one wants to be ranked poorly.

The Index is informative in quantifying the pressures upon states relative to each other, and for nations that score poorly, it can be crucial in bringing attention to countries that do not otherwise enjoy much attention. Almost all of us probably knew before today that Somalia is in bad shape, irrespective of having seen its ranking on the Failed States Index. Similarly, the immense resources of the United States currently deployed in Iraq and Afghanistan mean that those countries and the pressures therein are constantly capturing the public’s attention, especially here in Washington. But can the same be said fora country like the Central African Republic, ranked 8th. Or Guinea, at 11th? Or Niger, at 15th? We believe that the ranking of the Failed States Index is able to draw attention to countries that face significant challenges.

We believe that the Failed States Index represents a dispassionate assessment of the pressures upon countries. The assessment, as conducted through the CAST system, with its search terms and algorithms, removes much of the human subjective element. We live in a world of 24-hour news channels and ever-shortening news cycles. It is easy to assume that the most significant pressures experienced around the world are those discussed at the top of the hour. However, assessments like the Failed States Index demonstrate this is not always the case.

Certainly, overall rankings are interesting to analyze and debate. But the real value of the index comes in the horizontal analysis of the 12 indicators. And, to properly understand a country’s trending over time, it is critical to view their performance for each individual indicator. Every country has areas of weakness. This index provides a quick overview of where those areas of weakness are, so that all stakeholders can work together to address them.

* * *

Now, back to the highlights for Failed States Index 2011.

First off the bat, I do want to address what in many ways is the 800 pound gorilla in the room. I am sure many of you have looked at the Failed States Index book, or the rankings on our web site, and have seen the rankings for Tunisia, Egypt, Libya, Syria and Bahrain. I am also sure that many of you, after having seen these rankings, have thought, “how is it that a country like Libya, now in a state of civil war, can rank as well as 111th?”

The answer, in part, has to do with timing. The Failed States Index uses a strict time period for its data sample of January 1st to December 31st. As all of you will remember, the Arab Spring really began when Mohamed Bouazizi set himself alight in Tunisia on December 17th. The protests in Tahir Square in Cairo did not begin until late January. Upheavals in Benghazi did not start until mid-February. Therefore, nearly the entirety of the Arab Spring evaded the analysis of the 2011 Failed States Index. Perhaps we could have extended the sample period by a few months to include the Arab Spring, but it was felt that in doing so, we would risk fundamentally undermining the ability to accurately measure trends year-on-year for future years.

The Failed States Index is thus a snapshot in time. But in viewing multiple snapshots over time, one can begin to observe trends. Although the Arab Spring did not impact the current rankings, (with the exception of Tunisia), the individual indicator scores for demographic pressures, human rights, state legitimacy and group grievance were beginning to slide in a number of the effected countries. Of course, the Failed States Index did not predict the Arab Spring, and nor is it intended to predict such upheavals. But by digging down deeper into the specific indicator analysis, it was possible to observe the growing tensions in those countries. Though prediction of this sort is not the intention of the Failed States Index, it is important to emphasize the early warning potential of close examination of specific indicators and sub-indicators of conflict.

* * *

It is unfortunate to observe that Somalia continues to sit atop the Failed States Index. Similarly, it is equally unfortunate to observe that Chad, Sudan, Zimbabwe and D.R. Congo continue to rank in the worst six positions as they did in 2010. The good news is, however, that although the rankings of these countries changed little, their actual scores improved ever so slightly. This does not mean that the political risk in these countries is by any means much diminished. But what it does mean is that the situation does not appear to have deteriorated over the previous year.

Indeed, the same can be said for many countries this year. Of 177 countries, 115 of them saw their overall scores improve by 0.3 or more. This means that 65% of countries experienced some level of overall improvement. At the other end, only 36 countries, or 20%, experienced a worsening of similar magnitude.

Most significant of those worsening countries this year was Haiti, which shot to the top of the rankings largely as a result of the devastating earthquake in January 2010. The Caribbean nation, previously ranked 11th in the 2010 Failed States Index, now finds itself at 5th position, after having experienced immense pressures of widespread death and destruction, population displacement and collapsed infrastructure, compounded by a lack of capacity within the government to respond adequately and extremely high levels of foreign intervention.

But for anyone who may look at the scores of a country facing significant pressures, and may fret that the scores are not improving “fast enough”, it is worth exploring an important concept that is very clearly demonstrated by the Failed States Index. It is, as my colleague Nate Haken describes it, the Humpty Dumpty principle. When countries deteriorate, they can often do so very, very quickly. Witness Cote d’Ivoire, for example. Once considered as one of the most, if not the most, stable nation in West Africa, its fortunes turned rapidly. Indeed, in the inaugural Failed States Index in 2006, Cote d’Ivoire ranked first, where Somalia ranks today.

When Humpty Dumpty falls off his wall, it is very difficult to put him back together again – even despite the efforts of all the king’s horses and all the king’s men, as the nursery rhyme would tell us. Countries can recover. We have seen this with countries like Liberia and Cambodia. But the process of putting everything back together again is often long and arduous.

Of course, Haiti’s worsening was by far the most significant of 2011. But it is worth noting the effect of other natural disasters on other – even advanced – nations. Last year was the second worst year for damage and fatalities by natural disaster since 1980. Earthquakes in Chile and New Zealand saw both countries’ scores slide. Similarly, deadly floods in Benin and droughts leading to widespread starvation in Niger have also led to worsening scores for those nations. Natural disasters also demonstrate how a significant pressure can effect a country across many different aspects of life and can thus be detected across multiple indicators. An earthquake, for example, represents a demographic pressure. But it can also lead to population displacement, it can cause serious disruption to the economy and can destroy or otherwise severely test public services. A massive disaster can also lead to high levels of external, foreign, intervention. Second-order effects may include competition for newly finite resources, leading to group grievance issues; Resultant law and order problems can lead to human rights and security issues. This underscores two things: firstly, the seriousness of natural disasters and their effect of severely increasing pressures upon the state; and secondly, the interrelatedness of the indicators within the framework upon which the Failed States Index is based.

At the other end of the Index, Norway has been displaced from the best position for the first time in the history of the Failed States Index, to be replaced by its Nordic neighbor, Finland. But before mass depression sets in on the streets of Oslo, it is important to note that this really represents little more than a rearrangement of the desks in a gifted and talented classroom. Ever since the expansion of the Failed States Index to include the entire globe, Norway, Finland and Sweden have together held a monopoly on the best three positions on the Index.

The upper echelons of the Failed States Index are occupied almost exclusively by Western European nations, with the exceptions being Australia, New Zealand, the United States, Canada and Japan that round out the best 20. However, some of the worst slides this year were recorded in Western Europe as the economic crisis began to impact on countries such as Ireland and Greece. And this is where examining trends is just as important as looking at the raw scores and rankings. Though Ireland has consistently scored in the best 10, it is worth noting that its score has declined every year in the Failed States Index. It has not improved once. Spain and Greece, both perennially around the best 30, have similarly seen their scores decline every year but one.

The steady decline of some nations that have seemingly continuously occupied the best rankings is occurring at the same time that other nations continue to improve. Notably, Latin American nations Uruguay and Chile are making significant headway into the top 30, along with several former Soviet bloc countries, such as Estonia, Hungary, Poland and the Czech Republic that are ranked either in the top 30 or just outside.

The most improved nation this year was Georgia, which has recovered somewhat from a significant decline a few years ago which was caused in no small part due to the conflict with Russia over Abkhazia and South Ossetia. The renewed stability, along with reforms to security forces and a crack-down on corruption have aided Georgia’s recovery. Though the nation still ranks relatively high at 47th, it has already improved significantly from its ranking of 37th last year.

* * *

Every nation experiences pressures to some degree, and that is what the Failed States Index measures. But as we can see in many instances, the success of a state emanates from its ability to adequately handle those pressures.

We trust that the Failed States Index continues to draw attention to the significant pressures faced by many countries around the world, and will provide yet another tool for policy-makers, international organizations and business leaders to make the best possible decisions in addressing the pressures that challenge weak and failing states.