ByChristopher H. Wheeler
Within any city or metropolitan area in the United States, there are vast differences in the economic well-being of individuals residing in different neighborhoods. Some areas tend to be populated by individuals with high incomes and large stocks of wealth; others, by those with substantially lower incomes and fewer assets.
Differences in neighborhood-level economic outcomes can also be seen in the incidence of unemployment, which can vary substantially from one residential area to another. Among the block groups (i.e., neighborhoods consisting of approximately 500 households and 0.33 square miles of land) located within the St. Louis metropolitan area, for instance, the unemployment rate in the year 2000 ranged from 0.3 percent to more than 98 percent.
While it is hardly surprising that unemployment rates differ across neighborhoods within a metropolitan area, between 1980 and 2000, there was a striking increase across the country in the variation in neighborhood-level unemployment.
During this period, rates of joblessness among block groups with the lowest levels of unemployment dropped even further, whereas rates of unemployment among neighborhoods with the highest levels of joblessness grew even larger. In other words, the unemployed within the nation's metropolitan areas became increasingly concentrated within relatively few residential areas between 1980 and 2000.
Why did this occur? Three possible explanations are: urban decentralization (i.e., the movement of individuals from dense city cores into less dense suburban fringes), industrial and institutional changes in the labor market, and increases in the sorting of individuals across neighborhoods by income and education.
Based on data from the decennial U.S. Census covering more than 165,000 block groups across 361 metropolitan areas, it is apparent that, between 1980 and 2000, unemployment became less evenly distributed across the nation's residential areas. For example, in 1980, the median unemployed worker lived in a block group with an unemployment rate of 7.5 percent. That is, the unemployment rate within a worker's own block group of residence was 7.5 percent or greater for at least 50 percent of all unemployed workers. Two decades later, this worker lived in a block group with an unemployment rate of 7.9 percent. This trend is particularly striking because the average metropolitan area unemployment rate declined from 6.9 percent to 5.9 percent during this period.
Neighborhood-level percentile differences reveal a qualitatively similar pattern. In 1980, the average difference between the neighborhood at the 90th percentile of the unemployment distribution (i.e., the unemployment rate that is larger than 90 percent of the block-group level unemployment rates within a metropolitan area) and the neighborhood at the 10th percentile was 7.3 percentage points. Two decades later, the difference was 11.2 percentage points. As noted previously, the rise in this gap is the result of a simultaneous increase in unemployment among block groups with already high levels of unemployment and a decrease in unemployment among block groups with already low levels. The average 90th percentile increased from 11 percent in 1980 to 12.5 percent in 2000. The average 10th percentile decreased from 3.7 percent in 1980 to 1.3 percent in 2000.
By and large, these national trends were also observed within the metropolitan areas of the Eighth Federal Reserve District. Consider, for example, the experiences of Little Rock, Louisville, Memphis and St. Louis. In 1980, the gap between the 90th and 10th percentiles of the block group unemployment distribution in Little Rock stood at 7.5 percentage points. By 2000, this figure had widened to 11.9 percentage points. Memphis and St. Louis saw even larger increases in unemployment differences between neighborhoods. Between 1980 and 2000, the 90-10 gap rose from 11.4 to 15.1 percentage points in St. Louis, and expanded from 13 to 17.1 percentage points in Memphis. Although it was modest in comparison, Louisville also experienced an increase in its unemployment concentration. Its 90-10 difference rose from 10.3 percentage points in 1980 to 10.5 percentage points two decades later.
One of the most prominent theories in urban economics over the past half century suggests that the movement of population and employment away from city centers toward suburban locales has created an underclass of unemployed workers in central cities. This idea, known widely as the spatial mismatch hypothesis, was first studied by the economist John Kain.
The basic rationale behind this theory is straightforward. As city populations and employers move away from traditional central business districts, it becomes more difficult for workers who choose to remain in those central cities to find and secure jobs. Increased spatial isolation from employment opportunities presumably increases commuting costs and makes the job search process more difficult. In addition, increased distance may limit access to information about available jobs or create negative attitudes about central city workers among employers. Thus, as employers move farther away, it becomes less likely that the residents of historical city centers will be able to locate and maintain a job.
Urban populations in the United States, of course, began moving from central cities to suburban locales more than a century ago and have continued to do so in recent decades. For example, population density, which measures the extent to which residents within a city are concentrated or spread out, decreased from a level of 3,080 residents per square mile in 1980 to 3,004 in 2000.
Industrial and Institutional Change in the Labor Market
The last several decades have been characterized by decreasing employment in certain sectors, but increasing employment in others. Most notably, manufacturing employment has decreased while service employment has increased. In addition, rates of unionization have fallen substantially.
Between 1980 and 2000, the average share of manufacturing in total employment declined from 22 percent to 14 percent across the 361 metropolitan areas in this study's sample, whereas the fractions of workers employed in education and health services rose from 17 percent to 20 percent. Rates of unionization decreased from an average of 24 percent in 1980 to 14 percent in 2000.
How might these changes influence the geographic distribution of unemployment within a metropolitan area? If workers in certain neighborhoods tend to be employed in similar types of industries, or if unionization is relatively concentrated among the residents of certain neighborhoods, these changes may have produced differential rates of unemployment across different areas within a city. In other words, rather than there having been a change in the way that residents of a metropolitan area sort themselves across neighborhoods (e.g., into areas populated primarily by either high-skill workers or low-skill workers), it may simply be that changes in the labor market have differentially influenced workers of different neighborhoods.
Segregation by Income, Education
The rise in the concentration of unemployment may, on the other hand, be the product of greater segregation of individuals by income and education. If the manner by which individuals sort themselves into residential areas has created neighborhoods with concentrations of either high- or low-skill individuals, we should see increasing disparity between the unemployment rates of different neighborhoods. Low-skill individuals, after all, tend to experience higher rates of unemployment than high-skill individuals.
On the surface, this explanation seems related to the urban decentralization hypothesis sketched above. In fact, previous work has suggested that as city populations spread out, households become increasingly sorted into high- and low-income neighborhoods. Recent research, however, has found little association between the extent to which urban populations spread out and the income differentials they exhibit across block groups.
In general, there was a rise in income variation across block groups in the urban areas of the country between 1980 and 2000. On average, the variance of block-group level household income nearly doubled during this period. Additionally, college graduates became increasingly segregated from individuals with less schooling, suggesting that, in recent decades, the highly educated have sought neighborhoods populated primarily by other highly educated individuals.
Results from the statistical analysis of these patterns indicate that, of these three possible explanations, rising segregation of individuals by income and education is the most likely culprit. After controlling for a number of characteristics that may influence the residential distribution of unemployment, including the basic demographic makeup of each metropolitan area, the findings indicate that there is essentially no correlation between rising unemployment concentration and any of the following three quantities: population density (a measure of urban decentralization), industrial composition of a metropolitan area and extent of unionization among the local workforce. In contrast, there is a significantly positive association between unemployment concentration and the extent to which neighborhoods are segregated by income and educational attainment.
Why should the rise in the concentration of unemployment within relatively few residential areas concern us? The answer, quite simply, relates to the idea that we are all influenced by our immediate surroundings. For decades, economists and sociologists have argued that the characteristics of an individual's residential area greatly influence his or her economic outcomes. The evidence largely supports this notion.
Economists Anne Case and Lawrence Katz, for instance, have found evidence of strong peer effects characterizing a variety of behaviors, including criminal activity, drug and alcohol use, schooling, and employment status within a sample of residential areas in Boston. Giorgio Topa, an economist at the Federal Reserve Bank of New York, has found evidence of local spillovers in unemployment across neighborhoods in Chicago. High levels of unemployment within a residential area tend to have a negative influence on the employment prospects of individuals residing within or near that neighborhood.
According to William Julius Wilson, an influential sociologist and scholar of urban poverty, neighborhood effects of this sort formed the basis of the rise in inner city poverty in the United States in recent decades. As successful workers have gradually left inner cities, those who remain are surrounded by rising levels of poverty and joblessness, which makes it increasingly less likely that the residents of these areas will find work.
The rise in the concentration of unemployment, therefore, may be creating poverty traps from which people will find it increasingly difficult to escape.