Thursday, March 12, 2015

LA has zoned itself out of the ability to house its residents (h/t Matthew Glesne)

Once upon a time, the zoning in Los Angeles would have allowed for 10 million residents to live within its municipal boundaries.  Greg Morrow, in his UCLA dissertation, "Homeowner Revolution: Democracy, Land Use and the Los Angeles Slow Growth Movement 1965-1992," documents how this was eroded over time:


So LA really did create a moat around itself and pulled up the drawbridge.  For those of us who think the blessings of cities should be shared widely, this is a shame.


Thursday, February 26, 2015

It is hard to feel urban form sometimes.

I have spent a fair amount of time in Sao Paulo over the past 3-4 years, and always thought it sprawled more than LA, because it takes forever to get from one side of the place to the other.  So was I surprised when I went to Google Earth and looked at both of them from the same elevation.

Here is LA:


Now here is SP:



It is far more compact.  Metro LA has about 18 million people; SP has about 20 million. But it takes about 2 hours to get from Santa Clarita in the west to San Bernardino in the east--the distance between the two is 85 miles; it can take four hours to go just 30 kilometers in SP.  Sao Paulo feels much larger to me.



Wednesday, February 18, 2015

Should Finance Departments Pay Pigou Taxes?

 
The purpose of this paper is to examine why financial sector growth harms real growth. We begin by constructing a model in which financial and real growth interact, and then turn to empirical evidence. In our model, we first show how an exogenous increase in financial sector growth can reduce total factor productivity growth.2 This is a consequence of the fact that financial sector growth benefits disproportionately high collateral/low productivity projects. This mechanism reflects the fact that periods of high financial sector growth often coincide with the strong development in sectors like construction, where returns on projects are relatively easy to pledge as collateral but productivity (growth) is relatively low.  
 Next, we introduce skilled workers who can be hired either by financiers to improve their ability to lend, increasing financial sector growth, or by entrepreneurs to improve their returns (albeit at the cost of lower pledgeability). We then show that when skilled workers work in one sector it generates a negative externality on the other sector. The externality works as follows: financiers who hire skilled workers can lend more to entrepreneurs than those who do not. With more abundant and cheaper funding, entrepreneurs have an incentive to invest in projects with higher pledgeability but lower productivity, reducing their demand for skilled labour. Conversely, entrepreneurs who hire skilled workers invest in high return/low pledgeability projects. As a result, financiers have no incentive to hire skilled workers because the benefit in terms of increased ability to lend is limited since entrepreneurs’ projects feature low pledgeability. This negative externality can lead to multiple equilibria. In the equilibrium where financiers employ the skilled workers, so that the financial sector grows more rapidly, total factor productivity growth is lower than it would be had agents coordinated on the equilibrium where entrepreneurs attract the skilled labour. Looking at welfare, we are able to show that, relative to the social optimum, financial booms in which skilled labour work for the financial sector, are sub-optimal when the bargaining power of financiers is sufficiently large. 
Maybe the lesson is that finance departments should subsidize physics/chemistry/engineering departments.


Sunday, February 15, 2015

One reason to worry about US inequality...it is really bad for our babies.

My colleague Alice Chen, along with Emily Oster and Heidi Williams, have a new paper that explains differences in the infant mortality rate in the United States and other OECD countries. Despite its affluence, the US ranks 51st in the world in infant mortality, which puts it at the same level as Croatia.

One reason the US performs poorly on the infant mortality measure actually reflects differences in measurement between it and other countries--babies born very prematurely in the United States are recorded as live births, but in other countries might be reported as miscarriages.  Because extremely premature babies have higher mortality rates, their inclusion in the US birth and mortality rate makes the US look relatively worse.

Nevertheless, when Chen, Oster and Williams control for reporting differences, and focus on microdata from the US, Austria and Finland, they find that the US continues to lag the others in terms of first year survival.  What is particularly interesting is that the difference between the US and other countries accelerates over the course of the first year of life--as neonatal threats recede, the position of the US worsened relative to Austria and Finland.

Here is where inequality comes in--if when Chen and co-authors look at children born to advantaged individuals (meaning married, college-educated and white) in the US, they survive at the same rates as their counterparts in Austria and Finland.  But the trio find that children of disadvantaged parents in the US have much lower survival rates than children of disadvantaged parents in the other countries.  This may well be because Europe's safety nets make the disadvantaged less disadvantaged.

(Dylan Matthews blogs on this paper also).

Thursday, January 29, 2015

No comment necessary

The Violence Policy Center put out a press release this morning relating gun ownership rates to gun death rates.  I wrote to them asking for the complete data, and plotted it.  Here is the plot.



In case you're interested, the bivariate regression's R-squared is .6.

Monday, January 26, 2015

Cities and the Environment--A first order effect?

I was reading a story about peak driving over the weekend.  In the course of reading the story, I discerned that we here in California drive far less than the average American.  In fact, California ranks 41st among the states in per capita driving:


Date are from the Insurance Institute for Highway Safety.

Given the stereotype about California (as a place where everyone drives, always), this was a surprise to me.  But then it dawned on me--when one excludes the District of Columbia (which is kind of like a state, just without representation), California is the most urbanized state in the country.  And so I drew a scatter plot of VMT per capita against urbanization by state:


The negative correlation is quite apparent. To anyone who might be interested, here is the bivariate regression:

       mpc |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
       var4 |  -81.73815    14.1223    -5.79   0.000     -110.118   -53.35832
       cons |   15994.33   1066.959    14.99   0.000      13850.2    18138.47
------------------------------------------------------------------------------

So a one percentage point increase in urbanization is associated with an 81 mile per year reduction in driving.  I think the direction of causality is not too big a problem here (it is hard to tell a story that more driving causes a reduction in urbanization).  So Matt Kahn, Ed Glaeser and Richard Florida are all right--cities are environmentally friendly!

[BTW, a little Googling led me to a paper that relates to all this].

Monday, January 19, 2015

How choosing the right discount rate matters to Max Scherzer

My student Hyojung Lee sent me to a cute article about how Max Scherzer's $210 million contract is not really a $210 million contract.  Because Scherzer is getting $15 million per year over 14 years, the present value of the contract is substantially less than $210 million; it is also worth less than a contract that pays $30 million per year over the seven years he is expected to pitch.

But Dave Cameron (the author of the piece) assigns a 7 percent discount rate to the contract.  The present value of $15 million per year over 14 years at a 7 percent discount rate is about $131 million. He chose 7 percent as the discount rate because that is the expected long run return of investing in the stock market.

A contract is not, however, like a stock.  It is a bond--contracts have seniority to equity, and guarantee a particular cash flow.  I would guess the Nats (unlike the Expos) are something like a BBB company.  The current bond yield on BBB issues is currently about 3.6 percent.  Discounting the value of the Scherzer contract at 3.6 percent produces a present value of $163 million.  Not that $131 million isn't nothing, but $163 million is a lot more.

[Update: Adam Levitin says that MLB teams are more like AAA (in bond rating, not playing quality, except, perhaps for the Diamondbacks last year), because all of baseball backs team contracts (when the Rangers went bankrupt, all players got paid).  That would drive the discount rate to 2.8 percent, and raise the value of the contract to $172 million.]

Tuesday, December 30, 2014

Is Houston really vulnerable to recession? {Updated answer: maybe}

So after reading Paul Krugman's prediction that Texas was vulnerable, I did two things I should have done before.

First, I looked to see whether the share of jobs in the mineral industry in Houston now are any lower than they were in 1986 (the first year for which I could easily download data).  The answer is that, if anything, it is slightly more reliant now.

Second, I plotted the unemployment rates for Houston and the US against real oil prices.  This is what I found:


Two things: Houston's unemployment moves with the business cycle (so the stronger US economy should help it), but also that the relative unemployment rate of Houston fell as the real price of oil rose between 2000 and 2007.

We can summarize this in a regression:

HOUE - USUE = -1.5 -.83 ln(real oil price).

The t-stat on the coefficient on real oil price is 11.  So what this approximately means is that a 50 percentage point drop in real oil price will produce a 0.41 percentage point increase in Houston's unemployment rate.  Now this is all descriptive, and is not a serious model of the region, but it nevertheless provides evidence that Houston's relative employment performance has been affected by oil prices. Given that Houston it is as reliant on energy for jobs as ever, it probably will continue to be affected as well.

Monday, December 29, 2014

The limits of knowledge in economics, Part V

How much does it cost you to live in your house?  If you are a renter, the answer to that question is fairly straightforward (although if your rent includes a gym membership and heat, it is not clearcut what the simple cost of occupying your place is).

But suppose you are an owner.  What is it costing you every month or every year to live in your house?  The truth is, you don't know with a great deal of precision, and neither does anyone else.

There are two ways to look at the issue.  One is to look at something called owner's equivalent rent. In principle, one could determine owner's equivalent by offering her house for rent, and seeing what it would fetch in the rental market.  Needless to say, owners don't do this very often.  Another way to calculate owner's equivalent rent is to find a perfect comparable for an owned house in the renter market, and impute the rent for the owner.  In the next episode of this series, we will discuss the problems with doing that.  It should be fairly obvious that they are large.

The alternative to owner's equivalent rent is user cost, which seeks to compute the cost of owning to those who live in their own houses.  [Pat Hendershott is the guru of user cost.  See a typical paper of his here]. The formula for the user cost of housing is

uct = Vt[((1-m)rt + mit)(1-ty) + τp(1-ty) + d - π 

where uct is user cost at time t, Vt is the value of the house at time t, m is the loan-to-value ratiort is the opportunity cost of equity, it is the mortgage interest rate, ty is the marginal income tax rate, τp is the property tax rate, d is depreciation and π is expected house price appreciation.  This formula calculates the after-tax cash-flow cost of owning (including the opportunity cost of equity), adds depreciation and subtracts expected appreciation.

Of all these elements (and this formula doesn't incorporate everything, but it is close enough), the only thing we know with near certainty is the marginal income tax rate: once one calculates reported taxable income, one can know the marginal tax, which is set by statute, with certainty.

Everything else?  Up in in the air.  On a year-to-year basis, we don't know the value of our houses with certainty.  We don't know with certainty the opportunity cost of equity. While we know the coupon rate of a mortgage, we don't always know its total cost until we extinguish it, because mortgages with fees and points (and sometimes, even prepayment penalties) are amortized over time, and the life of a mortgage is generally considerably shorter than its term, as households refinance their mortgages or sell their houses.  We don't know property taxes until an assessor determines assessed value, which is usually at least a little different from market value.  Depreciation is difficult to measure.  Finally, we are pretty lousy at forecasting the values of our houses.

But let's say we are good at forecasting house prices, and you think the value of your house is going to increase by $5,000 over the next year.  Is this the same as being handed a check for $5,000?  No, because you still need to live somewhere.  If your house goes up by $5,000 in value, so to does your neighbor's.  The only way to cash in on your $5,000 is to downsize.  Pocketing the $5,000 and downsizing may leave you better off, but not as well off as just having $5,000.  So the user cost formula does not exactly get user cost right.


Saturday, December 27, 2014

Is Houston really vulnerable to recession?

My inbox is filling up with dire warnings about the near-term future of Houston's economy.  After all, the price of oil has dropped by 50 percent, and we know how reliant Houston is on energy.  Except, perhaps, it is not.  Let's look at a Bureau of Labor Statistics chart that gives the composition of employment in Harris County, Texas at the end of 2013:



On the one hand, what the chart does't show is that the location quotient for natural resources and mining in Harris County is 2.78, meaning that it is almost three times more reliant on the sector as the rest of the country.  Despite this, however, only about five percent of jobs in Harris County are in that sector.  The county in which Houston sits is actually very well diversified, with 75 percent of its jobs being in the service sector.  Put another way, over the past several years, Harris County has been creating more total jobs every two years as there are jobs in the entire natural resources and mining sector.

NRS jobs do pay well, which means that any reductions in these jobs would have a multiplier effect (but we are generally terrible at estimating regional multipliers).  But clearly something is happening in Houston that makes it attractive to employers that have nothing to do with the energy sector.  My suspicion is that inexpensive housing is one of those things.

I could be completely wrong about this, but it seems to me that the decline in oil prices will more likely bring slower growth--as opposed to recession--to Harris County. 

Friday, December 26, 2014

The limits of knowledge in economics: Part IV

The workhorse model of international trade is the Heckscher-Ohlin Model.  It is essentially the model that formalizes the Ricardian model of trade students in principles of economics learn.  It demonstrates that countries export goods in which they have a comparative advantage; it moreover shows that it is comparative advantage, rather than absolute advantage determines patterns of trade.

In the context of the HO model, comparative advantage is defined by the relative abundance of a production factor.  Let's say country A has 10 units of labor and 10 units of capital, which country B has 8 units of labor and 4 units of capital.  Country A has an absolute advantage in both labor and capital, but B has a comparative advantage in labor, because its labor to capital ratio (i.e., 2) is higher than  country A's (1).

The HO model thus predicts that country A will export a good that needs relatively more capital for production to B, and that country B will export a good that needs relatively more capital for production to A.

Everything works beautifully in a world with two countries, two goods and two factors.  But the world is nothing like that--it has many more countries, goods and factors than 2.  Is this a big deal?

The whole point of economic modeling is to isolate the impact of a particular phenomenon, ceteris paribus.  Ceteris paribus is a favorite phrase in the economist's lexicon, and means "all other things being equal or held constant."  Sometimes asserting ceteris paribus is innocuous; often it is not.

A small change in the HO model creates serious problems.  As Alan Deardorff showed, if the number of goods is greater than the number of factors of production, patterns of trade become indeterminate. The mechanics of the problem are simple: when the numbers of factors equal the number of goods, solving the pattern of trade problem involves equal numbers of equations and unknowns.  This equality disappears when there are more goods than factors, we become unable to determine what is produced where.  Allowing more goods than factors is not a trivial change to the model--it has an enormous impact on the analytical outcome.  It is also a change that better reflects the reality of the world.

This is a fundamental problem with HO that is easiest to explain.  Deardorff takes us through more sophisticated argues that show other problems with the predictions of the HO model.  He expresses concerns that the model:


(1) implies fractions of good produced or trade routes utilized that are (unrealistically?) low;
(2) has a solution that is hypersensitive to [trade costs].

So the fact is we really don't have a theoretical model that predicts patterns of trade well.  Yet we for years made lots of policy decisions based on a model that has lots of limitations. 

[Update in response to comment: Krugman and Helpman are great in reconciling how interindustry trade happens and why countries with similar factors trade with each other.  But the problems outlined by Deardorff about developing a robust general equilibrium model that predicts patterns of trade remain.]




    


Wednesday, December 24, 2014

The limits of knowledge in economics, Part III.

Economists think a lot about preferences, and are more interested in what people do, instead of what people say they will do. When people make work-leisure decisions, consumption decisions, and investment decisions, they are revealing their preferences, and we economists attempt to gather information about the broader economy based on these revelations.

Drawing inferences from revealed preferences can work if individual preferences meet four seemingly simple assumptions (Hal Varian's Microeconomics Analysis provides the clearest exposition I know of micro theory.  There is also a nice discussion here).

(1) Preferences are complete.  This simply says that if I am faced with a choice of two consumption bundles, I can always say that one is at least as good as the other.

(2) Preferences are reflexive.  This simply says that any bundle is always as good as itself.

(3) Preferences are transitive.  This simply says that if bundle A is better than bundle B, and bundle B is better than bundle C, then bundle A is better than bundle C.

(4) Preferences are strongly monotonic.  This simply says I never prefer less to more.

These may seem like mild assumptions, and they would be, except that people change their mind.

Perhaps there is a restaurant you go to on a regular basis.  Its menu stays the same, and the prices stay the same over a reasonable length, so your choice set remains constant.  Yet this week you might have the beef burger, next week the veggie-burger, and the following week the cajun chicken sandwich.  In doing so, you have violated assumptions (1) and (3).

Except that, in a sense, you haven't.  Suppose an element of the choice set is the time at which you eat your sandwich.  Under these circumstances, the menu does change because it is offered at three different times.  So we can preserve our theoretical assumptions.

But now if we want to estimate preferences with precision, we have a difficult problem, because we have to estimate in far more dimensions than the data can support.  So anything we infer about preferences will necessarily be approximations.

This is not to say that the general axiom of revealed preferences isn't a powerful tool to learn things about the economy: Varian's lecture on the subject makes a pretty compelling case that it is.  But it will always be an imprecise powerful tool.


Tuesday, December 23, 2014

The limits of knowledge in economics, Part II

Everyone--and I mean everyone--who does empirical analysis should read Charles Manski's Identification Problems in the Social Sciences (I am happy to say I took advanced econometrics from Manski when I was a Ph.D. student at Wisconsin).  The book reminds us that we are constantly relying on unstated assumptions when we do statistical analysis, and we need to do a better job of stating them.

The focus of this post involves a simple issue: extrapolation.  Let me show two graphs from Manski's chapter:


Suppose one wanted to infer y based on x.  Obviously, as the sample size gets larger, the confidence interval gets smaller, for the set of x that we are able to observe.  Note that in this instance, however, no x between 4 and 6 is sampled.

Most empirical analysis would simply assume that E(y|x=5) is some smooth function that gives weights to E(y|x=4) and E(y|x=6).  Put in English, one would just draw some sort of line between the x,y relation at x=4 and the x,y relation at x=6, and read off an x,y relation for x=5.

But doing this involves an important assumption: that y doesn't go flying off in one direction or another at x=5.  We actually cannot know this, because we have no observations at x=5; indeed, maybe the reason we never observe x=5 is because y is highly unstable at that point.  Just as problematic (perhaps more so) is predicting y when x > 9.

I am pretty sure that it is hard to go a day without reading something that involves someone extrapolating outside the support of observed data.  Sometimes it is necessary to do this, but when we do, we should always say so.


Monday, December 22, 2014

The limits of knowledge in economics, Part I.

I rather like this paragraph from Ed Leamer's Journal of Economic Perspectives piece on the utility and limits of econometrics:

Finally, I think that Angrist and Pischke are way too optimistic about the
prospects for an experimental approach to macroeconomics.  Our understanding
of causal effects in macroeconomics is virtually nil, and will remain so.  Don’t we
know that?  Though many members of our profession have jumped up to support
the $787 billion stimulus program in 2009 as if they knew that was an appropriate
response to the Panic of 2008, the intellectual basis for that opinion is very thin,
especially if you take a close look at how that stimulus bill was written.  
The economists who coined the DSGE acronym combined in three terms the
things economists least understand:  “dynamic,” standing for forward-looking decision
making; “stochastic,” standing for decisions under uncertainty and ambiguity; and “general equilibrium,” standing for the social process that coordinates and
influences the actions of all the players...that’s what we
do.  We seek patterns and tell stories [italics added].
The point is correct: among other things, we have nothing like the necessary degrees of freedom (not exactly the same as observations, but to a lay person, close enough) necessary to identify causal relationships in the macroeconomy with anything like certainty. We are now in our 11th business cycle since World War II, and as much as we like using high frequency data (I, for one, am guilty as charged), that really means in an important sense we have 11 observations about the post-War macroeconomy.

All that said, I did support the stimulus, because I have a Bayesian prior that Keynes was basically right.  In particular, the ideas that high unemployment can result from inadequate aggregate demand, and that in turn that high unemployment is an unrecoverable waste of resources, and, finally, that deficit spending can spur aggregate demand all make sense to me.

At the same time, George Akerlof lists example after example of how new classical macro-theory is rejected by empirical evidence (such as it is), while also showing that the empirical evidence, such as it is, is consistent with Keynesian predictions.  But this is still different from confirming that Keynes was right, something we are unable to do with data.

So back to Leamer. Elsewhere in the paper, he talks about three-valued logic: the ability to use evidence to come to a yes, and no, or an I don't know.   The honest thing to say about macroeconomics is "I don't know."  Alas, while the world is uncertain, policymakers still need to make policy decisions.


Thursday, December 04, 2014

If you think we're post-racial, read the lead article from the November 2014 American Economic Review

Alesina and La Ferrara conclude:

This paper proposes a test for racial bias in capital sentencing in the US over the period 1973-1995. We use the share of judicial errors in rst degree sentencing as an indicator of racial bias of such courts. Using an originally collected dataset, we uncover a bias against minority defendants killing white victims. The bias is present, according to our test, only in Southern States. More precisely, according to our interpretation rst degree courts tend to place less weight on the possibility of condemning an innocent in cases of minority defendants with one or more white victims relative to minority defendants who did not kill whites. The same does not hold for white defendants. This result is not explained by differences in observable characteristics of the crime or of the trial, nor by the ideological orientation of appeal courts.
The paper is really well done. 

Tuesday, November 04, 2014

Jung Hyun Choi and I write about Income Inequality across Cities.

We will be presenting at APPAM:

This paper investigates why the level of income inequality differs across U.S. cites. We also
explore why some cities experienced faster increases in the level of inequality than others.
Using the Decennial Census and the American Community Survey (ACS) from 1980 to 2011,
we explore whether the disparities in the level and the changes in the level of inequality can
be explained by MSA characteristics, including labor market conditions, skill distribution,
residential mobility, racial concentration, industrial composition and unionization. We also
examine how state level policies such as unemployment insurance benefits and minimum
wage level is associated with income inequality.

Our findings hows that negative labor market conditions, concentration of skilled workers
and racial segregation are positively associated with the level of income inequality. The level
of inequality in these cites also tends to rise grow at a faster pace. While differences in the minimum
wage level do not seem to have any association with income inequality across cities, we find some evidence that differences in unemployment insurance benefits and greater unionization lowered increases in the income inequality.

Thursday, October 30, 2014

How people who can't do math get shafted.

I have a car (an Accord, if you must know) that is 17 months old.  When I bought the car, the dealer offered me a car loan at 0 percent interest for 36 months, so I took it.  Even in the world of very low discount rates, accepting the loan allowed me to get a further small effective discount on the car.

The dealer called me today, saying I could trade the car in for a new car and not increase my payment; the payment would simply reset for 36 months.  I told him I needed to do a little math before calling him back.  The math I did was as follows:

Value of Old Car from Kelly Blue Book + PV of 36 months of payments = Cost of New Car.

Cost of New Car - Edmunds Value of New Car = $6000.

Yes, the dealer was trying to fool me into paying $6000 for...nothing.   In my particular case, he could not profit from informational asymmetry.  But for the person with the average math skills in the US?  That might be a different story.  We know this selling tactic must work sometimes, because otherwise I would not have gotten the call.

The first lesson from all this is we really need to do a better job teaching math.

The second lesson is that, in the meantime, we need to protect consumers from these sorts of practices.


Sunday, September 28, 2014

How the price of a Martini reveals the property value of a city

A Hendricks Gibson is basically a commodity (although the bartender does need to know what she is doing). But a good Gibson at the Starlight Lounge in LaCrosse, Wisconsin is $8; at the Roof Garden at the Peninsula Hotel in Beverly Hills is $16; at the King Cole Bar of the St. Regis Hotel in New York is $22. 
Let's say the cost of the cocktail, including labor, but exclusive of real estate, is $7. Then the implicit rent you are paying for sitting in a bar in LaCrosse is $1; in Beverly Hills on a rooftop is $9; and in NYC is $15. If one consults Zillow, one will find that this ratio of 1:9:15 for real estate in LaCrosse, BH and Manhattan is pretty close to the truth.
One key thing--all these drinks are served in competitive markets--there is true thickness in bars in these markets. And the people at the Roof Garden and the King Cole will let you sit a nurse your drink without hassling you about it.  So while having a drink in these lovely spots is very expensive, it is not a rip-off--one just has to pay the rent.
A drink at, say, Disney World, or FedEx field, doesn't count, because if you want to stay at the Park/Game and have a drink, you have to pay a monopoly price for a drink (in the case of the stadium of the Washington Football Team, you might even pay for a drink that is past its expiration date).
Needless to say, further research is necessary.

Saturday, September 27, 2014

How Piketty's care with language can improve economics

When I was a freshman in college, I read Fogel and Engerman's Time on the Cross.  I loathed the book, because it implicitly endorsed the idea that it is OK, "in the interested of science," to dehumanize those African-Americans that were placed in bondage by viewing them as capital (I loathed it for other reasons as well, but that is for another time.) It also contributed to the broad view currently within much of mainstream economics that it is (1) acceptable to treat human beings as objects, and (2) that it is embarrassing to embrace humanity.  I was embarrassed that the book helped Fogel ultimately won the Nobel Prize in economics.

I thought of Fogel and Engerman again when a recent review in the Economist of Edward Bishop's The Half Has Never Been Told complained that a book based on the perspective of slaves could not be objective.  (To the Economist's credit, it repudiated the review and apologized for allowing it to be printed, but also kept a link to it so that readers could see how misguided it was).  Again, an allegiance to "scientific detachment" led to a bizarre view of an evil institution.  A "detached" view of slavery helps legitimize its practice, and thus is not in any way objective.

And so we come to Thomas Piketty's Capital in the 21st Century.  I have some issues with the book, but I love the first third of it.  I particularly like his treatment of "human capital:"
There are many reasons to exclude human capital from our definition of capital.  The most obvious is that human capital cannot be owned by another person or traded on a market.
The language of economics often treats people as commodities: the phrases "representative agent" and "human capital" are examples of this.   Sometimes these phrases are useful abstractions, but they also contribute to the sometimes pernicious indifference of mainstream economics to issues of justice.  Piketty's take on human capital might make us a little less indifferent.









Monday, December 23, 2013

Hannah Harris Green in The Guardian on Race, Crime and Television

She writes:

The First 48 is an A&E true crime reality show that documents real police investigations for the first 48 hours after a homicide report, including what happens inside interrogation rooms. If this sounds dangerous and ethically questionable, that's because it is. Police accidentally killed a child as A&E's cameras rolled, and a legally innocent man came to beknown as a murderer after of his appearance on the show. Catastrophes like these have led to lawsuits, and now many cities refuse to work with The First 48....


...This portrayal is not representative of American crime statistics. Although homicide arrests are disproportionately high among African Americans, about the same total number of white people are arrested in homicide cases as black people. The First 48's overemphasis on black crime is symptomatic of a larger disrespect for African American communities, which many Americans deem inherently suspicious.....


...Even release from jail isn't necessarily enough to erase the stigma that comes from appearing on the First 48. Tyson Mimms of Louisville, Kentucky sued A&E in 2011 for invasion of privacy and defamation. For over a year, the episode aired repeatedly with an onscreen message saying that Mimms was "awaiting trial", even though his charges were dismissed due of lack of evidence before the episode first aired.