Law schools are innovating.  But to measure progress, we need data.


We are headlong into a period in which law schools, when considered as a whole, are working hard on innovation.  This innovation pertains to the full gamut of law school operations and activities.  This includes:

  • The educational structure, including the curriculum in the first and in the advanced years. What should be kept? What should be lost? What should be modified?  What used to be called, opaquely, “curricular reform,” now embodies a more comprehensive, and ultimately more creative, focus on how to embed modern lawyering and new perspectives on law and legal institutions into the overall structure of the law school curriculum.  Although seldom do law schools start from scratch, there has been some fairly bold efforts to consider how best to align the mission of a law school to give students a broad and deep education in law for a dynamic world with the coursework in a three-year program;
  • The economic model, including the costs to the students of legal education, the mechanisms of financial aid and variegated support, the diversification of the law school’s revenue picture, and the moving pieces of the fixed and variable costs – what are generally lumped (thank you, ABA/US News!! – into the category of “expenditures;”
  • The configuration and functions of the faculty. What are the responsibilities of the “research” and “teaching” faculty?  What are their work expectations?  How should they be hired?
  • Student services and support. There is a world of difference between how law schools work to support students now and when I was in law school a quarter century ago, or even ten years ago. Continuing energy is going into how best to support students, from career services to wellness initiatives to inclusion strategies;
  • External engagement, running through the relationship among the law school and its alumni – endeavors which I have called “law school for life” – but also including the connections between the law school and the wider legal and business world.

And there are other aspects of the law schools’ operations which are the subject of close attention and of innovative strategies.

The direction of change in the modern American law school is a positive one, and there are real reasons to be encouraged – indeed enthusiastic – about the momentum that close observers and stakeholders see in our long-conservative academy.

But yes there is a “however,” and that is this:  Many of these strategic efforts at real innovation are taking shape without adequate data.  To a remarkable and troubling extent, law school innovation is a data desert.  We develop natural experiments, we try out initiatives which we hope will move the needle, and yet we scarcely build upon real data – not never, but not enough.

This post has a blunt polemical point and it is this:  Legal educators must develop effective data; we must overcome whatever collective action problems stand in the way of developing these data; we must take scrupulous care to analyze data in creating and implementing reform strategies; and we must, in the end, make change on the basis of evidence wherever possible, not conjecture.

Facts and data have long remained elusive in the law school ecosystem.  What particular law schools do by way of data is of course case specific and it would be hard to generalize.  I can say from my own experience, which includes full-time service at four diverse law schools, and service as dean at two law schools for a total of nearly fourteen years, that we are fundamentally inadequate, and at times border on the functionally illiterate, when it comes to collecting, synthesizing, and analyzing data.

Consider, for example, the question of how our curriculum accomplishes the aims of preparing students for the practice of law.  We can measure performance on the bar – although here, too, we know nearly nothing about whether our particular curriculum increases bar performance in any way statistically significant – and we can draw some rough conclusions about student employability given a pattern of student coursework (so as to say, clumsily, “take lots of IP courses if you want to get a job as an IP lawyer and your overall grade-point average is mediocre).

But what is the connection between, say, a robust clinical program and the ability of law students to perform at a high level in their first years of practice? We want to say that exposure to high-quality experiential education makes a meaningful difference, but we do not really, truly know that on the basis of serious attention to data (more than anecdote; more than the partners’ war stories).

We have available many opportunities for natural experiments.  We can even configure randomized control tests – the gold standard of experiments – to tell us something about impacts of curricular shocks and reforms of various sorts.  However, if this data collection and analysis is going on to any great degree, the secret is closely held by the law schools which are doing this.  And that is one of the key problems with innovation in law schools:  When a law school collects data, it does not typically share it.

The collective problem is a severe one, indeed nearly fatal when it comes to serious innovation and reform.  How can we come to know what changes to our educational, economic, and engagement structures mean for the improvement in our graduates’ performance as new lawyers or business professionals or entrepreneurs or . . .  Moreover, how can we come to know what innovations in legal education mean for the progress of the legal system?  So little data is available at a comprehensive, collective level to illuminate the key performance indicators of law schools generally.  The fundamental problem is the absence of data, and the neglect of law schools to collect and share it.

There are a few legal educators who see the iceberg, or at least the top part, and who are deeply committed to collecting and evaluating data.  Derek Muller at Pepperdine, Michael Simkovic at USC, Jerry Organ at St. Thomas, and Dan Linna at Northwestern are, among their other professional roles, in the data collection and analysis business (as, of course, is the editor of Legal Evolution, the indefatigable Bill Henderson).  Posts appear from time-to-time on law-school focused blogs, and the conversation is advanced. There contributions are advanced greatly our understanding of how law schools function and how certain changes might have meaningful impacts.  I will not name many other contributors for fear of leaving folks out, but please know that the good work that is done in the blogosphere on legal innovation has moved the ball forward in many positive respects.

What will continue to vex scholars toiling away on these questions, however, is the limited amount of data that is made available by law schools, either via the ABA questionnaire or through other sources.  At bottom, it is the law schools, largely under the control of deans, that own the most important data.  Any serious efforts to broaden the availability of diverse, meaningful data will require the acquiescence, if that the enthusiastic support, of a critical mass of law school deans.

So let’s aim toward some systematic effort, driven by a wide swatch of stakeholders (certainly not limited to the ABA or other important legal education organizations, although efforts by these groups are not unimportant), to push law schools to collect and share data.  These data should be open access to the extent possible, and where relevant privacy laws and norms intervene to limit particular pieces of data (involving students, for example), careful protocols should be developed to ensure that privacy is carefully protected while the overall objective of broad access is advanced.

In a subsequent post, I will discuss in more depth the kinds of data which can be and ought to be collected in order to advance the mission of a thoroughly evidence-based approach to law school innovation and reform.  There is much work to do, and data is essential to this work.