Causal Analysis in Theory and Practice

August 11, 2015

Mid-Summer Greeting from the UCLA Causality Blog

Filed under: Announcement,Causal Effect,Counterfactual,General — moderator @ 6:09 pm

Friends in causality research,

This mid-summer greeting of UCLA Causality blog contains:
A. News items concerning causality research
B. Discussions and scientific results

1. The next issue of the Journal of Causal Inference is scheduled to appear this month, and the table of content can be viewed here.

2. A new digital journal “Observational Studies” is out this month (link) and its first issue is dedicated to the legacy of William Cochran (1909-1980).

My contribution to this issue can be viewed here:

See also comment 1 below.

3. A video recording of my Cassel Lecture at the SER conference, June 2015, Denver, CO, can be viewed here:

4. A video of a conversation with Robert Gould concerning the teaching of causality can be viewed on Wiley’s Statistics Views, link (2 parts, scroll down).

5. We are informed of the upcoming publication of a new book, Rex Kline “Principles and Practice of Structural Equation Modeling, Fourth Edition (link). Judging by the chapters I read, this book promises to be unique; it treats structural equation models for what they are: carriers of causal assumptions and tools for causal inference. Kudos, Rex.

6. We are informed of another book on causal inference: Imbens, Guido W.; Rubin, Donald B. “Causal Inference in Statistics, Social, and Biomedical Sciences: An Introduction” Cambridge University Press (2015). Readers will quickly realize that the ideas, methods, and tools discussed on this blog were kept out of this book. Omissions include: Control of confounding, testable implications of causal assumptions, visualization of causal assumptions, generalized instrumental variables, mediation analysis, moderation, interaction, attribution, external validity, explanation, representation of scientific knowledge and, most importantly, the unification of potential outcomes and structural models.

Given that the book is advertised as describing “the leading analysis methods” of causal inference, unsuspecting readers will get the impression that the field as a whole is facing fundamental obstacles, and that we are still lacking the tools to cope with basic causal tasks such as confounding control and model testing. I do not believe mainstream methods of causal inference are in such state of helplessness.

The authors’ motivation and rationale for this exclusion were discussed at length on this blog. See
“Are economists smarter than epidemiologists”

and “On the First Law of Causal Inference”

As most of you know, I have spent many hours trying to explain to leaders of the potential outcome school what insights and tools their students would be missing if not given exposure to a broader intellectual environment, one that embraces model-based inferences side by side with potential outcomes.

This book confirms my concerns, and its insularity-based impediments are likely to evoke interesting public discussions on the subject. For example, educators will undoubtedly wish to ask:

(1) Is there any guidance we can give students on how to select covariates for matching or adjustment?.

(2) Are there any tools available to help students judge the plausibility of ignorability-type assumptions?

(3) Aren’t there any methods for deciding whether identifying assumptions have testable implications?.

I believe that if such questions are asked often enough, they will eventually evoke non-ignorable answers.

7. The ASA has come up with a press release yesterday, recognizing Tyler VanderWeele’s new book “Explanation in Causal Inference,” winner of the 2015 Causality in Statistics Education Award

Congratulations, Tyler.

Information on nominations for the 2016 Award will soon be announced.

8. Since our last Greetings (Spring, 2015) we have had a few lively discussions posted on this blog. I summarize them below:

8.1. Indirect Confounding and Causal Calculus
(How getting too anxious to criticize do-calculus may cause you to miss an easy solution to a problem you thought was hard).
July 23, 2015

8.2. Does Obesity Shorten Life? Or is it the Soda?
(Discusses whether it was the earth that caused the apple to fall? or the gravitational field created by the earth?.)
May 27, 2015

8.3. Causation without Manipulation
(Asks whether anyone takes this mantra seriously nowadays, and whether we need manipulations to store scientific knowledge)
May 14, 2015

8.4. David Freedman, Statistics, and Structural Equation Models
(On why Freedman invented “response schedule”?)
May 6, 2015

8.5. We also had a few breakthroughs posted on our technical report page

My favorites this summer are these two:
because they deal with the tough and long-standing problem:
“How generalizable are empirical studies?”

Enjoy the rest of the summer

1 Comment »

  1. Consent and Disconsent in Interpreting Cochran’s Inheritance.
    In reading the discussions of many scholars on Cochran’s paper of 1975,
    I was struck by the apparent similarities between Andrew Gelman’s comment and mine
    on what we should inherit from Cochran’s paper. I would like to point out the differences.

    Gelman’s comments can be viewed here
    while mine can be viewed here

    Gelman says:
    In reading Cochran’s chapter, I was struck by his apparent lack of interest in causal identification. Modern textbooks (for example, the econometrics book of Angrist and Pischke) discuss the search for natural experiments, along with the assumptions under which an observational study can yield valid causal inference, and various specific methods such as instrumental variables and regression discontinuity that can identify causal effects if defined carefully enough under specified conditions. In contrast, Cochran discusses generic beforeand-after designs and restricts himself to analysis strategies that do basic controlling for pre-treatment covariates by matching and regression. He is not so clear on what variables should be controlled for (which perhaps can be expected given that he was writing before Rubin codified the concept of ignorability), and this has the practical consequence that he devotes little space to any discussion of the data-generating process. Sure, an experiment is, all else equal, better than an observational study, but we don’t get much guidance on how an observational study can be closer or further from the experimental ideal.

    In contrast, Pearl says:
    It is likewise not surprising that in the present article, Cochran does not offer readers any advice on which covariates are likely to reduce bias and which would amplify bias. Any such advice, as we know today, requires a picture of reality, which Cochran understood to be both needed and lacking at his time. On the positive side, though, he did have the vision to anticipate the emergence of a new type of research paradigm within statistics, a paradigm centered on mechanisms:

    Do Gelman and Pearl converge?
    Only up to a certain point. After agreeing on the importance of mechanisms, and the data-generation process,
    we find profound divergence on the role of mechanisms in practical causal inference.

    For Gelman, Rubin’s codifying the concept of ignorability
    and Angrist and Pischke’s quest for instrumental variables
    are shining examples of reasoning about mechanisms.
    For Pearl, on the other hand, reasoning about mechanism means representing
    mechanisms mathematically and transparently, and learning to
    draw causal conclusions from such representations.
    For him, “ignorability” is a cognitively formidable construct,
    and “instrumental variables,” just a tiny tip of
    the iceberg of modern causal inference.

    I bring this distinction here to emphasize that speaking about the importance
    of “mechanisms” does not mean agreeing on what “mechanisms” are, or what
    we should do with “mechanisms” even if we agree on what they mean.

    Comment by Judea Pearl — August 15, 2015 @ 3:28 am

RSS feed for comments on this post. TrackBack URI

Leave a comment

Powered by WordPress