Free Novel Read

Chances Are Page 20


  The variables most often used to define quality of care are an uneasy mixture of the practical and the political. People want to be treated soon, so “time to diagnosis” and “time to surgery” are variables to be minimized. People want to know they are going to a good hospital, so it is important to publish mortality rates—but the rates must be risk-adjusted; otherwise, advanced critical-care facilities would rank around the level of a Levantine pest-house. Governments want to pay as little as they can for routine procedures; more complex treatments require more money. It’s like a continuous clinical trial, with funding as the drug: where it does most good, dosage is increased; where the Null Hypothesis prevails, society can save money. The problem is that this experiment is neither randomized nor double-blind. Doctors and hospital administrators are entirely aware of the criteria and of their implications for future funding. It is as if, in a cross-over experiment, the patients in the control group were told, “You’re getting placebo now, but if you show the right symptoms we’ll switch you into treatment.” The very sources of data are given both the opportunity and the incentive to manipulate it.

  Given this remarkable arrangement, it’s surprising how few institutions have been caught fiddling, but the examples that have come to light are worrying enough. In the UK, some administrators reduced apparent waiting times for operations by finding out when patients were going on vacation and then offering appointments they knew wouldn’t be taken up. Departments hired extra staff and canceled routine procedures for the one week they knew waiting-time figures were being collected. In the United States, the “diagnosis-related group” reimbursement system for Medicare/ Medicaid has produced what is called “DRG creep,” where conditions are “upcoded” to more complex and remunerative classes. Hospitals anxious to achieve good risk-adjusted mortality figures can do so by sending home the hopelessly moribund and classifying incoming patients as higher-risk: in one New York hospital the proportion of preoperative patients listed as having “chronic obstructive pulmonary disease” rose from 2 percent to 50 percent in two years. In heart surgery, adding a little valve repair to a bypass operation for a high-risk patient could take the whole procedure off the mortality list, improving the bypass survival figures at the expense of a few extra deaths in the “other” column.

  Of course, many more frightening things happened in the swaggering days when health care was left to regulate itself. The point is that as long as funding depends on statistics, the temptation to doctor the numbers as well as the patients will be strong. Moreover, ranking asks us to do something all people find difficult: to accept that half of any ranking will be below the median. How will you feel when you learn that your surgeon ranks in the 47th percentile? Does that help cement the relationship of trust?

  Jeremy Bentham described the role of society as providing the greatest good for the greatest number—a difficult ratio to maximize. The “good” of medical science, based on experiment and statistics, consists of matching potential cures to existing illnesses. This model worked well when the bully diseases were still in charge: the constant threats that filled up the middle of our normal curve of deaths. Now, smallpox is gone, polio almost gone, TB generally under control, measles manageable. We are increasingly faced with diseases that conceal huge variety under a single name, like cancer—or mass illnesses caused, on average, by our own choices, like obesity, diabetes, or heart disease. The problem with these isn’t finding a cure—if ever there were a magic bullet, vigorous exercise would be it—it’s being willing to take it.

  A more easily swallowed remedy for the diseases of affluence is the Polypill, proposed in 2003 by Nicholas Wald and Malcolm Law of the Wolfson Institute of Preventative Medicine. Combining aspirin and folic acid with generic drugs that lower cholesterol and blood pressure, this would be given to everyone over 55, and (assuming the benefits are multiplicative) should cut the risk of heart attacks by 88 percent, and strokes by 80 percent. Average life span could be increased by 11 years at very little cost.

  But some of those drugs have side effects; some people could have problems. So, again, the question is whether you think of all of your patients or each of them. A National Health Service, dealing with a whole population, would probably favor the Polypill, but American researchers say that variable response among different groups requires tailoring the dose—there would be at least 100 different Polypills. It seems that we will still need to jog up the steep path of self-denial.

  “Patients very rarely fit the picture in the textbook. How do you treat an individual?” Dr. Timothy Gilligan of the Harvard Medical School is both a scientist and a practicing surgeon; his life is lived on the interface between the general and the particular. “In something like chemotherapy or radiotherapy, the published results don’t tell us anywhere near enough. We are trying to take into account genetic variations in metabolizing certain drugs, or the effects of different social environments on whether someone can get through a difficult therapy successfully. Individual differences can make all the difference to the outcome.”

  His hope is that increasing knowledge of the human genome will return medicine to the idea of a unique solution for every patient, custom-built around genetic predispositions. “Cancers are genetic diseases, and ultimately we should be able to define cancers by a series of specific mutations in genes. Right now, we have a hundred people coming in with lung cancer—which we know is a variety of diseases—and they all get put in the same basket. Some will respond to chemotherapy and some won’t, and one of the reasons is probably the specific character of the cancer, its genetic component. If we understood that, we could tailor treatment to it.” For the moment, though, the complexity of the way the human body expresses its genome makes this still a distant dream.

  Improved statistical evaluation may sharpen prognosis, however. Instead of being told that half the people with your disease die within a year, leaving you to wonder which half you are in, more sophisticated computer algorithms take account of several variables about your disease and give a more specific estimate. Dr. Gilligan elaborates: “You’re an individual with this type of lung cancer and this was the size of your tumor and this is where the metastases are located, and this is how fit you are right now, and if we plug all these numbers into our computer we can say not just what everyone with lung cancer does, but what people like you do. Again, though, you end up with a percentage: it may be you have a 75 percent chance of living a year—but we still can’t tell you whether you are in that 75 percent or the 25 percent who don’t. We’re a long way from the 100 percent or 0 percent that tells you you’re going to be cured or you’re not going to be cured—but if we have nasty chemotherapy and we are able to say this group of people has a 90 percent chance of benefiting and this group has only a 10 percent chance, then it would be easier to decide whom to treat.”

  As the human genome reveals its secrets, many of our assumptions about it begin to unravel. The first to go seems to be the idea of a universal genome from which any mutation represents a potential illness. As methods of studying DNA improve both in their resolution and their signal-to-noise ratio, they reveal more and more variation between “normal” people—not just in the regions of apparent nonsense between known functional stretches, but in the placement and number of copies of functional genes themselves. How this variation affects the potential for disease, developmental differences, or response to drugs becomes a deepening mystery. So not only is there no Death, only I, who am dying—there may be no Malady, only I, who am sick, and no Treatment, only what might work for me.

  Where does this leave us? Well short of immortality, but longer-lived and better cared-for than we were not so long ago, when a child on crutches meant polio, not a soccer injury. Our knowledge is flawed, but we can know the nature of its flaws. We take things awry, but we are learning something about the constants of our errors. If we remain aware that the conclusions we draw from data are inherently probabilistic and as interdependent as the balanced weights in a Cal
der mobile, we can continue the advance as better doctors—and as better patients. The inevitability of uncertainty is no more a reason for despair than the inevitability of death; medical research continues the mission set long ago by Fisher: “We have the duty of formulating, of summarizing, and of communicating our conclusions, in intelligible form, in recognition of the right of other free minds to utilize them in making their own decisions.”

  8

  Judging

  Law, says the judge as he looks down his nose,

  Speaking clearly and most severely,

  Law is as I’ve told you before,

  Law is as you know I suppose,

  Law is but let me explain it once more,

  Law is The Law.

  Yet law-abiding scholars write:

  Law is neither wrong nor right,

  Law is only crimes

  Punished by places and by times,

  Law is the clothes men wear

  Anytime, anywhere,

  Law is Good morning and Good night.

  —W. H. Auden, “Law Like Love”

  Around 1760 B.C.—a century or so after the departure of Abraham, his flocks, and family—Hammurabi established his supremacy in Mesopotamia, modern-day Iraq. He realized that leaving each mud-walled city under the protection of its own minor god and vassal kinglet was a sure prescription for treachery and disorder—so, as a sign of his supremacy, he gave to all one code of law: a seven-foot black pillar of basalt, densely covered with cuneiform writing, was set up in every marketplace.

  Here is the source for “an eye for an eye, a tooth for a tooth” (articles 197 and 200), but here also are set the seller’s liability for faulty slaves and the maximum damages for surgical malpractice. Much of the code seems astonishingly modern: it requires contracts, deeds, and witnesses for all transactions of sale or inheritance; merchant’s agents must issue receipts for goods entrusted them; wives may tell husbands “you are not congenial to me” and, if otherwise blameless, go their ways in full possession of their dowries. On the other hand, there is also the full complement of loppings, burnings, drownings, and impalements typical of a society without effective police, where deterrence from serious crime rests on visceral fear of pain rather than certainty of detection.

  “Law,” etymologically, means “that which lies beneath”: the unchanging standard against which mere happenstance is measured. Human events careen into the past, set going by motives we may not even know at the time. Yet when experience is brought to court, it must line up against the shining rule, and the bulgy mass of grievance and retort must be packed into the box of judgment. Achieving this task means answering two probabilistic questions: “Are these facts likely to be true?” and “If they are true, is this hypothesis the most likely one to explain them?”

  These are not answered by a basalt obelisk. They are human inquiries, pursued through the intrinsically human capability of speech—witnessing and argument. The first law case on record dates from the Sixth Dynasty of Egypt’s Old Kingdom. It concerns (for some things never change) a disputed will; and the judgment required Sebek-hotep, the defendant, to bring three reputable witnesses to swear by the gods that his account of the matter was accurate and his document not a forgery. We see here, at its very beginning, the elements that have remained vital to legal process: testimony, numbers, reputation, and the oath.

  These matters absorbed the attention of lawmakers in every tradition: Deuteronomy requires that “One witness shall not rise up against a man for any iniquity . . . at the mouth of two witnesses, or at the mouth of three witnesses, shall the matter be established.” The Romans preferred a rich witness over a poor one (as less likely to be bribed) and excluded anyone guilty of writing libelous poems. Jewish law forbade testimony from dice players or pigeon fanciers—but also ruled out all confessions, since they were unlikely to be obtained legitimately. People tend to affirm what they have heard others say; rules against hearsay evidence and conjecture appear very early. So does the idea that it is the accuser who must prove the charge, and that this proof, in criminal cases, should be beyond a reasonable doubt. “You found him holding the sword, blood dripping, the murdered man writhing,” says the Talmud; “if that is all, you saw nothing.”

  How was proof established? By argument. If you wonder why court proceedings are verbal—with all that repetition, hesitation, digression, and objection—it is because law keeps direct contact with the standards and habits of the ancient world. Life in classical Athens was one long conversation. Good talk flowed in spate throughout the city: smooth and easy in the after-dinner symposia, hard and forceful in the courts, intricate and demanding in the academies. All had one source, essential curiosity; and one goal, to try ideas by argument. The methods were the same when Socrates and his pupils were analyzing the Good in the shadow of a colonnade as when the old man himself was standing on trial for his life.

  Legal rhetoric has a bad name, but it is simply a degraded form of that Rhetoric the Greeks defined as the science of public reasoning. Aristotle, who never did things out of order, published his Rhetoric before he even began his logical masterwork, the Prior Analytics—seen in his terms, logical argument is just the chamber form of rhetoric. “All men are mortal” is not just a statement; it is a form of provisional law, like “all professional burglars re-offend.” To link, as in deductive logic, the actors of here and now to some provisional law is the essence of judicial reasoning: “Socrates is mortal”; “West Side Teddy is a professional burglar.” The syllogisms we discussed in Chapter 1 are really skeletons of legal cases, waiting to be fleshed out with facts and names.

  For a work on persuasion, the Rhetoric is hard going—but those who plow through it are rewarded by the first explicit treatment of the probable, which Aristotle considered both as the plausible and the likely, defined as “what usually happens.” You can use it to draw conclusions based on previous experience, as well as to make assumptions about the future based on the past, or expect the more usual based on the unusual. Thus: “He is flushed and his pulse is racing; it is likely he has a fever”; “She has always criticized my clothes; it is likely she will tomorrow”; and “He committed murder without a second thought; it is likely he would also be willing to double-park.”

  Arguments from likelihood appear weaker than deductive logic, but sometimes prove stronger, since they are not necessarily disproved by one counterexample. Hence, Aristotle says, judges “ought to decide by considering not merely what must be true but also what is likely to be true”: this is, indeed, the meaning of “giving a verdict in accordance with one’s honest opinion.”

  Obviously, this likelihood is open to abuse: a puny man may say, “It is unlikely that I committed this murder,” but then the strong man can say, “It is unlikely that I, who would be so easily suspected, committed this murder”—or, as the poet Agathon pointed out, “We might call it a probability that many improbable things happen to men.” So legal probability could do anything that logical argument does, except stand on its own unaided. Syllogisms are self-evident, but the use of likelihood means that the conclusion resides not in the speaker’s words, but in the listener’s understanding: that is what makes legal probability a branch of rhetoric.

  “What usually happens” is a concept that slides all too easily into “what people like us usually do.” Christ’s parables employed likelihood in the first sense: good fathers usually forgive their sons, good stewards improve their opportunities. Christ’s accusers employed it in the second sense: we are not accustomed to eat with sinners or heal on the Sabbath. When Pontius Pilate asked, “What is Truth?” he was in a courtroom, and—to give this Greek-educated official his due—may well have been pointing out how difficult it is to reconcile conflicting views of probability based on different premises.

  Among the Romans, the shared sense of “what usually happens” gained added support from their universal grounding in basic law and well-informed love of rhetorical theater. The law tables were terse, allu
sive, and rhythmic: the foundation song of the Republic. Schoolchildren were expected to memorize them. Every politician seeking glory had not only to win a military victory against the barbarians, but had also to conduct a successful defense and prosecution. These performances took place in the open forum, before an audience as passionately expert in legal spectacle as its descendants are in opera or soccer: once, when Cicero finished an oration with the quick flick-flack of a double trochee, the whole court erupted in frenzied cheering.

  All this—the brilliance, the freedom of judgment, the shared expectations—depended heavily on those two prerequisites for civilization: leisure and self-confidence. By the time Justinian became emperor in 527, neither quality was very evident. Rome was in the hands of the Ostrogoths; plague and riot raged through the remains of Empire. It was not a time to rely on “what usually happens.”

  Justinian, like Napoleon, was a determined and ruthless centralizer, and to concentrate power he collected and codified all law. His Decisions, Institutes, Digest, and Code—issued in the course of four years—superseded all previous legislation, nullified all other opinion, and founded the entire Western legal tradition. It is to Justinian that we owe our system’s uneasy yoking of statute and common law, eternal principles and local exceptions, sword and scales.

  As the darkness deepened, though, even the simplest laws soon lost their relevance. In the centuries of chaos, where dispute so often led directly to blood feud, there was no point in establishing evidence or pursuing judicial reasoning. Only God could provide proof—instantly and miraculously—in trial by battle, trial by water, and trial by ordeal, where you walked unharmed over red-hot plowshares or swallowed the blessed wafer that catches in the throat of liars. Each side could bring oath-helpers, whose numbers and standing amplified the power of the word. Oh—and in England, they would first put the matter to a jury of your peers. These were not the disinterested jurors of today, but quite the opposite—people who knew all about the accused and the background of the case—and their purpose had less to do with democratic integrity than with getting a decision that would satisfy the village. The results were often unjust, but then injustice was what one expected in this world: when Simon de Montfort led the crusade against the Cathars, he killed everyone he came on, heretic and Catholic alike, confident that God would sort them out to Heaven or Hell as they deserved.