Category Archive for: Science [Return to Main]

Saturday, September 05, 2015

'Range of Reactions to Realism about the Social World'

Daniel Little:

Range of reactions to realism about the social world: My recent post on realism in the social realm generated quite a bit of commentary, which I'd like to address here.

Brad Delong offered an incredulous response -- he seems to think that any form of scientific realism is ridiculous (link). He refers to the predictive success of Ptolemy's epicycles, and then says, "But just because your theory is good does not mean that the entities in your theory are "really there", whatever that might mean...." I responded on Twitter: "Delong doesn't like scientific realism -- really? Electrons, photons, curvature of space - all convenient fictions?" The position of instrumentalism is intellectually untenable, in my opinion -- the idea that scientific theories are just convenient computational devices for summarizing a range of observations. It is hard to see why we would have confidence in any complex technology depending on electricity, light, gravity, the properties of metals and semiconductors, if we didn't think that our scientific theories of these things were approximately true of real things in the world. So general rejection of scientific realism seems irrational to me. But the whole point of the post was that this reasoning doesn't extend over to the social sciences very easily; if we are to be realists about social entities, it needs to be on a different basis than the overall success of theories like Keynsianism, Marxism, or Parsonian sociology. They just aren't that successful!

There were quite a few comments (71) when Mark Thoma reposted this piece on economistsview. A number of the commentators were particularly interested in the question of the realism of economic knowledge. Daniel Hausman addresses the question of realism in economics in his article on the philosophy of economics in the Stanford Encyclopedia of Philosophy (link):

Economic methodologists have paid little attention to debates within philosophy of science between realists and anti-realists (van Fraassen 1980, Boyd 1984), because economic theories rarely postulate the existence of unobservable entities or properties, apart from variants of “everyday unobservables,” such as beliefs and desires. Methodologists have, on the other hand, vigorously debated the goals of economics, but those who argue that the ultimate goals are predictive (such as Milton Friedman) do so because of their interest in policy, not because they seek to avoid or resolve epistemological and semantic puzzles concerning references to unobservables.

Examples of economic concepts that commentators seemed to think could be interpreted realistically include concepts such as "economic disparity".  But this isn't a particularly arcane or unobservable theoretical concept. There is a lot of back-and-forth on the meaning of investment in Keynes's theory -- is it a well-defined concept? Is it a concept that can be understood realistically? The question of whether economics consists of a body of theory that might be interpreted realistically is a complicated one. Many technical economic concepts seem not to be referential; instead, they seem to be abstract concepts summarizing the results of large numbers of interactions by economic agents.

The most famous discussion of realism in economics is that offered by Milton Friedman in relation to the idea of economic rationality (Essays in Positive Economics); he doubts that economists need to assume that real economic actors do so on the basis of economic rationality. Rather, according to Friedman this is just a simplifying assumption to allow us to summarize a vast range of behavior. This is a hard position to accept, though; if agents are not making calculating choices about costs and benefits, then why should we expect a market to work in the ways our theories say it should? (Here is a good critique by Bruce Caldwell of Friedman's instrumentalism; link.)

And what about the concept of a market itself? Can we understand this concept realistically? Do markets really exist? Maybe the most we can say is something like this: there are many social settings where stuff is produced and exchanged. When exchange is solely or primarily governed by the individual self-interest of the buyers and sellers, we can say that a market exists. But we must also be careful to add that there are many different institutional and social settings where this condition is satisfied, so there is great variation across the particular "market settings" of different societies and communities. As a result, we need to be careful not to reify the concept of a market across all settings.

Michiel van Ingen made a different sort of point about my observations about social realism in his comment offered on Facebook. He thinks I am too easy on the natural sciences.

This piece strikes me as problematic. First, because physics is by no means as successful at prediction as it seems to suggest. A lot of physics is explanatorily quite powerful, but - like any other scientific discipline - can only predict in systemically closed systems. Contrasting physics with sociology and political science because the latter 'do not consist of unified deductive systems whose empirical success depends upon a derivation of distant observational consequences' is therefore unnecessarily dualistic. In addition, I'm not sure why the 'inference to the best explanation' element should be tied to predictive success as closely as it is in this piece. Inference to the best explanation is, by its very definition, perfectly applicable to EXPLANATION. And this applies across the sciences, whether 'natural' or 'social', though of course there is a significant difference between those sciences in which experimentation is plausible and helpful, and those in which it is not. This is not, by the way, the same as saying that natural sciences are experimental and social ones aren't. There are plenty of natural sciences which are largely non-experimental as well. And lest we forget, the hypothetico-deductive form of explanation DOES NOT WORK IN THE NATURAL SCIENCES EITHER!

This critique comes from the general idea that the natural sciences need a bit of debunking, in that various areas of natural science fail to live up to the positivist ideal of a precise predictive system of laws. That is fair enough; there are areas of imprecision and uncertainty in the natural sciences. But, as I responded to Delong above, the fact remains that we have a very good understanding of much of the physical realities and mechanisms that generate the phenomena we live with. Here is the response I offered Michiel:

Thank you, Michiel, for responding so thoughtfully. Your comments and qualifications about the natural sciences are correct, of course, in a number of ways. But really, I think we post-positivists need to recognize that the core areas of fundamental and classical physics, electromagnetic theory, gravitation theory, and chemistry including molecular biology, are remarkably successful in unifying, predicting, and explaining the phenomena within these domains. They are successful because extensive and mathematicized theories have been developed and extended, empirically tested, refined, and deployed to help account for new phenomena. And these theories, as big chunks, make assertions about the way nature works. This is where realism comes in: the chunks of theories about the nature of the atom, electromagnetic forces, gravitation, etc., can be understood to be approximately true of nature because otherwise we would have no way to account for the remarkable ability of these theories to handle new phenomena.

So I haven't been persuaded to change my mind about social realism as a result of these various comments. The grounds for realism about social processes, structures, and powers are different for many social sciences than for many natural sciences. We can probe quite a bit of the social world through mid-level and piecemeal research methods -- which means that we can learn quite a bit about the nature of the social world through these methods. Here is the key finding:

So it seems that we can justify being realists about class, field, habitus, market, coalition, ideology, organization, value system, ethnic identity, institution, and charisma, without relying at all on the hypothetico-deductive model of scientific knowledge upon which the "inference to the best explanation" argument depends. We can look at sociology and political science as loose ensembles of empirically informed theories and models of meso-level social processes and mechanisms, each of which is to a large degree independently verifiable. And this implies that social realism should be focused on mid-level social mechanisms and processes that can be identified in the domains of social phenomena that we have studied rather than sweeping concepts of social structures and entities.

(Sometimes social media debates give the impression of a nineteenth-century parliamentary shouting match -- which is why the Daumier drawing came to mind!)

Friday, August 23, 2013

'The Age of Denial and the Marketplace of Ideas'

Mike the Mad Biologist:

The Age of Denial and the Marketplace of Ideas: I probably should have written “The Age of Denial Results from the Marketplace of Ideas.” Physicist Adam Frank at the NY Times tackles the topic of denialism and science ...

Mark Thoma lists a bunch of reasons why this might be the case (boldface mine):

•The cranks have always been there, but today digital technology makes it easier to gain a platform.
•The stakes are higher, so winning is the only thing.
•Scientists have pushed too far and offered evidence as though it were fact, only to have to reverse themselves later (e.g. types of food that are harmful/helpful) eroding trust.
•Science education is so bad that the typical reporter has no idea how to tell fact from “manufactured doubt,” and the resulting he said, she said journalism leaves the impression that both sides have a valid point.
•Scientists became too arrogant and self-important to interact with the lowly public, and it has cost them.
The political sphere has become ever more polarized and insular making it much easier for false ideas intended to promote political or economic gain to reverberate within the groups.
•Nothing has really changed, old people always think their age was the golden one.

I would add to the list:

•The opposition to certain fields and findings of science is central to self-identity and part of a larger world view and way of life (e.g., fundamentalist doctrines). It transcends data-driven assessment of single issues. Typically, people will resist changing their minds and only do so after a trauma or betrayal (personal or group) forces them to confront their inconsistencies.

I would argue the widespread acceptance of racism–I mean the flat-out, stone cold kind, not subtle prejudice–for much of the twentieth century has to be one of the dumbest displays of denialism. And it was certainly tied into notions of self-identity (“If you ain’t better than a…, then who are you better than?”). So the oldsters, like Tolstoy’s unhappy families, were stupid in their own ways.

But I want to return to the notion of a marketplace of ideas. I dislike that metaphor because it implies ideas are judged not on their validity, but on how well they are marketed. The implication of this is that once some rich wacko decides to fund a ‘faith-tank’, that entity essentially becomes a Self-Perpetuating Bullshit Machine, and is unstoppable. It’s relatively cheap to ‘put ideas out there.’ More importantly, there’s no way to stop them from doing so, nor do the individual actors pushing these ideas have any incentive to stop.

One more way we have commodified the previously uncommodifiable*.

*Or least to a recently unprecedented extent.

Thursday, August 22, 2013

'Welcome to the Age of Denial'

Adam Frank wonders how it became "politically effective, and socially acceptable, to deny scientific fact" (this one is also in today's links):

Welcome to the Age of Denial, Commentary, NY Times: ...The triumph of Western science led most of my professors to believe that progress was inevitable. While the bargain between science and political culture was at times challenged — the nuclear power debate of the 1970s, for example — the battles were fought using scientific evidence. Manufacturing doubt remained firmly off-limits.

Today, however, it is politically effective, and socially acceptable, to deny scientific fact. ...

My professors’ generation could respond to silliness like creationism with head-scratching bemusement. My students cannot afford that luxury. Instead they must become fierce champions of science in the marketplace of ideas.
During my undergraduate studies I was shocked at the low opinion some of my professors had of the astronomer Carl Sagan. For me his efforts to popularize science were an inspiration, but for them such “outreach” was a diversion. That view makes no sense today.
The enthusiasm and generous spirit that Mr. Sagan used to advocate for science now must inspire all of us. There are science Twitter feeds and blogs to run, citywide science festivals and high school science fairs that need input. For the civic-minded nonscientists there are school board curriculum meetings and long-term climate response plans that cry out for the participation of informed citizens. ...
Behind the giant particle accelerators and space observatories, science is ..., simply put, a tradition. And as we know from history’s darkest moments, even the most enlightened traditions can be broken and lost. Perhaps that is the most important lesson all lifelong students of science must learn now.

I've been trying to think of some reasons why it might have changed:

  • The cranks have always been there, but today digital technology makes it easier to gain a platform.
  • The stakes are higher, so winning is the only thing.
  • Scientists have pushed too far and offered evidence as though it were fact, only to have to reverse themselves later (e.g. types of food that are harmful/helpful) eroding trust.
  • Science education is so bad that the typical reporter has no idea how to tell fact from "manufactured doubt," and the resulting he said, she said journalism leaves the impression that both sides have a valid point.
  • Scientists became too arrogant and self-important to interact with the lowly public, and it has cost them.
  • The political sphere has become ever more polarized and insular making it much easier for false ideas intended to promote political or economic gain to reverberate within the groups.
  • Nothing has really changed, old people always think their age was the golden one.

What else might have caused this? What's the most important factor?

Friday, May 31, 2013

Does Infinity Really Exist?

Sunday, December 23, 2012

'Paradigms, after Fifty Years'

I get in trouble if I blog too much when family is around for the holidays, so a quick one from David Warsh:

Paradigms, after Fifty Years, Economics Principals: For a book built on a narrative of, among other things, the history of our understanding of electricity, The Structure of Scientific Revolutions, by Thomas Kuhn, has had a remarkable run. It appeared in 1962, and people have been arguing about it ever since. ... For Structure is the book that made the word paradigm, meaning a way of seeing, part of the everyday discourse of nearly everyone who deals with ideas for a living. ...

Before Kuhn, the philosophy of science was boring and the history of science a backwater... There was a lot of boilerplate instruction about the steps of the scientific method and the logic of scientific discovery (if you’re not wrong, you might be right) to be found in the first chapters of textbooks, but, as Kuhn wrote at the beginning of Structure, this was no better than an image of national culture drawn from a tourist brochure.
After Kuhn, the focus shifted to the social organization of science: to the textbooks themselves, graduate education, the communities (“invisible colleges”) in which science was done, and the various nexuses in which results were put to work, from scientific journals and legal briefs to corporate laboratories and entrepreneurial start-ups. ...
How does a science get started? According to Kuhn, the story goes something like this: in the beginning someone contributes a powerful example of how to think about a set of scientific problems: Aristotle’s Physica, Ptolemy’s Almagest, Newton’s Principia, Franklin’s Electricity, Lavoisier’s Chemistry, Lyell’s Geology. The achievements appear, not out of the blue, but they are transformative. A community forms around them because they offer not a finished theory but rather a thinking cap, a pre-analytic way of seeing things and asking questions about them.
This way-of-seeing aspect that each possessed Kuhn designated a paradigm. The word itself is ancient Greek; he borrowed it from language studies, where it described the all-but-unconscious pattern by which one learns to conjugate a verb or decline a noun when learning to speak a language. A successful paradigm is enabling. It both poses plenty of unanswered questions and suggests means by which they might be conclusively answered. ...
This is the route to what Kuhn called “normal science.” By that he meant successful science, rather like filling in the outlines of a hastily drawn map once a new continent has been discovered. In this metaphor, normal scientists come in all sorts of guises: trailblazers, pioneers, settlers, sodbusters, ranchers, developers. Kuhn, unfortunately, chose two other metaphors to describe the conduct of this phase, and those labels have sometimes caused proud scientists to rebel at his description. Successful normal scientists were “puzzle-solvers,” he said, working away at adducing facts, producing theories and making sure the one dovetailed with the other. Or they were, in essence, engaged in “mopping up” after a big paradigmatic invasion. ...
Kuhn was a great student of the Copernican revolution, which meant he thoroughly understood the Ptolemaic system that it overthrew – crystalline spheres arrayed around an earth at the center of the universe. Ptolemy, and the astronomers who worked in his tradition for nearly fifteen hundred years, were excellent normal scientists. They had built a system that cohered; when observation of the heavens produced a troubling fact (anomalies, Kuhn dubbed such facts), they added a sphere or two.
But the troubling facts multiplied. Eventually a scientific crisis was at hand – anomalies with which existing normal science simply could not cope under any circumstances. At that point, a “revolutionary,” usually a young scientist, capable, but with little commitment to the old tradition – in this case, Copernicus – would produce a new paradigm, radically reordering the old facts, ignoring some and adducing new ones. The new paradigm would be resisted for a time, science being an inherently conservative enterprise, but gradually would gain adherents among the young. In time, the new order would be widely accepted. ...
In Structure, Kuhn went on to make the point that scientific revolutions didn’t have to be huge events with sweeping cultural ramification, such as the Copernican, Newtonian or chemical revolutions. The professional groups affected by them could be far smaller. ...

An especially fascinating aspect of the story has to do with the reception of Structure. A tendency to mildly disparage it has emerged. Hacking, in his introduction, assures us that science has moved on. The Cold War is over; physics is no longer “where the action is.” Today, he says, “biotechnology rules.” Thus Structure, he writes, “may be – I do not say it is –more relevant to a past epoch in the history of science than it is to the sciences as they are practiced today.”
David Kaiser, a physicist who is a professor in the history of science at the Massachusetts Institute of Technology (where Kuhn spent his last seventeen years), put the case clearly on the eve of a fiftieth-anniversary symposium: “Kuhn had an ambition with the book, which was common at the time: he really thought there was a structure, a hidden key that makes science tick. I think many of my colleagues today in the history and sociology of science would find that ambition wrong-headed. There is not a single magical key that will unlock the way science gets done.”
There is another possibility, of course – that, for one reason or another, it is the historians and philosophers of science, taken as a group, that have got it wrong. They are, after all, “normal” scientists. For as Daryn Leboux, of Queens University, and Jay Foster, of Memorial University of Newfoundland, said in their Science magazine review of the fiftieth anniversary edition, Structure was a revolution of its own, and revolutions are complicated things. They can spark backlash as well as assimilation. It is possible, even likely, that The Structure of Scientific Revolutions is one of those books, like The Origin of Species, that take more than a generation, even two or three, to find its level – a real anomaly in the age of blink. I eagerly look forward to the seventy-fifth anniversary edition.

Monday, December 17, 2012

'Master Computer Controls Universe?'

I can't resist this one (via Boing Boing):
Master computer controls universe?, The Times of India: Scientists are conducting experiments to discover whether the universe exists within a Matrixstyle computer simulation created by super computers of the future.
The experiments being conducted by University of Washington could prove that we are merely pawns in some kind of larger computer game. However, it is unclear who created these super computers that may hypothetically power our existence.
"Imagine the situation where we get a big enough computer to simulate our universe, and we start such a simulation on our computers," said professor Martin Savage, a physicists working on the project. "If that simulation runs long enough, and have same laws as our universe, then something like our universe will emerge within that simulations, and the situation will repeat itself within each simulation," he said. ...
Explaining how the experiment works, physicists claim that finite computer resources mean that space time is not continuous but set on a grid with a finite volume, designed to create maximum energy subatomic particles. The direction these particles flow in will depend on how they are ordered on the grid. They will be looking at the distribution of the highest energy cosmic rays in order to detect patterns that could suggest that universe is the creation of some futuristic computer technology. And if it does turns out that we are mere players in some sort of computer program, they suggested that there may be a way to mess with the program, and play with the minds of our creators. "One could imagine trying to figure out how to manipulate the code, communicate with the code and questions that appear weird to consider today," he said.

If you like this stuff, you might enjoy The hidden reality: parallel universes and the deep laws of the cosmos by Brian Greene. One of the chapters (which explore different ways a multiverse might arise) is on this topic. A big problem in this literature is finding ways to test these various theories empirically, so this is interesting from that perspective as well.

Friday, November 23, 2012

Paul Krugman: Grand Old Planet

The Republican anti-rational mind-set:

Grand Old Planet, by Paul Krugman, Commentary, NY Times: ...Senator Marco Rubio, whom many consider a contender for the 2016 Republican presidential nomination,... was asked how old the earth is. After declaring “I’m not a scientist, man,” the senator went into desperate evasive action, ending with the declaration that “it’s one of the great mysteries.”
It’s funny stuff, and conservatives ... say ... he was just pandering to likely voters in the 2016 Republican primaries — a claim that for some reason is supposed to comfort us.
But we shouldn’t let go that easily..., his inability to acknowledge scientific evidence speaks of the anti-rational mind-set that has taken over his political party. ... In one interview, he compared the teaching of evolution to Communist indoctrination tactics...
What was Mr. Rubio’s complaint about science teaching? That it might undermine children’s faith in what their parents told them to believe. And right there you have the modern G.O.P.’s attitude, not just toward biology, but toward everything: If evidence seems to contradict faith, suppress the evidence.
The most obvious example other than evolution is man-made climate change. As the evidence for a warming planet becomes ever stronger — and ever scarier — the G.O.P. has buried deeper into denial ... accompanied by frantic efforts to silence and punish anyone reporting the inconvenient facts.
But the same phenomenon is visible in many other fields. The most recent demonstration came in the matter of election polls..., the demonizing of The Times’s Nate Silver, in particular, was remarkable to behold. ...
We are, after all, living in an era when science plays a crucial economic role. How are we going to search effectively for natural resources if schools trying to teach modern geology must give equal time to claims that the world is only 6,000 years old? How are we going to stay competitive in biotechnology if biology classes avoid any material that might offend creationists?
And then there’s the matter of ... the recent study from the Congressional Research Service finding no empirical support for the dogma that cutting taxes on the wealthy leads to higher economic growth. How did Republicans respond? By suppressing the report. On economics, as in hard science, modern conservatives don’t want to hear anything challenging their preconceptions — and they don’t want anyone else to hear about it, either.
So don’t shrug off Mr. Rubio’s awkward moment. His inability to deal with geological evidence was symptomatic of a much broader problem — one that may, in the end, set America on a path of inexorable decline.

Wednesday, October 31, 2012

Climate Change and Hurricane Sandy

Is there a link between climate change and hurricane Sandy?:

Did Climate Change Cause Hurricane Sandy?, by Mark Fischetti, Scientific American: If you’ve followed the U.S. news and weather in the past 24 hours you have no doubt run across a journalist or blogger explaining why it’s difficult to say that climate change could be causing big storms like Sandy. Well, no doubt here: it is.
The hedge expressed by journalists is that many variables go into creating a big storm, so the size of Hurricane Sandy, or any specific storm, cannot be attributed to climate change. That’s true, and it’s based on good science. However, that statement does not mean that we cannot say that climate change is making storms bigger. It is doing just that—a statement also based on good science, and one that the insurance industry is embracing, by the way. (Huh? More on that in a moment.)
Scientists have long taken a similarly cautious stance, but more are starting to drop the caveat and link climate change directly to intense storms and other extreme weather events, such as the warm 2012 winter in the eastern U.S. and the frigid one in Europe at the same time. They are emboldened because researchers have gotten very good in the past decade at determining what affects the variables that create big storms. Hurricane Sandy got large because it wandered north along the U.S. coast, where ocean water is still warm this time of year, pumping energy into the swirling system. But it got even larger when a cold Jet Stream made a sharp dip southward from Canada down into the eastern U.S. The cold air, positioned against warm Atlantic air, added energy to the atmosphere and therefore to Sandy, just as it moved into that region, expanding the storm even further.
Here’s where climate change comes in. ... [more] ...

Thursday, August 30, 2012

The Base of Mount Sharp

Wednesday, July 04, 2012

'The Infinity Puzzle: Quantum Field Theory and the Hunt for an Orderly Universe'

[Following up on this post.] I don't usually do the "here's what I've been reading" thing, but if you want to know the history of the Higgs boson, and of developments in quantum physics more generally, see:

Close, Frank (2011-11-29). The Infinity Puzzle: Quantum Field Theory and the Hunt for an Orderly Universe. Perseus Books Group. Kindle Edition.

This isn't the only book I read prior to coming to Lindau, I read several more as well, but it was the best at the history of the people involved -- who got credit for discoveries, who got left out, and so on -- and it is also good at explaining the underlying physics.

Here's a (very) small part of Chapter 9, "The Boson That has Been Named after Me,” A.K.A. The Higgs Boson: How Peter Higgs—and many others—discover the “Higgs Mechanism” for creating mass. The Higgs Boson—why it is now so important for particle physicists, why it is named after him, and how to become famous in three weeks. The excerpts focus on the people part:

...To be fair to Peter Higgs, it was not he who yoked his name to the particle. He modestly refers to it as “the Boson that has been named after me.”12 Why, how, and when it came to be so named are some of the questions that I shall discuss. The British media, eager for a Nobel laureate, have headlined his name, and “Higgs Boson” has also been a convenient sound bite for those promoting the LHC. A counterpoint to this adulation has come from Philip Anderson, for whom Higgs was “a rather minor player.” Furthermore, he has written that the so-called Higgs phenomenon “was, in fact, discovered in [BCS] theory by me and applied to particle physics in 1963, a year before Higgs’ great inspiration.”13 As far as “The Mechanism” for generating a mass for gauge bosons such as W goes, this is indeed true. Anderson is the “A” of what Higgs himself has referred to as the “ABEGHHK’tH” mechanism,14 the full acronym referring to Anderson, Brout, Englert, Guralnik, Hagen, Higgs, Kibble, and ’t Hooft. When we discussed this together, between events at the Edinburgh Festival in 2010, Higgs added, “However, I do accept responsibility for the Higgs Boson; I believe I was the first to draw attention to its existence in spontaneously broken gauge theories.”15 The properties of “The Boson” in particle physics are what the LHC is investigating. While debates about priority for “The Mechanism” may continue, “The Boson” is another issue. So, first, let’s meet the saga of “The Mechanism.” ...
“The portion of my life for which I am known is rather small—three weeks in the summer of 1964. It would have been only two if Physics Letters [the European journal to which Higgs had sent the first version of his manuscript] had accepted the paper. But initially they rejected it.” Higgs’s interpretation of the editor’s letter was that “they didn’t understand it as it was written in a dead language—the dead language of field theory.”39 In the early 1960s, “field theory is a dead end” was a widespread belief. However, this rejection of Peter Higgs’s first draft of the 1964 paper that would eventually make him famous had profound consequences for the course of physics.
Higgs decided that to improve the paper’s credibility, he should “add some practical consequences of the theory. That took a week, hence the third week of 1964, and included the [Higgs] boson.” So it was that the paper’s initial rejection led Higgs to add the feature that helped set him apart from the pack. ...
Within the community of particle physicists it is Higgs’s name that is freely associated with the “Boson that has been named after [him].” That is how it is likely to remain. A historian of science might argue, as some have, that misnomers pollute this particular part of the record. The massless boson attributed to Goldstone is perhaps more justly credited to Nambu, and indeed is often referred to as the Nambu-Goldstone Boson. The massive boson, which in particle physics is named for Higgs, may be traced to Goldstone’s original paper. Tom Kibble recalled a suggestion that the Higgs Boson “should be called the Goldstone boson, while the Goldstone boson should be called the Nambu boson—though that would be very confusing!” The words on the tomb of President Kennedy will always be attributed to him, though it was Ted Sorensen who wrote them. Their impact and resonance through the years come from the writer and the orator both. So perhaps will be the legacy with this boson. It will be attributed to Higgs, if only because its discovery will be in a particle-physics experiment and that is the name by which that community knows it. ...
Intermission: Mid-1960s: We’ve reached the middle of the 1960s. A theory uniting the electromagnetic and weak forces has been achieved, and the earlier worries about the apparent need for massless force carriers assuaged. This has emerged out of ideas on symmetry being hidden, which had been known in other areas of science, and then applied to relativistic quantum field theory–particle physics. Originally, a theorem due to Jeffrey Goldstone had been thought to show that this could not happen. The loophole in his theorem, which led to the possibility that mass can emerge spontaneously in theories where, initially, there was no mass, has been established independently by six people, who published their work within a few weeks of one another in the summer of 1964. One of the sextet is Peter Higgs, whose name today has become associated with this development, and is best known for its—as yet unproved—consequence: the existence and properties of the “Higgs Boson.” While this is a central focus of particle physics investigation today, later chapters will show that in 1964 the concepts were widely regarded as an interesting mathematical discovery, awaiting some realistic application.

Summary of next chapter (10):

Kibble turns the Higgs Mechanism into a useful tool and teaches Salam the idea, who then incorporates this into the Salam-Ward model of the weak and electromagnetic forces. Weinberg also uses The Mechanism, and publishes a paper, which leads to his Nobel Prize. Salam sees Weinberg’s paper and realizes he’s been scooped. Meanwhile, almost everyone else ignores these ideas.

How does the Higgs Boson create mass? When asked that question today during the press conference, one of the scientists at CERN gave the classic example (except he used journalists in the room gathering around Nobel Prize winners):

Imagine a cocktail party of political party workers who are uniformly distributed across the floor, all talking to their nearest neighbours. The ex-prime minister enters and crosses the room. All of the workers in her neighbourhood are strongly attracted to her and cluster round her. As she moves she attracts the people she comes close to, while the ones she has left return to their even spacing. Because of the knot of people always clustered around her she acquires a greater mass than normal, that is, she has more momentum for the same speed of movement across the room. Once moving she is harder to stop, and once stopped she is harder to get moving again because the clustering process has to be restarted. In three dimensions, and with the complications of relativity, this is the Higg’s mechanism.

Hope that helps. The political party workers represent the Higgs field -- though as the Close book emphasizes, the Higgs mechanism described above differs from the Higgs boson. Continuing, here's the Higgs Boson:

Now consider a rumour passing through our room full of uniformly spread political workers. Those near the door hear of it first and cluster together to get the details, then they turn and move closer to their next neighbours who want to know about it too. A wave of clustering passes through the room. It may spread out to all the corners, or it may form a compact bunch which carries the news along a line of workers from the door to some dignitary at the other side of the room. Since the information is carried by clusters of people, and since it was clustering which gave extra mass to the ex-Prime Minister, then the rumour-carrying clusters also have mass. The Higgs boson is predicted to be just such a clustering in the Higgs field.

Sunday, July 01, 2012

The Strange Case of 'Global Warming'???

Oh my -- am I reading this correctly? This is an abstract from one of the talks tomorrow (and I hope the science is more accurate than the date given for the talk, it should be July 2, not July 3):

The Strange Case of "Global Warming", by Ivar Giaever: Lecture: Monday, 3 July, 12.00 hrs
In 2008 I participated on a panel at the Lindau meeting discussing "Global Warming" and to prepare, I looked into the subject using the internet. I found that the general belief is that the average surface temperature over the whole earth for a whole year has increased from ~288 oK to 288.8 oK in roughly 150 years, i.e. 0.3% and that it is due to increased CO2. If this is true, it means to me that the temperature has been amazingly stable.
In the same time period the number of people has increased in the world from 1.5 billions to over 7 billions. Is it possible that all the paved roads and cut down forests have had an effect on the climate?
The American Physical Society think differently, however, as its public position is:
Emissions of greenhouse gases from human activities are changing the atmosphere in ways that affect the Earth’s climate. Greenhouse gases include carbon dioxide as well as methane, nitrous oxide and other gases. They are emitted from fossil fuel combustion and a range of industrial and agricultural processes.
The evidence is incontrovertible: Global warming is occurring. If no mitigating actions are taken, significant disruptions in the Earth’s physical and ecological systems, social systems, security and human health are likely to occur. We must reduce emissions of greenhouse gases beginning now.
I believe that nothing in science is "incontrovertible" thus, in my view, APS has become a political (or religious?) society. Consequently, I resigned from APS in the fall of 2011.
In this talk I will explain why I became concerned about the climate, and terrified by the one sided propaganda in the media, In particular I am worried about all the money wasted on alternate energies, when so many children in the world go hungry to bed.
If you still believe that global warming is occurring and that the main cause is CO2 when I have finished this talk, I urge you to argue for two things to save the world:
1. Introduction of nuclear power
2. Limit the population increase by allowing only one child/woman

Surprise! Fox News highlighted this.

This lecture is just before the one above:

The Science and Policy of Climate Change, by Mario J. Molina: Lecture: Monday, 3 July, 11.30 hrs

Climate change is the most serious environmental challenge facing society in the 21st century. The basic science is clear: the International Panel on Climate Change concluded that there is more than 90% probability that human activities are causing the observed changes in the Earth’s climate in recent decades. The average temperature of the Earth’s surface has increased so far by about 0.8 degrees Celsius since the Industrial Revolution, and the frequency of extreme weather events such as droughts, floods and intense hurricanes is also increasing, most likely as a consequence of this temperature change. There are scientific uncertainties that remain to be worked out, connected with issues such as the feedback effects of clouds and aerosols. Nevertheless, the consensus among experts is that the risk of causing dangerous changes to the climate system increases rapidly if the average temperature rises more than two or three degrees Celsius. Society faces an enormous challenge to effectively reduce greenhouse gas emissions to avoid such dangerous interference with the climate system. This goal can only be achieved by taking simultaneously measures such as significantly increasing energy efficiency in the transportation, building, industrial and other sectors, using renewable energy sources such as solar, wind, geothermal and biomass, and possibly developing and using safer nuclear energy power plants.

These are Nobel Prize winners in physics. I thought physics and its adherence to the scientific method was supposed to be free of the kinds of controversy over models, politics, etc. that plagues economics.

Thursday, May 03, 2012

Things That Will Change the World

Overcoming spinal cord injuries (I learned a lot about the spinal cord from the first segment, e.g. the systems that control walking are at the base of the spinal column, the brain has little to do with it), remote brain controlled mechanical hands, self-directed robots, and so on:

Things That Will Change the World - and Blow Your Mind, Wednesday, May 2, 2012, 9:30 AM - 10:45 AM

Speakers:

  • Joel Burdick, Professor of Mechanical Engineering and Professor of Bioengineering, California Institute of Technology
  • Nathan Michael, Research Assistant Professor, Department of Mechanical Engineering and Applied Mechanics, University of Pennsylvania
  • Jay Schnitzer, Director, Defense Sciences Office, Defense Advanced Research Projects Agency

Moderator:

  • Richard Sandler, Executive Vice President, Milken Family Foundation; Partner, Law Offices of Maron & Sandler

Friday, January 20, 2012

"Fracking Would Emit Large Quantities of Greenhouse Gases"

Another reason to be suspicious of fracking:

Fracking Would Emit Large Quantities of Greenhouse Gases, by Mark Fischetti, Scientific American: Add methane emissions to the growing list of environmental risks posed by fracking.
Opposition to the hydraulic fracturing of deep shales to release natural gas rose sharply last year over worries that the large volumes of chemical-laden water used in the operations could contaminate drinking water. Then, in early January, earthquakes in Ohio were blamed on the disposal of that water in deep underground structures. Yesterday, two Cornell University professors said at a press conference that fracking releases large amounts of natural gas, which consists mostly of methane, directly into the atmosphere—much more than previously thought. ...
Molecule for molecule, methane traps 20 to 25 times more heat in the atmosphere than does carbon dioxide. The effect dissipates faster, however: airborne methane remains in the atmosphere for about 12 years before being scrubbed out by ongoing chemical reactions, whereas CO2 lasts 30 to 95 years. Nevertheless, recent data from the two Cornell scientists and others indicate that within the next 20 years, methane will contribute 44 percent of the greenhouse gas load produced by the U.S. Of that portion, 17 percent will come from all natural gas operations. ...

Monday, December 12, 2011

Comparing Infinities

This has been bugging me all day:

Comparisons involving infinitely large numbers are notoriously tricky. ... To grasp the mathematical challenge, imagine that you’re a contestant on Let’s Make a Deal and you’ve won an unusual prize: an infinite collection of envelopes, the first containing $1, the second $2, the third $3, and so on. As the crowd cheers, Monty chimes in to make you an offer. Either keep your prize as is, or elect to have him double the contents of each envelope. At first it seems obvious that you should take the deal. “Each envelope will contain more money than it previously did,” you think, “so this has to be the right move.” And if you had only a finite number of envelopes, it would be the right move. To exchange five envelopes containing $1, $2, $3, $4, and $5 for envelopes with $2, $4, $6, $8, and $10 makes unassailable sense. But after another moment’s thought, you start to waver, because you realize that the infinite case is less clear-cut. “If I take the deal,” you think, “I’ll wind up with envelopes containing $2, $4, $6, and so on, running through all the even numbers. But as things currently stand, my envelopes run through all whole numbers, the evens as well as the odds. So it seems that by taking the deal I’ll be removing the odd dollar amounts from my total tally. That doesn’t sound like a smart thing to do.” Your head starts to spin. Compared envelope by envelope, the deal looks good. Compared collection to collection, the deal looks bad.
Your dilemma illustrates the kind of mathematical pitfall that makes it so hard to compare infinite collections. The crowd is growing antsy, you have to make a decision, but your assessment of the deal depends on the way you compare the two outcomes.
A similar ambiguity afflicts comparisons of a yet more basic characteristic of such collections: the number of members each contains. ... Which are more plentiful, whole numbers or even numbers? Most people would say whole numbers, since only half of the whole numbers are even. But your experience with Monty gives you sharper insight. Imagine that you take Monty’s deal and wind up with all even dollar amounts. In doing so, you wouldn’t return any envelopes nor would you require any new ones... You conclude, therefore, that the number of envelopes required to accommodate all whole numbers is the same as the number of envelopes required to accommodate all even numbers—which suggests that the populations of each category are equal (Table 7.1). And that’s weird. By one method of comparison—considering the even numbers as a subset of the whole numbers—you conclude that there are more whole numbers. By a different method of comparison—considering how many envelopes are needed to contain the members of each group—you conclude that the set of whole numbers and the set of even numbers have equal populations.

Greene1Table 7.1 Every whole number is paired with an even number, and vice versa, suggesting that the quantity of each is the same.

You can even convince yourself that there are more even numbers than there are whole numbers. Imagine that Monty offered to quadruple the money in each of the envelopes you initially had, so there would be $4 in the first, $8 in the second, $12 in the third, and so on. Since, again, the number of envelopes involved in the deal stays the same, this suggests that the quantity of whole numbers, where the deal began, is equal to that of numbers divisible by four (Table 7.2), where the deal wound up. But such a pairing, marrying off each whole number to a number that’s divisible by 4, leaves an infinite set of even bachelors—the numbers 2, 6, 10, and so on—and thus seems to imply that the evens are more plentiful than the wholes.

Greene2Table 7.2 Every whole number is paired with every other even number, leaving an infinite set of even bachelors, suggesting that there are more evens than wholes.

From one perspective, the population of even numbers is less than that of whole numbers. From another, the populations are equal. From another still, the population of even numbers is greater than that of the whole numbers. And it’s not that one conclusion is right and the others wrong. There simply is no absolute answer to the question of which of these kinds of infinite collections are larger. The result you find depends on the manner in which you do the comparison. ...
Physicists call this the measure problem, a mathematical term whose meaning is well suggested by its name. ... Solving the measure problem is imperative.
[From Greene, Brian (2011). The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos (Kindle Locations 3609-3624). Random House, Inc.. Kindle Edition.]

Monday, August 29, 2011

Paul Krugman: Republicans Against Science

The GOP's willful ignorance and anti-intellectualism is getting worse:

Republicans Against Science, by Paul Krugman, Commentary, NY Times: Jon Huntsman Jr., a former Utah governor and ambassador to China, isn’t a serious contender for the Republican presidential nomination. And that’s too bad, because Mr. Hunstman has been willing to say the unsayable about the G.O.P. — namely, that it is becoming the “anti-science party.” This is an enormously important development. And it should terrify us.
To see what Mr. Huntsman means, consider recent statements by the two men who actually are serious contenders for the G.O.P. nomination: Rick Perry and Mitt Romney.
Mr. Perry ... recently made headlines by dismissing evolution as “just a theory,” one that has “got some gaps in it” — an observation that will come as news to the vast majority of biologists. But what really got peoples’ attention was what he said about climate change: “I think there are a substantial number of scientists who have manipulated data so that they will have dollars rolling into their projects. And I think we are seeing almost weekly, or even daily, scientists are coming forward and questioning the original idea that man-made global warming is what is causing the climate to change.”
That’s a remarkable statement — or maybe the right adjective is “vile.” ... In fact, if you follow climate science at all you know that the main development over the past few years has been growing concern that projections of future climate are underestimating the likely amount of warming. ...
So how has Mr. Romney  ... responded to Mr. Perry’s challenge? In trademark fashion: By running away. In the past, Mr. Romney ... has strongly endorsed the notion that man-made climate change is a real concern. But, last week, he softened that to a statement that he thinks the world is getting hotter, but “I don’t know that” and “I don’t know if it’s mostly caused by humans.” Moral courage!
Of course, we know what’s motivating Mr. Romney’s sudden lack of conviction. According to Public Policy Polling, only 21 percent of Republican voters in Iowa believe in global warming (and only 35 percent believe in evolution). Within the G.O.P., willful ignorance has become a litmus test for candidates, one that Mr. Romney is determined to pass at all costs. ... And the deepening anti-intellectualism of the political right, both within and beyond the G.O.P., extends far beyond the issue of climate change. ...
Now, we don’t know who will win next year’s presidential election. But the odds are that one of these years the world’s greatest nation will find itself ruled by a party that is aggressively anti-science, indeed anti-knowledge. And, in a time of severe challenges — environmental, economic, and more — that’s a terrifying prospect.

Friday, May 06, 2011

The Attention Deficit Society: What Technology Is Doing to Our Brains

I haven't had a chance to watch it yet, I've been distracted with other things, but several people told me they enjoyed this session at the Global Conference:

The Attention Deficit Society: What Technology Is Doing to Our Brains

Speakers:

  • Nicholas Carr, Author, "The Shallows: What the Internet Is Doing to Our Brains"
  • Cathy Davidson, Ruth F. DeVarney Professor of English and John Hope Franklin Humanities Institute Professor of Interdisciplinary Studies, Duke University
  • Clifford Nass, Thomas M. Storke Professor, Stanford University
  • Sherry Turkle, Abby Rockefeller Mauzé Professor of the Social Studies of Science and Technology, MIT

Moderator:

Monday, March 28, 2011

Not Only the Fittest Survive

I wonder if this applies to market as well (not sure if this is the same):

Research shows not only the fittest survive, EurekAlert: Darwin's notion that only the fittest survive has been called into question by new research published in Nature.
A collaboration between the Universities of Exeter and Bath in the UK, with a group from San Diego State University in the US, challenges our current understanding of evolution by showing that biodiversity may evolve where previously thought impossible.
The work represents a new approach to studying evolution that may eventually lead to a better understanding of the diversity of bacteria that cause human diseases.
Conventional wisdom has it that for any given niche there should be a best species, the fittest, that will eventually dominate to exclude all others.
This is the principle of survival of the fittest. Ecologists often call this idea the `competitive exclusion principle' and it predicts that complex environments are needed to support complex, diverse populations.
Professor Robert Beardmore, from the University of Exeter, said: "Microbiologists have tested this principle by constructing very simple environments in the lab to see what happens after hundreds of generations of bacterial evolution, about 3,000 years in human terms. It had been believed that the genome of only the fittest bacteria would be left, but that wasn't their finding. The experiments generated lots of unexpected genetic diversity."
This test tube biodiversity proved controversial when first observed and had been explained away with claims that insufficient time had been allowed to pass for a clear winner to emerge.
The new research shows the experiments were not anomalies.
Professor Laurence Hurst, of the University of Bath, said: "Key to the new understanding is the realization that the amount of energy organisms squeeze out of their food depends on how much food they have. Give them abundant food and they use it inefficiently. When we combine this with the notion that organisms with different food-utilizing strategies are also affected in different ways by genetic mutations, then we discover a new principle, one in which both the fit and the unfit coexist indefinitely."
Dr Ivana Gudelj, also from the University of Exeter, said: "The fit use food well but they aren't resilient to mutations, whereas the less efficient, unfit consumers are maintained by their resilience to mutation. If there's a low mutation rate, survival of the fittest rules, but if not, lots of diversity can be maintained.
"Rather nicely, the numbers needed for the principle to work accord with those enigmatic experiments on bacteria. Their mutation rate seems to be high enough for both fit and unfit to be maintained."
Dr. David Lipson of San Diego State University, concluded: "Earlier work showed that opposing food utilization strategies could coexist in complex environments, but this is the first explanation of how trade-offs, like the one we studied between growth rate and efficiency, can lead to stable diversity in the simplest possible of environments."

Tuesday, March 22, 2011

"How Free Is Your Will?"

Something a bit different (though the part at the end relates to economic choices). Does this actually say anything about free will?:

How Free Is Your Will? , by Daniela Schiller and David Carmel, Scientific American: ...Scientists from UCLA and Harvard -- Itzhak Fried, Roy Mukamel and Gabriel Kreiman -- have taken an audacious step ... challenging conventional notions of free will. ...
Fried and his colleagues implanted electrodes in twelve patients, recording from a total of 1019 neurons. They adopted an experimental procedure that Benjamin Libet, a pioneer of research on free will at the University of California, San Francisco, developed almost thirty years ago: They had their patients look at a hand sweeping around a clock-face, asked them to press a button whenever they wanted to, and then had them indicate where the hand had been pointing when they decided to press the button. This provides a precise time for an action (the push) as well as the decision to act. With these data the experimenters can then look for neurons whose activity correlated with the will to act. ...
[A]bout a quarter of these neurons began to change their activity before the time patients declared as the moment they felt the urge to press the button. The change began as long as a second and a half before the decision..., this activity was robust enough that the researchers could predict with over 80 percent accuracy not only whether a movement had occurred, but when the decision to make it happened. ...
Even with the above caveats,... these findings are mind-boggling. They indicate that some activity in our brains may significantly precede our awareness of wanting to move. Libet suggested that free will works by vetoing: volition (the will to act) arises in neurons before conscious experience does, but conscious will can override it and prevent unwanted movements.
Other interpretations might require that we reconstruct our idea of free will. Rather than a linear process in which decision leads to action, our behavior may be the bottom-line result of many simultaneous processes: We are constantly faced with a multitude of options for what to do right now – switch the channel? Take a sip from our drink? Get up and go to the bathroom? But our set of options is not unlimited (i.e., the set of options we just mentioned is unlikely to include “launch a ballistic missile”). Deciding what to do and when to do it may be the result of a process in which all the currently-available options are assessed and weighted. Rather than free will being the ability to do anything at all, it might be an act of selection from the present range of options. And the decision might be made before you are even aware of it. ...

Tuesday, March 08, 2011

Sleep Loss and "Optimism Bias"

If you are short on sleep, don't gamble:

Short on sleep, brain optimistically favors long odds, by Katherine Harmon, Scientific American: ...[A] new study shows how just one night of missed sleep can make people more likely to chase big gains while risking even larger losses...
A team of Duke University researchers examined the brains of 29 healthy volunteers using functional MRI ... while the subjects performed a variety of gambling tasks. After a full night of sleep, participants behaved like most people tend to in the real world: guarding against financial loses and cautiously pursuing gains.
But when deprived of a night's sleep (kept awake in the lab from 6 p.m. until 6 a.m.), the volunteers "moved from defending against losses to seeking increased gains," the researchers reported..., a condition the team describes as "an optimism bias." ...
Upon examining the fMRIs, the researchers noticed that when making financial decisions in the gambling games, sleep-deprived individuals had greater activation in the ventromedial prefrontal cortex, an area of the brain associated with fear, risk and decision-making... The sleep-deprived group also showed a drop in activity in the anterior insula, a region implicated in emotion and addiction...
The findings suggest that an all-night stint at the blackjack table or logged into an online poker game can be an extra gamble. ... These effects could also extend to areas where the stakes are even higher, such as the trading floor or the hospital, where workers often perform their duties when they are less than well rested. ...
And because these effects seem to run deeper than just apparent torpor, a shot of espresso—or even stronger stimulants—might not short-circuit the sleep-deprived brain's tendency toward unwarranted optimism...

Since we're on the topic, how do scientists know how much sleep people need?

Tuesday, January 18, 2011

"Loss of Reflectivity in the Arctic Doubles Estimate of Climate Models"

Global warming is a bigger problem than we thought -- and it was already bigger than we seem to be able to handle:

Loss of reflectivity in the Arctic doubles estimate of climate models, EurekAlert: A new analysis of the Northern Hemisphere's "albedo feedback" over a 30-year period concludes that the region's loss of reflectivity due to snow and sea ice decline is more than double what state-of-the-art climate models estimate.
The findings are important, researchers say, because they suggest that Arctic warming amplified by the loss of reflectivity could be even more significant than previously thought. ...

Thursday, January 13, 2011

Brain Images Predict Video Game Performance

A between classes quickie:

Researchers can predict your video game aptitude by imaging your brain, EurekAlert: Researchers report that they can predict "with unprecedented accuracy" how well you will do on a complex task such as a strategic video game simply by analyzing activity in a specific region of your brain.
The findings, published in the online journal PLoS ONE, offer detailed insights into the brain structures that facilitate learning, and may lead to the development of training strategies tailored to individual strengths and weaknesses.
The new approach used established brain imaging techniques in a new way. Instead of measuring how brain activity differs before and after subjects learn a complex task, the researchers analyzed background activity in the basal ganglia, a group of brain structures known to be important for procedural learning, coordinated movement and feelings of reward.
Using magnetic resonance imaging and a method known as multivoxel pattern analysis, the researchers found significant differences in patterns of a particular type of MRI signal, called T2*, in the basal ganglia of study subjects. These differences enabled researchers to predict between 55 and 68 percent of the variance (differences in performance) among the 34 people who later learned to play the game.
"There are many, many studies, hundreds perhaps, in which psychometricians, people who do the quantitative analysis of learning, try to predict from SATs, GREs, MCATS or other tests how well you're going to succeed at something," said University of Illinois psychology professor and Beckman Institute director Art Kramer, who led the research. These methods, along with studies that look at the relative size of specific-brain structures, have had some success predicting learning, Kramer said, "but never to this degree in a task that is so complex." ...
After having their brains imaged, participants spent 20 hours learning to play Space Fortress, a video game developed at the University of Illinois in which players try to destroy a fortress without losing their own ship to one of several potential hazards. None of the subjects had much experience with video games prior to the study.
The game, which was designed to test participants' real-world cognitive skills, is quite challenging, Kramer said. ... The findings should not be interpreted to mean that some people are destined to succeed or fail at a given task or learning challenge, however, Kramer said. "We know that many of these components of brain structure and function are changeable," he said.

[Not sure this has much to do with economics, but it's all I have right now -- feel free to talk about whatever in comments.]

Saturday, September 04, 2010

Stephen Hawking's Big Bang Gaps

I probably should have posted this discussion of naked CDS from Yeon-Koo Che and Rajiv Sethi, but for some reason I felt like something different from the usual fare:

Stephen Hawking's big bang gaps, by Paul Davies, CIF: Cosmologists are agreed that the universe began with a big bang 13.7 billion years ago. People naturally want to know what caused it. A simple answer is nothing: not because there was a mysterious state of nothing before the big bang, but because time itself began then – that is, there was no time "before" the big bang. The idea is by no means new. In the fifth century, St Augustine of Hippo wrote that "the universe was created with time and not in time".
Religious people often feel tricked by this logic. They envisage a miracle-working God dwelling within the stream of time for all eternity and then, for some inscrutable reason, making a universe (perhaps in a spectacular explosion) at a specific moment in history.
That was not Augustine's God, who transcended both space and time. Nor is it the God favored by many contemporary theologians. In fact, they long ago coined a term for it – "god-of-the-gaps" – to deride the idea that when science leaves something out of account, then God should be invoked to plug the gap. The origin of life and the origin of consciousness are favorite loci for a god-of-the-gaps, but the origin of the universe is the perennial big gap.
In his new book, Stephen Hawking reiterates that there is no big gap in the scientific account of the big bang. The laws of physics can explain, he says, how a universe of space, time and matter could emerge spontaneously, without the need for God. And most cosmologists agree: we don't need a god-of-the-gaps to make the big bang go bang. It can happen as part of a natural process. A much tougher problem now looms, however. What is the source of those ingenious laws that enable a universe to pop into being from nothing?
Traditionally, scientists have supposed that the laws of physics were simply imprinted on the universe at its birth, like a maker's mark. As to their origin, well, that was left unexplained.
In recent years, cosmologists have shifted position somewhat. If the origin of the universe was a law rather than a supernatural event, then the same laws could presumably operate to bring other universes into being. The favored view now, and the one that Hawking shares, is that there were in fact many bangs, scattered through space and time, and many universes emerging therefrom, all perfectly naturally. The entire assemblage goes by the name of the multiverse.
Our universe is just one infinitesimal component amid this vast – probably infinite – multiverse, that itself had no origin in time. So according to this new cosmological theory, there was something before the big bang after all – a region of the multiverse pregnant with universe-sprouting potential.
A refinement of the multiverse scenario is that each new universe comes complete with its very own laws – or bylaws, to use the apt description of the cosmologist Martin Rees. Go to another universe, and you would find different bylaws applying. An appealing feature of variegated bylaws is that they explain why our particular universe is uncannily bio-friendly; change our bylaws just a little bit and life would probably be impossible. The fact that we observe a universe "fine-tuned" for life is then no surprise: the more numerous bio-hostile universes are sterile and so go unseen.
So is that the end of the story? Can the multiverse provide a complete and closed account of all physical existence? Not quite. The multiverse comes with a lot of baggage, such as an overarching space and time to host all those bangs, a universe-generating mechanism to trigger them, physical fields to populate the universes with material stuff, and a selection of forces to make things happen. Cosmologists embrace these features by envisaging sweeping "meta-laws" that pervade the multiverse and spawn specific bylaws on a universe-by-universe basis. The meta-laws themselves remain unexplained – eternal, immutable transcendent entities that just happen to exist and must simply be accepted as given. In that respect the meta-laws have a similar status to an unexplained transcendent god.
According to folklore the French physicist Pierre Laplace, when asked by Napoleon where God fitted into his mathematical account of the universe, replied: "I had no need of that hypothesis." Although cosmology has advanced enormously since the time of Laplace, the situation remains the same: there is no compelling need for a supernatural being or prime mover to start the universe off. But when it comes to the laws that explain the big bang, we are in murkier waters.

Monday, August 23, 2010

"All-Out Geoengineering Still Would Not Stop Sea Level Rise"

Are you counting on geoengineering to solve our greenhouse gas problem?:

All-out geoengineering still would not stop sea level rise, by David Biello, Scientific American: Mimicking volcanoes by throwing particles high into the sky. Maintaining a floating armada of mirrors in space. Burning plant and other organic waste to make charcoal and burying it—or burning it as fuel and burying the CO2 emissions. Even replanting trees. All have been mooted as potential methods of "geoengineering"—"deliberate large-scale manipulation of the planetary environment," as the U.K.'s Royal Society puts it.
The goal, of course, is to cool the planet by remove heat-trapping gases in the atmosphere or reflecting sunlight away. But rising temperatures is just one impact of our seemingly limitless emission of greenhouse gases, largely carbon dioxide, into the atmosphere. Arguably a more devastating consequence would be the rise of the seas as warmer waters expand and melting icecaps fill ocean basins higher, potentially swamping nations and the estimated 150 million people living within one meter of high tide. Can geoengineering hold back that tide?
That's what scientists attempted to assess with computer models in a paper published online August 23 in Proceedings of the National Academy of Sciences. In their words, "sea level rise by 2100 will likely be 30 centimeters higher than 2000 levels despite all but the most aggressive geoengineeering." In large part, that's because the ocean has a lot of thermal inertia: it only slowly warms as a result of increasing greenhouse gas levels—and it will only slowly cool down again. ...
Perhaps the only way to reduce warming enough to minimize the rise of the oceans is an all-out effort that also includes burning biomass as fuel (either to replace coal or gasoline or both) and pairing it with CO2 capture and storage. Together, they could suck down greenhouse gas levels by 180 ppm—more than enough to bring us below pre-industrial levels. As a result, sea level rise is held to just 10 centimers by 2100, according to the author's modeling.
Such extensive geoengineering seems impractical given its economic (and environmental) cost. But interfering with the planet's carbon cycle—something we're already doing by adding so much CO2 to the atmosphere—appears to be the better bet, even if only by curbing current CO2 emissions. Otherwise, we're leaving our descendants one heck of a mess or, as the authors put it, "substituting geoengineering for greenhouse gas emission abatement or removal constitutes a conscious risk transfer to future generations."

Tuesday, August 10, 2010

Is This "A Common Feature of Biological Decision-Making"?

Humans are very sophisticated, calculating, rational decision-makers:
Slime Brainless slime mould makes decisions like humans, Discover: A couple arrive at a fancy restaurant and they’re offered the wine list. This establishment only has two bottles on offer, one costing £5 and the other costing £25. The second bottle seems too expensive and the diners select the cheaper one. The next week, they return. Now, there’s a third bottle on the list but it’s a vintage, priced at a staggering £1,000. Suddenly, the £25 bottle doesn’t seem all that expensive, and this time, the diners choose it instead.
Businesses use this tactic all the time – an extremely expensive option is used to make mid-range ones suddenly seem like attractive buys. The strategy only works because humans like to compare our options, rather than paying attention to their absolute values. In the wine example, the existence of the third bottle shouldn’t matter – the £25 option costs the same amount either way, but in one scenario it looks like a rip-off and in another, it looks like a steal. The simple fact is that to us, a thing’s value depends on the things around it. Economists often refer to this as “irrational”.
But if that’s the case, we’re not alone in our folly. Other animals, from birds to bees, make choices in the same way. Now, Tanya Latty and Madeleine Beekman from the University of Sydney, have found the same style of decision-making in a creature that’s completely unlike any of these animals – the slime mould, Physarum polycephalum. It’s a single-celled, amoeba-like creature that doesn’t have a brain. ... [...continue reading...]

Monday, July 19, 2010

Will Geoengineering Make Things Better or Worse?

Is geoengineering the answer to our global warming problems?:

Is the cure (geoengineering) worse than the disease (global warming)?, by David Biello: If there's one thing more potentially contentious than the international politics of global warming..., it's the politics of the most radical suggestion to solve it: geoengineering. ...
Geophysicist Kate Ricke of Carnegie Mellon University and her colleagues show that one of the more feasible geoengineering methods—injecting reflective particles into the atmosphere to mimic the world-cooling effects of a volcanic eruption—will have effects that vary from place to place. So, for example, India might be rendered too cold (and wet) by a level of particle injection that's just right for its neighbor China while setting the levels to India's liking would toast the Middle Kingdom.
What's worse, the computer models that show that such injections might work in the short term also show that they will change global weather patterns by making part of the atmosphere more stable—and therefore less likely to promote storms. That means less rainfall to go around—and these side effects become worse with time. ...

Engineers used to show up in comments and tell me that, unlike economists, they know how to build systems in ways that prevent the chance of catastrophic collapse like we had in the financial system. Then the oil spill in the gulf happened and they've become much more scarce. Even if the models said this will work without any worrisome side effects or geographic differences, the stakes are too high to use that result as an excuse to delay action on the global warming problem. We need to start solving this problem now and if we are lucky, we won't have to depend upon the geoengineers to prevent catastrophic effects from global warming.

Thursday, May 13, 2010

Intelligently Directed Evolution

Chemical and biological engineers are hoping to be able to do intelligent design within the next few decades, but for now, for the most part, they'll have to let evolution do the work:

Directed Evolution, by Anne Trafton, MIT News Office: In nature, evolution takes place over eons... But evolution can also happen on a small and fast scale in the laboratory.
The approach is called “directed evolution,” and scientists are using it to generate proteins that do not occur in nature — for example, cancer drugs, new microbial enzymes for converting agricultural waste to fuel, or imaging agents for magnetic resonance imaging (MRI).
Most protein structures are so complex that it’s nearly impossible to predict how altering their structure will affect their function. So the trial-and-error approach of directed evolution is usually the fastest way to come up with a new protein with desirable traits, says Dane Wittrup, an MIT professor of chemical and biological engineering...
Such experiments often yield proteins that researchers never would have come up with on their own. ...For example, let’s say you want to create an antibody that will bind to a certain protein found on tumor cells. You start with a test tube full of hundreds of millions of yeast cells, engineered to express a variety of mammalian antibodies on their surfaces. Then you add probes containing the molecule you want your new protein to target, allowing you to pick out the proteins that bind to it.
Next, you take the proteins that bind the best and mutate them, in hopes of generating something even better. This can be done by irradiating the cells, or by forcing them to replicate their DNA in a way that is prone to mistakes. Those new proteins are screened the same way, and each time, the best candidates are used to create more proteins. “At the end, you have proteins that bind very tightly and specifically,” says Wittrup. “In the lab, it’s the same rules as natural evolution, but we get to set the criteria for who survives.”
Wittrup and his students recently created a new antibody that binds tightly to tumor cells and to a radioactive compound used for chemotherapy, potentially allowing for very precise targeting of the cancer treatment.
First developed about 15 years ago, directed evolution has become ... easy enough that a first-year graduate student can produce a suitable protein in a couple of months, Wittrup says.
He and others at MIT ... have also tried to design proteins ... using computer models to predict how changes in a protein’s sequence will affect its structure and function. In 2007, their simulation successfully produced a new version of the cancer drug cetuximab that binds to its target with 10 times greater affinity than the original. However, this approach is very expensive and only works when researchers start with a great deal of information about the protein interactions being modeled.
“In a limited way, we could do rational design,” says Wittrup. “Fifty years from now, maybe everyone will be doing that.

Sunday, May 09, 2010

Neanderthal Science

Are you part Neanderthal? The answer isn't as clear as you may have been led to believe, and this illustrates a problem I see in economics as well. There is often a large difference in the way academic results are reported in the news and what the underlying academic research actually says (this happens frequently, especially with working papers that have not yet been through the refereeing process). The academic work is often much more qualified and tentative than the way it is presented in the media. The problem is that even when subsequent research calls into question or overturns the original work, many of these "facts" live on:

All in the (human) family?, by Rosemary Joyce: Big news in anthropology this past week:... we are all 1% to 4% Neanderthal– or rather, humans of non-African ancestry are. Or maybe not.
As Serge Bloch of the New York Times framed the story, there are “cavemen among us” because “the species most likely had a dalliance or two in the Middle East 60,000 to 100,000 years ago”.
Nicholas Wade’s science story for the Times played it somewhat straighter, but still went for the sex angle with the headline “Signs of Neanderthals Mating with Humans”.
The distance from the more clinical “mating” to Bloch’s cartoon of a Neanderthal man holding a club offering flowers to a woman in a dainty skirt may seem like the span from science to popular imagination. But as UCSC Professor of Anthropology Diane Gifford-Gonzalez, Berkeley Anthropology Professor Margaret Conkey, and University of Southampton Professor Stephanie Moser have all shown in different ways, the science of human origins is drenched in the same images as the popular press.
So I have to ask: why is the man in Serge Bloch’s cross-species couple the Neanderthal? Shades of Clan of the Cave Bear! Apparently, in the popular imagination it takes a more evolved woman to make a husband out of a man…
The researchers sequencing Neanderthal DNA from three fragments of bone recovered from a Croatian cave have reportedly completed 60% of the Neanderthal genome. And while other researchers applaud the technical work done at the Max Planck Institute for Evolutionary Anthropology in Germany, they are cautious about the interpretation of the data.
The problems start with the proposed time and place for Neanderthal-human romance: not in the Europe of 40,000 to 30,000 years ago imagined in Clan of the Cave Bear, but in the Middle East, and at least 20,000 years earlier, maybe as much as 60,000 years before the period when we know Neanderthal and early modern humans co-existed in Europe. To quote the Times again, “There is much less archaeological evidence for an overlap between modern humans and Neanderthals at this time and place.”
The times draws the line of disagreement along disciplinary lines:
Geneticists have been making increasingly valuable contributions to human prehistory, but their work depends heavily on complex mathematical statistics that make their arguments hard to follow. And the statistical insights, however informative, do not have the solidity of an archaeological fact.
Archaeologists do have well-developed models for recognizing human and Neanderthal populations in Europe during their period of overlap through different stone tools and other cultural features. It is the lack of such well-defined models for the Middle East of 100,000 to 60,000 years ago that gives archaeologists pause.
As an archaeologist, I found the idea that archaeological “facts” have solidity interesting for other reasons entirely. Like the image of the club-toting Neanderthal with stubble on his chin, it is a commonplace of everyday understanding of my discipline. And it is not quite true, or not true in the way that writers think it is.
The image of “solid” archaeological facts stems from the idea that our discipline studies hard, visible things that everyone can agree about. And there are lots of things involved in every archaeological analysis. But we quarrel all the time about what exactly they mean; how best to measure them and quantify them; and how the solid things in archaeology relate to the not-so-solid theories we develop.
And increasingly, our studies are not limited to, or even dominated by, the “solid facts” of popular imagination of archaeology. Instead, archaeologists today may study microscopic grains of starch invisible to the naked eye, or the traces of past human actions like sweeping a dirt floor visible under a microscope, or simply the chemical traces left behind when people sit in one place or do some everyday task in a particular location.
So, are we “part caveman”? the question is meaningless. If the findings of the Neanderthal genome sequencing hold up, they will tell us that the history of humans and our closest relatives was even more intimate than many had thought.
But the popular image of the crude Neanderthal should long ago have been set aside, replaced by our understanding of this human species as a cold-adapted contemporary of early modern humans. The visible differences in Neanderthal stature and facial shape would not necessarily have given a contemporary human pause. Those humans occupied the caves of Europe that gave us the scene for our cave man image as much as Neanderthals did.
As Stephanie Moser has shown us, we have populated those caves in our professional and popular imagination... Less reflections of what the “solid facts” of archaeology tell us than mirrors reflecting our own vision of our past, the cavemen are us.

Saturday, May 08, 2010

"The Universe on a String"

Friday, April 09, 2010

Is Our Universe in a Wormhole?

Here's the question:

Could our universe be located within the interior of a wormhole which itself is part of a black hole that lies within a much larger universe?

And an attempt at the answer:

Our universe at home within a larger universe?, EurekAlert: ...Such a scenario in which the universe is born from inside a wormhole (also called an Einstein-Rosen Bridge) is suggested in a paper from Indiana University theoretical physicist Nikodem Poplawski in Physics Letters B.
Poplawski takes advantage of the Euclidean-based coordinate system called isotropic coordinates to describe the gravitational field of a black hole and to model the radial geodesic motion of a massive particle into a black hole.
In studying the radial motion through the event horizon (a black hole's boundary) of two different types of black holes -- Schwarzschild and Einstein-Rosen, both of which are mathematically legitimate solutions of general relativity -- Poplawski ... notes that since observers can only see the outside of the black hole, the interior cannot be observed unless an observer enters or resides within.
"This condition would be satisfied if our universe were the interior of a black hole existing in a bigger universe," he said. "Because Einstein's general theory of relativity does not choose a time orientation, if a black hole can form from the gravitational collapse of matter through an event horizon in the future then the reverse process is also possible. Such a process would describe an exploding white hole: matter emerging from an event horizon in the past, like the expanding universe."
A white hole is connected to a black hole by an Einstein-Rosen bridge (wormhole) and is hypothetically the time reversal of a black hole. Poplawski's paper suggests that all astrophysical black holes, not just Schwarzschild and Einstein-Rosen black holes, may have Einstein-Rosen bridges, each with a new universe inside that formed simultaneously with the black hole.
"From that it follows that our universe could have itself formed from inside a black hole existing inside another universe," he said.

By continuing to study the gravitational collapse of a sphere of dust in isotropic coordinates, and by applying the current research to other types of black holes, views where the universe is born from the interior of an Einstein-Rosen black hole could avoid problems seen by scientists with the Big Bang theory and the black hole information loss problem which claims all information about matter is lost as it goes over the event horizon (in turn defying the laws of quantum physics).

This model in isotropic coordinates of the universe as a black hole could explain the origin of cosmic inflation, Poplawski theorizes.

[I'm traveling - though not through a wormhole - this should post automatically.]

Wednesday, February 24, 2010

"Physiological Evidence of Brain's Response to Inequality"

HTML clipboard

A quick post between classes -- the desire for equality appears to be hardwired:

Caltech scientists find first physiological evidence of brain's response to inequality, EurekAlert: The human brain is a big believer in equality—and a team of scientists from the California Institute of Technology (Caltech) and Trinity College in Dublin, Ireland, has become the first to gather the images to prove it.

Specifically, the team found that the reward centers in the human brain respond more strongly when a poor person receives a financial reward than when a rich person does. The surprising thing? This activity pattern holds true even if the brain being looked at is in the rich person's head, rather than the poor person's.

These conclusions, and the functional magnetic resonance imaging (fMRI) studies that led to them, are described in the February 25 issue of the journal Nature. ...

It's long been known that we humans don't like inequality, especially when it comes to money. Tell two people working the same job that their salaries are different, and there's going to be trouble, notes John O'Doherty, professor of psychology at Caltech...

But what was unknown was just how hardwired that dislike really is. "In this study, we're starting to get an idea of where this inequality aversion comes from," he says. "It's not just the application of a social rule or convention; there's really something about the basic processing of rewards in the brain that reflects these considerations."

Continue reading ""Physiological Evidence of Brain's Response to Inequality"" »

Monday, February 22, 2010

"Life Beyond Our Universe"

Calling Karl Marx Sagan. Do the laws of economics hold in alternative universes? This won't answer or even ask the question of whether the laws of economics are universal in that sense, but it does wonder if life can exist in alternative universes where the laws of physics differ substantially from our own:

Life beyond our universe, by Anne Trafton, MIT News Office: Whether life exists elsewhere in our universe is a longstanding mystery. But for some scientists, there’s another interesting question: could there be life in a universe significantly different from our own?

A definitive answer is impossible, since we have no way of directly studying other universes. But cosmologists speculate that a multitude of other universes exist, each with its own laws of physics. Recently physicists at MIT have shown that in theory, alternate universes could be quite congenial to life, even if their physical laws are very different from our own. ...

Continue reading ""Life Beyond Our Universe"" »

Friday, February 19, 2010

"The Phony Attack on Climate Science"

Jeff Sachs attacks the attacks on climate science backed by Exxon Mobile, the WSJ editorial pages, and others determined to stop climate change legislation:

The Phony Attack on Climate Science, by Jeffrey D. Sachs, Commentary, Project Syndicate: In the weeks before and after the Copenhagen climate change conference last December, the science of climate change came under harsh attack by critics who contend that climate scientists have deliberately suppressed evidence – and that the science itself is severely flawed. ... The global public is disconcerted by these attacks. If experts cannot agree that there is a climate crisis, why should governments spend billions of dollars to address it?
The fact is that the critics – who are few in number but aggressive in their attacks – are deploying tactics that they have honed for more than 25 years ... to stop action on climate change, with special interests like Exxon Mobil footing the bill. ... The ... same group of mischief-makers, given a platform by the free-market ideologues of The Wall Street Journal’s editorial page, has consistently tried to confuse the public and discredit the scientists whose insights are helping to save the world from unintended environmental harm.
Today’s campaigners against action on climate change are in many cases backed by the same lobbies, individuals, and organizations that sided with the tobacco industry to discredit the science linking smoking and lung cancer. Later, they fought the scientific evidence that sulfur oxides from coal-fired power plants were causing “acid rain.” Then, when it was discovered that certain chemicals called chlorofluorocarbons (CFCs) were causing the depletion of ozone in the atmosphere, the same groups launched a nasty campaign to discredit that science, too.
Later still, the group defended the tobacco giants against charges that second-hand smoke causes cancer and other diseases. And then, starting mainly in the 1980’s, this same group took on the battle against climate change.
What is amazing is that, although these attacks on science have been wrong for 30 years, they still sow doubts about established facts. ... The latest round of attacks involves two episodes. The first was the hacking of a climate-change research center in England. The e-mails that were stolen suggested a lack of forthrightness in the presentation of some climate data. Whatever the details of this specific case, the studies in question represent a tiny fraction of the overwhelming scientific evidence that points to the reality and urgency of man-made climate change.
The second issue was a blatant error concerning glaciers that appeared in a major IPCC report. Here it should be understood that the IPCC issues thousands of pages of text. There are, no doubt, errors in those pages. But errors ... point to the inevitability of human shortcomings, not to any fundamental flaws in climate science.
When the e-mails and the IPCC error were brought to light, editorial writers at The Wall Street Journal launched a vicious campaign... They claimed that scientists were fabricating evidence in order to obtain government research grants – a ludicrous accusation, I thought at the time, given that the scientists under attack ... have certainly not become rich relative to their peers in finance and business.
But then I recalled that this line of attack – charging a scientific conspiracy to drum up “business” for science – was almost identical to that used by The Wall Street Journal and others in the past, when they fought controls on tobacco, acid rain, ozone depletion, second-hand smoke, and other dangerous pollutants. In other words, their arguments were systematic and contrived, not at all original to the circumstances. ... Their arguments have been repeatedly disproved for 30 years – time after time – but their aggressive methods of public propaganda succeed in causing delay and confusion.
Climate change science is a wondrous intellectual activity. ... And the message is clear: large-scale use of oil, coal, and gas is threatening the biology and chemistry of the planet. We are fueling dangerous changes in Earth’s climate and ocean chemistry... We need urgently to transform our energy, transport, food, industrial, and construction systems to reduce the dangerous human impact on the climate. ...

Here's more from Jeff Sachs on this topic from Scientific American:

Breaking the Climate Debate Log jam, by Jeff Sachs, Scientific American: There is a growing possibility that the U.S. will pass no climate change legislation in this session of Congress... Several Democratic senators have already asked him to stop pushing for a bill in 2010, given the proximity to the midterm elections. ...
Perhaps the legislation can still narrowly pass, which at this point would be the best option. If it stalls this spring, however, the climate and the rest of the world can’t wait. A different approach is needed. Here are some components.
First, the Environmental Protection Agency has the mandate to move under the Clean Air Act. It could impose a timetable of emissions standards for electric utilities and for vehicles, which together account for around three fourths of carbon emissions. ...
Second, if cap-and-trade stalls, the administration and Congress should rethink their opposition to the much simpler option of a carbon tax. A predictable carbon tax ... might win broader assent as part of a package of deficit reduction.
Third, the public needs to hear a plan. The administration has embraced a goal of 17 percent reduction of greenhouse gas emissions by 2020, but it hasn’t told us how that would be achieved. The public is scared that even this modest goal would slam jobs and living standards. It’s time to spell out the changes in power generation, automobile technology and energy efficiency that can take us to our goals at modest cost and huge social benefit.
Fourth, it’s time to step up the response to the climate skeptics, who have misled the public. The Wall Street Journal leads the campaign against climate science, writing editorials charging that scientists are engaged in a massive conspiracy. ...
Let’s hear more from the president’s science adviser, John P. Holdren, Nobel laureate energy secretary Steven Chu, the National Academy of Sciences and other authorities. The public will learn to appreciate that the scientific community is working urgently, rigorously and ingeniously to better understand the complex climate system, for our shared safety and well-being.

It's hard to be optimistic that anything useful will happen.

Wednesday, February 17, 2010

Friedman: Scientists Should Fight Back

Thomas Friedman calls for scientists to go on the offensive against climate change deniers and skeptics:

Global Weirding Is Here, by Thomas Friedman, Commentary, NYTimes: Of the festivals of nonsense that periodically overtake American politics, surely the silliest is the argument that because Washington is having a particularly snowy winter it proves that climate change is a hoax and, therefore, we need not bother with all this girly-man stuff like renewable energy, solar panels and carbon taxes. Just drill, baby, drill.
When you see lawmakers like Senator Jim DeMint of South Carolina tweeting that “it is going to keep snowing until Al Gore cries ‘uncle,’ ” or news that the grandchildren of Senator James Inhofe of Oklahoma are building an igloo next to the Capitol with a big sign that says “Al Gore’s New Home,” you really wonder if we can have a serious discussion about the climate-energy issue anymore.
The climate-science community is not blameless. It knew it was up against formidable forces... Therefore, climate experts can’t leave themselves vulnerable by citing non-peer-reviewed research or failing to respond to legitimate questions, some of which happened with both the Climatic Research Unit at the University of East Anglia and the United Nations Intergovernmental Panel on Climate Change.
Although there remains a mountain of research from multiple institutions about the reality of climate change, the public has grown uneasy. What’s real? In my view, the climate-science community should convene its top experts — from places like NASA, America’s national laboratories, the Massachusetts Institute of Technology, Stanford, the California Institute of Technology and the U.K. Met Office Hadley Centre — and produce a simple 50-page report. They could call it “What We Know,” summarizing everything we already know about climate change in language that a sixth grader could understand, with unimpeachable peer-reviewed footnotes.

At the same time, they should add a summary of all the errors and wild exaggerations made by the climate skeptics — and where they get their funding. It is time the climate scientists stopped just playing defense. The physicist Joseph Romm, a leading climate writer, is posting on his Web site, climateprogress.org, his own listing of the best scientific papers on every aspect of climate change for anyone who wants a quick summary now. ...

I don't think a panel of experts will help much, but it's a start. What's really needed is for a few key leaders on the right to act responsibly, acknowledge the problem, and commit to working toward a solution. But the chances of that happening are pretty slim.

Friday, January 08, 2010

Turning Back the Hands of Entropy

If you have time for it, here's something a bit different from usual:

What Keeps Time Moving Forward? Blame It on the Big Bang, by John Matson, Scientific American: ...In his new book, From Eternity to Here..., theoretical physicist Sean Carroll of the California Institute of Technology sets out to explain why time marches along unfailingly in one direction. Expanding on the concepts in his June 2008 feature for Scientific American, "The Cosmic Origins of Time's Arrow," Carroll argues for the necessity of marrying three seemingly disparate concepts: time, entropy and cosmology.

Entropy, which in rough terms is the measure of a system's disorder, creeps up over time, as dictated by the second law of thermodynamics. To illustrate entropy's inexorable growth, Carroll takes us to the breakfast table—you can't unscramble an egg, he points out, and you can't unstir the milk out of your coffee. These systems invariably proceed to disordered, or high-entropy, arrangements. Each of these examples shows how the continual growth of entropy fills the world with irreversible processes that divide the past from the future: The making of an omelet and the mixing of milk into a cup of coffee are events that work in only one temporal direction.

But why should entropy always increase? This is where Carroll turns to cosmology, which must explain why the universe began in a uniquely low-entropy state. We spoke to the physicist...

Continue reading "Turning Back the Hands of Entropy" »

Tuesday, January 05, 2010

"Grandmasters and Global Growth"

Ken Rogoff never misses an opportunity to tell us about his prowess in chess, even if it means essentially rerunning an old column.  Compare today's column to this one from 2006 arguing that the next big driver of global growth will be artificial intelligence.

Saturday, January 02, 2010

''Scientists Need to Speak Up''

The academic community needs to do a better job at responding to attacks on its credibility:

On issues like global warming and evolution, scientists need to speak up, by Chris Mooney, Commentary, Washington Post: The battle over the science of global warming has long been a street fight between mainstream researchers and skeptics. But never have the scientists received such a deep wound as when, in late November, a large trove of e-mails and documents stolen from the Climatic Research Unit at Britain's University of East Anglia were released onto the Web.
In the ensuing "Climategate" scandal, scientists were accused of withholding information, suppressing dissent, manipulating data and more. But while the controversy has receded, it may have done lasting damage to science's reputation... Meanwhile, public belief in the science of global warming is in decline.
The central lesson of Climategate is not that climate science is corrupt. The leaked e-mails do nothing to disprove the scientific consensus on global warming. Instead, the controversy highlights that in a world of blogs, cable news and talk radio, scientists are poorly equipped to communicate their knowledge and, especially, to respond when science comes under attack.
A few scientists answered the Climategate charges almost instantly. ... But they were largely alone. ... This isn't a new problem. As far back as the late 1990s, before the news cycle hit such a frenetic pace, some science officials were lamenting that scientists had never been trained in how to talk to the public and were therefore hesitant to face the media.
"For 45 years or so, we didn't suggest that it was very important," Neal Lane, a former Clinton administration science adviser and Rice University physicist, told the authors of a landmark 1997 report on the gap between scientists and journalists. ". . . In fact, we said quite the other thing." ...
Scientific training continues to turn out researchers who speak in careful nuances and with many caveats, in a language aimed at their peers, not at the media or the public. Many scientists can scarcely contemplate framing a simple media message for maximum impact; the very idea sounds unbecoming. And many of them don't trust the public or the press....... Rather than spurring greater efforts at communication, such mistrust and resignation have further motivated some scientists to avoid talking to reporters and going on television.
They no longer have that luxury. After all, global-warming skeptics suffer no such compunctions. ... If scientists don't take a central communications role, nobody else with the same expertise and credibility will do it for them.
Meanwhile, the task of translating science for the public is ever more difficult: Information sources are multiplying, partisan news outlets are replacing more objective media, and the news cycle is spinning ever faster.
Consider another failure to communicate from the global-warming arena: the scientific fallout after a devastating trio of hurricanes -- Katrina, Rita and Wilma -- in the fall of 2005. Just as these storms struck, a pair of scientific studies appeared in top journals suggesting, for the first time, that global warming was making hurricanes more intense and deadly. Other scientists vociferously disagreed, and the two camps fell into combat.
So while public interest in hurricanes was at a high after Katrina, much of the science reporting at the time portrayed researchers bickering with one another ("Hurricane Debate Shatters Civility of Weather Science," announced a Wall Street Journal cover story). Judith Curry, a climate scientist at the Georgia Institute of Technology and a co-author of one of the contested studies, told me recently that the experience made her realize that "this was really the wrong way to do things, trying to fight these little wars and knock the other side down."
With the media distracted by the food fight, scientists weren't leading the public discussion, and other important findings that ought to have received attention in Katrina's wake ... were drowned out.
If the global-warming battle has any rival in its intensity, its nastiness and its risk to scientists if they do not talk to the public, it is the long-standing conflict over the teaching of evolution. Science's opponents in this fight are highly organized, and they constantly nitpick evolutionary science to cast the field into disrepute.
The scientific response to creationists has long been to cite the extensive evidence for evolution. In book after book, scientists have explained how DNA, fossil, anatomical and other evidence indisputably shows the interrelatedness of all species. Further, they have refuted creationist claims that evolution cannot explain the complexity of the eye or the intricacy of the bacterial flagellum. Yet ... polls repeatedly show that a large portion of Americans have doubts about evolution.
For all these efforts, why haven't scientists made any inroads? It's because at its core, the objection to evolution isn't about science at all, but about perceived threats to faith and moral values. The only way to defuse the conflict is to assuage these fundamental fears. Yet this drags many scientists out of their comfort zone: They're not priests or theologians and don't know how to sound like them. Many refuse to try...
Ironically, to increase support for the teaching of evolution, scientists must join forces with -- and show more understanding of -- religion. Scientists who are believers also need to be more vocal about how they reconcile science and faith. ...
In other words, what's needed is less "pure science" on its own -- although of course scientists must continue to speak in scientifically accurate terms -- and more engagement with the concerns of nonscientific audiences. In response to that argument, many researchers will say: "Why target us? We're the good guys. And if we become more media savvy, we'll risk our credibility."
There is only one answer to this objection: "Look all around you -- at Climategate, at the unending evolution wars -- and ask, are your efforts working?" The answer, surely, is no.
The precise ways in which scientists should change their communication strategies vary from issue to issue, but there are some common themes. Reticence is never a good thing, especially on a politically fraught topic such as global warming -- it just cedes the debate to the other side....
On other topics, including evolution, scientists must recognize that more than scientific matters are at stake, and either address the moral and ethical issues themselves, or pair with those who can...
All this will require universities to do a better job of training young scientists in media and communication. ... Scientists need not wait for former vice presidents to make hit movies to teach the public about their fields -- they must act themselves. ...

I agree that the academic community should speak up more often, and engage in the public discourse in areas where they have particular expertise. I also agree that alone will be enough.

I'd describe the problem a little bit differently. Over the last few decades -- and perhaps longer -- there has been a concerted effort from the right to discredit and diminish academic voices. We hear academics are nothing but a bunch of liberals out of touch with the real world, etc., etc. -- you've heard it all -- and I think the right has been at least moderately successful at attacking the messengers when they don't like the message (and there's not much coming out of academia they agree with).

So, yes, we have to get out there and explain what research says about a topic in terms the public (and more importantly newscasters) can understand. But the political battle has to be engaged as well. If academics continue to ignore or place themselves above such political battles, then they will continue to be be undermined by those who want to silence research that stands in opposition to their interests (and who do not care at all about the integrity of the scientific process, their goal is to win the battle).

Academics have lost considerable respect in the eyes of the public in the last few decades, and while we often don't do ourselves any favors when we do speak up (so yes, let's become more media savvy), that outcome is no accident. It's the result of a concerted effort from those on the right. If we don't do more to counter the political attacks on our credibility, nobody else is going to do it for us. If the voices that are aligned against the academic community are not taken on directly, then the influence of the academic community over important pubic policy questions will continue to diminish.

Saturday, December 05, 2009

"Can Science Fight Media Disinformation?"

Is better science education the answer to our "media disinformation" problem?:

War Is Peace: Can Science Fight Media Disinformation?, by Lawrence M. Krauss, Commentary, Scientific American: ...The rise of a ubiquitous Internet, along with 24-hour news channels has, in some sense, had the opposite effect from what many might have hoped such free and open access to information would have had. It has instead provided free and open access, without the traditional media filters, to a barrage of disinformation. Nonsense claims had more difficulty gaining traction in the days when print journalism held sway and newspaper editors had the final word on what made its way into homes and when television news consisted of a half-hour summary of what a trained producer thought were the most essential stories of the day.
Now fabrications about “death panels” and oxymoronic claims that ”government needs to keep its hands off of Medicare” flow freely on the Internet, driving thousands of zombielike protesters to Washington to argue that access to health care will undermine their fundamental freedom to have their insurance canceled if they get sick. And 24-hour news channels, desperate to provide ”breaking” coverage at all hours, end up serving as public relations vehicles for any celebrity who happens to make an outrageous claim or, worse, decide that the competition for ratings requires them to be anything but ”fair and balanced” in their reporting.
“Fair and balanced,” however, doesn’t mean putting all viewpoints, regardless of their underlying logic or validity, on an equal footing. Discerning the merits of competing claims is where the empirical basis of science should play a role. I cannot stress often enough that what science is all about is not proving things to be true but proving them to be false. What fails the test of empirical reality, as determined by observation and experiment, gets thrown out like yesterday’s newspaper. One doesn’t need to debate about whether the earth is flat or 6,000 years old. These claims can safely be discarded, and have been, by the scientific method.
What makes people so susceptible to nonsense in public discourse? Is it because we do such a miserable job in schools teaching what science is all about—that it is not a collection of facts or stories but a process for weeding out nonsense to get closer to the underlying beautiful reality of nature? Perhaps not. But I worry for the future of our democracy if a combination of a free press and democratically elected leaders cannot together somehow more effectively defend empirical reality against the onslaught of ideology and fanaticism. [full version]

There was plenty of nonsense long before the internet and 24 hour news, but it's probably true that these developments helped to amplify and speed the spread of nonsensical claims, though I'd assert that 24 hour news (plus radio to some extent) is more responsible than the internet.

As for solving the nonsense problem through better science education, I do agree that better critical thinking skills would be helpful, that's true by definition I suppose, but that's not enough. Nobody can be an expert on health care, global warming, and all the other important issues they face. The underlying scientific, economic, political, sociological, etc. issues are too difficult (in some cases even for the experts). To overcome that, we have to rely upon people we can trust, often experts who can help to guide us to the correct decisions, but sometimes it's a trusted intermediary. Critical thinking skills can help us determine who to listen to, but it still comes down to trusting that you are getting the best possible analysis of the problem

For good or bad -- I'm still making up my mind about that -- I think that a trust that was once there is gone, at least to some degree. People believed Walter Cronkite, they trusted scientists, Dr. Spock had all the answers about how to raise your kids, but trust in the media, scientists, politicians, doctors, and so on has eroded (yes, economists too). I'd cite 24 hours news and its ilk as part of the reason, but I'm not sure that's been the fundamental driving force behind the change.

Maybe people are right to be more skeptical of the information they receive -- maybe they trusted too much in the past (and there could be an overreaction during the adjustment, causing trust to fall even further). If so, then the increase in uncertainty brought about by declining trust in experts and other sources of information would be consistent with the appearance of more nonsense in the public discourse attempting to fill the void.

Thursday, September 10, 2009

Solving the Free Rider Problem using fMRI Measurements

If we hook up a randomly chosen set of people to magnetic neural imaging machines to see if they are truthfully revealing their valuation of public goods, we can improve our ability to provide these services, but the intrusiveness of the solution seems problematic, at least to me. Does this bother you, or does it seem like a good idea to move in this direction? [Update: Cheap Talk has good comments on the research]:

Caltech scientists develop novel use of neurotechnology to solve classic social problem, EurekAlert: Economists and neuroscientists from the California Institute of Technology (Caltech) have shown that they can use information obtained through functional magnetic resonance imaging (fMRI) measurements of whole-brain activity to create feasible, efficient, and fair solutions to one of the stickiest dilemmas in economics, the public goods free-rider problem—long thought to be unsolvable.
This is one of the first-ever applications of neurotechnology to real-life economic problems, the researchers note. "We have shown that by applying tools from neuroscience to the public-goods problem, we can get solutions that are significantly better than those that can be obtained without brain data," says Antonio Rangel, associate professor of economics at Caltech and the paper's principal investigator.
The paper describing their work was published today in the online edition of the journal Science, called Science Express.
Examples of public goods range from healthcare, education, and national defense to the weight room or heated pool that your condominium board decides to purchase. But how does the government or your condo board decide which public goods to spend its limited resources on? And how do these powers decide the best way to share the costs?

Continue reading "Solving the Free Rider Problem using fMRI Measurements" »

Thursday, August 27, 2009

"Revisiting Popper"

Is it true that "History and society are not law-governed systems for which we might eventually hope to find exact and comprehensive theories"?:

Revisiting Popper, by Daniel Little: Karl Popper's most commonly cited contribution to philosophy and the philosophy of science is his theory of falsifiability (The Logic of Scientific Discovery, Conjectures and Refutations: The Growth of Scientific Knowledge). (Stephen Thornton has a very nice essay on Popper's philosophy in the Stanford Encyclopedia of Philosophy.) In its essence, this theory is an alternative to "confirmation theory." Contrary to positivist philosophy of science, Popper doesn't think that scientific theories can be confirmed by more and more positive empirical evidence. Instead, he argues that the logic of scientific research is a critical method in which scientists do their best to "falsify" their hypotheses and theories. And we are rationally justified in accepting theories that have been severely tested through an effort to show they are false -- rather than accepting theories for which we have accumulated a body of corroborative evidence. Basically, he argues that scientists are in the business of asking this question: what is the most unlikely consequence of this hypothesis? How can I find evidence in nature that would demonstrate that the hypothesis is false? Popper criticizes theorists like Marx and Freud who attempt to accumulate evidence that corroborates their theories (historical materialism, ego transference) and praises theorists like Einstein who honestly confront the unlikely consequences their theories appear to have (perihelion of Mars).

At bottom, I think many philosophers of science have drawn their own conclusions about both falsifiability and confirmation theory: there is no recipe for measuring the empirical credibility of a given scientific theory, and there is no codifiable "inductive logic" that might replace the forms of empirical reasoning that we find throughout the history of science. Instead, we need to look in greater detail at the epistemic practices of real research communities in order to see the nuanced forms of empirical reasoning that are brought forward for the evaluation of scientific theories. Popper's student, Imre Lakatos, makes one effort at this (Methodology of Scientific Research Programmes; Criticism and the Growth of Knowledge); so does William Newton-Smith (The Rationality of Science), and much of the philosophy of science that has proceeded under the rubrics of philosophy of physics, biology, or economics is equally attentive to the specific epistemic practices of real working scientific traditions. So "falsifiability" doesn't seem to have a lot to add to a theory of scientific rationality at this point in the philosophy of science. In particular, Popper's grand critique of Marx's social science on the grounds that it is "unfalsifiable" just seems to miss the point; surely Marx, Durkheim, Weber, Simmel, or Tocqueville have important social science insights that can't be refuted by deriding them as "unfalsifiable". And Popper's impatience with Marxism makes one doubt his objectivity as a sympathetic reader of Marx's work.

Continue reading ""Revisiting Popper"" »

Sunday, August 23, 2009

"Why Sleep?"

Is sleep our power saving mode?:

Why sleep?, EurekAlert: ...Humans ... spend roughly one-third of their lives asleep, but sleep researchers still don't know why. ... Theories range from brain "maintenance" — including memory consolidation and pruning — to reversing damage from oxidative stress suffered while awake, to promoting longevity. ...

Now, a new analysis by Jerome Siegel, UCLA professor of psychiatry ... has concluded that sleep's primary function is to increase animals' efficiency and minimize their risk by regulating the duration and timing of their behavior. The research appears in the current online edition of the journal Nature Reviews Neuroscience.

"Sleep has normally been viewed as something negative for survival because sleeping animals may be vulnerable to predation and they can't perform the behaviors that ensure survival," Siegel said. These behaviors include eating, procreating, caring for family members, monitoring the environment for danger and scouting for prey.

"So it's been thought that sleep must serve some as-yet unidentified physiological or neural function that can't be accomplished when animals are awake," he said.

Siegel's lab conducted a new survey of the sleep times of a broad range of animals, examining everything from the platypus and the walrus to the echidna, a small, burrowing, egg-laying mammal covered in spines. The researchers concluded that sleep itself is highly adaptive, much like the inactive states seen in a wide range of species, starting with plants and simple microorganisms; these species have dormant states — as opposed to sleep — even though in many cases they do not have nervous systems. That challenges the idea that sleep is for the brain, said Siegel.

Continue reading ""Why Sleep?"" »

Sunday, April 12, 2009

Predicting Turning Points is Hard

This sounds familiar:

Confusing Patterns With Coincidences, by Susan Hough, Commentary, NY Times: In the aftermath of the earthquake at L’Aquila, Italy, on Monday that killed nearly 300 people, splashy headlines suggested that these victims didn’t have to die.

An Italian researcher, Giampaolo Giuliani, began to sound alarm bells a month earlier, warning that an earthquake would strike near L’Aquila on March 29. ... Mr. Giuliani was denounced for inciting panic..., and he was forced to take his warning off the Web after March 29 came and went without significant activity.

Should Italian officials have listened? Should the public have heeded the warnings? With 20-20 hindsight the answer certainly appears to be yes. The real answer is no.

Scientists have been chasing earthquake prediction — the holy grail of earthquake science — for decades. ... Yet we have little to no real progress to show for our efforts. ... We’re pretty good at forecasting the long-term rates of earthquakes in different areas. But prediction per se, which involves specifying usefully narrow windows in time, location and magnitude, has eluded us.

Continue reading "Predicting Turning Points is Hard" »

Friday, April 10, 2009

"Neural Mechanisms of Social Influence in Consumer Decisions"

Today's seminar:

Neural Mechanisms of Social Influence in Consumer Decisions, by Gregory Berns, C. Monica Capra, Sara Moore, and Charles Noussai: Abstract It is well-known that social influences affect consumption decisions.  Although a number of different mechanisms have been hypothesized, a consumer's tendency to purchase a product is influenced by the choices made by his associative reference group.  Here, we use functional magnetic resonance imaging (fMRI) to elucidate the neural mechanisms associated with social influence on a common consumer good: music.  We restricted our study population to adolescents between the ages of 12-17 because music is a common purchase in this age group, and it is widely believed that adolescent behavior is particularly influenced by perceptions of popularity in their reference group.  Using 15-second clips of songs downloaded from MySpace, we obtained behavioral measures of preferences and neurobiological responses to the songs. The data were gathered with, and without, the popularity of the song revealed. The popularity had a significant effect on the participants' ratings of how much they liked the songs.  The fMRI results showed a strong correlation between the participants' rating and activity in the caudate nucleus, a region previously implicated in reward-driven actions.  The tendency to change one's evaluation of a song was correlated with activation only in the anterior insula, a region associated with physiological arousal, particularly to negative affective states.  Our results suggest that a principal mechanism whereby popularity ratings affect consumer choice is through the anxiety generated by the mismatch between one's own preferences and others'.  This mismatch anxiety motivates people to switch their choices in the direction of the consensus, suggesting that this is a major force behind conformity observed in music tastes in teenagers.

This may also explain why economists generally adopt the consensus forecast, and how this tendency to conform to respected opinion within the field due to "mismatch anxiety" can lead to herd-like behavior that causes us to miss things like a housing bubble. Why take the time to think hard about the problem yourself if, in the end, you are going to adopt the view of the most respected and powerful voices in the field anyway?

Wednesday, March 18, 2009

"Selfish Punishment"

How does altruism survive?:

Thriving on Selfishness, by Marina Krakovsky, Scientific American: It’s the altruism paradox: If everyone in a group helps fellow members, everyone is better off—yet as more work selflessly for the common good, cheating becomes tempting, because individuals can enjoy more personal gain if they do not chip in. But as freeloaders exploit the do-gooders, everybody’s payoff from altruism shrinks.

All kinds of social creatures, from humans down to insects and germs, must cope with this problem; if they do not, cheaters take over and leech the group to death. So how does altruism flourish? Two answers have predominated...: kin selection, which explains altruism toward genetic relatives—and reciprocity— the tendency to help those who have helped us. Adding to these solutions, evolutionary biologist Omar Tonsi Eldakar came up with a clever new one: cheaters help to sustain altruism by punishing other cheaters, a strategy called selfish punishment.

“All the theories addressed how altruists keep the selfish guys out,” explains Eldakar... Because selfishness undermines altruism, altruists certainly have an incentive to punish cheaters—a widespread behavior pattern known as altruistic punishment. But cheaters, Eldakar realized, also have reason to punish cheaters...: a group with too many cheaters does not have enough altruists to exploit. ... That is why, he points out, some of the harshest critics of sports doping, for example, turn out to be guilty of steroid use themselves: cheating gives athletes an edge only if their competitors aren’t doing it, too. ...

In a colony of tree wasps..., a special caste of wasps sting other worker wasps that try to lay eggs, even as the vigilante wasps get away with laying eggs themselves. In a strange but mutually beneficial bargain, punishing other cheaters earns punishers the right to cheat. ...

[T]he idea of a division of labor between cooperators and policing defectors appeals to Pete Richerson, who studies the evolution of cooperation at the University of California, Davis. “It’s nothing as complicated as a salary, but allowing the punishers to defect in effect does compensate them for their services in punishing other defectors...,” he says. After all, policing often takes effort and personal risk, and not all altruists are willing to bear those costs.

Corrupt policing may evoke images of the mafia, and indeed Eldakar notes that when the mob monopolizes crime in a neighborhood, the community is essentially paying for protection from rival gangs—a deal that, done right, lowers crime and increases prosperity. But mob dynamics are not always so benign... “What starts out as a bunch of goons with guns willing to punish people [for breaching contracts] becomes a protection racket,” Richerson says. The next question, therefore, is, What keeps the selfish punishers themselves from overexploiting the group?

Wilson readily acknowledges this limitation of the selfish punishment model..., “there’s nothing telling us that that mix is an optimal mix,” he explains. The answer to that problem, he says, is competition not between individuals in a group but between groups. That is because whereas selfishness beats altruism within groups, altruistic groups are more likely to survive...

Friday, January 16, 2009

"Can Economists Be Trusted?" "Are There Ever Any Wrong Answers in Economics?"

Uwe Reinhardt:

Can Economists Be Trusted?, by Uwe E. Reinhardt, Economix: ...[W]ittingly or unwittingly, economists infuse their analysis with their own (or a political client’s) preferred ideology.

Consider, for example, President Bill Clinton’s 1993-94 health-reform plan. In this plan, President Clinton proposed a mandate on employers to provide their employees with health insurance.

Politically conservative economists predicted that the mandate ... would lead to vast unemployment. Economists supporting the Clinton health plan predicted that the ... mandate ... might even ... increase employment.

It can be shown with a simple mathematical model that an economist’s prediction in this regard is powerfully driven by two assumptions about the behavioral responses to mandated employer-paid health insurance. ... Unfortunately, the empirical literature on this responsiveness offers economists a wide range of estimates from which they can choose...

This example starkly illustrates how easy it is for economists to infuse their own ideology – or that of their clients – into what may appear to outsiders as objective, scientific analysis.

We are now seeing a replay of this tendency in the debate on the relative merits of added government spending versus added tax cuts as measures to stimulate the economy.

Writing in The New York Times, for example, the Harvard professor N. Gregory Mankiw, former chief of President George W. Bush’s Council of Economic Advisers, makes a case for stimulating the economy through tax cuts rather than added government spending. ...

To buttress his case..., he then cites an empirical study by Valerie A. Ramey, according to which the $1 of added government spending will ultimately increase gross domestic product (G.D.P.) by only $1.40, while according to another recent study by Christina and David Romer, $1 of tax cuts over time increases G.D.P. by $3.

Non-economists may ask, of course, exactly how a $1 cut in taxes would translate itself into a $3 increase in G.D.P. at a time when traumatized households, whose wealth has been eroded, might use any new tax savings merely to pay down debt or rebuild their wealth through added savings, rather than spend it, and when business firms unable to sell their output even from existing capacity might hesitate to invest such tax savings in more capacity.

But never mind this fine point.

More interesting is the fact that Christina Romer is to be the head of President-elect Barack Obama’s Council of Economic Advisers. In that capacity, last Saturday she released an analysis of fiscal stimulus alternatives, with a co-author, Jared Bernstein. Curiously — or perhaps not — for that analysis, the two authors assume a much larger four-year multiplier effect for added government spending (1.55) than for tax cuts (0.98), although they do confess to a high degree of uncertainty on the actual sizes of these multipliers.

So there you have the flexibility, shall we say, that economists enjoy when they apply their professional skills to affairs of state in what may seem, to outsiders, like purely scientific analyses.

In the first lecture of my freshman economics course at Princeton entitled “The Art of Siffing Among Seasoned Adults,” I demonstrate how seasoned adults routinely structure information felicitously (i.e., “sif”) to further their own agenda, and I point out that economists can be among the most skillful practitioners of this art. ... When economists advise on public policy, the operative mantra is Caveat Emptor!” ...

The answer to this, of course, is that economists should acknowledge the range of estimates, and, if they are committed to one set of estimates over another, if they want to get past the "on the one hand, on the other hand" construction, why they think one set is better or worse than another (let me admit to being less than perfect at this myself).

Brad DeLong:

Fama's Fallacy V: Are There Ever Any Wrong Answers in Economics?: Montagu Norman here, back from my grave once again. This time it is Greg Mankiw whose words have summoned me...

One thing that used to give me nightmares--and that provoked several of my nervous breakdowns--was how you could never get any economist (except for John Maynard Keynes) to take a definite position. They were always "on the one hand--on the other hand." This was what led Harry Truman in later days to wish for a one-handed economist, a wish that has never been fulfilled...

The "on the one hand--on the other hand" nature of discourse raises the question of whether in economics--a "science" where there is enormous intellectual and ideological and political disagreement about how the world works--there can ever be any wrong answers?. I believe that there can be wrong answers in economics, because examinations in economics tend to take a particular form: instead of asking (i) "do expansionary fiscal policies increase output and employment?" we ask (ii) "in models where there are idle resources and high unemployment, do expansionary fiscal policies increase output and employment?" (ii) is a question about a particular class of models of the economy, and so has a definite right answer--"yes, in that class of models they do"--and a definite wrong answer--"no, in that class of models they don't."

Eugene Fama claimed that "when there are idle resources--unemployment" expansionary fiscal policies had no effect in models in which the NIPA savings-investment identity:

investment = (private savings) - (government deficit)

held.

Now the NIPA savings-investment identity holds in all models--it is, after all, an identity, true by definition and construction. And every single model that has been built in which there is a possibility of high unemployment and idle resources is a model in which fiscal policy works because increases in government spending lead to unexpected declines in inventories and unexpected declines in inventories lead to firms to expand production, which leads to increases in income and saving.

I would, therefore, say that Fama's claim is "wrong". Not only does it not hold in all models in the class, it does not hold in any models in the class.

Greg Mankiw disagrees:

Greg Mankiw's Blog: Fama's arguments make sense in the context of the classical model... presented in Chapter 3 of my intermediate macro textbook.... I would go on to the Keynesian model.... But whether one leaves the classical model behind to embrace the Keynesian model is a judgment call...

Mankiw thinks that Fama is not wrong but is, rather, making a "judgment call."

But Mankiw writes in his chapter 3 that the classical model "assume[s] that the labor force is fully employed." And so Greg gets himself into Cretan Liars' Paradox territory here: Fama says that there is high unemployment and idle resources, while Mankiw says that Fama is not wrong because he makes sense as long as the labor force is fully employed and there are no idle resources.

Is Mankiw's answer here a "wrong" answer, or is he too making a "judgment call"? I seek an empirical test. I seek a Harvard undergraduate to take Greg Mankiw's course this spring, to write the following in an appropriate place:

the classical model of chapter 3 shows us that expansionary fiscal policies have no effect on output even where there are idle resources--unemployment.

and to report back on the reaction of the course instructors.

Let's ask another question. Does Greg Mankiw believe in the classical model he is using to defend Fama (in the classical model, the LM curve is vertical, and a vertical LM curve leads to a vertical supply curve, and to the result that demand side policies such as a change in government spending or taxes cannot change real output)?:

I disagree ... that the LM curve is vertical... Introspection is not a particularly reliable way to measure elasticities. There is a substantial empirical literature on money demand that demonstrates that it is interest-elastic. ... According to Ball, the interest semi-elasticity of money demand is -0.05: This means that an increase in the interest rate of one percentage point, or 100 basis points, reduces the quantity of money demanded by 5 percent.

How far off is the vertical LM case as a practical matter? One way to answer this question is to look at the fiscal-policy multiplier. In chapter 11 of my intermediate macro text, I give the government-purchases multiplier from one mainstream econometric model. If the nominal interest rate is held constant, the multiplier is 1.93. If the money supply is held constant, the multiplier is 0.60. If the LM curve were completely vertical, the second number would be zero. ...

Greg has been pretty good at saying there is a lot of uncertainty about the fiscal policy multipliers, and about explaining why estimates differ across studies, and why he favors one set of estimates over another, so I don't want to come down too hard on his disagreement with the 1.93 figure in his "favorite textbook", but it does seem like he is defending Fama with a model that he does not believe in.

Wednesday, January 07, 2009

Black Holes Grow Galaxies?

Black holes came first:

Which came first--galaxies or black holes?: ...The holes came first, Christopher Carilli of the National Radio Astronomy Observatory and his colleagues announced today at the American Astronomical Society meeting. ... “Black holes came first and somehow—we don’t know how—grew the galaxy around them,” Carilli says. ...

Wednesday, December 03, 2008

"Undoing the Damage"

Let's hope the war is coming to an end:

Back to Reality, by Olivia Judson: President-elect Obama already has a long to-do list. But here’s another item for it: to restore science in government.

The most notable characteristic of the Bush administration’s science policy has been the repeated distortion and suppression of scientific evidence in order to fit ideological preferences about how the world should be, rather than how it is. ...

The distortion and suppression of science is dangerous ... because it is an assault on ... a method of thought and inquiry on which our modern civilization is based and which has been hugely successful... In many respects science has been the dominant force — for good and ill — that has transformed human lives over the past two centuries.

In schools, science is often taught as a body of knowledge — a set of facts and equations. But all that is just a consequence of scientific activity.

Science itself is something else... It is an attitude, a stance towards measuring, evaluating and describing the world that is based on skepticism, investigation and evidence. The hallmark is curiosity; the aim, to see the world as it is. ... And it is not something taught so much as acquired during a training in research or by keeping company with scientists.

Now, I don’t want to idealize this. To claim that scientists are free of bias, ambition or desires would be ridiculous. Everyone has pet ideas that they hope are right; and scientists are not famous for humility. ...

Moreover, to downplay evidence that doesn’t fit your ideas, and to place more weight on evidence that does — this is something that human brains just seem to do. ...

However, the beauty of the scientific approach is that even when individuals do succumb to bias or partiality, others can correct them using a framework of evidence that everyone broadly agrees on. (Admittedly, this can sometimes be a slow process.) But arguing over data is different from suppressing it. Or changing it. Or ignoring it. For these activities debase the whole enterprise and threaten its credibility. When data can’t be accessed or trusted, when “facts” are actually illusions — well, this threatens the nature of knowledge itself. And a society without knowledge is steering blind.

The rubbishing of science is far more serious than any particular decision over whether to fund research into stem cells, the sexual behavior of fruit flies or the quarks and quirks of particle physics. Undoing the damage of the past eight years may take another eight. But it must be done. We are probably one of the last generations that will be able to use our knowledge and methods to guide human civilization to a sustainable future. This is our time. ...

Friday, November 21, 2008

"Uncertainty, Climate Change, and the Global Economy"

This research concludes that "global warming will be a major problem even under very optimistic circumstances":

Uncertainty, climate change, and the global economy, by Torsten Persson and David von Below, VoxEU.org: What will the climate be like in a hundred years’ time? The answer to this question is highly uncertain, and will depend on a number of socio-economic as well as natural processes, which describe the links between human activity, emissions of greenhouse gases, and warming of the atmosphere. The existing policy discussion in important forums, such as the IPCC and Stern reports (see this Vox column), is largely based on the uncertainty about the biogeophysical and biogeochemical systems, as are analyses such as that of Wigley and Raper (2001). In a recent paper, we include such uncertainty – but highlight uncertainty about the drivers of climate change in the socioeconomic system. [...continue reading...]

Wednesday, November 12, 2008

Bang-Bang You're Complex

In many economic problems, feedback loops can be used to optimally control a system. For example, the inputs to an economic system might be government spending, taxes, and the federal funds rate, and, given values for these variables the system will produce outcomes for other variables in the system such as output and prices. The goal in these problems is to find a feedback rule for changing the policy variables that best achieves policy objectives.

Take a simpler problem where the system has just one input, the federal funds rate, and one outcome, the level of output. Given a goal such as minimizing the variability of output around a target value, the objective is to find a rule for setting the federal funds rate that minimizes variance in output. That is, if the policy rule is linear and of the form ff = a + b(y-y*), i.e. a Taylor rule, then the solution finds the values of a and b that minimize the loss function (which in this case is the variance of y). The mathematics is very similar to what an engineer might use to control a system with a feedback rule, except that in economics you have to consider how the economic actors respond to actual or expected changes in the feedback policy rule, and that turns out to be much more than a minor complication.

In some classes of these problems, the solution is known as "bang-bang". With bang-bang solutions, the control variable takes just two values, e.g. all the way on or all the way off. The heater in your home probably controls the room temperature with a solution of this form (though it may not be optimal - it depends). The heater is either all the way on or all the way off depending upon the room temperature, and this moderates the variation in the room temperature. The outcome - the temperature - is monitored, and there is a feedback rule that turns the heater on or off when the deviation from the targeted temperature exceeds some set amount. Such solutions are quite common (see here for technical details).

So common, in fact, that according to this research evolution has also discovered bang-bang solutions, and it uses them to regulate evolutionary change, i.e. to "correct any imbalance imposed on them through artificial mutations..." This allows the system to self-correct when random mutations throw it off course, and helps to explain how organisms can "be so exquisitely complex, if evolution is completely random":

Evolution's new wrinkle, EurekAlert: A team of Princeton University scientists has discovered that chains of proteins found in most living organisms act like adaptive machines, possessing the ability to control their own evolution.

The research, which appears to offer evidence of a hidden mechanism guiding the way biological organisms respond to the forces of natural selection, provides a new perspective on evolution, the scientists said.

The researchers ... made the discovery while carrying out experiments on proteins... A mathematical analysis of the experiments showed that the proteins themselves acted to correct any imbalance imposed on them through artificial mutations and restored the chain to working order.

"The discovery answers an age-old question that has puzzled biologists since the time of Darwin: How can organisms be so exquisitely complex, if evolution is completely random, operating like a 'blind watchmaker'?" said Chakrabarti... "Our new theory extends Darwin's model, demonstrating how organisms can subtly direct aspects of their own evolution to create order out of randomness."

The work also confirms an idea first floated in an 1858 essay by Alfred Wallace, who along with Charles Darwin co-discovered the theory of evolution. Wallace had suspected that certain systems undergoing natural selection can adjust their evolutionary course in a manner "exactly like that of the centrifugal governor of the steam engine, which checks and corrects any irregularities almost before they become evident." In Wallace's time, the steam engine operating with a centrifugal governor was one of the only examples of what is now referred to as feedback control. Examples abound, however, in modern technology, including cruise control in autos and thermostats in homes and offices.

Continue reading "Bang-Bang You're Complex" »

Saturday, November 08, 2008

"Fibonacci, Fermat, and Finance"

I went to a seminar yesterday in the Physics Department to see the manager of a hedge fund, John Seo, talk about "Fibonacci, Fermat, and Finance: How a Biophysicist Built a Multi-Billion Dollar Catastrophe Bond Fund after Re-Reading the Foundations of Modern Finance" (NY Times magazine story). One thing I learned at the seminar and from asking questions at dinner afterward was about the origins of present value analysis. It goes back to the mathematician Leonardo Fibonacci and chapter 12 of his book "Liber Abaci" written in 1202. Here's a working paper on the topic:

Fibonacci and the Financial Revolution William N. Goetzmann NBER Working Paper No. 10352: ...Traveling Merchant Problems The second type of financial problem is a set of “traveling merchant” examples, akin to accounting calculations for profits obtained in a series of trips to trading cities.

The first example is:

A certain man proceeded to Lucca on business to make a profit doubled his money, and he spent there 12 denari. He then left and went through Florence; he there doubled his money, and spent 12 denari. Then he returned to Pisa, doubled his money and it is proposed that he had nothing left. It is sought how much he had at the beginning.

[Update: the problem isn't clear about this, but 12 denari are spent at each of the three stops, including Pisa] Leonardo proposes an ingenious solution method. Since capital doubles at each stop, the discount factor for the third cash flow (in Pisa) is ½ ½ ½ . He multiplies the periodic cash flow of 12 denari times a discount factor that is the sum of the individual discount factors for each trip i.e. (1/2) + (1/4) + (1/8). The solution is 10½ denari. The discount factor effectively reduces the individual cash flows back to the point before the man reached Lucca.

Notice that this approach can be generalized to allow for different cash flows at different stages of the trip, a longer sequence of trips, different rates of return at each stop, or a terminal cash flow. In the twenty examples that follow the Lucca-Florence- Pisa problem, Leonardo presents and solves increasingly complex versions with various unknown elements. For example, one version of the problem specifies the beginning value and requires that the number of trips to be found – e.g. “A certain man had 13 bezants, and with it made trips, I know not how many, and in each trip he made double and he spent 14 bezants. It is sought how many were his trips.” This and other problems demonstrate the versatility of his discounting method. They also provide a framework for the explicit introduction of the dimension of time, and the foundation for what we now consider finance.

In case you don't see this, though he didn't explicitly use this formula, he realized that the "present value" of the cash flow is:

PV = [R1/(1+i)1] + [R2/(1+i)2] + [R3/(1+i)3]

where Rj is the cash spent at each point j on the trip, and i is the profit rate. In this case,  Rj = 12 for all j, and i=100%, or 1.0. Thus:

PV = [12/(1+1)1] + [12/(1+1)2] + [12/(1+1)3]

      = [12/2] + [12/4] + [12/8]

      = 12[(1/2) + (1/4) + (1/8)] (as above)

      = 10.5

But does Fibonacci realize this applies not just to traveling merchants, but also to discounting financial cash flows over time? Yes. All you have to do is convert the problem back to the traveling merchant example. Going back to the NBER paper:

Immediately following the trip problems, Fibonacci poses and solves a series of banking problems. Each of these follows the pattern established by the trips example – the capital increases by some percentage at each stage, and some amount is deducted. For example:

A man placed 100 pounds at a certain [banking] house for 4 denari per pound per month interest and he took back each year a payment of 30 pounds. One must compute in each year the 30 pounds reduction of capital and the profit on the said 30 pounds. It is sought how many years, months, days and hours he will hold money in the house....

Fibonacci explains that the solution is found by using the same techniques developed in the trips section. Intervals of time replace the sequence of towns visited and thus a time-series of returns and cash draw-downs can be evaluated. Once the method of trips has been mastered, then it is straightforward to construct a multiperiod discount factor and apply it to the periodic payment of 30 pounds – although in this problem the trick is to determine the number of time periods used to construct the factor. Now we might use logarithms to address the problem of the nth root for an unknown n, but Fibonacci lived long before the invention of logarithms. Instead, he solves it by brute force over the space of three pages, working forward from one period to two periods etc. until he finds the answer of 6 years, 8 days and [5 and 1/2] hours. The level of sophistication represented by this problem alone is unmatched in the history of financial analysis. Although the mathematics of interest rates had a 3,000 year history before Fibonacci, his remarkable exposition and development of multi-period discounting is a quantum leap above his predecessors.

And, one final example:

Present Value Analysis The most sophisticated of Fibonacci’s interest rate problems is “On a Soldier Receiving Three Hundred Bezants for his Fief.” In it, a soldier is granted an annuity by the king of 300 bezants per year, paid in quarterly installments of 75 bezants. The king alters the payment schedule to an annual year-end payment of 300. The soldier is able to earn 2 bezants on one hundred per month (over each quarter) on his investment. How much is his effective compensation after the terms of the annuity have changed? ...

As before, Fibonacci explains how to construct a multi-period discount factor from the product of the reciprocals of the periodic growth rate of an investment, using the model developed from mercantile trips in which a percentage profit is realized at each city. In this problem, he explicitly quantifies the difference in the value of two contracts due to the timing of the cash flows alone. As such, this particular example marks the discovery of one of the most important tools in the mathematics of Finance – an analysis explicitly ranking different cash flow streams based upon their present value.

Update: More at  Catastrophe bonds and the investor's choice problem.

Friday, November 07, 2008

"The Sequencing of the Mathematical Genome"

"We may be close to seeing how computers, rather than humans, would do mathematics":

Proof by computer: Harnessing the power of computers to verify mathematical proofs, EurekAlert: New computer tools have the potential to revolutionize the practice of mathematics by providing far more-reliable proofs of mathematical results than have ever been possible in the history of humankind. These computer tools, based on the notion of "formal proof", have in recent years been used to provide nearly infallible proofs of many important results in mathematics. A ground-breaking collection of four articles by leading experts, published today in the Notices of the American Mathematical Society, explores new developments in the use of formal proof in mathematics.

When mathematicians prove theorems in the traditional way, they present the argument in narrative form. They assume previous results, they gloss over details they think other experts will understand, they take shortcuts to make the presentation less tedious, they appeal to intuition, etc. The correctness of the arguments is determined by the scrutiny of other mathematicians, in informal discussions, in lectures, or in journals. It is sobering to realize that the means by which mathematical results are verified is essentially a social process and is thus fallible. When it comes to central, well known results, the proofs are especially well checked and errors are eventually found. Nevertheless the history of mathematics has many stories about false results that went undetected for a long time. In addition, in some recent cases, important theorems have required such long and complicated proofs that very few people have the time, energy, and necessary background to check through them. And some proofs contain extensive computer code to, for example, check a lot of cases that would be infeasible to check by hand. How can mathematicians be sure that such proofs are reliable?

To get around these problems, computer scientists and mathematicians began to develop the field of formal proof. A formal proof is one in which every logical inference has been checked all the way back to the fundamental axioms of mathematics. Mathematicians do not usually write formal proofs because such proofs are so long and cumbersome that it would be impossible to have them checked by human mathematicians. But now one can get "computer proof assistants" to do the checking. In recent years, computer proof assistants have become powerful enough to handle difficult proofs.

Only in simple cases can one feed a statement to a computer proof assistant and expect it to hand over a proof. Rather, the mathematician has to know how to prove the statement; the proof then is greatly expanded into the special syntax of formal proof, with every step spelled out, and it is this formal proof that the computer checks. It is also possible to let computers loose to explore mathematics on their own, and in some cases they have come up with interesting conjectures that went unnoticed by mathematicians. We may be close to seeing how computers, rather than humans, would do mathematics.

The four Notices articles explore the current state of the art of formal proof and provide practical guidance for using computer proof assistants. If the use of these assistants becomes widespread, they could change deeply mathematics as it is currently practiced. One long-term dream is to have formal proofs of all of the central theorems in mathematics. Thomas Hales, one of the authors writing in the Notices, says that such a collection of proofs would be akin to "the sequencing of the mathematical genome".

The articles appear today in the December 2008 issue of the Notices and are freely available at http://www.ams.org/notices.