Saturday, December 26, 2009

Eastern APA Session: New Waves in Philosophy of Mathematics

On the first night of the APA in New York City I will be participating in a session designed to unveil the book New Waves in the Philosophy of Mathematics. We are scheduled for Sunday, December 27 from 6:30 to 9:30 pm (GI-8: Society for Realist/Antirealist Discussion). Perhaps not the ideal time, but we will have papers by Otavio Bueno, Oystein Linnebo, Roy Cook, Agustin Rayo and me. Come by to hear about this great volume!

Note: This marks the 100th post on this blog. Thanks to everyone who checks in or links to Honest Toil!

Thursday, December 17, 2009

Dark Matter Rumors (cont.)

The results that prompted the rumors noted in an earlier post have now been unveiled. They involve the detection of Weakly Interacting Massive Particles (WIMPs) which are predicted by some theories of dark matter. The group has provided a helpful two-page summary, with the key paragraph:
In this new data set there are indeed 2 events seen with characteristics consistent with those expected from WIMPs. However, there is also a chance that both events could be due to background particles. Scientists have a strict set of criteria for determining whether a new discovery has been made, in essence that the ratio of signal to background events must be large enough that there is no reasonable doubt. Typically there must be less than one chance in a thousand of the signal being due to background. In this case, a signal of about 5 events would have met those criteria. We estimate that there is about a one in four chance to have seen two backgrounds events, so we can make no claim to have discovered WIMPs. Instead we say that the rate of WIMP interactions with nuclei must be less than a particular value that depends on the mass of the WIMP. The numerical values obtained for these interaction rates from this data set are more stringent than those obtained from previous data for most WIMP masses predicted by theories. Such upper limits are still quite valuable in eliminating a number of theories that might explain dark matter. (emphasis added)
So, Bryan's prediction was correct! Now if only the scientists would tell us what "reasonable doubt" amounts to ...

A Universal Pattern for Insurgents?

From this week's Nature:
The researchers collected data on the timing of attacks and number of casualties from more than 54,000 events across nine insurgent wars, including those fought in Iraq between 2003 and 2008 and in Sierra Leone between 1994 and 2003. By plotting the distribution of the frequency and size of events, the team found that insurgent wars follow an approximate power law, in which the frequency of attacks decreases with increasing attack size to the power of 2.5. That means that for any insurgent war, an attack with 10 casualties is 316 times more likely to occur than one with 100 casualties (316 is 10 to the power of 2.5).

[...]

To explain what was driving this common pattern, the researchers created a mathematical model that assumes that insurgent groups form and fragment when they sense danger, and strike in well-timed bursts to maximize their media exposure. The model gave results that resembled the power-law distribution of actual attacks.
This all seems a bit too easy, although I must admit I have not delved into the details of the actual model. I'm also a bit wary of the predictive power of the model, as with "He is now working to predict how the insurgency in Afghanistan might respond to the influx of foreign troops recently announced by US President Barack Obama". But at least this is yet one more case of a purported mathematical explanation in science.

Sunday, December 13, 2009

Dark Matter Rumors Persist

Philosophers interested in tracking how scientists argue for the existence of novel entities might want to stay tuned this week. Rumors of a big announcement, centered largely around the blog Resonances and this post, continue.

Thursday, December 10, 2009

The Popularity of Nominalism

One of the surprises in the recently unveiled results of the PhilPapers Survey of philosophers is the popularity of nominalism:

Target responses

Abstract objects: Platonism or nominalism?

Accept or lean toward: Platonism 366 / 931 (39.3%)
Accept or lean toward: nominalism 351 / 931 (37.7%)
Other 214 / 931 (22.9%)

All responses

Abstract objects: Platonism or nominalism?

Accept or lean toward: nominalism 1454 / 3226 (45%)
Accept or lean toward: Platonism 1016 / 3226 (31.4%)
Other 756 / 3226 (23.4%)

That is, 45% of all the respondents tend towards nominalism. But when we look at people who list an AOS as philosophy of mathematics, the numbers switch:

AOS: Philosophy of mathematics

Abstract objects: Platonism or nominalism?

Accept or lean toward: Platonism 50 / 102 (49%)
Accept or lean toward: nominalism 30 / 102 (29.4%)
Other 22 / 102 (21.5%)

AOS: Logic or Philosophy of Logic

Abstract objects: Platonism or nominalism?

Accept or lean toward: Platonism 112 / 264 (42.4%)
Accept or lean toward: nominalism 96 / 264 (36.3%)
Other 56 / 264 (21.2%)

Does this mean that people outside of philosophy of mathematics don't care what we think, or is it just that our arguments for platonism aren't very good?

Friday, November 13, 2009

New Book: New Waves in the Philosophy of Mathematics

If Macmillan's web page is to be believed, the anthology New Waves in Philosophy of Mathematics, edited by Bueno and Linnebo, is now out:
In this book, thirteen promising young researchers write on what they take to be the right philosophical account of mathematics and discuss where the philosophy of mathematics ought to be going. New trends are revealed, such as an increasing attention to mathematical practice, a reassessment of the canon, and inspiration from philosophical logic.
(Look inside at Amazon.) Full disclosure: I have a chapter in there about how applied mathematics can be thought of as its own area of mathematics with some implications for how we think about mathematics as a whole. The other contributions that I have read are excellent and this volume would be an ideal starting point for appreciating the new ways in which philosophy of mathematics is developing. Congratulations to the editors for a timely and well-executed job!

Wednesday, November 4, 2009

New Book: Weyl, Mind and Nature: Selected Writings

Princeton University Press has recently reissued Weyl's classic Philosophy of Mathematics and Natural Science along with a collection of essays by Weyl. Brandon Fogel reviews the latter in NDPR, noting
Peter Pesic's collection is the latest, and to date most significant, salvo in the effort to bring long overdue attention to Weyl's philosophical ideas, particularly those regarding science and its integration with mathematics.
While Weyl's philosophical views come across as a bit strange, he surely marks one of the most serious attempts to integrate a phenomenological starting point with a genuine understanding of our scientific knowledge.

Monday, October 26, 2009

Mathematics and Scientific Representation: Summary and Chapter 1

More than a year ago I posted a fairly vague description of a book project on the ways in which mathematics contributes to the success of science. I have made some progress on bringing together this material and thought it would be useful to post a summary of the chapters of the book along with an introductory chapter where I give an overview of the main conclusions of the book. Hopefully this is useful to other people working on similar projects. Critical suggestions for what is missing or who else is doing similar stuff is of course welcome!

Update (Feb. 17, 2011): The link to chapter 1 has been replaced with the final version. The link to the summary has been removed.

Thursday, October 15, 2009

The Polymath Project

Gowers and Nielsen offer in the current issue of Nature a report on the online collaboration in mathematics known as the Polymath Project. It is hard to know what to make of it all without delving into the details and trying to understand if there is anything special about this problem which lends itself to collaboration. But two passages jump out for the philosopher:
This theorem was already known to be true, but for mathematicians, proofs are more than guarantees of truth: they are valued for their explanatory power, and a new proof of a theorem can provide crucial insights.
The working record of the Polymath Project is a remarkable resource for students of mathematics and for historians and philosophers of science. For the first time one can see on full display a complete account of how a serious mathematical result was discovered. It shows vividly how ideas grow, change, improve and are discarded, and how advances in understanding may come not in a single giant leap, but through the aggregation and refinement of many smaller insights. It shows the persistence required to solve a difficult problem, often in the face of considerable uncertainty, and how even the best mathematicians can make basic mistakes and pursue many failed ideas. There are ups, downs and real tension as the participants close in on a solution. Who would have guessed that the working record of a mathematical project would read like a thriller?
At over 150 000 words, these records should keep some philosopher busy for a while!

Friday, October 9, 2009

Nobel Prize for Efficient Markets Hypothesis?

One of the core ideas driving the derivation of the Black-Scholes model is the efficient markets hypothesis. Exactly what this comes to is hopefully something I'll post on next week. But for now I'll pass on this from NPR's Marketplace:
Kai Ryssdal's final note.

Not so much news as a commentary on the state of the economic profession. The Nobel Prize in economics comes out Monday morning. I obviously have no idea who's going to win, but the markets think they do. The betting line at Ladbrokes, in London, has Eugene Fama of the University of Chicago as a 2-to-1 favorite.

That's all well and good except for this: Fama's best known for something called the Efficient Markets Theory. That the markets are, in essence, always right. I dunno, I'd say that's a tough sell after the year and a half we've just had. More to come on Monday.

Tuesday, October 6, 2009

Mathematics, Financial Economics and Failure

In a recent post I noted Krugman's point about economics being seduced by attractive mathematics. Since then there have been many debates out there in the blogosphere about the failures of financial economics, but little discussion of the details of any particular case. I want to start that here with a summary of how the most famous model in financial economics is derived. This is the Black-Scholes model, given as (*) below. It expresses the correct price V for an option as a function of the current price of the underlying stock S and the time t.

My derivation follows Almgren, R. (2002). Financial derivatives and partial differential equations. American Mathematical Monthly, 109: 1-12, 2002.

In my next post I aim to discuss the idealizations deployed here and how reasonable they make it to apply (*) in actual trading strategies.

A Derivation of the Black-Scholes Model

A (call) option gives the owner the right to buy some underlying asset like a stock at a fixed price K at some time T. Clearly some of the factors relevant to the fair price of the option now are the difference between the current price of the stock S and K as well as the length of time between now and time T when the option can be exercised. Suppose, for instance, that a stock is trading at 100$ and the option gives its owner the right to buy the stock at 90$. Then if the option can be exercised at that moment, the option is worth 10$. But if it is six months or a year until the option can be exercised, what is a fair price to pay for the 90$ option? It seems like a completely intractable problem that could depend on any number of factors including features specific to that asset as well as an investor's tolerance for risk. The genius of the Black-Scholes approach is to show how certain idealizing assumptions allow the option to be priced at V given only the current stock price S, a measure of the volatility of the stock price σ , the prevailing interest rate r and the length of time between now and time T when the option can be exercised. The only unknown parameter here is σ , the volatility of the stock price, but even this can be estimated by looking at the past behavior of the stock or similar stocks. Using the value V computed using this equation a trader can execute what appears to be a completely risk-free hedge. This involves either buying the option and selling the stock or selling the option and buying the stock. This position is apparently risk-free because the direction of the stock price is not part of the model, and so the trader need not take a stand on whether the stock price will go up or down.

The basic assumption underlying the derivation of (*) is that markets are efficient so that ``successive price changes may be considered as uncorrelated random variables" (Almgren, p. 1). The time-interval between now and the time T when the option can be exercised is first divided into N -many time-steps. We can then deploy a lognormal model of the change in price δ S_j at time-step j :

δ S_j = a δ t + σ S ξ_j

The ξ_j are random variables whose mean is zero and whose variance is 1 (Almgren, p. 5). Our model reflects the assumption that the percentage size of the random changes in S remains the same as S fluctuates over time (Almgren, p. 8). The parameter a indicates the overall ``drift" in the price of the stock, but it drops out in the course of the derivation.

Given that V is a function of both S and t we can approximate a change in V for a small time-step &delta t using a series expansion known as a Taylor series

δ V = V_t δ t + V_s δ S + 1/2 V_{SS} δ S^2

where additional higher-order terms are dropped. Given an interest rate of r for the assets held as cash, the corresponding change in the value of the replicating portfolio Π = DS+C of D stocks and C in cash is

δ Π = Dδ S + r C δ t

The last two equations allow us to easily represent the change in the value of a difference portfolio which buys the option and offers the replicating portfolio for sale. The change in value is

δ(V-Π)=(V_t - rC)δ t + (V_S - D)δ S + 1/2 V_{SS} δ S^2

The δ S term reflects the random fluctuations of the stock price and if it could not be dealt with we could not derive a useful equation for V . But fortunately the δ S term can be eliminated if we assume that at each time-step the investor can adjust the number of shares held so that

D=V_S

Then we get

δ(V-Π)=(V_t - rC)δ t + 1/2 V_{SS} δ S^2

The δ S^2 remains problematic for a given time-step, but we can find it for the sum of all the time-steps using our lognormal model. This permits us to simplify the equation so that, over the whole time interval Δ t ,

Δ(V-&Pi) = (V_t - rC + 1/2 σ^2 S^2 V_{SS})Δ t

Strictly speaking, we are here applying a result known as Ito's Lemma.

What is somewhat surprising is that we have found the net change in the value of the difference portfolio in a way that has dropped any reference to the random fluctuations of the stock price S . This allows us to deploy the efficient market hypothesis again and assume that Δ(V-Π) is identical to the result of investing V-Π in a risk-free bank account with interest rate r . That is,

Δ(V-Π) = r (V-Π)Δ t

But given that V-Π = V - DS - C and D = V_S , we can simplify the right-hand side of this equation to

(rV - rV_S S - rC)Δ t

Given our previous equation for the left-hand side, we get

(*) V_t + 1/2 σ^2 S^2 V_{SS} + rSV_S - rV = 0

after all terms are brought to the left-hand side.

Wednesday, September 30, 2009

Critical Notice of Mark Wilson's Wandering Significance

I have posted a long critical notice of Mark Wilson's amazing book Wandering Significance: An Essay on Conceptual Behavior. It will eventually appear in Philosophia Mathematica. My impression is that even though the book came out in 2006 and is now available in paperback, it has not really had the impact it should in debates about models and idealization. I think this is partly because the book addresses broad questions about concepts that don't often arise in philosophy of science or philosophy of mathematics. But if you start to read the book, it becomes immediately clear how important examples from science and mathematics are to Wilson's views of conceptual evaluation. So, I hope my review will help philosophers of science and mathematics see the importance of the book and the challenges it raises.

Tuesday, September 15, 2009

Cole's Practice-Dependent Realism and Creativity in Mathematics

Julian Cole's "Creativity, Freedom and Authority: A New Perspective on the Metaphysics of Mathematics" is now available via the Australasian Journal of Philosophy. Cole develops what seems to me to be the most careful version of a social constructivist metaphysics for mathematics. Basically the idea is that the activities of mathematics constitute the mathematical entities as abstract entities. This makes it coherent for Cole to insist that the entities have many of the traditional features of abstract objects such as being outside space and time and lacking causal relations. Crucially for the causal point, even though the mathematicians constitute the mathematical entities, they do not cause them to exist.

One consideration in favor of his view that Cole emphasizes is the creativity that mathematicians have to posit new entities. Qua mathematician, he notes "the freedom I felt I had to introduce a new mathematical theory whose variables ranged over any mathematical entities I wished, provided it served a legitimate mathematical purpose" (p. 589). Other mathematicians have of course said similar things, from Cantor's claim that "the essence of mathematics lies precisely in its freedom" (noted by Linnebo in his essay in this volume) and Hilbert's conception of axioms in his debate with Frege.

I have two worries with this starting point. First, is it so clear that mathematicians really have this freedom? The history of mathematics seems filled with controversies about new objects or new mathematical techniques that seem to presuppose the existence problematic objects. Second, even if mathematicians have a certain kind of freedom to posit new objects, how do we determine that this freedom is independent of prior metaphysical commitments? One option for the traditional platonist or the ante rem structuralist is to insist that mathematicians are now free to posit new objects only because it is highly likely that these new objects can find a place in their background set theory or theory of structures. This of course would not settle the issue against practice-dependent realism, but it gives the realist a strategy to accommodate the same data.

Wednesday, September 9, 2009

Call for Papers: Mathematical and Scientific Philosophy

Readers of this blog should check out the fall meeting of the Indiana Philosophical Association:


Call for Papers

Mathematical and Scientific Philosophy

with a special session on the Darwin Bicentenary

Indiana Philosophical Association Fall Meeting
Invited Speakers: Colin Allen, Elisabeth Lloyd, Larry Moss

Saturday AND Sunday, 5-6 December 2009
Indiana Memorial Union, IU Bloomington

We invite submissions—from philosophers, logicians, and historians and philosophers of science—on topics that fall under the theme of the meeting. Papers should be 35 minutes reading time, i.e., no more than 17 double-spaced pages. Papers will be blind reviewed; the author’s name and affiliation should therefore appear only on the cover sheet.

Send one copy of your paper and a short, one–paragraph abstract to one of the following.

Peter Murphy
Department of Philosophy
Esch Hall 044U
University of Indianapolis
Indianapolis, IN 46227
murphyp at uindy.edu

Bernd Buldt
Department of Philosophy
CM 21 026
IPFW
Fort Wayne, IN 46805
buldtb at ipfw.edu

Charles McCarty
The Logic Program
Sycamore Hall
Indiana University
Bloomington, IN 47405
dmccarty at indiana.edu

Electronic submissions of papers and abstracts in MSWord or pdf formats are encouraged.
Deadline for Submissions: 15 October 2009
Deadline for Notifications: 9 November 2009
For further information, please email Charles McCarty at dmccarty at indiana.edu

Tuesday, September 8, 2009

Krugman on Mathematics and the Failure of Economics

Probably anyone who is interested in this article has already seen it, but Paul Krugman put out an article in Sunday's New York Times Magazine called "How Did Economics Get It So Wrong?". The article is very well-written, but a bit unsatisfying as it combines Krugman's more standard worries about macroeconomics with a short attack on financial economics. I am trying to write something right now about the ways in which mathematics can lead scientists astray, and one of my case studies in the celebrated Black-Scholes model for option pricing. Hopefully I can post more on that soon, but here is what Krugman says about it and similar models which are used to price financial derivatives and devise hedging strategies.

My favorite part is where Krugman says "the economics profession went astray because economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth". But he never really follows this up with much discussion of the mathematics or why it might have proven so seductive. Section III attacks "Panglossian Finance", but this is presented as if it assumes "The price of a company's stock, for example, always accurately reflects the company's value given the information available". But, at least as I understand it, this is not the "efficient market hypothesis" which underlies models like Black-Scholes. Instead, this hypothesis makes the much weaker assumption that "successive price changes may be considered as uncorrelated random variables" (Almgren 2002, p. 1). This is the view that prices over time amount to a "random walk". It has serious problems as well, but I wish Krugman had spent an extra paragraph attacking his real target.

Almgren, R. (2002). Financial derivatives and partial differential equations.
American Mathematical Monthly, 109: 1-12, 2002.

Monday, August 31, 2009

New Book: Mathematics and Philosophy

I have just completed a review of the relatively new collection, edited by Bonnie Gold and Roger Simons, called Proof and Other Dilemmas: Mathematics and Philosophy. The review will appear eventually in SIGACT News.

I think everyone who is interested in the interaction between mathematics and philosophy should be encouraged by the volume. The editors have brought together philosophers and mathematicians to try to increase interest in philosophy on the mathematics side. This is a difficult task, and I still have the impression that a philosophy-mathematics collaboration is more difficult than other kinds of interdisciplinary work, e.g. philosophy-physics or philosophy-cognitive science.

From the review:
Hopefully these brief summaries suggest how the editors have sought to link philosophy of mathematics more closely with the interests of mathematicians. There is certainly a need for more engagement between mathematics and the philosophy of mathematics and I believe that this volume marks a productive first step in this direction. It is worth briefly asking, though, what barriers there are to philosophy-mathematics interaction and whether this volume will do much to overcome them. As I have already emphasized, philosophers and mathematicians tend to approach a philosophical topic with different priorities. The mathematicians in this volume often emphasize examples and exciting developments within mathematics, while the philosophers spend most of their energy clarifying concepts and criticizing the arguments of other philosophers. When taken to extremes either approach can frustrate the members of another discipline. Philosophers rightly ask mathematicians to clarify and argue for their positions, while a mathematician may become impatient with endless reflection and debate. A related barrier is the different backgrounds that most philosophers and mathematicians have. Philosophers are typically trained through the careful study of their predecessors and are taught to seek out objections and counterexamples. While most philosophers of mathematics have an excellent understanding of foundational areas of mathematics like logic and set theory, for obvious reasons few have reached a level of specialization in any other area of mathematics. By contrast, most mathematicians will not have much of a background in philosophy and will be tempted to appeal to the most interesting examples from their own mathematics even if they are not accessible to philosophers, let alone many other mathematicians. I am happy to report that most of the philosophical and mathematical discussion in this volume should be fairly accessible to everyone, but this probably happened only because the editors were looking out for complexities that might put off the average reader. Finally, it would be a bit naive to ignore the substantial professional barriers that stand in the way of any substantial philosophy-mathematics collaboration. To put it bluntly, nobody should try to get tenure by publishing for a community outside their home discipline. That said, it is encouraging to see philosophers and mathematicians at least trying to engage each other's interests and I hope these efforts will be continued and expanded in the coming years.

Friday, August 14, 2009

Computer Simulations Support Some New Mathematical Theorems

The current issue of Nature contains an exciting case of the productive interaction of mathematics and physics. As Cohn summarizes here, Torquato and Jiao use computer simulations and theoretical arguments to determine the densest way to pack different sorts of polyhedra together in three-dimensional space:
To find their packings, Torquato and Jiao use a powerful simulation technique. Starting with an initial guess at a dense packing, they gradually modify it in an attempt to increase its density. In addition to trying to rotate or move individual particles, they also perform random collective particle motions by means of deformation and compression or expansion of the lattice's fundamental cell. With time, the simulation becomes increasingly biased towards compression rather than expansion. Allowing the possibility of expansion means that the particles are initially given considerable freedom to explore different possible arrangements, but are eventually squeezed together into a dense packing.
A central kind of case considered is the densest packings of the Platonic solids. These are the five polyhedra formed using only regular polygons of a single sort, where the same number of polygons meet at each vertex: tetrahedron, icosahedron and octahedron (all using triangles), cube (using squares) and dodecahedron (using pentagons). Setting aside the trivial case of the cube, Torquato and Jiao argue that the densest packing for the icosohedron, octahedron and dodecahedron all have a similar feature. This is that the result from a simple lattice structure known as the Bravais lattice. Again, using Cohn's summary:
In such arrangements, all the particles are perfectly aligned with each other, and the packing is made up of lattice cells that each contain only one particle. The densest Bravais lattice packings had been determined previously, but it had seemed implausible that they were truly the densest packings, as Torquato and Jiao's simulations and theoretical analysis now suggest.
The outlier here is the tetrahedron, where the densest packing remains unknown.

Needless to say, there are many intriguing philosophical questions raised by this argument and its prominent placement in a leading scientific journal. To start, how do these arguments using computer simulations compare to other sorts of computer assisted proofs, such as the four color theorem or the more recent Kepler Conjecture? More to the point, does the physical application of these results have any bearing on the acceptability of using computer simulations in this way?

Monday, July 27, 2009

Michael Murray and Jan Cover (Purdue) Take on Evil

My colleague Jan Cover appears in the latest edition of Percontations, a Bloggingheads series which has in the past tackled other philosophical topics like the nature of time. This time the nature of evil is discussed, with special reference to God and Leibniz.

Saturday, July 25, 2009

The Honeycomb Conjecture (Cont.)

Following up my earlier post, and in line with Kenny’s perceptive comment, I wanted to raise two sorts of objections to the explanatory power of the Honeycomb Conjecture. I call them the problem of weaker alternatives and the bad company problem (in line with similar objections to neo-Fregeanism).

(i) Weaker alternatives: When a mathematical result is used to explain, there will often be a weaker mathematical result that seems to explain just as well. Often this weaker result will only contribute to the explanation if the non-mathematical assumptions are adjusted as well, but it is hard to know what is wrong with this. If this weaker alternative can be articulated, then it complicates the claim that a given mathematical explanation is the best explanation.

This is not just a vague possibility for the Honeycomb Conjecture case. As Hales relates
It was known to the Pythagoreans that only three regular polygons tile the plane: the triangle, the square, and the hexagon. Pappus states that if the same quantity of material is used for the constructions of these figures, it is the hexagon that will be able to hold more honey (Hales 2000, 448).
This suggests the following explanation of the hexagonal structure of the honeycomb:
(1) Biological constraints require that the bees tile their honeycomb with regular polygons without leaving gaps so that a given area is covered using the least perimeter.

(2) Pappus’ theorem: Any partition of the plane into regions of equal area using regular polygons has perimeter at least that of the regular hexagonal honeycomb tiling.
This theorem is much easier to prove and was known for a long time.

If this is a genuine problem, then it suggests an even weaker alternative which arguably deprives the explanation of its mathematical content:
(1) Biological constraints require that the bees tile their honeycomb with regular polygons without leaving gaps so that a given area is covered using the least perimeter.

(2’) Any honeycomb built using regular polygons has perimeter at least that of the regular hexagonal honeycomb tiling.
We could imagine supporting this claim using experiments with bees and careful measurements.

(ii) Bad company: If we accept the explanatory power of the Honeycomb Conjecture despite our uncertainty about its truth, then we should also accept the following explanation of the three-dimensional structure of the honeycomb. The honeycomb is built on the two-dimensional hexagonal pattern by placing the polyhedron given on the left of the picture both above and below the hexagon. The resulting polyhedron is called a rhombic dodecahedron.



So it seems like we can explain this by a parallel argument to the explanation of the two-dimensional case:
(1*) Biological constraints require that the bees build their honeycomb with polyhedra without leaving gaps so that a given volume is covered using the least surface area.

(2*) Claim: Any partition of a three-dimensional volume into regions of equal volume using polyhedra has surface area at least that of the rhombic dodecahedron pattern.
The problem is that claim (2*) is false. Hales points out that Toth showed that the figure on the right above is a counterexample, although “The most economical form has never been determined” (Hales 2000, 447).

This poses a serious problem to anyone who thinks that the explanatory power of the Honeycomb Conjecture is evidence for its truth. For in the closely analogous three-dimensional case, (2*) plays the same role, and yet is false.

My tentative conclusion is that both problems show that the bar should be set quite high before we either accept the explanatory power of a particular mathematical theorem or take this explanatory power to be evidence for its mathematical truth.

Friday, July 24, 2009

Schupbach Crushes Pincock!

Over at Choice and Inference, Jonah Schupbach has initiated a discussion of my PSA 2008 paper on mathematics, science and confirmation theory. Readers of this blog may be interested in how it is going ...

Thursday, July 23, 2009

What Follows From the Explanatory Power of the Honeycomb Conjecture?

Following up the intense discussion of an earlier post on Colyvan and mathematical explanation, I would like to discuss in more detail another example that has cropped up in two recent papers (Lyon and Colyvan 2008, Baker 2009). This is the Honeycomb Conjecture:
Any partition of the plane into regions of equal area has perimeter at least that of the regular hexagonal honeycomb tiling (Hales 2000, 449).
The tiling in question is just (Hales 2001, 1)



The Honeycomb Conjecture can be used to explain the way in which bees construct the honeycombs that they use to store honey. The basic idea of this explanation is that the bees which waste the minimum amount of material on the perimeters of the cells which cover a maximum surface area will be favored by natural selection. As Lyon and Colyvan put it:
Start with the question of why hive-bee honeycomb has a hexagonal structure. What needs explaining here is why the honeycomb is always divided up into hexagons and not some other polygon (such as triangles or squares), or any combination of different (concave or convex) polygons. Biologists assume that hivebees minimise the amount of wax they use to build their combs, since there is an evolutionary advantage in doing so. ... the biological part of the explanation is that those bees which minimise the amount of wax they use to build their combs tend to be selected over bees that waste energy by building combs with excessive amounts of wax. The mathematical part of the explanation then comes from what is known as the honeycomb conjecture: a hexagonal grid represents the best way to divide a surface into regions of equal area with the least total perimeter. … So the honeycomb conjecture (now the honeycomb theorem), coupled with the evolutionary part of the explanation, explains why the hive-bee divides the honeycomb up into hexagons rather than some other shape, and it is arguably our best explanation for this phenomenon (Lyon and Colyvan 2008, 228-229).
Lyon and Colyvan do not offer an account of how this conjecture explains, but we can see its explanatory power as deriving from its ability to link the biological goal of minimizing the use of wax with the mathematical feature of tiling a given surface area. It is thus very similar to Baker's periodic cicada case where the biological goal of minimizing encounters with predators and competing species is linked to the mathematical feature of being prime.

Baker uses the example to undermine Steiner’s account of mathematical explanation. For Steiner, a mathematical explanation of a physical phenomenon must become a mathematical explanation of a mathematical theorem when the physical interpretation is removed. But Baker notes that the Honeycome Conjecture wasn’t proven until 1999 and this failed to undermine the explanation of the structure of the bees’ hive (Baker 2009, 14).

So far, so good. But there are two interpretations of this case, only one of which fits with the use of this case in the service of an explanatory indispensability argument for mathematical platonism.
Scenario A: the biologists believe that the Honeycomb Conjecture is true and this is why it can appear as part of a biological explanation.
Scenario B: the biologists are uncertain if the Honeycomb Conjecture is true, but they nevertheless deploy it as part of a biological explanation.
It seems to me that advocates of explanatory indispensability arguments must settle on Scenario B. To see why, suppose that Scenario A is true. Then the truth of the Conjecture is presupposed when we give the explanation, and so the explanation cannot give us a reason to believe that the Conjecture is true. A related point concerns the evidence that the existence of the explanation is supposed to confer on the Conjecture according to Scenario B. Does anybody really think that the place of this conjecture in this explanation gave biologists or mathematicians a new reason to believe that the Conjecture is true? The worry seems even more pressing if we put the issue in terms of the existence of entities: who would conclude from the existence of this explanation that hexagons exist?

Hales, T. C. (2000). "Cannonballs and Honeycombs." Notices Amer. Math. Soc. 47: 440-449.

Hales, T. C. (2001). "The Honeycomb Conjecture." Disc. Comp. Geom. 25: 1-22.

Sunday, July 19, 2009

Two New Drafts: Surveys on "Philosophy of Mathematics" and "The Applicability of Mathematics"

I have posted preliminary drafts of two survey articles that are hopefully of interest to readers of this blog. The first is for the Continuum Companion to the Philosophy of Science, edited by French and Saatsi, on "Philosophy of Mathematics":
In this introductory survey I aim to equip the interested philosopher of science with a roadmap that can guide her through the often intimidating terrain of contemporary philosophy of mathematics. I hope that such a survey will make clear how fruitful a more sustained interaction between philosophy of science and philosophy of mathematics could be.
The second is for the Internet Encyclopedia of Philosophy on "The Applicability of Mathematics":
In section 1 I consider one version of the problem of applicability tied to what is often called "Frege's Constraint". This is the view that an adequate account of a mathematical domain must explain the applicability of this domain outside of mathematics. Then, in section 2, I turn to the role of mathematics in the formulation and discovery of new theories. This leaves out several different potential contributions that mathematics might make to science such as unification, explanation and confirmation. These are discussed in section 3 where I suggest that a piecemeal approach to understanding the applicability of mathematics is the most promising strategy for philosophers to pursue.
In line with the aims of the IEP, my article is more introductory, but hopefully points students to the best current literature.

Both surveys are of course somewhat selective, but comments and suggestions are more than welcome!

Saturday, July 18, 2009

Leitgeb Offers an "Untimely Review" of the Aufbau

Topoi has a fun series of "untimely reviews" of classic works in philosophy commissioned with the following aim: "We take a classic of philosophy and ask an outstanding scholar in the same field to review it as if it had just been published. This implies that the classical work must be contrasted with both past and current literature and must be framed in the wider cultural context of the present day."

Hannes Leitgeb has carried this off with great panache with the Aufbau. The opening paragraph sets the tone:
Philosophy is facing a serious crisis, but no one cares. When
German Idealism, Existentialism, and Marxism allied with
Sociology, Psychoanalysis, Cultural History, and Literature
Studies in the early 20th century, all attempts at conducting
philosophy in a style similar to that of the scientists got
expelled from the High Church of Philosophy. The creation
of the Wykeham Professorship in Hermeneutics (formerly:
Logic) at Oxford and the Stanford Chair of Textual Non-
Presence (formerly: Methodology of Science) are wellknown
indicators of these, by now, historical developments.
The best philosophical work since then is to be found in the
history of philosophy—if one is lucky. One cannot help but
wondering what turn philosophy would have taken if
someone had picked up the revolutionary developments in
logic and mathematics in the 1920s and directed them
towards philosophy. Maybe there would still be logic
courses in philosophy departments? Who knows?
Here's hoping that some more classics of analytic philosophy get similar treatments soon!

Thursday, July 16, 2009

El Niño Has Arrived. But What is El Niño?

According to Nature the lastest El Niño has begun in the Pacific. I got interested in this meteorological phenomenon back when I was living in California and coincidentally read Mike Davis' polemic Late Victorian Holocausts: El Niño Famines and the Making of the Third World . While a bit over the top, it contains a great section on the history of large-scale meteorology including the discovery of El Niño. As I discuss in this article, El Niño is a multi-year cyclical phenomenon over the Pacific that affects sea-surface temperature and pressure from India to Argentina. What I think is so interesting about it from a philosophy of science perspective is that scientists can predict its evolution once a given cycle has formed, but a detailed causal understanding of what triggers a cycle or what ends it remains a subject of intense debate. See, for example, this page for an introduction to ths science and here for a 2002 article by Kessler which asks if El Niño is even a cycle. This case provides yet one more case where causal ignorance is overcome by sophisticated science and mathematics.

Thursday, July 9, 2009

Scientists Wonder If Philosophy Makes You a Better Scientist

Over at Cosmic Variance Sean Carroll has initiated an ongoing discussion of the following passage from Feyerabend:
The withdrawal of philosophy into a “professional” shell of its own has had disastrous consequences. The younger generation of physicists, the Feynmans, the Schwingers, etc., may be very bright; they may be more intelligent than their predecessors, than Bohr, Einstein, Schrodinger, Boltzmann, Mach and so on. But they are uncivilized savages, they lack in philosophical depth — and this is the fault of the very same idea of professionalism which you are now defending.
With some hesitation Carroll concludes that "I tend to think that knowing something about philosophy — or for that matter literature or music or history — will make someone a more interesting person, but not necessarily a better physicist." (See comment 56 by Lee Smolin and comment 64 by Craig Callender for some useful replies.)

Beyond that debate, it's worth wondering how knowing some science and mathematics helps the philosopher of science and mathematics. Pretty much everyone in these areas of philosophy would agree that it does help, but exactly how is probably a controversial issue.

Colyvan Blocks the "Easy Road" to Nominalism

In a paper posted on his webpage listed as forthcoming in Mind, Mark Colyvan launches a new offensive against fictionalists like Azzouni, Melia and Yablo. They present a non-platonist interpetation of the language of mathematics and science that, they argue, does not require the "hard road" that Field took. Recall that Field tried to present non-mathematical versions of our best scientific theories. As Colyvan describes the current situation, though, "There are substantial technical obstacles facing Field's project and thse obstacles have prompted some to explore other, easier options" (p. 2). Colyvan goes on to argue that, in fact, these fictionalists do require the success of Field's project if their interpretations are to be successful.

I like this conclusion a lot, and it is actually superficially similar to what I argued for in my 2007 paper "A Role for Mathematics in the Physical Sciences". But what I argued is that Field's project is needed to specify a determinate content to mixed mathematical statements (p. 269). Colyvan takes a different and perhaps more promising route. He argues that without Field's project in hand, the fictionalist is unable to convincingly argue that apparent reference to mathematical entities is ontologically innocent. This is especially difficult given the prima facie role of mathematics in scientific explanation:
The response [by Melia] under consideration depends on mathematics playing no explanatory role in science, for it is hard to see how non-existent entities can legitimately enter into explanations (p. 11, see also p. 14 for Yablo).
I have noted this explanatory turn in debates about indispensability before, but here we see Colyvan moving things forward in a new and interesting direction. Still, I continue to worry that we need a better positive proposal for the source of the explanatory contributions from mathematics, especially if it is to bear the weight of defending platonism.

Tuesday, July 7, 2009

Mancosu on Mathematical Style

Paolo Mancosu continues his innovative work in the philosophy of mathematics with a thought-provoking survey article on Mathematical Style for the Stanford Encyclopedia of Philosophy. From the introductory paragraph:
The essay begins with a taxonomy of the major contexts in which the notion of ‘style’ in mathematics has been appealed to since the early twentieth century. These include the use of the notion of style in comparative cultural histories of mathematics, in characterizing national styles, and in describing mathematical practice. These developments are then related to the more familiar treatment of style in history and philosophy of the natural sciences where one distinguishes ‘local’ and ‘methodological’ styles. It is argued that the natural locus of ‘style’ in mathematics falls between the ‘local’ and the ‘methodological’ styles described by historians and philosophers of science. Finally, the last part of the essay reviews some of the major accounts of style in mathematics, due to Hacking and Granger, and probes their epistemological and ontological implications.
As Mancosu says later in the article "this entry is the first attempt to encompass in a single paper the multifarious contributions to this topic". So it is wide-open for further philosophical investigation!

Saturday, June 20, 2009

New Draft: Mathematics, Science and Confirmation Theory

Here is the latest version of my paper from the PSA. As noted earlier, the goal of the session was to establish some links between philosophy of mathematics and philosophy of science. My aim was to make the connection through confirmation, although all I have done so far in this paper is raised the issue in what is hopefully a useful and novel way. This part of an ongoing project, so comments are certainly welcome!

Mathematics, Science and Confirmation
Abstract: This paper begins by distinguishing intrinsic and extrinsic contributions of mathematics to scientific representation. This leads to two investigations into how these different sorts of contributions relate to confirmation. I present a way of accommodating both contributions that complicates the traditional assumptions of confirmation theory. In particular, I argue that subjective Bayesianism does best accounting for extrinsic contributions, while objective Bayesianism is more promising for intrinsic contributions.

Wednesday, June 17, 2009

Richardson on Carus on Carnap

Richardson has a review in NDPR of Carus' recent book on Carnap. It is fairly sympathetic, but I think it strikes the right note of skepticism about Carus' attempts to extract an Enlightenment project from Carnap's work that will not only rescue some notion of explication in the service of clarifying scientific knowledge, but will also relate scientific knowledge to normative disputes in ethics and politics. As I read the book, Carus does an excellent job clarifying Carnap's moves towards a defensible picture of explication, but the link to values is still hard to make out. As Richardson puts it,
Carus's book leaves, that is to say, more to be done to specify and implement the project he announces. One can only hope that he continues to work in this vein and to inspire others to do so also. I am not convinced that what is at stake in interpreting Carnap's philosophy is ultimately our Western way of life, but, given the well-known social projects of the Vienna Circle, it would not be surprising if some aspects of interpreting Carnap's project aided in our philosophical understanding of our own social projects. I hope this review has given some indication of the multiple levels on which Carus's book is worth engaging philosophically. The book will be central to the continuing detailed scholarly discussions of Carnap's philosophy. More than this, it will, I hope, help raise to consciousness several larger issues regarding the social import of key projects within analytic philosophy.

Tuesday, June 16, 2009

Nahin on Torricelli’s Funnel



On the beach I finally got a chance to start (if not finish) Nahin’s When Least is Best: How Mathematicians Discovered Many Clever Ways to Make Things as Small (or as Large) as Possible. So far it is an engrossing survey of work on maximum/minimum problems that leads one gently into the mathematical intricacies of the history without being overly technical.

One of the first examples that Nahin discusses is Torricelli’s Funnel (also known as Gabriel’s Horn). One can think of it as the result of rotating the "first quadrant branch" (positive x, positive y) of the hyperbola xy = 1 around the x-axis. If we consider its surface, then we can show that the total surface is greater than any finite number. But if we consider its volume, then we can show that it is finite. For this case, the volume is pi. While I had come across this example before in Mancosu’s book, Nahin raises a paradox which I think should be more widely known for those working on mereology and "intuitions" in philosophy. As Nahin puts it, if the surface is infinite, then we cannot paint the funnel with a bucket of paint, no matter how large the bucket is. But if the volume is finite, we can paint the inside of the funnel by filling the funnel with paint! Do we have a proof that this figure is impossible?

No! Nahin points out that we are tacitly working with two different notions of paint: "mathematical paint" and "real paint". If we consider real paint, it is composed of molecules of some finite size. So, if we pour the real paint into the funnel, then it will not coat the inside of the funnel because at some distance its molecules will no longer fit. But if we consider "mathematical paint", then a finite amount can coat the outside of the funnel, or the inside of the funnel, because a finite volume can be spread out to any degree over a surface if we place no limit on the thickness of the coating. This elegant example shows just how liable we are to make mistakes when we consider these sorts of mereological questions if we fail to pay attention to the mathematical subtleties.

Thursday, June 4, 2009

A Course in the History of Analytic Philosophy

Following up the previous post, here is the list of lectures that I gave here in Taiwan, with the readings for each lecture. I had 22 sessions, with an hour and a half per session, but pressed into four weeks. I ended up with only 18 lectures, with some sessions having more reading than others. An introductory course in deductive logic was presupposed.

1 What is analytic philosophy? What is the history of analytic
philosophy?
Glock, What is Analytic Philosophy?, pp. 21-48.

2 Kant and Mill
(i) Kant, Prolegomena to Any Future Metaphysics (1783), Preamble
and First Part. (ii) Mill, A System of Logic (1843), Book I, Ch. 3, sections 6-9, Ch. 5 & Book II, Ch. 6.

3 Frege, Foundations: Project & Critical Phase
Foundations, Introduction, sections 1-44

4 Frege, Foundations: Constructive Phase
Foundations, sections 45-69

5 Frege, Foundations: Implications
Foundations, sections 70-109

6 Frege, Two later papers
"Sense and Reference", "The Thought"

7 Moore and Russell
Moore, "Refutation of Idealism"

8 Russell on Denoting
Russell, "On Denoting"

9 Russell, Problems: Perception
Problems of Philosophy, ch. 1-4

10 Russell, Problems: Universals
Problems of Philosophy, ch. 5-10

11 Russell, Problems: Judgment
Problems of Philosophy, ch. 11-15

12 Wittgenstein, Tractatus: Metaphysics
1-2.063

13 Wittgenstein, Tractatus: Picturing
2.1-4.28

14 Wittgenstein, Tractatus: Logic
4.3-5.5571

15 Wittgenstein, Tractatus: Nonsense
5.6-7

16 The Vienna Circle
Neurath, Carnap, Hahn, "The Scientific World Conception: The Vienna Circle", Schlick, "The Turning Point in Philosophy", Carnap, "Elimination
of Metaphysics"

17 Protocol Sentences
Neurath, "Physicalism", "Protocol Sentences", Carnap, "Protocol
Sentences"

18 Carnap & Quine
Carnap, "Empiricism, Semantics and Ontology", Quine, "Two Dog-
mas of Empiricism"

Ideally there would be two more lectures: (i) one after 17 filling out the second phase of the protocol sentence debate with Schlick's "Foundation of Knowledge" and some later Neurath papers "Radical Physicalism and the 'Real World'" and "Unity of Science as a Task" and (ii) a final lecture bringing together some of the lessons for the history of analytic philosophy and noting some later developments with Quine and post-Quine. While this is a lot for one semester, for fifteen weeks I think it is a good balance of coverage of material and detailed discussion.

Wednesday, June 3, 2009

Teaching the History of Analytic Philosophy

Rather than worrying about the nature of analytic philosophy or taking a poll on who the most important philosophers are, I wanted to raise the issue of how we should structure an introductory course on the history of analytic philosophy. It seems to me that history of analytic has reached a kind of maturity that we associate with other areas of history like modern or Kant. With these topics, there are a few standard ways to organize an introductory course. There are also some companion introductory books that can be used to supplement readings of the classic primary texts.

But in the history of analytic philosophy, we don't really have either a standard syllabus or adequate companion books. Over the years several people have asked me how I teach my history of analytic classes. Unfortunately, I have tried out several different ways of organizing a Frege-Russell-Wittgenstein course and have always had trouble with the Russell part. My latest experience in Taiwan has convinced me that we should have a few basic goals when organizing such a course.

First, as with any history of philosophy course, the readings should be mostly drawn from the primary texts of the philosophers themselves. These readings should cover several different philosophers and span various areas of philosophy. It is not that useful, I would argue, to just focus on philosophy of language, for example, or just philosophy of mathematics. Picking just one area of philosophy would give a misleading impression of early analytic philosophy.

Second, there should be some attempt to relate the readings together into some kind of sustained narrative. One problem with some introductory books out there right now is that they cover just one philosopher. So, they miss the important interactions and disputes between philosophers that are crucial to the development of analytic philosophy. The narrative need not be some kind of continuous advancement of understanding, but could just as easily include several false starts or strange innovations on a given issue.

Third, students should learn not only the material covered, but also come to appreciate its remoteness from many of our contemporary ways of doing philosophy. As I put this point in my most recent course, early analytic philosophy was done in a different context, where a context includes a choice of problems, methods and standard positions. So, we should acquaint our students with this different context and help them to see how radical a transformation was effected by people like Frege, Russell and Wittgenstein.

On this last point, I can imagine the reply
Aren't we analytic philosophers, after all? If we are analytic philosophers, then surely we share more or less the same philosophical context of earlier analytic philosophers. So, the sorts of misunderstandings that result from variation in philosophical context simply cannot arise when we look back at these writings.
I want to suggest that this sort of response is misguided. What it ignores is that in its early stages analytic philosophy involved a radical change in the way philosophy was done. If this is right, then these early analytic philosophers were operating in a very different philosophical context from the one we enjoy now. This is for the simple reason that their philosophical contributions changed the philosophical context in many respects. These include all the features of a context: what problems are important, which methods are appropriate and which answers are viable. Many more traditional aspects of philosophy were thrown out and many new problems and techniques were imported into philosophy. The revolutionary character of early analytic philosophy, then, means that we must be careful in our approach to these writings, perhaps even more careful than when reading Aristotle or Aquinas where the difference in philosophical context is obvious and uncontested.

In future posts I hope to flesh out more how this sort of course would proceed, but for now I would be interested in hearing from teachers and students what their thoughts are on this sort of issue.

Tuesday, June 2, 2009

New Book: Glock, What is Analytic Philosophy?

As part of my course here in Taiwan I have been trying to read Glock's 2008 book What is Analytic Philosophy? (reviewed here). I am only about half-way through, but it is already one of the most encouraging contributions to the field in recent years. To start, Glock shows an in-depth knowledge of the many different aspects of analytic philosophy, from its earliest stages to its contemporary manifestations in Europe. Another positive is the level at which the book is written. While some exposure to analytic philosophy is necessary to appreciate his main points, most of the discussion would be accessible to a detemined undergraduate or a beginning graduate student.

Glock's main claim is set out in his "Introduction":
According to the [historical conception], analytic philosophy is first and foremost a historical sequence of individuals and schools that influenced, and engaged in debate with, each other, without sharing any single doctrine, problem, method or style ... [But] a purely historical conception ignores the fact that philosophers can be more or less analytic on grounds other than historical ties. These worries can be laid to rest if we acknowledge that analytic philosophy is a tradition held together not just by relations of influence, but also by overlapping similarities (pp. 19-20).
So, Glock offers a hybrid account of what analytic philosophy is. He combines both historical influence with similarities in philosophical commitments.

As I continue to read the book I will be interested to see if Glock addresses a tension that I see in many attempts to characterize analytic philosophy. On the one hand, we want to understand why analytic philosophy developed at the time and place that it did. On the other hand, if we are sympathetic to analytic philosophy, we also want to explain what is good or best about it compared to other developments in philosophy. Both desires can be easily combined if there are conclusive philosophical arguments for certain distinctive views of analytic philosophy, and these arguments were presented by the early analytic philosophers. But, in my experience at least, these things are very hard to find. As a result, many historians feel forced to choose either a purely causal reconstruction of historical developments or else a timeless reconstruction of philosophical arguments. Some third alternative is clearly needed.

Monday, June 1, 2009

Back to Blogging!

Apologies to the few regular readers of this blog for my lack of posts through May. I finished up my sabbatical visit at the Center for the Philosophy of Science in Pittsburgh at the beginning of May. I highly recommend it for anybody at any rank whose work intersects with the philosophy of science. Among other things, John Norton organized a reading group for the visiting fellows where we read each other's work in progress. While this caused some disturbing flashbacks to graduate school for me, the whole group was great and I certainly learned a lot about different areas of philosophy of science. PhilSci Archive now has a page where some of the work done by fellows this year can be seen. Everyone should consider applying for 2010-2011 -- the deadline for applications is in December.

After Pittsburgh I travelled to Taiwan where I am teaching an intensive four-week summer course on the history of analytic philosophy at Soochow University. This has been a very enlightening experience, and also very time consuming as Michael Mi, my host, asked me to circulate my lecture notes to the students to help them follow along. We are now in the last week of the class, so some of my energies can turn back to blogging! I plan on posting more about this class and the issues it raises for teaching the history of analytic soon.

Wednesday, April 29, 2009

The Deep Blue of Jeopardy

According to this note from Scientific American, IBM scientists are aiming to unveil a computer that can compete against humans on the game-show Jeopardy. Following up the success with chess, and the claimed success with poker, seems to me a bit of a stretch. A "final showdown" is planned for some time in 2010.

Monday, April 27, 2009

End Universities as We Know Them or Just End Universities?

Columbia Professor of Religion Mark C. Taylor offers a fairly bizarre series of recommendations for reforming universities in today's New York Times. He starts by making the well-known point that many graduate programs are larger than they should be because graduate student teaching saves universities money. This is true, but unrelated to his "reforms", which include abolishing traditional departments and tenure.

An example:
Abolish permanent departments, even for undergraduate education, and create problem-focused programs. These constantly evolving programs would have sunset clauses, and every seven years each one should be evaluated and either abolished, continued or significantly changed. It is possible to imagine a broad range of topics around which such zones of inquiry could be organized: Mind, Body, Law, Information, Networks, Language, Space, Time, Media, Money, Life and Water.
It makes perfect sense to have indisciplinary centers of research or even graduate programs. But how is someone supposed to invest the time and energy to gain specialized knowledge in any given field if they have to worry that their entire program might be abolished in seven years!?

Saturday, April 11, 2009

New Draft: Abstract Representations and Confirmation

Here is a recent draft of a paper I have been working on throughout my year at the Pittsburgh Center for the Philosophy of Science. It corresponds roughly to chapters III and IV of my book project where I go into more detail with examples and the significance for confirmation. I hope to post a more comprehensive overview of the project soon, but for now this may interest those working on both modeling and indispensability arguments.

Abstract: Many philosophers would concede that mathematics contributes to the abstractness of some of our most successful scientific representations. Still, it is hard to know what this abstractness really comes to or how to make a link between abstractness and success. I start by explaining how mathematics can increase the abstractness of our representations by distinguishing two kinds of abstractness. First, there is an abstract representation that eschews causal content. Second, there are families of representations with a common mathematical core that is variously interpreted. The second part of the paper makes a connection between both kinds of abstractness and success by emphasizing confirmation. That is, I will argue that the mathematics contributes to the confirmation of these abstract scientific representations. This can happen in two ways which I label "direct" and "indirect". The contribution is direct when the mathematics facilitates the confirmation of an accurate representation, while the contribution is indirect when it helps the process of disconfirming an inaccurate representation. Establishing this conclusion helps to explain why mathematics is prevalent in some of our successful scientific theories, but I should emphasize that this is just one piece of a fairly daunting puzzle.

Update (July 23, 2009): I have now linked to a new version of the paper.

Update (Sept. 30, 2010): This paper has been removed.

Thursday, April 9, 2009

Dupré & Griffiths Reject "An Unproductive Controversy"

In a letter to Nature, John Dupré and Paul Griffiths argue that Harry Collins' recent note in Nature is a mischaracterization of the current state of science studies.

One amusing feature of Collins' note is that he only cites work by himself. So, the second and third waves of science studies that he discusses coincide with the change in his own focus. This reinforces Dupré and Griffiths' point that the philosophy of science is already engaged in the kind of productive collaboration between the humanities and the sciences that Collins is calling for. More importantly, this collaboration is not based on some sacrosanct sociological model of science as just another social institution, but rather on an engagement with the content of the scientific views themselves and their scientific justification.

Tuesday, April 7, 2009

NYT Columnist Declares the End of Philosophy

What grade would you give this David Brooks essay in a freshman philosophy course? Perhaps a C for effort:
Think of what happens when you put a new food into your mouth. You don’t have to decide if it’s disgusting. You just know. You don’t have to decide if a landscape is beautiful. You just know.
Moral judgments are like that. They are rapid intuitive decisions and involve the emotion-processing parts of the brain. Most of us make snap moral judgments about what feels fair or not, or what feels good or not. We start doing this when we are babies, before we have language. And even as adults, we often can’t explain to ourselves why something feels wrong.

Thursday, April 2, 2009

Mathematical Laws or Trivial Patterns?

Philip Ball offers an entertaining summary of a recent paper in Science on how data can be analyzed to propose laws. The kinds of examples discussed, from physics through biology, show the need for some philosophical clarification:
As Schmidt and Lipson point out, some of the invariants embedded in natural laws aren't at all intuitive because they don't actually relate to observable quantities. Newtonian mechanics deals with quantities such as mass, velocity and acceleration, whereas its more fundamental formulation by Joseph Louis Lagrange invokes the principle of minimal action — yet 'action' is an abstract mathematical quantity that can be calculated but not really 'measured'.
And many of the seemingly fundamental constructs of natural law — the concept of force, say, or the Schrödinger equation in quantum theory — turn out to be mathematical conveniences or arbitrary (if well motivated) guesses that merely work well. Whether any physical reality should be ascribed to such things, or whether they should just be used as theoretical conveniences, remains unresolved in many of these constructs.

Tuesday, March 24, 2009

It's Official: Math Can Do Anything

At least according to this IBM ad.

Wednesday, March 18, 2009

Under the ruler, faster than the ruler?

One of the highlights of the recent workshop on Models and Fiction hosted by the Institute of Philosophy in London was Deena Skolnick Weisberg's presentation on her recent psychological study of how the imagination is deployed in fiction. One fairly robust finding for adults is that we tend to 'import' claims that we believe into the fiction even when they are not mentioned in a story. For example, when asked "is 2+2=4 true in the story you just heard?" most adults said "yes". Skolnick also observed a slightly reduced tendency to import beliefs that we recognize as contingent such as "is Obama President in the story you just heard?".

For me an equally interesting phenomenon is one that Skolnick just had time to mention: known ways in which our imaginative powers fail to track what will happen in the world. A dramatic example of this is found in this YouTube video. It is worthwhile just trying to predict what will happen! Apparently it is not clear what the systematic error that we tend to make is, but I take this example to complicate some attempts to identify scientific modelling with fiction.

Wednesday, February 25, 2009

Sunday, February 22, 2009

Post-blogging the Central: Plantinga and Dennett

The Central APA in Chicago this past weekend seemed fairly empty, although I heard from one of the organizers that registrations this year were about the same as last year. One of the more interesting sessions was in the very last time slot, and had Dennett commenting on Plantinga's "Science and Religion: Where the Conflict Really Lies". Partly what was remarkable about the session was how many people were there. The room was changed at the last minute to accommodate the additional interest, but even so, it was still standing room only. With at least 200 people packed into a small conference room, it was certainly one of the better attended APA events that I have been to.

I had to leave early to make my flight, so I only heard Plantinga's talk. Here I didn't hear much that was new. In the first half Plantinga argued that a committed theist could accept evolution because evolution per se is compatible with theism. This is mainly because the process of natural selection with random variation was said to be consistent with a divine plan which guided what we see as random. I am not sure how much this point of view depends on Plantinga's view that the warrant for theism is basic, but granting that point, I can see the coherence of his position.

The second half of the presentation argued that there is a quasi-religious "naturalism" which is in fact in tension with belief in evolution. Here Plantinga rehearsed his notorious argument that the combination of naturalism and evolution is self-defeating because it undermines the belief in the reliability of our cognitive faculties, and so provides a defeater for these beliefs.

Hearing the argument again drew my attention to one of the steps that seems very problematic. Plantinga's first premise is that P(R/N&E) is low. Here R is the belief that our cognitive faculties are reliable, N is naturalism and E is evolutionary theory. My concern is that even if this probability is low, that is irrelevant to the existence of defeaters. For a basic point about conditionalizing is that we should only conditionalize on our total evidence. Often this is captured by some kind of K meant to encapsulate all our background knowledge. So, even if P(R/N&E) is low, P(R/N&E&K) may be higher, and actually end up being high enough to avoid a defeater for R.

If we ignore total evidence, we can come up with easy defeaters for theism T. For example P(T/S) is low, where S is suffering. But of course theists are not forced to conditionalize on S, but can also include other beliefs from their store of background knowledge.

I am not an expert on the discussion of this argument, so maybe someone has made this objection before. Any comments are welcome, especially by those who saw the rest of the session!

Update: there is now an extended description of the session here.

Tuesday, February 17, 2009

"Idealization in Science" APA Session

Readers of this blog might be interested in checking out a session this Thursday night, 7:30-10:30pm, at the Central APA on the topic of idealization. The lineup:

Robert Batterman (University of Western Ontario), “Explanatorily Essential Idealizations”

Otávio A. Bueno (University of Miami), “Idealization in Science: An Inferential
Conception”

Christopher Pincock (Purdue University), "How to Avoid Inconsistent Idealizations" (formerly “Idealization and Mathematical Tractability”)

Michael Weisberg (University of Pennsylvania), “Deploying Highly Idealized Models”

I will be defending the contentious claim that idealizations of ocean waves do not involve the assumption that the ocean is infinitely deep ...

Thanks to Omar Mirza (St. Cloud State) for chairing!

Sunday, February 8, 2009

Wilson on the Missing Physics

In “Determinism and the Mystery of the Missing Physics” (BJPS Advance Access) Mark Wilson uses the debate about determinism and classical physics to make the more general point about “the unstable gappiness that represents the natural price that classical mechanics must pay to achieve the extraordinary success it achieves on the macroscopic level” (3). Wilson focuses mostly on Norton’s “dome” example and Norton’s conclusion that it shows that classical mechanics is not deterministic. The main objection to this conclusion is that Norton relies on one particular fragment of classical mechanics, and only finds a counterexample to determinism by mistreating what are really “descriptive holes” (10). By contrast, Wilson argues that there are different fragments to classical mechanics: (MP) mass point particle mechanics, (PC) the physics of rigid bodies with perfect constraints (analytic mechanics) and (CM) continuum mechanics. Norton's example naturally lies in (PC). Each fragment has its own descriptive holes which become manifest when we seek to understand the motivation for this or that mathematical technique or assumption at the basis of a treatment of a given system. Typically, a hole in one fragment can be fixed by moving to another fragment, but then that fragment itself has its own holes that prevent a comprehensive treatment. As a result, Wilson concludes that there is no single way the world has to be for “classical mechanics” to be true, and, in particular, there is no answer to the question of whether or not classical mechanics is deterministic.

I think Wilson has noticed something very important about the tendencies of philosophers of science: philosophical positions are typically phrased in terms of how things are quite generally or universally, but our scientific theories, when examined, are often not up to the task of answering such general questions. It seems to me that Wilson opts to resolve this situation by rejecting the philosophical positions as poorly motivated. But another route would be to try to recast the philosophical positions in more specific terms. For example, if, as Wilson argues, descriptive holes are more or less inevitable in these sorts of cases, then a suitably qualified kind of indeterminism cashed out in terms of the existence of these holes can be vindicated. Other debates, like the debate about scientific realism, seem to me to be in need of similar reform, rather than outright rejection.

Wednesday, February 4, 2009

An Introduction to Carnap's Aufbau

As promised earlier, here is a draft of a survey article on Carnap's Logical Structure of the World or Aufbau. The article will eventually be submitted to Philosophy Compass. Comments welcome, although please bear in mind that it is hard to summarize 80 years of discussion in 6000 words! Update (May 2012): A comment has drawn my attention to the broken link -- the published version is now online here.

Sunday, February 1, 2009

Weisberg on Models of Cognitive Labor in Science

Anyone who enjoyed the old cellular automata game of Life will have fun with the Java application that Michael Weisberg has made available on his webpage. Here Weisberg gives a small piece of the broader research project that he has been carrying out with Ryan Muldoon concerning how to understand the division of labor in successful scientific communities. As explained here the approach adopts a landscape approach where regions correspond to research strategies, and where different regions have different levels of epistemic significance. In the model agents then explore these landscapes according to different strategies. A preliminary result is that populations of "mavericks" who deliberately avoid regions explored by others do best. While the significance of this for the original question is not entirely clear, this is certainly an exciting way to investigate the issue!

Saturday, January 31, 2009

Models and Fiction

In a forthcoming paper "Models and Fiction", Roman Frigg gives an argument for the view that scientific models are best understood as fictional entities whose metaphysical commitments are “none” (17). I think this argument is a new and important one, but I don’t agree with it. Frigg first considers the view that models are abstract structures. He points out that an abstract mathematical structure, by itself, is not a model because there is nothing about it that ties it to any purported target system. But "in order for it to be true that a target system possesses a particular structure, a more concrete description must be true of the system as well" (5). The problem is that this more concrete description is not a true description of the abstract structure and it is not a true description of the target system either in the case if idealization. So, for these descriptions to do their job of linking the abstract structure to their targets, they must be descriptions of "hypothetical systems", and it is these systems that Frigg argues are the models after all.

My objection to this argument is that there are things besides Frigg’s descriptions that can do the job of linking abstract structures to target systems. A weaker link is a relation of denotation between some parts of the abstract structure and features of the target systems. This, of course, requires some explanation, but a denotation or reference relation, emphasized, e.g. by Hughes, need not involve a concrete description of any hypothetical system.

(Cross-posted with It's Only a Theory.)

Thursday, January 29, 2009

New Philosophy of Science Blog

Gabriele Contessa has come up with a great idea: a group blog for general philosophy of science. Check it out at It's Only A Theory. So far the other bloggers are Marc Lange, Otavio Bueno and myself. But I expect the list and the range of topics addressed there to grow quickly!

Friday, January 23, 2009

Sullivan on MacBeth on Frege's Logic

In the long tradition of negative book reviews, Peter Sullivan launches a fairly sustained attack on Macbeth's book on Frege's logic. Some highlights or lowlights, depending on your taste for such things:
In the short term, this book will probably make quite a stir; one hopes that in the longer term, it will be seen to have done no lasting damage to Frege studies. It is an extraordinary work, whose central contentions are remarkable chiefly for their perversity.
Because she so undervalues the achievement of Begriffsschrift, Macbeth is content to have no account at all of what Frege might have been thinking when he wrote it.
In this book, in her accounts both of the development of Frege’s own thought and of its relation to the tradition it founded, Macbeth does the history of logic backwards. She portrays Frege as reacting against a background of doctrine that the works of Carnap, Tarski, Quine et alii have somehow already magically set in place; and she portrays him as reacting against that background for reasons which he has yet to discover. This does not make for a plausible story.

Wednesday, January 14, 2009

Bleg: Aufbau Literature Since 1990

Here is a preliminary bibliography of places to look for focused discussions of the Aufbau since 1990. I have not always given the titles of papers in a collection if that collection has several different papers.

Suggestions welcome! Please also let me know if you have a view about the most important issues for our understanding and interpretation of the Aufbau. This is for a Philosophy Compass article that I am currently working on.

Tuesday, January 13, 2009

Scientific American Profile of Penelope Maddy

A rare glimpse of how someone became a philosopher of mathematics: Maddy describes the route from being a Westinghouse finalist to philosopher. Strangely, the profile does not mention her most recent book.

New Book: Grounding Concepts: An Empirical Basis for Arithmetical Knowledge

C. S. Jenkins' relatively new book looks like an exciting contribution to the epistemology of mathematics that aims to relate debates in the philosophy of mathematics to some more recent work on concepts and the a priori. Based on the title and on her earlier paper, I had expected that Jenkins aimed to defend some kind of neo-Millian view of arithmetic, in line with Kitcher. But this seems to have been a mistake. Her preface lays out a clear desire to defend the a priority of arithmetic:
(1) that arithmetical truths are known through an examination of our arithmetical concepts;
(2) that (at least our basic) arithmetical concepts map the arithmetical structure of the independent world;
(3) that this mapping relationship obtains in virtue of the normal functioning of our sensory apparatus. (x)
It is (1) and (3) which might not seem to initially sit well together, so I look forward to seeing how Jenkins can reconcile some of kind a priorism with some kind of empiricism.

Sunday, January 4, 2009

Models and Simulations 3 Program

After a bit of the delay, the program for the Models and Simulations 3 conference is now online. The conference will be held at the University of Virginia and will run all day on March 6th and 7th and the morning of March 8th.

Obvious highlights of the program are the two keynote speakers: Mark Bedau and Patrick Suppes. But there are about 50 other speakers, making this one of the larger special-topic conferences. Judging from the titles, it looks like a good mix of papers focused on the different sciences as well as some of the main conceptual issues in modeling and simulation.

My paper is grandly titled "Methods of Multiscale Modeling" and will be my attempt to integrate issues about scaling and the topic of an earlier post into broader epistemological issues about modeling and scientific reasoning. Hopefully a draft will appear here soon!