Bacteria and Boltzmann

(A more technical post than most, in which we look at bacterial population densities, and watch a movie.)

bacteria in a gradientThis term I taught a graduate biophysics course, about which I’ll write a recap later. Bacteria came up a lot in the course, both because they’re important but also because they offer many nice illustrations of how physical principles of diffusion, hydrodynamics, and stochasticity govern how life works. A benefit of teaching a graduate course is that I learn things, too! I’ll describe here something about chemotaxis that I wasn’t previously aware of.

Motile bacteria will migrate towards higher concentrations of attractants (e.g. nutrients). Bacteria like E. coli move in a sequence of roughly straight-line, constant velocity “runs”…

run_transparent_June2015… separated by “tumbles” at which they randomize their direction:

tumble If the bacterium senses that the nutrient concentration is increasing with time, the probability per unit time of tumbling decreases (and so, on average, the run length increases). In other words, if the bacterium is moving in a “good” direction, it’s more likely to continue in that direction. All this well known, nicely described, for example in Howard Berg’s classic book, which I first read at least a decade ago (2000?).

The new (to me) part: Given a bunch of bacteria, it’s clear that the process described above will increase the bacterial concentration at regions of high attractant concentration. But how much? If I double the attractant concentration (c) in one area compared to another, does the bacterial population density (P) double? What is the relationship between c and P?

Remarkably, in retrospect, I had never thought about this question, and had never stumbled upon a discussion of it, until seeing it as an exercise in Bill Bialek’s recent book. As Bialek sketches, one can write a simple model of this process in which we state that the tumble rate (r) is a linear function of ∂c/∂t, the perceived rate of change of the concentration. Just given this, we can derive, analytically, the form of the steady-state bacterial concentration P(c(x)). (We consider just one dimension, x; I expect that the answer easily generalizes to higher dimensions.) I elaborated slightly and assigned this as a homework problem for my class. (If you’re interested, it’s #5 of this: Homework5: the original by Bialek is #49 of this: fromBialek_page154.)

The answer is amazing:

Screen Shot 2015-06-20 at 10.07.06 AMwhere Z is some normalization. In other words, the bacterial density follows a Boltzmann distribution, analogous to P = exp(-Energy/Temperature), with the local concentration playing the role of energy. The analog to the temperature in statistical mechanics is the proportionality between the tumble probability and ∂c/∂t. (I don’t think I can write a c with a dot on it here…)

It’s such a nice, succinct result! If we double the nutrient concentration, we increase the bacterial density by a factor of e^2 = 7.4.

In retrospect, I should perhaps not have been surprised, since:

  • the concentration gradient changes the likelihood that a right-moving bacterium becomes a left-moving one, or vice versa, and so is reminiscent of forward and backward reaction rates in chemical reactions, the equilibrium concentrations of which are given by a Boltzmann expression dependent on the free energy.
  • it seems that every distribution in nature is exponential, Gaussian, or Poisson, so we had a 1/3 chance of finding a Boltzmann distribution!

One can derive the above result analytically with some careful thinking and a bit of time. One can also simulate this chemotaxis model and validate the result with less thinking and more time. I didn’t assign the simulation to my students (though they had several other programming assignments), but I’ve done it myself for kicks.

Consider, for example, this parabolic concentration profile:

concentration_parabola_3umLet’s watch 200 bacteria moving (in 1D), responding to this c(x) as in the Bialek model, i.e. with a tumble rate that is linearly dependent on the perceived rate of change of c:

The movie is fun, but it’s clearer to look at the histogram of bacterial number, i.e. P(x), which the theory would predict is a Gaussian. It is, with the proper width and shape predicted by the model. The plot on the left shows the final (steady-state) P(x) together with exp(-beta c(x)), where beta = ∂r/∂\dot{c} and the plot on the right shows the evolution of P(x) with time.

bacteriaP_parabola_bothWe can make weirder concentration profiles, and again find that the simulated P(x) agrees with the analytic P(x) = exp(-beta c(x)):

bacteriaP_twostep_allMy MATLAB code for the simulation, which contains descriptions of the parameters, is here. (Note: there’s considerable room for improvement — take this as a sketch!)

What’s the moral of this story? Yet again, it shows us that random processes can be described by simple rules. It also gives us a simple way to potentially link observed bacterial spatial distributions with underlying chemical gradients, which we’re presently applying in the lab to some of our bacterial imaging experiments.

And the funders said, “Let there be funds,” and there were funds

The University of Oregon’s Research Development Services (RDS) sends a weekly email listing funding opportunities — grants, fellowships, and awards that we might be interested in. It’s a good thing to do, and each week there’s a nice variety of programs from many federal and private funding agencies. RDS might be a bit too permissive in its assessment of funding sources, though, since this week’s email includes:

Creation research snapshot 2015-05-27 at 1.33.48 PMYes, the “Creation Research Society” is just what you think it is.

Perhaps I should be appalled, or perhaps I should broaden my funding horizons. After all, how better to test parables of feeding the multitudes than with my lab’s three-dimensional imaging of lots of zebrafish? And can’t the formation of membranes from lipid molecules, which we study extensively, be seen as driven by divine forces, rather than secular “self-assembly?” The possibilities are endless.

How much is that physics major in the window?

A friend pointed me to a recent study looking at the earnings (salaries) of people with degrees in different majors — for example, what’s the average salary of a physics major? Or a history major? The authors of the study are economists, and in my opinion put forth in their exposition of the study an overly pecuniary view of the factors that go into choosing a major. Regardless of their utility, though, the data are interesting. The web site is here, and includes link to a summary and a 214 page full report. In case you’re curious, the median salary of 25-59 year-olds with undergraduate degrees in physics is $81,000, making it rank #15 out of the 137 majors listed. I’m surprised this is so high. Nine of the top ten majors are various flavors of engineering, with the other (#2) being pharmacy. Chemistry is #50 ($64k) and Biology is #74 ($56k). Depressingly, at the bottom of the list is Early Childhood Education ($39k) — something of immense importance, but that doesn’t pay well.

In addition to salary, the report looks at the popularity of various majors. I was curious whether these two are correlated — whether, for example, there’s a “supply curve” (thinking like an economist) such that the majors for which students are abundant are those for which the pay is less. (This would assume that a lot of other factors are equal across majors.) I can’t extract the number of people with each major from the report — at least not unless I put a lot of work into this, which won’t happen. However, I can easily extract the rankings for each (salary and popularity) and can plot these:

Major_each_Earnings_Pop_Rank(The colors indicate categories, detailed in the legend of the next graph.)

As you can see, it’s a cloud! There are popular, lucrative majors, unpopular, low-paying majors, and every other combination.

The 137 majors in the study are grouped into categories. If we plot the median salary (the actual amount, not the ranking) versus popularity (percentage of majors), we get the following:

Major_Cat_Earnings_PopAgain, there’s no trend evident. Stretching the horizontal axis with a semilogarithmic scale gives us a nice triangle of points (with one outlier):

Major_Cat_Earnings_Pop_semilogxIs there a lesson here, other than that there are a lot of business majors? I don’t know. I’ll leave that to you, dear reader.

Slammed! (Why do bacteria care about physics?)

EColi_big_5Apr2015_transparent_downsampleI was a participant (contestant?) in our local “Physics Slam” two weeks ago, in which half a dozen physics faculty gave 10 minute talks to the general public, with a “winner” chosen by about five judges selected from the audience. About 500 people came, filling an auditorium:

PhysicsSlam_299_4_08_15My talk, Why do bacteria care about physics?, was mostly about the surreal physics of small-scale fluid flows, which microbes have to deal with and which necessitate that bacteria can’t swim by, for example, waving appendages back and forth — a strategy that works well for fish, whales, and other large things. I did a live demonstration of the classic illustration of reversible flows using a big vat of corn syrup — this is one of my favorite demonstrations to do, and the crowd loved it, spontaneously cheering at the end. The whole talk went remarkably well — you can watch for yourself at My part is at 1:05 or so. I waved my arms a lot. Eric Corwin’s talk, on sand grains and sphere packing, is particularly good (0:38). (If you’ve never seen reversible flows, also check out this YouTube video by fluid dynamics giant G. I. Taylor.)

My graduate biophysics class has also been exploring the microscopic-scale physics of diffusion and flow. This week, we’ll get to the “low Reynolds number” issues reflected in the Physics Slam talk. Last week, we explored among other things the very non-intuitive ways in which diffusion-to-capture works. For example: Imagine a bacterium covered with “sticky” patches (receptors for nutrients, for example). The nutrients dissolved in the bacterium’s surroundings diffuse and, by chance, hit the patches, where they’re absorbed and “eaten.” Each patch is small, — say, 1 nm in radius like a typical protein — compared to the roughly 1 micron radius of the bacterium. The bacterium could cover itself entirely with adsorbing patches and maximize its food uptake, but it wouldn’t have any surface left for any other tasks — motility, secretion, sensing, etc. — so this would not be a good strategy to adopt. We, or the microbe, can ask: What fraction of the surface needs to be covered in absorbing patches for the total food uptake to be half as large as it would be if the entire surface were “sticky?” Naively, one might expect the answer to this to be 0.5 — half the surface should give half the food uptake. This is, however, totally wrong. Remarkably, one needs only about 0.3% of the surface to be “sticky” to provide a 50% efficiency for diffusive food capture.

A graph of the capture rate versus the area fraction covered by “sticky” patches looks like this (green curve):

patches_capture_rateor more clearly, on a semilog scale:

patches_capture_rate_semilogxThis remarkable result follows from the properties of diffusion. A simple derivation can be found in Howard Berg’s classic “Random Walks in Biology.” We can get a rough intuitive sense of how it arises by realizing that diffusive flows are driven by gradients of concentration. If we halve the radius of an absorbing patch, its area drops by a factor of four. However, the average flow lines of the diffusing particles, which must end at the patches, are squeezed into a tighter space, increasing the concentration gradient and thereby giving a greater flow rate that partially counteracts the large drop in area.

patchy sphere transparentThis is both amazing and, for bacteria, extremely useful. A handful of receptors are sufficient to to efficiently capture molecules from the surroundings, so there’s plenty of room for many types of receptors for many types of molecules (various nutrients, attractants, repellents, etc.).

The physics of the microscopic world are endlessly fascinating. Returning to the Physics Slam: As five of us predicted with high certainty beforehand, our astronomer Scott Fisher won — he started off with an astronomically-modified version of the intro to Prince’s Let’s Go Crazy, complete with music. Plus there were photos of colorful space things. The rest of us didn’t stand a chance.

You should appreciate the infrequency of my blog posts

fish_5April2015_transparentToday’s illustration doesn’t have anything to do with the topic below. I made it for a ten minute talk I’ll give tomorrow, at the local “Physics Slam.” You can see the program here. Short version: Six physics faculty will have ten minutes each to explain something. The audience votes on their favorite presentation. Apparently, when it was done last a few years ago, several hundred people came. We’ll see what happens this time! My title:

Why do bacteria care about physics?

At some point, I should practice…

Now on to today’s topic:

Everyone agrees that it’s impossible to keep up with the ever-expanding scientific literature. An interesting recent paper* takes a look at this phenomenon, verifying that the number of papers published every year is, indeed, growing exponentially:

* “Attention decay in science,” Pietro Della Briotta Parolo et al.,

Parolo Fig 5The authors look at what this means for scientific “memory.” In general, the rate at which a paper is cited by later papers decays over time (after an initial peak), as it is forgotten or as it is gives rise to other works that are cited instead. One might guess that a growth in publication rate might correlate with a larger decay rate for citations — we spend less time with the past as we’re swamped by new stuff. This is indeed what Parolo et al. find: a decay rate that has steadily grown over decades. This is unfortunate: by not considering papers of the more distant past we risk needlessly re-discovering insights, and we disconnect ourselves from our fields’ pioneering perspectives.

Returning to the overall number of papers: I wonder if this terrifying growth is driven primarily by an increase in the number of scientists or by an increase in papers written per person. I suspect the former. Even within the US, there are a lot more scientists than there used to be [e.g. this graph]. In the developing world this increase is far more dramatic (see e.g. here), as (presumably) it should be.

Unfortunately, I can’t find any data on the total number of scientists worldwide — at least not with just a few minutes of searching — or even the total number of Ph.D.’s awarded each year.

Looking around for any data that might help illuminate trends of population and paper production, I stumbled upon historical data for the American Physical Society (APS), namely the number of members in each year, since 1905 ( It’s not hard to tabulate the total number of papers published each year in the Physical Review journals — the publications of the APS. Looking at how each of these change with time might give a rough sense of whether one tracks the other. Of course, there are a lot of problems with interpreting any correlation between these two things: APS members (like me) publish in all sorts of journals, not just APS ones; non-APS members publish in APS journals; etc. Still, let’s see what these two look like:

APS_membership_and_papersJust considering APS journals alone, the number of papers published each year is 10 times what it was a few decades ago! Within the microcosm of APS, the number of papers being published has been growing at a far faster rate than the membership.

What does all this mean? I don’t really know. It’s impossible to do something about the general complaint that there are too many papers to read unless we have some deeper understanding of why we’re in this state. Lacking that, I suppose we’re just stuck reading papers as best we can, or feeling guilty for not reading…

T-minus 9 days for my graduate biophysics course

urchin_Feb2015_transparentNext term, I’ll be teaching a brand-new graduate biophysics course. (It’s the first time teaching a graduate course in my eight years as a professor!) I’ve spent quite a while thinking of what should be in it and how the course should be structured. Here, I’ll just note my list of topics (below, with a few comments), and provide a link to the syllabus (here). Hopefully in weeks to come I’ll comment on how the course is going.


Introduction; Physics, statistics, and sight

What are the fundamental limits on vision, and how close does biology come to reaching them? (A brief look.)

Components of biological systems

What are the components of biological systems? What are the length, time, and energy scales that we’ll care about? How can we organize a large list of “parts?”

Probability and heredity (a quick look)

We’ll review concepts in probability and statistical mechanics. We’ll discuss a classic example of how a quantitative understanding of probability revealed how inheritance and mutation are related.

Random Walks

We can make sense of a remarkable array of biophysical processes, from the diffusion of molecules to the swimming strategies of bacteria to the conformations of biomolecules, by understanding the properties of random walks.

Life at Low Reynolds Number

We’ll figure out why bacteria swim, and why they don’t swim like whales.

Entropy, Energy, and Electrostatics

We’ll see how entropy governs electrostatics in water, the “melting” of DNA, phase transitions in membranes, and more.

Mechanics in the Cell

We’ll look more at the mechanical properties of DNA, membranes, and other cellular components, and also learn how we can measure them.

Circuits in the Cell

Cells sense their environment and perform computations using the data they collect. How can cells build switches, memory elements, and oscillators? What physical principles govern these circuits?

Multicellular organization and pattern formation

How does a collections of cells, in a developing embryo, for example, organize itself into a robust three-dimensional structure? We’re beginning to understand how multicellular organisms harness small-scale physical processes, such as diffusion, and large-scale processes, such as folding and buckling, to generate form. We’ll take a brief look at this.

Cool things everyone should be aware of

We live in an age in which we can shine a laser at particular neurons in a live animal to stimulate it, paste genes into a wide array of organisms, and sequence a genome given only a single cell. It would be tragic to be ignorant of these sorts of almost magical things, and they contain some nice physics as well!


As you’ve probably concluded, this is too much for a ten-week course! I will cull things as we go along, based on student input. I definitely want to spend some time on biological circuits, though, which I’m increasingly interested in. I also want to dip into the final topic of “cool things” — I find it remarkable and sad that so many physicists are unaware of fantastic developments like optogenetics, CRISPR, and high-throughput sequencing. Students: prepare to be amazed.

My sea urchin illustration above has nothing to do with the course, but if you’d like a puzzle: figure out what’s severely wrong with this picture.


IMG_1442I’m at a conference at Biosphere 2, the large ecological research facility in the Arizona desert that was originally launched as an attempt at creating a sealed, self-contained ecosystem.

It’s a surreal place — a collection of glass pyramids and domes housing miniature rain forests, deserts, an “ocean,” and a few other biomes — that’s now used for more “normal” research and education. I’m here not to join some a futuristic commune (at least not yet), but rather as a participant in a fascinating conference organized by Research Corporation called “Molecules Come to Life” — basically, it’s getting a lot of people who are interested in complex living systems together to discuss big questions, think of new research directions, and launch new projects. It’s a fascinating and very impressive group that’s here. Interestingly, a huge fraction are physicists, either physicists in physics departments (like me) or people trained as physicists who are now in systems biology, bioengineering, microbiology, etc., departments.

Do the conference topic and the venue have anything to do with one another? Explicitly, no. But in an indirect sense, both touch on issues of scale. A key issue in the study of all sorts of complex systems is how to relate phenomena across different extents of space and time. How can we connect the properties of molecules to the operation of a biological circuit? A circuit to a cell? A cell to an organism? Are there general principles — like those that tie the individually chaotic behaviors of atoms in a gas into robust many-particle properties like pressure and density — that lead to a deeper understanding? Would a piece of a complex system have the same behavior as the whole, or are collective properties scale-dependent?

The initial goal with Biosphere 2 was that these small-scale ecosystems under glass could function sustainably. This failed quite badly (at least at first — see Wikipedia for more details). As we learned on an excellent tour this afternoon, nearly all animals in the enclosure died, the food grown was so minimal that everyone was hungry all the time, and oxygen levels dropped from about 20% to 14% (at which point oxygen had to be pumped in). Walking around, the issue that kept coming to mind was: what is the scale of an ecosystem? Biosphere 2 is really not very big — it’s a few football fields in total area. Are the webs of interaction that can exist in an area this size sufficient to mimic a “real” rainforest, savannah, or other environment? Are they large enough to be stable, and not fluctuate wildly?

Perhaps these questions couldn’t have been answered without building the structure and trying the experiment. (Or perhaps they could.) It would be great to talk to the people behind the project — they were commune dwellers, not scientists — and see what thoughts, assessments, dreams, and predictions went into the planning of this impressive, but odd, place.

Some more photos: