E-Mail carlip@physics.ucdavis.edu
My background |
Back to people page.
Return to the home page.
"Quantum gravity is notoriously a subject
where problems vastly outnumber results."
-Sidney Coleman
But the study of quantum gravity is difficult, and the main thing we have learned in these years of research is that the obvious approaches don't work. The difficulties are partly technical -- general relativity is a complicated, nonlinear theory -- but we face deep conceptual problems as well. According to general relativity, gravity is a consequence of the geometry of spacetime. That means that when we talk about quantizing gravity, we really mean "quantizing space and time themselves." We don't know what a completed quantum theory of gravity will look like, but we will surely end up with a picture of the Universe quite unlike anything we now imagine.
In the past few years, two promising new approaches to these problems have emerged. The first is string theory, a model in which elementary particles are not treated as pointlike objects, but instead as extended one-dimensional "strings." (Here is a nice nontechnical introduction.) The second is a reformulation of general relativity in terms of new variables -- "self-dual connections" or "Ashtekar variables" -- that behave more like those of conventional quantum field theories. This approach is now often called "quantum geometry." More recently, a new method, "causal dynamical triangulations," has also shown promise. Gary Au has written a nice nontechnical paper based on interviews with physicists working on string theory and quantum geometry, and I have written a more technical review of a variety of approaches to the problem.
An alternative general strategy for research is to explore simpler models that share the underlying conceptual features of quantum gravity while avoiding the technical difficulties. For example, general relativity in 2+1 dimensions -- two spatial dimensions plus time -- has the same basic structure as the full (3+1)-dimensional theory, but it is technically much simpler, and the implications of quantum gravity can be examined in detail. Similarly, quantum black holes may be simple enough to allow us to learn concrete lessons about the full theory.
For the past few years, I have concentrated on four areas of research:
In all, roughly 15 different approaches to quantizing (2+1)-dimensional gravity have been developed. Most of these are discussed in a book I wrote in 1998 for Cambridge University Press. The model has offered insight into such issues as the nature of time in quantum gravity, the source of black hole entropy, and the question of whether the topology of space can change. Here is a paper I wrote with Jeanette Nelson comparing two interesting approaches.
A recent review article I wrote on general relativity in 2+1 dimensions for the on-line journal Living Reviews in Relativity can be found here. Another review is here, this one discussing what we know about the microscopic "statistical mechanics" that presumably underlies (2+1)-dimensional black hole entropy. An older and more general review I wrote on (2+1)-dimensional black holes is here. For some research papers on the statistical mechanics of the (2+1)-dimensional black hole, look here and here.
In some ways, ordinary (2+1)-dimensional gravity may be too simple. Recently, a number of physicists have become interested in a slightly more complicated version, topologically massive gravity, which has a new propagating "graviton." I have been involved in this work; two papers are here and here.
An important focus of my recent work here has been an attempt to understand how much of the statistical mechanics of black holes can be determined purely from general symmetries, independent of the details of quantum gravity. I have shown that a symmetry mechanism is at least plausible. (Here and here are some papers, and here is a review.) This idea would help explain one of the mysteries of this field, sometimes called the problem of universality: the fact that very different approaches to quantum gravity, with different starting points and different underlying degrees of freedom, all seem to give the same answer. This overview won the 2007 Gravity Research Foundation essay prize; here is a less technical summary.
Another interesting issue is whether "quasinormal modes" -- the damped oscillations of a disturbed black hole -- can tell us anything about black hole quantum mechanics. I've written two papers on this subject, one on (2+1)-dimensional black holes and another on the higher-dimensional black holes that are understood in string theory.
Lattice quantum gravity may be another such window. The basic idea of putting a continuous theory on a lattice, approximating it by a simpler discrete theory, has had considerable success in quantum chromodynamics (QCD). The gravitational version is similar, but unlike QCD, where fields live on a fixed lattice, gravity is the lattice: just as the flat triangles in a a geodesic dome approximate a sphere, varying edge lengths or patterns of connectivity in higher dimensions can approximate varying curved spacetimes.
In particular, several of my students and I have begun to work on a promising lattice approach known as causal dynamical triangulations, in which a causal structure -- a "direction of time" -- is put in from the start. We have found the first independent confirmation of the pioneering results of Ambjorn, Jurkiewicz, and Loll, who showed that the method gives a sensible semiclassical limit that really looks like a four-dimensional spacetime. With the code now running stably, we are starting to look at new questions, such as the renormalization group flow of the cosmological constant and the predicted patterns of quantum fluctuations in the early Universe.
One intriguing prediction of the causal dynamical triangulations method is that while spacetime appears four-dimensional at large scales, it undergoes a "dimensional reduction" to two dimensions at very small scales. If this is a general feature of quantum gravity, and not just a peculiarity of this particular approximation, it could be telling us something very important. A lecture of mine on this topic may be found here.
A few less technical areas I've worked in are the following:
Michael Ashworth worked on coherent states in quantum gravity, and on conformal field theory from (2+1)-dimensional gravity.
Sayan Basu worked on covariant canonical quantization and, with a postdoc, on observational searches for violations of Lorentz invariance of a sort that might be produced by quantum gravity.
Yujun Chen made major progress in quantizing Liouville theory, particularly in the sector that is probably relevant to black hole entropy.
Russell Cosgrove studied the problem of time in quantum gravity. (See the discussion here of such conceptual problems.)
Eric Minassian worked on the question of singularities in (2+1)-dimensional quantum gravity
Peter Salzman and I wrote a paper on a possible experimental test of whether gravity must be quantized, or whether it can be treated as a fundamentally classical theory. If we are right -- we're awaiting independent confirmation -- the need for quantum gravity could be tested in the next generation of molecular interferometry experiments.
Jim Van Meter didn't finish his paper on approximation methods for the Einstein field equations until he graduated, but he went on to work on one of the pioneering efforts to numerically simulate black hole mergers.
My "Careers in Physics" seminar for graduate students brings in a variety of speakers with physics degrees who have jobs outside academia, to describe their work and also to give some nuts-and-bolts advice about how to get a job. Past speakers have ranged from stock analysts to high school teachers to microwave engineers to radiation safety experts. We also invite occasional "skill" speakers, to address such issues as how to give a good talk and what to expect at a job interview.
For statistics and information about jobs in physics, look at the American Institute of Physics Career Services and Statistical Research pages, and at Joanne Cohn's very nice Physics and Astronomy Job Hunting Resources site. Job-hunting help is available at UC Davis from the Internship and Career Center.
If you're interested in learning more about general relativity and quantum gravity, here are some good starting places:
Memberships
Here are some slightly longer explanations of a few of the ideas I've mentioned elsewhere. Be warned -- the explanations here are, for the most part, drastic oversimplifications, and shouldn't be taken too literally. This section is still under construction (and will be for a long time...).
This argument was at first dismissed because it was believed that black holes were truly "black," that is, that they emitted no radiation. This would mean their temperature was absolute zero -- anything with a finite temperature radiates -- and an object with zero temperature cannot have a changing entropy. All this changed, though, when Hawking discovered that black holes actually do radiate, with a black body spectrum. (See "Hawking radiation.") The laws of thermodynamics were quickly extended to include black holes.
In ordinary thermodynamic systems, though, thermal properties appear as a consequence of statistical mechanics, that is, as a collective result of the behavior of large numbers of "microscopic" degrees of freedom. The temperature of a box of gas, for instance, reflects the kinetic energy of the molecules of gas, and the entropy measures the number of accessible physical states. There have been many attempts to formulate a "statistical mechanical" explanation of black hole thermodynamics, but it's safe to say that a complete picture has not yet been found.
For a recent review, see my Lectures at the 2007 Aegean Summer School on Black Holes.
We don't know how to carry out this sum, though, and the usual approximations used in quantum field theory apparently fail -- that's what it means to say that general relativity is nonrenormalizable. An alternative is to approximate the path integral by putting the theory on a lattice, essentially breaking an infinite sum into a discrete, finite collection of "paths." For ordinary field theory on a lattice -- quantum chromodynamics, for example -- the lattice is held fixed, with fields placed at vertices or along edges. For general relativity, on the other hand, the lattice itself provides the dynamics. Just as flat triangles can be put together to form the curved surface of a geodesic dome, so flat simplices can be put together to form a curved spacetime.
The idea of this kind of discrete approach to quantum gravity is not at all new. Until recently, though, the resulting path integrals didn't seem to give anything close to a good approximation of a classical spacetime. Instead of a smooth, nearly flat spacetime, the results looked like either "crumpled" or "branched polymer" geometries, not at all like the world we live in. Recently, though, a new approach to the problem has been proposed. Called "causal dynamical triangulations" (or "Lorentzian dynamical triangulations"), this program treats time in a new way, choosing new "gluing" rules that guarantee a well-behaved direction of time and, in the process, rule out certain kinds of quantum fluctuation of topology. While it is too soon to tell whether this approach will work, there are some encouraging signs -- at least we seem to be able to obtain a good "nearly classical" four-dimensional spacetime from the simulations.
I should stress that people who work on quantum gravity don't, for the most part, spend much time thinking about such problems in the abstract. Rather, we try various approaches that work elsewhere, run into some difficulty that's rooted in these conceptual problems, and try to solve it in a concrete instance; or we look for ways to reformulate general relativity or quantum mechanics to make these issues less important; or we look for simpler models in which some of these problems occur but may be solvable. There are lots of vague ideas about what quantum gravity might look like; the hard part is in getting any of them to actually work in detail.
(For a nice review paper by Chris Isham on some of the conceptual issues in quantum gravity, go here.)
These assumptions lead almost uniquely to a set of field equations with two undetermined constants. One of these is Newton's constant, which determines the strength of the gravitational interaction. The other is the cosmological constant, Lambda.
In modern terms, the cosmological constant looks like a very peculiar kind of gravitating matter, one that pervades the Universe with a constant density and an extremely high, negative pressure. Einstein originally introduced this term in order to allow static cosmological solutions to the field equations, and he later called it his biggest blunder. But it's not so easy to put this genie back in the bottle. The cosmological constant can be interpreted as the energy density of the vacuum, and can arise for at least two reasons:
We can measure Lambda by looking at its effect on cosmology. A value significantly different from zero in "particle physics" units would lead, depending on sign, to exponential expansion and a cold, empty universe, or to a universe that would have recollapsed long before its present age. A nonzero value of Lambda is not ruled out observationally, and in fact there is some evidence for a small positive cosmological constant. But the value of the vacuum energy density must be, at most, comparable in magnitude to the density of ordinary matter in the Universe, and this is is tiny compared to the values one expects from particle physics -- it's some 120 orders of magnitude smaller than one would expect from simple dimensional analysis.
While a number of more or less exotic suggestions have been floated, no one really knows why Lambda should be so small. This "cosmological constant problem" is one of the biggest mysteries in modern physics.
See Ned Wright's cosmology tutorial and Eli Michael's cosmological constant page for more about Lambda.
Popular books describe this proposal with varying degrees of accuracy. (See PBS's " Stephen Hawking's Universe" for a fairly good example.) By forgetting that the "no boundary proposal" is a quantum mechanical description, though, these popularizations can sometimes be misleading.
In particular, it's worth remembering that a quantum mechanical object does not have a unique, well-defined "history." For a particle, for instance, such a history would be a trajectory -- position as a function of time -- and would determine both the particle's position and its momentum at all times. But by the Heisenberg uncertainty relations, this cannot be done: we can never simultaneously exactly specify a particle's position and momentum.
The Hartle-Hawking "no boundary" proposal is based on the path integral, or "sum over histories," approach to quantum mechanics, in which a probability amplitude is computed by taking a weighted sum over all possible histories that lead from an initial condition (in this case, "nothing") to a final state. In a certain approximation, this sum is dominated by a "history" in which the Universe initially has a positive-definite metric -- thus the frequent references to "imaginary time." But neither this nor any other single history represents "the way the Universe really evolved."
The vacuum in quantum field theory is not really empty; it's filled with "virtual pairs" of particles and antiparticles that pop in and out of existence, with lifetimes determined by the Heisenberg uncertainty principle. When such pairs forms near the event horizon of a black hole, though, they are pulled apart by the tidal forces of gravity. Sometimes one member of a pair crosses the horizon, and can no longer recombine with its partner. The partner can then escape to infinity, and since it carries off positive energy, the energy (and thus the mass) of the black hole must decrease.
There is something a bit mysterious about this explanation: it requires that the particle that falls into the black hole have negative energy. Here's one way to understand what's going on. (This argument is based roughly on section 11.4 of Schutz's book, A first course in general relativity.)
To start, since we're talking about quantum field theory, let's understand what "energy" means in this context. The basic answer is that energy is determined by Planck's relation, E=hf, where f is frequency. Of course, a classical configuration of a field typically does not have a single frequency, but it can be Fourier decomposed into modes with fixed frequencies. In quantum field theory, modes with positive frequencies correspond to particles, and those with negative frequencies correspond to antiparticles.
Now, here's the key observation: frequency depends on time, and in particular on the choice of a time coordinate. We know this from special relativity, of course -- two observers in relative motion will see different frequencies for the same source. In special relativity, though, while Lorentz transformations can change the magnitude of frequency, they can't change the sign, so observers moving relative to each other with constant velocities will at least agree on the difference between particles and antiparticles.
For accelerated motion this is no longer true, even in a flat spacetime. A state that looks like a vacuum to an unaccelerated observer will be seen by an accelerated observer as a thermal bath of particle-antiparticle pairs. This predicted effect, the Unruh effect, is unfortunately too small to see with presently achievable accelerations, though some physicists, most notably Schwinger, have speculated that it might have something to do with thermoluminescence. (Most physicists are unconvinced.)
The next ingredient in the mix is the observation that, as it is sometimes put, "space and time change roles inside a black hole horizon." That is, the timelike direction inside the horizon is the radial direction; motion "forward in time" is motion "radially inward" toward the singularity, and has nothing to do with what happens relative to the Schwarzschild time coordinate t.
The final ingredient is a description of vacuum fluctuations. One useful way to look at these is to say that when a virtual particle- antiparticle pair is created in the vacuum, the total energy remains zero, but one of the particles has positive energy while the other has negative energy. (For clarity: either the particle or the antiparticle can have negative energy; there's no preference for one over the other.) Now, negative-energy particles are classically forbidden, but as long as the virtual pair annihilates in a time less than h/E, the uncertainty principle allows such fluctuations.
Now, finally, here's a way to understand Hawking radiation. Picture a virtual pair created outside a black hole event horizon. One of the particles will have a positive energy E, the other a negative energy -E, with energy defined in terms of a time coordinate outside the horizon. As long as both particles stay outside the horizon, they have to recombine in a time less than h/E. Suppose, though, that in this time the negative-energy particle crosses the horizon. The criterion for it to continue to exist as a real particle is now that it must have positive energy relative to the timelike coordinate inside the horizon, i.e., that it must be moving radially inward. This can occur regardless of its energy relative to an external time coordinate.
So the black hole can absorb the negative-energy particle from a vacuum fluctuation without violating the uncertainty principle, leaving its positive-energy partner free to escape to infinity. The effect on the energy of the black hole, as seen from the outside (that is, relative to an external timelike coordinate) is that it decreases by an amount equal to the energy carried off to infinity by the positive-energy particle. Total energy is conserved, because it always was, throughout the process -- the net energy of the particle-antiparticle pair was zero.
Note that this doesn't work in the other direction -- you can't have the positive-energy particle cross the horizon and leaves the negative- energy particle stranded outside, since a negative-energy particle can't continue to exist outside the horizon for a time longer than h/E. So the black hole can lose energy to vacuum fluctuations, but it can't gain energy.
(See the relativity FAQs, here, for a related but slightly different description of black hole thermodynamics and Hawking radiation.)
One way to understand the simplification is the following. In n dimensions, the phase space of general relativity -- the space of generalized positions and momenta, or equivalently the space of initial data -- is characterized by a spatial metric on a constant-time hypersurface, which has n(n-1)/2 independent components, and its time derivative (or conjugate momentum), which adds another n(n-1)/2 degrees of freedom per spacetime point. It is a standard result of general relativity, however, that n of the Einstein field equations are constraints on initial conditions rather than dynamical equations. These constraints eliminate n degrees of freedom per point. Another n degrees of freedom per point can be removed by using the freedom to choose n coordinates. We are thus left with n(n-1)-2n = n(n-3) physical degrees of freedom per spacetime point.
In four spacetime dimensions, this gives the four phase space degrees of freedom of ordinary general relativity, two gravitational wave polarizations and their time derivatives. If n=3, on the other hand, there are no field degrees of freedom: up to a finite number of possible global degrees of freedom, the geometry is completely determined by the constraints.
Equivalently, it can be shown from basic Riemannian geometry that the full Riemann curvature tensor in 2+1 dimensions is algebraically determined by the Einstein tensor. The Einstein tensor, in turn, is fixed uniquely, through the Einstein field equations, by the distribution of matter. As a result, there are no propagating gravitational degrees of freedom -- the geometry of spacetime at a point is (almost) entirely determined by the amount and type of matter at that point. In particular, if there is no matter, the Einstein field equations in 2+1 dimensions imply that spacetime is flat.
At first sight, this is too strong a restriction. It's not much of a test to be able to quantize a theory with no degrees of freedom, and general relativity is supposed to be about curved spacetimes, not flat ones. But this sort of counting argument can miss a finite number of "global" degrees of freedom. The simplest example of such degrees of freedom is the following:
Consider a flat, square piece of paper, with the following "gluing" rule: a point at any edge is to be considered the same as the corresponding point at the opposite edge. Such a space is topologically a torus, and is sometimes called the "video game model" of the torus. (When you reach one edge, you automatically pop back in at the opposite edge, as in many video games.) Geometrically, this space is flat -- all of the rules of Euclidean geometry hold in any small finite region -- because, after all, any small region looks just like a region of the piece of paper you started with. It's a fun exercise to convince yourself that this is true even for regions that contain an "edge."
Now change this model a little bit by starting with a parallelogram rather than a square. This gives you a different manifold for each different choice of parallelogram, up to some subtle symmetries (the "mapping class group"). Each of these manifolds is flat, but they are geometrically distinguishable. In fact, this construction gives you a three-parameter family of flat spaces with torus topology: there's one parameter for the length of each side of the parallelogram, plus one for the angle between two adjacent sides.
One of the simplest nontrivial solutions to (2+1)-dimensional general relativity is precisely such a torus universe. The constraints fix the overall scale in terms of two parameters (say, one side length and one angle), but these two parameters have an interesting and nontrivial evolution. More complicated topologies give more parameters. So do point particles, which can be represented as conical "defects" in space. With a negative cosmological constant, (2+1)-dimensional general relativity even admits black hole solutions, which behave almost like ordinary (3+1)-dimensional black holes.
Note that this does not contradict the earlier counting argument. There are still only finitely many total degrees of freedom, rather than one or more degrees of freedom per point. But this finite number of degrees of freedom still has a very interesting dynamics, and the theory is rich enough to test many standard approaches to quantum gravity.
Two-dimensional spacetimes also provide an interesting testing ground for quantum gravity. As you might guess from the earlier counting argument, ordinary general relativity does not make sense in two spacetime dimensions -- the count gives "-2 degrees of freedom per point." But there are simple modifications that lead to interesting models of what is called "dilaton gravity."
For a beautiful nontechnical introduction to topologies like the "video game torus," see Jeff Weeks' book, The Shape of Space.
"Quantum cosmology" is the effort to use quantum gravity to predict some of the properties of the very early Universe -- its topology, for instance, and its initial distribution of matter and energy. This task is rather difficult, since we don't yet have a quantum theory of gravity. But there may be reasonable approximations that can be used to obtain partial information. Among the popular approaches are various saddle point approximations to the path integral (including approximations of the no boundary proposal) and "minisuperspace models," models in which all but a finite number of degrees of freedom of the gravitational field are "frozen out" and held fixed. The quantum geometry program has recently made some interesting progress in such minisuperspace cosmology -- see, for example this review by Bojowald.
The states of loop quantum gravity are described by "spin networks," graphs whose edges are labeled by spins and whose vertices are labeled by "intertwiners" (think Clebsch-Gordan coefficients) that tell how to combine spins. Geometric operators like area, constructed from the spacetime metric, act on these states, changing the network. The dynamics of general relativity comes in through the Hamiltonian constraint, a not-fully-understood operator condition that determines the admissible spin networks. An alternative view describes three-dimensional spin networks as tracing out spin foams in (3+1)-dimensional spacetime, providing a setting for a path integral approach. The (2+1)-dimensional version of this spin foam picture, the Turaev-Viro model, is a well-understood quantization of (2+1)-dimensional gravity.
Loop quantum gravity has had important successes in black hole physics and in quantum cosmology. Furthermore, unlike many past attempts to quantize general relativity, the loop approach is known to be mathematically well-defined, and it is "background-free," avoiding some of the conceptual problems of other methods. But although it is a quantum theory based on general relativity, it is not entirely clear that it is really a "quantum theory of gravity" -- it is not yet understood how to recover a good classical limit that looks like classical general relativity.
The couplings in the Lagrangian, however, are not the same as the couplings we actually measure. The physical interactions receive quantum corrections. The charge of an electron, for example, is screened by "vacuum polarization." This effect occurs because the vacuum is full of virtual pairs of particles and antiparticles (see the entry for Hawking radiation), which act, roughly, as a sort of dielectric medium. An electron in the vacuum attracts the positively charged members of virtual particle pairs and repels the negatively charged members, so the effective charge of the electron, as seen from a distance, is reduced. The amount of screening depends on distance (or energy) -- the closer you can get to an electron, the fewer virtual pairs lie between you and the electron, and the less screening occurs. Other kinds of interactions can lead to "antiscreening," which occurs in quantum chromodynamics.
This means that the effective value of a coupling constant will normally depend on the energy at which you probe the interaction. At short distances/high energies, the charge of an electron, for example, becomes higher -- it is less screened by virtual particle-antiparticle pairs. The color charge of a quark, in contrast, becomes lower, approaching zero at infinite energy in a phenomenon called "asymptotic freedom." This variation of coupling constants with energy or scale is called the "renormalization group flow."
The important thing to keep in mind is that the observed coupling constants are not the same as the bare ones that occur in the Lagrangian. In fact, even if a coupling constant is zero in the Lagrangian, its effective value can be nonzero once quantum corrections are accounted for. In general relativity, for example, a cosmological constant will be induced by quantum fluctuations even if the "bare" cosmological constant is set to zero.
A renormalizable theory is one in which the number of undetermined coupling constants is finite, even after quantum corrections are taken into account. A nonrenormalizable theory is one in which the number of undetermined effective coupling constants is infinite. General relativity, when treated as an ordinary quantum field theory, is nonrenormalizable.
Now, even a nonrenormalizable theory can have some predictive power -- for general relativity, see for example, work by Donoghue on "effective field theory." But given its infinite number of free parameters, a nonrenormalizable theory probably cannot serve as a complete description of physics.
(One possible loophole, suggested by Steven Weinberg, might make certain nonrenormalizable theories more acceptable. As I noted earlier, the coupling constants in any theory "flow" with energy. It could be that even if a theory has infinitely many coupling constants, they flow to a finite-dimensional surface at high energies. In such an "asymptotically safe" theory, the coupling constants, although infinite in number, would be determined by a finite number of parameters of the high-energy theory. It is an open question whether general relativity is asymptotically safe; here is a sympathetic review.)
For a much more detailed, but entertaining and not-too-technical, description, see John Baez's Renormalization Made Easy.
The basic idea of string theory is to replace point particles with "strings," one-dimensional objects that can come as loops (closed strings) or segments (open strings). This gives a very rich structure -- string theories typically involve more than four spacetime dimensions, and strings can both vibrate and "wrap around" extra compact dimensions, leading to an enormous number of possible quantum states. The hope is that these states can unify gravity and elementary particle physics into a single framework, and that, if we are lucky, only one self-consistent theory will exist.
Gravity comes into string theory in two closely related ways. First, one of the states of a closed string is a "graviton," a massless, self-interacting spin two particle. There are general results that any theory describing such a particle has to look like general relativity at low energies. Second, if one looks at a string propagating in a curved spacetime background, one finds that a consistent description is only possible if the background obeys certain restrictions, which again look like the Einstein field equations at low energies. These two results seem independent, but they are actually linked -- the consistent background in which strings can propagate can be described as a quantum state (technically, a "coherent state") of string excitations, including gravitons.
For many years, it looked as if there were several distinct self-consistent string theories. But the "duality revolution" of the last decade showed that they all seem to be related by a set of duality transformations, which typically relate the strongly-coupled behavior of one string theory with the weakly-coupled behavior of another. This development has given us a glimpse of a larger landscape, in which the string theories we know are only small pieces. The hypothetical big picture is called M theory (M for "mystery," "matrix," "membrane," and a number of other possibilities). We don't know much about M theory as a whole, though we do know that it is not only a theory of strings -- higher-dimensional objects, membranes or "branes" for short, also play essential roles. A further revolution has come with the AdS/CFT correspondence, a remarkable relation between string theory, including gravity, in certain spacetimes ("asymptotically anti-de Sitter spaces") and lower dimensional conformal field theories that don't include gravity.
String theory has shown enough striking coincidences -- surprising internal consistencies that have even led to deep new results in mathematics -- that most practitioners are convinced that a deep underlying structure exists. But we don't yet know what that structure is. Ed Witten, for example, has said that the one of the most fundamental questions in string theory is to understand "what is the new kind of geometry that generalises what Einstein used." We also don't know, in a practical sense, how unique string theory is: even if there is only one consistent theory, there seem to be an extraordinarily large number of "ground states," each of which gives not only a different spacetime geometry, but a different particle content and a different gauge group for elementary particles. Whether one of these states describes the real Universe, and, if so, whether there is any way to pick it out as being special, remains to be seen.
String theory is also just now beginning to address some of the general conceptual problems of quantum gravity: the problem of finding local observables to describe spacetime, for example, and the question of what quantum gravity does about singularities. Most of what we can actually do in string theory involves perturbation theory around a fixed classical background, a process that postpones these deep issues but cannot entirely remove them.
Typical field theories have an infinite number of degrees of freedom -- while they involve only a finite number of fields, each field has one or more degrees of freedom per point. In certain cases, though, symmetries are strong enough to reduce the physical degrees of freedom to a finite number. One example of such a reduction occurs in (2+1)-dimensional gravity.
Usage varies, but a "topological field theory" is usually defined as one with both of these properties, metric independence and a finite number of physical degrees of freedom. The archetypical topological field theory is Chern-Simons theory, a type of gauge theory in three dimensions. In the past few years, especially because of the work of Ed Witten, such theories have become increasingly important, both in physics and in mathematics.
One step more complicated is a model called "topologically massive gravity," proposed by Deser, Jackiw, and Templeton in 1982. This model modifies the field equations of general relativity by adding a new term with three derivatives. This is normally a dangerous thing to do -- "higher derivative" theories in physics usually have negative energies and no stable solutions -- but in this special case it is consistent. In a different context, the extra term is "topological", that is, it depends only on the topology of spacetime and not the particular geometry; hence the somewhat confusing name.
The addition of a higher-derivative term in the field equations changes the counting of degrees of freedom of a theory. For topologically massive gravity, the effect is to add a new, propagating degree of freedom, a sort of massive gravitational wave, or, in the quantum theory, a massive graviton. Recently, the model has been a subject of renewed attention because of its interesting properties in anti-de Sitter space, where it has become a testing ground for the AdS/CFT correspondence of string theory.
The field equations of general relativity determine the geometry of spacetime in terms of the matter content. They do not, in general, determine the topology. The two aren't completely independent; choices of geometry and topology must be compatible, and this places some restrictions on possible spacetimes. As a simple example, the average curvature of a two-dimensional manifold with the topology of torus (that is, the surface of a donut) must be zero, while the the average curvature of a two-dimensional sphere must be positive. But the restrictions are weak, and many topologies are consistent with the same geometry. In an earlier section of this glossary, for example, I described a geometrically flat torus. But there are also nonflat geometries on a torus (picture a donut again), and there are other two-dimensional spaces -- the plane, for instance -- with zero curvature.
One of the interesting problems in modern cosmology is that of determining the topology of space in the real Universe. An interesting discussion of the problem is here. The September 1998 issue of Classical and Quantum Gravity has the proceedings of a conference on the topology of the Universe.
Another interesting question is whether quantum fluctuations can cause the topology of space to change in time. John Wheeler has proposed a picture of "spacetime foam," in which the topology of the Universe at the smallest scales is undergoing complicated, random fluctuations. Whether this picture is correct or not remains an open question.
For a nice nontechnical introduction to topology, see Jeff Weeks' book, The Shape of Space. For a little more detail, a good elementary introduction is Klaus Janich's book, Topology. UC Davis has an extremely strong topology group in the math department.
Back to people page.
Return to the home page.