Amazon Prime Free Trial
FREE Delivery is available to Prime members. To join, select "Try Amazon Prime and start saving today with FREE Delivery" below the Add to Cart button and confirm your Prime free trial.
Amazon Prime members enjoy:- Cardmembers earn 5% Back at Amazon.com with a Prime Credit Card.
- Unlimited FREE Prime delivery
- Streaming of thousands of movies and TV shows with limited ads on Prime Video.
- A Kindle book to borrow for free each month - with no due dates
- Listen to over 2 million songs and hundreds of playlists
Important: Your credit card will NOT be charged when you start your free trial or if you cancel during the trial period. If you're happy with Amazon Prime, do nothing. At the end of the free trial, your membership will automatically upgrade to a monthly membership.
$18.00$18.00
Ships from: Amazon.com Sold by: Amazon.com
$12.31$12.31
Ships from: Amazon Sold by: ZBK Books
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Audible sample
Follow the author
OK
From Eternity to Here: The Quest for the Ultimate Theory of Time Paperback – Illustrated, October 26, 2010
Purchase options and add-ons
-Brian Greene, author of The Elegant Universe
Twenty years ago, Stephen Hawking tried to explain time by understanding the Big Bang. Now, Sean Carroll says we need to be more ambitious. One of the leading theoretical physicists of his generation, Carroll delivers a dazzling and paradigm-shifting theory of time's arrow that embraces subjects from entropy to quantum mechanics to time travel to information theory and the meaning of life.
From Eternity to Here is no less than the next step toward understanding how we came to exist, and a fantastically approachable read that will appeal to a broad audience of armchair physicists, and anyone who ponders the nature of our world.
- Print length464 pages
- LanguageEnglish
- PublisherDutton
- Publication dateOctober 26, 2010
- Dimensions5.5 x 1 x 8.5 inches
- ISBN-100452296544
- ISBN-13978-0452296541
Book recommendations, author interviews, editors' picks, and more. Read it now.
Frequently bought together
Customers who viewed this item also viewed
Editorial Reviews
Review
-Seed Magazine
"Carroll...takes his readers on a fascinating and refreshing trek through every known back alley and cul de sac of quantum mechanics, relativity, cosmology and theoretical physics. The best way to grasp the rich mysteries of our universe is by constantly rereading the best and clearest explanations. Mr. Carroll's From Eternity to Here is certainly one of them."
-Wall Street Journal
"For anyone who ever wondered about the nature of time and how it influences our universe, this book is a must read. It is beautifully written, lucid, and deep."
-Kip Thorne, Feynman Professor of Theoretical Physics at Caltech, author of Black Holes and Time Warps
"Sean Carroll's From Eternity to Here provides a wonderfully accessible account of some of the most profound mysteries of modern physics. While you may not agree with all his conclusions, you will find the discussion fascinating, and taken to much deeper levels than is normal in a work of popular science."
-Sir Roger Penrose, University of Oxford, author of The Road to Reality and The EMperor's New Mind
About the Author
Excerpt. © Reprinted by permission. All rights reserved.
Contents
Title Page
Copyright Page
Dedication
PART ONE - TIME, EXPERIENCE, AND THE UNIVERSE
Chapter 1 - THE PAST IS PRESENT MEMORY
Chapter 2 - THE HEAVY HAND OF ENTROPY
Chapter 3 - THE BEGINNING AND END OF TIME
PART TWO - TIME IN EINSTEIN’S UNIVERSE
Chapter 4 - TIME IS PERSONAL
Chapter 5 - TIME IS FLEXIBLE
Chapter 6 - LOOPING THROUGH TIME
PART THREE - ENTROPY AND TIME’S ARROW
Chapter 7 - RUNNING TIME BACKWARD
Chapter 8 - ENTROPY AND DISORDER
Chapter 9 - INFORMATION AND LIFE
Chapter 10 - RECURRENT NIGHTMARES
Chapter 11 - QUANTUM TIME
PART FOUR - FROM THE KITCHEN TO THE MULTIVERSE
Chapter 12 - BLACK HOLES: THE ENDS OF TIME
Chapter 13 - THE LIFE OF THE UNIVERSE
Chapter 14 - INFLATION AND THE MULTIVERSE
Chapter 15 - THE PAST THROUGH TOMORROW
Chapter 16 - EPILOGUE
APPENDIX: MATH
NOTES
BIBLIOGRAPHY
Acknowledgements
INDEX
DUTTON
Published by Penguin Group (USA) Inc.
375 Hudson Street, New York, New York 10014, U.S.A.
Penguin Group (Canada), 90 Eglinton Avenue East, Suite 700, Toronto, Ontario M4P 2Y3, Canada (a division of Pearson Penguin Canada Inc.) Penguin Books Ltd, 80 Strand, London WC2R 0RL, England Penguin Ireland, 25 St Stephen’s Green, Dublin 2, Ireland (a division of Penguin Books Ltd) Penguin Group (Australia), 250 Camberwell Road, Camberwell, Victoria 3124, Australia (a division of Pearson Australia Group Pty Ltd) Penguin Books India Pvt Ltd, 11 Community Centre, Panchsheel Park, New Delhi—110 017, India Penguin Group (NZ), 67 Apollo Drive, Rosedale, North Shore 0632, New Zealand (a division of Pearson New Zealand Ltd) Penguin Books (South Africa) (Pty) Ltd, 24 Sturdee Avenue, Rosebank, Johannesburg 2196, South Africa
Penguin Books Ltd, Registered Offices: 80 Strand, London WC2R 0RL, England
Published by Dutton, a member of Penguin Group (USA) Inc.
First printing, January 2010
All rights reserved
Photograph on page 37 by Martin Röll, licensed under the Creative Commons Attribution ShareAlike 2.0 License, from Wikimedia Commons. Photograph on page 47 courtesy of the Huntington Library. Image on page 53 by the NASA/WMAP Science Team. Photograph on page 67 courtesy of Corbis Images. Image on page 119 courtesy of Getty Images. Figures on pages 147, 153, 177, 213, 270, 379, and 382 by Sean Carroll. Photograph on page 204 courtesy of the Smithsonian Institution. Photograph on page 259 courtesy of Professor Stephen Hawking. Photograph on page 267 courtesy of Professor Jacob Bekenstein. Photograph on page 295 by Jerry Bauer, from Wikimedia Commons. Photograph on page 315 courtesy of the Massachusetts Institute of Technology. All other images courtesy of Jason Torchinsky.
REGISTERED TRADEMARK—MARCA REGISTRADA
LIBRARY OF CONGRESS CATALOGING-IN-PUBLICATION DATA
Carroll, Sean M., 1966-
From eternity to here : the quest for the ultimate theory of time / Sean Carroll.
p.cm.
Includes bibliographical references and index.
ISBN: 9781101152157
1. Space and time. I. Title.
QC173.59.S65C37 2009
530.11—dc22
2009023828
Without limiting the rights under copyright reserved above, no part of this publication may be reproduced, stored in or introduced into a retrieval system, or transmitted, in any form, or by any means (electronic, mechanical, photocopying, recording, or otherwise), without the prior written permission of both the copyright owner and the above publisher of this book.
The scanning, uploading, and distribution of this book via the Internet or via any other means without the permission of the publisher is illegal and punishable by law. Please purchase only authorized electronic editions, and do not participate in or encourage electronic piracy of copyrighted materials. Your support of the author’s rights is appreciated.
While the author has made every effort to provide accurate telephone numbers and Internet addresses at the time of publication, neither the publisher nor the author assumes any responsibility for errors, or for changes that occur after publication. Further, the publisher does not have any control over and does not assume any responsibility for author or third-party Web sites or their content.
To Jennifer
For all time
PROLOGUE
Does anybody really know what time it is?
—Chicago, “Does Anybody Really Know What Time It Is?”
This book is about the nature of time, the beginning of the universe, and the underlying structure of physical reality. We’re not thinking small here. The questions we’re tackling are ancient and honorable ones: Where did time and space come from? Is the universe we see all there is, or are there other “universes” beyond what we can observe? How is the future different from the past?
According to researchers at the Oxford English Dictionary, time is the most used noun in the English language. We live through time, keep track of it obsessively, and race against it every day—yet, surprisingly, few people would be able to give a simple explanation of what time actually is.
In the age of the Internet, we might turn to Wikipedia for guidance. As of this writing, the entry on “Time” begins as follows:
Time is a component of a measuring system used to sequence events, to compare the durations of events and the intervals between them, and to quantify the motions of objects. Time has been a major subject of religion, philosophy, and science, but defining time in a non-controversial manner applicable to all fields of study has consistently eluded the greatest scholars.1
Oh, it’s on. By the end of this book, we will have defined time very precisely, in ways applicable to all fields. Less clear, unfortunately, will be why time has the properties that it does—although we’ll examine some intriguing ideas.Cosmology, the study of the whole universe, has made extraordinary strides over the past hundred years. Fourteen billion years ago, our universe (or at least the part of it we can observe) was in an unimaginably hot, dense state that we call “the Big Bang.” Ever since, it has been expanding and cooling, and it looks like that’s going to continue for the foreseeable future, and possibly forever.
A century ago, we didn’t know any of that—scientists understood basically nothing about the structure of the universe beyond the Milky Way galaxy. Now we have taken the measure of the observable universe and are able to describe in detail its size and shape, as well as its constituents and the outline of its history. But there are important questions we cannot answer, especially concerning the early moments of the Big Bang. As we will see, those questions play a crucial role in our understanding of time—not just in the far-flung reaches of the cosmos, but in our laboratories on Earth and even in our everyday lives.
TIME SINCE THE BIG BANG
It’s clear that the universe evolves as time passes—the early universe was hot and dense; the current universe is cold and dilute. But I am going to be drawing a much deeper connection. The most mysterious thing about time is that it has a direction: the past is different from the future. That’s the arrow of time—unlike directions in space, all of which are created pretty much equal, the universe indisputably has a preferred orientation in time. A major theme of this book is that the arrow of time exists because the universe evolves in a certain way.
The reason why time has a direction is because the universe is full of irreversible processes—things that happen in one direction of time, but never the other. You can turn an egg into an omelet, as the classic example goes, but you can’t turn an omelet into an egg. Milk disperses into coffee; fuels undergo combustion and turn into exhaust; people are born, grow older, and die. Everywhere in Nature we find sequences of events where one kind of event always happens before, and another kind after; together, these define the arrow of time.
Remarkably, a single concept underlies our understanding of irreversible processes: something called entropy, which measures the “disorderliness” of an object or conglomeration of objects. Entropy has a stubborn tendency to increase, or at least stay constant, as time passes—that’s the famous Second Law of Thermodynamics. 2 And the reason why entropy wants to increase is deceptively simple: There are more ways to be disorderly than to be orderly, so (all else being equal) an orderly arrangement will naturally tend toward increasing disorder. It’s not that hard to scramble the egg molecules into the form of an omelet, but delicately putting them back into the arrangement of an egg is beyond our capabilities.
The traditional story that physicists tell themselves usually stops there. But there is one absolutely crucial ingredient that hasn’t received enough attention: If everything in the universe evolves toward increasing disorder, it must have started out in an exquisitely ordered arrangement. This whole chain of logic, purporting to explain why you can’t turn an omelet into an egg, apparently rests on a deep assumption about the very beginning of the universe: It was in a state of very low entropy, very high order.
The arrow of time connects the early universe to something we experience literally every moment of our lives. It’s not just breaking eggs, or other irreversible processes like mixing milk into coffee or how an untended room tends to get messier over time. The arrow of time is the reason why time seems to flow around us, or why (if you prefer) we seem to move through time. It’s why we remember the past, but not the future. It’s why we evolve and metabolize and eventually die. It’s why we believe in cause and effect, and is crucial to our notions of free will.
And it’s all because of the Big Bang.
WHAT WE SEE ISN’T ALL THERE IS
The mystery of the arrow of time comes down to this: Why were conditions in the early universe set up in a very particular way, in a configuration of low entropy that enabled all of the interesting and irreversible processes to come? That’s the question this book sets out to address. Unfortunately, no one yet knows the right answer. But we’ve reached a point in the development of modern science where we have the tools to tackle the question in a serious way.
Scientists and prescientific thinkers have always tried to understand time. In ancient Greece, the pre-Socratic philosophers Heraclitus and Parmenides staked out different positions on the nature of time: Heraclitus stressed the primacy of change, while Parmenides denied the reality of change altogether. The nineteenth century was the heroic era of statistical mechanics—deriving the behavior of macroscopic objects from their microscopic constituents—in which figures like Ludwig Boltzmann, James Clerk Maxwell, and Josiah Willard Gibbs worked out the meaning of entropy and its role in irreversible processes. But they didn’t know about Einstein’s general relativity, or about quantum mechanics, and certainly not about modern cosmology. For the first time in the history of science, we at least have a chance of putting together a sensible theory of time and the evolution of the universe.
I’m going to suggest the following way out: The Big Bang was not the beginning of the universe. Cosmologists sometimes say that the Big Bang represents a true boundary to space and time, before which nothing existed—indeed, time itself did not exist, so the concept of “before” isn’t strictly applicable. But we don’t know enough about the ultimate laws of physics to make a statement like that with confidence. Increasingly, scientists are taking seriously the possibility that the Big Bang is not really a beginning—it’s just a phase through which the universe goes, or at least our part of the universe. If that’s true, the question of our low-entropy beginnings takes on a different cast: not “Why did the universe start out with such a low entropy?” but rather “Why did our part of the universe pass through a period of such low entropy?”
That might not sound like an easier question, but it’s a different one, and it opens up a new set of possible answers. Perhaps the universe we see is only part of a much larger multiverse, which doesn’t start in a low-entropy configuration at all. I’ll argue that the most sensible model for the multiverse is one in which entropy increases because entropy can always increase—there is no state of maximum entropy. As a bonus, the multiverse can be completely symmetric in time: From some moment in the middle where entropy is high, it evolves in the past and future to states where the entropy is even higher. The universe we see is a tiny sliver of an enormously larger ensemble, and our particular journey from a dense Big Bang to an everlasting emptiness is all part of the wider multiverse’s quest to increase its entropy.
That’s one possibility, anyway. I’m putting it out there as an example of the kind of scenarios cosmologists need to be contemplating, if they want to take seriously the problems raised by the arrow of time. But whether or not this particular idea is on the right track, the problems themselves are fascinating and real. Through most of this book, we’ll be examining the problems of time from a variety of angles—time travel, information, quantum mechanics, the nature of eternity. When we aren’t sure of the final answer, it behooves us to ask the question in as many ways as possible.
THERE WILL ALWAYS BE SKEPTICS
Not everyone agrees that cosmology should play a prominent role in our understanding of the arrow of time. I once gave a colloquium on the subject to a large audience at a major physics department. One of the older professors in the department didn’t find my talk very convincing and made sure that everyone in the room knew of his unhappiness. The next day he sent an e-mail around to the department faculty, which he was considerate enough to copy to me:
Finally, the magnitude of the entropy of the universe as a function of time is a very interesting problem for cosmology, but to suggest that a law of physics depends on it is sheer nonsense. Carroll’s statement that the second law owes its existence to cosmology is one of the dummest [sic] remarks I heard in any of our physics colloquia, apart from [redacted]’s earlier remarks about consciousness in quantum mechanics. I am astounded that physicists in the audience always listen politely to such nonsense. Afterwards, I had dinner with some graduate students who readily understood my objections, but Carroll remained adamant.
I hope he reads this book. Many dramatic-sounding statements are contained herein, but I’m going to be as careful as possible to distinguish among three different types: (1) remarkable features of modern physics that sound astonishing but are nevertheless universally accepted as true; (2) sweeping claims that are not necessarily accepted by many working physicists but that should be, as there is no question they are correct; and (3) speculative ideas beyond the comfort zone of contemporary scientific state of the art. We certainly won’t shy away from speculation, but it will always be clearly labeled. When all is said and done, you’ll be equipped to judge for yourself which parts of the story make sense.
The subject of time involves a large number of ideas, from the everyday to the mind-blowing. We’ll be looking at thermodynamics, quantum mechanics, special and general relativity, information theory, cosmology, particle physics, and quantum gravity. Part One of the book can be thought of as a lightning tour of the terrain—entropy and the arrow of time, the evolution of the universe, and different conceptions of the idea of “time” itself. Then we will get a bit more systematic; in Part Two we will think deeply about spacetime and relativity, including the possibility of travel backward in time. In Part Three we will think deeply about entropy, exploring its role in multiple contexts, from the evolution of life to the mysteries of quantum mechanics.
In Part Four we will put it all together to confront head-on the mysteries that entropy presents to the modern cosmologist: What should the universe look like, and how does that compare to what it actually does look like? I’ll argue that the universe doesn’t look anything like it “should,” after being careful about what that is supposed to mean—at least, not if the universe we see is all there is. If our universe began at the Big Bang, it is burdened with a finely tuned boundary condition for which we have no good explanation. But if the observed universe is part of a bigger ensemble—the multiverse—then we might be able to explain why a tiny part of that ensemble witnesses such a dramatic change in entropy from one end of time to the other.
All of which is unapologetically speculative but worth taking seriously. The stakes are big—time, space, the universe—and the mistakes we are likely to make along the way will doubtless be pretty big as well. It’s sometimes helpful to let our imaginations roam, even if our ultimate goal is to come back down to Earth and explain what’s going on in the kitchen.
PART ONE
TIME, EXPERIENCE, AND THE UNIVERSE
1
THE PAST IS PRESENT MEMORY
What is time? If no one asks me, I know. If I wish to explain it to one that asketh, I know not.
—St. Augustine, Confessions
The next time you find yourself in a bar, or on an airplane, or standing in line at the Department of Motor Vehicles, you can pass the time by asking the strangers around you how they would define the word time. That’s what I started doing, anyway, as part of my research for this book. You’ll probably hear interesting answers: “Time is what moves us along through life,” “Time is what separates the past from the future,” “Time is part of the universe,” and more along those lines. My favorite was “Time is how we know when things happen.”
All of these concepts capture some part of the truth. We might struggle to put the meaning of “time” into words, but like St. Augustine we nevertheless manage to deal with time pretty effectively in our everyday lives. Most people know how to read a clock, how to estimate the time it will take to drive to work or make a cup of coffee, and how to manage to meet their friends for dinner at roughly the appointed hour. Even if we can’t easily articulate what exactly it is we mean by “time,” its basic workings make sense at an intuitive level.
Like a Supreme Court justice confronted with obscenity, we know time when we see it, and for most purposes that’s good enough. But certain aspects of time remain deeply mysterious. Do we really know what the word means?
WHAT WE MEAN BY TIME
The world does not present us with abstract concepts wrapped up with pretty bows, which we then must work to understand and reconcile with other concepts. Rather, the world presents us with phenomena, things that we observe and make note of, from which we must then work to derive concepts that help us understand how those phenomena relate to the rest of our experience. For subtle concepts such as entropy, this is pretty clear. You don’t walk down the street and bump into some entropy; you have to observe a variety of phenomena in nature and discern a pattern that is best thought of in terms of a new concept you label “entropy.” Armed with this helpful new concept, you observe even more phenomena, and you are inspired to refine and improve upon your original notion of what entropy really is.
For an idea as primitive and indispensable as “time,” the fact that we invent the concept rather than having it handed to us by the universe is less obvious—time is something we literally don’t know how to live without. Nevertheless, part of the task of science (and philosophy) is to take our intuitive notion of a basic concept such as “time” and turn it into something rigorous. What we find along the way is that we haven’t been using this word in a single unambiguous fashion; it has a few different meanings, each of which merits its own careful elucidation.
Time comes in three different aspects, all of which are going to be important to us.
1. Time labels moments in the universe. Time is a coordinate; it helps us locate things.
2. Time measures the duration elapsed between events. Time is what clocks measure.
3. Time is a medium through which we move. Time is the agent of change. We move through it, or—equivalently—time flows past us, from the past, through the present, toward the future.
At first glance, these all sound somewhat similar. Time labels moments, it measures duration, and it moves from past to future—sure, nothing controversial about any of that. But as we dig more deeply, we’ll see how these ideas don’t need to be related to one another—they represent logically independent concepts that happen to be tightly intertwined in our actual world. Why that is so? The answer matters more than scientists have tended to think.
1. Time labels moments in the universe
John Archibald Wheeler, an influential American physicist who coined the term black hole, was once asked how he would define “time.” After thinking for a while, he came up with this: “Time is Nature’s way of keeping everything from happening at once.”
There is a lot of truth there, and more than a little wisdom. When we ordinarily think about the world, not as scientists or philosophers but as people getting through life, we tend to identify “the world” as a collection of things, located in various places. Physicists combine all of the places together and label the whole collection “space,” and they have different ways of thinking about the kinds of things that exist in space—atoms, elementary particles, quantum fields, depending on the context. But the underlying idea is the same. You’re sitting in a room, there are various pieces of furniture, some books, perhaps food or other people, certainly some air molecules—the collection of all those things, everywhere from nearby to the far reaches of intergalactic space, is “the world.”
And the world changes. We find objects in some particular arrangement, and we also find them in some other arrangement. (It’s very hard to craft a sensible sentence along those lines without referring to the concept of time.) But we don’t see the different configurations “simultaneously,” or “at once.” We see one configuration—here you are on the sofa, and the cat is in your lap—and then we see another configuration—the cat has jumped off your lap, annoyed at the lack of attention while you are engrossed in your book. So the world appears to us again and again, in various configurations, but these configurations are somehow distinct. Happily, we can label the various configurations to keep straight which is which—Miss Kitty is walking away “now”; she was on your lap “then.” That label is time.
So the world exists, and what is more, the world happens, again and again. In that sense, the world is like the different frames of a film reel—a film whose camera view includes the entire universe. (There are also, as far as we can tell, an infinite number of frames, infinitesimally separated.) But of course, a film is much more than a pile of individual frames. Those frames better be in the right order, which is crucial for making sense of the movie. Time is the same way. We can say much more than “that happened,” and “that also happened,” and “that happened, too.” We can say that this happened before that happened, and the other thing is going to happen after. Time isn’t just a label on each instance of the world; it provides a sequence that puts the different instances in order.
A real film, of course, doesn’t include the entire universe within its field of view. Because of that, movie editing typically involves “cuts”—abrupt jumps from one scene or camera angle to another. Imagine a movie in which every single transition between two frames was a cut to a completely different scene. When shown through a projector, it would be incomprehensible—on the screen it would look like random static. Presumably there is some French avant-garde film that has already used this technique.
The real universe is not an avant-garde film. We experience a degree of continuity through time—if the cat is on your lap now, there might be some danger that she will stalk off, but there is little worry that she will simply dematerialize into nothingness one moment later. This continuity is not absolute, at the microscopic level; particles can appear and disappear, or at least transform under the right conditions into different kinds of particles. But there is not a wholesale rearrangement of reality from moment to moment.
This phenomenon of persistence allows us to think about “the world” in a different way. Instead of a collection of things distributed through space that keep changing into different configurations, we can think of the entire history of the world, or any particular thing in it, in one fell swoop. Rather than thinking of Miss Kitty as a particular arrangement of cells and fluids, we can think of her entire life stretching through time, from birth to death. The history of an object (a cat, a planet, an electron) through time defines its world line—the trajectory the object takes through space as time passes.3 The world line of an object is just the complete set of positions the object has in the world, labeled by the particular time it was in each position.
Figure 1: The world, ordered into different moments of time. Objects (including people and cats) persist from moment to moment, defining world lines that stretch through time.
Finding ourselves
Thinking of the entire history of the universe all at once, rather than thinking of the universe as a set of things that are constantly moving around, is the first step toward thinking of time as “kind of like space,” which we will examine further in the chapters to come. We use both time and space to help us pinpoint things that happen in the universe. When you want to meet someone for coffee, or see a certain showing of a movie, or show up for work along with everyone else, you need to specify a time: “Let’s meet at the coffee shop at 6:00 P.M. this Thursday.”
If you want to meet someone, of course, it’s not sufficient just to specify a time; you also need to specify a place. (Which coffee shop are we talking about here?) Physicists say that space is “three-dimensional.” What that means is that we require three numbers to uniquely pick out a particular location. If the location is near the Earth, a physicist might give the latitude, longitude, and height above ground. If the location is somewhere far away, astronomically speaking, we might give its direction in the sky (two numbers, analogous to latitude and longitude), plus the distance from Earth. It doesn’t matter how we choose to specify those three numbers; the crucial point is that you will always need exactly three. Those three numbers are the coordinates of that location in space. We can think of a little label attached to each point, telling us precisely what the coordinates of that point are.
Figure 2: Coordinates attached to each point in space.
In everyday life, we can often shortcut the need to specify all three coordinates of space. If you say “the coffee shop at Eighth and Main Street,” you’re implicitly giving two coordinates—“Eighth” and “Main Street”—and you’re assuming that we all agree the coffee shop is likely to be at ground level, rather than in the air or underground. That’s a convenience granted to us by the fact that much of the space we use to locate things in our daily lives is effectively two-dimensional, confined near the surface of the Earth. But in principle, all three coordinates are needed to specify a point in space.
Each point in space occurs once at each moment of time. If we specify a certain location in space at one definite moment in time, physicists call that an event. (This is not meant to imply that it’s an especially exciting event; any random point in empty space at any particular moment of time would qualify, so long as it’s uniquely specified.) What we call the “universe” is just the set of all events—every point in space, at every moment of time. So we need four numbers—three coordinates of space, and one of time—to uniquely pick out an event. That’s why we say that the universe is four-dimensional. This is such a useful concept that we will often treat the whole collection, every point in space at every moment of time, as a single entity called spacetime.
This is a big conceptual leap, so it’s worth pausing to take it in. It’s natural to think of the world as a three-dimensional conglomeration that keeps changing (“happening over and over again, slightly differently each time”). We’re now suggesting that we can think of the whole shebang, the entire history of the world, as a single four-dimensional thing, where the additional dimension is time. In this sense, time serves to slice up the four-dimensional universe into copies of space at each moment in time—the whole universe at 10:00 A.M. on January 20, 2010; the whole universe at 10:01 A.M. on January 20, 2010; and so on. There are an infinite number of such slices, together making up the universe.
2. Time measures the duration elapsed between events
The second aspect of time is that it measures the duration elapsed between events. That sounds pretty similar to the “labels moments in the universe” aspect already discussed, but there is a difference. Time not only labels and orders different moments; it also measures the distance between them.
When taking up the mantle of philosopher or scientist and trying to make sense of a subtle concept, it’s helpful to look at things operationally—how do we actually use this idea in our experience? When we use time, we refer to the measurements that we get by reading clocks. If you watch a TV show that is supposed to last one hour, the reading on your clock at the end of the show will be one hour later than what it read when the show began. That’s what it means to say that one hour elapsed during the broadcast of that show: Your clock read an hour later when it ended than when it began.
But what makes a good clock? The primary criterion is that it should be consistent—it wouldn’t do any good to have a clock that ticked really fast sometimes and really slowly at others. Fast or slow compared to what? The answer is: other clocks. As a matter of empirical fact (rather than logical necessity), there are some objects in the universe that are consistently periodic—they do the same thing over and over again, and when we put them next to one another we find them repeating in predictable patterns.
Think of planets in the Solar System. The Earth orbits around the Sun, returning to the same position relative to the distant stars once every year. By itself, that’s not so meaningful—it’s just the definition of a “year.” But Mars, as it turns out, returns to the same position once every 1.88 years. That kind of statement is extremely meaningful—without recourse to the concept of a “year,” we can say that Earth moves around the Sun 1.88 times every time Mars orbits just once.4 Likewise, Venus moves around the Sun 1.63 times every time Earth orbits just once.
The key to measuring time is synchronized repetition—a wide variety of processes occur over and over again, and the number of times that one process repeats itself while another process returns to its original state is reliably predictable. The Earth spins on its axis, and it’s going to do so 365.25 times every time the Earth moves around the Sun. The tiny crystal in a quartz watch vibrates 2,831,155,200 times every time the Earth spins on its axis. (That’s 32,768 vibrations per second, 3,600 seconds in an hour, 24 hours in a day.5) The reason why quartz watches are reliable is that quartz crystal has extremely regular vibrations; even as the temperature or pressure changes, the crystal will vibrate the same number of times for every one rotation of the Earth.
So when we say that something is a good clock, we mean that it repeats itself in a predictable way relative to other good clocks. It is a fact about the universe that such clocks exist, and thank goodness. In particular, at the microscopic level where all that matters are the rules of quantum mechanics and the properties (masses, electric charges) of individual elementary particles, we find atoms and molecules that vibrate with absolutely predictable frequencies, forming a widespread array of excellent clocks marching in cheerful synchrony. A universe without good clocks—in which no processes repeated themselves a predictable number of times relative to other repeating processes—would be a scary universe indeed.6
Still, good clocks are not easy to come by. Traditional methods of timekeeping often referred to celestial objects—the positions of the Sun or stars in the sky—because things down here on Earth tend to be messy and unpredictable. In 1581, a young Galileo Galilei reportedly made a breakthrough discovery while he sat bored during a church service in Pisa. The chandelier overhead would swing gently back and forth, but it seemed to move more quickly when it was swinging widely (after a gust of wind, for example) and more slowly when wasn’t moving as far. Intrigued, Galileo decided to measure how much time it took for each swing, using the only approximately periodic event to which he had ready access: the beating of his own pulse. He found something interesting: The number of heartbeats between swings of the chandelier was roughly the same, regardless of whether the swings were wide or narrow. The size of the oscillations—how far the pendulum swung back and forth—didn’t affect the frequency of those oscillations. That’s not unique to chandeliers in Pisan churches; it’s a robust property of the kind of pendulum physicists call a “simple harmonic oscillator.” And that’s why pendulums form the centerpiece of grandfather clocks and other timekeeping devices: Their oscillations are extremely reliable. The craft of clock making involves the search for ever-more-reliable forms of oscillations, from vibrations in quartz to atomic resonances.
Our interest here is not really in the intricacies of clock construction, but in the meaning of time. We live in a world that contains all sorts of periodic processes, which repeat a predictable number of times in comparison to certain other periodic processes. And that’s how we measure duration: by the number of repetitions of such a process. When we say that our TV program lasts one hour, we mean that the quartz crystal in our watch will oscillate 117,964,800 times between the start and end of the show (32,768 oscillations per second, 3,600 seconds in an hour).
Notice that, by being careful about defining time, we seem to have eradicated the concept entirely. That’s just what any decent definition should do—you don’t want to define something in terms of itself. The passage of time can be completely recast in terms of certain things happening together, in synchrony. “The program lasts one hour” is equivalent to “there will be 117,964,800 oscillations of the quartz crystal in my watch between the beginning and end of the program” (give or take a few commercials). If you really wanted to, you could reinvent the entire superstructure of physics in a way that completely eliminated the concept of “time,” by replacing it with elaborate specifications of how certain things happen in coincidence with certain other things.7 But why would we want to? Thinking in terms of time is convenient, and more than that, it reflects a simple underlying order in the way the universe works.
Figure 3: Good clocks exhibit synchronized repetition. Every time one day passes, the Earth rotates once about its axis, a pendulum with a period of 1 second oscillates 86,400 times, and a quartz watch crystal vibrates 2,831,155,200 times.
Slowing, stopping, bending time
Armed with this finely honed understanding of what we mean by the passage of time, at least one big question can be answered: What would happen if time were to slow down throughout the universe? The answer is: That’s not a sensible question to ask. Slow down relative to what? If time is what clocks measure, and every clock were to “slow down” by the same amount, it would have absolutely no effect at all. Telling time is about synchronized repetition, and as long as the rate of one oscillation is the same relative to some other oscillation, all is well.
As human beings we feel the passage of time. That’s because there are periodic processes occurring within our own metabolism—breaths, heartbeats, electrical pulses, digestion, rhythms of the central nervous system. We are a complicated, interconnected collection of clocks. Our internal rhythms are not as reliable as a pendulum or a quartz crystal; they can be affected by external conditions or our emotional states, leading to the impression that time is passing more quickly or more slowly. But the truly reliable clocks ticking away inside our bodies—vibrating molecules, individual chemical reactions—aren’t moving any faster or slower than usual.8
What could happen, on the other hand, is that certain physical processes that we thought were “good clocks” would somehow go out of synchronization—one clock slows down, or speeds up, compared to all the rest. A sensible response in that case would be to blame that particular clock, rather than casting aspersions on time itself. But if we stretch a bit, we can imagine a particular collection of clocks (including molecular vibrations and other periodic processes) that all change in concert with one another, but apart from the rest of the world. Then we would have to ask whether it was appropriate to say that the rate at which time passes had really changed within that collection.
Consider an extreme example. Nicholson Baker’s novel The Fermata tells the story of a man, Arno Strine, with the ability to “stop time.” (Mostly he uses this miraculous power to go around undressing women.) It wouldn’t mean anything if time stopped every where; the point is that Arno keeps moving through time, while everything around him stops. We all know this is unrealistic, but it’s instructive to reflect upon the way in which it flouts the laws of physics. What this approach to stopping time entails is that every kind of motion and rhythm in Arno’s body continues as usual, while every kind of motion and rhythm in the outside world freezes absolutely still. Of course we have to imagine that time continues for all of the air and fluids within Arno, otherwise he would instantly die. But if the air in the rest of the room has truly stopped experiencing time, each molecule must remain suspended precisely in its location; consequently, Arno would be unable to move, trapped in a prison of rigidly stationary air molecules. Okay, let’s be generous and assume that time would proceed normally for any air molecules that came sufficiently close to Arno’s skin. (The book alludes to something of the sort.) But everything outside, by assumption, is not changing in any way. In particular, no sound or light would be able to travel to him from the outside world; Arno would be completely deaf and blind. It turns out not to be a promising environment for a Peeping Tom.9
What if, despite all the physical and narrative obstacles, something like this really could happen? Even if we can’t stop time around us, presumably we can imagine speeding up the motion of some local clocks. If we truly measure time by synchronized repetition, and we arranged an ensemble of clocks that were all running fast compared to the outside world while they remained in synchrony with one another, wouldn’t that be something like “time running faster” within that arrangement?
It depends. We’ve wandered far afield from what might actually happen in the real world, so let’s establish some rules. We’re fortunate enough to live in a universe that features very reliable clocks. Without such clocks, we can’t use time to measure the duration between events. In the world of The Fermata, we could say that time slowed down for the universe outside Arno Strine—or, equivalently and perhaps more usefully, that time for him sped up, while the rest of the world went on as usual. But just as well, we could say that “time” was completely unaffected, and what changed were the laws of particle physics (masses, charges on different particles) within Arno’s sphere of influence. Concepts like “time” are not handed to us unambiguously by the outside world but are invented by human beings trying to make sense of the universe. If the universe were very different, we might have to make sense of it in a different way.
Meanwhile, there is a very real way for one collection of clocks to measure time differently than another: have them move along different paths through spacetime. That’s completely compatible with our claim that “good clocks” should measure time in the same way, because we can’t readily compare clocks unless they’re next to one another in space. The total amount of time elapsed on two different trajectories can be different without leading to any inconsistencies. But it does lead to something important—the theory of relativity.
Twisty paths through spacetime
Through the miracle of synchronized repetition, time doesn’t simply put different moments in the history of the universe into order; it also tells us “how far apart” they are (in time). We can say more than “1776 happened before 2010”; we can say “1776 happened 234 years before 2010.”
I should emphasize a crucial distinction between “dividing the universe into different moments” and “measuring the elapsed time between events,” a distinction that will become enormously important when we get to relativity. Let’s imagine you are an ambitious temporal10 engineer, and you’re not satisfied to just have your wristwatch keep accurate time; you want to be able to know what time it is at every other event in spacetime as well. You might be tempted to wonder: Couldn’t we (hypothetically) construct a time coordinate all throughout the universe, just by building an infinite number of clocks, synchronizing them to the same time, and scattering them throughout space? Then, wherever we went in spacetime, there would be a clock sitting at each point telling us what time it was, once and for all.
The real world, as we will see, doesn’t let us construct an absolute universal time coordinate. For a long time people thought it did, under no less an authority than that of Sir Isaac Newton. In Newton’s view of the universe, there was one particular right way to slice up the universe into slices of “space at a particular moment of time.” And we could indeed, at least in a thought-experiment kind of way, send clocks all throughout the universe to set up a time coordinate that would uniquely specify when a certain event was taking place.
But in 1905, along comes Einstein with his special theory of relativity.11 The central conceptual breakthrough of special relativity is that our two aspects of time, “time labels different moments” and “time is what clocks measure,” are not equivalent, or even interchangeable. In particular, the scheme of setting up a time coordinate by sending clocks throughout the universe would not work: two clocks, leaving the same event and arriving at the same event but taking different paths to get there, will generally experience different durations along the journey, slipping out of synchronization. That’s not because we haven’t been careful enough to pick “good clocks,” as defined above. It’s because the duration elapsed along two trajectories connecting two events in spacetime need not be the same.
This idea isn’t surprising, once we start thinking that “time is kind of like space.” Consider an analogous statement, but for space instead of time: The distance traveled along two paths connecting two points in space need not be the same. Doesn’t sound so surprising at all, does it? Of course we can connect two points in space by paths with different lengths; one could be straight and one could be curved, and we would always find that the distance along the curved path was greater. But the difference in coordinates between the same two points is always the same, regardless of how we get from one point to another. That’s because, to drive home the obvious, the distance you travel is not the same as your change in coordinates. Consider a running back in football who zips back and forth while evading tacklers, and ends up advancing from the 30-yard line to the 80-yard line. (It should really be “the opponent’s 20-yard line,” but the point is clearer this way.) The change in coordinates is 50 yards, no matter how long or short was the total distance he ran.
Figure 4: Yard lines serve as coordinates on a football field. A running back who advances the ball from the 30-yard line to the 80-yard line has changed coordinates by 50 yards, even though the distance traveled may have been much greater.
The centerpiece of special relativity is the realization that time is like that. Our second definition, time is duration as measured by clocks, is analogous to the total length of a path through space; the clock itself is analogous to an odometer or some other instrument that measures the total distance traveled. This definition is simply not the same as the concept of a coordinate labeling different slices of spacetime (analogous to the yard lines on a football field). And this is not some kind of technical problem that we can “fix” by building better clocks or making better choices about how we travel through spacetime; it’s a feature of how the universe works, and we need to learn to live with it.
As fascinating and profound as it is that time works in many ways similar to space, it will come as no surprise that there are crucial differences as well. Two of them are central elements of the theory of relativity. First, while there are three dimensions of space, there is only one of time; that brute fact has important consequences for physics, as you might guess. And second, while a straight line between two points in space describes the shortest distance, a straight trajectory between two events in spacetime describes the longest elapsed duration.
But the most obvious, blatant, unmistakable difference between time and space is that time has a direction, and space doesn’t. Time points from the past toward the future, while (out there in space, far away from local disturbances like the Earth) all directions of space are created equal. We can invert directions in space without doing damage to how physics works, but all sorts of real processes can happen in one direction of time but not the other. It’s to this crucial difference that we now turn.
3. Time is a medium through which we move
The sociology experiment suggested at the beginning of this chapter, in which you ask strangers how they would define “time,” also serves as a useful tool for distinguishing physicists from non-physicists. Nine times out of ten, a physicist will say something related to one of the first two notions above—time is a coordinate, or time is a measure of duration. Equally often, a non-physicist will say something related to the third aspect we mentioned—time is something that flows from past to future. Time whooshes along, from “back then” to “now” and on toward “later.”
Or, conversely, someone might say that we move through time, as if time were a substance through which we could pass. In the Afterword to his classic Zen and the Art of Motorcycle Maintenance, Robert Pirsig relates a particular twist on this metaphor. The ancient Greeks, according to Pirsig, “saw the future as something that came upon them from behind their backs, with the past receding away before their eyes.”12 When you think about it, that seems a bit more honest than the conventional view that we march toward the future and away from the past. We know something about the past, from experience, while the future is more conjectural.
Common to these perspectives is the idea that time is a thing, and it’s a thing that can change—flow around us, or pass by as we move through it. But conceptualizing time as some sort of substance with its own dynamics, perhaps even the ability to change at different rates depending on circumstances, raises one crucially important question.
What in the world is that supposed to mean?
Consider something that actually does flow, such as a river. We can think about the river from a passive or an active perspective: Either we are standing still as the water rushes by, or perhaps we are on a boat moving along with the river as the banks on either side move relative to us.
The river flows, no doubt about that. And what that means is that the location of some particular drop of river water changes with time—here it is at some moment, there it is just a bit later. And we can talk sensibly about the rate at which the river flows, which is just the velocity of the water—in other words, the distance that the water travels in a given amount of time. We could measure it in miles per hour, or meters per second, or whatever units of “distance traveled per interval of time” you prefer. The velocity may very well change from place to place or moment to moment—sometimes the river flows faster; sometimes it flows more slowly. When we are talking about the real flow of actual rivers, all this language makes perfect sense.
But when we examine carefully the notion that time itself somehow “flows,” we hit a snag. The flow of the river was a change with time—but what is it supposed to mean to say that time changes with time? A literal flow is a change of location over time—but time doesn’t have a “location.” So what is it supposed to be changing with respect to?
Think of it this way: If time does flow, how would we describe its speed? It would have to be something like “x hours per hour”—an interval of time per unit time. And I can tell you what x is going to be—it’s 1, all the time. The speed of time is 1 hour per hour, no matter what else might be going on in the universe.
The lesson to draw from all this is that it’s not quite right to think of time as something that flows. It’s a seductive metaphor, but not one that holds up under closer scrutiny. To extract ourselves from that way of thinking, it’s helpful to stop picturing ourselves as positioned within the universe, with time flowing around us. Instead, let’s think of the universe—all of the four-dimensional spacetime around us—as somehow a distinct entity, as if we were observing it from an external perspective. Only then can we appreciate time for what it truly is, rather than privileging our position right here in the middle of it.
The view from nowhen
We can’t literally stand outside the universe. The universe is not some object that sits embedded in a larger space (as far as we know); it’s the collection of everything that exists, space and time included. So we’re not wondering what the universe would really look like from the point of view of someone outside it; no such being could possibly exist. Rather, we’re trying to grasp the entirety of space and time as a single entity. Philosopher Huw Price calls this “the view from nowhen,” a perspective separate from any particular moment in time.13 We are all overly familiar with time, having dealt with it every day of our lives. But we can’t help but situate ourselves within time, and it’s useful to contemplate all of space and time in a single picture.
And what do we see, when looking down from nowhen? We don’t see anything changing with time, because we are outside of time ourselves. Instead, we see all of history at once—past, present, and future. It’s like thinking of space and time as a book, which we could in principle open to any passage, or even cut apart and spread out all the pages before us, rather than as a movie, where we are forced to watch events in sequence at specific times. We could also call this the Tralfamadorian perspective, after the aliens in Kurt Vonnegut’s Slaughterhouse-Five. According to protagonist Billy Pilgrim,
The Tralfamadorians can look at all the different moments just the way we can look at a stretch of the Rocky Mountains, for instance. They can see how permanent all the moments are, and they can look at any moment that interests them. It is just an illusion we have here on earth that one moment follows another like beads on a string, and that once a moment is gone it is gone forever.14
How do we reconstruct our conventional understanding of flowing time from this lofty timeless Tralfamadorian perch? What we see are correlated events, arranged in a sequence. There is a clock reading 6:45, and a person standing in their kitchen with a glass of water in one hand and an ice cube in the other. In another scene, the clock reads 6:46 and the person is again holding the glass of water, now with the ice cube inside. In yet another one, the clock reads 6:50 and the person is holding a slightly colder glass of water, now with the ice cube somewhat melted.
In the philosophical literature, this is sometimes called the “block time” or “block universe” perspective, thinking of all space and time as a single existing block of spacetime. For our present purposes, the important point is that we can think about time in this way. Rather than carrying a picture in the back of our minds in which time is a substance that flows around us or through which we move, we can think of an ordered sequence of correlated events, together constituting the entire universe. Time is then something we reconstruct from the correlations in these events. “This ice cube melted over the course of ten minutes” is equivalent to “the clock reads ten minutes later when the ice cube has melted than it does when the ice cube is put into the glass.” We’re not committing ourselves to some dramatic conceptual stance to the effect that it’s wrong to think of ourselves as embedded within time; it just turns out to be more useful, when we get around to asking why time and the universe are the way they are, to be able to step outside and view the whole ball of wax from the perspective of nowhen.
Opinions differ, of course. The struggle to understand time is a puzzle of long standing, and what is “real” and what is “useful” have been very much up for debate. One of the most influential thinkers on the nature of time was St. Augustine, the fifth-century North African theologian and Father of the Church. Augustine is perhaps best known for developing the doctrine of original sin, but he was interdisciplinary enough to occasionally turn his hand to metaphysical issues. In Book XI of his Confessions, he discusses the nature of time.
What is by now evident and clear is that neither future nor past exists, and it is inexact language to speak of three times—past, present, and future. Perhaps it would be exact to say: there are three times, a present of things past, a present of things present, a present of things to come. In the soul there are these three aspects of time, and I do not see them any where else. The present considering the past is memory, the present considering the present is immediate awareness, the present considering the future is expectation.15
Augustine doesn’t like this block-universe business. He is what is known as a “presentist,” someone who thinks that only the present moment is real—the past and future are things that we here in the present simply try to reconstruct, given the data and knowledge available to us. The viewpoint we’ve been describing, on the other hand, is (sensibly enough) known as “eternalism,” which holds that past, present, and future are all equally real.16
Concerning the debate between eternalism and presentism, a typical physicist would say: “Who cares?” Perhaps surprisingly, physicists are not overly concerned with adjudicating which particular concepts are “real” or not. They care very much about how the real world works, but to them it’s a matter of constructing comprehensive theoretical models and comparing them with empirical data. It’s not the individual concepts characteristic of each model (“past,” “future,” “time”) that matter; it’s the structure as a whole. Indeed, it often turns out to be the case that one specific model can be described in two completely different ways, using an entirely different set of concepts. 17
So, as scientists, our goal is to construct a model of reality that successfully accounts for all of these different notions of time—time is measured by clocks, time is a coordinate on spacetime, and our subjective feeling that time flows. The first two are actually very well understood in terms of Einstein’s theory of relativity, as we will cover in Part Two of the book. But the third remains a bit mysterious. The reason why I am belaboring the notion of standing outside time to behold the entire universe as a single entity is because we need to distinguish the notion of time in and of itself from the perception of time as experienced from our parochial view within the present moment. The challenge before us is to reconcile these two perspectives.
2
THE HEAVY HAND OF ENTROPY
Eating is unattractive too . . . Various items get gulped into my mouth, and after skillful massage with tongue and teeth I transfer them to the plate for additional sculpture with knife and fork and spoon. That bit’s quite therapeutic at least, unless you’re having soup or something, which can be a real sentence. Next you face the laborious business of cooling, of reassembly, of storage, before the return of these foodstuffs to the Superette, where, admittedly, I am promptly and generously reimbursed for my pains. Then you tool down the aisles, with trolley or basket, returning each can and packet to its rightful place.
—Martin Amis, Time’s Arrow18
Forget about spaceships, rocket guns, clashes with extraterrestrial civilizations. If you want to tell a story that powerfully evokes the feeling of being in an alien environment, you have to reverse the direction of time.
You could, of course, simply take an ordinary story and tell it backward, from the conclusion to the beginning. This is a literary device known as “reverse chronology” and appears at least as early as Virgil’s Aeneid. But to really jar readers out of their temporal complacency, you want to have some of your characters experience time backward. The reason it’s jarring, of course, is that all of us nonfictional characters experience time in the same way; that’s due to the consistent increase of entropy throughout the universe, which defines the arrow of time.
THROUGH THE LOOKING GLASS
F. Scott Fitzgerald’s short story “The Curious Case of Benjamin Button”—more recently made into a film starring Brad Pitt—features a protagonist who is born as an old man and gradually grows younger as time passes. The nurses of the hospital at which Benjamin is born are, understandably, somewhat at a loss.
Wrapped in a voluminous white blanket, and partly crammed into one of the cribs, there sat an old man apparently about seventy years of age. His sparse hair was almost white, and from his chin dripped a long smoke-coloured beard, which waved absurdly back and forth, fanned by the breeze coming in at the window. He looked up at Mr. Button with dim, faded eyes in which lurked a puzzled question.
“Am I mad?” thundered Mr. Button, his terror resolving into rage. “Is this some ghastly hospital joke?”
“It doesn’t seem like a joke to us,” replied the nurse severely. “And I don’t know whether you’re mad or not—but that is most certainly your child.”
The cool perspiration redoubled on Mr. Button’s forehead. He closed his eyes, and then, opening them, looked again. There was no mistake—he was gazing at a man of threescore and ten—a baby of threescore and ten, a baby whose feet hung over the sides of the crib in which it was reposing.19
No mention is made in the story of what poor Mrs. Button must have been feeling around this time. (In the movie version, at least the newborn Benjamin is baby-sized, albeit old and wrinkled.)
Because it is so bizarre, having time run backward for some characters in a story is often played for comic effect. In Lewis Carroll’s Through the Looking-Glass, Alice is astonished upon first meeting the White Queen, who lives in both directions of time. The Queen is shouting and shaking her finger in pain:
“What IS the matter?” [Alice] said, as soon as there was a chance of making herself heard. “Have you pricked your finger?”
“I haven’t pricked it YET,” the Queen said, “but I soon shall—oh, oh, oh!”
“When do you expect to do it?” Alice asked, feeling very much inclined to laugh.
“When I fasten my shawl again,” the poor Queen groaned out: “the brooch will come undone directly. Oh, oh!” As she said the words the brooch flew open, and the Queen clutched wildly at it, and tried to clasp it again.
“Take care!” cried Alice. “You’re holding it all crooked!” And she caught at the brooch; but it was too late: the pin had slipped, and the Queen had pricked her finger.20
Carroll (no relation21) is playing with a deep feature of the nature of time—the fact that causes precede effects. The scene makes us smile, while serving as a reminder of how central the arrow of time is to the way we experience the world.
Time can be reversed in the service of tragedy, as well as comedy. Martin Amis’s novel Time’s Arrow is a classic of the reversing-time genre, even accounting for the fact that it’s a pretty small genre.22 Its narrator is a disembodied consciousness who lives inside another person, Odilo Unverdorben. The host lives life in the ordinary sense, forward in time, but the homunculus narrator experiences everything backward—his first memory is Unverdorben’s death. He has no control over Unverdorben’s actions, nor access to his memories, but passively travels through life in reverse order. At first Unverdorben appears to us as a doctor, which strikes the narrator as quite a morbid occupation—patients shuffle into the emergency room, where staff suck medicines out of their bodies and rip off their bandages, sending them out into the night bleeding and screaming. But near the end of the book, we learn that Unverdorben was an assistant at Auschwitz, where he created life where none had been before—turning chemicals and electricity and corpses into living persons. Only now, thinks the narrator, does the world finally make sense.
THE ARROW OF TIME
There is a good reason why reversing the relative direction of time is an effective tool of the imagination: In the actual, non-imaginary world, it never happens. Time has a direction, and it has the same direction for everybody. None of us has met a character like the White Queen, who remembers what we think of as “the future” rather than (or in addition to) “the past.”
What does it mean to say that time has a direction, an arrow pointing from the past to the future? Think about watching a movie played in reverse. Generally, it’s pretty clear if we are seeing something running the “wrong way” in time. A classic example is a diver and a pool. If the diver dives, and then there is a big splash, followed by waves bouncing around in the water, all is normal. But if we see a pool that starts with waves, which collect into a big splash, in the process lifting a diver up onto the board and becoming perfectly calm, we know something is up: The movie is being played backward.
Certain events in the real world always happen in the same order. It’s dive, splash, waves; never waves, splash, spit out a diver. Take milk and mix it into a cup of black coffee; never take coffee with milk and separate the two liquids. Sequences of this sort are called irreversible processes. We are free to imagine that kind of sequence playing out in reverse, but if we actually see it happen, we suspect cinematic trickery rather than a faithful reproduction of reality.
Irreversible processes are at the heart of the arrow of time. Events happen in some sequences, and not in others. Furthermore, this ordering is perfectly consistent, as far as we know, throughout the observable universe. Someday we might find a planet in a distant solar system that contains intelligent life, but nobody suspects that we will find a planet on which the aliens regularly separate (the indigenous equivalents of) milk and coffee with a few casual swirls of a spoon. Why isn’t that surprising? It’s a big universe out there; things might very well happen in all sorts of sequences. But they don’t. For certain kinds of processes—roughly speaking, complicated actions with lots of individual moving parts—there seems to be an allowed order that is somehow built into the very fabric of the world.
Tom Stoppard’s play Arcadia uses the arrow of time as a central organizing metaphor. Here’s how Thomasina, a young prodigy who was well ahead of her time, explains the concept to her tutor:
THOMASINA: When you stir your rice pudding, Septimus, the spoonful of jam spreads itself round making red trails like the picture of a meteor in my astronomical atlas. But if you need stir backward, the jam will not come together again. Indeed, the pudding does not notice and continues to turn pink just as before. Do you think this odd?
SEPTIMUS: No.
THOMASINA: Well, I do. You cannot stir things apart.
SEPTIMUS: No more you can, time must needs run backward, and since it will not, we must stir our way onward mixing as we go, disorder out of disorder into disorder until pink is complete, unchanging and unchangeable, and we are done with it for ever. This is known as free will or self-determination.23
The arrow of time, then, is a brute fact about our universe. Arguably the brute fact about our universe; the fact that things happen in one order and not in the reverse order is deeply ingrained in how we live in the world. Why is it like that? Why do we live in a universe where X is often followed by Y, but Y is never followed by X?
The answer lies in the concept of “entropy” that I mentioned above. Like energy or temperature, entropy tells us something about the particular state of a physical system; specifically, it measures how disorderly the system is. A collection of papers stacked neatly on top of one another has a low entropy; the same collection, scattered haphazardly on a desktop, has a high entropy. The entropy of a cup of coffee along with a separate teaspoon of milk is low, because there is a particular orderly segregation of the molecules into “milk ” and “coffee,” while the entropy of the two mixed together is comparatively large. All of the irreversible processes that reflect time’s arrow—we can turn eggs into omelets but not omelets into eggs, perfume disperses through a room but never collects back into the bottle, ice cubes in water melt but glasses of warm water don’t spontaneously form ice cubes—share a common feature: Entropy increases throughout, as the system progresses from order to disorder. Whenever we disturb the universe, we tend to increase its entropy.
A big part of our task in this book will be to explain how the single idea of entropy ties together such a disparate set of phenomena, and then to dig more deeply into what exactly this stuff called “entropy” really is, and why it tends to increase. The final task—still a profound open question in contemporary physics—is to ask why the entropy was so low in the past, so that it could be increasing ever since.
FUTURE AND PAST VS. UP AND DOWN
But first, we need to contemplate a prior question: Should we really be surprised that certain things happen in one direction of time, but not in the other? Who ever said that everything should be reversible, anyway?
Think of time as a label on events as they happen. That’s one of the ways in which time is like space—they both help us locate things in the universe. But from that point of view, there is also a crucial difference between time and space—directions in space are created equal, while directions in time (namely, “the past” and “the future”) are very different. Here on Earth, directions in space are easily distinguished—a compass tells us whether we are moving north, south, east, or west, and nobody is in any danger of confusing up with down. But that’s not a reflection of deep underlying laws of nature—it’s just because we live on a giant planet, with respect to which we can define different directions. If you were floating in a space suit far away from any planets, all directions in space would truly be indistinguishable—there would be no preferred notion of “up” or “down.”
The technical way to say this is that there is a symmetry in the laws of nature—every direction in space is as good as every other. It’s easy enough to “reverse the direction of space”—take a photograph and print it backward, or for that matter just look in a mirror. For the most part, the view in a mirror appears pretty unremarkable. The obvious counterexample is writing, for which it’s easy to tell that we are looking at a reversed image; that’s because writing, like the Earth, does pick out a preferred direction (you’re reading this book from left to right). But the images of most scenes not full of human creations look equally “natural” to us whether we see them directly or we see them through a mirror.
Contrast that with time. The equivalent of “looking at an image through a mirror” (reversing the direction of space) is simply “playing a movie backward” (reversing the direction of time). And in that case, it’s easy to tell when time has been inverted—the irreversible processes that define the arrow of time are suddenly occurring in the wrong order. What is the origin of this profound difference between space and time?
While it’s true that the presence of the Earth beneath our feet picks out an “arrow of space” by distinguishing up from down, it’s pretty clear that this is a local, parochial phenomenon, rather than a reflection of the underlying laws of nature. We can easily imagine ourselves out in space where there is no preferred direction. But the underlying laws of nature do not pick out a preferred direction of time, any more than they pick out a preferred direction in space. If we confine our attention to very simple systems with just a few moving parts, whose motion reflects the basic laws of physics rather than our messy local conditions, there is no arrow of time—we can’t tell when a movie is being run backward. Think about Galileo’s chandelier, rocking peacefully back and forth. If someone showed you a movie of the chandelier, you wouldn’t be able to tell whether it was being shown forward or backward—its motion is sufficiently simple that it works equally well in either direction of time.
Figure 5: The Earth defines a preferred direction in space, while the Big Bang defines a preferred direction in time.
The arrow of time, therefore, is not a feature of the underlying laws of physics, at least as far as we know. Rather, like the up/down orientation space picked out by the Earth, the preferred direction of time is also a consequence of features of our environment. In the case of time, it’s not that we live in the spatial vicinity of an influential object; it’s that we live in the temporal vicinity of an influential event: the birth of the universe. The beginning of our observable universe, the hot dense state known as the Big Bang, had a very low entropy. The influence of that event orients us in time, just as the presence of the Earth orients us in space.
NATURE’S MOST RELIABLE LAW
The principle underlying irreversible processes is summed up in the Second Law of Thermodynamics:
The entropy of an isolated system either remains constant or increases with time.
(The First Law states that energy is conserved.24) The Second Law is arguably the most dependable law in all of physics. If you were asked to predict what currently accepted principles of physics would still be considered inviolate a thousand years from now, the Second Law would be a good bet. Sir Arthur Eddington, a leading astrophysicist of the early twentieth century, put it emphatically:
If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations [the laws of electricity and magnetism]—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.25
C. P. Snow—British intellectual, physicist, and novelist—is perhaps best known for his insistence that the “Two Cultures” of the sciences and the humanities had grown apart and should both be a part of our common civilization. When he came to suggest the most basic item of scientific knowledge that every educated person should understand, he chose the Second Law:
A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics, the law of entropy. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: “Have you read a work of Shakespeare’s?”26
I’m sure Baron Snow was quite the hit at Cambridge cocktail parties. (To be fair, he did later admit that even physicists didn’t really understand the Second Law.)
Our modern definition of entropy was proposed by Austrian physicist Ludwig Boltzmann in 1877. But the concept of entropy, and its use in the Second Law of Thermodynamics, dates back to German physicist Rudolf Clausius in 1865. And the Second Law itself goes back even earlier—to French military engineer Nicolas Léonard Sadi Carnot in 1824. How in the world did Clausius use entropy in the Second Law without knowing its definition, and how did Carnot manage to formulate the Second Law without even using the concept of entropy at all?
The nineteenth century was the heroic age of thermodynamics—the study of heat and its properties. The pioneers of thermodynamics studied the interplay between temperature, pressure, volume, and energy. Their interest was by no means abstract—this was the dawn of the industrial age, and much of their work was motivated by the desire to build better steam engines.
Today physicists understand that heat is a form of energy and that the temperature of an object is simply a measure of the average kinetic energy (energy of motion) of the atoms in the object. But in 1800, scientists didn’t believe in atoms, and they didn’t understand energy very well. Carnot, whose pride was wounded by the fact that the English were ahead of the French in steam engine technology, set himself the task of understanding how efficient such an engine could possibly be—how much useful work could you do by burning a certain amount of fuel? He showed that there is a fundamental limit to such extraction. By taking an intellectual leap from real machines to idealized “heat engines,” Carnot demonstrated there was a best possible engine, which got the most work out of a given amount of fuel operating at a given temperature. The trick, unsurprisingly, was to minimize the production of waste heat. We might think of heat as useful in warming our houses during the winter, but it doesn’t help in doing what physicists think of as “work”—getting something like a piston or a flywheel to move from place to place. What Carnot realized was that even the most efficient engine possible is not perfect; some energy is lost along the way. In other words, the operation of a steam engine is an irreversible process.
So Carnot appreciated that engines did something that could not be undone. It was Clausius, in 1850, who understood that this reflected a law of nature. He formulated his law as “heat does not spontaneously flow from cold bodies to warm ones.” Fill a balloon with hot water and immerse it in cold water. Everyone knows that the temperatures will tend to average out: The water in the balloon will cool down as the surrounding liquid warms up. The opposite never happens. Physical systems evolve toward a state of equilibrium—a quiescent configuration that is as uniform as possible, with equal temperatures in all components. From this insight, Clausius was able to re-derive Carnot’s results concerning steam engines.
So what does Clausius’ law (heat never flows spontaneously from colder bodies to hotter ones) have to do with the Second Law (entropy never spontaneously decreases)? The answer is, they are the same law. In 1865 Clausius managed to reformulate his original maxim in terms of a new quantity, which he called the “entropy.” Take an object that is gradually cooling down—emitting heat into its surroundings. As this process happens, consider at every moment the amount of heat being lost, and divide it by the temperature of the object. The entropy is then the accumulated amount of this quantity (the heat lost divided by the temperature) over the course of the entire process. Clausius showed that the tendency of heat to flow from hot objects to cold ones was precisely equivalent to the claim that the entropy of a closed system would only ever go up, never go down. An equilibrium configuration is simply one in which the entropy has reached its maximum value, and has nowhere else to go; all the objects in contact are at the same temperature.
If that seems a bit abstract, there is a simple way of summing up this view of entropy: It measures the uselessness of a certain amount of energy.27 There is energy in a gallon of gasoline, and it’s useful—we can put it to work. The process of burning that gasoline to run an engine doesn’t change the total amount of energy; as long as we keep careful track of what happens, energy is always conserved.28 But along the way, that energy becomes increasingly useless. It turns into heat and noise, as well as the motion of the vehicle powered by that engine, but even that motion eventually slows down due to friction. And as energy transforms from useful to useless, its entropy increases all the while.
The Second Law doesn’t imply that the entropy of a system can never decrease. We could invent a machine that separated out the milk from a cup of coffee, for example. The trick, though, is that we can only decrease the entropy of one thing by creating more entropy elsewhere. We human beings, and the machines that we might use to rearrange the milk and coffee, and the food and fuel each consume—all of these also have entropy, which will inevitably increase along the way. Physicists draw a distinction between open systems—objects that interact significantly with the outside world, exchanging entropy and energy—and closed systems—objects that are essentially isolated from external influences. In an open system, like the coffee and milk we put into our machine, entropy can certainly decrease. But in a closed system—say, the total system of coffee plus milk plus machine plus human operators plus fuel and so on—the entropy will always increase, or at best stay constant.
THE RISE OF ATOMS
The great insights into thermodynamics of Carnot, Clausius, and their colleagues all took place within a “phenomenological” framework. They knew the big picture but not the underlying mechanisms. In particular, they didn’t know about atoms, so they didn’t think of temperature and energy and entropy as properties of some microscopic substrate; they thought of each of them as real things, in and of themselves. It was common in those days to think of energy in particular as a form of fluid, which could flow from one body to another. The energy-fluid even had a name: “caloric.” And this level of understanding was perfectly adequate to formulating the laws of thermodynamics.
But over the course of the nineteenth century, physicists gradually became convinced that the many substances we find in the world can all be understood as different arrangements of a fixed number of elementary constituents, known as “atoms.” (The physicists actually lagged behind the chemists in their acceptance of atomic theory.) It’s an old idea, dating back to Democritus and other ancient Greeks, but it began to catch on in the nineteenth century for a simple reason: The existence of atoms could explain many observed properties of chemical reactions, which otherwise were simply asserted. Scientists like it when a single simple idea can explain a wide variety of observed phenomena.
These days it is elementary particles such as quarks and leptons that play the role of Democritus’s atoms, but the idea is the same. What a modern scientist calls an “atom” is the smallest possible unit of matter that still counts as a distinct chemical element, such as carbon or nitrogen. But we now understand that such atoms are not indivisible; they consist of electrons orbiting the atomic nucleus, and the nucleus is made of protons and neutrons, which in turn are made of different combinations of quarks. The search for rules obeyed by these elementary building blocks of matter is often called “fundamental” physics, although “elementary” physics would be more accurate (and arguably less self-aggrandizing). Henceforth, I’ll use atoms in the established nineteenth-century sense of chemical elements, not the ancient Greek sense of elementary particles.
The fundamental laws of physics have a fascinating feature: Despite the fact that they govern the behavior of all the matter in the universe, you don’t need to know them to get through your everyday life. Indeed, you would be hard-pressed to discover them, merely on the basis of your immediate experiences. That’s because very large collections of particles obey distinct, autonomous rules of behavior, which don’t really depend on the smaller structures underneath. The underlying rules are referred to as “microscopic” or simply “fundamental,” while the separate rules that apply only to large systems are referred to as “macroscopic” or “emergent.” The behavior of temperature and heat and so forth can certainly be understood in terms of atoms: That’s the subject known as “statistical mechanics.” But it can equally well be understood without knowing anything whatsoever about atoms: That’s the phenomenological approach we’ve been discussing, known as “thermodynamics.” It is a common occurrence in physics that in complex, macroscopic systems, regular patterns emerge dynamically from underlying microscopic rules. Despite the way it is sometimes portrayed, there is no competition between fundamental physics and the study of emergent phenomena; both are fascinating and crucially important to our understanding of nature.
One of the first physicists to advocate atomic theory was a Scotsman, James Clerk Maxwell, who was also responsible for the final formulation of the modern theory of electricity and magnetism. Maxwell, along with Boltzmann in Austria (and following in the footsteps of numerous others), used the idea of atoms to explain the behavior of gases, according to what was known as “kinetic theory.” Maxwell and Boltzmann were able to figure out that the atoms in a gas in a container, fixed at some temperature, should have a certain distribution of velocities—this many would be moving fast, that many would be moving slowly, and so on. These atoms would naturally keep banging against the walls of the container, exerting a tiny force each time they did so. And the accumulated impact of those tiny forces has a name: It is simply the pressure of the gas. In this way, kinetic theory explained features of gases in terms of simpler rules.
ENTROPY AND DISORDER
But the great triumph of kinetic theory was its use by Boltzmann in formulating a microscopic understanding of entropy. Boltzmann realized that when we look at some macroscopic system, we certainly don’t keep track of the exact properties of every single atom. If we have a glass of water in front of us, and someone sneaks in and (say) switches some of the water molecules around without changing the overall temperature and density and so on, we would never notice. There are many different arrangements of particular atoms that are indistinguishable from our macroscopic perspective. And then he noticed that low-entropy objects are more delicate with respect to such rearrangements. If you have an egg, and start exchanging bits of the yolk with bits of the egg white, pretty soon you will notice. The situations that we characterize as “low-entropy” seem to be easily disturbed by rearranging the atoms within them, while “high-entropy” ones are more robust.
Figure 6: Ludwig Boltzmann’s grave in the Zentralfriedhof, Vienna. The inscribed equation, S = k log W, is his formula for entropy in terms of the number of ways you can rearrange microscopic components of a system without changing its macroscopic appearance. (See Chapter Eight for details.)
So Boltzmann took the concept of entropy, which had been defined by Clausius and others as a measure of the uselessness of energy, and redefined it in terms of atoms:
Entropy is a measure of the number of particular microscopic arrangements of atoms that appear indistinguishable from a macroscopic perspective.29
It would be difficult to overemphasize the importance of this insight. Before Boltzmann, entropy was a phenomenological thermodynamic concept, which followed its own rules (such as the Second Law). After Boltzmann, the behavior of entropy could be derived from deeper underlying principles. In particular, it suddenly makes perfect sense why entropy tends to increase:
In an isolated system entropy tends to increase, because there are more ways to be high entropy than to be low entropy.
At least, that formulation sounds like it makes perfect sense. In fact, it sneaks in a crucial assumption: that we start with a system that has a low entropy. If we start with a system that has a high entropy, we’ll be in equilibrium—nothing will happen at all. That word start sneaks in an asymmetry in time, by privileging earlier times over later ones. And this line of reasoning takes us all the way back to the low entropy of the Big Bang. For whatever reason, of the many ways we could arrange the constituents of the universe, at early times they were in a very special, low-entropy configuration.
Product details
- Publisher : Dutton; Reprint edition (October 26, 2010)
- Language : English
- Paperback : 464 pages
- ISBN-10 : 0452296544
- ISBN-13 : 978-0452296541
- Item Weight : 2.31 pounds
- Dimensions : 5.5 x 1 x 8.5 inches
- Best Sellers Rank: #103,356 in Books (See Top 100 in Books)
- #21 in Physics of Time (Books)
- #116 in Quantum Theory (Books)
- #149 in Cosmology (Books)
- Customer Reviews:
About the author
Sean Carroll is Homewood Professor of Natural Philosophy at Johns Hopkins University and Fractal Faculty at the Santa Fe Institute. His research focuses on fundamental issues in quantum mechanics, gravitation, statistical mechanics, and cosmology. He has wide-ranging interests, including in philosophy, complexity theory, and information.
Carroll is an active science communicator, and has been blogging regularly since 2004. His textbook "Spacetime and Geometry" has been adopted by a number of universities for their graduate courses in general relativity. He is a frequent public speaker, and has appeared on TV shows such as The Colbert Report and Through The Wormhole with Morgan Freeman. He has produced a set of lectures for The Teaching Company on dark matter and dark energy, and another on the nature of time. He has served as a science consultant for films such as Thor and TRON: Legacy, as well as for TV shows such as Fringe and Bones.
His 2010 popular book, "From Eternity to Here," explained the arrow of time and connected it with the origin of our universe. "The Particle at the End of the Universe," about the Large Hadron Collider and the quest to discover the Higgs boson, was released November 2012, "The Big Picture: On the Origins of Life, Meaning, and the Universe Itself" in May 2016, and "Something Deeply Hidden: Quantum Worlds and the Emergence of Spacetime" in 2019. His next book project is "The Biggest Ideas in the Universe," which will consist of three books. The first, "Space, Time, and Motion," appears in September 2022.
More information at http://preposterousuniverse.com/
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonCustomers say
Customers find the book accessible and a good introduction to modern physics. They find it interesting and worth reading, with an understandable style and illustrations. The content is complex, but presented in laymen's terms. However, some readers feel the book is too long, with endless footnotes. Opinions differ on the language, with some finding it easy to understand, while others consider it awkward. There are mixed reviews regarding the time theory, with some finding it complete and well-written, while others say it lacks depth.
AI-generated from the text of customer reviews
Customers find the book accessible and well-written. It provides an excellent introduction to modern physics concepts for lay readers. The book covers many topics in detail and draws together all the research on the topic of time. However, it is a bold introduction to modern ideas that requires attention as you read it.
"...the first book I have read by Sean Carroll, and I found in it an elegant discussion on the concept of the arrow of time...." Read more
"...there will likely never be a final say, this is a bold introduction to a lot of modern ideas, but dont read it lightly as its value is in the depth..." Read more
"...The book takes a tour through modern theories and speculations by starting with a few fundamental questions "what is time and why is it moving..." Read more
"...Carroll's passion, candor, and eccentric prose kept me through it...." Read more
Customers find the book engaging and worth reading. They describe it as a fun read that provides quality modern physics explanations in a clear and thought-provoking way. The author is described as brilliant and the subject matter seems interesting.
"...done an excellent job of presenting us with an in-depth and provocative introduction to this subject...." Read more
"...A note to Sean: Keep up the excellent work. If you're reading this, I would happily purchase other books for you and support your academic endeavors...." Read more
"...but what a wonderful book, too) and then it's up to you. Ah, by the way: the answers to the questions in my first phrase?..." Read more
"...To me the book was quite interesting. A few equations are displayed, but there is no actual use of mathematics...." Read more
Customers find the book's style engaging and reasonable. They appreciate the illustrations that help explain concepts. The book looks like a comic book but reads like a science thriller, according to customers.
"...Buy the book. It's great. The new science is wonderful and elegant...." Read more
"...He goes deeply into the issues, but with a reasdable style that can give at least partial enlightenment on one of the most abstract subjects science..." Read more
"...by the layman as many practical examples are given with many great illustrations to help make the ideas more concrete...." Read more
"Book is excellent. Totally new and beautiful inside/outside...." Read more
Customers like the content. They find the story interesting and engaging, with each chapter leading into the next. The book presents complex subjects in laymen's terms while going deep into issues.
"...I found chapter eleven interesting...." Read more
"...It's not an easy read, though, even if it does present super complex subjects in laymen's terms...." Read more
"...I have to say that I specifically liked the final chapter that offers a how the prediction of multiverses may just be a solution that addresses the..." Read more
"...He goes deeply into the issues, but with a reasdable style that can give at least partial enlightenment on one of the most abstract subjects science..." Read more
Customers have mixed opinions about the language. Some find it easy to understand, with practical examples and clear thought processes. Others describe it as a difficult read with awkward writing and incomplete ideas. There are also typographical errors.
"...Don't worry, the book is not heavy on math...." Read more
"...Content-wise, he does a phenomenal job of summarizing and relating the history of theories of time and their real-world application...." Read more
"...This is not a light read, if you make it such you probably will miss a lot of what the author is trying to communicate, im sure I missed a lot of..." Read more
"...The explanations are good, clear and enlightening - we learn a great deal - if one can stick with it...." Read more
Customers have different views on the time theory. Some find it comprehensive and interesting, challenging traditional ideas about time and space. Others feel the universe is complicated and the book oversimplifies concepts. The hypotheticals are also criticized as obscure and difficult to understand.
"...Chapter eight involves an interesting discussion of Boltzmann's formula, which is a calculation of entropy based on the number of microscopic..." Read more
"This book is an overview of the time symmetry of most physics and the reality we live in where time seems to evolve in 1 direction...." Read more
"...Right now the universe is very complicated: There are lots of galaxies, stars, planets, black holes, people, etc...." Read more
"...difficult concepts and the book culminates with some truly fascinating modern theories on time, the universe, and (seemingly) everything in between...." Read more
Customers have different views on the gravitational energy. Some find it well-presented at a high level, covering general relativity, quantum mechanics, and cosmology. Others feel the concept still applies and quantum field theory is not well developed.
"...Quantum gravity is discussed at a high level and is presented as the theory which eventually will illuminate the subject though the huge fuzziness..." Read more
"...Then the gravitational energy is always negative. Assuming a finite universe, you can add up all the positive energy of mass, kinetic energy, etc...." Read more
"...It covers entropy, general relativity, quantum machanics and cosmology. So quite a bit of ground is covered...." Read more
"...Mind blowing how quantum physics, time, space, gravity are all consistent with ancient mystics and current Buddhism, Hinduism, and Yoga..." Read more
Customers find the book too long with 438 pages. They also mention it's filled with endless and pointless footnotes that add extra length.
"...requires a fair amount of concentration without stop, and the book is rather long...." Read more
"...At 438 pages, it is also a fairly lengthy challenge, and I expect most readers who are not scientists or have no prior background in the subject..." Read more
"...It is a long read, and 3/4 through the book, as most physics books without formulas are, it can get a bit ethereal, but it's a satisfying read,..." Read more
"...Nearly half of this very long book (the specs say 448 pages, but my Kindle version seemed much longer) is taken up with endless, and often pointless..." Read more
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
- Reviewed in the United States on October 16, 2012This is the first book I have read by Sean Carroll, and I found in it an elegant discussion on the concept of the arrow of time. I think he has done an excellent job of presenting us with an in-depth and provocative introduction to this subject. Some parts, I found, required clear, sharp thinking as I read the material; it can sometimes be a bit confusing. Nevertheless, Carroll did quite well in explaining the material in as clear and comprehensive manner as possible. I need to mention that this book packs a tremendous amount of information between it covers. Often I would read only so many pages before have to stop and digest the material.
He divides the book into four sections. In section one, we get into some talk about the concepts of the past (events near the Big Bang), the present, the future, and an introduction to the laws of thermodynamics, especially the second law which is about entropy - an important topic in our understanding of the arrow of time. We also learn about vacuum energy, time symmetry, and what is maximum entropy as he lays the foundation for what is to come.
Section two delves into concepts involving relativity, such as the speed of light and light cones, curved spacetime along with a discussion of white and black holes. Here we learn that black holes provide the strongest connection between gravitation and entropy - the two crucial ingredients in an ultimate explanation of the arrow of time according to Carroll.
Section three introduces us to something called closed time-like curves, a closed flatland universe, and something called a space of states. Microstates and macrostates play an important role in the discussion. Chapter eight involves an interesting discussion of Boltzmann's formula, which is a calculation of entropy based on the number of microscopic arrangements of a system that are macroscopically indistinguishable. For those rusty on exponentials and logarithms, Carroll provides an appendix covering the basics. Don't worry, the book is not heavy on math. We get into a number of concepts involving entropy: Liouville's Theorem, Gibb's formula, Loschmidt's reversibility objection, and the past hypothesis (referring to a boundary condition at the beginning of the universe). I also need to mention Maxwell's demon (illustrating a connection between entropy and information) and Laplace's all-knowing demon. I found chapter eleven interesting. The material delves into quantum mechanics involving such topics as the "quantum cat" and the collapse of the wave function, entanglement, and decoherence. All of the material in this section is actually quite important to building a knowledge foundation for understanding the arrow of time.
In the last section, there is a more in depth coverage of black holes, which, as I said, provides an important connection between entropy and gravity. The question of why the universe had such a low entropy at the beginning is explored in more depth, and the future state of the universe is hypothesized - possibly something called de Sitter space. Inflation and the multiverse are discussed. In this section, the concept of bubble universes is presented as a possible solution to the arrow of time. I found myself concurring with Carroll on this. It sounds plausible, if not testable.
Don't worry about all of the concepts introduced in this review. Carroll thoroughly explains and elaborates on these topics in the process of making them understandable.
If you want a good summary of the contents of the book, I suggest you use Amazon's "Look Inside" feature, and scroll down to the table of contents. Under each chapter heading, you will find a brief description of the chapter. This gives you a pretty good idea of what is being discussed.
- Reviewed in the United States on February 16, 2010This book is an overview of the time symmetry of most physics and the reality we live in where time seems to evolve in 1 direction. Sean Carroll is a world renowned physicist and so the approach is one that is defined from the implications of our physical laws themselves rather than from a philosophical perspective based on our subjective interpretation of time. Most of the book focuses on time from the perspective of thermodynamics and the second law in particular- entropy is expected to increase through time, though relativistic time and its similarity to space is discussed, as are modern theories of the origin of the universe to try to avoid assuming the problem away theories.
Let me try to talk briefly on the topics the author explores. The arrow of time is not specifically a part of classical physics (newtonian physics and electromagnetism) and this is confusing as to us, time clearly only moves forward not back. The relativistic aspects of closed spacelike curves and wormholes are addressed briefly as ideas in relativity that approach time's direction, but this isnt focused on in depthly. The author approaches the direction of time as a correspondence between entropy's strict march higher and our experience with the irreversibility of time. The ideas justifying an increase in entropy are well discussed and exponential increase in states if configuration spaces are discussed. This is with the backdrop of a static universe. Poincare's recurrence theorems in dynamical systems is brought up to describe things like the eventual recurrence of low entropy states over time and Boltzmann's retorts which amount to assuming away issues are then included. The book then discusses the change from static universe in which time has no beginning nor end to one which has a beginning and how this avoids recurrence by selecting preferred intial boundary conditions of low entropy, and then the author gets into how this too is unsatisfying as it assumes the problem away again. Quantum ideas are presented, the asymmetry of the collapse of the wave function is brought up but not taken anywhere. On a side note, I still have no clarity on how a spacelike closed curve can exist in a world with quantum mechanics (excluding a multiverse scenario) as I would think that implies there is no probability which can change an event in spacetime's trajectory and the author doesnt discuss that at all. Quantum gravity is discussed at a high level and is presented as the theory which eventually will illuminate the subject though the huge fuzziness of the subject isnt really very encouraging. The book concludes with some modern theories and directions in physics which might give consistent frameworks for worlds with strictly increasing entropy which evolve into our visible universe though is careful to admit that this is all really speculation.
This is a complicated book. One can probably gloss over a lot of the content and get something out of it, but most of the contents of this book are based off a lot of deep thinking by academic minds over centuries. I for one definately have not come through this book having any stronger feeling about the nature of time, though I now have a better understanding of entropy and information theory. I also think the most clear writing on relativistic time is described in this book which takes only a small portion of the space. This is not a light read, if you make it such you probably will miss a lot of what the author is trying to communicate, im sure I missed a lot of the subtelties though i was trying to concentrate while reading. I did not come out anymore clearly on- why do we remember the past? The author often makes statements about having addressed it as a result of entropy, but I really dont find a rigorous argument in this book that convinces. The state of entropy and its direction impacts the distribution of events in a probabilistic world, it doesnt imply determinism which the arrow of time has a deterministic past from our eyes. The relative entropy of the universe now and 100 years ago being higher is not a reason why we have a memory of the past and literature from the past. The specific reason why we have a flitration of measurable sets to us that is bounded by time is not convincingly shown to be a result of increasing entropy. If it was, then I wish the author spent more time on the arguments. This book is mainly about physics and how time fits in and what time's implications are on physics and then the interpretation of that physics. It is a subject for which there will likely never be a final say, this is a bold introduction to a lot of modern ideas, but dont read it lightly as its value is in the depth of the ideas presented.
Top reviews from other countries
- Juan L. Gomez-PeralesReviewed in Canada on December 14, 2021
5.0 out of 5 stars Excellent read
Not for the beginner as it is written at a fairly high level. For me it is perfect and one of the best I have read in a while. This was my second book from this author and both were exceptional.
- André GargouraReviewed in France on December 16, 2022
5.0 out of 5 stars In search of lost time... And entropy !
A thrilling excursion to some of the most fascinating -- though yet unsolved -- issues in modern physics and cosmology, already hinted at in Kant's antinomies, among others...
The road to full understanding of those perplexing themes is still long and difficult, but Carroll's maestria, enthusiasm and optimism constantly drive the reader towards hope, all along.
So, embark safely !
-
Cliente KindleReviewed in Italy on November 6, 2019
5.0 out of 5 stars L'importanza e il mistero dell'entropia
Il miglior libro che io abbia mai letto sul problema del tempo( e ne ho letti diversi!). Si basa sopratutto sulla legge dell'entropia, spiegandone molto bene il concetto e i diversi significati.Da leggere, per chi è appassionato dei problemi più grandi e profondi del cosmo e della vita.
- Zlatko smoleReviewed in Germany on August 28, 2019
3.0 out of 5 stars Font waaay to small
Book itself OK, but the font - I have a perfect sight, tested this year, this is insane. I need a loop. Its basically unreadable for me. Photo attached is appendix part (really relevant) bit the main font is not much bigger. For what, for 10 pages more if font increases?
Zlatko smole
Reviewed in Germany on August 28, 2019
Images in this review - Midhun JoseReviewed in India on August 10, 2017
5.0 out of 5 stars A good book for those who wants to understand what is time
This book is a great start if you are trying to understand the cosmology of time. I must agree that one read is not enough to understand the concepts discussed in this book. However, the author is successful in explaining rather complicated scientific concepts in layman's terms.