The features of spontaneous symmetry that are peculiar to quantum field theory have received scant attention in the philosophical literature. The present paper aims to advance the discussion of both of these twin topics by tracing their interaction in classical physics, ordinary quantum mechanics and quantum field theory. ) mysteries of spontaneous symmetry breaking. Only comparatively recently have they begun to delve into the (. Philosophers have long been interested in the meaning and status of Curie's Principle. In the same publication Curie discussed a key feature of what later came to be known as spontaneous symmetry breaking: the phenomena generally do not exhibit the symmetries of the laws that govern them. In 1894 Pierre Curie announced what has come to be known as Curie's Principle: the asymmetry of effects must be found in their causes. Disappointingly, in their interpretation of general relativity, the logical empiricists unwittingly replicated some epistemological remarks Kretschmann had written before General Relativity even existed. While Einstein had taken nothing from Kretschmann but the expression “point-coincidences”, the logical empiricists, however, instinctively dragged along with it the entire apparatus of Kretschmann’s conventionalism. Kretschmann himself realized this and turned the point-coincidence argument against Einstein in his second (. Whereas Kretschmann was inspired by the work of Mach and Poincaré, Einstein inserted Kretschmann’s point-coincidence parlance into the context of Ricci and Levi-Civita’s absolute differential calculus. The present paper attempts to show that a 1915 article by Erich Kretschmann must be credited not only for being the source of Einstein’s point-coincidence remark, but also for having anticipated the main lines of the logical-empiricist interpretation of general relativity. Moreover, the motivation for the program-that isomorphic substantival models should be regarded as representing the same physical situation-is misguided. In fact, for the category of topological spaces of interest in spacetime physics, the program is equivalent to the original spacetime approach. I argue that the program of Leibniz algebras is subject to radical local indeterminism to the same extent as substantivalism. ) prey to radical local indeterminism, the Leibniz algebras do not. An alleged virtue of this is that, while a substantival interpretation of spacetime theories falls (. The idea is that the structure common to the members of an equivalence class of substantival models is captured by a Leibniz algebra which can then be taken to directly characterize the intrinsic reality only indirectly represented by the substantival models. Landauer's principle is the loosely formulated notion that the erasure of n bits of information must always incur a cost of k ln n in thermodynamic entropy.In a number of publications, John Earman has advocated a tertium quid to the usual dichotomy between substantivalism and relationism concerning the nature of spacetime. It can be formulated as a precise result in statistical mechanics, but for a restricted class of erasure processes that use a thermodynamically irreversible phase space expansion, which is the real origin of the law's entropy cost and whose necessity has not been demonstrated. General arguments that purport to establish the unconditional validity of the law (erasure maps many physical states to one erasure compresses the phase space) fail. They turn out to depend on the illicit formation of a canonical ensemble from memory devices holding random data. To exorcise Maxwell's demon one must show that all candidate devices-the ordinary and the extraordinary-must fail to reverse the second law of thermodynamics. The theorizing surrounding Landauer's principle is too fragile and too tied to a few specific examples to support such general exorcism. Charles Bennett's recent extension of Landauer's principle to the merging of computational paths fails for the same reasons as trouble the original principle.Ī sizeable literature is based on the claim that Maxwell's demon must fail to produce violations of the second law of thermodynamics because of an inevitable entropy cost associated with certain types of information processing. In the second edition of their standard compilation of work on Maxwell's demon, Leff and Rex (2003, p. Xii) note that more references have been generated in the 13 years since the volume's first edition than in all years prior to it, extending back over the demon's 120 years of life.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |