Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 Contents lists available at SciVerse ScienceDirect Studies in History and Philosophy of Modern Physics journal homepage: www.elsevier.com/locate/shpsb The case of the composite Higgs: The model as a ‘‘Rosetta stone’’ in contemporary high-energy physics Arianna Borrelli IZWT, Bergische Universität Wuppertal, Gaussstraße 20, D-42097 Wuppertal, Germany a r t i c l e i n f o abstract Article history: Received 18 July 2011 Received in revised form 15 January 2012 Accepted 30 April 2012 Available online 20 July 2012 This paper analyses the practice of model-building ‘‘beyond the Standard Model’’ in contemporary high-energy physics and argues that its epistemic function can be grasped by regarding models as mediating between the phenomenology of the Standard Model and a number of ‘‘theoretical cores’’ of hybrid character, in which mathematical structures are combined with verbal narratives (‘‘stories’’) and analogies referring back to empirical results in other fields (‘‘empirical references’’). Borrowing a metaphor from a physics research paper, model-building is likened to the search for a Rosetta stone, whose significance does not lie in its immediate content, but rather in the chance it offers to glimpse at and manipulate the components of hybrid theoretical constructs. I shall argue that the rise of hybrid theoretical constructs was prompted by the increasing use of nonrigorous mathematical heuristics in high-energy physics. Support for my theses will be offered in form of a historical–philosophical analysis of the emergence and development of the theoretical core centring on the notion that the Higgs boson is a composite particle. I will follow the heterogeneous elements which would eventually come to form this core from their individual emergence in the 1960s and 1970s, through their collective life as a theoretical core from 1979 until the present day. & 2012 Elsevier Ltd. All rights reserved. Keywords: High-energy physics Model-building Composite Higgs Nonrigorous mathematics When citing this paper, please use the full journal title Studies in History and Philosophy of Modern Physics 1. Introduction: Model-building ‘‘beyond the Standard Model’’ The Standard Model of particle physics is a theoretical framework which emerged in the 1970s and has been eminently successful at explaining and predicting a wide range of experimental observations in the field of particle physics. Despite its empirical success, though, the Standard Model is considered by many physicists to be valid only for phenomena involving energies up to the order of a few TeV, beyond which a still unknown ‘‘new physics’’ rules. Even though no indisputable evidence of failures of the Standard Model exists, theorists have been developing alternatives to it since many decades, and the rate of creation of models of physics ‘‘beyond the Standard Model’’ (BSM-physics) is steadily increasing. Theoretical physicists working on alternatives to the Standard Model are often collectively referred to as ‘‘model-builders’’, and indeed their main efforts seem to be directed rather at creating new, often E-mail address: [email protected] 1355-2198/$ - see front matter & 2012 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.shpsb.2012.04.003 quite sketchy models, then at focussing on one of them to develop it further into a full-fledged theory. The complex, fragmented landscape of models of BSM-physics and the enthusiasm with which model-builders work in expanding it appear somehow confusing to the external observer. The present paper shall offer an assessment of the practice of model-building in today’s high-energy physics and an analysis of the epistemic dynamics of the development of BSM-models. Particular attention shall be devoted to the question, whether and how it may be possible in this context to draw a distinction between ‘‘models’’ and ‘‘theories’’. I shall argue that such a differentiation is not only possible, but even necessary and that the development of BSMmodels can only be understood as a multi-centered process in which a number of long-lived, physically meaningful, but only scantily outlined ‘‘theoretical cores’’ are tentatively instantiated in fully expendable and easily manipulable models. This happens with diverse aims: exploring the phenomenological potential of the theoretical cores; combining two or more of them with each other; developing and/or testing new mathematical tools to extract predictions from the cores; understanding better the formal or physical implications of theoretical hypotheses. In these and other cases, succeeding in building a model with certain desired features 196 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 provides a nonrigorous proof (‘‘demonstration’’) of the physical– mathematical validity of the core, even if the model itself is patently unrealistic. I will argue that theoretical cores cannot be understood purely in terms of mathematics because they are constituted by a mixture of (1) mathematical structures, (2) verbal, narrative components (‘‘stories’’) and (3) analogies referring back to accepted theoretical explanations of experimental results (‘‘empirical references’’)—a mixture whose elements have come to resonate with each other giving rise to a stable, resilient and flexible core to guide model-building and be in turn constantly reshaped by it. I will also attempt to show how the combination of mathematical and nonmathematical elements into hybrid theoretical structures is a feature which has characterized theorizing in highenergy physics at least since the 1960s, although it has attained a dominant role only in the post-Standard-Model era. I will argue that today, although the ideal of a ‘‘final theory’’ as a coherent, rigorous mathematical construct still features prominently in the rhetoric of high-energy physics research, theoretical practitioners engaged in speculating about BSM-physics are more interested in exploring the properties and potentials of hybrid theoretical cores than in trying to construct such a final theory. In this sense, I will suggest that today’s practices of model-building in high-energy physics have to be understood as a sign that the notion of ‘‘theory’’ as a coherent mathematical construct expressing natural order de facto faces erosion. The rise in the epistemic significance of hybrid theoretical practices can be brought in connection with the increasing use of nonrigorous mathematical methods in theoretical physics in general, and in high-energy physics in particular. The general theses stated above will be illustrated by an analysis of the development of the class of models having at their core the notion of a ‘‘composite Higgs boson’’. In the next section, I offer a brief overview of the different approaches to BSM-physics developed in contemporary research papers and introduce the questions and tentative answers at the centre of this paper. In Section 3, I will connect these reflections with a first-hand account of the aims and methods of model-building given by a practising model-builder, Lisa Randall. In Section 4, I will then go on to expound more in detail the key notions of the paper (‘‘theoretical cores,’’ ‘‘stories,’’ ‘‘empirical references’’) using as reference points ideas introduced by Margaret Morrison, Mary Morgan, Stephen Hartmann, Isabelle Peschard and Andrew Pickering. Section 5 is devoted to discussing the use and epistemic significance of nonrigorous mathematical heuristics in today’s high-energy physics, as these issues play an important role in understanding how the practice of model-building may in some cases be regarded as a nonrigorous form of proof. After having introduced my main theses I will provide some evidence in their support by reconstructing the development of the theoretical core centering around the idea of a ‘‘composite Higgs boson’’. This theoretical core emerged at the end of the 1970s from the combination of a number of elements which had been present in the high-energy physics discourse since a decade or longer. Sections 6–10 will deal with events pre-dating the emergence of the core, with particular attention to the work of Yoshiro Nambu and Giovanni Jona-Lasinio on what would later be termed ‘‘spontaneous symmetry breaking’’. This discussion will also provide evidence of the presence of hybrid theoretical practices of the 1960s and 1970s, and I shall argue that the notion of theoretical core can find application also in that context. Sections 11–14 will then sketch the rise and further development of the composite Higgs approach from the late 1970s until the present day. Section 15 presents a summary of the results, as well as some tentative speculations on the factors which might be linked to the transformations of theoretical practices discussed in the paper. 2. The ‘‘new physics’’ in contemporary research papers Models of new physics can be tentatively collected into a number of classes corresponding to different approaches. An exhaustive discussion of these approaches is beyond the aims of this paper, but a brief sketch of some of the most popular ones is necessary to formulate the questions which this study will address. More extensive reviews with bibliographical references can be found in (Grojean, 2009; Altarelli, 2010; Bustamante, Cieri, & Ellis, 2009). Among the most popular avenues to new physics we find: 1. Extending the Standard Model by additional particles. Without changing the essential elements of the Standard Model, it is possible to add a new symmetry or new particles to it. Of course, to explain the fact that these additional particles have not yet been detected, they must be assumed to either have high masses, or be very weakly coupled to known particles, so that the probability of creating them in a high-energy collision is very small. 2. Supersymmetry. In quantum field theory, each particle falls into one of two categories of quantum-statistical behaviour: fermions or bosons. In supersymmetric models the Standard Model is extended so as to make it symmetrical with respect to a transformation changing bosons into fermions and vice versa. Since, however, no such symmetry is actually observed in nature, it must be assumed that supersymmetry is broken and that all ‘‘supersymmetric partners’’ of known particles have such a high mass, that they have not yet been produced in high-energy experiments. 3. Composite Higgs. In the Standard Model, the Higgs boson is an elementary particle, but theorists have been producing models in which it is a composite state. Later on, we shall take a closer look at this class of models. 4. Extra dimensions. Despite clear observational evidence that our space–time is 4-dimensional, physicists have long been experimenting with the idea that one or more as-yet-unremarked dimensions exist. While initially confined to extensions of general gravitation and to string theory, this notion was taken up by mainstream high-energy theorists at the end of the 1990s and has since been a valuable model-building technique. There are very many subcategories to this approach, depending for example on the number of extra-dimensions or on the way in which they are made ‘‘invisible’’ under everyday conditions. These four broad classes by no means exhaust the panorama, but will suffice for our present purposes. The first thing to note is that, as already remarked in the introduction, there are no experimental results contradicting the predictions of the Standard Model and favouring one or more of the models of new physics. In fact, a main feature of these models is that, at low energies, they all mimic the phenomenology of the Standard Model. It is only at higher, as-yet-unexplored energies, that the ‘‘new physics’’ is expected to appear. In short, from the empirical point of view all BSM-models are at present equally speculative. The second point worth noting is that none of the classes listed above can be associated to some coherent mathematical construct—a ‘‘theory’’ in a traditional sense—from which the various models are derived. For example, there is no ‘‘theory of supersymmetry’’, but only a number of different supersymmetric models (see for example Drees, Rohini, & Prorbir, 2004). The same applies to the other approaches, and perhaps the most fruitful way of looking at the various ‘‘classes’’ of BSM-models is to consider them as loose clusters whose elements may gravitate around one or more central ideas such as supersymmetry, extra space–time dimensions or the notion of a composite Higgs boson. Before A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 trying to characterize more in detail these ‘‘central ideas’’ it is important to note that, for the most part, they are not mutually exclusive. In fact, the last years have seen an increasing tendency to combine two (or even more) of the overarching approaches in the same model, building for example supersymmetric models with extra space–time dimensions. Moreover, if one looks at the work produced recently by some prominent model-builders, one notices that most of them have not focussed on developing one model, but have rather proposed a large numbers of variations on one or more themes. The start of the Large Hadron Collider at CERN, which is expected to bring about evidence of new physics, has apparently increased the pace of model building, as jokingly remarked in an overview paper: ‘‘The prospect of imminent Higgs discovery concentrates wonderfully the minds of theorists, and many theorists with cold feet are generating alternative models, as prolifically as monkeys on their laptops. These serve the invaluable purpose of providing benchmarks that can be compared and contrasted with the SM Higgs’’ (Bustamante et al., 2009, p. 21). Models are here characterized as ‘‘benchmarks’’, a term often used to indicate constructs which are not regarded as a serious candidate for a theory, but allow quantitative predictions which give at least an approximate idea of how phenomena in that general direction of new physics might look like. A further feature that strikes a first-time reader of papers on BSM-models is how the authors attempt to underscore the novelty and originality of their models by giving them inventive, sometimes funny names. In an overview paper on ‘‘New Theories at the Fermi Scale’’ (2009) the theorist Christoph Grojean comments on the number of attributes which have been invented to describe hypothetical alternative means of breaking the electroweak symmetry: ‘‘Theorists have always been very good at giving names to things they do not understand. And clearly the EWSB [electroweak symmetry breaking] sector has been an inspirational source of creativity to them, as it is evident by collecting the attributes that have been associated to the Higgs boson over the last few years [y]: buried, charming, composite, fat, fermiophobic, gauge, gaugephobic, holographic, intermediate, invisible, leptophilic, little, littlest, lone, phantom, portal, private, slim, simplest, strangephilic, twin, un-, unusual,y’’ (Grojean, 2009 p. 3, see there also for the references of the research papers where the various terms appear). In general, the attitude of model-builders towards their creations suggests that, in their eyes, what is being built are ‘‘only’’ models to be later modified or discarded without much regard. In short, these models seem to enjoy a lower epistemic status than would in general parlance be associated to a ‘‘theory’’. For comparison, we may look at those high-energy theorists who are today usually regarded as a category distinct from model-builders: string theorists. The majority of string theorists have a very strong commitment to strings ontology and their most prominent representatives do no hesitate to propound them as the ‘‘theory of everything’’ in the strongest possible terms (Susskind, 2005; Hawkins & Mlodinow, 2010). Yet, if in string theory there are only ‘‘theories’’, in model building there seem to be only ‘‘models’’ and therefore one might be justified in regarding them as nothing but theories under another name—perhaps theories which are still incomplete or lack empirical backing, but theories nonetheless. In both cases, the distinction between theories and models would seem to fade. Is it so? I shall argue that this is not the case, and that it is possible to draw a very clear distinction between models and theories of new physics, provided one is ready to give up the 197 idea that theoretical constructs must have a purely mathematical character. Let us now go back to the question of how ‘‘central ideas’’ like supersymmetry or the composite Higgs approach may be characterized. Although they are not coherent mathematical structures, they do contain mathematical elements, such as the supersymmetric transformations turning boson into fermions and vice versa, or the methods used to make extra-dimensions ‘‘disappear’’. However, these elements are only bound into a (more or less) coherent mathematical whole at the level of specific, individual models. The central ideas also contain elements which are purely qualitative and expressed only in verbal terms, such as the generic notion that supersymmetry must be broken (as opposed to the many specific methods of actually breaking it) or the concept of extra space–time dimensions (as opposed to specifying how many of them and with what kind of metric). Mathematical and nonmathematical elements are only loosely bound together, so that it is possible to build models which combine ideas of different origin and therefore belong to two or more clusters. Despite their very vague character, the central ideas implemented in the models command from physicists more respect and commitment than the models themselves. For example, supersymmetry became widely accepted as one of the most promising approaches to BSM-physics in the 1980s and has since remained one of the prime candidates, despite the fact that longawaited experimental evidence in its favour has until now failed to turn up (Drees et al., 2004, p. 4). The situation in the case of the composite Higgs approach is similar, as we shall see later on: Many models have come and gone, but the central idea survived. In the end, one cannot help but conclude that although ideas like supersymmetry, a composite Higgs or extra-dimensions are not coherent mathematical structures, their elements are seen by physicists as containing valuable clues to the workings of nature. It is to characterize these central ideas and their elements that I shall introduce the notion of a ‘‘theoretical core’’ and its mathematical and non-mathematical components, reserving the term ‘‘theory’’ to indicate coherent mathematical structures which lay a claim at explaining and predicting a broad, possibly infinite range of phenomena. Such a clear-cut distinction is valid only in a first approximation, but is necessary to avoid ambiguities in the following discussion. When looking at specific examples we shall however see how the border between ‘‘models,’’ ‘‘theories,’’ and ‘‘theoretical cores’’ is permeable. For example, I will argue that the Weinberg-Salam model of electroweak interactions emerged as a model implementing a theoretical core, but eventually ended up as a theory in its own right. According to the terminology chosen, though, the Standard Model, the BCS theory of superconductivity and general relativity are today examples of existing theories, while the ‘‘theory of everything’’ of which some theoretical physicists speak is a (still) nonexisting one (‘t Hooft et al. 2005). BSM-models, on the other hand, although they are usually coherent, if sketchy, mathematical constructs, are not regarded by physicists as starting points for a theory to explain and predict phenomena, but rather as expendable tools to explore the potentialities of theoretical cores. For example, the minimal supersymmetric standard model (MSSM) is a quite complex structure employed by theorists as a convenient means to formulate predictions and test hypotheses about supersymmetry (a ‘‘benchmark’’), but is today hardly regarded as a candidate for the final theory (Drees et al., 2004, pp. 251–252). 3. Model-building as a search for ‘‘nouns and phrases’’ In the previous section, I made some claims about the practice of model-building in the search for BSM-physics, and these 198 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 reflections will now be compared with an account of this practice as described by one of its most prominent contemporary practitioners: Lisa Randall, a theoretical physicists from Harvard University best known for the BSM-model she proposed together with Raman Sundrum, a model in which space–time is assumed to be 5-dimensional and to posses a ‘‘warped’’ metric (Randall & Sundrum, 1999). In her book ‘‘Warped Dimensions’’, an account of her research addressed at a broad audience, Randall endeavours to convey to the reader both her enthusiasm for and open questions about the notion of extra-dimensions (Randall, 2006). She also offers an open and demystifying description of the practice of model-building which can provide us with some valuable insights for the following discussion: ‘‘Particle physics models are guesses at alternative physical theories that might underlie the Standard Model. If you think of a unified theory as the summit of a mountain, model builders are trailblazers who are trying to find the path that connects the solid ground below, consisting of well-established physical theories, to the peak—the path that will ultimately tie new ideas together. [y] Models that go beyond the Standard Model incorporate its ingredients and mimic its consequences at energies that have already been explored, but they also contain new forces, new particles and new interactions that can be seen only at a shorter distances. [y] Model builders look at the unresolved aspects of the Standard Model and try to use known theoretical ingredients to address its inadequacies. [y] Model builders try to see the big picture so they can find the pieces that could be relevant to our world.[y] Only some models will prove correct, but models are the best way to investigate possibilities and build up a reservoir of compelling ingredients’’ (Randall, 2006, pp. 70–73). In this passage the search for a ideal ‘‘unified theory’’ features prominently as the ultimate motivation for model-builders’ work, yet model-builders are not directly looking for such a theory, but rather search for a ‘‘path’’ to it and for ‘‘theoretical ingredients’’, ‘‘pieces that could be relevant to our world’’ to be put in a ‘‘reservoir’’. I would like to argue that this intermediate level between the (many) models and the (one) theory corresponds to the ‘‘theoretical cores’’ introduced in Section 2. Randall’s distinction between a ‘‘unified theory’’ on the one side and the collection of ingredients in a ‘‘reservoire’’ on the other may be seen as corresponding to the difference between a mathematically coherent proposal for a final theory and the central idea(s) shared by the models in a cluster. As I have shown in the previous section, supersymmetry and other approaches to BSM-physics are loose collections of elements of different nature and are indeed similar to ‘‘reservoires’’ of promising ingredients for a future theory. At the same time, as Randall notes, ‘‘only some models will prove correct, but models are the best way to explore possibilities and build up a reservoire of compelling ingredients,’’ so that the relation between models and ‘‘reservoires’’ appears one of mutual feedback: Models are built from reservoire ingredients, but also serve to enlarge the reservoire. For example, supersymmetric transformations can be implemented in many different models to explore new methods of supersymmerybreaking to be collected. The models’ primary aim is not to become the seed of a unified theory, and in this point Randall sees the main difference between model-builders and string-theorists: The latter ones, she claims, are actually aiming directly at building a unified theory. To express the difference Randall makes use of a metaphor based on the age-old notion of the ‘‘language of nature’’: ‘‘You might say that we are all searching for the language of the universe. But whereas string theorists focus on the inner logic of the grammar, model-builders focus on the nouns and phrases that they think are most useful. If particle physicists were in Florence learning Italian, the model builders would know how to ask for lodging and acquire the vocabulary that would be essential to finding their way around, but they might talk funny and never fully comprehend the ‘Inferno’’’ (Randall, 2006, p. 73). Randall is here using the language analogy to indicate mathematical structures of different complexity and various degrees of interconnectedness: While the ‘‘grammar’’ of string theory imposes strong constraints on their practitioners, model-builders are free to collect scattered mathematical ‘‘nouns and phrases’’ to be later variously linked with each other—for example supersymmetric transformations or extra-dimensional metrics. As I have argued above, though, theoretical cores also contain nonmathematical elements and these, too, should in my opinion be regarded as ‘‘nouns and phrases’’ in Randall’s sense—although this was probably not what she had in mind when using this expression. I shall presently say more about the characteristics of nonmathematical elements of theoretical cores, and I will argue that among them verbal narratives constitute an essential component, so that it may not be a pure chance that Randall chose a linguistic metaphor to express her thoughts. Indeed, as we shall see, deciphering the book of nature has also been likened by theorists to the search for a ‘‘Rosetta stone.’’ Randall’s exposition of the results of BSM-research has the same rhapsodic character as model-building practices: After having summarized the development of physics in the twentieth century up to the creation of the Standard Model, she devotes the remaining chapters of the book to various ‘‘names and phrases’’ which were collected in the last decades: supersymmetry (chapter 13), strings (chapter 14), ‘‘branes’’ (i.e., multidimensional membranes in extradimentsional space–time, chapter 15), and finally a number of alternative ‘‘proposals for extra-dimensional universes’’ like ‘‘sequestering’’ (the localization of some particles on a brane, chapter 17), ‘‘Kaluza–Klein particles’’ (4-dimensional manifestations of particles existing in extra-dimensions, chapter 18), ‘‘large’’ extra dimensions (i.e., of the order of a millimeter, chapter 19), and of course the two different models of ‘‘warped’’ extra dimensions proposed by herself together with Sundrum (chapters 20 and 22). Beginning each section with a short science-fiction-like narrative inspired by the notion to be discussed, Randall expounds the pros and contras of the various approaches and, although she makes clear her preferences, she does not attempt to present any one of them as the best candidate for a ‘‘unified theory,’’ leaving the decision up to experiment and appropriately naming the last chapter of the book ‘‘(In) Conclusions.’’ Randall clearly sees a contrast between the practices of string theory and those of model-building and seems to imply that string theorists have more interest in mathematically coherent formulations, as opposed to the fragmentary creations of modelbuilders. Indeed, the interest of string theorists for complex mathematics—and their disinterest for experiment—has often been noted in the physics community. An analysis of the practices and values of string theorists has been given by Galison (2004), who has convincingly argued that in the last decades string theory has effected a ‘‘shift in values in physical research away from the constant interplay between theory and experiment’’ and to a ‘‘realignment of theory towards mathematics’’ (Galison, 2004, pp. 24–25). However, Galison’s study has also shown that the string theorists’ approach to mathematics radically differs from the one of mathematicians, since string theorists—and physicists in general—make use of nonrigorous methods, following their ‘‘hunches’’ and ‘‘intuitions’’ rather then accepted standards of mathematical proof (Galison, 2004, p. 35, 53). In short, what A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 Randall calls the ‘‘grammar’’ of string theory is certainly more refined than the ‘‘nouns and phrases’’ of model-building, but is usually no rigorous mathematical structure either. Significantly, Galison has characterized the practices of string theorists as a ‘‘hybrid’’ of physics and mathematics (Galison, 2004, pp. 25, 53, 60) and, as we shall see in Section 5, the use of nonrigorous mathematical heuristics on the part of theoretical physicists is a key element not only in string theory but also in model-building. In this sense, some of the reflections of the present paper may also apply to string theory, yet the present investigation is concerned only with model-building, and I will attempt no assessment of whether the notion of theoretical core may also be applied in the context of string theory. 4. Theoretical cores and their mathematical and nonmathematical components To offer a more detailed characterization of models, theoretical cores, and their mutual relationship, I will take as a starting point the analysis of Margaret Morrison and Mary Morgan, where ‘‘models’’ are seen as ‘‘both a means and a source of knowledge’’, fulfilling a mediating function between theory and experiment, and representing an autonomous method of production of scientific knowledge (Morrison & Morgan, 1999, quote on p. 35). I shall argue that the two levels between which BSM-models mediate are, on the experiment side, particle phenomena as represented by the Standard Model and, on the theory side, theoretical cores, i.e. composite constructs which only display partial mathematical character. Although the modus operandi of model-builders may formally still be motivated by the idea of a final theory, so little actual commitment to the feasibility of that aim is present in the practice of model-building that one might argue that they represent an implicit shift in the attitude to what ‘‘scientific knowledge’’ is taken to be—a shift in which the notion of theory faces erosion, at least in the traditional form paradigmatically represented by classical mechanics, electromagnetism, relativity or quantum mechanics. In other words, I would like to claim that—de facto if not de jure—in the practice of model-building the notion of theoretical core has replaced the idea of theory. However, this transformation does not in any way mean a shift toward a lesser commitment to some form of realism on the part of (theoretical) physicists. As remarked in the previous sections, theoretical cores contain elements of nonmathematical character and these are of two kinds: On the one side narratives (‘‘stories’’) and, on the other, analogical references to empirical results which are not in themselves relevant for the core, but provide evidence that similar theoretical approaches have been successful in other fields (‘‘empirical references’’). While stories are expressed verbally, empirical references may be formulated using not only words, but also images, diagrams or mathematical formulas. Let us first turn to the narrative components: I employ the term ‘‘story’’ to indicate them, as I believe that they can indeed be described so according Stephan Hartmann’s terminology (Hartmann, 1999). In a study on phenomenological models, Hartmann argued that rival models may enhance their appeal by means of a good story: ‘‘A story is a narrative told around the formalism of the model. It is neither a deductive consequence of the model nor of the underlying theory. It is, however, inspired by the underlying theory (if there is one). This is because the story takes advantage of the vocabulary of the theory (such as ‘gluon’) and refers to some of its features (such as its complicated vacuum structure). Using more general terms, the story fits the model in a larger framework (a ‘world picture’) in a 199 non-deductive way. A story is, therefore, an integral part of a model; it complements the formalism. To put it in a slogan: a model is an (interpreted) formalismþa story’’ (Hartmann, 1999, p. 344, italics in the original). Although it was developed for phenomenological models, the notion of ‘‘story’’ can be very fruitful also when applied at the ‘‘theory’’ level of Morgan and Morrison’s scheme: We may conceive a theoretical construct as composed both of mathematical elements and of stories complementing them. Out of the interplay of these two factors, a theoretical construct emerges which is not a full-fledged mathematical theory, but rather a theoretical core. The term ‘‘theoretical core’’ was suggested in 2007 by Margaret Morrison, in a paper asking ‘‘Where have all theories gone?’’ and arguing that a more flexible notion of theory should be adopted when studying the reality of scientific practice (Morrison, 2007). Morrison stated that, after focussing mostly on the epistemological role of models, ‘‘The time has come to bring theories back into the picture and attempt a reconstruction of the relation between models and theories that emphasizes a distinct role for each’’ (Morrison, 2007, p. 196). Morrison proposed to differentiate between models and theories using ‘‘the notion of a theoretical core: a set of fundamental assumptions that constitute the basic content of the theory, as in the case of Newton’s three laws and universal gravitation. This core constrains not only the behaviour but the representation of objects governed by the theory as well as the construction of models of the theory.’’ (Morrison, 2007, pp.197–198). According to her view, this characterizations avoids the problem of having to include in a theory all mathematical techniques which may be required to expound or apply it, and ‘‘nothing about this way of identifying theories requires that they be formalized or axiomatized’’ (Morrison, 2007, p. 205). Using the theory of superconductivity as an example, Morrison stated that ‘‘it refers to a well-defined core that involves the notion of pairing and a connection with the broader theoretical principles of spontaneous symmetry breaking, both of which constrain the way superconducting phenomena can be represented’’ (Morrison, 2007, p. 226). While Morrison only suggested as an aside that stories may provide a relevant element of theoretical cores, Isabelle Peschard focussed precisely on this aspect and introduced ‘‘the notion of a theoretical story as a resource and source of constraint for the construction of models of phenomena,’’ aiming to ‘‘show the relevance of this notion for a better understanding of the role and nature of values in scientific activity’’ (Morrison, 2007, pp. 220, 224; Peschard, 2007, p. 151). Peschard argued that conceiving theories (also) as stories allows to characterize the theoretical coherence of the models connected to them without depending on the existence of mathematical principles, and also to account for ‘‘the way in which a theoretical framework guides and constrains the construction of models’’ (Peschard, 2007, p. 156). Her conclusion is that theoretical stories represent cognitive values: ‘‘The theoretical story guides and constrains the construction of models by prescribing what should be accounted for, what should be taken into account, what are the things that are relevant to the understanding of a domain of phenomena, what features they have, what kind of behaviour these things can have or kind of relations there can be between the features of the different things that are involved’’ (Peschard, 2007, p. 166). The cases studied by Peschard are much different from those which are the subject of the present work, and my study is no rigorous application of her approach. However, I find her ideas 200 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 highly inspiring and in the following pages I shall underscore how the stories which came to be associated to the composite Higgs approach played a key role in defining what features the models of a class should possess and how stories contribute to unify the many different models despite the lack of a rigorously defined common mathematical element. Beside the mathematical and narrative components there is a third element which is essential for the coming-to-be of a theoretical core in today’s high-energy physics: The reference to observed phenomena whose generally accepted explanation can be interpreted in analogy to the ideas characterizing the core in question. I will refer to this kind of element as an ‘‘empirical reference.’’ To introduce this notion, let us again consider the case of supersymmetry: Its theoretical core contains the idea that particle interactions are approximately invariant with respect to supersymmetric transformations and that therefore the ‘‘superpartners’’ of all known particles await discovery at higher energies. Since the mechanism of supersymmetry-breaking is unknown, the masses of these hypothetical particles cannot be exactly predicted, yet their interaction properties are for the most part determined by those of Standard Model particles. This picture combines the mathematics of supersymmetry with the story of its breakdown and of the existence of a supersymmetric world awaiting discovery. Its plausibility is supported by the knowledge that, since the 1960s at the latest, identifying groups of particles as quasi-multiplets of a broken symmetry has led to successful predictions of ‘‘missing’’ members of such multiplets and eventually to a deeper understanding of elementary interactions. The phenomenological successes of the ‘‘broken symmetry’’ idea do not imply that this approach will work also in the case of supersymmetry, but they help such a scenario appear more plausible. In this sense they constitute an important ‘‘empirical reference’’ for the theoretical core of supersymmetry. The importance of analogical references to previous empirical successes had already been discussed by Pickering (1981), who used Thomas Kuhn’s remarks on ‘‘exemplars’’ as a starting point: ‘‘Kuhn used the term ‘exemplar’ as a shorthand for ‘shared example’. He emphasized that such a shared example derives from the concrete demonstration in some practical situation of the utility of a cultural product—a new experimental technique, a new theoretical model, or whatever—and that it is precisely through such demonstrations that new concepts are related to the natural world and acquire meaning [y] For the purposes of the present work, an exemplar can be regarded as an analogy drawn between some new aspect of the subject matter of the field and some other field of discourse’’ (Pickering, 1981, p. 108, italics in the original). Pickering expounded his idea using an example from high-energy physics of the 1970s: The choice between ‘‘colour’’ and ‘‘charm’’ as an explanation for an observed phenomenon, i.e., the resonance J/psi. It would bring us too far to summarize here Pickering’s historical case study, but it is worth mentioning his conclusions: Around 1975, neither the ‘‘colour’’ nor the ‘‘charm’’ hypothesis was clearly favoured by experimental evidence, but the latter idea was supported in the eyes of most physicist by the fact that the J/psi could be represented as a bound state of a charmed quark with its antiquark by using a simple mathematical model (‘‘charmonium’’) analogous to the one employed in nonrelativistic quantum mechanics: ‘‘In particular, the charmonium model depended upon an image of a charmed quark non-relativistically orbiting its antiparticle under the influence of a central potential, in exact analogy with the atomic system of an electron orbiting a positron (the antiparticle of the electron). The latter system is known as positronium, and is one of the textbook applications of quantum mechanics to atomic physics, being almost identical to the hydrogen atom in its formal treatment. The name charmonium was coined to make explicit the parallel between the envisaged structure of the new particles and the simplest and best understood atomic structure [y] The charmonium model illustrates perfectly the role of an exemplar as the concrete embodiment of an analogy relating a new field of research back to an established body of practice. The model was intuitively transparent to any trained physicist, and this had two important consequences. Firstly, whenever the model encountered a mismatch with reality the resources were available to essentially anyone to attempt to fix it up [y] Another source was less direct but more fundamental: namely, the charmonium model made quarks themselves ‘real’’’ (Pickering, 1981, pp. 124–125). Although I do not wish to fully identify ‘‘empirical references’’ with Pickering’s ‘‘exemplars,’’ which have a broader scope, I believe that in both cases analogies ‘‘referring back’’ to established knowledge play a key role in promoting new theoretical notions in the physics community, as the analogies suggest ‘‘old’’ ways of dealing with ‘‘new’’ concepts and let the latter appear more intelligible. Moreover, in the case of charmonium we also find another element which, as we shall see, has an important function in supporting theoretical cores: the suggestive choice of the names of their elements. Finally, like charmonium, empirical references often involve both mathematical structures and stories, but their function within the core is primarily that of offering a physically certified, analogical template for the physical significance of speculative ideas of the core. Therefore, an empirical reference has to be regarded as a whole and as such plays a role in supporting the cognitive values of stories. As we shall see in detail later on, the explanation of the pion as a bound state of a quark and an antiquark provided a very important template for conceiving the Higgs boson as a composite state of two hypothetical elementary fermions. To sum up, I would like to suggest that the various approaches to new physics should be regarded as theoretical cores characterized by a combination of mathematics, empirical references and stories. The notion of a theoretical core shall allow us not only to draw a distinction between the level of theories and that of models, but also to understand how the practice of modelbuilding contributes to the development of theoretical structures in high-energy physics. Models function as mediating instances between theoretical cores and the Standard Model, which in turn represents well-established experimental results. Models are built as explorative tools working at different levels: to extract from a core quantitative estimates about observable parameters; to understand whether a mathematical method leads to physically relevant results when employed under the guidelines of a given story; or to test whether and how elements from a core can be implemented or combined with those of another one. A word of caution has to be added at this point: The distinction between ‘‘mathematical’’, ‘‘verbal’’ and ‘‘empirical’’ elements of a theoretical core is a powerful heuristic tool, but is not to be understood in the sense that a core can be taken apart into three groups of components. Quite the contrary: The theoretical core only emerges from the interplay of all three elements. Despite their hybrid character theoretical cores are coherent constructs commanding respect and commitment on the part of physicists. This fact hardly comes as a surprise to those familiar with today’s theoretical practices in high-energy physics and cosmology, where the use of nonrigorous mathematical methods is the rule rather than the exception. The next section shall be devoted to A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 discussing some testimonies by theoretical physicists and mathematicians on this point, as well as on the various non-mathematical elements which are constitutive ingredients of physical theorizing. Understanding the kind of practice in which modelbuilding is embedded allows to grasp how a hybrid theoretical core may play a role in many ways equivalent to that of a mathematically well-defined theory. Moreover, it is only by taking into account nonrigorous practices that the epistemic function of the models being built can be understood: Their value lies primarily in the fact that they can deliver nonrigorous, but convincing ‘‘proofs’’ of the validity of hypotheses concerning theoretical cores. This point will be the subject of the next section. 5. Nonrigorous mathematical methods and the probative value of models According to what we have discussed in the previous sections, models of BSM-physics could in a way be thought of as toymodels, to be set aside soon after they have served their immediate aim. However, it would be wrong to believe that model-builders consider their creations as ‘‘toys’’ in the sense of constructions helpful only for an approximate assessment of a situation, or to be used for didactic purposes: As far as their specific explorative task is concerned, models represent the most serious instance of proof available to researchers. To understand better how models may come to be regarded as having probative value with respect to hybrid theoretical cores, we have to take a closer look at the employment of nonrigorous mathematical methods in theoretical physics. In the construction of physical knowledge there is a long tradition of using mathematical methods which are considered nonrigorous by the standards of the time, both in defining expressions and in proving their validity (Grosholz, 2007). Prominent examples can be found in celestial mechanics (George William Hill’s variational methods), electromagnetism (Oliver Heaviside’s operational calculus) and quantum theory (infinite matrices, d-funtion) (Bernkopf, 1968, pp. 318–320; Borrelli, 2010; Daley, 2003; Lützen 1979; Peters, 2011). In many—yet not in all—cases physically significant results which were obtained non-rigorously could eventually be reformulated in mathematically acceptable terms, and these successes have bolstered the physicists’ rather lax attitude to mathematical proof: ‘‘Mathematicians are far more interested in finding rigorous proofs, whereas physicists, who use mathematics as a tool, are usually happy with a convincing argument for the truth of a mathematical statement, even if that argument is not actually a proof. The result is that physicists, operating under less stringent constraints, often discover fascinating mathematical phenomena long before mathematicians do’’ (Gowers, 2008, p. 7). In fact, one may add that theorists often have little choice but accept mathematical risks if they want to pursue their physics interests. While it is not my intention to discuss here (or elsewhere) normative issues regarding the use of mathematics in physics, I wish to note how physicists, although they do not in general disregard mathematical rigor and apply it whenever possible, are also at times ready to employ ill-defined expressions or base their work on ‘‘folk theorems,’’ which in the words of Steven Weinberg are ‘‘things that have never been proven, but are known to be true’’ (Weinberg, 1985, p. 125; Schweber, 1993, p. 162). Weinberg attributes this expression to Arthur Wightman, the mathematical physicists who devoted himself to the development of a rigorous version of quantum field theory, yet mathematicians use this term rather to indicate theorems whose proof has not yet been 201 published, and it was in this sense that Wightman for example spoke of a ‘‘folk lemma’’ (Wightman, 1973, p. 473). Weinberg was instead largely responsible for the diffusion of the term among theoretical physicists to indicate statements that the community regards a true, even though no proofs are known for them: ‘‘These theorems can probably be formulated in precise terms, and proved, though I haven’t done it’’, wrote Weinberg of two folk theorems (Weinberg, 1985, p. 125; see also Weinberg, 1997). This attitude is ambiguous, since the existence of a proof for folk theorems is not doubted, and is indeed regarded as a necessary requirement for the validity of the physical theories built upon them, while at the same time the grounds on which the conviction of such existence rests remain obscure. The use of folk theorems and of nonrigorous heuristics has become increasingly common among high-energy physicists, and is generally regarded by them as an unproblematic issue to be either glossed over or underpinned with unconditional statements of belief in the truth-content of science such as the one quoted above. However, by the 1990s at the latest, such attitudes to mathematical rigor had started spreading also among mathematicians: As discussed by Galison (2004), one of the main factors in this development was the rise of string theory, whose practitioners combined the nonrigorous methods of theoretical physics (e.g., path integrals, Feynman diagrams) to physical arguments in the search for new mathematical structures which they believed would provide a key to understanding nature. As we saw in Section 3, Galison has underscored how the development of string theory created a new ‘‘border region’’ between physics and mathematics characterized by hybrid research objects: ‘‘With the new sense of theoretical physicist and geometer came also a new object of inquiry: in its present form not quite mathematical and not quite physical, either. One day pieces of such entities may be folded back into physics or into geometry, but at the century’s end they were conceptual objects, hugely productive and yet seen with discomfort by purists in both camps.’’ (Galison, 2004, p. 60, italics in the original). Galison explained at length how the new hybrid conceptual objects were perceived by ‘‘purists’’ as differing both from mathematical structures and from empirically-based physical notions, but he did not discuss in detail whether they were expressed only through nonrigorous mathematical formalisms, or also in fully nonmathematical (e.g., verbal or visual) terms. I would like to suggest that these string-theory hybrids are closely related to—although not identical with—theoretical cores of BSM-physics. Both kinds of theoretical constructs are born out of the humus of nonrigorous mathematical practices which has come to play an increasingly central role in the whole of highenergy physics. Under these premises, to shed some light on the practice of model-building it is important to learn more about the epistemic status of nonrigorous heuristics among theorists and mathematicians. This issue is rarely if ever explicitly addressed within the context of model-building, but we can learn something about it thanks to a dispute between mathematicians and theoretical physicists whose starting point was a paper written in 1993 by Artur Jaffe, a former PhD. student of Wightman who, like his advisor, had devoted much effort to the development of a rigorous version of quantum field theory, and Frank Quinn, a topologist (Jaffe & Quinn, 1993). The paper was published in the Bulletin of the American Mathematical Society with the title: ‘‘Theoretical mathematics: towards a cultural synthesis of mathematics and theoretical physics’’. As the place of publication shows, the article was directed at mathematicians and mathematical physicists, and was prompted by the spread in the mathematical community of the non-rigorous practices of ‘‘proof’’ common 202 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 among theoretical physicists and computer scientists. The authors stated their views clearly in the abstract: Morris Hirsch, a topologist who had been Thurston’s PhD advisor, also made some remarks which are very relevant for our subject, stating: ‘‘Is speculative mathematics dangerous? Recent interactions between physics and mathematics pose the question with some force: Traditional mathematical norms discourage speculation, but it is the fabric of theoretical physics. In practice there can be benefits, but there can also be unpleasant and destructive consequences. Serious caution is required, and the issue should be considered before, rather than after, obvious damage occurs. With the hazards carefully in mind, we propose a framework that should allow a healthy and positive role for speculation’’ (Jaffe & Quinn, 1993, p. 1). ‘‘The nonrigorous use of mathematics by scientists, engineers, applied mathematicians and others, out of which rigorous mathematics sometimes develops, is in fact more complex than simple speculation. While sloppy proofs are all too common, deliberate presentation of unproven results as correct is fortunately rare. Much more frequent is the use of mathematics for narrative purposes [italics in the original]. An author with a story to tell feels it can be expressed most clearly in mathematical language. In order to tell it coherently without the possibly infinite delay rigor might require, the author introduces certain assumptions, speculations and leaps of faiths [y] In such cases it is often irrelevant whether the mathematics can be rigorized, because the author’s goal is to persuade the reader of the plausibility or relevance of a certain view about how some real world system behaves.’’ (Atiyah et al., 1994, p. 186). Jaffe and Quinn advocated a clear-cut, explicit separation between ‘‘rigorous’’ and ‘‘theoretical’’ mathematics, where the latter was to be understood as encompassing both the conjectures and folk theorems of theoretical physicists and the numerical demonstrations of hypotheses practised by computer scientists. They shared Weinberg’s (and indeed most physicists’) view that the ultimate aim of theoretical physics research is the construction of theories of rigorous mathematical character. However, other than Weinberg, they were not ready to belittle or even ignore the gap between rigorous ideals and nonrigorous research practices. The editors of the Bulletin invited a large number of mathematicians and physicists to respond to the paper, and their answers were published in a subsequent issue of the journal, together with a reply by Jaffe and Quinn. Those texts addressed all possible facets of the matter: mathematical, physical, historical, epistemological, sociological and institutional (Atiyah et al., 1994; Jaffe & Quinn, 1994; Thurston, 1994). Galison has discussed some of these texts with particular attention to the interplay of different sets of ‘‘values,’’ here I will instead focus on some passages shedding light on the hybrid character of the constructs with which theorists and some mathematicians work, and on the ambiguous attitude of physicists with respect to rigorous proofs and their nonrigorous surrogates. William Thurston—a topologist and computer scientist whose conjectural ‘‘geometrization theorem’’ Jaffe and Quinn had used as a ‘‘cautionary tale’’ against the use of nonrigorous methods—wrote not just a letter, but a whole paper ‘‘On proof and progress in mathematics,’’ where he discussed in depth and detail the issue of communication strategies within mathematics (e.g., verbal, visual, kinaesthetic, logical and more) and their decisive role in shaping the notion of validity of mathematical statements. He related his own experience of difficulties in communicating results to different mathematical communities and recounted how he had come to recognize that ‘‘mathematical knowledge and understanding were embedded in the minds and in the social fabric of the community of people thinking about a particular topic. This knowledge was supported by written documents, but the written documents were not really primary.’’ (Thurston, 1994, p. 169). He also claimed that ‘‘Most mathematicians adhere to foundational principles that are known to be polite fictions,’’ quoting as an example the well-ordering of real numbers and problematized the notion of ‘‘proof,’’ opposing ‘‘formal proofs’’ to human understanding of mathematical validity: ‘‘For the present, formal proofs are out of reach and mostly irrelevant: we have good human processes for checking mathematical validity’’ (Thurston, 1994, pp. 170–171). Thurston characterized formal proofs as ‘‘mostly irrelevant,’’ but did not in principle doubt their existence (they were just ‘‘out of reach’’): This is the same ambiguous attitude we already encountered in Weinberg’s statements because, as underscored by Jaffe and Quinn, it can never be sure that a formal proof exists until it has been found. This passage can be read as a describing those aspects of the theoretical practices which are the subject of the present study: Tn the theoretical cores of BSM-physics mathematical fragments such as supersymmetric transformations or extra-dimensional metrics are embedded into ‘‘narratives’’ containing also elements of nonmathematical character, such as stories and empirical references. The aim is to persuade the audience of the physical relevance of a given theoretical core and, as we shall see, the method is indeed fruitful. Thus we see how a connection exists between the employement of nonrigorous mathematical heuristics and the increased importance of theoretical constructs involving nonmathematical elements. There is a further detail in Hirsch’s statement which is worth noting: Unlike Thurston and Weinberg, he states explicitly that it can be ‘‘irrelevant whether the mathematics can be rigorized.’’ From this point of view, nonrigorous demonstrations are not only qualitatively different, but also epistemically independent from the rigorous proofs which may or not exist. In my opinion Hirsch’s words offer an accurate characterization of the actual practices of theoretical physicists, where nonrigorous proofs are perceived as valid independently of the existence of formal ones, despite the fact that most physicists, when asked, stick to the idea that their constructs can be transformed into rigorous ones. To underscore this distinction, in the following pages I shall make use of the terms demonstration/demonstrate—as opposed to proof/prove—to indicate arguments which have probative value in the context of model-building, although they do not constitute a proof in the rigorous mathematical sense. The perceived epistemic independence of demonstrations from proofs is of central importance for our subject, because it is within this framework that the practice of model-building becomes endowed with its own specific kind of validating power. The act of building a model which implements elements from a core and possesses some desired features (e.g., a Higgs boson of a given mass) ‘‘demonstrates’’ that a theory with those features is possible, even though the model itself may have other characteristics that are clearly unrealistic. In the following sections I shall discuss a case study spanning the period from the 1960s until today, and we shall see how already in the early years model building was a powerful means to demonstrate the validity of theoretical hypotheses and of theoretical cores. 6. The ‘‘composite Higgs’’ approach to BSM-physics as a theoretical core: A case study The reflections and notions introduced in the previous sections will now be illustrated and supported by a case study focussing A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 on the theoretical core linked to the notion of a composite Higgs boson. The best way to grasp how the unity of a theoretical core emerges from the interplay of various elements is to reconstruct the historical path of its development. In this way, it will become clear how—under specific conditions—words, formulas and experimental results may resonate with each other in the perception of the physics community, leading scientists to regard the network of mutual references between them as a theoretical unity—a core—endowed with cognitive values and worthy of being used as a guideline for explorative model-building. Although experiments play an important role, the emergence of a theoretical core of BSM-physics cannot be attributed to positive experimental indications in its favour. At the same time, the success of a core is not purely due to abstract considerations of mathematical beauty or to choices of career-opportunitism: Although these factors certainly have their weight, the process through which a theoretical core establishes itself is an epistemic phenomenon involving all facets of the scientific enterprise and characterized by a peculiar kind of interplay between abstract and empirical elements. In my case study, I will focus on reconstructing these aspects, leaving aside for the moment the broader social and scientific context in which they are embedded. This restriction is necessary in the context of the present, exploratory study. Before discussing the composite Higgs approach, a few words about the main features and the history of the Standard Model are in order. The Standard Model is made out of two parts, one dealing with electroweak interactions (the ‘‘Weinberg–Salam model’’), the other with strong forces (‘‘quantum chromodynamics’’, short: QCD) ( Hoddeson, Brown, Riordan, & Dresden, 1997; Zee, 2010). In both cases the theory is formally defined by assigning a function (‘‘Lagrangian’’) which—at least in principle—uniquely determines the elementary particles contained in it, as well as their properties (masses, mutual coupling constants, transformation properties with respect to various symmetries). Despite the definiteness of the Lagrangian, physical predictions can be extracted from it only in a limited number of cases and, even then, usually only with the help of approximations and of additional assumptions—and without a rigorous mathematical proof that the procedures involved are correct. For example, most Standard Model predictions rely on perturbative expansions whose convergence is not ensured (Cao & Schweber, 1993, pp. 33–34). Moreover, in QCD perturbation theory has a very limited application and theorists are forced onto even less safe ground when attempting to compute the consequences of nonperturbative effects, such as the properties of observed hadrons as bound states of elementary quarks. These difficulties have to be kept in mind, since nonperturbative effects play a central role in the composite Higgs approach. In the Standard Model there is one particle to which special significance is attached: an elementary scalar (i.e., spinless) particle usually referred to as ‘‘the Higgs boson’’. We need not discuss here the technical details making the Higgs boson so important, since for our purposes it is sufficient to quote the verbal statements explaining its role: The Higgs boson ‘‘breaks spontaneously’’ the local gauge symmetry of electroweak interactions through the ‘‘Higgs mechanism’’ and, in doing so, allows itself and all other particles of the Standard Model to have a non-zero mass without spoiling the ‘‘renormalizability’’ of the whole theory. The notion of ‘‘spontaneous symmetry breaking’’ will be discussed later on. As to ‘‘renormalizability’’, it is a property of a theory which ensures that the divergent terms appearing in perturbative computations of observable quantities can be formally subtracted by imposing a finite number of empirically determined conditions, such as requiring that the observable electron charge be equal to its measured value. Renormalization procedures are a prominent example of nonrigorous mathematical methods in quantum field theory. In 1971, Gerhard 203 ‘t Hooft proved the renormalizability of the class of models to which both the Weinberg-Salam model and QCD belong and, by the end of the 1970s, the Standard Model had established itself as a phenomenologically successful theory. However, the Higgs boson has not yet been observed and, indeed, the ‘‘Higgs mechanism’’ is the feature of the Standard Model which has most often been called into question. The theoretical core which has the notion of a ‘‘composite Higgs’’ at its centre shares with the Standard Model the idea that particle masses are linked to the spontaneous breakdown of electroweak symmetry. However, the mathematical structures and the story associated to this general notion diverge from those of the Standard Model. From the mathematical point of view, the main distinction is the fact that the Lagrangian of composite Higgs models does not contain any elementary scalar particles like the Higgs boson. The symmetry breakdown is attributed to nonperturbative effects in the interaction of as-yet-unobserved fermions, which eventually form a bound state having the same properties (mass, charge etc.) as the Higgs boson. Although there are no mathematical tools which allow to accurately describe nonperturbative dynamics, it is nonetheless possible to convincingly argue that they can lead to the spontaneous breakdown of electroweak symmetry. 7. Yoichiro Nambu’s story of symmetry lost and recovered Historically, spontaneous symmetry breaking both through an elementary Higgs boson and by way of nonperturbative effects can be traced back to a paper by Yoichiro Nambu devoted to the phenomenon of superconductivity and its relation to symmetry (Nambu, 1960).1 In Nambu’s paper the formal methods developed in quantum electrodynamics were used to open up new physical perspectives in the study of superconductivity. Starting point of his reflections was the Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity, in which nonperturbative effects of the electromagnetic interactions were shown to give rise in certain materials to a ground state which was separated from excited states by a finite energy gap. It was thanks to the presence of this energy gap that the material could exists in a stable superconducting state. The BCS theory had been very successful from the phenomenological point of view, but had initially raised eyebrowses in the solid-state community because of its apparent lack of invariance with respect to the local gauge symmetry of electromagnetism—a fact which seemed to imply a violation of electric charge conservation. Indeed, in the formalism developed by Bogliubov, Tolmachev, & Shirkov (1959). This violation of one of the most sacred principles of physics had however been shown to be only apparent because, again thanks to nonperturbative effects, quasi-particles combined to form ‘‘collective excitations’’, which in turn insured that all observable quantities would be gauge-invariant and charge-conserving. These results had been known already in the late 1950s, but in 1960 Nambu used the techniques developed in quantum electrodynamics to pursue a more detailed investigation of the relationship between quasiparticles, collective excitations and gauge invariance: ‘‘If such collective states are essential to the gauge-invariant character of the theory—he wrote—then one might argue that the former is a necessary consequence of the latter. But this point has not been clear so far’’ (Nambu, 1960, p. 649). 1 I will summarize only those historical–philosophical aspects of spontaneous symmetry breaking which are relevant to the present subject. Those who wish to learn more about Nambu’s work are referred to the detailed discussion in (Brown & Cao, 1991), where the authors however attribute to the notion of ‘‘spontaneous symmetry breaking’’ a universal character which is not to be found in historical sources. Exhaustive historical–philosophical discussion of spontaneous symmetry breaking in the Standard Model with references can be found in: (Brown, Brout, Cao, Higgs, & Nambu 1997; Karaca, forthcoming; Borrelli, forthcoming). 204 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 Thus, Nambu’s central aim was not to derive new observable consequences of the BCS theory, but rather to use new mathematical tools to investigate the physical implications of the already existing formalism. The new mathematical presentation of an already known subject lent itself to a particularly appealing narrative interpretation (a story) in terms of a symmetry apparently lost and recovered: Since the underlying theory was symmetric and the superconducting state was not—so Nambu argued—collective excitations had to appear to compensate. In other words, the combined existence of the underlying symmetry and of the energy gap was verbally interpreted as the cause of the appearance of collective excitations. Despite Nambu’s refined and highly innovative methods, no compelling mathematical argument corresponded to the generality of this narrative, but his results were enough to provide a nonrigorous demonstration of the story of apparent loss and recovery of symmetry through nonperturbative effects. Here we see an early example of how fragmentary, nonrigorous mathematical arguments can be combined with a story to build a hybrid theoretical construct, whose validity cannot be rigorously proven, but only demonstrated in special cases. It is here worth noting that, although the work of Nambu would later be described as a case of ‘‘spontaneous symmetry breaking’’, the author never spoke of a ‘‘breaking’’ of symmetry, but rather of its preservation. One year later, in a paper written together with Giovanni JonaLasinio, Nambu applied the construct developed in solid state physics to quantum field theory (Nambu & Jona-Lasinio, 1961a). The authors established an analogy between the phenomenon of superconductivity and the masses of strongly interacting particles. Their idea was that the mass of observed hadrons (protons, neutrons, pions etc.) resulted from nonperturbative effects in the interactions of some as yet undetected, massless elementary fields. The unknown fundamental dynamics was assumed to be invariant with respect to a symmetry which, as in the case of superconductivity, was (apparently) lost and (equally apparently) recovered due to nonperturbative effects. More precisely, nucleons (i.e., protons and neutrons) were regarded as analogous to quasi-particles, with their finite masses playing the role of the superconductivity energy gap. In the same way in which quasiparticles violated electromagnetic gauge invariance (and with it electric charge conservation), nucleons broke a fundamental symmetry (‘‘chiral symmetry’’) which would have required them to be massless. Overall invariance was however recovered thanks to the contribution of ‘‘collective excitations,’’ which in strong interactions took the form of pions (and eventually also other hadrons). Once again, the whole construct was not a rigorous mathematical theory, but rather a composite of rigorous and nonrigorous mathematics, as well as verbal narratives. At the beginning of the paper, the authors explained that their analogy was primarily a qualitative one, which they ‘‘would like to pursue mathematically.’’ (Nambu & Jona-Lasinio, 1961a, p. 346) They were able to deliver some mathematical arguments in favour of their thesis, but the chosen form for the underlying interaction remained purely speculative, and they could only offer a demonstration of their thesis through model-building: ‘‘Our model Hamiltonian, though very simple, has been found to produce results which strongly simulate the general characteristics of real nucleons and mesons’’ (Nambu & Jona-Lasinio, 1961a, p. 357). Thus, the hybrid theoretical construct proposed by Nambu to explain superconductivity in terms of a hidden symmetry was now being extended to the field of strong interactions thanks to a mathematical analogy based on the formalism of quantum field theory. Model-building was used to ‘‘demonstrate’’ the validity of the extension. The fact that Nambu’s construct arguably explained (nonsymmetric) superconductivity in terms of (symmetric) electromagnetism provided an ‘‘empirical reference’’ further supporting the idea that nonperturbative effects linked to a hidden symmetry could explain observed variety (differences in hadron masses) in terms of hidden simplicity (a small number of massless particles). As we see, Nambu and Jona-Lasinio’s explanation of some features of strongly interacting particles can be interpreted as a theoretical core like those which dominate today’s BSM-modelbuilding. In proposing their theory Nambu and Jona-Lasinio were partly building upon previous work by Julian Schwinger and Werner Heisenberg, yet Schwinger and Heisenberg had developed their arguments only at the formal, quantum-field-theoretical level (Schwinger, 1957; Heisenberg, 1957; Dürr, Heisenberg, Mitter, Schlieder, & Yamazaki, 1959). Nambu and Jona-Lasinio instead brought in a new, decisive element into play: The interdisciplinary analogy linking speculative hypotheses in particle physics to well-established solid state theories and phenomena. This empirical reference was essential in supporting the validity of the core. However, Nambu and Jona-Lasinio soon had more to offer than analogies: In the same year, they published a follow-up paper in which they tentatively developed some phenomenological consequences of their ideas (Nambu & JonaLasinio, 1961b). In particular, they showed how one could derive from their model the so-called ‘‘Goldberger–Treiman relation’’ linking the pion mass to the pion decay constant. The formula was known to be phenomenologically successful and had later found a justification on the basis of a mathematical assumption on the form of weak interactions (the ‘‘partially conserved axial current’’ (PCAC) hypothesis) whose physical significance remained however unclear (Pickering, 1984, pp. 112–113). The derivation of that formula by Nambu and Jona-Lasinio was based on a number of additional assumptions, but nonetheless delivered a phenomenological underpinning for the conjectured analogy between superconductivity and hadron masses. The derivation of the Goldberger–Treiman relation was for Nambu and Jona-Lasinio’s theoretical core much more than an analogical empirical reference, as it provided a directly relevant evidence of the predictive power of the construct. In this sense, their theoretical core was much less speculative than those of today’s BSM-physics. 8. Enter ‘‘spontaneous symmetry breaking’’ Responses to the work of Nambu and Jona-Lasinio were prompt and positive. In 1961, Jeffrey Goldstone took up the story of the existence of ‘‘superconductor solutions’’ in particle physics and implemented it anew in a simple model in which the collective excitations were represented by an elementary scalar field put in by hand (Goldstone, 1961). Goldstone explicitly acknowledged the explorative character of the model he was building and the nonbinding, yet interesting nature of his results: ‘‘The present work merely considers models and has no direct physical applications, but the nature of these theories seems worthwhile exploring. The models considered here all have a boson field in them from the beginning. It would be more desirable to construct bosons out of fermions and this type of theory does contain that possibility. The theories of this paper have the dubious advantage of being renormalizable, which at least allows one to find simple conditions in finite terms for the existence of ‘superconducting solutions’’’ (Goldstone, 1961, pp. 154–155). For Goldstone, the scalar boson effectively represented the nonperturbative effects of the unknown underlying interactions and allowed to ‘‘black-box’’ them, investigating the consequences of the assumed existence of ‘‘superconducting solutions’’. The new class of models implementing the theoretical core proposed by Nambu A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 and Jona-Lasinio were explicitly presented as exploratory tools with ‘‘no direct physical application,’’ but this approach to modelbuilding proved heuristically fruitful and was subsequently employed by a number of authors to study the spontaneous breakdown of a special kind of symmetry (local gauge symmetry), leading to the formulation of what is today referred to as the ‘‘Higgs mechanism’’: a spontaneous breakdown of local gauge symmetry linked to an elementary scalar boson (the Higgs boson). While Goldstone’s introduction of an elementary scalar was very important as a mathematical tool for model-building in connection with the theoretical core proposed by Nambu and Jona-Lasinio, another relevant step was taken in 1962 by Marshall Baker and Sheldon Glashow, who introduced the term ‘‘spontaneous symmetry breaking’’ to characterize Nambu’s story of symmetry loss and recovery (Baker & Glashow, 1962; Brown et al., 1997, p. 512). The question of naming might at first appear as a secondary issue, but, as far as stories and hybrid constructs are concerned, it may weight heavily in favour or disfavour of their success in the community, and physicists are well aware of this fact. In Section 2 we have seen the creative efforts to ‘‘name’’ alternative Higgs models, and in the next sections I will underscore how physicists often strove to find suggestive, catchy nicknames for their creations or, if possible, choose names suggesting an empirical reference, as in the case of ‘‘charmonium’’. In 1967, spontaneous symmetry breaking through the Higgs mechanism was used by Weinberg and Salam in (independently) formulating their model of electroweak unification (Weinberg, 1967; Salam, 1968). Both authors expressed the hope that the spontaneous nature of the breakdown of electroweak symmetry would lead to the renormalizability of the theory, but had no proof to offer for their conjecture. Once again, a story had been introduced into particle physics, and it would later prove to be a very successful one. Weinberg and Salam’s model could be seen as the implementation of a theoretical core combining mathematical elements (Higgs mechanism, local gauge invariance, the methods of renormalization) with stories (the ‘‘spontaneity’’ of the electroweak symmetry breaking and the idea that it would guarantee renormalizability) and empirical references (superconductivity, Nambu and Jona-Lasinio’s explanation of the Goldberger—Treiman relation). In 1967, there was no indication that the connection proposed by Weinberg and Salam between spontaneous symmetry breaking and renormalizability might be real, as they could not offer any mathematical arguments in support of this hypothesis. Their papers were practically ignored by the scientific community showing how, in this period, a theoretical core without any direct phenomenological or rigorous mathematical support did not necessarily appear attractive to theorists. In 1971, though, ‘t Hooft delivered a solid mathematical underpinning for the story connecting the ‘‘Higgs mechanism’’ to renormalizability (‘t Hooft, 1971a, 1971b; Koester, Sullivan, & White, 1982; Veltmann, 1997), and this development suddenly made the Weinberg–Salam model appealing. Within a few years, experimental evidence in its favour had accumulated. Yet, while experimental physicists were busy finding evidence in favour of the Weinberg-Salam model, many theorists had already started building models which went beyond it (Pickering, 1984, pp. 180–187). I shall briefly discuss this trend in the next session. 9. Model-building beyond the Weinberg–Salam model in the early 1970s In a review article on effective field theory (1993), Howard Georgi wrote: ‘‘In the early 70’s, after Gerhard ‘t Hooft’s explanation of the renormalizability of spontaneously broken gauge theories, but 205 before the ascendancy of the standard model, many physicists engaged in model building, exploring the huge new space of renormalizable models that ‘t Hooft opened up to us. This now seems a little naive, but a tremendous amount of effort was expended understanding the range of possibilities for spontaneous symmetry breaking with elementary scalar field. Much of this effort was devoted to two goals: 1. to determine the precise form of the gauge structure of the partially unified theory of electroweak interactions; 2. to further unify the electroweak interactions by incorporating its gauge structure into something more comprehensive and simpler. The first goal, as it turned out, was bootless. Somehow, Glashow, Weinberg and Salam had written down the right electroweak gauge group the first time’’ (Georgi, 1993, pp. 217–218). Looking at the number of proposals made in those years for alternatives to the Weinberg–Salam model, it becomes clear how that specific model was not regarded with the respect due to a promising candidate for describing fundamental interactions, but was rather seen as one of the countless expendable models which had come and gone in the previous years. Only the general principles it embodied (renormalizability, gauge invariance, spontaneous symmetry breaking) were expected to eventually survive. In other words, the hybrid theoretical core proposed by Weinberg and Salam was regarded as being phenomenologically successful, while the specific model they had formulated was expected to be only a short-lived preliminary proposal. As Georgi noted, things turned out differently, but of course model-builders of that period could not know that. Even Weinberg conceived his own model as one ‘‘of a general class’’ and stated: ‘‘There are many possible presumably renormalizable models, based on different underlying symmetries and different patterns of symmetry breaking’’ (Weinberg, 1972, pp. 1962–1963). He, too, regarded the renormalizability of his own model as a positive test not of that specific mathematical form, but rather of the theoretical core embodied in it. By 1974, experimental evidence in favour of the Weinberg–Salam model had lead to the demise of its early rivals, as extensively documented in (Koester et al., 1982; Sullivan, Koerster, White, & Kern, 1980). However, these developments by no means meant that model-builders had given up their trade. Quite the contrary: The second half of the 1970s saw the rise both of the Standard Model and of some among the most significant approaches to physics beyond it: grand unified theories, supersymmetry and, last but not least, the idea of a composite Higgs boson. A feeling for the attitude of model-builders already in this early period may be gleamed from a paper published in 1974 by Abraham Pais and Howard Georgi, in which they tried to develop some general guidelines for model-building (Georgi & Pais, 1974). I shall not discuss here the contents of the paper, as they are not directly relevant for the present discussion, but I would like to offer from it two quotes which well represent the spirit of model-building, then as now. The first one is from the beginning of the work: ‘‘Many gauge models of weak and electromagnetic interactions have been devised in the last few years. The basic strategy for their construction consists in a reconciliation of field-theoretical and phenomenological requirements. From the side of field theory one insists on the renormalizability of the scheme as the principal predictive theoretical tool. From the side of phenomenology one attempts to incorporate all the known regularities of the weak interactions, What is known here almost entirely concerns the rather low-energy and lowmomentum-transfer domain. Indeed, it is our ignorance of high-energy weak phenomena which allows, at this stage, for so much play in model building.’’ (Georgi & Pais, 1974, pp. 539–540). 206 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 The second was in the conclusions: ‘‘We could not foresee the labyrinth of technical difficulties into which this problem has led us. [y] We feel there must be general strategies for the construction of natural models and further that such general results would be a very important advance in the study of the structure of gauge theories’’ (Georgi & Pais, 1974, p. 557). The ‘‘general strategies’’ of which Georgi and Pais spoke might be seen as similar to Randall’s ‘‘nouns and phrases’’: not proper theories, but possible elements of a theoretical core to put in a ‘‘reservoire.’’ As in Randall’s account, the primary function of model-building was not to construct a theory, but rather to find ‘‘general strategies’’ for further model-building. 10. Dynamical symmetry breaking from the late 1960s to 1978 As we saw in the previous section, by the early 1970s spontaneous symmetry breaking had become a hugely successful hybrid notion in the study of particle interactions, but it had come to be associated primarily with an elementary scalar boson. Yet the original idea of Nambu had not been forgotten and, since the late 1960s, it had been receiving new impulses from the emergence of another new idea: the quark model. In the 1960s, theorists had found a way of at least formally making sense of the plethora of observed strongly interacting particles by regarding them as composites of elementary fermion, the ‘‘quarks’’ (Pickering, 1984, pp. 85–124). Although quarks were unobserved and their dynamics remained unspecified, they provided a new means for implementing Nambu and Jona-Lasinio’s theoretical core. In 1968, Gell-Mann, Robert J. Oakes and R. Benner had shown how the reasoning which had led Nambu and Jona-Lasinio to derive the Goldberger–Treiman relation from a hidden chiral symmetry could be instantiated also using the quark model as the fundamental interaction, and a similar suggestion was also made by Glashow and Weinberg, although they did not mentions quarks explicitly (Gell-Mann, Oakes, & Renner, 1968; Glashow & Weinberg, 1968). However, at this time quarks were being regarded by the majority of physicists as purely formal devices, and so the idea that some unknown underlying symmetry of an equally unknown quark dynamics might be the origin of phenomenological relations was also regarded rather as a useful mathematical constructions than as a physical reality. For example, Roger Dashen and Marvin Weinstein wrote: ‘‘The reader who finds it difficult to believe that this symmetry [i.e. the symmetry of quark dynamics] really exists in a physically meaningful sense can take comfort in the fact that PCAC and current algebra lead to exactly the same formulas for soft-mesons theorems. The language of approximate symmetry, nevertheless, is quite useful and allows us to give a precise meaning to PCAC in a natural way’’ (Dashen & Weinstein, 1969, p. 1261). This passage illustrates the interplay between formal and empirical elements in attributing physical significance to mathematical methods: The spontaneously broken symmetry was regarded as a mathematical trick with no physical significance, yet at the same time it was recognized that it did provide a useful ‘‘language’’ (a story) which might allow to bridge the gap between mathematics and nature (‘‘give a precise meaning to PCAC in a natural way’’). Later on, when quarks came to be regarded as physical entities on a par with electrons and protons, the ‘‘symmetry story’’ would become the main carrier of physical meaning, and the derivation of pion properties would come to provide an empirical reference supporting the notion of a composite Higgs. Yet this development still lay in the future. Let us now go back to model-building in the early 1970s: In Section 9 we have seen how model-builders like Glashow and Georgi were searching for alternative implementations of the theoretical core proposed by Weinberg and Salam. In this context some of them attempted to build models where the spontaneous breakdown of electroweak symmetry was implemented using the nonperturbative approach of Nambu and Jona-Lasinio. An early suggestion in this sense was made in 1972 by Heinrich Saller, who was working in the context of Heisenberg’s (non-mainstream) approach to quantum field theory which had been among the inspirations for the work of Nambu and Jona-Lasinio (Saller, 1972). Yet Saller’s work was practically ignored and the idea of a dynamical breakdown of electroweak symmetry was effectively launched in 1973 independently by John Cornwall and Richard Norton (‘‘Spontaneous Symmetry Breaking Without Scalar Mesons’’) and Kenneth Johnson and Roman Jackiw (‘‘Dynamical Model of Spontaneously Broken Symmetry’’). Cornwall and Norton stated: ‘‘In this paper we present a simple model field theory in which the spontaneous symmetry breaking and the consequent massiveness of a vector meson occur in a manner similar to the violation of electric current conservation and the consequent Meissner effect in the theory of superconductivity. Indeed, our efforts to construct a theory of this kind were inspired by the work of Nambu concerning the gauge invariance of the BCS theory of superconductivity’’ (Cornwall & Norton, 1973, p. 3338). Like Nambu and Jona-Lasinio, they did not offer a realistic theory, but used model-building to demonstrate that such a theory was possible: ‘‘It will be evident that this model is not intended as a realistic theory of weak or electromagnetic interactions. Rather, it is only an example of what we feel is probably a large class of theories in which the spontaneous symmetry breaking derives from general features of an apparently symmetric interaction’’ (Cornwall & Norton, 1973, p. 3338). As stated already in the title, the authors did not suggest the existence of a composite Higgs boson, but rather claimed that one might do without any scalars at all. Jackiw and Johnson, too, acknowledged that they were building upon the work of Nambu, Jona-Lasinio and Goldstone and used model-building to demonstrate their claim that ‘‘a theory consisting of massless Fermi and vector-meson fields can lead to an excitation spectrum of solely massive particles’’ (Jackiw & Johnson, 1973, p. 2386). However, they concluded: ‘‘The physical relevance of this mechanism is not apparent at the present time’’ (Jackiw & Johnson, 1973, p. 2395). Their model did not explicitly include a massive, composite scalar, but they commented that the presence of such a bound state was a possibility and that such a theory would be ‘‘an example of the conventional Higgs mechanism’’ (Jackiw&Johnson, 1973, p. 2395). The earliest explicit suggestion of a ‘‘composite Higgs’’ in the Weinberg–Salam model seems to be in a paper published in 1974 by Terrance Goldman and Patrizio Vinciarelli (‘‘Composite Higgs Fields and Finite Symmetry Breaking in Gauge Theories’’), to which the web of knowledge attributes only two citations (Goldman & Vinciarelli, 1974). In the following years, a number of authors engaged in investigating dynamical symmetry breaking, but they focussed on the general mathematical aspects of the problem and did not speculate on their physical relevance and on the nature of the nonperturbative interactions underlying A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 the symmetry-breaking dynamics. In short, no theoretical core focussed on the idea of a composite Higgs emerged, although many of its future elements were already there (the mathematical structures used by Nambu and Jona-Lasinio, the stories of a dynamical spontaneous breakdown of symmetry and of a composite Higgs boson, the empirical reference to superconductivity). Let us pause to summarize the situation: In the middle of the 1970s, the Weinberg–Salam model of electroweak interactions with an elementary Higgs field had established itself as a coherent, phenomenologically successful theory, but had at the same time become the starting point of a increasing activity of model-building. The idea of a dynamically generated, composite Higgs had been explicitly formulated, but had failed to attract much attentions and, in particular, no one had addressed the question of the underlying nonperturbative interactions responsible for the generation of the bound state. As already anticipated, the decisive contribution which would give rise to a new, selfsustaining theoretical core was precisely a template to conceive such nonperturbative interactions, and this element came in the form of Quantum Chromodynamics. In the years between 1976 and 1978 QCD established itself as a viable model for the strong interactions from which all hadrons emerged as composite states (Pickering, 1984, pp. 207–230, 309–346). Once the quarks had come to be regarded as more that formal devices, the idea of considering the pion as a composite state associated to the dynamical breaking of a symmetry of quark interactions provided the empirical reference needed to support the story of dynamical symmetry breaking of electroweak interactions. In 1976, Steven Weinberg explored the ‘‘Implications of dynamical symmetry breaking’’ and explicitly focussed on its possible physical significance (Weinberg, 1976). He did not introduce composite scalar fields, but he did explicitly ask what physical interactions might be at the origin of a dynamic symmetry breaking of electroweak symmetry. He showed that, due to a difference in energy scales, it was not possible to assume that the breakdown was due to the strong interactions of QCD, but he showed that QCD provided a viable template for the new, unknown forces: ‘‘One way to approach these problems is to suppose that in addition to the color SU(3) associated with the observed strong interactions, there is another gauge group [%] associated with a new class of ‘‘extra strong’’ interactions, which act on leptons as well as other fermions’’ (Weinberg, 1976, p. 992). However, Weinberg did not find the phenomenology of such models very attractive (Weinberg, 1976, p. 992) In the years immediately following its publication Weinberg’s work did not attract much attention: According to the web of knowledge it was quoted three times in 1976, seven times in 1977, and no times at all in 1978. From 1979 onward, however, the paper achieved popularity because it came to be associated with a work by Leonard Susskind to be discussed in the next section. 11. Leonard Susskind and the emergence of the composite Higgs theoretical core In 1979, Leonard Susskind published a paper whose impact on model-building in high-energy physics can hardly be overestimated (Susskind, 1979). The paper bore the title ‘‘Dynamics of Spontaneous Symmetry Breaking in the Weinberg-Salam Theory’’, yet its focus was not dynamical symmetry breaking in general, but rather the idea that the Higgs boson should be regarded as a bound state of more fundamental fermions whose interactions caused a dynamical breakdown of electroweak symmetry. In this paper, Susskind combined a number of pre-existing elements into a new theoretical core. He began by introducing the idea of a 207 composite Higgs by claiming that elementary scalars fields were not ‘‘natural.’’ I shall not discuss here his argument, except for two remarks. The first one is that, as all theorists are ready to recognize, the argument had no compelling character, but was rather a hybrid construct combining some mathematical elements with a story according to which the fine-tuning of parameters is an aesthetic flaw of the Higgs mechanism. The second remark is that, despite its non-compelling nature, the ‘‘naturalness problem’’ soon became a main point of criticism against the Standard Model (Galison, 2004, p. 31; Drees et al., 2004, pp. 8–10). While the ‘‘naturalness’’ argument gave generic support for the thesis that the Higgs boson was a composite particle, it provided no indication as to how a composite Higgs model would look like. The author therefore went on to tentatively fill up his story with mathematical meaning and it is worth following his argument step-by-step, as it provides a clear example of how the different elements of a theoretical core were combined in an act of model-building demonstrating the core’s validity. Susskind’s first step was to implement his composite Higgs story in a simple model (‘‘a warm up example’’) referring back to the interpretation of the pion as a composite state of quarks which acquired mass thanks to the spontaneous breakdown of a symmetry of the underlying interactions (Susskind, 1979, pp. 2620– 2622). The stated aim of the warm-up example was this: ‘‘We would like to know if the strong interactions can somehow replace the Higgs scalars and provide masses for the intermediate vector bosons’’ (Susskind, 1979, p. 2621). Susskind considered the pion as a quark-composite resulting from the dynamical breakdown of the chiral symmetry of quark interactions, and asked whether it might ‘‘replace the Higgs scalars’’ in giving masses to electroweak vector bosons. The answer was yes, although the model was not a realistic one. As we saw in the previous chapters, this result was nothing new: Following Nambu and Jona-Lasinio— whom Susskind mentioned, but did not quote (Susskind, 1979, p. 2621)—the pion had been regarded since the late 1960s as a quark composite linked to the dynamical breakdown of chiral symmetry. The idea that QCD might be behind electroweak symmetry breaking, on the other hand, had already been explored by Weinberg (1976). Yet Susskind had shrewdly combined the two elements in his warm-up example, whose aim was not to explore a realistic possibility, but to help build up a hybrid narrative in which not QCD itself, but rather a new interaction would be behind the dynamical breakdown of electroweak symmetry. What was fundamental for his rhetorical strategy was to establish a close connection between the (purely fictional) composite Higgs and the pion, which by that time was unanimously regarded as a quark-composite whose mass could indeed be linked to a story of spontaneous symmetry breaking. This connection would be of the greatest importance for the emerging theoretical core. Having thus warmed up, Susskind proceeded to formulate ‘‘a more realistic example’’, in which he postulated the existence of a ‘‘new undiscovered strongly interacting sector, similar to ordinary strong interactions’’ (Susskind, 1979, p. 2622). The new QCD-like interaction caused a new type of elementary fermions to bind into a symmetry-breaking scalar particle and eventually reproduce the particle content of the Weinberg–Salam model. Susskind used a language maximally underscoring the analogy with QCD: Since the strong charge of quarks was ‘‘color’’, he called his new charge ‘‘heavy-color’’ and spoke of ‘‘heavy-color quarks’’ and ‘‘heavycolor pions’’ (Susskind, 1979, p. 2623). After having shown how the composite scalar would be able to give mass to the electroweak vector bosons like the elementary Higgs did, the author was however forced to admit that his construction was not fully realistic: ‘‘The model described here is certainly incomplete. As it stands it cannot account for the masses of leptons and 208 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 quarks’’ (Susskind, 1979, p. 2623). In the last section of the paper he addressed in more detail the main problem plaguing his construction: If the composite Higgs boson gave mass also to fermions like leptons and quarks, then the relevant Lagrangian had to contain terms (four-fermion interactions) making it nonrenormalizable. (Susskind, 1979, p. 2624). Still, this result was in no way seen as a failure, but rather as a first insight into the potential and problems of a new theoretical approach to BSMphysics: a theoretical core having at its centre the notion of a composite Higgs. Susskind’s paper was an immediate success, both for his naturalness argument and for the idea of the composite Higgs: The Web of Science attributes to it 27 citations for 1980, 84 for 1981 and 85 for 1982. Keeping into account the delay between a preprint and its publication, this citation record indicates an almost immediate reception, much different than what we saw for Weinberg’s (1976) paper. However, in an addendum to his own earlier paper Weinberg referred to Susskind’s work—quoted as a preprint—and pointed out how he himself had already anticipated its ideas, thus ensuring himself a role as co-creator of the composite Higgs approach (Weinberg, 1979, p. 1279). Susskind’s paper can be identified as the origin of the core associated to the idea of a composite Higgs boson and comprising a mixture of (1) mathematical statements on the renormalizability of the Weinberg-Salam model and the nonperturbative behaviour of quantum-field-theoretical expressions, (2) stories connecting these elements to qualitative notions such as the compositeness of a particle or the spontaneity of a symmetry breakdown and its relationship to renormalizability, and (3) empirical references to how similar arguments had proved phenomenologically successful in other cases (superconductivity, Weinberg–Salam model, Goldberger–Treiman relation, compositeness of the pion and of hadrons in general). None of the individual elements was in itself a novelty, but Susskind combined them for the first time into a physically suggestive and rhetorically unitary theoretical core, using model-building to demonstrate both the viability and potential usefulness of the new construct. He admitted openly that none of the models he wrote down could be the starting point for a future theory, but this fact did not detract from the success of his approach in the physics community. It is also important to note that the theoretical core did not allow any new explanation or predictions, but only solved a noncompelling problem which Susskind himself had formulated for the first time: the lack of ‘‘naturalness’’ of the Weinberg–Salam model—an issue which in the following decades would become a main motivation for BSMmodel-building. 12. Technicolor, supercolor, ultracolor: The many avatars of the composite Higgs Susskind’s paper was the starting point of model-building activities in which he himself played a prominent role: Together with Savas Dimopoulos he introduced an extension of heavycolor called ‘‘technicolor’’: ‘‘We attempt to show that fundamental scalar fields can be eliminated from the theory of weak and electromagnetic interactions. We do this by constructing an explicit example in which the scalar field sectors are replaced by strongly interacting gauge systems’’ (Dimopoulos & Susskind, 1979, p. 237). In this paper the authors addressed the problem of fermion masses, but also explicitly underscored the limited scope of the models they built and their purely exploratory function: ‘‘We will present some examples which we argue are capable of generating a set of mass scales with realistic orders of magnitude for both fermions and bosons. The examples in their present form are too simple to take completely seriously. We offer them as an ‘‘existence proof’’ for a family of theories which can produce reasonable scales without unnatural adjustments. We also feel that the examples offer valuable clues to the various mechanisms which may be needed’’ (Dimopoulos & Susskind, 1979, p. 237). These words again illustrate the modus operandi of model builders and fit with Randall’s analogy of looking for ‘‘nouns’’ and ‘‘phrases’’, i.e., for some ‘‘mechanism’’ which might be needed in constructing theories. The models are not to be ‘‘taken seriously,’’ but provide an ‘‘existence proof’’ for a ‘‘family of theories’’ of a given kind which may eventually arise out of the elements of the core. Model-building around the composite Higgs core continued in this vein, for example with the development by Susskind, Dimopoulos and Stuart Raby of models for ‘‘tumbling gauge theories’’, in which dynamical symmetry breakdown did not occur all at once, but rather in series of steps (Raby, Dimopoulos, & Susskind, 1980). In this paper, too, the authors made a conscious effort to associate their quite complex and rather sketchy mathematical constructions to a good verbal formulation and to a catchy name: ‘‘In this paper we will describe a class of gauge theories in which a wide variety of length scales may naturally occur. These scales are associated with the sequential spontaneous symmetry breakdown of the gauge symmetry in a series of steps. The breakdowns occur without the aid of fundamental scalars. [y] We refer to such behaviour as ‘tumbling’.’’ (Raby et al., 1980, p. 373). Another versions of the composite Higgs model was ‘‘supercolor’’, where the idea of technicolor was combined with the element of another emerging theoretical core: supersymmetry (Dimopoulos & Raby, 1981). Later on, also models named ‘‘hypercolor’’, ‘‘ultracolor’’ and ‘‘metacolor’’ were proposed, but, eventually, the term ‘‘technicolor’’ came to be used to describe all variations on the theme (Kaplan, Georgi, & Dimopoulos, 1984; Kaplan & Georgi, 1984). We have now arrived in the early 1980s, a period of particular importance for the construction of today’s landscape of ‘‘new physics,’’ because it was in the years between 1979 and 1985 that the some of the most important concepts and approaches in the discourse on ‘‘new physics’’ established themselves: not only technicolor and naturalness, but also the ‘‘hierarchy problem,’’ supersymmetry, grand unified theories, strings, as well as issues connecting particle physics and cosmology, such as dark matter or inflation. Like dynamical symmetry breaking, all of these notion had been present in the scientific discourse already in the 1970s, but it was only in the early 1980s that they attained centre stage in the community. In all cases—and not only in string theory—nonrigorous mathematical heuristics and hybrid theoretical constructs played a key role, although in string theory this development was particularly evident. This constellation would deserve a detailed historical, philosophical and sociological analysis which cannot be carried out in the context of the present explorative investigation, but the broad picture has to be kept in mind as the framework within which the events discussed in the following pages were embedded. In the early 1980s supersymmetry overcame the composite Higgs as a theoretical core worth exploring in model-building, and remains today the most prominent candidate for BSMphysics. However, the composite Higgs did not die out: In the Proceedings of the 18th Solvay Conference on Physics Susskind admitted that technicolor ‘‘runs into grave difficulties’’ when dealing with quark and lepton masses, but listed tentative wayouts, among them the possibility of considering also quarks and A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 leptons as composite states (Susskind, 1984, pp. 185–187). In the same conference Haim Harari offered a review of such ‘‘Composite models for quarks and leptons’’ and concluded: ‘‘There is no satisfactory theoretical model of composite quarks and leptons. However, the models proposed so far contain many interesting new ideas. Each of these ideas should be investigated on its own merit, regardless of the detailed model which may have led to it. Several correct ingredients of the correct theory may already be with us now’’ (Harari, 1984, p. 175). Although no viable models for a composite Higgs approach which could treat quarks and leptons had been found, the theoretical core was still regarded as worthy of attention, as it might contain ‘‘correct ingredients of the correct theories,’’ i.e., ‘‘nouns and phrases’’ useful for further model-building. Beyond the specific issue of the composite Higgs, Susskind’s and Hariri’s texts are interesting because, given the prominent occasion they were prepared for, they may be seen as expressing contemporary attitudes in the physics community: High-energy theorists were ready to commit themselves to investigate ‘‘ideas [y] each in its own merit regardless of the model that may have led to it.’’ This is the same kind of commitment to fragmentary theoretical constructs found in Randall’s description of the practices of modelbuilding: The belief that collecting ‘‘nouns and phrases’’ without bothering with ‘‘grammar’’ was the immediate aim in the search for new physics. Among the most active contributors to technicolor modelbuilding in the 1980s was Howard Georgi, who published ‘‘A Tool Kit for Builders of Composite Models’’ (Georgi, 1986) and, together with Andrew Cohen, played an important role in adding a new ingredient to the tool kit: ‘‘walking technicolor’’ (Cohen & Georgi, 1989). The idea behind this new class of models had been developed by a number of authors in the years 1985–1987, but in their paper Cohen and Georgi combined the previous approaches into a powerful story: To solve the by now well-known problem of nonrenormalizability of four-fermion interactions present in composite Higgs models the relevant coupling constant had to vary very slowly with energy—a story supported by an empirical reference to the explanation of lepton-hadron deep inelastic scattering given by QCD in the early 1970s (Pickering, 1984, pp. 125–158, 207–215). The authors demonstrated the validity of their proposal by building a simple but significant model, and by supplying a catchy name: walking technicolor—a play on words based on the fact that the variation of coupling constants with energy is usually referred to as ‘‘running’’ (Cohen & Georgi, 1989, p. 7). As had been the case in the works discussed before, the paper did not offer a model which could be used as a starting point to construct a full-fledged theory: Model-building was aimed at supporting the validity of the general idea by showing that it worked in a special case. Cohen and Georgi constructed a ‘‘gap equation’’ whose solutions they studied by making use of a series of approximations heavily relying on perturbative methods. Their conclusions offer additional evidence of the hybrid nature of the theoretical constructs with which they were working: ‘‘The really important question about all of this is how much of the story is an artefact of the brutal approximations that we have made to the gap equation. Should we believe it at all in a strongly interacting theory in which, in general, perturbation theory is not a reliable guide to the physics? Obviously, such details as the value of the critical coupling constant will not survive. However, we think that the most important part of the walking technicolor idea is, in fact, very likely to survive beyond perturbation theory’’ (Cohen & Georgi, 1989, p. 23). 209 Indeed, the idea of ‘‘walking technicolor’’ survived to become a new element for the theoretical core of the composite Higgs. 13. Technicolor walks beyond LEP results With the start of the LEP experiment in 1989 precision data on electroweak interactions became available, and they were not particularly favourable to technicolor, as they ruled out many of the simple models, while signatures of more complex variations seemed hard to detect (Peskin & Takeuchi, 1990). Yet the technicolor approach was not abandoned: Already in 1991 Kenneth Lane and M. V. Ramana published a paper on ‘‘Walking technicolor signatures at hadron colliders’’, in which they looked forward to data forthcoming from the Tevatron, a proton–proton collider at Fermilab which had been active since 1983, from the Superconducting Supercollider, which at the time was planned, but was eventually cancelled in 1994, and from the LHC, whose planning was already under way (Lane & Ramana, 1991). Lane and Ramana argued that walking technicolor offered better chances than older models and, indeed, this direction of modelbuilding would later become prominent within the composite Higgs approach. Although new experimental results were partly problematic for the composite Higgs core, they also had a positive impact on it, and this was linked to developments in the search for a particle which had long been assumed to exist, but had not yet been detected: the top quark. For theoretical reasons, no doubts on the existence of the top quark were held since 1977, when evidence for its partner, the bottom quark, had been found (Liss & Tipton, 1997; Staley, 2004). The mass of the top quark was initially assumed to be of the same order of magnitude as that of the bottom quark (ca. 4.2 GeV), but high-energy experiments failed to find it and, by the end of the 1980s, the lower limit for the top quark mass had reached the value of 77 GeV. This result led some theorists, among them Nambu, to speculate along a new path: until then, the relatively low values of quark masses had prevented regarding them as the particles whose dynamics broke electroweak symmetry and gave rise to the composite Higgs. Now the situation had changed and it seemed possible that the breakdown of electroweak symmetry might be due to a top-quark condensate, and not to some ‘‘new’’ elementary fermions (Miransky, Tanabashi, & Yamawaki 1989; Nambu, 1989; Bardeen, Hill, & Lindner, 1990). As usual, model-building was the tool for testing the hybrid combination of new experimental results on the top quark and the story of a top-quark condensate which spontaneously broke electroweak symmetry: V. A. Miransky, Masaharu Tanabashi and Koichi Yamawaki proposed a ‘‘new class of models of dynamical electroweak symmetry breaking in which the t quark is responsible for the formation of the condensate generating masses of the W and Z boson’’ (Miransky et al., 1989, quote from p. 180). Once again, not a single model, but a whole class was being built at the same time to demonstrate the validity of the new element added to the core. Along similar lines William Bardeen, Christopher Hill and Manfred Lindner spoke of a ‘‘minimal dynamical symmetry breaking’’ of electroweak symmetry through a top quark condensate (Bardeen et al., 1990). They noted how ‘‘several authors, most notable Nambu, have recently experimented with this idea,’’ proceeded to build models to ‘‘make precise the definition of the minimal dynamicalsymmery-breaking scheme’’ and then went on to argue that their simple model ‘‘may be generalized in several directions.’’ (Bardeen et al., 1990, quotes from p. 1647, 1651). Finally, they derived some predictions for the values of the Higgs and top mass. At the end of the paper, though, they pointed out how their models suffered from naturalness problems similar to 210 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 those attributed by Susskind to the Standard Model with an elementary Higgs. The idea that a top-quark condensate might be responsible for electroweak symmetry breaking became known as ‘‘topcolor’’ and Christopher Hill eventually combined it with standard technicolor to obtain ‘‘topcolor assisted technicolor’’ (Hill, 1995). After expounding his new idea Hill noted that the combination of topcolor and technicolor opened up a number of new avenues for model-building: ‘‘We note that a number of new models is suggested by this approach. In model building we have several options: (I) technicolor breaks both the electroweak interactions and the topcolor interactions; (II) technicolor breaks electroweak, and something else breaks topcolor; (III) technicolor breaks only topcolor and something else drives electroweak symmetry breaking [y] We believe these models offer new insights into the dynamical origin of fermion masses and electroweak symmetry breaking, and merit further study’’ (Hill, 1995, pp. 487–488). This example illustrates well both the flexibility of a theoretical core whose different elements could be variously combined, and its capability to embed new elements of experimental origin. It also shows how the commitment of physicists to a theoretical core (the composite Higgs) prompted them to use model-building to tentatively modify it in different, at times mutually incompatible directions, adding new elements to it. When the top quark was finally observed (1995), no clear evidence for or against topcolor emerged. Although the top quark had failed to provide direct experimental evidence in favour of the composite Higgs core, its existence could still be used as an empirical reference supporting that approach, since it showed how heavy, strongly interacting fermions similar to the hypothetical techni-quarks were indeed possible. By the end of the 1990s, although the composite Higgs core had failed to produce a concrete candidate for a viable theory, it had not been abandoned: models were disposable tools, theoretical cores were not. In his review of ‘‘Avenues for Dynamical Symmetry Breaking’’ (1999) Sekhar Chivukula hoped that forthcoming experimental results would give a boost to the enterprise: ‘‘Technicolor, topcolor, and related models provide an avenue for constructing theories in which electroweak symmetry breaking is natural and has dynamical origin. Unfortunately, no complete and consistent model of this type exist. [y] If electroweak symmetry breaking is due to strong dynamics at energy scales of order a TeV, experimental directions will be crucial to construct the correct theory. With luck, the necessary clues will begin to appear at the Tevatron in Run II!’’ (Chivukula, 1999, p. 8). Thus, twenty years after its emergence the composite Higgs was still thriving, despite the fact that no experimental evidence in its favour had been found. The same applied to supersymmetry and, somehow later, to the theoretical cores linked to various extradimensional approaches. This development stands in contrast to the cases of Nambu and Jona-Lasinio’s ideas on hidden symmetries and hadron masses and of Weinberg and Salam’s electroweak unification: In the first case, the authors had been able to provide at least a tentative phenomenological support for their ideas in the derivation of the Goldberger–Treiman relation; in the second case, the physics community had only taken notice of the theoretical core proposed by Weinberg and Salam after ‘t Hooft had provided a rigorous proof underpinning the story connecting renormalizability and spontaneous symmetry breaking. Instead, Susskind’s proposal of a composite Higgs boson had been immediately taken up and expanded upon by other authors even without direct experimental support or rigorous mathematical proof. In short, it is remarkable how long-lived this and other BSM-physics approaches from the late 1970s have turned out to be. As I shall discuss more in detail in Section 15, part of the explanation for this longevity lies certainly in the lack of new, decisive experimental results in particle physics, which has prompted—if not outright forced—theorists to rely increasingly on arguments of ‘‘mathematical aesthetics’’ such as Susskind’s naturalness problem. However, I would like to suggest the the shift in theoretical practices may also be linked to a long-term transformation of epistemic values implicit in the trend toward nonrigorous heuristics discussed in Section 5. Theoretical highenergy physicists are used to assume in principle that rigorous mathematical proofs exist, but substitute them in practice with folk theorems and usually deny the epistemic gap between the two. In a similar way, theories as coherent mathematical constructs remain the ideal aim of research, but its practical focus are hybrid constructs, and the gap between the two is left unspoken also in this case. 14. The composite Higgs in the new millennium: The model as a ‘‘Rosetta stone’’ The new millennium has seen many more composite Higgs models being built, while the elements of this theoretical core have increasingly often been combined with those from other ones (e.g., Andersen et al., 2011; Dietrich, Sannino, & Tuominen, 2005; Piai, 2010). In October 2010, in a paper entitled ‘‘Technicolor and Beyond: Unification in Theory Space’’ Francesco Sannino proposed to ‘‘marry’’ technicolor to supersymmetry to ‘‘provide a unification of different extensions of the Standard Model. For example, this means that one can recover, according to the parameters and spectrum of the theory, distinct extensions of the standard model, from supersymmetry to technicolor and unparticle physics’’ (Sannino, 2010, p. 1)2. Today, model-builders await new experimental results not simply to test the products of their work against observation, but also to obtain new material to enrich their speculations. To quote one example, in April 2011, Estia Eichten, Kenneth Lane and Adam Martin published a preprint on ‘‘Technicolor at the Tevatron’’ in which they proposed to regard a deviation from Standard Model predictions observed at Tevatron by the CDF experiment as an indication in favor of technicolor, pointing the way to further theoretical developments in the field (Eichten, Lane, & Martin, 2011). The authors stated that the new experimental data might give indications in favour of a theory of the walking technicolor variety, but devoted their paper to the construction of a different kind of model (‘‘low-scale technicolor’’) which was supposed to represent the low-energy phenomenology of the ‘‘real’’ theory. The authors stressed both the transitional nature of the new model and its key role in connecting experiment and the higher level of theoretical cores: ‘‘If experiments at the Tevatron and LHC reveal a spectrum resembling these predictions [i.e. those of low-scale technicolor], it could well be that low-scale technicolor is the ‘‘Rosetta stone’’ of electroweak symmetry breaking. For it will then be possible to know its dynamical origin and discern the character of its basic constituents, the technifermions. The masses and quantum numbers of their bound states will provide stringent experimental benchmarks for the theoretical studies 2 ‘‘Unparticle physics’’ is yet another approach to BSM-physics and was introduced in by Georgi (2007). A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 of the strong dynamics of walking technicolor just now getting started’’ (Eichten et al., 2011, p. 4). Here we see a scenario very similar to that found in Randall’s text: A ‘‘real’’ theory is identified in principle as the aim of research, but in practice the efforts of model-builders concentrate on formulating expendable models which will provide ‘‘benchmarks’’ for further theoretical and experimental studies and allow to collect new ‘‘ingredients’’ for theoretical cores. To characterize their model, the authors use like Randall a striking linguistic analogy, suggesting that it might turn out to be a ‘‘Rosetta stone.’’ This image expresses well the role of models as tools to explore and expand hybrid theoretical cores, tentatively linking them with phenomena, and individuating those ‘‘nouns and phrases’’ out of which in a (far) future a ‘‘real’’ theory might be constructed. 15. Conclusions and outlook: Hybrid theorising and the epistemic significance of Rosetta stone models In the last decades no model of BSM-physics has managed to rise to the role of a clear-cut, phenomenologically viable alternative to the Standard Model—yet no model-builder had ever expected that. Models of BSM-physics are built to serve as expendable implementations of one or more non-expendable theoretical notions of mathematical and nonmathematical character, which have to be tested and explored as potential ingredients of future theories. I have argued that in today’s BSM-model-building - and to some extent in all of high-energy physics - theoretical constructs have come to display increasingly often a hybrid character: They are not ‘‘theories’’ in the sense of coherent mathematical structures, but rather ‘‘theoretical cores’’ comprising a closely knit network of mathematical notions, verbal narratives (‘‘stories’’), and analogies referring back to established explanations of phenomena from other fields (‘‘empirical references’’). The models which are continuously built and discarded are neither failed attempts at a final theory nor toy models derived from a more complex overarching theoretical structure: They are the constantly refreshed links in a network connecting, on the one side, the elements of one or more theoretical cores and, on the other, experimental data. Following the approach of Morgan and Morrison, we can regard model-building as mediating between the theoretical and empirical level, yet the theoretical level cannot be anymore conceived purely in mathematical terms. Other than theoretical cores, model are usually mathematically coherent constructs, yet they are not theories as they are not regarded by physicists as in themselves describing or predicting phenomena, but only as giving an approximate estimate of how the predictions of a hypothetical real theory would look like. Using the suggestive metaphor employed by Eichten, Lane and Martin, models can be seen as ‘‘Rosetta stones’’, which are not important because of their immediate content, but because they provide a chance of adding new mathematical and nonmathematical ‘‘nouns and phrases’’ to theoretical cores. These consideration apply to the approaches to BSM-physics listed in Section 2, as well as to others. The case of the composite Higgs has been used as an example to clarify the notion of theoretical core and the function of models. In reconstructing the emergence and development of this theoretical core particular attention has been devoted to showing how theorists already in the 1960s were fully aware of the importance of fragmentary theoretical constructs in their work. When they built a mathematical model, they did not try to pass it off as a candidate for a theory, but were instead ready to point out that models were not to be ‘‘taken seriously’’, as they were rather exploratory tools to individuate ‘‘strategies’’ and ‘‘ingredients’’ for 211 a future theory. Although such epistemic practices had their roots in pre-Standard-Model high-energy physics, I have argued that they gained a prominent role only in the last two or three decades. Since the 1980s, theoretical cores such as supersymmetry, extra-dimension or the composite Higgs have kept on being developed despite the fact that no experimental evidence of their validity has been found. Yet it would be incorrect to state that empirical elements play no role in model-building: As we have seen in many cases, analogical empirical references have great importance in supporting the validity of theoretical cores. For example, the original idea of a dynamical symmetry breaking involving no elementary scalar particles scarcely played a role in electroweak physics until QCD became available as an empirical reference supporting the story of a composite Higgs boson. Leonard Susskind was the first one to make extensive use of this argument and it is because of his work that the theoretical core centring on the notion of a composite Higgs emerged, to evolve in the following three decades thanks to an uninterrupted activity of creative model-building. If one accepts this case study as representative for the dynamics of model-building in high-energy physics, one may wonder whether this situation should be seen as a consequence of the increasingly different time-scales of theoretical and experimental practice in this field. Since the early 1990s. theorists have been starved of experimental input—except for what came from astrophysics experiments—and have tried to make the best out of the few data available. As Pais and Georgi wrote in 1974: ‘‘It is indeed our ignorance of high-energy weak phenomena which allows, at this stage, for so much play in model building’’ (Georgi & Pais, 1974, p. 540). Indeed, in the last decades hardly any new experimental results came about to guide theorists in developing theoretical cores. However, as we have seen, the main features of today’s practice of model-building were already in place in the early 1970s, when the experimental results on which the phenomenological success of the Standard Model would be built still lay in the future. Those successes notably failed to establish the Standard Model as ‘‘the’’ final theory of particle physics. Indeed, theorists since the 1970s seem to have been more preoccupied with doubting the Standard Model than with committing to it. To quote the CERN theorist Guido Altarelli: ‘‘The Standard Model is a low energy effective theory (nobody can believe it is the ultimate theory)’’ (Altarelli, 2010, p. 1). It is therefore legitimate to speculate that the removal into the far distance (for example on the top of a mountain) of a ‘‘theory’’ conceived as a coherent, complete mathematical framework explaining a large range of phenomena may be due to other factors than the paucity of experimental input. To answer this question conclusively, a more detailed study of the sociology of knowledge in high-energy physics is needed, yet in concluding this essay I would like to tentatively address the issue. In 1993, Silvan Schweber noted how, thanks to the method of the ‘‘renormalization group,’’ the renormalizability of a theory has been linked to the idea that phenomena at different energy scales are described by different ‘‘effective theories’’, all having the same right to claim to represent the laws of nature as any ‘‘final theory’’ (Schweber, 1993). Moreover, the techniques of the renormalization group allow under certain circumstances to extract from a given quantum field theory some informations on how the effective field theories at higher or lower energies should look like. On this basis, Schweber concluded: ‘‘The successes of the standard model of electroweak and strong interactions has been very impressive. These theories, when interpreted as effective field theories, have lent support to a positivistic and antireductionist viewpoint. The physical world could be considered as layered into quasiautonomous 212 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 domains, each layer having its ontology and associated laws. A considerable number of high-energy physicists have adopted such a stance’’ (Schweber, 1993, p. 155). Schweber’s conclusion might be seen as providing at least a partial explanation for the practices of today’s model-building as reconstructed in the present paper: If no theory is the final one, one may be content with learning only a few ingredients from it, using model-building to try and advance from one theoretical level to the next one. Like Schweber, other authors, too, have pointed out the possible epistemological relevance of the renormalization group (Cao, 1993; Fraser, 2011; Hartmann, 2001; Huggett & Weingard, 1995; Wallace, 2011). However, Elena Castellani has recently noted that the use of renormalization group methods does not in itself compel to give up the idea of a final theory, since they can be interpreted in different ways (Castellani, 2002). In particular, under an ‘‘extreme interpretation’’ of the renormalization group, one may understand the layers of which Schweber spoke as ‘‘related in a precise way’’ (Castellani, 2002, p. 262). From this point of view, the ‘‘final theory’’ can be regarded as existing on top of the layered tower of effective theories. Indeed, as we have seen, Randall was very positive about the existence of a ‘‘unified theory’’ as ultimate aim of physics research, an attitude widely shared in the physics community by undermining the idea that a final theory exists. In conclusion, the methods of the renormalization group cannot be in themselves regarded as the decisive factor promoting the employment of hybrid constructs in theoretical practice, although they certainly play a role in letting such practices appear as epistemically equivalent to the search for a theory of everything. I would like instead to suggest that the rise of hybrid theoretical cores is primarily linked to the increasing use of nonrigorous heuristics in high-energy physics. As already discussed in chapter 5, the use by theoretical physicists of methods which contemporary mathematicians regard as nonrigorous is nothing new, and has often been considered by physicists, philosophers and historians as a transient phenomenon to be straightened out in the course of theory development. However, quantum field theory has a special status in this respect, as it has at its core singularly long-lived examples of phenomenologically successful methods whose mathematical foundations have until now escaped a rigorous reformulation: Renormalization techniques, which involve mathematically questionable manipulations of ‘‘infinities,’’ but have allowed astonishingly precise predictions of observed phenomena (Brown, 1993)3. The necessity of relying on such nonrigorous methods may be regarded as an important factor promoting among theorists the belief that demonstrations are in some sense as good as proofs. In turn, as discussed in Section 5, demonstrations may involve not only nonrigorous mathematics, but also thoughroughly nonmathematical elements which, are not only acceptable, but even necessary, when theorists want to extend a mathematical ‘‘narrative’’ with other means. Indeed, renormalization techniques are also associated with those tools (path integrals, Feynman integrals) which provided the basis for the hybrids of mathematics and physics used by string theorists and analysed by Galison (Galison, 2004, p. 54–55). In conclusion, the modus operandi of model-builders which has been the subject of the present study could be regarded as a symptom not so much of the abandonment of the ideal of a ‘‘final theory,’’ but rather of a crisis of the notion that rigorous mathematical structures stand alone as the ultimate goal of 3 (Scharf, 2001) has been able to offer an infinity-free formalization of the renormalization procedures, but his results are still based on a perturbative expansion whose convergence is not ensured. theoretical research in physics. In principle, the foundational myth of theories as pure and rigorous mathematical constructs continues thriving in today’s high-energy physics community, but in practice mathematical theories are usually replaced by hybrid constructs such as theoretical cores. Using Yehuda Elkana’s terminology, one might suggest regarding this phenomenon as a case of ‘‘two-tier thinking’’ linked to the gradual rise of an ‘‘image of knowledge’’ (i.e., a socially determined view on knowledge) in which mathematical proof does not anymore represent alone the highest level in the hierarchy of methods of knowledge legitimization (Elkana, 1981). It is in this context that the building and reshaping of Rosetta-stone models becomes a epistemic practice of central importance, as it appears to be the appropriate tool to grasp and combine elements of the hybrid theoretical level and confront them with empirical claims demonstrating their validity (or lack thereof). Physicists often admit to the importance of verbal components, non-compelling empirical analogies and numerical methods in their everyday practice, but at the same time they play down the epistemic role of these factors, and are ready to assume that all such elements will eventually be filled up with rigorous mathematical meaning. For philosophers and historians it is however imperative to go beyond the study of mathematical theories and take into account the interplay of hybrid strategies of communication and representation in contemporary science. Acknowledgements The roots of this paper lie in my collaboration with Michael Stöltzner, to whom I am deeply indebted. My reflections on the epistemic implications of non-rigorous mathematical heuristics were greatly enriched by comments from Erhard Scholz, whom I warmly thank also for making me aware of the dispute around the Jaffe&Quinn paper. Finally, I would like to warmly thank the anonymous reviewer, whose acute comments much helped me clarify the arguments in this paper. The present study has been developed in context of the DFG-Project ‘‘Epistemic dynamics of model-development at the LHC: an empirical investigation’’ at the University of Wuppertal (Germany). References Atiyah, M., Borel, A., Chaitin, G. J., Friedan, D., Glimm, J., Gray, J. J., et al. (1994). Responses to ‘‘Theoretical mathematics: towards a cultural synthesis of mathematics and theoretical physics’’, by A. Jaffe and F. Quinn. Bulletin (New Series) of the American mathematical society, 30, 178–207. Altarelli, G. (2010). Particle physics in the LHC era and beyond. arXiv:1002.4957v1 [hep-ph]. Andersen, J.R., Antipin, O., Azuelos, G., Del Debbio, L., Del Nobile, E., Di Chiara, S., et al. (2011). Discovering technicolor. arXiv:1104.1255v1 [hep-ph] (CP3-Origins2011-13). Baker, M., & Glashow, S. (1962). Spontaneous breakdown of elementary particle symmetries. Physical review, 128, 2462–2471. Bardeen, W. A., Hill, C. T., & Lindner, M. (1990). Minimal dynamical symmetry breaking of the standard model. Physical Review D, 41, 1647–1660. Bernkopf, M. (1968). A history of infinite matrices. A study of denumerably infinite linear systems as the first step in the history of operators defined on function spaces. Archive for History of Exact Sciences, 4, 308–358. Bogliubov, N., Tolmachev, V. V., & Shirkov, D. V. (1959). A New Method in the Theory of Superconductivity. (Translated from Russian). London: Chapman & Hall. Borrelli, A. (2010). Dirac’s bra-ket notation and the notion of a quantum state. In H. Hunger, F. Seebacher, & G. Holzer (Eds.). Styles of thinking in science and technology. Proceedings of the 3rd International Conference of the European Society for the History of Science (Vienna 2008) (pp. 361–371). Vienna: Austrian Academy of Sciences. Borrelli, A. (forthcoming). Who broke electroweak symmetry? In Recent Progress in Philosophy of Science: Perspectives and Foundational Problems. New York: Springer (Invited submission). Brown, L. M. (Ed.). (1993). Renormalization. From Lorentz to Landau (and Beyond). New York: Springer. A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 Brown, L. M., & Cao, T. Y. (1991). Spontaneous breakdown of symmetry: its rediscovery and integration into quantum field theory. Historical Studies in the Physical and Biological Sciences, 21, 211–235. Brown, L. M., Brout, R., Cao, T. Y., Higgs, P., & Nambu, Y. (1997). Panel session: spontaneous breaking of symmetry. In: L. Hoddeson, L. Brown, M. Riordan, & M. Dresden (Eds.), The Rise of the Standard Model. Particle Physics in the 1960s and 1970s (pp. 478–522). Cambridge: Cambridge University Press 1997. Bustamante, M., Cieri, L., & Ellis, J. (2009). Beyond the Standard Model for Montañeros. arXiv:0911.4409v2 [hep-ph]. Cao, T. Y. (1993). New philosophy of renormalization: from the renormalization group equations to effective field theories. In: L. M. Brown (Ed.), Renormalization. From Lorentz to Landau (and Beyond) (pp. 87–133). New York: Springer. Cao, T. Y., & Schweber, S. (1993). The conceptual foundations and the philosophical aspects of renormalization theory. Synthese, 97, 33–108. Castellani, E. (2002). Reductionism, emergence, and effective field theories. Studies in History and Philosophy of Modern Physics, 33, 251–267. Chivukula, R.S. (1999). Avenues for Dynamical Symmetry Breaking. arXiv:hep-ph/ 9903500v1. Cohen, A., & Georgi, H. (1989). Walking beyond the rainbow. Nuclear Physics B, 314, 7–24. Cornwall, J. M., & Norton, R. E. (1973). Spontaneous symmetry breaking without scalar mesons. Physical Review D, 8, 3338–3346. Daley, K. (2003). Is mathematical rigour necessary in physics?. British Journal for the Philosophy of Science, 54, 439–463. Dashen, R., & Weinstein, M. (1969). Soft pions, chiral symmetry and phenomenological Lagrangians. Physical Review, 183, 1261–1291. Dietrich, D. D., Sannino, F., & Tuominen, K. (2005). Light composite Higgs boson from higher representations versus electroweak precision measurements: predictions for CERN LHC. Physical Review D, 72 055001-1–21. Dimopoulos, S., & Susskind, L. (1979). Mass without scalars. Nuclear Physics B, 155, 237–252. Dimopoulos, S., & Raby, S. (1981). Supercolor. Nuclear Physics B, 192, 353–368. Drees, M., Rohini, M. G., & Prorbir, R. (2004). Theory and phenomenology of sparticles. An Account of Four-Dimensional N ¼1 Supersymmetry in High-Energy Physics. Singapore: World Scientific. Dürr, H. P., Heisenberg, W., Mitter, H., Schlieder, S., & Yamazaki, K. (1959). Zur Theorie der Elementarteilchen. Zeitschrift für Naturforschung, 14, 441–485. Eichten,E.J., Lane, K., & Martin, A. (2011). Technicolor at the Tevatron. arXiv:11040976v1[hep-ph]. Elkana, Y. (1981). A programmatic attempt at an anthropology of knowledge. In: E. Mendelsohn, & Y. Elkana (Eds.), Sciences and Cultures. Anthropological and Historical Studies of the Sciences (pp. 1–76). Berlin: Springer Netherland. Fraser, D. (2011). How to take particle physics seriously: a further defence of axiomatic quantum field theory. Studies in History and Philosophy of Modern Physics, 42, 126–135. Galison, P. (2004). Mirror symmetry: persons, values, and objects. In: M. N. Wise (Ed.), Growing Explanations. Historical Perspectives on Recent Science (pp. 23–63). Durham: Duke University Press. Gell-Mann, M., Oakes, R. J., & Renner, B. (1968). Behaviour of current divergences unter SU(2) x SU(3). Physical Review, 175, 2195–2199. Georgi, H. (1986). A tool kit for builders of composite models. Nuclear Physics B, 255, 274–284. Georgi, H. (1993). Effective field theory. Annual Review of Nuclear and Particle Physics, 43, 209–252. Georgi, H. (2007). Unparticle physics. Physical Review Letters, 98, 221601–1–4. Georgi, H., & Pais, A. (1974). Calculability and naturalness in gauge theories. Physical Review D, 10(1974), 539–558. Glashow, S., & Weinberg, S. (1968). Breaking chiral symmetry. Physical Review Letters, 20, 224–227. Goldman, T., & Vinciarelli, P. (1974). Composite Higg field and finite symmetry breaking in gauge theories. Physical Review D, 10, 3431–3434. Gowers, T. (2008). Introduction. In: T. Gowers, J. Barrow-Green, & I. Leader (Eds.), The Princeton Companion to Mathematics (pp. 1–76). Princeton: Princeton University Press. Goldstone, J. (1961). Field theories with ‘‘superconducting solutions’’. Il Nuovo Cimento, 19, 154–164. Grojean, C. (2009) New Theories for the Fermi Scale. arxiv:0910.4976v1 [hep-ph]. Grosholz, E. R. (2007). Representation and Productive Ambiguity in Mathematics and the Sciences. Oxford: Oxford University Press. Harari, H. (1984). Composite models for quarks and leptons. Physics Reports, 104, 159–179. Hartmann, S. (1999). Models and stories in hadron physics. In: M. S. Morgan, & M. Morrison (Eds.), Models as Mediators. Perspectives on Natural and Social Science (pp. 326–346). Cambridge: Cambridge University Press. Hartmann, S. (2001). Effective field theories, reductionism and scientific explanation. Studies in the History and Philosophy of Modern Physics, 32, 267–304. Hawkins, S., & Mlodinow, L. (2010). The Grand Design. New York: Bantam Books. Heisenberg, W. (1957). Quantum theory of fields and elementary particles. Review of Modern Physics, 29, 269–278. Hill, C. T. (1995). Topcolor assisted technicolor. Physics Letters B, 345, 483–489. Hoddeson, L., Brown, L., Riordan, M., & Dresden, M. (Eds.). (1997). The Rise of the Standard Model. Particle Physics in the 1960s and 1970s. Cambridge: Cambridge Univesity Press. Huggett, N., & Weingard, R. (1995). The renormalization group and effective field theories. Synthese, 102, 171–194. 213 Jackiw, R., & Johnson, K. (1973). Dynamical model of spontaneously broken gauge symmetries. Physical Review D, 8, 2386–2398. Jaffe, A., & Quinn, F. (1993). ‘‘Theoretical mathematics’’: towards a cultural synthesis of mathematics and theoretical physics. Bulletin (New Series) of the American Mathematical Society, 29, 1–13. Jaffe, A., & Quinn, F. (1994). Response to comments on ‘‘Theoretical mathematics’’. Bulletin (New Series) of the American Mathematical Society, 30, 208–211. Kaplan, B., & Georgi, H. (1984). SU(2)xU(1) breaking by vacuum misalignement. Physics Letters B, 136, 187–190. Kaplan, B., Georgi, H., & Dimopoulos, S. (1984). Composite Higgs scalars. Physics Letters B, 136, 183–186. Karace, K. (forthcoming). The construction of the Higgs mechanism and the emergence of the electroweak theory. Studies in History and Philosophy of Modern Physics. Koester, D., Sullivan, D., & White, D. H. (1982). Theory selection in particle physics: a quantitative case study of the evolution of weak-electromagnetic unification theory. Social Studies of Science, 12, 73–100. Lane, K., & Ramana, R. V. (1991). Walking technicolor signatures at hadron colliders. Physical Review D, 44, 2678–2700. Liss, T. M., & Tipton, P. L. (1997). The discovery of the top quark. Scientific American, September, 1997, 54–59. Lützen, J. (1979). Heaviside’s operational calculus and the attempts to rigorise it. Archive for History of Exact Sciences, 21, 161–200. Miransky, V. A., Tanabashi, M., & Yamawaki, K. (1989). Dynamical electroweak symmetry breaking with large anomalous dimension and t quark condensate. Physics Letters B, 221, 177–183. Morrison, M. (2007). Where have all theories gone?. Philosophy of Science, 74, 195–228. Morrison, M., & Morgan, M. S. (1999). Models as mediating intruments. In: M. S. Morgan, & M. Morrison (Eds.), Models as Mediators. Perspectives on Natural and Social Science (pp. 10–37). Cambridge: Cambridge University Press. Nambu, Y. (1960). Quasi-particles and gauge invariance in the theory of superconductivity. Physical Review, 117, 648–663. Nambu, Y., & Jona-Lasinio, G. (1961a). Dynamical model of elementary particles based on an analogy with superconductivity I. Physical Review, 122, 345–358. Nambu, Y., & Jona-Lasinio, G. (1961b). Dynamical model of elementary particles based on an analogy with superconductivity II. Physical Review, 124, 246–254. Nambu, Y. (1989). BCS mechanism, quasi-supersymmetry, an fermion masses. In: Z. Ajduk, S. Pokorski, & A. Trautman (Eds.), New Theories in Physics. Proceedings of the XI Symposium on Elementary Particle Physics (pp. 1–9). Singapore: World Scientific. Peschard, I. (2007). The value(s) of a story: theories, models and cognitive values. Principia, 11, 151–169. Peskin, M. E., & Takeuchi, T. (1990). New constraints on a strongly interacting Higgs sector. Physical Review Letters, 65, 964–968. Peters, K.-H. (2011). Mathematische und phenomenologische Strenge: Distributionen in der Quantenmechanik und -feldtheorie. In: K. H. Schlote, & M. Schneider (Eds.), Mathematics Meets Physics. A Contribution to Their Interaction in the 19th and the First Half of the 20th Centruy (pp. 373–393). Frankfurt a. M.: Verlag Harri Deutsch. Piai, M. (2010). Lectures on walking technicolor, holography, and gauge/gravity dualities. Advances in High-Energy Physics, 2010, 464302–1–21. Pickering, A. (1981). The role of interest in high-energy-physics: the choice between charm and colour. In: K. Knorr, R. Krohn, & R. Whitley (Eds.), The Social Process of Scientific Investigation (pp. 107–138). Dordrecht: D. Reidel. Pickering, A. (1984). Constructing Quarks. A Sociological History of Particle Physics. Chicago: University of Chicago Press. Raby, S., Dimopoulos, S., & Susskind, L. (1980). Tumbling gauge theories. Nuclear Physics B, 169, 373–383. Randall, L. (2006). Warped passages. Unravelling the Universe’s Hidden Dimensions. London: Penguin. Randall, L., & Sundrum, R. (1999). Large mass hierarchy from a small extra dimension. Physical Review Letters, 83, 3370–3373. Salam, A. (1968). Weak and electromagnetic interactions. In: Svartholm (Ed.), Elementary Particle Theory: Relativistic Groups and Analiticity (Nobel Symposium 8 (pp. 367–377). Stockholm: Almqvist and Wiksell. Saller, H. (1972). A compact version of Weinberg’s lepton model. Il Nuovo Cimento, 12A, 349–364. Sannino, F. (2010). Technicolor and beyond: unification in theory space. Journal of Physics. Conference series, 259, 012003-1-11. Scharf, G. (2001). Quantum Gauge Theories: A True Ghost Story. New York: Wiley. Schweber, S. (1993). Changing conceptions of renormalization theory. In: L. M. Brown (Ed.), Renormalization. From Lorentz to Landau (and beyond) (pp. 135–166). New York: Springer. Schwinger, J. (1957). A theory of the fundamental interactions. Annals of Physics, 2, 407–434. Staley, K. W. (2004). The Evidence for the Top Quark: Objectivity and Bias in Collaborative Experimentation. Cambridge: Cambridge University Press. Sullivan, D., Koerster, D., White, D. H., & Kern, R. (1980). Understanding rapid theoretical change in particle physics: a month-by-month co-citation analysis. Scientometrics, 2, 309–319. Susskind, L. (1979). Dynamics of spontaneous symmery breaking in the Weinberg–Salam model. Physical Review D, 20, 2619–2625. Susskind, L. (1984). The gauge hierachy problem, technicolor, supersymmetry, and all that. Physics Reports, 104, 181–193. 214 A. Borrelli / Studies in History and Philosophy of Modern Physics 43 (2012) 195–214 Susskind, L. (2005). The Cosmic Landscape. String Theory and the Illusion of Intelligent Design. Little: Brown. ’t Hooft, G. (1971a). Renormalization of massless Yang–Mills fields. Nuclear Physics B, 33, 173–199. ’t Hooft, G. (1971b). Renormalizable Lagrangians for massive Yang–Mills fields. Nuclear Physics B, 35, 167–188. ’t Hooft, G., Susskind, L., Witten, E., Fukugita, M., Randall, L., Smolin, L., et al. (2005). A theory of everything?. Nature, 433, 257–259. Thurston, W. P. (1994). On proofs and progress in mathematics. Bulletin (New Series) of the American Mathematical Society, 30, 161–177. Veltman, M. (1997). The path to renormalizability. In: L. Hoddeson, L. Brown, M. Riordan, & M. Dresden (Eds.), The Rise of the Standard Model. Particle Physics in the 1960s and 1970s (pp. 145–178). Cambridge: Cambridge University Press 1997. Wallace, D. (2011). Taking particle physics seriously: a critique of the algebraic approach to quantum field theory. Studies in History and Philosophy of Modern Physics, 42, 116–125. Weinberg, S. (1967). A model of leptons. Physical Review Letters, 19, 1264–1266. Weinberg, S. (1972). Mixing angle in renormalizable theories of weak and electromagnetic interactions. Physical Review D, 5, 1962–1967. Weinberg, S. (1976). Implications of dynamical symmetry breaking. Physical Review D, 13, 974–996. Weinberg, S. (1979). Implications of dynamical symmetry breaking: an addendum. Physical Review D, 19, 1277–1280. Weinberg, S. (1985). The ultimate structure of matter. In: C. DeTar, J. Finkelstein, & C.-I. Tan (Eds.), A Passion for Physics. Essays in Honor of Geoffrey Chew (pp. 114– 127). Singapore: World Scientific. Weinberg, S. (1997). What is Quantum Field Theory, and What Did We Think It is?. arxiv:9702027v1 (hep-th). Wightman, A. S. (1973). Relativistic wave equations as singular hyperbolic systems. In: D. C. Spencer (Ed.), Partial Differential Equations. Proceedings of Symposia in Pure Mathematics, vol. 23 (pp. 441–477). Providence: American Mathematical Society. Zee, A. (2010). Quantum Field Theory in a Nutshell (Second edition). Princeton: Princeton University Press.
© Copyright 2026 Paperzz