396 JOURNAL OF AAARKETING RESEARCH, NOVEMBER 1986 of the three domains. It also examines research innovations and true paradigm shifts from the point of view of the VNS, and provides a summary of the strengths and weaknesses of the system. I did not find this chapter to be as exciting as the others. This book is strongly recommended to veteran researchers and to doctoral students. It would be a valuable addition to doctoral courses in research methodology or marketing theory. Because one of the main contributions of the book is to structure and integrate disparate validity concepts, it is more powerful the more one already knows about research methods and the more familiar one is with classics like Campbell and Fiske (1959), Campbell and Stanley (1963), Cook and Campbell (1979), Cronbach and Meehl (1955), Gamer, Hake, and Eriksen (1956), Kuhn (1962), and Platt (1964). Validity and the Research Process deserves a place on the practicing researcher's shelf alongside these works. Like them, it can be read and reread, with new insights becoming apparent each time. but experience their holism we will have tapped the real meaning of the term clinical—knowledge through encounter, valid because it fits with experience, real because it touches the core, the universal and particular simultaneously" (p. 34). One dominant message of this book is that social scientists are also people. We have goals, biases, ideals, and agendas—all of which may influence our work. We ought not drive methodological vehicles with blind spots where our own humanity may obstruct clear vision of the correct inference, leading to potentially tragic conclusions from time to time. Rather, we ought to recognize the blind spots and use mirror methodologies that will help us see clearly. The danger of zealous advocacy of one methodology, such as the one in this book, is that we may be tempted to ignore the windshield and look only at the blind spot. However, the editors and common sense both dictate that clinical methods alone are no better than traditional methods alone. The best inferences emerge from methodological heterogeneity. JOHN G . LYNCH, JR. Essentially this book broadly attacks the establishment of social science—the editors of major journals and the corporate research directors—who insist on detached, rigorous, generalizable, and objective data for publication or decision making. Ritual beatification of methodological diversity rarely translates into altemative modes of investigation. However, the book fails to address squarely the question of why this change does not happen. The answer is that few scholars see superior knowledge accumulating in fields where the clinical method reigns, such as psychiatry and clinical psychology. This book was written for organizational behavior scholars, not for marketers. Many examples will require translation. The editors, for example, would probably like focus group research. Some advice may not be attractive after translation. Clinical research takes time, often generations, to complete. If one's goal is to increase sales next quarter or to eam tenure next year, the confidence that one is supplying one's grandchildren may not compensate sufficiently for other costs. Like many edited volumes, this book has extreme stylistic variety, ranging from work by an author who pretends to be a playwright to passages drier than Death Valley in drought. The lack of an index in this age of word processing undermines the utility of this book for those of us who have only a passing interest in organizational behavior, but the section commentaries and introductions do provide helpful guides. University of Florida REFERENCES Campbell, Donald T. and Donald W. Fiske (1959), "Convergent and Discriminant Validation and the Multitrait Multimethod Matrix," Psychological Bulletin, 56, 81-105. and Julian Stanley (1963), Experimental and QuasiExperimental Designs for Research. Chicago: Rand McNally, Inc. Cook, Thomas K. and Donald T. Campbell (1979), QuasiExperimentation: Design and Analysis Issues for Field Settings. Chicago: Rand McNally, Inc. Cronbach, Lee J. and Paul J. Meehl (1955), "Construct Validity in Psychological Tests," Psychological Bulletin, 52, 281-302. Gamer, Wendell R., Harold H. Hake, and Charles W. Eriksen (1956), "Operationism and the Concept of Perception," Psychological Review, 63, 149-59. Kuhn, Thomas S. (1962), The Structure of Scientific Revolutions. Chicago: University of Chicago Press. Platt, John R. (1964), "Strong Inference," Science, 146, 3 4 7 53. EXPLORING CLINICAL METHODS FOR SOCIAL RESEARCH, edited by David N. Berg and Kenwyn K. Smith. Beverly Hills, CA: Sage Publications, Inc., 1985, 400 pp. The concept of clinical methods of social research conveys a combination of meanings that many marketing researchers may find new. Essentially, clinical methods in this sense combine psychoanalytic techniques with participant observation techniques to emphasize involvement, fiexibility, depth, and advocacy, as opposed to detachment, rigor, generalizability, and objectivity. If the clinical side of this characterization forces a contrast that resembles the distinction between art and science, it is no accident. The editors want methodology that blurs the art-science dichotomy: "When we no longer can distinguish between art and science as separate processes LYNN R . KAHLE University of Oregon COMPETITOR INTELLIGENCE: HOW TO GET I T HOW TO USE IT, Leonard M. Fuld. New York: John Wiley & Sons, Inc., 1985, 470 pp. Recent focus on strategic market planning combined with the contemporary phenomena of maturing markets, extensive market segmentation, and intensive competitive warfare has spawned a burgeoning interest in the competitive intelligence system. The literature on this
© Copyright 2026 Paperzz