Repository logo
 

Theses - History and Philosophy of Science

Browse

Recent Submissions

Now showing 1 - 20 of 72
  • ItemOpen Access
    Building Black Holes: Analogue experiments and analogical reasoning
    Field, Grace; Field, Grace [0000-0003-2629-7191]
    Analogue experiments investigate empirically accessible ‘source’ systems that are meant to mirror the behaviour of less accessible ‘target’ systems. This disseration aims to answer two questions about analogue experiments. First, what are they useful for? Second, how do analogical reasoning and the reasoning underlying analogue experimentation compare to other forms of inductive reasoning? The first question has been controversial ever since experiments in analogue gravity began to reveal effects which are in principle undetectable by conventional means. The second question connects analogue experimentation with the broader landscape of inductive reasoning, including the ‘problem of induction’ (e.g. Norton (2021)), analogical reasoning more broadly construed (e.g. Hesse (1966), Bartha (2010)), and simulation and modelling (e.g. Parke (2014), Morgan (2005)). Chapter 1 provides background on both analogical reasoning and analogue experimentation. Chapter 2 expands on a Bayesian framework introduced by Dardashti et al. (2019) to argue that analogue experiments can in principle provide significant confirmation for claims about their target systems, but only when supplemented with an independently plausible claim which is *significantly positively relevant* to both the source and target systems. Chapter 3 situates analogical reasoning and analogue experimentation within a new categorisation of inductive reasoning, arguing that inferences which we would typically place into the same class often provide very different kinds of inductive support for very different conclusions. Chapter 4 represents these categories as variations of Chapter 2’s Bayesian framework, lending generality to the arguments presented in Chapter 2 and providing a quantitative perspective on the arguments presented in Chapter 3. Finally, Chapter 5 uses recent developments in analogue gravity to show that analogue experiments can be useful for reasons beyond confirmation of hypotheses about their target systems: they can directly detect generalised phenomena, and they can be used as exploratory tools.
  • ItemControlled Access
    The Clock and the Hand: Taking the Pulse in English Medicine, 1650-1710
    Huang, Yijie
    This thesis examines the knowledge and practice of pulse diagnosis in early modern England. An essential sign of the human body and its disease, the pulse had nonetheless been difficult to interpret since the earliest days of medicine. In the first decade of the eighteenth century, the Lichfield physician Sir John Floyer (1649-1734) introduced the method of measuring the pulse assisted by a “pulse watch” in his sphygmological treatise, *The physician’s pulse-watch* (2 vols, 1707 and 1710). Floyer’s new method is often explained within prevalent discourses on the monumental transformation of science and medicine from qualitative to quantitative and subjective to objective. Prompted by Floyer’s writings, this thesis instead argues that pulse diagnosis ought to be understood as part of the history of the sense of touch and as an artefact of multiple experiences of the body. Taking Floyer’s argumentation and equivocation as clues, it traces how the pulse was explained against natural philosophers’ conception of the clocklike human body, how it was expressed under various medical practitioners’ fingers, and how it was manipulated according to lay people’s repertoire of cure concerning the wrist. In doing so, this thesis traces how the pulse was variously perceived in the entangled, vibrant early modern medical world Floyer inhabited. Drawing on a wide range of sources from translations of Chinese medicine to exegeses of Greek and Latin classics, from anatomical experiments to medical manuals and recipe collections, it sits at the intersection of the histories of medicine, science, and the senses. Bringing the pulse watch back into the contexts of its emergence, it seeks to contribute to the historical reflections on the fruition and tension of knowing through sensory experience.
  • ItemEmbargo
    Tempering the Ambition of Social Science Genomics: Causation, Explanation, and Evidence for Policy
    Bondarenko, Olesya
    This thesis offers a philosophical account of social science genomics (sociogenomics) – an integrative field of study which brings together behavioural genetics and the social sciences. Sociogenomic integration consists in the use of genome-wide association studies (GWAS) and related tools (particularly the so-called polygenic scores) within psychology, sociology, economics, and evidence-based social policy. I propose that the anticipated epistemic and non-epistemic payoffs from this type of integration can be understood in terms of two overarching promises or ambitions: credibility and trustworthiness. For the social sciences, sociogenomic integration is associated with the promise of greater credibility, as some hope that polygenic score-informed research designs will strengthen causal inference about psychological or environmental influences on individual outcomes such as educational attainment, socioeconomic status or well-being. In turn, the infusion of social science frameworks into genetic research on human behavioural differences is meant to increase the latter’s trustworthiness by lending it a more ethically responsible and socially attuned character. Having characterised the ambitions of the integration in this way, I proceed to examine whether sociogenomic research has been able to realise them. I argue that the payoffs from this type of inquiry have been more modest than its proponents claim. In particular, applications of polygenic scores in the social sciences have suffered from methodological and theoretical shortcomings which cast doubt on their ability to revolutionise causal inference in the relevant domains. Moreover, social science genomics does not fully address the longstanding challenges associated with the causal interpretations of behavioural genetic findings, even though it seeks to turn these challenges into new lines of cross-disciplinary investigation. I also argue that significant progress is still needed to improve the ethical profile of behavioural genetics, and that the involvement of the social sciences should not be seen as a panacea against genetic determinism, reductionism, and fatalism. In fact, as the thesis demonstrates, these problematic views often persist in sociogenomic research in less overt or obvious forms. This suggests that conceptual and theoretical resources of the social sciences have a more limited ability to counteract such views than is typically recognised.
  • ItemOpen Access
    Chiffchaffs chirp and cherries are real: a scientifically informed defence of wholehearted emergentism
    Tabatabaei Ghomi, Hamed
    In this thesis, I provide a scientifically informed defence of metaphysical emergence, the idea that some systems are metaphysically distinct from their constituent parts and possess properties that are not reducible to their parts’ properties. Moreover, I argue that metaphysical emergence is acceptable only if embraced wholeheartedly, along with all its ontological corollaries. Finally, I offer an ontology that suits metaphysical emergence. I begin, in chapter 1, by rejecting a computational class of theories of emergence that I take to be the most important rivals to metaphysical emergence. These theories try to explain the irreducible properties of emergent phenomena by reference to their computational irreducibility. I show that computational irreducibility fails to fulfil its philosophical roles in these theories. In chapter 2, I offer a practical argument in support of metaphysical emergence. The main message is that the growing reliance on so-called irrational scientific methods provides evidence that objects of science are indecomposable and as such, are better described by metaphysical emergence as opposed to an alternative reductionistic metaphysics. In chapter 3, I analyse the emergentist research programme in linguistics as an example of the scientific application of the concept of emergence. The underlying theme of this chapter is that half-hearted emergentism is hopeless. I show that if one adopts some weak understandings of the concept of language emergence, the emergentist programme is not fundamentally different from the other non-emergentist research programmes in linguistics. On the other hand, if one adopts some stronger understandings of emergence then the programme would have a unique character, but at the cost of some philosophical corollaries that demand a fundamental revision of the emergentist programme in its present shape. In chapter 4, I suggest that if one accepts metaphysical emergence, then one needs to revise one’s ontological views as well. I compile the minimum set of ontological commitments necessary for maintaining the metaphysical and causal claims of structuralist accounts of metaphysical emergence. The set has three elements: (A) structural realism, (B) structural causation, and (C) the condition of downward percolation.
  • ItemOpen Access
    A Hybrid Theory of Induction
    Segarra Torné, Adrià
    In this thesis I motivate and develop a Hybrid Theory of Induction (HTI), and I explore some of its virtues and implications. The HTI is a hybrid second-order model of inductive support. It is a hybrid model of inductive support because it holds that two ingredients play a necessary role in understanding inductive support: rules and facts. It is a second-order model of inductive support because it is a model within which first-order models of inductive support (i.e. logics of induction) can fit. In chapter 1 I argue that we need both rules and facts to play a role in a successful account of inductive support. Rules of induction accurately describe relations of inductive support when they are warranted; facts do the warranting work. I call this type of warrant "factual warrant''. The resulting account is both functional and accurate, it helps us make sense of how different rules of induction can coexist and it allows us to resolve some current debates in induction. For the purposes of chapter 1 I adopt an existing binary account of factual warrant. In chapter 2 I develop a Graded account of Factual Warrant (GFW), according to which factual warrant comes in degrees. I integrate the GFW in the HTI. I then show that the GFW illuminates the connection between factual warrant and inductive support, and it can successfully account for the role of idealisations and theory in our understanding of inductive support. In chapter 3 I argue that the HTI is also useful for agents, since it can provide methodological guidance to ensure strong inferences and conceptual guidance to assess the strength of our inferences. Finally, in chapter 4, I explore Bayesian inductive logics from the perspective of the HTI. This analysis brings to light the central role that probability models play in Bayesian inductive logics, offering a logical underpinning for some recent suggestions in Bayesian epistemology. Furthermore, throughout this thesis I analyse in detail three rules of induction from the perspective of the HTI: enumerative induction in chapter 2, causal inference in chapter 3 and Bayesian inductive logics in chapter 4. These analyses illustrate how the HTI can help us think more clearly about rules of induction, offering new tools to tackle existing challenges.
  • ItemEmbargo
    Making Sense of Pain: A Pluralist Remedy for Pain Eliminativism
    Ott, Daniel
    In this dissertation, I argue that pain is a sense. This argument is made against pain eliminativism, a position which argues that pain is no longer a meaningful scientific concept and that it should be removed from scientific and philosophical investigations. I make this sensorial argument by focusing on the methodology needed to answer the question: given what we now know about pain, how can we increase our understanding of it? In Part I, I focus on pain and sense concepts, by first developing three intractable problems for which current theories cannot account, these being: Perceptual Objectivity, Mechanistic Disparity, and Phenomenal Heterogeneity. Stemming from the requirements these three problems methodologically impose, I develop a novel internal logic for the dissertation, formed of two premises. The first, the Veridical Criterion, states that for a perceptual theory to be successful, it must account for the possibility of hallucinations and illusions and thereby differentiate veridical perceptual states versus misperceptions. I argue that for perceptual theories to address the Veridical Criterion, they must methodologically proceed first from a place of public consensus, a position termed within as the Priority Thesis, which forms the second premise. Using these premises, I evaluate and then conclude that the eliminativist methodology is not a viable philosophical argument for scientists or philosophers to adopt for pain concepts. I then contrast the understand of pain concepts with that of sense concepts, and argue that the three preceding problems of pain equally apply to the traditional sense categories, such that pain eliminativism, if accepted, necessitates sense eliminativism. This realisation creates an impasse for the dissertation, having demonstrated the prevailing concepts deficiencies, while simultaneously rejecting the removal of them by means of eliminativism. In Part II, I respond to this impasse by developing an original definition of sense, termed afferent action, and argue that this definition is inclusive of pain. I reject the prevailing notion of Veridical Criterion, and posit the Standardisation Criterion as a replacement. I show how, if this definition of pain as a sense is accepted, paining must be taken as equivalent to seeing, hearing, smelling, tasting, and touching. This novel verbialist theory of pain is put in contrast to prevailing adverbial theories, and is shown to successfully resolve the original three problems of pain. I conclude that this theory allows scientists and philosophers alike to make sense of pain.
  • ItemEmbargo
    Bowu and the Natural World in the Formation of Modern China
    Yu, Jia
    My PhD thesis looks closely at a Chinese word, *bowu* 博物, interpreted by historians of science and technology in China as ‘broad learnings’ and which refers to knowledge with a great diversity of origins across Chinese history. Arguing for its enduring relevance in knowledge-making of the natural world in China, this thesis presents a first step in investigating the long-term history of *bowu* by tracing various manifestations of *bowu* as an intellectual and cultural category of knowledge. Chapter One provides a comprehensive review of some exemplary uses of *bowu* in pre-modern eras, ranging from association with a kind of polymath (*bowu junzi* 博物君子) which emerged between the fifth and second centuries BC to the term’s continuous occurrence in major reference books, such as the Qing imperial encyclopaedia *Qindin Gujin tushu jicheng* 欽定古今圖書集成. The following three chapters focus on the transitional period, starting from the 1850s, during which *bowu* experienced profound and rapid changes in meaning. Chapter Two examines one of the most significant moments in the history of *bowu*, when American Baptist medical missionary Daniel Jerome Macgowan (1815-1893) interpreted *bowu* to refer to natural philosophy and general sciences. This chapter centres on Macgowan’s 1851 compilation of a popular scientific work in Chinese, titled *Bowu tongshu* 博物通書 (Philosophical Almanac), which communicated basic elements and general uses of novel technologies of electric telegraphy. Chapter Three presents a different approach to *bowu* taking place around the same period. The renowned British medical missionary Benjamin Hobson (1816-1873), whose Chinese works introducing Western medical knowledge had been widely circulated and read by Chinese elites and doctors, published a three-volume Chinese book in 1855 called *Bowu xinbian* 博物新編. The book sought to ‘diffuse’ useful knowledge that helped its readers to take a new look at the natural world. These two chapters aim to show how Western natural philosophy and natural history made their way into late imperial China through their association with *bowu*. The last chapter studies the modern developments of traditional *bowu* and local operations of *bowu xue* (‘*bowu* learning’) from the perspective of native practitioners of the early Republican era, when Chinese educators and reform-minded members of the gentry established learned institutes, launched specialized journals, and organized field practices of *bowu xue* across the new nation. With these four chapters, the goal of my thesis is to show that long-term historical studies of local knowledge categories like *bowu* provide us with a clearer understanding of the importance of non-Western knowledge systems in shaping and re-shaping our visions of the natural world prior to the arrival and standardization of modern disciplinary sciences, yet they did not become visible parts of the historiography of modernity in most places around the globe.
  • ItemEmbargo
    Visualising the Aurora: Embodied and Instrumental Sensing throughout the International Polar and Geophysical Years (1880-1960)
    Amery, Fiona
    This thesis traces the various ways in which the aurora was imaged, visualised and understood during the International Polar and Geophysical Years of 1882-1883, 1932-1933 and 1957-1958. I explore the depiction of the phenomenon, from hand-drawings to radio echoes, while paying heed to what was occluded from portrayals, the imaginative and aesthetic considerations involved in rendering the aurora and the epistemological problems of capturing a transient, unpredictable and intangible atmospheric object. Photography, spectroscopy, radio imaging and the introduction of the all-sky perspective were integral technological developments, influencing the ways in which the aurora was presented and viewed. Nevertheless, experiential knowledge of the phenomenon, gained through watching the affective light displays and occasionally listening for its potentially illusive sounds, remained crucial to each of the endeavours. With a focus on the practices of Polar research, I trace the shifting balance between reliance on embodied and instrumental registration of the phenomenon. This perspective reveals the significance of amateur participation to the Polar Years and the centrality of outdoor, situated practices of knowledge creation, complicating our understanding of the spaces of the nineteenth and twentieth century physical and geophysical sciences. The project to perfectly reproduce the aurora, and thus come to know it, was from the outset an impossible task. This thesis is, therefore, a story of incremental learning, of the calibration and standardisation of the phenomenon across vast distances, of bringing together fragments of the aurora’s ontology to create a fuller, more complete picture of the phenomenon, and of both fallibility and success.
  • ItemOpen Access
    On the Challenges of Measurement in the Human Sciences
    Larroulet Philippi, Cristian; Larroulet Philippi, Cristian [0000-0001-5793-4670]
    Measurement practices are central to most sciences. In the human sciences, however, it remains controversial whether the measurement of human attributes—depression, happiness, intelligence, etc.—has been successful. Are, say, widely used depression questionnaires valid measuring instruments? Can we trust self-reported happiness scales to deliver quantitative measurements as it is sometimes claimed? These and related questions are till today hotly disputed. There are two main frameworks under which human measurements are studied and criticized. One is the so-called construct validity framework. Here, criticisms to human measurements are typically of the form “this instrument is not valid: it does not actually measure the attribute we set out to measure”. The second framework is the standard typology of measurement scales (nominal, ordinal, interval, and ratio). Human measurements are commonly challenged for being merely ordinal—not quantitative: interval or ratio—despite the fact that many researchers use measurement results as if they were quantitative. The first part of the dissertation studies how adequate these frameworks are for evaluating measuring instruments and the inferences they afford. Regarding the concept of validity, it is commonly understood unconditionally, that is, without restricting validity judgments about measuring instruments to context-specific situations. Instruments are said to be (in)valid *simpliciter*. In Chapter 2, I argue against this conception of validity and in favor of a contextual one. Regarding the second main framework, the standard typology of scales is typically linked to a set of prescriptions regarding (un)justified measurement inferences. I call this classification-cum-prescriptions the “received view” on measurement scales. In chapters 3 and 4, I question the idea that the received view is an impeccable guide for clarifying the kinds of inferences we are licensed to make from measurement results in human science contexts. I motivate general doubts about the adequacy of the received view in Chapter 3, and I articulate these doubts in detail for the specific case of average group comparisons in Chapter 4. The upshot of this first part of the dissertation is a deeper awareness of the complexity surrounding which inferences can legitimately be made from measuring instruments. The first part of the dissertation is largely framed under the assumption that some human attributes are indeed quantitative, even if they are not currently being measured quantitatively. The second part addresses two important issues raised by that assumption. Chapter 5 tackles the so-called “quantity objection”: that human science attributes are themselves not quantitative, thus they cannot be quantitatively measured. This objection has been deployed to argue against optimistic positions regarding human quantification. I argue that the quantity objection is not successful in this sense—it begs the question to these optimistic human scientists that, just like their colleagues in the physical sciences have done, postulate theoretical quantitative attributes as *working hypotheses*. But what does treating human attributes quantitatively as working hypotheses amount to? And what are, or can be, “amounts” of depression, happiness, etc.? Chapter 6 argues that there is not one but various approaches for quantitatively conceiving of human attributes, each with its own success conditions. This chapter offers a conceptual framework for making progress on debates about controversial human measurements.
  • ItemEmbargo
    Globalising China: Jesuits, Eurasian Exchanges, and the Early Modern Sciences
    Giovannetti-Singh, Gianamar; Giovannetti-Singh, Gianamar [0000-0003-3752-6359]
    This dissertation argues that the Manchu conquest of China in the mid-seventeenth century transformed several ostensibly “European” sciences in the early modern period. The “Tartar war” between the weakened Ming dynasty (1368-1644), peasant rebels, and the Manchus—a semi-nomadic population from northeast Asia—was experienced first-hand by several Jesuit missionaries proselytising in China. During the unstable interregnum, Jesuits sought patronage from disparate warring factions, offering their astronomical expertise to help various pretenders secure the “Mandate of Heaven” to rule legitimately over China, hoping to ensure their mission’s survival. By engaging with Chinese and Manchu astronomical labourers, reading Chinese treatises on cosmology, agriculture, cartography, history, and moral philosophy, and interacting with scholar-officials and military commanders, Jesuits learned extensively from local technoscientific discourses and practices. Between 1653 and 1658, the Tridentine missionary Martino Martini (1614-1661) served as a “procurator”—responsible for promoting the China mission in Europe—and a representative of the new, Manchu-led Qing dynasty (1636/44-1912). In Europe, Martini published accounts of the Ming-Qing War (1654), China’s geography (1655), and its history (1658) with commercial printers, reaching a wide, interconfessional readership. He courted patronage from powerful Habsburg rulers and defended the Jesuits’ involvement in Chinese sciences and politics at an audience with Pope Alexander VII. As this dissertation contends, Martini’s successful mobilisation of disparate political, religious, commercial, and scholarly networks across a turbulent Eurasia enabled his laudatory accounts of Chinese sciences to convince an extraordinarily wide audience. In turn, during the long eighteenth century, European writers drew—often polemically—on Martini’s accounts of Chinese agriculture, astrology, cartography, chronology, cosmology, ethnography, military cultures, and moral philosophy to articulate new solutions to contemporary technoscientific, social, and political crises. As such, the dissertation argues that Manchu and Chinese cultures of knowledge, mediated by Jesuits, occupied an important and underappreciated role in Enlightenment sciences.
  • ItemEmbargo
    The Logical Structure of Scientific Knowledge-Systems
    Vos, Bob
    This thesis seeks to assess and develop the use of formal methods in philosophy of science. More specifically, I argue that the particular strand of philosophy of science that concerns itself with the formal structure of scientific knowledge has focused excessively on the structure of scientific theories. In response, I consider the prospects for the formal study of other, supratheoretical aspects of science, culminating in the proposal for a research programme centred around the formal study of, what I shall call, scientific knowledge-systems. This line of argument is laid out over the course of two parts, along with an extended introduction. In my extended introduction, I will situate myself with respect to the wider philosophy of science. This is necessary, because the body of work I seek to build on in this thesis is emblematic of a style of philosophy of science, referred to here as the architectonic style, which today has largely fallen out of favour. Accordingly, I will first offer some general arguments for the desirability and viability of this style of metascientific enquiry. In Part I of the thesis, I offer a critical appraisal of extant work on the formal structure of scientific knowledge, referred to there as architectonic metascience. Following an extensive survey of architectonic metascience (Chapter 1), I argue that it suffers from the problem of theory-centrism (Chapter 2). The upshot of this observation, I argue, is that frameworks for the formal study of scientific knowledge should adopt a supra-theoretical unit of analysis (or macro-unit, for short). I conclude my appraisal by discussing the few extant—but seldom acknowledged—examples of the formal study of macro-units (Chapter 3). Finally, in Part II, I seek to make a contribution to the formal study of macro-units. A recurring theme throughout the history of analytic philosophy is the idea that we may draw on the metatheoretical study of logic to inform the metatheoretical study of science. In line with this tradition, I first survey the movement of logical abstractivism, within which we find various frameworks for the systematic study of different systems of logic (Chapter 4). Following this is an intermezzo which presents an existing application of logical abstractivism to philosophy of science (Chapter 5), and a brief discussion on the explication of target systems in formal analyses of science (Chapter 6). Building on these reflections, I set out the programme of metascientific abstractivism for the study of scientific knowledge-systems (Chapter 7).
  • ItemOpen Access
    A history of Ohm's Law: Investigating the flow of electrical ideas through the instruments of their production
    Connelly, Charlotte; Connelly, Charlotte [0000-0001-9350-044X]
    This thesis takes a deep dive into the electrical work of Georg Simon Ohm. It is tightly focused on the period 1825-1827 when he developed and published the famous law we now think of as “Ohm’s law”. This work differs from previous studies of Ohm’s investigations by putting material culture at its heart. Using Ohm’s research as a case study, this project asks: how can the material culture of the physical sciences contribute to contemporary historiography in the history of science? What can we uncover from a material-led investigation that would remain inaccessible in a text-led investigation? And, in this particular case, how does looking closely at Ohm’s experimental apparatus and practice help us to understand the development of his law? As part of the material-led study, this project incorporates a reconstruction and reworking of Ohm’s 1826 experimental apparatus. As something of an outsider, Ohm has defied ready categorisation by historians. This project looks at why that might be, and what his theoretical and methodological influences were. Looking at the way Ohm’s instrument was designed and used, we can see the influence of natural philosophers including Coulomb, Ørsted, Seebeck, Ampère and others, while looking at Ohm’s mathematical treatise reveals the strong influence of Fourier. Perhaps most notable from Ohm’s work in this period is how readily he changed his conceptual approach. Over the course of three years, three distinct phases of work can be identified: in the first phase Ohm used a voltaic, or hydroelectric, pile and described his observations in terms of “loss of force”; in the second he replaced the unstable voltaic pile with a thermocouple and described his observations in terms of “exciting force”; and in the third he set aside his experimental work and focused instead on making mathematical arguments, shifting his language again to describe what was happening in an electrical circuit in terms of “electroscopic force”. The shift between Ohm’s first two phases of work can, this project argues, be made sense of by understanding changes in Ohm’s experimental setup and the way they affected his interaction with the phenomena he was studying. This thesis presents objects and instruments as integrated parts of the thought process of an experimenter, in which the apparatus shapes the thinking of the experimenter as much as the experimenter’s thoughts shape the design of the apparatus. This framework is then applied to Ohm’s work as a case study, leading to the suggestion that Ohm’s material interactions with his experiment shaped his conceptualisation of what was happening in an electrical circuit. It is possible to draw clear links between the conceptual tools and the material tools that Ohm used in his different phases of work; and, in understanding the practical experience Ohm had of using his apparatus, to make sense of his shifting use of language as he moved through different phases of work. Through its material-led approach, this project brings together history, philosophy and material culture of science in a single case study. It presents novel findings about Ohm’s work, uncovered through material engagement with his experimental setup, and offers a set of tools for material-led studies in the history of science.
  • ItemControlled Access
    Using mathematics in physics: A pragmatic approach
    Tomczyk, Hannah
    In this thesis, I address the philosophical problem of why mathematics is useful for physics. I discuss the contemporary philosophical debate on the topic, which is focused on the representational role of mathematics. In contrast to this, I then defend a pragmatic view in which representation is only one of the uses of mathematics in physics, and which stresses the use of mathematics in experimental and technological contexts. I argue that physicists can make mathematics useful in concrete situations because they have a language that enables them to coordinate activities in mathematical and concrete contexts. In particular, there are concepts that are meaningful in both contexts of work. I call these “hybrid” concepts, and I analyse them with a view that takes the meaning of a term to be determined by its use. For physicists to successfully combine work in mathematical and concrete contexts, they need to have mastered the accepted uses of hybrid terms in both contexts. In the historical development of science, many concepts underwent changes so that today, they are particularly apt to connect the two contexts of work. I will use two case studies to show how this approach can illuminate the usefulness of mathematics in concrete situations: the development of the concept of “velocity” from its mathematisation in the Middle Ages to its first technological usefulness in the context of gunnery in the eighteenth century, and the development of the term “spin” from its introduction into atomic physics in the 1920s to its use in modern MRI. I claim that if we trace the historical development of hybrid terms with a use-focused view of meaning, it becomes clear why mathematics became more and more useful in experimental and technological contexts.
  • ItemOpen Access
    Colour and Colour Vision in Late Nineteenth Century British Sciences
    Bridgman, Gregory
    This thesis employs a close reading of archival and published sources to explore the origins of colour vision science in 19th century Britain. By drawing attention to wide ranging dialogues and disagreements between diverse figures with different visions of colour, including physicists David Brewster, James Clerk Maxwell, and Lord Rayleigh, politician and philologist William Gladstone, ophthalmologists Frederick Edridge-Green and Robert Carter, and anthropologist W.H.R Rivers, I show that 19th century colour vision science was not narrowly confined to the quantification, measurement, and classification of colours. It was instead shaped by deeper metaphysical questions and wider political concerns. These questions and concerns included the implications of natural law, free-will, and materialism for scientific understandings of reality, the limits of Darwinian understandings of humanity, the legitimacy of scientific experts and institutions in determining public policy, and the history, future, and advancement of civilization. I argue that the widespread use of spinning discs as an experimental research technology, promoted by Brewster and Maxwell, combined with the mainstream acceptance of Maxwell’s theoretical model of ‘coterminal response curves’, generated conflicts between competing understandings of perception, vision, and colour in the second half of the 19th century. These conflicts stemmed from the establishment of new conventions, inspired by Maxwell’s work, which held that scientists should maintain a practical and analytical distance from their own visual experiences, that the visual experiences of test subjects should be treated as untrustworthy phenomena in need of further analysis, and that the meaning of subjective experiences are contained within, and revealed by, mathematical models that accord with a rational understanding of the physical world. These practical and metaphysical approaches to the meaning of human experience did not end with the conflicts they generated in the second half of the 19th century but continue to bear on broader contemporary understandings of truth and illusion in scientific practice and popular imagination.
  • ItemOpen Access
    Bridging the gap between populations and individuals in the philosophy of medicine
    Scholl, Raphael
    The thesis addresses the relationship between populations and individuals in the philosophy of medicine. There is a long-standing tension in the fact that randomised, controlled trials in clinical populations count among the best evidence in medicine, while the goal of medical practice is to help individual patients who may differ from population averages. To what extent is evidence from populations a sufficient guide to the treatment of individuals, and how can it perhaps be improved upon? The thesis consists of five chapters that consider this problem from several complementary perspectives. The first chapter returns to the origins of population studies in medicine: The 1835 debate on medical statistics at the Académie Royale de Médecine in Paris. It argues that the existing literature has neglected core epistemological arguments on which the debate turned, and which made population studies a much more challenging methodological development than we today appreciate. The second chapter moves to a present-day version of the debate. It asks whether physicians ought more frequently to conduct so-called n-of-1 or single-case trials with individual patients. The conclusion is that while such studies are epistemologically sound, they are less useful than they may appear at first glance. The third chapter focuses on so-called molecular network reconstruction, a type of mechanistic discovery strategy that leverages large datasets. While it is often difficult to find strong associations between genetic variants and individual health outcomes, this literature suggests that a higher level of organisation -- the state of entire molecular networks -- can often be associated with individual outcomes. The fourth chapter presents an extended case study of periodontal disease, a common affliction that is well understood in some respects but also presents inter-individual heterogeneity that has been recalcitrant to explanation for decades. The puzzles of the case study lead into the fifth and final chapter, which locates the search for inter-individual variation in disease susceptibility and therapy response in an evolutionary context. It argues that evolutionary models of ultimate disease causation can serve as a heuristic tool for the study of the proximate causes of variation in health outcomes.
  • ItemOpen Access
    Experts & AI systems, explanation & trust: A comparative investigation into the formation of epistemically justified belief in expert testimony and in the outputs of AI-enabled expert systems
    Seger, Elizabeth; Seger, Elizabeth [0000-0001-8942-4130]
    The relationship between human experts and those that seek their advice (novices), and between AI-enabled expert systems and users, are epistemically imbalanced relationships. An epistemically imbalanced relationship is one in which the information source (expert/AI) occupies an epistemically privileged position relative to the novice/user; she/it can utilize capacities, resources, and reasoning techniques to draw conclusions that the novice/user would be unable to access, reproduce, or in some cases, comprehend on her own. The interesting and problematic thing about epistemically imbalanced relationships is that when the epistemically disadvantaged party seeks out expert/AI aid, then in virtue of the novice’s epistemically disadvantaged position, she is not well-equipped to independently confirm the expert/AI’s response. Consider for example, a physician who outlines a cancer treatment regime to a patient. If the physician were then to try to explain to the patient how she decided on that specific regime (including drug doses, timings, etc.) it is not clear how the explanation would help the patient justify her belief in the physician’s claims. If an expert outlines her reasoning in such detail that it provides strong evidence in support of her claim – for instance, such that a series of true premises logically leads to a conclusion – then the novice is unlikely to have the expertise necessary to recognize the evidence as supporting the claim. Accordingly, the question stands, how can the novice, while remaining a novice, acquire justification for her belief in an expert claim? A similar question can be asked of user-AI interactions: How can an AI user, without becoming an expert in the domain in which the AI system is applied, justify her belief in AI outputs? If an answer can be provided in the expert-novice case, then it would seem that we are at least on our way to acquiring an answer for the AI-user case. This dissertation serves a dual purpose as it responds to the above questions. The primary purpose is as an investigation into how AI users can acquire a degree of justification for their belief in AI outputs. I pursue this objective by using the epistemically imbalanced novice-expert relationship as a model to help identify key challenges to user appraisal of AI systems. In so doing, the primary objective is achieved while pursuing the dissertation’s secondary purpose of addressing standing questions about the justification of novice belief in human expert claims. The discussions that follow are framed against an overarching conceptual concern about preserving epistemic security in technologically advanced societies. As my colleagues and I have defined it (Seger et al., 2020), an epistemically secure society is one in which information recipients can reliably identify true information or epistemically trustworthy information sources (human or technological). An investigation into how novices and users might make epistemically well-informed decisions about believing experts and AI systems is therefore an investigation into how we might address challenges to epistemic security posed by epistemically imbalanced relationships.
  • ItemOpen Access
    Why we need to talk about preferences: A federalist proposal
    Beck, Lukas
    This doctoral thesis argues that, contrary to the appearances of unity, economists are highly disunified in their understanding of central concepts of choice- and game theory, namely, preferences and beliefs. Even though 'preference' is arguably the most central concept in economics and the discipline is very explicit about the structural assumptions that preferences are supposed to satisfy, there is neither an explicit definition of the concept in economic textbooks nor much of a current debate in the discipline. Nevertheless, the last few years have seen the emergence of several views by philosophers of economics concerning what preferences in economics really are (e.g., judgmentalism, various strands of revealed preferences theory). Trying to defend one such story is, in my view, a mistake. Instead, I propose that we should acknowledge that there is significant disunity about concepts like preferences and beliefs in economics, and that explicating this disunity cannot only help us account for important controversies at the forefront of economic research, but also point us towards potential resolutions. As a first step towards my aim, I demonstrate that the various grand narratives about ‘what preferences in economics really are’ fail to account for substantial contributions and practices in economics. I then argue that this is to be expected as only a minimal conception of preferences holds the federation of economics together. This minimal conception is usually enriched with further implicit assumptions that differ across various research programs and are tailored to the specific agendas of the research programs in which they are employed. One of my central claims is that explicating and appraising these more local assumptions will — in contrast to how the debate currently proceeds — allow philosophers of science to contribute substantially to the progress of economics. The thesis supports this claim by looking in detail at i) the disagreements concerning what kind of experiments microeconomics needs and ii) the recent controversy about preference purification in behavioral welfare economics. Concerning the first debate, I argue that proponents of the heuristics-and-bias program usually put internalist restrictions on the constituents of preferences, while proponents of experimental economics in the tradition of Vernon Smith permit agents' environments to play a crucial role in the constitution of their preferences. Regarding the second debate, I argue that disentangling different substantial notions of rationality, which go beyond its technical meaning in economics, can help us account for the vastly different assessments of the plausibility of preference purification in behavioral welfare economics. Turning back to the big picture, my discussion of i) and ii) highlights that economics has more to gain from an explication of the implicit assumptions about choice- and game-theoretic concepts made by different research programs than from overreaching narratives about ‘what preferences really are.’
  • ItemEmbargo
    The Prospects of Personalising Medicine
    Mncube, Zinhle
    Personalised Medicine (PM) is touted as a medical revolution where medical treatment and diagnosis is tailored to the individual patient so that it is optimal, safe, and exactly appropriate. Each chapter in this dissertation deals with the conceptual, methodological, epistemic, or ethical issues of personalising medicine that influence our ability to reliably predict, diagnose, and treat disease for individual patients. I argue that PM in its current form is insufficient in several ways. My dissertation contributes to the literature by providing arguments for why and how we should reconceive of PM. In Chapter 1, I describe and analyse two conceptions of PM: broad-based and biological. I reject the popular claim that biological conceptions of PM are not truly personalised because they do not capture holistic aspects of personhood. Instead, I argue that the real problem for genome-based conceptions is that genomics is frequently imprecise and explanatorily not robust about underlying causes of disease. In Chapter 2, I illustrate that some theorists combine their defence of the use of race in medicine with an appeal to the value-free ideal. Against this view, I contend that contextual value judgements are important in assessing the use of race in medicine because evidence on the epistemic usefulness of racial categories in medicine can be transiently underdetermined and inductively risky. In Chapter 3, I interrogate the controversial use of racial categories in equations to predict kidney function and show that equations that include race in their estimations are inadequate at accurately predicting kidney function. In Chapter 4, I assess the underexplored reliability of what I call the ‘stratification strategy’. This strategy requires that clinicians make therapeutic predictions about individual patients based on evidence of commonly shared molecular biomarker status among patient subgroups. I argue that in many instances of its use, this strategy has low reliability because biomarker evidence lacks credibility. Lastly, in Chapter 5, I draw attention to African Traditional Medicine as a broad-based form of PM. I consider the rationale to decolonise medicine as it applies to African Traditional Medicine in South Africa. I argue that interpreting this rationale as many do—that African Traditional Medicine is epistemically equivalent to Western mainstream medicine—implies an untenable wholesale medical relativism.
  • ItemOpen Access
    Gauging a State: Excise Taxation, Practical Mathematics, and Cask Measuring in Seventeenth Century England
    Sechrist, Guy
    This thesis investigates the mathematical developments and the continued use of gauging instruments designed to calculate the contents of wooden barrels for the purposes of collecting excise revenue for the English state. This research situates the work of mathematical practitioners alongside the practices of excise officers, or gaugers, to examine how the calculation of the contents of barrels became a fundamental part of the English state’s financial development throughout the seventeenth century. By championing the ubiquity of wooden barrels, their examination within the context of England’s financial developments will provide remarkable insight into the connections underpinning the practical advancements that were made in the mathematical sciences, commercial activities, and state knowledge of the brewing trade and excise system. While the thesis argues that the theoretical knowledge of mathematics served to influence and develop the very practical methods excise officers used to gauge barrels, it was ultimately the state’s interest in taxation, via the ability to gauge said barrels, which ultimately drove efforts to develop and refine such practices throughout the century. Previous scholarship on the history of excise generally begins with the Excise Ordinance of 1643. The first three chapters of this thesis are case studies which trace the origins of barrel gauging in relation to the development of early mathematics, and the expanding brewing and wine trade prior to 1643. This research is necessary to highlight the steps that were needed before the English state could enact such an ordinance in the first place.
  • ItemOpen Access
    Framed and Framing Inquiry: Development and Defence of John Dewey's Theory of Knowledge
    Henne, Celine
    This thesis develops Dewey’s theory of inquiry and provides a novel perspective on what realists consider to be Dewey’s most controversial claims: his rejection of the view that inquiry aims at providing an accurate representation of reality, his claim that the object of knowledge is constructed, and his definition of truth in terms of warranted assertibility or fulfilment of the requirements of a problem. My strategy is to draw a gradual and relative distinction between what I call “framed” and “framing” inquiry. While the distinction is not explicitly present in Dewey’s works, it rests on Dewey’s functional distinction between existential and universal propositions. In a framed inquiry, the problem is covered by an existing conceptual framework, which is used to resolve the problem, without being revised. In a framing inquiry, the problem is underdetermined by existing conceptual frameworks, which are created, revised, or expanded. My general argument is that Dewey’s main contribution and controversial claims should be understood in the context of framing inquiry. In Chapter 1, I set the stage for the thesis. I paint the portrait of Dewey as the archetypal anti-realist; I present pragmatism as moving above this debate; and I present the specifically Deweyan brand of pragmatism. In Chapter 2, I introduce the distinction between framed and framing inquiry. In Chapter 3, I argue that realist notions of knowledge as representation, existence as independent facts, truth as correspondence can be cast in terms of framed inquiry, while most realists mistakenly interpret these notions as “unframed.” In the next three chapters, I defend and develop Dewey’s views for framing inquiry. In Chapter 4, I argue that framing inquiry should be construed as articulating rather than representing reality. In Chapter 5, I maintain that Dewey’s view avoids idealism. In Chapter 6, I defend the pragmatist theory of truth as a standard for framing inquiry, by contrast with representational standards.