We are happy to announce the next PROGRAMme spring workshop:
Future visions on computing and programming then and now.
It will be held on June 12, in Lille, Maison de la recherche, Salle F0.13, Campus Pont de Bois. The campus can be easily reached from the Gare de Lille Flandres by taking the metro line 2, direction “Quatre Cantons”, get off at “Pont de bois”.
The workshop is the second workshop organized in 2023 within the Lille-Paris Séminaire Histoire et Philosophie de l’Informatique et du Calcul (HEPIC) organized by Alberto Naibo and Liesbeth De Mol and supported by UMR 8590 IHPST; and UMR 8163 Savoirs, Textes, Langage.
More details about subscription and program can be found here.
The PROGRAMme autumn meeting was again a very intense and special week with a workshop around programming languages and notations (Cluster 3 of the project) and a set of work meetings to shape and discuss not just a book but a collaborative work which I hope will continue after PROGRAMme ends. The dynamics of the group of researchers that have gathered for the second time in Bertinoro is quite special and I would like to thank all participants for contributing to and shaping this a-disiciplinary endeavor.
Here are slides of some of the talks from the workshop:
Here are some pictures! Thanks to Marie-José Durand-Richard, Baptiste Mélès, Elisabetta Mori, Dale Miller and Pierre Mounier-Kuhn for having shared some of their pictures.
How can philosophy of science contribute to basic issues in computing and programming, not just on the conceptual level but also on a more concrete level? Giuseppe Primiero, one of the PROGRAMme members, is one of the researchers moving across the different disciplinary divides to show how philosophy can really matter. In two recent papers, he considers two important issues.
In the first paper, titled Infringing Software Property Rights: Ontological, Methdological, and Ethical Questions, the very basic ethical issue of software property rights infringement and argues that any discussion around property rights (legal or other) should be carefully distinguish between three types of questions, viz. ontological, methodological and ethical questions. It is rooted in previous work from Primiero and his co-author on the ontological problem of software copies. Based on that, they now identify what they consider the most appropriate level of abstraction at which software should be legally protected. From that perspective, the current work connects insights from (Floridi’s) philosophy of information and the central role played by so-called LoA’s with a fundamental problem of software property right infringement. The paper is available here:
In the second paper, titled A theory of change for prioritised resilient and evolvable software systems the problem of maintenance and improvement of software systems used to guarantee the reliability of a system in view of change, is attacked by providing a full formalization of three basic processes: completion, correcting and prioritizing of specifications. This formalization is achieved by using an existing paradigm from philosophy of science, viz. The Alchourron-Gärdendorfs-Makinson paradigm for belief revision of human epistemic states. The paper is available here.
The PROGRAMme spring meeting in Lille (5 – 7 june) was an exciting but intense meeting — we thanks everyone who contributed to this!
Below you can find the slides for some of the talks as well as a link to the LilleTV channel where you can relive most of the talks and the discussions that follow:
It is my pleasure to announce that the next event on the history and philosophy of computer programs within the framework of the PROGRAMme project (https://programme.hypotheses.org/) will be held from June 5-7 2019 at the MESHS, 2 Rue des Canonniers, 59000 Lille, espace Baïetto and Salle 002.
On June 5 we will have the last session of the academic year of the Lille-Paris séminaire HEPIC (Histoire et Philosophie de l’informatique et du calcul) around the topic: /Computing – a human activity?/ Speakers are:
13h-14h15: David Aubin (Université de Paris 6), Astronomical tables as work, 16th-18th centuries 14h15-14h30: pause 14h30-15h45: Stephen Kell (University of Kent), Software against humanity? An Illichian perspective on the industrial era of software 15h45-16h15: pause 16h15-17h30: Christiane Floyd (Technical University of Vienna), The move to activity-centered views of software development
From June 6-7 the Spring workshop 2019 of the PROGRAMme project is organized. It is the second in a series of four workshops. Each focuses on one of the PROGRAMme project’s main clusters in connection to the other three. The second workshop will focus on the cluster Machines and so offers reflections on a set of historical, philosophical and epistemological question on Computing Machines, broadly interpreted, and this in relation to the other three clusters (Logic; Programming languages and notations; Systems).
In order to register, please send a mail with your affiliation to: liesbeth.demol@univ-lille3.fr *before May 20 2019*. Registration is free but required in order to attend.
What kind of epistemological assumptions are being made in the context of computer simulation and in how far are those epistemologies determined by domain-specific features? This question s tackled in the recently published paper:
Giuseppe Primiero, A Minimalist Epistemology for Agent‑Based Simulations in the Artificial Sciences, Minds and Machines 2019, https://doi.org/10.1007/s11023-019-09489-4
It is shown that within the specific context of the artificial sciences, where extensive use is made of agent-based and multi-agent system simulation, one assumes specific epistemological constraints which means that those sciences use a specific epistemology. Primiero proposes a minimally committed epistemology based on the philosophy of information and, based on that, proposes a new definition of simulation in this specific domain.
I am very happy to announce the recruitment of Nick Wiggershaus as a PhD student who will work on the PROGRAMme project. Nick has a background in physics (B.Sc. 2015) and then transitioned to History and Philosophy of Science in Utrecht. Mainly engaging in foundations of physics, he completed his research on the ontological status information measures used in contemporary physics (M.Sc. 2018). Having investigated foundational issues in information theory, physics and computer science, his main focus now in the project is a historico-philosophical analysis of the ontology of computer programs. More specifically, he tries to explore how the seemingly abstract aspects of computer programs are related to concrete machines. We are of course very much looking forward to his research results and I am happy to welcome him in the PROGRAMme team.
How can one (efficiently) differentiate between malicious and non-malicious software and how can this be formalized and automated? This is one of many fundamental problems within software engineering which is becoming more acute with society’s increasing reliance on cyber-physical systems. In fact, it is one of many problems in dire need of a deep conceptual and so philosophical analysis, not just for the mere pleasure of defining and classifying, but because such reflections help improving our software. One basic challenge within the security context then is to understand what malicious software is today (indeed, also malware can and should be seen as a historical object) and, so, whether one can identify different types of so-called malware. Existing proposals of taxonomies are too focused on specific details and, hence, system-specific – a generalizing approach is needed. This is the task that Primiero, Solheim and Spring set out for themselves in a recent paper:
“On malfunction, mechanism and malware classification.” Philosophy and Technology, 2018, https://doi.org/10.1007/s13347-018-0334-2, available here.
Starting out from an existing taxonomy, they propose a generalized framework, extending it to intended disruptions of normal software functionality. In a first analysis, the understanding of malware is functionalist, viz. in terms of the damage done (e.g. is the intent of the attack merely to monitor and possible steal data or is it disruptive?). This functionalist understanding is then connected with the use of mechanistic modeling strategies to clarify and, ultimately, facilitate malicious software classifications. Of course, the authors’ ambition is to propose this analysis as a first stepping stone towards provable and verifiable reasoning about malware. We are looking forward to the next steps in this endeavour.
I am very happy to announce the publication of two books from two PROGRAMme members.
The first book is the long-awaited basic book on the philosophy of computer science by Ray Turner. Ray is one of the founders of the philosophy of computer science, both through his efforts within the IACAP community to establish a track on the philosophy of computer science, as well as through his well-known SEP entry on the philosophy of computer science which is now co-authored by Nicola Angius and still available here: https://plato.stanford.edu/entries/computer-science/.
Ray’s book is a great example of how insights from different fields are needed in order to develop a deeper reflection on the computer science field which is of value not just to philosophers but also to the computer science community itself. The book’s title is “Computational artifacts. Towards a philosophy of computer science”. It refers to Ray’s effort to connect the following fundamental ontological question of computer science which, in the past, as often given rise to a variety of fierce debates on the nature of computer science:
what “are” the
“things” of computer science?
with insights from
(analytical) philosophy of technology. This allows him to explain and
argue from the dual nature of computational objects as having both
structural and functional properties. They are technical artefacts.
But the analytical framework offered by the philosophy of technology
does not suffice to fully capture computational artefacts, but also
requires insights from the philosophy of mathematics. More
specifically, whereas programs and computations are computational
artefacts from the perspective of being physically implemented, the
process of constructing a symbolical program from its specification
is a rule governed mathematical activity. Bridging the mathematical
aspects of computational “things” with their technological side
through the relevant philosophies is one of the great achievements of
this book. A must-read not just for philosophers but, perhaps, first
of all, for computer scientists. A more detailed review will follow
later.
The second book that appeared is by philosopher, software engineer and historian Mark Priestley who also wrote the first serious book on the history of computer programs. His latest book titled Routines of substitution. John von Neumann’s work on Software development 1945-1948, engages deeply with the early years of computer programming and, more specifically, with the evolution in von Neumann’s work from 1945-1948 in this context.
It starts out from a detailed analysis of the earliest known example of a routine written by von Neumann, viz. to `mesh’ two sequences of data and intended to be part of the larger program known as mergesort and then broadens its perspective ultimately engaging with the very fundamental and controversial topic of subroutines as discussed by von Neumann an Goldstine in the famous but often not well-read report on flow diagrams and subroutines and considered one of the first texts on computer programming. Rather than following popular myths, both positive and negative, Mark instead made the effort to reread well-known texts and connected them with previously unknown or hardly known texts in order to offer a historical reading of von Neumann’s work in this context put in its proper context. This allows him to develop a more coherent and correct view on von Neumann’s view on the planning and coding of problems. One basic outcome is the central role played by substitution which is something that directly connects von Neumann’s work on programming to a tradition of mathematical logic where the notion of substitution played a central role.
The autumn meeting in Bertinoro was a great and intense meeting — after the workshop, the members of the PROGRAMme group started their collaborations around the different PROGRAMme cluster and also initiated work on a PROGRAMme wiki. Below you can access the slides of some of the talks at the workshop “Formalisms at the interface with machines, languages and systems”.
Thanks to all participants for turning this into a great event! Thanks are also due to Simone Martini who not only proposed Bertinoro as a possible venue but also co-organized the event.
What is the actual effect of advances in mathematical logic from the 1920s and 1930s on the field of computing? A lot has been written and more has been said already of the supposed theoretical foundations of computing and its roots in the logical work of people like Church and Turing. While it is clear that there are important connections, both historically as well as epistemologically, between these developments, it has been shown before that this should not mean that there is a clear one-directional line going from theory into practice (nor conversely) and that the relation between logic, computing machinery and programming practices is a complicated one both historically (e.g. what is the significance of “Turing machines” for the development of the modern computer?) as well as philosophically (e.g. what connects computation with finite combinatory processes and how is this reflected in the control mechanisms for both types of “processes”?). This becomes even more significant in the light of the fragmented nature of the computing field today which seems to lack a more coherent foundational framework.
One specific historical aspect of this issue is investigated in the paper:
L. De Mol, M. Bullynck and E.G. Daylight, Less is more in the fifties: encounters between logical minimalism and computer design during the fifties, IEEE Annals of the History of Computing, vol. 40, issue 1, pp. 19-45, 2018, a version is available here: http://hal.univ-lille3.fr/hal-01345592v3/document
It studies how certain developments in computer design from the 1950s can be embedded into what are called minimalist traditions in mathematical logic and engineering. Both traditions could be termed logical minimalism, meaning the systematic use of (mathematical) logic in designing minimal systems and devices. These forms of logical minimalism were recast into a diversity of computing practices in the 40s and 50s. The logical tradition is part of the more general research programme into the foundations of mathematics and logic that was carried out in the beginning of the 20th century. The engineering tradition then emerged during the 1930s to design relay circuits and is part of a more general trend of using mathematical techniques in engineering. In the 1940s and 1950s, however, these traditions were redefined and appropriated when computer engineers, logicians and mathematicians started searching for 1the small(est) and/or simple(st) machines with an eye on engineering a small and relatively cheap digital computer. Of course, minimalism on one level does not imply overall simplicity, and nearly always, these logically small machines came with tradeoffs, mostly more involved and complex programming and a need for more memory for efficient operation. This paper studies the search for small machines, both physically and logically, and ties it to these older traditions of logical minimalism. Focus is on how the transition of symbolic machines into real computers integrates minimalist philosophies as parts of more complex computer design strategies. It also shows how the name of Alan S. Turing became affiliated with the idea that solving computational problems with machines do not require a series of computing machines of ever higher complexity but that a few operations suffice.
What does it mean for two programs to be identical to each other and in what sense can one computational artefact be said to be a copy of another one? This is the fundamental question tackled in the following paper that is now on-line:
The philosophical problem of identity has a long history which can be traced back to the work of people like Quine, Frege and Martin-Löf. In computing the problem of identity concerns the question when two programs are identical. The approach of this paper is to revisit the computational problem by (re-)connecting it to the more philosophical literature by studying in how far the different identity criteria from philosophy can be re-applied to the problem in the computational setting. The result is the determination and formal definitions using process algebras of several types of identity relations. The issues raised in this paper connect quite naturally to some other fundamental problems that came to the fore during the prelaunch roundtable of PROGRAMme at CNAM such as the questions of copyright; the relative difference between programs and algorithms and issues of simulation (see the report). Looking forward to more results in this direction!
An earlier version of the paper is available here.
The launch of the PROGRAMme project was an intense first meeting with some great discussions on a variety of topics. The slides of the talks on wednesday February 7 and thursday February 8 are now available here:
PROGRAMme will have a regular column in the Dissemination Corner of The Reasoner. written by L. De Mol and G. Primiero. The first column is available here.
Welcome to the site of the ANR research project “What is a program? Historical and philosophical perspectives”. The aim of this project is to develop a coherent analysis and pluralistic understanding of “computer program” and its implications to theory and practice. On this site you can find all relevant information related to the project as well as other news, events and publications that are relevant to the project.
The site is currently under construction given that the project will start on February 1, 2018 but a pre-launch event is already planned on October 20, 2017.