Research

Most of my work at the moment involves looking at how artificial neural networks represent things. I’m also interested in understanding the ways that generative linguistics integrates or doesn’t with the rest of cognitive science. And occasionally, I think about applications of sub-regular grammars across cognitive science.

Philosophy of ml/nlp

Mallory, F. (forthcoming) Language Models are Stochastic Measuring Devices In Communicating with AI (eds. Rachel Sterken & Herman Cappelen), Oxford University Press

Mallory, F. (2026). Formats of representation in large language models. Philosophy and the Mind Sciences, 7. https://doi.org/10.33735/phimisci.2026.12091 

Mallory, F. (2023) Fictionalism about Chatbots, Ergo, vol. 10, no. 39 https://doi.org/10.3998/ergo.4668

Mallory, F. (2023) Wittgenstein, The Other, and Large Language Models, Filosofisk Supplement, 2(3), 79-87

Mallory, F. (2020) In Defence of the Reciprocal Turing Test, Minds and Machines, 30:659–680 https://doi.org/10.1007/s11023-020-09552-5

Philosophy of linguistics/language

Mallory, F. (forthcoming) The Philosophy of Online Speech in Nefdt, R., Dupre, G., & Stanton, K. (eds.) The Oxford Handbook of the Philosophy of Linguistics. OUP

Mallory, F. (2025) Inquisitive injustice. Philosophical Studies 5 https://doi.org/10.1007/s11098-025-02428-3 Preprint

Mallory, F. (2024) Generative Linguistics and the Computational Level, Croatian Journal of Philosophy, Vol. XXIV, No. 71, 2024, 195-218 https://doi.org/10.52685/cjp.24.71.5

Mallory, F. (2023) Why is Generative Grammar Recursive? Erkenntnis 88, 3097–3111 https://doi.org/10.1007/s10670-021-00492-9

Mallory, F. (2021) The Case Against Linguistic Palaeontology, Topoi 40, 273–284 https://doi.org/10.1007/s11245-020-09691-5

Mallory, F. (2020) Linguistic types are capacity-individuated action-types, Inquiry, 63:9-10, 1123-1148 DOI: 10.1080/0020174X.2020.1772864

Misc philosophy

Mallory, F. (2021). Critical Notice: A Spirit of Trust: A Reading of Hegel’s Phenomenology by Robert Brandom (Harvard University Press, 2019). Philosophy, 96(4), 675-682 doi:10.1017/S0031819121000206

Collections edited

(With Eliot Michaelson) Philosophy: Special issue on Online Speech Volume 99 – Special Issue 3 – July 2024 [Online Communication: Problems and Prospects]

Drafts under review/revision

Email for drafts of papers on information theory and compression in natural language, and dialectical materialism.

Outreach and other writings

Discussion on the European Day of Languages

Daily Nous Philosophers On discussion of ChatGPT

My Stanford Encyclopedia of Philosophy map, the code used to generate it is on my githib account

Zellig Harris: blog post on Harris’s alleged antirealism

Some ways to think about linguistic structure (for philosophers)

Some notes on different varieties of merge (made in grad school)

Some clunky code

This is a little program to go through all the PDFs in a folder and extract any highlighted text, then compile them into a single PDF with the PDF names used as headings for each section. If, like me, you read a lot of PDFs, highlight what you take to be the important bits, and then promptly forget everything you have read, it might be helpful. It’s still a little buggy though and I intend on improving it.

Here is a jupyter notebook which should allow anyone (including people with no programming ability at all) to train a word2vec model on a bunch of text and then probe that model for a semantic subspace like this. I mean it when I say it shouldn’t take any programming ability. You just plug the name of your document in where it says, plug the antonyms you want to use in where it says, and the list of words you want to inspect in where it says, and then hit enter a few times.

Dissertation (2019): a pragmatist interpretation of generative grammar

Theories of generative grammar describe a device for recursively enumerating syntactic structures. Generative theorists deny that such a device operates in real-time. Instead, it is claimed that the device characterises ‘competence’, not ‘performance’. On the surface, it looks like a device that doesn’t run — a function that isn’t computed — isn’t empirically interesting. This thesis is an attempt to make sense of the idea that generative grammar describes computations that don’t actually happen in any temporal sense. I argue that, despite claims to the contrary, generative grammar does not characterise a function in the sense of Marr’s computational level of theorising. Instead, I argue that the function characterised is more like the transition function of a Turing machine rather than any function implemented by the machine. In the process, the thesis discusses the philosophical context in which generative grammar developed, provides analyses of the roles played by concepts like recursion, computation, and function-in-intension, and discusses the impact of the trend toward the lexicalisation of syntactic information on how computation should be understood in theoretical syntax.

Supervisors: Mark Textor, David Adger, Alex Clark