Mallory, F. (2021) Why is Generative Grammar Recursive? Erkenntnis: An International Journal of Scientific Philosophy ISSN 0165-0106. p. 1–15. doi: 10.1007/s10670-021-00492-9
Mallory, F. (2021). A Spirit of Trust: A Reading of Hegel’s Phenomenology by Robert Brandom (Harvard University Press, 2019). Philosophy, 96(4), 675-682.. doi:10.1017/S0031819121000206
(2020) Linguistic types are capacity-individuated action-types, Inquiry, 63:9-10, 1123-1148
Mallory, F. (2020) The Case Against Linguistic Palaeontology, Topoi 40, 273–284 https://doi.org/10.1007/s11245-020-09691-5
Mallory, F. (2020) In Defence of the Reciprocal Turing Test, Minds and Machines, 30:659–680 https://doi.org/10.1007/s11023-020-09552-5
Work in Progress
Mallory, F. (2023) What do large language models model? In Communicating with AI (eds. Rachel Sterken & Herman Cappelen) [forthcoming]
Mallory, F. (2022) Paper on communicating with chatbots [currently at R&R stage]
Mallory, F. (2022) Paper on Wittgenstein and language modelling [under review]
Mallory, F. What is the distributional hypothesis? What should it be? [Draft]
Mallory, F. Erotetic Injustice [Draft]
I also have some papers under review or revision on chatbots, semantic competence, and generative grammar. Email me for a draft of any of these and receive a complementary picture of my cat.
At the moment, I am working on applying teleosemantics to neural language models (e.g., Word2Vec, ELMo) and seeing how it can constrain methods of model interpretation and probing. Feel free to get in contact if this is the kind of thing you’re in to. There’s an extremely rough draft of some of this work here. Here are some slides from a recent talk about this. This is connected to a longer-term project developing an appropriate epistemology for stochastic measuring devices, a category which includes deep neural networks and other computational systems often labelled ‘artificial intelligences’.
Bibliography for Computation in Generative Linguistics talk, 2022 [Here]
Zellig Harris: Blog Post on Harris’s alleged antirealism [Here]
Some ways to think about linguistic structure (for philosophers) [Here]
Some notes on different varieties of merge (made in grad school) [Here]
Dissertation (2019): A Pragmatist Interpretation of Generative Grammar
Generative grammar describes a device for recursively enumerating syntactic structures. Generative theorists deny that such a device operates in real-time. Instead, it is claimed that the device characterises ‘competence’, not ‘performance’. On the surface, it looks like a device that doesn’t run — a function that isn’t computed — isn’t empirically interesting. This thesis is an attempt to make sense of the idea that generative grammar describes computations that don’t actually happen in any temporal sense. I argue that, despite claims to the contrary, generative grammar does not characterise a function in the sense of Marr’s computational level of theorising. Instead, I argue that the function characterised is like the transition function of a Turing machine rathen than any function which a Turing machine realises. In the process, the thesis discusses the philosophical context in which generative grammar developed and provides analyses of of the roles played by concepts like recursion, computation, and function-in-intension.
Supervisors: Mark Textor, David Adger, Alex Clark