My school Extended Project Qualification: a paper on machine unlearning that does not answer the question.
Find a file
2026-02-10 21:38:20 +00:00
unofficial Finalise report as per supervisor guidance 2025-01-20 16:09:26 +00:00
.gitignore Main structure; First draft of Introduction+Methodology; Notes for Architectures+Applications 2024-07-26 17:25:29 +01:00
EPQ - Oliver Geer - Is There A Place For Approximate Unlearning In Production-Headed Neural Networks? - Submitted Version.pdf Add final printable versions, having fixed citation non-displaying err 2025-01-29 20:44:32 +00:00
EPQ Presentation - Oliver Geer - Is There A Place For Approximate Unlearning in Production-Headed Neural Networks? - Printed Version.pdf Add final printable versions, having fixed citation non-displaying err 2025-01-29 20:44:32 +00:00
EPQ.bib Add presentation; finally edit report 2025-01-29 20:01:59 +00:00
EPQ.run.xml Add final printable versions, having fixed citation non-displaying err 2025-01-29 20:44:32 +00:00
EPQ.tex Calling Llama an "open-weight network" is confusing for a network with weights available under a commercial- 2025-04-22 19:25:56 +01:00
LICENSE Add README and license 2026-02-10 21:38:20 +00:00
Presentation.odp Add presentation; finally edit report 2025-01-29 20:01:59 +00:00
README.md Add README and license 2026-02-10 21:38:20 +00:00

Is There A Place For Approximate Unlearning In Production-Headed Neural Networks?

This is my first research paper, as a school project. It is not a good paper for its title, since it does not cover ethics or experimental tests anywhere near enough. It is slightly better but still vague as a review of the different kinds of machine unlearning available in 2025. For a better read of the technical side, please see Nguyen et al., 2025 (CC-BY-4.0, open access), whose preprint this paper cites regularly. I do think ethics need to be covered more and do not have an answer for that right now.

Bearing in mind the notice above, you can read this using the .pdf file in this repository, the .tex source also here, or the webpage version here