All platforms conduct content governance, by humans and by algorithms. Especially in times of growing dangers of Corona-related disinformation and a polarised US election campaign platforms have become more active in governing speech. The number of platforms transparency reports and content governance disclosures – voluntarily (like in the US) or not (like in Germany) – is growing. Platforms, however, do not usually include examples in their reports. It is therefore very difficult for researchers to replicate the deletion/non-deletion decisions and check deleted content against national law and platform rules systematically. Both the data (deleted contents) and a taxonomy of norms under which deletion happens have been unavailable. This paper, for the first time, provides a legal and terms of service taxonomy that allows for the evaluation of the quality of individual content governance decisions for German and Austrian online forums. It allows other researchers, when they have access to deleted platform content, to code them systematically and in doing so to better assess the extent to which content governance matches the legal and terms of service categories platforms claim to be implementing.