000K utf8 1100 2022$c2022-02-15 1500 eng 2051 10.3390/e24020278 3000 Mohseni, Mahdi 3010 Gast, Volker 3010 Redies, Christoph 4000 Approximate Entropy in Canonical and Non-Canonical Fiction$hMDPI [Mohseni, Mahdi] 4030 Basel$nMDPI 4060 16 Seiten 4209 : Computational textual aesthetics aims at studying observable differences between aesthetic categories of text. We use Approximate Entropy to measure the (un)predictability in two aesthetic text categories, i.e., canonical fiction (‘classics’) and non-canonical fiction (with lower prestige). Approximate Entropy is determined for series derived from sentence-length values and the distribution of part-of-speech-tags in windows of texts. For comparison, we also include a sample of non-fictional texts. Moreover, we use Shannon Entropy to estimate degrees of (un)predictability due to frequency distributions in the entire text. Our results show that the Approximate Entropy values can better differentiate canonical from non-canonical texts compared with Shannon Entropy, which is not true for the classification of fictional vs. expository prose. Canonical and non-canonical texts thus differ in sequential structure, while inter-genre differences are a matter of the overall distribution of local frequencies. We conclude that canonical fictional texts exhibit a higher degree of (sequential) unpredictability compared with non-canonical texts, corresponding to the popular assumption that they are more ‘demanding’ and ‘richer’. In using Approximate Entropy, we propose a new method for text classification in the context of computational textual aesthetics. 4950 https://doi.org/10.3390/e24020278$xR$3Volltext$534 4961 https://www.db-thueringen.de/receive/dbt_mods_00051544 5051 400 5550 Approximate Entropy 5550 canonical texts 5550 fictional texts 5550 non-canonical texts 5550 POS-tags 5550 Shannon Entropy 5550 text classification