{"id":22441,"date":"2023-08-12T18:08:48","date_gmt":"2023-08-12T16:08:48","guid":{"rendered":"https:\/\/www.wjst.de\/blog\/?p=22441"},"modified":"2023-08-12T18:10:28","modified_gmt":"2023-08-12T16:10:28","slug":"ai-perpetuating-nonsense-the-mad-disorder","status":"publish","type":"post","link":"https:\/\/www.wjst.de\/blog\/sciencesurf\/2023\/08\/ai-perpetuating-nonsense-the-mad-disorder\/","title":{"rendered":"AI perpetuating nonsense &#8211; the MAD disorder"},"content":{"rendered":"<p>Petapixel had an interesting <a href=\"https:\/\/petapixel.com\/2023\/08\/11\/ai-trained-on-ai-images-produces-terrible-results-study-finds\/\">news<\/a> feed <a href=\"https:\/\/www.cl.cam.ac.uk\/~is410\/Papers\/dementia_arxiv.pdf\">leading to a paper<\/a> that shows what happens when AI models are trained on AI generated images<\/p>\n<blockquote><p>The research team named this AI condition Model Autophagy Disorder, or MAD for short. Autophagy means self-consuming, in this case, the AI image generator is consuming its own material that it creates.<\/p><\/blockquote>\n<p><a href=\"http:\/\/in more technical terms\">more seriously<\/a><\/p>\n<blockquote><p>What happens as we train new generative models on data that is in part generated by previous models. We show that generative models lose information about the true distribution, with the model collapsing to the mean representation of data<\/p><\/blockquote>\n<p>As the training data will soon include also AI generated content &#8211; just because nobody can discriminate human and AI content anymore\u00a0 &#8211; we will soon see MAD results everywhere.<\/p>\n\n<p>&nbsp;<\/p>\n<div class=\"bottom-note\">\n  <span class=\"mod1\">CC-BY-NC Science Surf , accessed 24.04.2026<\/span>\n <\/div>","protected":false},"excerpt":{"rendered":"<p>Petapixel had an interesting news feed leading to a paper that shows what happens when AI models are trained on AI generated images The research team named this AI condition Model Autophagy Disorder, or MAD for short. Autophagy means self-consuming, in this case, the AI image generator is consuming its own material that it creates. &hellip; <a href=\"https:\/\/www.wjst.de\/blog\/sciencesurf\/2023\/08\/ai-perpetuating-nonsense-the-mad-disorder\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">AI perpetuating nonsense &#8211; the MAD disorder<\/span> <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[3358],"class_list":["post-22441","post","type-post","status-publish","format-standard","hentry","category-computer-software","tag-ai"],"_links":{"self":[{"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/posts\/22441","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/comments?post=22441"}],"version-history":[{"count":3,"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/posts\/22441\/revisions"}],"predecessor-version":[{"id":22445,"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/posts\/22441\/revisions\/22445"}],"wp:attachment":[{"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/media?parent=22441"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/categories?post=22441"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.wjst.de\/blog\/wp-json\/wp\/v2\/tags?post=22441"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}