website/_posts/research/2022-08-01-SR_journal.md
2024-11-10 12:16:11 +01:00

16 lines
1.3 KiB
Markdown

---
layout: single
title: "Self-Replication in NNs"
categories: research audio deep-learning anomalie-detection
excerpt: "Elaboration and journal article of the initial paper"
header:
teaser: assets/figures/15_sr_journal_teaser.jpg
---
![Children Evolution](\assets\figures\15_sr_journal_children.jpg){:style="display:block; margin-left:auto; margin-right:auto"}
A key element of biological structures is self-replication. Neural networks are the prime structure used for the emergent construction of complex behavior in computers. We analyze how various network types lend themselves to self-replication. Backpropagation turns out to be the natural way to navigate the space of network weights and allows non-trivial self-replicators to arise naturally. We perform an in-depth analysis to show the self-replicators’ robustness to noise. We then introduce artificial chemistry environments consisting of several neural networks and examine their emergent behavior. In extension to this works previous version (Gabor et al., 2019), we provide an extensive analysis of the occurrence of fixpoint weight configurations within the weight space and an approximation of their respective attractor basins.
{% cite gabor2022self %}
![Noise Levels](\assets\figures\15_noise_levels.jpg){:style="display:block; margin-left:auto; margin-right:auto"}