NonEuclidean Therapy for AI Trauma Analog Archives SoME3 somepi
>> YOUR LINK HERE: ___ http://youtube.com/watch?v=FQ9l4v7zB3I
PATIENT ALICE: An Artificial Intelligence suffering from hallucinations of a lost puppet show. These hallucinations need to be erased. • GENERATIVE MODEL TYPE: Diffusion-based. • PRESCRIBED TREATMENT: A Latent Space Editing method that involves the Pullback, the Jacobian Matrix, Eigenfaces and SVD. • ------------------------- • 2. NOTE: read the Errata (section 4) below or in the pinned comment for a few important corrections in the explanation. • Subtitles (CC) available. • Math-only version (suitable for all): • • [MATH ONLY] Non-Euclidean Therapy for... • This video experimented with a different approach, combining a science fiction mystery story told by analog VHS effects with an explanation of a very recent paper about latent space editing in diffusion models. This is my only entry to #some3 • Link to the paper this was based on: • Unsupervised Discovery of Semantic Latent Directions in Diffusion Models • https://arxiv.org/abs/2302.12469 • ------------------------- • 3. The following are chapter timestamps. They can be used to skip to math-only or non-math sections. • Timestamps: [spoilers ahead] • 00:00 - Patient Introduction -- [No Math] • 02:08 - Manifolds and Pushforwards • 06:38 - The Three Functions • 09:55 - The Lost Show -- [No Math] • 12:32 - Diffusion Models and the U-Net • 14:13 - Matrix Multiplication and the Change of Basis Neurons • 20:01 - The Jacobian Matrix • 26:17 - The Pullback and the Dot Product • 28:42 - A treat before treatment -- [No Math] • 31:09 - The Treatment -- [No Math] • 36:13 - Finding the Error • 39:31 - Correlations in Matrices • 42:13 - Superposition • 45:23 - W^T W • 47:24 - Eigenvectors of W^T W • 49:38 - The Trauma -- [No Math] • 51:43 - Singular Value Decomposition • 54:30 - Reunion -- [No Math] • Feel free to just watch a few sections you want to learn more about. • ------------------------- • 4. ERRATA • 4.1. This video refers to needing to find singular vectors to compute the Jacobian but that's poorly worded; it actually means for calculations involving the Jacobian, such as for J^T * J to compute the SVD and get editable directions . The Jacobian matrix, as stated in the paper, is computed via a sum-pooled feature map of the bottleneck representation of H to reduce the number of parameters to compute. • 4.2. This video states that the SVD is used to compute both the Jacobian and the semantic directions. But in the paper, the SVD is actually only used to obtain the semantic directions. To compute the SVD, in section 3.2 of a subsequent paper, the Power Method is used to compute the singular vectors without computing (M^T)(M). An updated version will fix this by stating that dimensionality reduction should be performed on J before multiplying with it; this was not done in the paper, so the video will take a different approach than it for the sake of explaining dimensionality reduction. This would fix the circular issue of needing the Jacobian J to compute J^T J to get the eigenvectors used to compute J. • Link to this paper Understanding the Latent Space of Diffusion Models • through the Lens of Riemannian Geometry : • https://arxiv.org/abs/2307.12868 • 4.3. Other Fixes (for the future update): • 37m35s: This could be misleading; singular vectors are neglected, not dropped, so the matrix would have the same dim, but with a reduced rank (see: https://stats.stackexchange.com/quest...) • 46m: this transpose should switch the elements to row a c and row b d • 53m20s: this should say square root of eigenvalues • 55m55s: this is an audio glitch. it should be 'you weren't such a' • 4.4. Subtitle additions: • 37m: added- it’s too many for them to calculate (for finding J^T * J for the pullback metric) • 54m15s: added- There’s algorithms I still need (to get the eigenvectors without calculating M^T M) • In the future, an updated version with improved narration and visuals (such as for the patient voice during the latter half) may be uploaded. This video was not re-uploaded with fixes due to 8/18 being the deadline for #SoME3. • --- • A behind-the-scenes video going into more depth about the paper this was based off of may be made soon. It will address some issues not explained (due to time) in the video. • ------------------------- • 5. Helpful Resources: • • SVD Visualized, Singular Value Decomp... • SVD Visualized, Singular Value Decomposition explained • • 12.4.2 The Power Method • The Power Method • -- • Prequels / supplementary videos: • • Why do Neural Networks use Linear Alg... • Why do Neural Networks use Linear Algebra? || The Visual Intuition of Cat Mathematics • • THE AI AMONG US in your Non-Euclidean... • THE AI AMONG US in your Non-Euclidean Mind 〘 Analog VHS Infomerical 〙 • --- • More References: • https://transformer-circuits.pub/2022...
#############################