This post discusses the relationship between deep learning and topology, highlighting that while topology is often referenced in theoretical discussions, practical advancements in deep learning have predominantly come from empirical methods rather than topological principles. Many comments suggest a focus on geometry rather than topology, arguing that the geometric properties of data greatly influence neural network training and architecture design. Conversations also touch on the evolution of neural network structures, particularly during training, and express hope for a deeper understanding of algebraic topology's implications for neural network efficiency and performance.