Arnheim from DeepMind, produce images with grammatical brushstrokes
Have you heard of Rudolf Arnheim? Good! Rudolf was born in Germany in 1904 and was an art and film theorist. In 1968 he was invited to join Harvard University as a professor of art psychology. His works, such as “Art and Visual Perception: A Psychology of the Creative Eye” (1954) and “Visual Thinking” (1969), have studied art through the prism of science and sensory perception.
As DeepMind explains in his blog, “Our computational creativity work uses computers as tools to generate visual art in a way that draws on Arnheim’s formalism.” The research lab took to Twitter to present Arnheim, a generative algorithm for producing images made with grammatical brushstrokes.
To this end, DeepMind offers two Colabs that allow users to easily invent their own generative architectures and paintings, including:
- Arnheim 1: The original algorithm from the article Generative Art Using Neural Visual Grammars and Dual Encoders running on 1 GPU allows optimization of any image using a genetic algorithm. It’s much more general but much slower than using Arnheim 2, which uses gradients.
- Arnheim 2: A reimplementation of the Arnheim 1 generative architecture in the CLIPDraw framework allowing the optimization of its parameters using gradients; much more efficient than Arnheim 1 above but requires to be differentiated by the image itself.
For those who want to play around with the code to change the way brushstrokes are generated, we can go here.
These two Colabs are examples of how AI can be used to increase human creativity by suggesting possible ways to form a representation.
Alva NoÃ« defines art as the process of reorganizing experience, effectively as a kind of visual philosophy. While we are still a long way from an algorithmic understanding of this deeply human process, the collaborations here show that to some extent a minor aspect can be understood synthetically, namely how decisions are made regarding which marks to be ordered. to produce a representation. effectively. The team hopes others will modify these algorithms in fascinating ways.
Subscribe to our newsletter
Receive the latest updates and relevant offers by sharing your email.
Join our Telegram Group. Be part of an engaging community