Generative Art Tools

Sep 6, 2020 Events

4 years ago

By Yura Miron

Yura Miron is a visionary artist exploring the inner and outer universes using the new media art tech, such as: VR, GAN, generative art and blockchain. Inspired by his visionary psychedelic experiences, lucid dreaming and by being constantly aware of his own presence. Working with such themes as: visionary mystical experiences, fusion of nature and technology, solarpunk, eco-speculation, science fiction, quantum and astrophysics, micro and molecular biology.

In this article I’d like to talk about some fascinating generative tools that are available to download and experiment by any artist. First I’d like to give you some of my art background.

All of my life I’ve been drawing and I’ve decided to dedicate my life to art around 9 years ago, when I was 20 y/o. For the first 4 years I’ve been only drawing all days long, generating 3-4 finished drawings every day. I’ve been learning arts very fast. I’ve been working on developing my unique style, playing with compositions, colors and technics. 6 years ago I’ve discovered the whole new fantastic world – new media art. I’ve started 3D sculpting, VJing and animating my drawings. I’ve been very productive and posted everyday new artworks on my social media blogs my artworks, and have gained lot of followers, for example on Tumblr there was 100K+ of followers. And still there was no way of me making money of my digital art.

In 2015 I’ve first tried painting in VR and my life as an artist was completely changed forever by it. Together with my friend we’ve created a mixed reality lab, where artists could explore VR and AR tech in their art practices. I’ve taught for than 1000 people how to create art/sculpture/animation/design in VR in 2016-18 years. In 2016 together with Serge Synthkey we’ve created our musical duo project called SYNXRON. We’ve been doing many live shows, mostly electronic dance music (techno/house/acid/modular/ambient/idm), did lives on opening of some cool art exhibitions, made collaborations with dancers, choreographers, media artists, laserjays and designers. Also I’ve been working with VJ Yarkus on many shows as a laserjay. Laser light is something of absolutely magnificent nature. The main point of all those collaborative live shows was the real time connection with an audience. When I’ve had all my set-up working properly, at any given moment I’ve been able to generate anything unexpected and influence its output all of the time. The main mission of our performances was to create an immersive audio-visual experiences for different people in different locations. And everybody who’s watching my visuals or listening to my music were also co-creating it with me, as it influences my outcome in real-time as well. So this is more like a controlled chaotic feedback-loop, than a predicted order.

In the late 2018 I’ve felt really burned out and decided to stop making music with Serge and teaching and laserjaing. Working on laser shows and teaching VR, while making an album, practicing for next live shows over and over again and having ongoing troubles in my personal relationship, I’ve decided to leave it all behind, being grateful for the experience, and moving further. I needed to get back to my core, to zero. I’ve felt that I need to focus on my Art again. I’ve started creating “Pure Abstractions” back then. Suddenly then I’ve discovered cryptoart. Blown away with it’s possibilities I’ve started searching for the main platforms where I could start tokenizing some of my best artworks. Luckily, I’ve been accepted as an artist on all of them. I remember one day I’ve seen ‘Latent Space of Landscape Paintings #1’ by @videodrome.

Latent Space of Landscape Paintings #1
Edition 1 of 1
Tracing the perimeter of a high dimensional sphere through the latent space of AI generated landscape paintings.

I was pretty much amazed by it! I’ve started learning about GANs and how they work. I’ve found Ganbreeder (now it’s called Artbreeder https://www.artbreeder.com/ ) and it was exactly what I’ve been looking for. I’ve been ganbreeding for hours, just like I’ve been drawing before. I’ve had a feeling that this is something really surreal and beautiful! I’ve generated thousands of artworks using this tool, and made created around 200 different animated loops with it (half of them are already tokenized to). I’ve been breeding static artworks, and crossbreed between them, saving frame-by-frame. Later I’ve learned automate this process it using some coding in Python (copy-paste from instructions), before the animation editor was finally built into Artbreeder. I remember how I’ve put up the very first GAN animation of mine. It was ‘Agfom Potent-Shot’:

Agfom Potent-Shot
Edition 1 of 1

It completely brew my mind, how close this artwork was to some of my visions generated by magic mushrooms and I’ve continued exploring this new unexplored (by myself) territory.

I love the way how I’m able to choose where the evolution of an artwork will go, by selecting the best artworks. Also I’ve been crossbreeding a lot between different pictures. It’s a wonderful tool for an artist. I think that an artist becomes a curator in this situation. I imagine a factory of artists, hundreds of artists working all the at once to paint art, and my role is to select the best ones. Only a few super wealthy artists can really hire hundreds of other artists in real world, but with Ai tools now any artist can do that virtually.

Now I’d like to talk about some other fascinating generative tools that I’ve been exploring lately.

First one is Physarum simulation by Sage Jenson. A true cosmic generation experience. God-like feeling J

Here’s how he describes it by himself: “This February I spent a bit of time simulating slime mold (Physarum polycephalum). I saw some incredible posts by Georgios Cherouvim with reference to a 2010 paper by Jeff Jones, “Characteristics of pattern formation and evolution in approximations of Physarum transport networks.” When I read the paper, I was excited to learn that the model combined continuum and agent-based simulation systems in a way that I hadn’t seen before. In this post I try to describe some of the concepts behind the system, and give an overview of how it functions.

There are two requirements of efficiency in foraging behavior: 1.) to search a maximal area and 2.) to optimize transport distance. Physarum polycephalum is a unicellular multinucleate organism that excels at these two competing tasks through the mechanisms of growth, movement, and area reduction. When the organism can choose to travel through two different paths to a destination, the emergent behavior allows it to effectively find shortest paths. This allows Physarum to navigate mazes, develop optimal road-like systems and solve other path-finding problems.

The model postulated by Jones employs both an agent-based layer (the data map) and a continuum-based layer (the trail map). The data map consists of many particles, while the trail map consists of a 2D grid of intensities (similar to a pixel-based image). The data and trail map in turn affect each other; the particles of the data map deposit material onto the trail map, while those same particles sense values from the trail map in order to determine aspects of their locomotion.

Each particle in the simulation has a heading angle, a location, and three sensors (front left, front, front right). The sensor readings effect the heading of the particle, causing it to rotate left or right (or stay facing the same direction). The trail map undergoes a diffusion and decay process every simulation step. A simple 3-by-3 mean filter is applied to simulate diffusion of the particle trail, and then a multiplicative decay factor is applied to simulate trail dissipation over time. The diagram below describes the six sub-steps of a simulation tick.

Many of the parameters of this simulation are configurable, including sensor distance, sensor size, sensor angle, step size, rotation angle, deposition amount, decay factor, deposit size, diffuse size, decay factor, etc. For a more detailed description check out the original paper.”

He implemented the model in C++ and GLSL using openFrameworks. All of the computation and rendering happen on the GPU, which stores both the particle information and the trail map. The simulation runs in real-time on a GTX 1070, with most of the examples comprised of between 5 and 10 million particles.

https://sagejenson.com/physarum

I did not receive any answer from Sage, asking if I could experiment with his amazing tool. So I’ve started searching for any other software where I can generate Physarum simulations and I’ve found one created by nicoptere on Github https://github.com/nicoptere/physarum

Here’s what I’ve generated with it:

Another software that I’ve been enjoin lately is Egregore. “Egregore – source” is an adaptation of the software used by chdh for the performance egregore in 2011-2014. It is based on five different audiovisual instruments made of chaotic and physical modeling algorithms that you can directly control. It is available as a software to download or as a usb stick. Download the software here: http://www.chdh.net/egregore_source.php

Here’s my today’s experiments with it:

Next tool I’d like to discuss is ‘Cosmic Sugar’ – a simple but elegant GPU driven simulation space. VR conrollers become attractors or repulsors which allow you to craft nebulae. A very beautiful VR tool: I’ve imagined that one day I’ll be able to generate millions of particles using my hands in VR, real-time, and here’s this tool – just like I wanted it to be. Highly recommended software.

Here’s me playing with it for the first time yesterday:

http://cosmicsugarvr.com/

 

Next tool is called Vsynth – A modular virtual video synthesizer and image processor package for Cycling’74 Max environment. Vsynth was created by Kevin Kripper, a new_media artist, indie developer and teacher based in Buenos Aires.

I’ve created 3 music videos for my musician friends Luna-9. Here’s one of them:

https://www.patreon.com/vsynth

Here’s a link on his Patreon, where you can download the software and many cool patches by subscribing to his pateron page for only 5$/mo

The next tool is perfect for VJing (my passion for the last 6 years – I’ve been VJing on many parties/festivals/raves) is Microdose VR – a trly fantastic software for live performances and music videos.

 

Microdose VR combines art, music and dance into a realtime generated creative virtual reality experience. Microdose VR is created by Vision Agency, a VR centric studio based on Colorado, USA, founded by Android Jones, Anson Phong, Scott Hedstrom and Evan Bluetech.

Microdose VR is currently in beta development and only in use by a very small circle of testers. The only way to try Microdose VR is to find us at an event that we’re at. If you have an HTC Vive or Oculus Rift with touch controllers and would like to obtain Microdose VR, you can get yourself on the list to beta test by sending a private message to this FB page, requesting to be in the beta test group. https://www.facebook.com/microdoseVR/ or you can do the same using Discord (that’s the way I’ve got my beta-tester key) https://discord.gg/t43TsSp

This is my upcoming artwork for the Cryptograph https://cryptograph.co All of my proceeds (70%) will be donated to the “Many Hopes”charity https://www.manyhopes.org Created using MicrodoseVR:

I’m going to tokenize many new artworks created with these amazing generative tools.

I hope you’ve enjoyed my article and learned something new and valuable. Thank you!

28

SuperRare

SuperRare is a marketplace to collect and trade unique, single-edition digital artworks.

Art

Tech

Curators' Choice