GlitchesAreLikeWildAnimalsInLatentSpace! BOVINE! — Karin + Shane Denson (2024)

BOVINE! (2024)
Karin & Shane Denson

Bovine! is a part of the GlitchesAreLikeWildAnimalsInLatentSpace! series of AI, generative video, and painting works. Inspired in equal parts by glitch-art vernaculars, the chronophotography of Eadweard Muybridge and Étienne-Jules Marey, the cut-up methods of Brion Gysin and William Burroughs, and generative practices from Oulipo to Brian Eno and beyond, our ongoing series GlitchesAreLikeWildAnimalsInLatentSpace! stages an encounter between human imagination and automated image-making.

The above video is a screen recording of a real-time, generative/combinatory video. There are currently two versions:

Bovine.app displays generative text over combinatory video, all composited in real time. It is mathematically possible but virtually impossible that the same combination of image, sound, and text will ever be repeated.

Bovine-Video-Only.app removes text and text-to-speech elements, and only features generative audiovideo, which is assembled randomly from five cut-up versions of a single video, composited together in real-time.

The underlying video was generated in part with RunwayML (https://runwayml.com). Karin’s glitch paintings (https://karindenson.com) were used to train a model for image generation.

Karin Denson, Training Data (C-print, 36 x 24 in., 2024)

Prompting the model with terms like “Glitches are like wild animals” (a phrase she has been working with for years, originally found in an online glitch tutorial, now offline), and trying to avoid the usual suspects (lions, tigers, zebras), produced a glitchy cow, which Karin painted with acrylic on canvas:

Karin Denson, Bovine Form (acrylic on canvas, 36 x 24 in., 2024)

The painting was fed back into RunwayML as the seed for a video clip (using Gen-2 in spring/summer 2024), which was extended a number of times. The resulting video was glitched with databending methods (in Audacity). The soundtrack was produced by feeding a jpg of the original cow painting into Audacity as raw data, interpreted with the GSM codec. After audio and video were assembled, the glitchy video was played back and captured with VLC and Quicktime, each of which interpreted the video differently. The two versions were composited together, revealing delays, hesitations, and lack of synchronization.

The full video was then cropped to produce five different strips. The audio on each was positioned accordingly in stereo space (i.e. the left-most strip has its audio turned all the way to the left, the next one over is half-way from the left to the center, the middle one is in the center, etc.). The Max app chooses randomly from a set of predetermined start points where to play each strip of video, keeping the overall image more or less in sync.

Onscreen and spoken text is generated by a Markov model trained on Shane’s book Discorrelated Images (https://www.dukeupress.edu/discorrelated-images), the cover of which featured Karin’s original GlitchesAreLikeWildAnimals! painting.

Made with Max 8 (https://cycling74.com/products/max) on a 2023 Mac Studio (Mac 14,14, 24-core Apple M2 Ultra, 64 GB RAM) running macOS Sonoma (14.6.1). Generative text is produced with Pavel Janicki’s MaxAndP5js Bridge (https://www.paweljanicki.jp/projects_maxandp5js_en.html) to interface Max with the p5js (https://p5js.org) version of the RiTa tools for natural language and generative writing (https://rednoise.org/rita/). Jeremy Bernstein’s external Max object, shell 1.0b3 (https://github.com/jeremybernstein/shell/releases/tag/1.0b3), passes the text to the OS for text-to-speech.

Karin Denson, Bovine Space (pentaptych, acrylic on canvas, each panel 12 x 36 in., total hanging size 64 x 36 in., 2024)

DEMO Video: Post-Cinema: 24fps@44100Hz

As Karin posted yesterday (and as I reblogged this morning), our collaborative artwork Post-Cinema: 24fps@44100Hz will be on display (and on sale) from January 15-23 at The Carrack Modern Art gallery in Durham, NC, as part of their annual Winter Community Show.

Exhibiting augmented reality pieces always brings with it a variety of challenges — including technical ones and, above all, the need to inform viewers about how to use the work. So, for this occasion, I’ve put together this brief demo video explaining the piece and how to view it. The video will be displayed on a digital picture frame mounted on the wall below the painting. Hopefully it will be both eye-catching enough to attract passersby and it will effectively communicate the essential information about the process and use of the work.

Making Mining Networking: Video Documentation

Above, some video documentation of the pieces included in Making Mining Networking, the exhibition that Karin and I have going on until September at Duke University. As I posted recently, the augmented reality platform we used to make the interactive components (Metaio) has been sold to Apple and will be going offline at the end of the year. All the more reason to document everything now — but until December 15 you can still try out the pieces yourself, either in person at the exhibition or on your own computer screen with a smart device (see the images here)!

The (generative, network-driven) music is from the project “Listen to Wikipedia,” by Hatnote — which seemed a perfect match for the theme of Making Mining Networking!

Making Mining Networking: Exhibition Extended, But Not For Long…

Tutorial_Level

Making Mining Networking went on display on April 20, 2015 at Duke University’s The Edge digital workspace. The show was originally scheduled to run until mid May, but it was extended several times until September 2015. After that, a few of the pieces are slated to be shown in Fall 2015 in an exhibit organized by the online journal Hyperrhiz: New Media Cultures and the Digital Studies Center at Rutgers University Camden.

In the meantime, however, it was reported in late May that the augmented reality platform we used to build the interactive components of our pieces had been sold – effectively putting an expiration date on our artworks. Metaio GmbH, makers of the popular Junaio AR browser and the underlying engine that allowed us to augment our QR-based paintings with videos, 3D objects, and HTML hyperlinks, were acquired by one of the biggest corporations in the business: Apple.

Even before the buyout was confirmed, rumors had started circulating after Metaio abruptly closed their community forums, cancelled their annual developers’ convention, and stopped selling their software and services. An ominous message went up on the Metaio website (metaio.com):

Metaio products and subscriptions are no longer available for purchase.

Downloads of your previous purchases will be available until December 15th, 2015, and active cloud subscriptions will be continued until expiration. Email support will continue until June 30th, 2015.

Thank you.

(No, thank you!) Lacking any explanation, users of the Metaio/Junaio AR platform were left to speculate about the future of their advertising campaigns, educational applications, and (as in our case) artworks.

Shortly thereafter, after the acquisition by Apple came to light, our worst fears were confirmed in the FAQs section on junaio.com. There we read:

Channel publishing to Junaio is no longer available. All existing channels will continue to be available until December 15th, 2015.

In other words, the pieces included in Making Mining Networking will no longer be functional at the end of the year. The QR codes painted on these canvases will no longer work; the pieces will then be flattened from the interactive physical/virtual assemblages they were designed to be and rendered into … paintings. Or worse, they will retain an executable dimension, albeit a non-operational one, and it will be supplemented by a weirdly representational dimension: effectively, these will then be paintings of 404 error codes.

MarxMarkovPainting-frame

Like all of the pieces, “The Magical Marx-Markov Manifesto Maker” is therefore destined to lose its magic; the QR code, when scanned with a smartphone, will lead the user’s browser into virtual nothingness. On the other hand, though, pieces like “The Gold Standard” and “Gnomecrafting” might still have something to tell us – precisely because their non-operationality will render visible the inevitable entanglement of proprietary platforms and obsolescent objects that is the material heart and soul of digital capital.

goldstandard-coins_frame

Making Mining Networking is (or was?) about probing the borders between the virtual and the physical – boundaries that are inscribed in stone (e.g. rare earths) as much as they are written in code. With our works, we have sought to invite users to experiment with this interface, opting for a playful approach to a space that we know is about deadly serious transactions (in the realms of capital and of the environment, to begin with). We installed our data gnomes at the physical/virtual border where they stood as talismans to ward off the bad spirits of digital capital – but we were never so naïve as to believe that they could really protect us for long. We still believe that we regained something of personal value by reclaiming our data from corporate mining and making something weird and inscrutable with it, but now a corporate transaction is about to render our productions invisible.

gnomecrafting_small

Again, however, it is the seeming totality of such corporate power to make things invisible – to make all that’s solid melt into zeroes and ones – that is paradoxically made visible at this juncture, where links are inoperative, QR codes are non-functional, and paintings are not just paintings but paintings of such failure.

The_9-small

Data-Portrait2 Data_Portrait1

Following our initial disappointment, then, we now now eagerly await the appointed date, our “doomsday” of December 15, 2015 – when the truth of Making Mining Networking will be revealed. Will it be a simple 404 message, or can we hope for something else to make manifest the physical/virtual interface as it exists in our era of climate change, high-speed finance, and the biopolitical mining of all that breathes and lives? Only time will tell…

In the meantime, these works exist as reminders of the expiration date that is implicitly inscribed in all of our devices – and, potentially, our very planet – at the hands of global digital capital. Play with them, think with them, experience with them – and await with us their obsolescence…

IMG_9422a