Hello

I’m an artist making zines and video. Maybe games one day. Maybe all that together. Currently obsessed with creating an immersive film for Apple Vision Pro. I’m documenting it all here. I also publish a zine about what I’m making.

  • virtual Set v1

    I spent a little time over the weekend trying to bring The Phantom Moon to life. I created this in Blender, rendered out a still image, and then made a video out of it. This was recorded in my Apple Vision Pro.

  • Testing Spatial photo and Video embeds

    In VisionOS 2.2 (I’m on the developer beta), you can now “tap to view spatial photos and videos embedded on web pages” in Safari.

    So if you’re viewing this in Safari in VisionOS 2.2 you can select and hold the image which brings up a menu where you can select “View Spatial Photo.” To view the spatial video, tap it to start it playing and then tap the fullscreen button in the upper-left corner.

    The image and video are from a trip to The Morton Arboretum outside of Chicago.

  • I’ve got a Vision Pro app in beta

    A couple of weeks ago Anthony Maës released an open source immersive video player based on Mike Swanson’s work. It’s also published as an app that you can download on the Vision Pro. It’s now my go-to app for playing immersive MV-HEVC videos. I was hopeful I could use this to make an app for my work.

    For my purposes, I want an app that you can download and then just hit play to watch the thing I’ve made. Today that would be as close as I can get to the experience of watching Apple’s immersive videos in the TV app. So this weekend I set out to see if I could modify this player with a custom menu that only plays a video that I include with the app. Spoiler alert: I did it! Here’s roughly how:

    • I forked the Openimmersive GitHub repository.
    • I opened the forked project in Xcode.
      • I spent some time trying to make things work by hard-coding a stream URL into that section of the app. That worked, but streaming a 115 Mbit/sec video file from my shared hosting account sucks even with fiber internet. Not to mention this could get expensive.
      • So I copied the code for picking a file into ChatGPT and asked it to modify it to play a video file that was bundled with the app. It did it but it used AVKit which is Apple’s video playing framework. The issue with immersive video is that Apple doesn’t provide a framework for it. But this open source project is built on one (OpenImmersiveLib). So fixing this was a one-line change. I swapped out “import AVKit” with “import OpenImmersive” and it worked like a charm.
      • I deleted the menu parts I didn’t need.
      • I created a new app icon.
      • I learned how to exclude my giant video file from the GitHub repository.
    • I tested my app on my Vision Pro.
    • I committed all my changes to my forked repository on GitHub.
    • I followed these instructions for setting up my app on TestFlight.
    • And voilà – you can test it via TestFlight!

  • CineD Podcast on Immersive Filmaking

    Super informative interview. This covers all the stuff I’ve been learning on my own over the last 8 months.

  • Spatial Photo to Black & White Anaglyph

    The camera app in iOS 18 has a “Spatial” mode for iPhone 16 and 16 Pro. There you can switch between photos and video.

    I’ve always wanted to make a B&W anaglyph photo zine and while running this morning I realized that I knew how to make this work for images taken with my new iPhone.

    • Select your images in the Photos desktop app and use command+option+e to export them all at once.
    • Open the terminal and use Mike Swanson’s Spatial tool to split all the images into left and right views.
    for f in *.HEIC; do spatial export -i "$f" -o Left_$f.png -o Right_$f.png;done
    • Use Stereo Photo Maker to open sets of left and right images, and then use option+a to auto align the images.
    • Then export your image from Stereo Photo Maker

    Here’s one shot at my desk. I cropped this in Photoshop to see what it might look like in a 5.5″ X 8.5″ zine.

  • I love this idea

    It’s not a new idea but it’s new to me. I’d love to steal this and do something with film and/or performance.

  • 16K Upscale

    I’ve seen a few VR180 creators on Reddit talk about turning their 8K footage from the Canon R5C into 16K. The examples I’ve seen do look better so I thought I’d try it out. I used Topaz Labs Video AI to upscale a 30 second clip that I shot while we on vacation in Guerneville, CA a couple of weeks ago.

    It took about 26 hours (!) to upscale the video on my Mac Studio (M1 Max processor). I just used the default settings. I imagine I can get better results in the future. Here are crops from the original and upscaled video.

    Original 8K video – click to view original
    Upscaled to 16K – click to view original

    Continue Reading

  • Rough Prototype

    I spent the Labor Day weekend learning how to add text and 2D video into an immersive video scene. I’m doing this because I have this vision for a film/app for the Apple Vision Pro and I thought I could 1. prototype a large part of this experience in video and 2. I could also make immersive videos like this and learn how to tell stories in this medium. 

    So to begin with, I came up with the requirements for my project.

    • I had to use existing video. I wasn’t going to waste time shooting new things or trying to have this video mean anything or stand on it’s own. I’m just working on learning the technical part.
    • I want to include text in the 3D space.
    • I want to include 2D video in the 3D space.

    There’s more I want to learn — immersive audio & 3D video in the 3D space — but I’ll save that for another time. Knowing how to add text and 2D video is most of what I need and it allows me to work on storytelling in parallel with the other technical skills. 

    Continue Reading

  • Progress

    I spent the weekend learning how to composite text and 2D video into a 180 degree, 3D video. I mostly have it all working. Once I figure out a couple of issues I’ll write something up.

  • Photo Novels

    I got an email from someone who saw my La Jetée zine in Quimby’s in Chicago asking if I knew more about genre. I don’t but a little bit of searching led me to this presentation by SAW (Sequential Artists Workshop) which makes me really want to revisit this format.

Get my Newsletter
and Free zine

I send out an occasional email newsletter and a few times a year I publish a printed zine. Be sure to add your mailing address when you sign up to get the zine.