After many iterations I'm currently working mainly in 2D video processing environments, Resolume Avenue and TouchDesigner. The links here are inspiring, thanks for posting.
Every time I've ever plugged in a modern projector into a laptop at a presentation it's so stressful, like rolling the dice if the screen will ever come up. What kind of a projector and calibration and preparation did it take to project live hires SGI video onto the screen above the band?
https://en.wikipedia.org/wiki/Talaria_projector
>RGB color separation and processing is obtained using vertical wobbulation of the electron beam on the oil film to modulate the green channel and sawtooth modulation is added to the horizontal sweep to separate and modulate Red and Blue channels. The optical system used in the Talaria line is a Schlieren optic like an Eidophor, but the color extraction is much more complex.
https://en.wikipedia.org/wiki/Wobbulator
>A wobbulator is an electronic device primarily used for the alignment of receiver or transmitter intermediate frequency strips. It is usually used in conjunction with an oscilloscope, to enable a visual representation of a receiver's passband to be seen, hence simplifying alignment; it was used to tune early consumer AM radios. The term "wobbulator" is a portmanteau of wobble and oscillator. A "wobbulator" (without capitalization) is a generic term for the swept-output RF oscillator described above, a frequency-modulated oscillator, also called a "sweep generator" by most professional electronics engineers and technicians.[1] A wobbulator was used in some old microwave signal generators to create what amounted to frequency modulation. It physically altered the size of the klystron cavity, therefore changing the frequency.
Samwell & Hutton Ruggedized CT501 Wobbulator (1968) NSN: 6625-99-620-2403
https://www.ebay.com/itm/267012403603?_skw=wobbulator&itmmet...
https://en.wikipedia.org/wiki/Infrared_Roses
Infrared Roses is a live compilation album by the Grateful Dead. It is a conglomeration of their famous improvisational segments "Drums" and "Space". The ElectroPaint stuff begins around 11:00, but the Raster Masters did all kind of different stuff in parallel and mixed it all together in real time. I remember them describing some "recursive texture map" feedback too, which only ran on high end SGI workstations.
https://www.youtube.com/watch?v=gkhr23asO-M
Wired: Raster Masters: Enough with virtual reality -- virtual hallucinations?
https://www.wired.com/1994/06/raster-masters/
Electropaint on SGI Indy: A capture of the great screensaver electropaint on an SGI Indy. There is no sound (originally I had Mahavishnu Orchestra's "Miles Beyond" but youtube flagged me for it), but feel free to blast your own music while watching:
https://www.youtube.com/watch?v=StA81MNuqB8
6 minutes of ElectroPaint:
https://www.youtube.com/watch?v=ObdtoLuSaWM
SGI IRIX ElectroPaint Screen Saver:
https://www.youtube.com/watch?v=gbWpsrNYfaQ
Panel Library and ElectroPaint source code:
http://66.111.2.18/pub/The_Unix_Archive/Unix_Usenet/comp.sys...
Some of David's more recent stuff:
https://www.facebook.com/groups/106720642819222/posts/222074...
>In the spirit of J-Walt's intro message, I'm David Tristram, somewhat of a pioneer in the use of real-time graphics for live performance. Author of Electropaint and Electroslate live performance instruments, and founding member of Raster Masters. Toured with Grateful Dead, developed performance system for Graham Nash and Herbie Hancock.
>I'm just playing with things these days, most recently making music with a small modular system and experimenting with very simple looping visuals in an investigation into the perception of visual rhythms. Here is my most recent test.
The UI has been greatly improved since I took the original demo on the site, the real thing is MUCH better now. Same base idea - chain together shaders, videos, or webcams and then drive their parameters via an audio signal, BPM, oscillator, MIDI board, or manual sliders.
The beta link on the site isn't really worth trying yet - if you're interested in getting on the TestFlight just shoot me a message at joe@nottawa.app. Would love some HN feedback :)
May I ask where are the sources? Looks great, any plans for Windows or Linux (Docker) version?
`noise().thresh(()=>a.fft[0]*2).out()`
Is it possible to grab from default audio output device instead of mic? Probably not as it's browser based. I suppose mic can be faked on OS level somehow.
https://github.com/milkdrop2077/MilkDrop3/releases/tag/MilkD...
Thank you, that's exactly what I'm looking for.
A very interesting process displaces the texture coordinates by advecting them along a flow field. Use any 2D vector field and apply displacement to each coordinate iteratively. Even inaccurate explicit methods give good results.
After the coordinates have been distorted to a far distance, the image becomes unrecognizable. A simple hack is to have a "restore" force applied to the coordinates, and they spring back to their original position, like flattening a piece of mirroring foil.
Just now I am using feedback along with these displacement effects. Very small displacements applied iteratively result in motion that looks quite a bit like fluid flow.
https://www.donhopkins.com/home/archive/news-tape/fun/melt/m...
%!
%
% Date: Tue, 26 Jul 88 21:25:03 EDT
% To: NeWS-makers@brillig.umd.edu
% Subject: NeWS meltdown
% From: eagle!icdoc!Ist!jh@ucbvax.Berkeley.EDU (Jeremy Huxtable)
%
% I thought it was time one of these appeared as well....
% NeWS screen meltdown
%
% Jeremy Huxtable
%
% Mon Jul 25 17:36:06 BST 1988
% The procedure "melt" implements the ever-popular screen meltdown feature.
/melt {
3 dict begin
/c framebuffer newcanvas def
framebuffer setcanvas clippath c reshapecanvas
clippath pathbbox /height exch def /width exch def pop pop
c /Transparent true put
c /Mapped true put
c setcanvas
1 1 1000 {
pop
random 800 mul
random 600 mul
random width 3 index sub mul
random height 2 index sub mul
4 2 roll
rectpath
0
random -5 mul
copyarea
pause
} for
framebuffer setcanvas
c /Mapped false put
/c null def
end
} def
melt
Here's Jeremy's original "Big Brother" eye.ps, that was the quintessential demo of round NeWS Eyeball windows:https://www.donhopkins.com/home/archive/news-tape/fun/eye/ey...
I tried naïvely using `ps2pdf` (Ghostscript), but got errors on both of them. I guess they're meant to be consumed by some other sort of system?
https://en.wikipedia.org/wiki/NeWS
For example, here's a heavily commented demo application called PizzaTool:
https://donhopkins.medium.com/the-story-of-sun-microsystems-...
Source code:
https://www.donhopkins.com/home/archive/NeWS/pizzatool.txt
It uses an iterated feedback pixel warping technique kind of like melt.ps, to spin the pizza rotationally, which melts the cheese and pizza toppings, instead of melting the screen by simply blitting random rectangles vertically like melt.ps -- note the randomization of the rotation to "dither" the rotation and smooth out the artifacts you'd get by always rotating it exactly the same amount:
% Spin the pizza around a bit.
%
/Spin { % - => -
gsave
/size self send % w h
2 div exch 2 div exch % w/2 h/2
2 copy translate
SpinAngle random add rotate
neg exch neg exch translate %
self imagecanvas
grestore
} def
It animates rotating a bitmap around its center again and again as fast as you "spin" it with the mouse, plus a little jitter, so the jaggies of the rotation (not anti-aliased, 8 bit pixels, nearest neighbor sampling) give it a "cooked" effect!It measures the size of the pizza canvas, translates to the center, rotates around the middle, then translates back to the corner of the image, then blits it with rotation and clipping to the round pizza window.
LGR: Kai's Power Goo – Classic 90s Funware for PC!
Some years ago I did a similar project to smoothly crossfade (with "interesting effects") between images using some of the same techniques. My writeup (and a demo):
https://sheep.horse/2017/9/crossfading_photos_with_webgl_-_b...
Specifically about halfway through the process and applying:
uv.x = uv.x + sin(time + uv.x * 30.0) * 0.02; uv.y = uv.y + sin(time + uv.y * 30.0) * 0.02;
to the static image. Having experienced a range of psychedelic experiences in my life this appears to be the closest visually with the real thing, at least at low, non-heroic, doses. Maybe slow the waves down and lessen the range of motion a bit.
Note: I am far more interested in replicating the visual hallucinations induced by psychedelic compounds than by making cool visuals for concerts and shows, utmost respect for both sets of artists though.
There is an artist (and I’m sure many more) who does a fantastic job with psychedelic visuals using fully modern stacks to edit, unfortunately their account name entirely escapes me. I’ll comment below if I find it.
The comparison that I would make with this portion of the Rolling Hills article would be the mushroom tea scene from Midsommar, specifically with the tree bark. The effect of objects “breathing” and flowing is such a unique visual and I love to see artists accomplishing it in different ways.
[1] https://grokware.com/ [2] https://m.youtube.com/watch?v=3Z4X4FmIhIw
It was a time of screensavers and palette animation.
Most spot on visual depictions of psychedelic artifacts I’ve witnessed.
Saw them together last year and it’s the no. 1 artistic experience of my life. The richness, and complexity of Fractaled Vision’s visuals are almost unbelievable.
Even knowing a lot about shader programming, etc. some of the effects I was like “wtf how did he do that”.
Here’s the set, doesn’t fully capture the experience, but gives a feel: Seeing this in 4k at 60fps was next level.
Sure there might be limited use cases for it visually but playing with the models we've built up around how graphics in computers work are a great way to learn about the each one of these systems. Not just graphics but fundamental math in programming, how GPUs work and their connection to memory and CPUs, how our eyes work, how to handle animation/time, and so on.
I am truly fascinated by people who attempt to reproduce the actual physiological vision effects of psychedelic drugs.
Psychoactive drugs can be probes into the inner workings of our minds - in some scientific sense - and exploring the vision effects seems likely to suggest interesting things about how our visual system works.
Mostly, I am just impressed when anyone is able to capture the visual experience in graphical effects, with any level of realism.
I have to say that the cliche of super bright, super saturated, geometric or melty shapes like in the article are not a great reproduction of the typical visual effects of psychedelics. Apart from very high doses, the visual effects are much more subtle.
The /r/replications subreddit has GIFs and short videos with a much higher degree of realism https://www.reddit.com/r/replications/top/?t=year
They were originally made to debug neural networks for image recognition. The idea is run the neural network in reverse while amplifying certain aspects, to get an idea on what it "sees". So if you are trying to recognize dogs, running the network in reverse will increase the "dogginess" of the image, revealing an image full of dog features. Depending on the layer on which you work, you may get some very recognizable dog faces, or something more abstract.
The result is very psychedelic. It may not be the most faithful representation of an acid trip, but it is close. The interesting part is that it wasn't intended to simulate an acid trip. The neural network is loosely modeled after human vision, and messing with the artificial neurons have an effect similar to how some drugs mess with our natural neurons.
> uv.x = uv.x + sin(time + uv.x * 50.0) * 0.01;
> uv.y = uv.y + sin(time + uv.y * 50.0) * 0.01;
I thought, wow, what on Earth is going on here? But no, it turns out that its not that psychedelic. They could have used p,q or any other variable pair but its still quite interesting geometrically [2].
Actually Blender has an abstract base "Node" set of Python classes and user interfaces that you can subclass and tailor for different domains, to create all kinds of different domain of application specific visual programming languages.
So visually programming 2d video filters, GPU shaders, 3D geometry, animations, constraints, state machines, simulations, procedural city generators, etc, and each can have their own compilation/execution model, tailored user interface, node libraries, and connection types. Geometry nodes have the visual programming language equivalent of lambdas, functions you can pass to other functions that parameterize and apply them repeatedly, iterating over 3d geometry, texture pixels, etc.
Blender extensions can add nodes to the existing languages and even define their own new visual programming languages. So you can use a bunch of integrated tightly focused domain specific visual programming languages together, instead of trying to use one giant general purpose but huge incoherent "uber" language (cough cough Max/MSP/Jitter cough).
https://docs.blender.org/manual/nb/2.79/render/blender_rende...
What are Geometry Nodes:
https://www.youtube.com/watch?v=kMDB7c0ZiKA
Geometry Nodes From Scratch:
https://studio.blender.org/training/geometry-nodes-from-scra...
Free blender City Generator Addon:
https://www.youtube.com/watch?v=9nLsew8I7KM
Here's a paid product, an incredibly detailed and customizable city generator (and traffic simulator!) that shows off what you can do with Geometry Nodes, well worth the price just to play with as a video game, and learning geometry nodes:
Using The City Generator 2.0 in Blender | Tutorial:
https://www.youtube.com/watch?v=kRHkGoTQKM8
How to Create Procedural Buildings | Blender Geometry Nodes | Procedural City:
https://github.com/emberian/blender-graphify
Maybe this can inspire some dabbling
DonHopkins 11 months ago | parent | context | favorite | on: John Walker, founder of Autodesk, has died
Jim Crutchfield is DOCTOR CHAOS -- he's got a PhD in Complexity Science!
https://www.youtube.com/watch?v=B4Kn3djJMCE
Space-Time Dynamics in Video Feedback
A film by Jim Crutchfield, Entropy Productions, Santa Cruz (1984). Original U-matic video transferred to digital video. 16 minutes.
James P. Crutchfield. Center for Nonlinear Studies, Los Alamos National Laboratories, Los Alamos, NM 87545, USA.
ABSTRACT: Video feedback provides a readily available experimental system to study complex spatial and temporal dynamics. This article outlines the use and modeling of video feedback systems. It includes a discussion of video physics and proposed two models for video feedback based on a discrete-time iterated functional equation and on a reaction-diffusion partial differential equation. Color photographs illustrate results from actual video experiments. Digital computer simulations of the models reproduce the basic spatio-temporal dynamics found in the experiments.
1. In the beginning there was feedback ...
James P. Crutchfield. "Space-Time Dynamics in Video Feedback." Physica 10D 1984: 229-245.
[pdf] https://csc.ucdavis.edu/~cmg/papers/Crutchfield.PhysicaD1984...
[Plates 1-4] https://csc.ucdavis.edu/~cmg/papers/Crutchfield.PhysicaD1984...
[Plates 5-7] https://csc.ucdavis.edu/~cmg/papers/Crutchfield.PhysicaD1984...
Appreciate you taking the time!
https://benpence.com/blog/post/psychedelic-graphics-1
This gets more into how to introduce motion and new visuals instead of the building blocks. The rolling hills graphic was really interesting.
Many, but not any. No finite set of real primary colours can produce every perceivable colour. Some will always be out of gamut.
https://www.youtube.com/watch?v=lyZUzakG3bE
At 24:28 he shows a running Belousov–Zhabotinsky reaction mapped onto a 3d model's texture:
https://youtu.be/lyZUzakG3bE?t=1468
I wrote about it in the discussion of John Walker passing away, and Josh Gordon, who worked on Chaos at Autodesk, joined the discussion:
https://news.ycombinator.com/item?id=39300605
>DonHopkins 11 months ago | parent | context | favorite | on: John Walker, founder of Autodesk, has died
>I really love and was deeply inspired by the great work that John Walker did with Rudy Rucker on cellular automata, starting with Autodesk's product CelLab, then James Gleick's CHAOS -- The Software, Rudy's Artificial Life Lab, John's Home Planet, then later the JavaScript version WebCA, and lots of extensive documentation and historical information on his web page. CelLab:
https://www.fourmilab.ch/cellab/
https://www.fourmilab.ch/cellab/classic/
https://www.fourmilab.ch/homeplanet/
https://www.rudyrucker.com/oldhomepage/cellab.htm
[...]
>josh_gordon 11 months ago | prev [–]
>I'm amazed that my beloved CHAOS still runs beautifully on emulators like DOSbox. It was the last programming project where I could completely roll my own interface - and maybe my last really fun one.
Here's some stuff I did that was inspired by Rudy Rucker and John Walker's work, as well as Tommaso Toffoli and Norm Margolus's wonderful book, "Cellular Automata Machines: A New Environment for Modeling":
https://news.ycombinator.com/item?id=37035627
by DonHopkins on Aug 7, 2023 | parent | context | favorite | on: My history with Forth and stack machines (2010)
>"Cellular Automata Machines: A New Environment for Modeling" is one of my favorite books of all time! It shows lots of peculiarly indented Forth code. https://donhopkins.com/home/cam-book.pdf
>CAM6 Simulator Demo:
https://www.youtube.com/watch?v=LyLMHxRNuck
>Forth source code for CAM-6 hardware:
https://donhopkins.com/home/code/tomt-cam-forth-scr.txt
https://donhopkins.com/home/code/tomt-users-forth-scr.txt
And a couple more recent videos to music using the SimCity/Micropolis tile set and WebGL tile engine to display cells:
SimCity Tile Sets Space Inventory Cellular Automata Chill Resolve 1
https://www.youtube.com/watch?v=319i7slXcbI
I performed it in real time in response to the music (see the demo below to try it yourself), and there's a particularly vivid excursion that starts here:
https://youtu.be/319i7slXcbI?t=314
The following longer demo starts out with an homage to "Powers of 10", and is focused on SimCity, but shows how you can switch between simulators with different rules and parameters, like setting rings of fire with the heat diffusion cellular automata, then switching to the city simulator to watch it all burn as the fires spread out and leave ashes behind, then switching back to another CA rule to zap it back into another totally different pattern (you can see a trail of destruction left by not-Godzilla at 0:50 while the city simulator is running).
I had to fix some bugs in the original SimCity code so it didn't crash when presented with the arbitrarily scrambled tile arrangements that the CA handed it -- think of it as fuzz testing; due to the sequential groups of 9 tiles for 3x3 zones, and the consecutive arrangements of different zone type and growth states, the smoothing heat diffusion creates all these smeared out concentric rings of zones for the city simulator to animate and simulate, like rings of water, looping animations of fire, permutations of roads and traffic density, rippling smokestacks, spinning radars, burbling fountains, an explosion animation that ends in ash, etc.
Chaim Gingold's "SimCity Reverse Diagrams" visually describes the SimCity tiles, simulator, data models, etc:
https://smalltalkzoo.thechm.org/users/Dan/uploads/SimCityRev...
Micropolis Web Space Inventory Cellular Automata Music 1:
https://www.youtube.com/watch?v=BBVyCpmVQew
You can play with it here. Click the "X" in the upper left corner to get rid of the about box, use the space bar to toggle between SimCity and Cellular Automata mode, the letters to switch between cities, + and - switch between tile sets (the original SimCity monochrome tiles are especially nice for cleansing the palette between blasts of psychedelic skittles rainbows, and the medieval theme includes an animated retro lores 8 bit pixel art knight on a horse), the digits to control the speed, and 0 toggles pause. (It's nice to slow down and watch close up, actually!):
As you can see it's really fun to play with to music and cannabis, but if you're going to use any harder stuff I recommend you get used to it first and have a baby sitter with you. Actually the whole point of my working on this for decades is so that you don't need the harder stuff, and you can put it on pause when you mom calls in the middle of your trip and you have to snap back to coherency, and close the tab when you've had enough.
>deepnet on Oct 6, 2022 | parent | context | favorite | on: Recording the Grateful Dead: The Culture of Tapers
>The overlap between early nerd culture and The Grateful Dead was very significant.
>Taping and sharing culture and its benefits were very apparent in many net forums.
>As were democratisation of the new tools, public terminals with BBS access and the Deadheads community spirit exemplified on Usenet and Arpanet.
>Look no further than John Perry Barlow, EFF co-founder and his Manifesto of Cyberspace - he was a Grateful Dead Lyricist !
https://www.wired.com/2016/02/its-been-20-years-since-this-m...
>Barlow's paradigm seems cheeky without awareness of the Net's public roots, how it came up through BBS and Fidonet culture, is forgotten by those who only saw the view of the Net as a gift from the ivory towers of academia and the military rather than bedroom z80 & 6502 modem culture.
q.v. Fidonet BBS documentary
https://m.youtube.com/watch?v=Dddbe9OuJLU
DonHopkins on Oct 6, 2022 [–]
>In another comment reply to Gumby, I mentioned how I often accidentally call them "Grateful Dead Conferences", because so many tech people I knew and worked with in Silicon Valley and the Free Software community and regularly saw at computer conferences and trade shows would show up at Dead shows.
>The Raster Masters would lug enormous million dollar high end SGI workstations across North Shoreline Boulevard from SGI headquarters to Shoreline Amphitheater, and actually pack them into trucks and travel on tour with the Dead, performing live improvisational psychedelic graphics on the screen behind the band in real time to their live music, using an ensemble of custom software they wrote themselves, mixing together and feeding back the video of several SGI workstations in real time.
>At one concert, some hippie came up to me, pointed at the graphics on the screen behind the stage in awe, and said, "I took all these shrooms, I'm tripping my balls off, and you would not fucking believe what they're making me seeing on the screen up there!!!" I explained to him that I hadn't taken any shrooms, but I could see the exact same thing!
>The Raster Masters wrote and performed their own software, which reflected the taping and sharing culture of the Dead scene, including ElectroPaint and the Panel Library from NASA, whose source code and recorded live performances were distributed with SGI's demo software and free source code library.
>The improvisational software was like a musical instrument performed in real time along with the music.
[...Lots more stuff with links and videos at the link:...]