internship update

I’m back! As much as I’d like to use this blog for general purpose writing, for the time being it is exclusively related to uni work. One day I’d like to change that, as I have some ideas for things I can write about. Anyway, on to the main purpose of this post.

Week 1

I’m quite excited about working at the Capitol Theatre again, after spending some time with the system last year creating my own original work. This time around, I’ve been invited to create lighting designs for Honor Eastly’s show No Feeling Is Final, part of the Big Anxiety festival. The work involves not only articulating lighting designs to accompany audio and stage cues, but also learning and researching the internal systems, writing guidelines and generally being involved with the processes behind Capitol’s LED lighting installation. As a natural problem solver, this kind of work is very relevant to my interests. Additionally, I’m delighted that this is a way to get my foot in the door in terms of potential future employment at RMIT.

The initial meeting with Honor via Zoom was a great introduction. We seem to be on the same wavelength in terms of ideas and working style, and coupled with Honor’s demonstrated skills in project management—from the early drafts and schedule shared in our second meeting with Erik—I think this work will be a success.

Week 2

It was great to visit the venue again in order to show Honor the initial draft designs I’d sketched out, as well as showing some of the previous work from the Heightened Multisensory Experiences studio from last year (see earlier entries in this blog for examples). In addition to this, I spent some time with Ben and Simon running through some essentials of the system—how to back up the main session, how to add new sessions, how to run scenes, etc. There’s still some room for learning in terms of more detailed operation, so I’m looking forward to going deeper. Already I’ve discovered that it is possible to manipulate the lighting in a somewhat live manner, which is useful for quick iteration. I’ll definitely be enquiring about booking solo time in the space in order to document colour values, intensities, and the behaviour of preset patterns in the system, which are all elements sadly lacking from the experience of programming the lighting remotely in the studio last year.

On a slight tangent, I have some rather ambitious plans to research the file format used for the lighting designs, and potentially create a 3D environment which can interpret such files and display the programming virtually. Coupled with a VR headset, this would make the design process a lot easier, with the possibility of pushing things further and adding design capability to the environment itself. I am of course getting way ahead of myself here, but it’d be incredible to be able to “paint” lighting states in VR and have them be exported as files compatible with the real system. Something to think about..

week 13: reflection

I’m glad the semester (and study year) is over, but also very appreciative that it’s been one of my most creatively rich stretches in the past few years.

 

Capitol

The “final” iteration is looking about as good as it can without me being present in the venue to note any further tweaks I’d like to implement. I was happy to receive some positive feedback, mostly because people understood what I was aiming for. In particular, a comment about the ending feeling like an out of body experience with everything coming together after the slight blurring of the staccato patterns in the buildup, followed by the sudden cut being a somewhat alarming jolt back to reality was pretty much spot on in terms of what I was going for. Other comments such as the feeling of it being something one could get lost in, was definitely something I was aiming for. I think if I end up extending this, or developing something similar, I may go in the direction of building something that is designed to be experienced with eyes closed. I think that’d work incredibly well at Capitol. I know it’s been done for a Dark Mofo work, but it’d be interesting to do it for something with a more positive emotion.

 

What didn’t work

Pretty early on, I decided to move away from the idea to give it movements, or a very dynamic arc, from suggestions given in class. I did attempt this in one early version, but it would have felt disjointed.

Also early on, the decision to move away from using metal guitars, as well as clean, Raster Noton style minimalist sound design, was, mostly in terms of keeping things simple, a good idea. I’m still convinced that I can create a hypnotising metal work, but that’s for a future project. I also decided against any tempo automation, since it would have been hard to keep track of with multiple layers and the methods I used to generate sounds; however, towards the end of the piece, there are some polyrhythmic elements coming in, briefly, which add to the “blurring” I mentioned earlier.

My experience with Pharos Designer definitely didn’t quite work out the way I expected it. I had an idea to play with colour in a way that was similar to chromatic aberration, but that was scrapped before I even started trying to get things together in Pharos, as I don’t think the effect would have worked in the space. My initial experiments with manually controlling lights were quite tedious to program, and I largely abandoned manual control in favour of slightly less precise, preset-based control.

 

What worked

From the beginning of my sound experiments, I knew that loading recordings into Emission Control 2 would be the way to go for the work. It was essentially the reason why I quickly decided against using guitars or more minimal tones, as neither sounded good when processed using the software. Essentially, the piece is made up of many different Korg Wavestation presets, all playing the same very slow chord progression, processed using Emission Control, which was sometimes set to oscillate between different recordings on the fly. I spent almost an entire week recording various versions, including using different repetition intervals, varying levels of randomness (well, not really randomness; it was more like several LFOs cascading over each other to make odd patterns), and different sound sources. There are over 20 different layers in the project, which all fade in and out slowly.

Once I started getting used to the idiosyncrasies of Pharos Designer, it became a little easier to manage. As mentioned above, I quickly learned that it’s ok to use the presets, rather than trying to create everything from scratch myself; it meant that I could adjust things like timing a lot easier.

The idea shown in my initial presentation, of ambiguous movement, worked out quite well, as can be seen on the ceiling from around 6:30 onwards, where each strip of lights is flashing at a different rate, which means that at certain points, the lights are moving towards the centre, and others away from the centre, or even both at once, which I thought was one of the key hypnotic moments of the design.

 

Future plans

I would like to create a long-form work for the Capitol, perhaps something approaching an hour in duration, but will need to think about how to structure it in terms of how to keep the audience interested. Perhaps something with long stretches in similar style to my hypnotising work, but with some movements where things get a bit more complex or even brutal. Also, I’d love to use the screen as well, even for simple colour washes, to activate the architecture in a way that the lighting doesn’t; using flashes between the screen and lighting to give the illusion of the architecture moving would be highly effective.

 

Phase

There’s not much more to write about this, because it’s been in a finished state for a couple of weeks now, but it has been an interesting exercise in restraint.

 

What didn’t work

I’m still a little conflicted about the way it should be presented; part of me enjoys the ambiguity of the circles being invisible until they hit the base of each square, and then silhouetted by the flash, but the other part agrees with comments about that idea diminishing the hypnotic experience by removing or reducing the sense of it being a kinetic sculpture captured in a browser.

Aside from stylistic decisions, there are still some issues I’d like to address in terms of movement, or sound. Sound-wise, there are some occasional glitches which may be fixed by limiting the amount of active squares to a smaller number, like 8. The movement issues mostly arise when switching browser tabs; I don’t know if there’s any way to fix this, but I’d have to run some tests.

 

What worked

The decision to cut down from a set of three browser-based works to two was very good for my mental health, and allowed me to focus on one while still having time to work on other projects (and have some downtime as well). I’ve still got the other two works in various stages of completeness, and will continue to work on them in the future.

I’m pretty happy that Phase ended up being very close to what I proposed in my initial presentation, which means I’m getting a lot better at coming up with ideas that I can actually realise.

 

Future plans

My browser-based works are an ongoing project, as they’re often quite good coding exercises for me, and have allowed me to develop a somewhat consistent style. I’d love to, at some point, consolidate them all as levels, or mini-games within a larger, possibly platform game, where each of the individual pieces is a puzzle that must be solved by playing a certain melody or interacting in a certain way before the player can progress.

Another implementation is for these works to be translated into physical space, using projection mapping, gestural control, audio reactivity, and multichannel sound. I can imagine a set of installations in subdivided spaces which can be accessed individually.

 

Collaboration – Capitol

I took in the minor changes I mentioned last week, as well as removing the ending (sadly, as I thought it provided some nice closure to the piece—however, it’s true that it was a step down in terms of intensity compared to the previous section), and, while there were a million things I thought about that I could adjust slightly, I had to make the call to consider it finished.

 

What didn’t work

There were so many iterations of the main chord progression, all of which just sounded too cheesy, possibly only to me, but cheesy nonetheless. This was actually the part I spent the most time on.

I still don’t think I quite nailed the Iglooghost / Woulg vibes I was going for; I think I need to study their work a lot more in order to figure out what they’re doing and why it resonates with me.. and then figure out how to write music inspired by them without wearing my influences on my sleeve.

 

What worked

Pat’s positive reaction to my experimental ideas allowed me to push this piece pretty far. I was a little cautious of having four sections that seemed to have no relation to each other, but Pat was really into the idea and inspired me to work on getting them to transition into each other a bit more smoothly.

 

Collaboration – Lightning

I wrote a lot about this last week, but since then, I added the elements I mentioned (scrunching paper through FFT effects, and guitar lead buzzing), which ended up working a lot better than I expected. There’s a definite arc to the piece, highlighting each element sometimes solo, and when the buzzing and scrunching come in, everything else is reduced greatly in volume, giving a feeling of the system breaking somewhat, before everything coming back in stronger than before for the bright ending.

I’m actually incredibly happy with how this turned out, and there were no ideas that I tried that didn’t work; in fact, the whole piece was, as mentioned, built on a “happy accident” that emerged through viewing the initial sculpture video while working on another piece of audio (for a different project). From there, the project just kept building as I went back and forth with Pat, and at one point I was so excited about it that I made three iterations in one day.

 

Overall reflection

It’s a shame I didn’t get to work on some of the more ambitious ideas that I had, including the Grand Organ, but that may not have worked out the way I expected anyway. I’ll have to see if I can hook it up in the future though.

One other thing I wanted to explore was multichannel audio, but again, due to restrictions I don’t think I would have been able to deliver a work that I would have been satisfied with. Again, it’s something I’d like to explore in the near future.

I acknowledge that my research has been a little less in-depth than it could have been, and if I do want to continue into further HE study I’ll have to step it up. From this course, however, I’ve collected a huge list of research opportunities which I’m keen to get into, which includes everything from psychological effects of rhythm, to further study in the field of mathematics and physics, both of which I can apply to my browser-based (and eventually standalone/installation-based) works.

week 12: the art of finishing

The weekend was a tangle of patching in TouchDesigner, which was a good lesson in what I should and shouldn’t do in TouchDesigner. In other words, what would usually be a few simple lines in code is rather longwinded to construct in a visual programming environment. It was definitely a learning experience.

Capitol piece

I’ve been making micro-iterations of this for the past few weeks. There are still a couple of annoying things that I can’t seem to fix (the most notable being an unexplained colour fade when the ceiling colours change—I don’t recall having any fades in that part), but I think I’m nearly there.

Phase

I’ve brought this back to how it was in the first iteration (with a few bug fixes) and it’s done, at least for the assignment submission. After a discussion with Shaun last week, the idea of bringing this (and my other browser-based works) to a physical space is sounding a bit more interesting. Shaun’s idea to convert the interface from requiring a physical device (e.g. mouse, or my other idea to have a 4×4 grid of buttons on a podium/plinth) to something that allows the viewer to gesture with their eyes or hands, is very intriguing. It does raise a question of accessibility, but that’s something that can be addressed in the future.

Collaborations

Lightning sculpture

I’m loving the sound process for this; it’s nearly done, but I have a few more ideas on how to vary the sound over time so it has more of an arc*. I’m amazed that the few moments of fuller-frequency sound have retained the airy, ethereal feel of the original idea, while still hitting exhilarating due to a combination of the unexpected bursts alongside the sub bass and extended chord voicings.

I’m also pretty interested in the way the background noise (originally unintentional and just a byproduct of the synth patch I used needing an extreme volume boost), and how it plays a part in the ethereal/mesmerising feel. My initial thought is that it makes it seem like the sculpture itself is producing the sounds, and it’s being recorded by a camera mic with the gain up, with the autogain ducking the volume when the big chords hit. But it also ties in with the name of the sculpture—lightning—and could represent rain. There’s also the nano-augmentation aesthetic, which I’ve touched on in previous posts; the game Deus Ex has a similar aesthetic and is an underlying inspiration for my sound design in this project.

Two new ideas I’ve had for this in the past week have been to use some close-mic paper scrunching, through an FFT filter like DtBlkFx, to provide an alternative sound / layer to the high frequency twinkles. The other idea is a more experimental one, using the sound of a finger on an audio cable plug to create a buzz whenever the lights are off, as though the lighting is being unplugged at that point.

* – We’re working on converting it into a live, generative performance/installation too. Obviously in that context, there won’t be as much opportunity for a tightly tailored sound experience, but it may be possible to automate a lot of it, and have controls in the TouchDesigner patch which influence not only the light, but also the sound.

Capitol piece

The latest version is sounding a lot more complete, with only a few things I need to work on:

  • Increase volume of distorted drums
  • Fade in the intro pad; there’s a weird sound right at the beginning that sounds odd
  • (low priority) Add a melody / arpeggio in the foley drums section

Other than that though, I think it’s definitely hitting the exhilarating feel, partly through the use of distinct, unpredictable sections; something that I struggled with initially, but now that I’m tightening up the transitions, they’re working more effectively.

On a technical level, exporting the stems from Renoise and cutting things up / timestretching in Reaper was a fun process, and something I’d like to do more in my personal work. I’ve used techniques like that in the past, though mostly with my hardware synth work, where it’s almost necessary to go through and cut things together after recording. So, using partial composition in one software environment and then finishing it up in another, is an interesting new experience that allows one to take advantage of the best features of both.

week 11: distractions

It’s been a busy week. I haven’t had a great deal of time to dedicate to HME work, due having to prepare for a job interview, as well as some work for another course that took up a lot of my time.. but

Capitol piece

After last week’s uncertainty, it was good to receive some feedback that I’m making something that feels effective. I’m particularly appreciative of the comment that leaving the middle section of the ceiling completely unlit for the first 6 minutes was a good idea. As the weeks go by, the iterations become smaller and smaller; this week, I basically only made the changes I wrote about last week, but also made the ending abrupt, rather than a slow fadeout.

I think the decision to move from a wave on the ceiling at every bass note change, to a wave that lasts the entire duration of the non-tonic bass notes, was a good one. Having it slowly fade to a strobe works well for me. However, the walls using a pulse pattern instead of ramp down may be a bit too extreme; I think the ripple effect of the previous iterations is more subtle and works better as a hypnotic pattern.

The build to the abrupt end, as well as the new chord sweeps and arpeggios, make this go from hypnotic to somewhat exhilarating. I think I might aim for that as a secondary experience.

Phase

I haven’t done any more work on this in the past week, however after feedback during the week, I think it can go back to how it was before, without the flashing, and with visible circles that drop from the middle on click. I’ll still include the parameters, as “secret” features, or for a future version of the work, but it’s interesting to iterate on a project and settle on a previous version. That’s a first for me.

Collaborations

Sculpture

The latest iteration of Pat’s sculpture is as follows:

Since this upload, I’ve received a somewhat final edit of the video, as well as some feedback—notably, to reduce the high register sounds, but also to give the piece a kind of arc, that progresses from mesmerising to exhilarating, and to include occasional moments of full-frequency sound. I’m onboard with all of those ideas, and think this will be a great project.

Capitol piece

I’ve been thinking about what I can do with the audio for Pat’s Capitol piece, and over the weekend, a friend sent me a link to Woulg’s latest album Bubblegum.

It’s pretty similar in sound palette to Iglooghost’s work, which I used as inspiration for my piece. However, it’s making me think of doing some post-processing on my already sequenced audio, which perhaps could be the last piece of the puzzle in terms of making something cohesive. I had an idea to create several timestretched versions of the sections that include drums, and crossfade/cut between them; then, in the ending section with the heavy beat, replace some drum hits with enveloped versions of the trance chords from the previous section. I’ll be experimenting with that later today.

Research

My research in the past week has mostly been for my Emerging Digital Cultures assignment, where I’m researching glitch art and presets.

One work in particular that is standing out for me is Cory Arcangel’s Data Diaries, which the artist talks about at the 19-minute point in this lecture:

The work involves providing QuickTime with only header information in a file, which contains resolution, duration and colour mode; an interesting bug/feature in QuickTime is that when provided with such a file, it will use the computer’s RAM to fill in the data that was missing from the file, thus creating glitched patterns and sounds. It relates quite heavily to the work I’m creating for the assignment, which is a collection of Nord Lead presets which were generated by inserting a new battery into a PCMCIA SRAM memory card, creating randomised data that the synth tries to interpret as presets.

Researching presets for the assignment, I got into a rabbit hole of reading about algorithmic listening, a relatively new form that Kobel (2019) compares to Schaeffer’s (2017, p. 212) theory of reduced listening, with the distinction being that the listening and categorisation is being performed by a machine learning algorithm. I find this quite fascinating as someone who is going deeper and deeper into the realms of generative music, and in turn, getting closer to working with AI and machine learning in my music. Relinquishing control to the computer for several tasks—most importantly, the actual generation of the presets, but also, within the automated process of recording and separating the sounds themselves—is a key component of my assignment’s concept.

So how do these relate to my HME works? One thing I’ve noticed with the EDC assignment is that the resulting work can be seen as a catalog of sounds, of which I’ve already used a few as launching points for Pat’s works. It brings up an interesting idea about my now monolithic “album” of Nord patches being simultaneously presented as useless and functional art.

References

Woulg 2021, Submission, sound recording, Yuku Music, Prague.

Arcangel, C 2009, Digital Media Arts, YouTube, 25 March, Columbia University, viewed 5 October 2021, <https://www.youtube.com/watch?v=ZzHq7PzQWEE>.

Kobel, M 2019, ‘The drum machine’s ear: XLN Audio’s drum sequencer XO and algorithmic listening’, Sound Studies, vol. 5, no. 2, pp. 201–204.

Schaeffer, P 2017, Treatise on Musical Objects: An Essay Across Disciplines, translated by Christine North and John Dack, University of California Press, Oakland, California.

week 10: rustin man

Capitol piece

It’s a bit difficult to tell if I’m on the right track with this, but I’ll continue to iterate with the hope of being able to at least submit something that looks effective filmed; then tighten it up if/when we’re allowed to go in and view it in person.

The new iteration generally works better than the previous one, with some notes:

  • The wave forward along the ceiling whenever the bass note changes doesn’t quite work. I think I might change this to a wave that lasts for the entire length of the shorter bass note changes, perhaps somehow with a repeating pulse that slowly decreases in pulsewidth until it stops at the next note change.
  • The animation on the walls still feels a little loose. I think I’ll change the wave to a square/pulse instead of the current ramp.
  • On the first sub bass drop around 6:30, the lighting doesn’t cut out as with the other changes. I must have missed a fadeout time or something.

Overall I think it’s progressing well. It’s definitely an exercise in restraint. I would love to make something completely ridiculous for the Capitol someday, but in the meantime I think it’s good to learn the system by staying very controlled.

Phase

I’ve started to implement some of the changes suggested in class; notably, a flash whenever a ball hits the base of the box, and the ball starting at the base rather than middle upon clicking. I’ve implemented these as URL parameters, so it’s possible to allow for variations without having to duplicate the files.

Another thing I discovered is a potentially hypnotising effect from making the ball colour the same as the square background. Here’s how it looks with the standard ball colour:

Ignore the weird comet-like trails coming off the balls, that’s a GIF artefact.

Changing the ball colour to the same as the activated square background, however, results in this effect:

It could go either way, so I’ve parameterised that feature as well. I think, in a way, it looks more hypnotic, as it’s not clear when the ball will bounce again, though is somewhat predictable due to the regular timing.

Using URL parameters like that has really opened up the possibilities for my other browser works, in that I’m now able to save basic states without requiring any local or server-based data. As usual, I’ve been digging deep into the p5js reference site (Ye, Masso & McCarthy 2021) to find my way through these additional functions.

I’m still looking for an effective way to allow for some degree of interaction between users, even a way to record “scores” or messages on the server for other users to access, which will allow me to transform these works into browser-based games. I may have to move away from p5js and into more specific game programming environments in order to achieve this.

Collaborative projects

The latest iteration of my music for Pat’s Capitol work is getting closer to having an actual structure:

I have a few notes about what to change in the next iteration, mostly chord progressions and adding more effective transitions between sections, but it’s really coming along.

Pat’s other project is really coming along nicely:

It’s an opportunity to go a lot more abstract than many of the other projects I’ve created for the program, which excites me greatly. Yesterday,I happened to view Pat’s latest video while getting sound together for an unrelated assignment. The audio I was recording at the time seemed to fit really well with the visuals, so I’ll be refining that discovery in the next few days. Here’s what it sounds like:

There’s something really interesting about how it sounds with mostly high-register tones, especially coupled with the limited medium of the fibre optic strands. It feels ethereal, but futuristic at the same time, and reminds me of some of the themes of Deus Ex (Ion Storm, 2000).

Other research

Just as I loaded up my YouTube history I saw that Mal had another video:

I really need to talk to him about polyrhythms sometime. I’ve been very interested in the idea of ambiguous, or multi-purpose rhythms for a long time, and it’s good to see someone put them into practice.

~

The experiments with high register tones for Pat’s fibre optic sculpture have once again brought up the idea of using combination tones:

While being conscious of potential hearing damage or other unpleasant effects of high pitched sound, I’d like to try embedding some other sound using the above technique—perhaps a secondary melody that complements the melodic content of the existing audio.

~

I reached out to fellow generative artist William Fields, as I noticed he is also creating synchronised audio-visual works (though, much farther along in the process than I), and it was nice to hear back from him. While he hasn’t written anything on his creative practice, I found several useful techniques and concepts in this video he linked:

His points on creating a system and iterating, rather than starting from scratch, resonated with me, because I have a tendency to reinvent the wheel every time I start a new project. As I’m getting more into programming, this makes less sense, and I think I’ll start developing my own libraries for things, for example, a synthesis engine, or sequencer, that can be saved as a separate file and referenced in future projects.

The segment on limiting controls via macros was quite inspiring as well. I’ve done a small amount of this in my synth programming, in systems where I don’t have access to a large amount of individual physical controls, but I think I can apply this to my software interfaces as well, once my works necessitate on-screen controls.

He also mentioned tuned randomness, something which I’ve been trying to play with for my generative sequencers, but never pushed too far. His ideas on using non-linear functions to weight randomness towards a certain point make a lot of sense.

References

Ye, Q, Masso, E, & McCarthy, L 2021, p5js reference, p5js.org, viewed 1 October 2021, <https://p5js.org/reference/>.

Ion Storm 2000, Deus Ex, PC game, Eidos Interactive Ltd/Square Enix, London.

Webb, M 2021, 8 against 13 Blurred PolyPolyRhythm… a work in progress, YouTube, 4 October, viewed 4 October 2021, <https://www.youtube.com/watch?v=SKC8mUt8iWY>.

Neely, A 2017, Combination Tones, YouTube, 31 October, viewed 3 October 2021, <https://www.youtube.com/watch?v=73_CiAYX00k>.

Fields, W 2020, Augmented Creativity (as Illustrated Through My Creative Journey), YouTube, 2 May, viewed 1 October 2021, <https://www.youtube.com/watch?v=LmGiV_ht5ak>.

 

week 9: momentum

Capitol piece

In the past week, I’ve made significant changes to this piece, based on my thoughts in last week’s blog post, but also from some feedback given in class. One suggestion I’ve implemented is to use some kind of change in lighting alongside the changes of bass note. In the first six minutes this is shown using a slow “burst” preset, along the ceiling—in fact, this is the only lighting that appears on the ceiling in the first six minutes, with exception of some strobing at the rear of the venue at the beginning.

From ~6:20 onwards, the bass note changes are accentuated using a long decay 808 bass drum, which is used as a sidechain input for a compressor effecting all other elements*. To mirror this, the ceiling lighting cuts out and fades back in over a few seconds, each time a different colour. There’s a chance the bass drum sound will cause the viewer to snap out of a potential hypnotic state, but that’s to be seen when it’s tested in venue. Perhaps removing the white noise/screen lighting buildup before each “drop” could remedy this.

I’ve slowed down the transitions significantly as well, so hopefully it’s a smoother experience in the venue. Holding back on using all of the lighting until near the end was also a new experience for me. I’m very much used to making minimalist music by now, but as I’m new to lighting design, I have to fight to not throw everything in at once. I think I chose the right adjective for this in order for it to be a nice challenge.

* – A technique commonly used by EDM producers, often to accentuate or influence the listener’s perception of rhythm (Brøvig-Hanssen, Sandvik & Aareskjold-Drecker 2020), but also found in more experimental styles of electronic music. Using such techniques in experimental and minimal styles of music is less concerned with rhythm; however, there is some sense of disruption from the way the sidechain source causes the other tracks to “dip”, giving an almost psychedelic sensation. This is by no means a new technique—in fact, in an article on sidechain compression, Abravanel (2019) refers to The Beatles’ recording Tomorrow Never Knows (1966) as an early example of overcompression achieving a similar effect, where the cymbals are “ducked” by the bass drum and snare.

Exactly why this contributes to, or represents, a psychedelic effect is something I’ve attempted to research, with no success so far; however, I’ve discovered a few interesting articles on the mechanics of psychedelic music, in particular, Melting Clocks and the Hallways of Always: Time in Psychedelic Music (Reising 2009) which touches on some examples of repetition and constant rhythms in early psychedelic music, which connects back to my original ideas and use of repetitive rhythms. As such, it made sense to implement sidechain compression in my Capitol piece, to further enhance the psychedelic and hypnotic experience.

Phase browser work

I haven’t had the chance to work more on this piece, as a few other things have taken over in terms of priority, but as mentioned in my presentation last week, I’ve got some good ideas about how to iterate on it, based on the reactions when it was presented in class.

One main reaction that made me consider some modification was that the hypnotic effect is diminished when all of the squares are active. I can remedy this by grouping the squares, perhaps by row, or even in groups of two, and only allowing one square in each group to be active. This would also allow the user to “perform” the work more effectively, as well as preventing notes only a semitone apart from clashing.

Another notable suggestion was to create different versions with varying degrees of synchronisation—one unsynchronised (as per current version), one where clicking on a square causes the ball to appear at the bottom of the cycle (thus bouncing immediately), and one where each ball is strictly locked to a certain point in the sequence, for example to a 16th note grid. Ultimately, I think I’m going to keep it unsynchronised, but it will be a good coding exercise, if nothing else.

Further research opportunities

Adam Neely’s recent video on rhythm perception has inspired me to continue research into rhythm, beyond the end of semester.

Using an example of a musical piece performed by his band, he demonstrates the edges of perception of both slow and fast rhythms. It was something he discussed already in a previous video, but he referred to Paul Fraisse’s writing on rhythm; this motivated me to look him up and find his other writing. In particular, I’m interested in reading Repp’s translation/interpretation The legacy of Paul Fraisse: II. Two further studies of sensorimotor synchronization (2012) as it investigates human response to rhythm at varying tempi. I’m finding that as I continue with this program, my works are becoming more and more focused on rhythm, rather than melodic content, so it’s appropriate that I continue researching interesting rhythmic phenomena to apply to my work.

References

Brøvig-Hanssen, R, Sandvik, B, and Aareskjold-Drecker, J.M 2020, “Dynamic Range Processing and Its Influence on Perceived Timing in Electronic Dance Music.” Music theory online, vol. 26, no. 2.

Abravanel, D 2019, “Sidechain Compression”, Canadian Musician, vol. 41, no. 3, p. 31.

Beatles, The 1966, Tomorrow Never Knows, sound recording, Parlophone Records, UK.

Reising, R 2009, “Melting Clocks and the Hallways of Always: Time in Psychedelic Music”, Popular Music and Society, vol. 32, no. 4, pp. 523–547.

Neely, A 2021, The Psychology of Extreme Rhythms, YouTube, 25 September, viewed 26 September 2021, <https://www.youtube.com/watch?v=DRLTjESyuQk>.

Repp, B 2012, “The legacy of Paul Fraisse: II. Two further studies of sensorimotor synchronization”, Psychomusicology, vol.22, no. 1, pp. 74–76.

week 8: getting there

Pharos Designer / Capitol

The footage of the first version of my piece for the Capitol was interesting. A few things worked well, but a lot didn’t. I’ll be re-making a bunch of it this/next week based on what worked. I think some of the transitions could be even slower, with some areas, such as the random flickering on the ceiling, removed altogether. An idea I had was to start with some slow, fading lighting, slowly transitioning into strobing/fast downward ramp patterns towards the end. Potentially also using the screen to display some colour washes in the most intense points. More experimentation is needed.

Browser-based work

I’ve made the decision to cut the browser-based works down to one piece—Phase, due to taking on another collaboration, a second one with Pat. More on that later. I still have plans to complete the other works, but will keep them as a personal project, rather than submitting for the assignment.

Phase is coming along nicely; I developed an early, yet silent, prototype on Saturday, which works well and looks great. I’ll be adding sound hopefully before the presentations on Thursday, where I’ll include a video of it in action.

It’s also requiring me to develop my p5js skills a bit more, mainly in terms of creating a more refined user interface, but also that it requires me to think a bit more efficiently about using classes/objects, which I touched on in Bounce. I’m working towards this method for the buttons as well, having a definited button class that has different uses depending on conditions set when initiated. For this, I’ve been mostly referring to the p5js reference documentation, but I’m also going further into generative art, with the aim to extend the visual content of my browser works in general, studying texts such as Generative Design Revised: Visualize, Program, and Create with JavaScript in P5.js (Gross et al. 2018). Such texts are not only technically useful, but also provide inspiration by showing how techniques can be combined to create artistic results, which I sometimes have trouble with when simply learning the techniques in isolation.

Collaborations

I’ve taken on another of Pat’s projects; her optical fibre sculpture looks amazing and I definitely want to create some abstract sound for it. My first pass was received well:

Though, I have realised that the parts I initially thought were a bright pulse fading to a dim constant light, are due to the camera automatically adjusting exposure levels, so some of the bursts in my sound won’t really match up when viewing the work in a physical space. As the above has been controlled by a relay, it’s currently limited to simple on/off lighting, so that makes it a little more tricky to create a heightened experience. Pat has experimented with using a projector as the light source instead though, so that would make things easier to sync up and create a more dynamic experience.

I’ve also been iterating on Pat’s Capitol Theatre work, which may go in a few directions at once, changing rapidly to maintain an exhilarating experience. Here’s the latest clip, just of drums for now, but an exercise in complexity:

It’s heavily inspired by Iglooghost, who, in my opinion, creates incredibly ecstatic and exhilarating music, even when the melodic content is on the darker side.

I tried unsuccessfully to create similar music a few years ago in another musical project, but have always wanted to revisit his work, so I’ve been spending some time analysing what makes it such an ecstatic experience for me, and I’ve narrowed it down to a few things:

  • Lack of repetition: There is always something changing in these pieces, mostly the intricate drum programming, but also the melodic content popping in and out among the more constant bassline progressions.
  • Contrasting sounds: Iglooghost plays with pairing delicate, natural sounds with intense, synthetic sounds a lot in his music. This is a fairly standard trope of IDM, but in his music, it’s taken to more of an extreme, including field recordings, quiet vocal samples, and quick foley sounds alongside drum machine, breakbeat and synth sounds more commonly associated with genres like footwork and breakcore. This also relates to the above point, where occasional melodic runs add a burst of energy, in contrast to the quieter sounds.
  • Fast tempo: This hardly needs an explanation, but the use of fast tempo, while also using a lot of half-time elements (e.g. a half-time snare/clap) allows for some of the more intricate drum programming to evoke a feeling of “what the hell is going on?” and being overwhelmed, and even mesmerised by the level of detail. I tried to replicate this at a slower tempo, but it doesn’t quite work very well, unless using faster note divisions.

References

Gross, B, Bohnacker, H, Laub, J & Lazzeroni, C 2018, Generative Design Revised: Visualize, Program, and Create with JavaScript in P5.js, Princeton Architectural Press, New York.

week 7: first passes

Pharos Designer / Capitol

I’m getting the hang of it. Between referencing the manual and simply messing around in the software I’m slowly building a workflow. I made a quick sync test so the difference between manually placing events and using the presets can be identified:

Since making the above, I rearranged the groups a little so they make a bit more sense for what I’m trying to do, and corrected a couple of errors in the LED configuration (most notably, one of the infill lights on the right was not lighting—it had its maximum intensity set to 0). It’s feeling a bit easier to use now, and I’m starting to dig deeper into how to optimally adjust the parameters in the presets. Rearranging—and adding to—the groups has made it so I can strictly customise the order in which lights are activated in presets such as Wave and Burst, and as such the sweeps look a lot cleaner now. Whether or not that matters in the actual venue remains to be discovered when we can get some footage of the tests.

Over the weekend I put together a new version of the audio, with much tighter synchronisation and a rough progression, from static 16th notes, to a pseudo-random pattern (actually the result of two LFOs out of sync with the main timing), to a straight 4-note “loop” at the end. I think it could potentially do with more layers, or variation in the bass tone/decay, but it’s a good base to start putting the main work together. I’ve created a new timeline in the Designer project where I’ll be extending the test events, gradually building it up piece by piece. As I was creating the new audio, I was reminded of something a friend said after a Sunn O))) show back in 2007; his interpretation of it was that it represented the progression of sunrise -> daytime -> sunset, or something to that effect. With that in mind, I think it’d be interesting to use a similar progression. There’s a definite peak intensity in the audio which could represent afternoon daylight. It’s not so relevant to the hypnotising adjective, and as such, not really something I want to lean on conceptually, but certainly interesting as a rough guide to the intensity and colour temperature.

Collaborative project

I finally started work on something for Pat’s Capitol piece, here’s an initial rough idea:

Taking heavy inspiration from the PC Music / hyper-pop style I got heavily into a few years back:

Once again, Pat has given me great feedback, and suggested I push it in a weirder, less repetitive direction. Of course, I’m no stranger to weird, so I’m pretty keen to get a bit wild with it. For inspiration, I’m revisiting Iglooghost’s album Neo Wax Bloom, which is a masterpiece in complexity:

There’s room to throw in a lot of different sounds; in particular I’ve been thinking about “foley percussion”, combining found sounds with electronic percussion, all combined with bursts of huge synth sounds as per my initial draft. It’s an interesting contrast to my solo projects this semester, which are basically the opposite in style.

Browser-based work

Slight existential crisis with this work, in that I don’t really know if I’ll be able to hit hypnotising without just making non-interactive, optical illusion based work, which I worry won’t feel satisfying in terms of my own internal goals. I’m thinking of a shift towards, or combination with adjectives serene or meditative, particularly applicable to some in-progress works such as Wander or Dungeon, where the experience is something more akin to a zen garden or a jigsaw puzzle. I’m wondering if putting it aside for this semester (and thus back to being a side-project), so I can concentrate on the Capitol piece, and my collaboration, would be viable. I can imagine it would allow me to take on Pat’s other work, which looks really exciting.

edit: Shortly after posting this, I went back to some of the tabs I had open from last week, and explored autostereograms. I think this may be the solution. Some of my initial ideas (e.g. Phase) could be really interesting if adapted to be viewed as autostereograms. Also, I have some ideas about how to create random-dot autostereograms in p5js now that I’ve properly digested the Wikipedia article. Time for some experimentation.

break week

I forced myself to have a break this week, so while I did some work on the Capitol Theatre project, I didn’t have a chance to keep up with research, beyond looking at some reference images for my browser-based works.

Happy accidents

I’m working in Emission Control 2 again for my Capitol Theatre piece. It’s a rather temperamental piece of software, especially on PC, where possibly the lack of priority in its audio / file processing, or the fact that it writes 32-bit WAV files, results in occasional glitching. For the piece I’m developing, which involves a lot of rhythmic pulsing, those glitches cause the rhythm to be thrown off, and since I’m using several recordings from the program layered in a DAW, this has some unintended Reich-y consequences, as layers will shift randomly one by one.

For example, 0:07 in this excerpt:

The third note in the four-note pattern shifts out for a moment and then back again. I edited it slightly to remove the glitch sound, and to align the rhythmic shift to a 32nd note grid, but it’s an unintentional interesting feature for now. I’m not sure if I’ll keep it like that, as it throws the hypnotic feeling off somewhat, but it was a nice little happy accident that was only mildly frustrating after a whole day of recording similarly glitchy recordings from Emission Control. I’ve subsequently switched back to my iMac as the PC was frustrating me in other ways.

Pharos Designer

I’ve started digging into Pharos Designer using audio from the above experiment, mostly getting a feel for the timing and some of the automation options available. My conclusion is that I should not feel like using the presets (e.g. strobe, wave, etc) is cheating—after spending quite some time trying to replicate what the presets do by individually accessing each light and cycling through them, I’ve found that it’s not quite worth the trouble. I’ve also realised that it’s likely possible to manipulate the order of lights in the cycling presets by creating new groups with the lights moved into their required positions. I’ll study this over the coming week and see how I go.

Another thing I’m realising is that on the simulation, any quick strobing isn’t really represented very well. I’m hoping that this isn’t the case in the actual space, but I’m prepared to use ramped strobing instead, ie. a quick cut at the beginning or end, either followed or preceded by a fade. The ramp modes work really nicely with the wave preset, resulting in a nice pulsing chase sequence that I can easily time to my music (which is specifically set to 120bpm in order to work best with Pharos).

Browser-based works

As I’ve had a deliberately quiet week, I haven’t had a chance to get into the browser-based work, but I have been collecting more examples, mostly through various Reddit communities, but also in interesting areas such as my partner’s vision therapy exercises.

One exercise in particular uses 3D glasses to make the wearer’s eyes focus on a red and blue circle as one object, and each circle moves further apart onscreen. The viewer is unaware of this happening, as they are told to find a specific feature in the circle and press a key when they see it. Pressing a key results in the images moving further apart, and thus the viewer expecting to lose focus and find that feature again; this distraction enables the circles to move further apart and the viewer’s eyes to follow. It’s a very interesting effect and has made me wonder about similar eye focus techniques, such as autostereograms. Much like some previous ideas, I think it may border on being annoying/headache-inducing, but creating a visual experience that is viewed as an autostereogram may be an interesting way to create a hypnotic experience.

I’m also thinking about kaleidoscopes and the possibility to connect the mechanics of them to an audio experience. One such possibility is to use a wave file of a chord progression, much like in my Emission Control experiments, with a small looping window that adjusts with the kaleidoscope controls. So, mirror position would move the loop points over the wave file, and changing the number of mirrors would expand and contract the size of the loop. It could even be done with pattern data, which would make it easier to generatively create the audio, and allow it to be represented as visuals or vice versa.

week 6: subdivisions

Polyrhythms and rhythmic ambiguity

Over the coming week I’m intending to experiment with polyrhythms; specifically, using polyrhythms to give the impression of multiple time signatures or tempi existing at once, or subtle shifts between the two.

An example of this is shown starting around 2:34 in this video by the excellent Mal Webb:

Shifting perception of the faster pulses between triplets, eighth notes, and other divisions, while ensuring the patterns fit in the larger structure really gives a sense of perspective shifts that I could exploit in my work.

This Tim Exile track uses similar shifts:

Notable times: 1:17, 2:40, 4:23.

Obviously I’ll be using such shifts in more subtle ways, by gradually fading or filtering each different perspective in and out, or having a certain “anchor” element gradually shift between two emphasis states, much like in this Autechre track:

In this case it’s the hihats that mostly dictate how the rhythm is perceived, fading between a fast 16th note pattern and a slower pattern which can either be perceived as dotted eighth notes, or quarter notes accentuating a triplet pattern.

Going further down the rabbit hole, and back to Mal Webb’s work, is this incredibly mathematical yet subtle exploration of rhythmic shifting:

This is less of a polyrhythm and more akin to polymeter, but has a similar mind-bending quality. The moments where the echoes mathematically align and turn into a rhythm are definitely something I want to explore in the coming weeks. In fact, I’ve explored similar timing-related “cascades” in the past:

.. though not as refined. I think I still have the project file for that track somewhere though, so it may be worth revisiting.

Pharos Designer

I’m trying to dig into the specifics of the Pharos Designer project file format, in the hopes that I can somehow generate some lighting patterns myself, but I’m unable to decode it. I’ve currently tried using a hex editor but my guess is that it’s a form of compressed file.

I will continue to look into it over the next few days, however I’ll need to prepare for having to do the hard work myself and manually create the lighting sequences.

Browser-based works

Something I’ve considered for my browser-based works is to make them either not interactive at all, or interactive in a non-direct way, ie. the user influences the work in an abstract way in order to push it towards a certain condition where the visuals and audio are generated. This is partly going back to my Bounce work from last year, where the interaction is much less direct than my more recent pieces Dungeon Field.

My interpretation of Darrin’s comments about Bounce being a kinetic/haptic experience, observing the results of a physics engine, is that of the user watching the parameters they set (ie. the position of the barriers, speed and density of the bouncing balls) play out in an unknown yet consistent way. It’s similar to why I enjoy using generative algorithms to make music; the sense of giving up some degree of control, yet still being able to influence the system.

Perhaps the three individual pieces I create for this work can represent three different levels of interactivity. From something like Wander being simply for observation, to Bounce / Field / Phase where the user observes the result of set parameters, to the near fully manual Window / Dungeon / Repeat works, which are closer to traditional sequencers or loopers.

That’s a moiré

I’ve touched on it in previous blog posts, but something to explore for my browser-based works is the idea of moiré patterns. This could be an interesting way to represent any polyrhythmic relationships in the audio; not necessarily directly representing them, but some kind of display where a correlation could be drawn between the two. Much with polyrhythmic audio, moiré patterns are something I’ve explored briefly in the past, so revisiting them and contextualising/pairing them with audio would be an interesting idea.

Even the overlaying of two simple dot matrices of slightly different sizes and rotations (Hermann 2012 p. 11) provides some interesting results. A simple browser-based work where the user can rotate / scale one dot matrix overlaid on another, static, matrix, with polyrhythmic sequencing / tonal relationships to represent that, could be a worthwhile concept to explore. It’s a very minimal concept, but in my opinion the best hypnotic artworks are minimal. It could include various different dot configurations, which the user could switch between using mouse clicks or key presses.

Optical illusions

I found a very extensive website of optical illusions (Kitaoka 2021), which also contains links to the author’s papers on various illusions and their mechanics (e.g. Kitaoka & Ashida 2003). This will be a valuable resource for my browser-based works.

Particularly interesting is the Ouchi illusion (Ouchi 1977), which is quite an extreme example, considering the very basic nature of the image:

Researching the mechanics behind this image reveals that random eye movements independently stimulate neurons depending on the horizontal and vertical placement, and thus the “disk” appears to jitter or move independently to the background (Barile & Weisstein n.d.).

Part of my experimentation in the coming week will be attempting to re-create some of the examples presented on Kitaoka’s site in p5js, and adjusting certain parameters with the attempt to find the “breaking point” of the illusions. Something like the following (Kitaoka 1998) would be fun to parameterise in p5js:

References

Barile, M & Weissten, E n.d., Ouchi illusion, Wolfram MathWorld, accessed 30 August 2021, <https://mathworld.wolfram.com/OuchiIllusion.html>.

Hermann, K 2012, ‘Periodic overlayers and moiré patterns: theoretical studies of geometric properties’, Journal of Physics: Condensed Matter, vol. 24, no. 31.

Kitaoka, A 1998. A bulge, digital image, Ritsumeikan University, viewed 30 August 2021, <http://www.ritsumei.ac.jp/~akitaoka/Bulge02c.jpg>.

Kitaoka, A & Ashida, H 2003, ‘Phenomenal Characteristics of the Peripheral Drift Illusion’, VISION, vol. 15, no. 4, p. 261–262.

Kitaoka, A 2021, Akiyoshi’s illusion pages, Ritsumeikan University, viewed 25 August 2021, <http://www.ritsumei.ac.jp/~akitaoka/index-e.html>.

Ouchi, H 1977, ‘Japanese Optical and Geometrical Art’, 1st ed., Dover Publications, Mineola, NY, USA.