Close to goals. I spent a bit of last night working with Octave, an open source MATLAB to get various tools working so I could create and work with cochleagrams / correlelograms, I got over an obstacle this morning, then several others. I created what I had been working towards — and after all that…
.. I realized I already had ALMOST what I wanted already.
I just had to rotate it. I’ll probably have to stretch it too. Here I combined both ffmpeg’s waveform/acolor with showcqt after rotating both of them.
It’s still not exactly it but this view is a decent compromise for the moment.
On the right is my attempt at turning 432 individual 30ms frames of autocorrelated wavelets into a video and on the left is what ffmpeg could already do with its showcqt -= but I rotated it, which made a big difference to me.
On an individual frame basis, the cocholeagram / autocorrelation is informative and says a lot about that “moment” or “instant”.
But putting it into a movie just smushed it together. I’m glad because I’ve learned a lot about octave / matlab which I didn’t know much on how to use before and I’ll try some more experiments. But rotating the FFMPEG graph achieved a lot of what I was looking for.
All of the sudden, I realized I needed to rotate the showcqt graph. That’s all. Turns it into sound writing. Never thought of it before now. But I did get to learn some matlab / octave while messing around with cochleagrams basically seeing this aesthetic but not knowing it was right under my nose, just a rotation away.