A bit of a lot of things – lots of trial and error. But I settled on a process for now: Paint dot NET is my homebase. a) Have color image. b) Convert it to 1-bit using a little thing I found http://blog.roguecode.co.za/update-for-my-agent-1-bit-image-converter-1bitter/ c) Open in Inkscape d) Control-A (Select All) e)Path:Trace Bitmap “Grays” Turn off Smooth Options: Turn off Smooth Corners f) Path: Simplify (or Control-L) about three times g) File: Export Bitmap. To do: 1) Automate b) – maybe imagemagick “convert” can do it? 2) Maybe shrinking the image to like 64×64 will make thresholding work better? I don’t know. 3) Use potrace instead of Inkscape for tracing. I did and can but as I haven’t mastered potrace at all, I have to learn its options (same problem I have with Imagemagick – I don’t know it well) 4) Find another way to “simplify”. I think potrace can but I don’t know for sure.

A bit of a
[read full article]
 

It was the help of an app called Fourier Anime that got me started. It created rubber-band animation from a standard image. I took one of the frames, simplified it further using Inkscape’s “simplify” to get here. App page for Fourier Anime says it uses this process, which I’d like to figure out to do by hand: 1. Edge Detect Differentiate the luminosity value of the monochrome input image, then we can get the edges where the luminosity are changing significantly. 2. Find Contours Extract contour curves from the edge image one by one. 3. Fourier Transform Contours are the sequences of points. Convert this complex sequences into Fourier coefficients by Fourier transform. 4. Low-pass Filter Leave only the coefficients of low-frequency parts. 5. Inverse Fourier Transform Reconstruct the complex sequences of points from low-pass filtered Fourier coefficients. And draw the curves in the display (parametric representation).

It was the help
[read full article]
 

I had to do this in my life. The projects I want to do can take hours of concentration. But reality gives me 5-15 minute increments throughout the day (except night, which is why I’m up right now). So I learned to have logical stopping points with ANYTHING I do that’s at about the 15m mark. I can usually stall an extra 10m, cut off, and move on to whatever chaotic thing is SOO DARN IMPORTANT that have to packup and leave for hours about. If it’s a super quick thing, I can do it right away and return to my task but usually that’s followed by another interruption within 10 minutes, so sometimes even that 15m logical stopping point stutters along for hours :)

I had to do [read full article]

 

I’ve had many many nephews and nieces through the years so I’ve taken my tendency for excessive accuracy and put that “too much information” “too literal” to good use. Say what you’re going to do. Do what you said you were going to do. Only explain what you did if there are any questions or quizzicle looks. If there is, improve so that next time there is fewer and fewer “question mark faces”.

I’ve had many many
[read full article]
 

Try as I might, I keep going back to Processing. (now at 3). I did a brief stop back at GW-BASIC (looking at PC-BASIC, which is a 64-bit implementation). I keep thinking: I want to get good at Jula, at R, at Python, at lisp, as Closure ,at Erlang, at Scheme… but what is it that I’m REALLY wanting to do? Interactive, but not a game. Drawing but not just a static plot of a function (to embed in a PDF, EVEN IF it’s in PS and autodocuments as a scientific paper) Simple, logical syntax without code overhead. Easy access to ports, graphics, sound. A toy. A toy to test quick ideas in, not running a 2021 banking system server or a massive multiplayer game. Ad-hoc.

Try as I might, [read full article]

 

Been fascinated by it since I was a kid watching science shows and devouring library books. (I didn’t know who Hawking was as he didn’t write his book yet). Encyclopedias mostly. Later, rooted for Loop Quantum Gravity, was disappointed that String Theory took over for so long until M-Brane where I liked String Theory again. Now I’ve been back at discrete for a bit as first off, “quantum” is _COUNTING_ so – not continuous by definition – and recent work by DARPA on asymmetrical fraction and integer Quantum Hall effect materials really resurrected my interest, which never died really but now I can look forward to a new way to do computing, with memory and logic happening on the same chips, NO MORE LOST DATA… all coming soon. Real soon. I’m happy.

Been fascinated by it
[read full article]