a) Created plain text version of 2051 Wikipedia pages I had in my history + bookmarks, b) Ran them through Topic Modeling that happily ate the single 42 MB text file I gave it and told it to give me 12 topics of 20 words each. c) Did Google for those word sets and came up with these (showing 8) Surface tension surface high flow change water due energy force stress material small low body rate large conditions temperature current size phase Programming paradigm theory language information term analysis model logic terms type object data based structure objects mathematical systems word common programming related Embedded System computer needed system systems design game software citation based video version memory control games technology support code data source power Vector space set function space number called group point functions defined linear numbers case elements vector values real sets finite points open Introduction to the Field of Psychology life social human world knowledge work view nature thought scientific study people person mind individual events philosophy psychology experience natural Quantum mechanics time system process state field theory physics physical mass light quantum earth called energy mechanics processes free input model single Evolution of the neocortex: Perspective from developmental biology doi pmid cid cell cells bibcode development pmc journal brain biology al form species acid neural activity evolution molecular protein Classic Antiquity century early history modern education greek age found art years period world part ancient bc church great ii middle culture

a) Created plain text
[read full article]
 

serial (long strips) / AND / chunks / associations / general – versus parallel (random, “at will”), OR / detail / specific… yeah, the machinery wouldn’t work well for that purpose. Perhaps that explains the power of cloud computations; even though the speeds of the ‘net are far slower than that of a particular hard drive or two, the ability to distribute, more or less in parallel increases the “space” available for computatoins; being connected, of course, to multiple computers each with their own RAM. The protein folding experiments a few years back were a fantastic example of using the branching/parallel _nature_ of the web itself (and the connected “computer brains” at the ends of the branches) to perform tasks difficult for any single super computer.

 serial (long strips) /
[read full article]
 

Yes, indeed. A lot of my exposure was from pop psychology when I was young. Scripts people live, the “inner child” stuff – all very common stuff floating around. Multiple intelligences, trait development, etc. Some of it was scammy — a lot of it was and still is — people will gladly pay money for promises of happiness and life-fulfillment — but there’s a lot of good in the material regardless of dressing.

Yes, indeed. A lot
[read full article]