It’s a big claim as it puts randomness as the ultimate answer whose complexity is so great we cannot possibly fathom its depths. Basically, it’s occupying a god-like position, possessing what would normally be considered super-natural abilities or abilities “so far beyond” us mere humans that it is omniscient. It’s a bold claim indeed and I would place it in the realm of religion.

It’s a big claim as it puts randomness as the ultimate answer whose complexity is so great we cannot possibly fathom its depths. Basically, it’s occupying a god-like position, possessing what would normally be considered super-natural abilities or abilities “so far beyond” us mere humans that it is omniscient. It’s a bold claim indeed and I would place it in the realm of religion.

So by that logic, if I’m understanding properly  that would place sets as superior to randomness, yes?

Yes, sounds like statistics or in the context of entropy, statistical physics.

—-

A Set theoretical logic will lead you to the Axiom of Choice of course, but this is ultimately in the context of the field of mathematics or perhaps more properly number theory which crosses into Philosophy of Mathematics.

set_theory

 

—-

[my head is always in Taxonomy of concepts 🙂 ]

—-

So in your worldview (see what I did there? 😛 ) then, the concept of choice is equivalent to the “internal inertia” / governing force / Newtonian replacement for inertia then?

===

Probabilities are a practical substitution for human observation and measurement. I believe they’re based upon the theoretical behavior of identical ideal gas in a thermodynamically enclosed system.

I know they have a basis prior to that but I *think* the modern movement of statistics was empowered by that.

===

 

Yes because humans tend to follow learned patterns including those of grammar which we learn so young (prior to the age of 4) that by the time we reach the first “forgetting age” (somewhere around 7-8 years old), the processes are functionally automated for most.
—-

You provide the initial context based upon your prior assumptions and should I mention them again (which I did), your assumptions are challenged by the narrative and refined, hopefully providing context for the cats at some point for the reader.

—-

Rules of language are modeled-by mathematical proofs

====

Evolution is a concept you are borrowing from biology, although evolution has been successfully modelled by mathematics, after the fact, to great success.

—-

Indeed. As recently as 2015, Godel’s was proven again formally and rigorously.

So far, it holds up strongly.

http://link.springer.com/article/10.1007/s10817-015-9322-8

====

 

That said, there’s always ways to work around it, less rigorously, some leaps of faith here and there. The pragmatic value of axiomic systems to me “proves” (pun intended) their usefulness. I wouldn’t base a Universe entirely on them, but for human purposes they’re great.

====

 

Specialization has lead to an increase in vocabulary. Most of this vocabulary is not in common use.
 
http://www.economist.com/blogs/johnson/2013/05/vocabulary-size
 
This estimate of 20,000-35,000 words known per adult person sounds about right.
 
The Bible having 10,000 words doesn’t equate to humans knowing only 10,000 words.
====
  I love the setup for this experiment (the adversarial neural network) : I hope to see more of this in use as it may lead to a better ability for us to tell the difference between what’s analog source and what’s from the neural net, a kind of self-reflection process – ‘is this my idea (me being the NN) or did I get this from outside?”


We propose a new framework for estimating generative models via an adversar- ial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a discriminative model D that estimates the probability that a sample came from the training data rather than G. The train- ing procedure for G is to maximize the probability of D making a mistake. This framework corresponds to a minimax two-player game. In the space of arbitrary functions G and D, a unique solution exists, with G recovering the training data distribution and D equal to 1

=====

Attachments

[responsivevoice_button voice="US English Male"]

Leave a comment

Your email address will not be published. Required fields are marked *


nine − 5 =

Leave a Reply