Abstraction of abstractions – in great detail.

by abstractions I mean “levels removed from each other”. Or maps of maps. Or maps making connections between different maps on a third map, and so on up the dimensions.

For example, in the mapping experiment I did:

A) Here’s me. I’m sitting in a chair on a porch with a roof. This is now.
B) At some point in the past, same location relative to our current coordinate system, a flying machine shoots laser beams over the roof of this screen room from above. The bounce-backs are turned into numbers, abstractions themselves, stored on another abstracted thing, the hard drive storing the data, likely in the form of iron filings that were moved on spinning disks.
C) It is at this point that the 3D was converted into 1D (the number line) and 2D (the filings on the disk) – although all of it *really* is 3D+Time – I call it the other dimensions for convenience sake.
D) These iron filings were read at time in the future from the LIDAR scanning, converted to [whatever] and sent up to a server, stored in a similar fashion.
E) At a future point from there, I downloaded the numbers, read, written and stored in a similar fashion as C.
F) I converted these numbers into a “2D” image, with shades of grey representing heights. The numbers have been abstracted into colors on a 2D image file.
F) I take this image (a height map) and put it into a program called Worldmapper. Worldmapper converted these colors into data structures readable by Minecraft, another abstraction.
G) I open Minecraft.
H) I now see a representation (abstraction) of the relative heights of everything around me, including the porch I’m sitting in, as it was when the scan was taken.

I can go to the coordinates that I’m sitting at here, and duck under a “grass block” representing the height of the roof above me, and stand.

A virtual me is standing in a virtual recreation of the place I’m at as it was at the time it was scanned, which _happens_ to thankfully be the same as it appears now.

With a few changes, I can make the area look a lot like what I see around me.

If the scan was more detailed, if Minecraft was more detailed, which it or one of its successors likely will be one day, it will feel “as if” I am walking around the same place I am at now.

Yet, I’m still sitting in my chair, looking at a piece of glass.

I’m analogizing [connecting ‘points’ on the screen and mapping them to my visual map of the area I see when I look *away* from the screen].

What’s the real real? if someone unplugs the power, I find out quickly enough. Yet, if I remain in there long enough *without* input from the world that’s _not_ virtual, my suspension of disbelief will prevail and I may believe that I am in reality when I am actually in the game.

But where is the real of it all? My eyes scan around me in 2mm segments at a time. The LIDAR scanner works similarly to the eyes+brain, just with far less detail. The LIDARs method of estimating distance is based upon how we see loosely, how we might be *if* our eyes shot out gauges of distance, which they effectively do by CHOOSING to look at certain areas and RECEIVING. Shooting the laser to bounce back is equivalent to our INTENTION of pointing in a particular spot (unconscious or conscious).

The map of the area is created in my mind as my eyes scan, being compressed and stored deeply in my mind. The hard drive works similarly to the brain, but my brain does not work similarly to the hard drive, as a brain based the hard drive as an abstraction of human long term memory as we understand it, utilizing the available technologies we are aware of.

So, that’s my abstraction of absractions thinking as well as I can put it without editing.

Leave a comment

Your email address will not be published. Required fields are marked *


six − = 3

Leave a Reply