Believe it or not this was the complete text of a single message: truth is wet with time

The following is a message to a friend, an excellent scholar and correspondent on lots of cool things.

He is more concerned with searching for truth than for life(or more accurately what it is to be alive and have your truths shattered). Or at least that is what I would worry about were I to critique josh. But I mean this as a complement as well as a eyebrow raise (critique can be positive or neutrally interesting). Josh just understands so much that only thing that even approaches something like scholarly issues are actually pretty well defined. It's a case of discretitus.

It's a weak computational bias at first, but it can really accelerate rapidly if you aren't careful.

But if there are to be truth statements about things that happen in the world they must be temporally defined. I would argue that even if we allow for approximate simultaneity, so long as just noticeable differences work out to be stable. But they are not stable. In fact, the order of stimulus presentation matters in and of itself. (Ackn: Thanks to Thom Morgan for helping me think about this stuff.) so it changes overtime and the endpoint is affected by the slope over time, meaning there's also some notion of nearness (and therefore of farness) since there is some dimension somewhere along which you can imagine yourself having had thought… were you to believe the information were uniformly randomly delivered (ie on a poisson schedule…notErlang Ackn:josh Abbott for getting me to think about random variations in vehicular transfer time(time to get on and off the bus + the time takes to slow to a stop and open the door and close the door and then speed up to what the max speed would be) to avoid collisions (or more accurately… caravans)) And then you could reasonably calculate the range of possible data structures that could result from the data that you have seen, if it had been delivered in the opposite direction. Well, what does this amount to? Slippery truths that suggest a wide range of mutually exclusive but model consistent ideas, but particular instances of truth only are the knots that we happen to see, or rocks in the stream of thought/experience.

Grasping at nothing. Nothing there to grasp.

Truths are slippery. Get w i m . s m g i n

Cheers, :€

image.jpg
image.jpg
image.jpg
image.jpg

•bzzzzrrrt•

image.jpg

For another example of how this idea of same input different filters meaning very different things…

image.jpg

Error sourcing, causal inference, abduction, high-level languages

Suppose your program fails.

As evidence for the reasonableness of this assumption I Point to: Casey Liss (@caseyliss) described an instance of this occuring, in ep 54 goto fail; ~29:00 mins into the epsiode.

If there is a hidden fault (a memory leak) produced by the program, presumably if you were able to watch the activity on the memory in relation to the commands that are being executed that cause the memory activity, if the two clocks that govern both were appropriately synchronized, then you might be able to identify the particular cause of the total interaction of all the memory leaking by considering how much of the allocated memory was actually due to that particular sequence of code (appropriately built into a functional equivalence class where if there is some output put into it there will be a bunch of other outputs that would produce the same output when input as well. I.e., a many to many mapping).

So in this case suppose that every allocation was automatically analyzed with respect to the code that generated it, could you then not use that as an error signal? I.e., you have identified what the problem is (memory leaking), and have allocated appropriate responsibility portions to all of the pieces of code, and then attempt to analyze there to see if they come to a common source point.

I.e., if you find the root of a many tiered many faced superorganism of causal relations (like oaks, or aspen?, or strawberries? See Peter Godfrey smith post for details about this, or I think I also linked to one of the chapters that includes this information), and you excise the root, how much will wither?

It is it this way that bugs, diseases, viruses infect our systems and have effects that are far reaching for organisms much larger than itself. But if we can identify these things automatically in the real world where we don't have access to literally all of the details, how can we not identify these things automatically in our virtual worlds (simulated logic sPace$) where we do have access to literally all of the details. This is a Turing no worries task, because in order for something to be sent it needs to have already been encoded.

This is also the case for the history of eve online book, which you should check out today is the last kickstarter day for it, and it'll be a text that I will probably be pulling a lot from in the time following its release (and possibly before if they open source their data).

It will also be the goal there, where literally every interaction that could have possibly mattered, by definition could have been recorded perfectly, to undertake the task of automatic abduction in a complex space. I hope to be able to build a model that in theory could be applied to this dataset that will output the relevant root causes in an actually mathematically defined universe.

But a history of any series of computations is possible. And that history may be able to identify patterns that suggest fixing. Such an abductive strength could even help create automatically improving code. If you want to get to really the highest possible level of programming, it is to be able to automatically change the particular implementation of some code on the fly(like field medicine, but field enchancement), to avoid creating obvious errors and to notify the developer when it's too injured and confused that it doesn't know how to fix itself with super high confidence.

Code that would have caught Casey's bug, and fixed it for him, that is about as high level as you can get. If you had a programming language that did that it could hardly be said to not be substantially easier for novices to begin coding.

For better or worse.

Cheers, :€

John Siracusa (@siracusa) would probably enjoy oquonie

This week atp(65) discussed monument valley and to a lesser extent, journey.

I've played neither. I have played oquonie. It was really good and is exactly what John Siracusa defined as a non-linear game(paraphrased):
You have to go back and do a tonne of things in a particular order that involves topologically and temporally — which were it not for the monument valley discussion I wouldn't have distinguished — tortuous paths.

Oquonie is that.

Also holy crap I am so happy that Squarespace figured out how to let you select text and navigate the cursor’s position without activating the block movement mode in their iPhone blogs app. Like seriously, kickass!

I wrote down how Casey Liss (@caseyliss; congrats btw, though I don't know why you want to have kids… not that you need to explain anything to me)

image.jpg

a sidebar on beAts and apple & nest and IFTTT and hue

John gets it. Regarding beats, this was an investment of fashion. I said the other day that this was a day when beats headphones users just got huge dividends on their investment in fashion. That is, now they are going to have the geeks around to help them progress into the brave new world of an automated life.

I really like the in case sonic headphones. But they are more subtly bold in their fashion style, so it'll take a while before the two–tone-chunk headphones (and other things, see picture) will take-off, but, I think it will eventually. just give it time.

I started buying brightly colored socks that I purposefully did not match in spring 2011. Followed by, bright shoes with different shoelaces in left and right shoes; shoes that are of the same style but of different colors so one can be worn on each foot. Only now are they getting big. It's the obvious reaction to this upcoming wave of intense monotonicity.

It's why I wish apple would make a rose gold colored iPhone (like twsbi’s 580).

It'd twist the current path but into new and interesting directions.

In any case, the reason why this should pay off so well for beats users is that due to their choice in fashion they are allied with the nerds of apple fandom. To show their authenticity the nerds will kinda rebel and grudgingly accept them but in doing so they will interact and offer massive improvements to both of their lives.

This generation will finally spawn, once combined with the backlash against hipsterdom, bitNeaks will be arise (spoonerism: beatnik; pun: Bit; jargon formatting reference: camelCase).

But that is a tale for another day(particularly after it's more obvious to more people that it's occurring).

The point is that there will be a stylish technorati that are able to improve people's lives. I want to share the most important parts with anyone who will hear them.

I want to say a few things that I think could dramatically improve people's lives.

image.jpg

Nest and IFTTT and hue: this automated life

Nest is a smart thermostat and fire alarm company that was bought by google. They would figure out when you wouldn't be home and thereby save you money by turning down the air when you leave. Fire alarm knows different kinds of signals in the smoke so that it can detect what's likelier to be a fire and a seared steak. I think. If that isn't the case, work on it cause it should be.

IFTTT is short for if this then that, which is an excellent way to automate various parts of your world by hooking them together with nice if clauses, which behave rather nicely as event catchers for event streams in real(continuous)-time

hue are lights by Phillips that will turn off when I leave home, and on when I return.

Off at night. Purple to signal last call to go to the store. gradually on in the morning.

Mall of them turned off at once to go to bed cause it's too late already.

Don't some of the pieces look like they're jumping out at you?

Don't some of the pieces look like they're jumping out at you?

Cheers, :€