top of page

Communications & Composition

circle 9.0 full 8k.png

Poor Wizards Almanac

Poems

  • A Verse and Reply - Go to

  • If Alice Never - Go to

  • Lonely House - Go to​

  • Mellaphant - Go to

  • Pi and the Speed of Light - Go to

  • Red Hues & Sterling Blues - Go to

  • The Man on TV - Go to

  • The Most Private Thing - Go to

  • Willie Tennis- Go to

Info. Theory

1.      Information Theories

a)      This is one model for the measuring the number of facts of information surrounding a human over a given time, K(t).  It does this in a numerical fashion and considers neither the accuracy of the facts it counts, nor thoughts that are not facts.  For example, if a person made the statement, “I know bananas are blue”, it counts as something one knows whether it’s correct or not, but, “I wonder if I will ever visit outer space” does not count at all.  However, considering the examples, the model for this person would include and count the facts that they “don’t know that bananas are commonly labeled yellow”, and “I know I wonder if I will ever visit outer space”.  Hopefully this will be clearer after the next paragraph.

(1)         Typically, a human is not aware of the numerical value of the number facts it knows.  Similarly, it is not aware of the number of facts it does not know.  What then can be said about the relations between these values logically and mathematically?  It seems, that what we do or do not know, can in one model, be divided into 6 groups (A through F).  Each group breaks down further into the past, present, and future, when a sense of time is considered (number subscript).  Also, consider these other six groups at time t assigning them positive magnitudes: initial info (KIn), final info (KF), info gained or learned between the past and present (K1), info gained or learned between present and future (K2), info forgotten or lost between past and present (K3), and info to be forgotten or lost between the present and future (K4). Therefore, K1 + K2 - K3 - K4 = KNET, where KIn + KNet = KF.  This represents the net info between the past and future over the course of ones life.  Of course in many belief systems, at a life’s end K2 and K4 disappear, and the equation becomes K1 - K3 = KNet.  Finally, there is also KT, knowledge total, which is all there is to know, and KU, facts unknown, which is the difference between KT and KF (KT – KF = KU).  Using the present tense case to label each group heading is described following the next paragraph.

The sequence of these values in time must also be considered.  This is set such that time event 0 corresponds to KIn.  The letter name of a variable (A, B, C, etc) signifies its group.  The subscript after a variable distinguishes between members in a same group.

(2)   The Ks
·         K1(t) + K2(t) - K3(t) - K4(t) = KNet(t)
·         KIn + KNet(t) = KF(t)
·          KT(t) – KF(t) = KU(t)
(3)   What you know (A)
(a)    What you knew (A1)
  • A1(0) = 0

  • A1(t) = A2(tn)  for any tn < t

  • A1(t) = A3(tn)  for any tn < t

(b)   What you know (A2)
  • A2(0) = KIn = KT(0) – KU(0) - KNet(0)

  • A2(t) = KIn + K1(t) – K3(t)

  • A2(t) + K1(tn) - K1(t) + K3(t) - K3(tn) = A2(tn)for tn > t

  •  

(c)    What you will know (A3)
  • A3(0) = KIn + K2 (0) – K4 (0)

  • A3(t) = A2(t) + K2 (t) – K4 (t) = KF(t)

  •  

(4)   What you know you know (B)
(a)    What you knew you knew (B1) –
(b)   What you knew you know (B2)
(c)    What you knew you will know (B3)
(d)   What you know you knew (B4)
(e)    What you know you know (B5)
  • B5(t) = A2(t)

(f)    What you know you will know (B6)
(g)   What you will know you knew (B7)
(h)   What you will know you know (B8)
(i)     What you will know you will know (B9)
(5)   What you don’t know (C)
(a)    What you didn’t know (C1)
(b)   What you don’t know (C2)
  • C2(t) = KU(t)

(c)    What you will not know (C3)
(6)   What you don’t know you don’t know (D)
(a)    What you didn’t know you didn’t know (D1)
(b)   What you didn’t know you don’t know (D2)
(c)    What you didn’t know you will not know (D3)
(d)   What you don’t know you didn’t know (D4)
(e)    What you don’t know you don’t know (D5)
(f)    What you don’t know you will not know (D6)
(g)   What you will not know you didn’t know (D7)
(h)   What you will not know you know (D8)
(i)     What you will not know you will not know (D9)
(7)   What you know you don’t know (E)
(a)    What you knew you didn’t know (E1)
(b)   What you knew you don’t know (E2)
(c)    What you knew you will not know (E3)
(d)   What you know you didn’t know (E4)
(e)    What you know you don’t know (E5)
(f)    What you know you will not know (E6)
(g)   What you will know you didn’t know (E7)
(h)   What you will know you don’t know (E8)
(i)     What you will know you will not know (E9)
(8)   What you don’t know you know (F)
(a)    What you didn’t know you knew (F1)
(b)   What you didn’t know you know (F2)
(c)    What you didn’t know you will know (F3)
(d)   What you don’t know you knew (F4)
(e)    What you don’t know you know (F5)
(f)    What you don’t know you will know (F6)
(g)   What you will not know you knew (F7)
(h)   What you will not know you know (F8)
(i)     What you will not know you will know (F9)
(9)   Comments on third order combinations – By 3rd order combinations I mean making questions like, “What you knew you will know you knew”.  All examples that I have studied seem to logically condense down into 2nd order questions.  However, I can’t prove that all do, for I know not enough about Booleans. 
(10)           Other comments at this time:  I can’t keep up with all the variables; comparing the mathematical equations to the verbal equivalents and then logically trying to verify validity.  If anyone skilled gets a hold of this, it would be great to see what a more complete model would look like. 

Recursive Semantics

I've studied and wrote briefly on a cursory level about this topic. It's interesting indeed, and can spiral out of control very quickly.

First, I made the connection between the nature of this problem and its relation to mathematics. The thought flow was something like; Words are functions -> Words are composite functions -> Some words are recursive functions.

Then it went on to; Well defined words are not recursive -> poorly defined words have recursions that can be isolated/resolved -> improperly defined words have recursions which can not be resolved and by default means that word is not objectively, but subjectively defined -> subjectively defined words with unresolvable recursion in a loop are self consistent -> and lastly, the absolute worst failures of words, subjectively defined words with unresolvable recursion that contradicts another word, and are therefore broken, paradoxical, with internal conflict.

Using that system, you can start to rate words as being well or poorly defined, and work to fix or better define ones that are bad. Also, you eventually realize, that what would make the best defined words, are ones where you hit a termination word or words, an independent variable if you will, a self contained word, a word with no further definition. But what would such a word be? It turns out that those words rely on less and less other words, until you get to the branch between the letters that make up a word and the word themselves. This in turn leads into topics like glyphs, sounds, syntax, and lexicography.

Info
Semantics
bottom of page