kronosapiens.github.io - The Problem of Information II









Search Preview

The Problem of Information II

kronosapiens.github.io
1
.io > kronosapiens.github.io

SEO audit: Content analysis

Language Error! No language localisation is found.
Title The Problem of Information II
Text / HTML ratio 63 %
Frame Excellent! The website does not use iFrame solutions.
Flash Excellent! The website does not have any flash contents.
Keywords cloud channel capacity information student material uncertainty requires complexity teaching function randomness input source outcomes means system entropy random communication representation
Keywords consistency
Keyword Content Title Description Headings
channel 18
capacity 11
information 11
student 9
material 9
uncertainty 8
Headings
H1 H2 H3 H4 H5 H6
2 1 1 4 0 0
Images We found 1 images on this web page.

SEO Keywords (Single)

Keyword Occurrence Density
channel 18 0.90 %
capacity 11 0.55 %
information 11 0.55 %
student 9 0.45 %
material 9 0.45 %
uncertainty 8 0.40 %
requires 8 0.40 %
complexity 7 0.35 %
teaching 6 0.30 %
function 6 0.30 %
randomness 6 0.30 %
input 6 0.30 %
source 6 0.30 %
outcomes 5 0.25 %
means 5 0.25 %
system 5 0.25 %
entropy 5 0.25 %
random 4 0.20 %
communication 4 0.20 %
representation 4 0.20 %

SEO Keywords (Two Word)

Keyword Occurrence Density
of the 15 0.75 %
is a 8 0.40 %
to the 8 0.40 %
the channel 8 0.40 %
that the 7 0.35 %
capacity is 6 0.30 %
the material 6 0.30 %
the student 6 0.30 %
function of 6 0.30 %
a function 6 0.30 %
the source 5 0.25 %
of information 5 0.25 %
in the 5 0.25 %
channel is 5 0.25 %
trying to 4 0.20 %
language of 4 0.20 %
is not 4 0.20 %
and the 4 0.20 %
the capacity 4 0.20 %
we can 4 0.20 %

SEO Keywords (Three Word)

Keyword Occurrence Density Possible Spam
a function of 6 0.30 % No
is a function 6 0.30 % No
capacity is a 4 0.20 % No
some sort of 3 0.15 % No
the concept of 3 0.15 % No
about the world 3 0.15 % No
the language of 3 0.15 % No
of the channel 3 0.15 % No
the relationship between 3 0.15 % No
the number of 3 0.15 % No
of the material 3 0.15 % No
function of both 3 0.15 % No
input For a 2 0.10 % No
For a fixed 2 0.10 % No
learning about the 2 0.10 % No
A subject is 2 0.10 % No
trying to teach 2 0.10 % No
to the student 2 0.10 % No
function of the 2 0.10 % No
as some sort 2 0.10 % No

SEO Keywords (Four Word)

Keyword Occurrence Density Possible Spam
is a function of 6 0.30 % No
capacity is a function 4 0.20 % No
a function of both 3 0.15 % No
input For a fixed 2 0.10 % No
randomness of the source 2 0.10 % No
is the concept of 2 0.10 % No
the amount of information 2 0.10 % No
would like to say 2 0.10 % No
that capacity is a 2 0.10 % No
some sort of representation 2 0.10 % No
as some sort of 2 0.10 % No
function of both the 2 0.10 % No
complexity of the material 2 0.10 % No
learning about the world 2 0.10 % No
being tailored to the 2 0.10 % No
a function of the 2 0.10 % No
is one who can 2 0.10 % No
language of randomness and 1 0.05 % No
the language of randomness 1 0.05 % No
is the language of 1 0.05 % No

Internal links in - kronosapiens.github.io

About
About
Strange Loops and Blockchains
Strange Loops and Blockchains
Trie, Merkle, Patricia: A Blockchain Story
Trie, Merkle, Patricia: A Blockchain Story
Reputation Systems: Promise and Peril
Reputation Systems: Promise and Peril
The Future of Housing, in Three Parts
The Future of Housing, in Three Parts
Proof of Work vs Proof of Stake: a Mirror of History
Proof of Work vs Proof of Stake: a Mirror of History
Introducing Talmud
Introducing Talmud
The Economics of Urban Farming
The Economics of Urban Farming
Time and Authority
Time and Authority
On Meaning in Games
On Meaning in Games
Objective Functions in Machine Learning
Objective Functions in Machine Learning
A Basic Computing Curriculum
A Basic Computing Curriculum
The Problem of Information II
The Problem of Information II
The Problem of Information
The Problem of Information
Elements of Modern Computing
Elements of Modern Computing
Blockchain as Talmud
Blockchain as Talmud
Understanding Variational Inference
Understanding Variational Inference
OpsWorks, Flask, and Chef
OpsWorks, Flask, and Chef
On Learning Some Math
On Learning Some Math
Understanding Unix Permissions
Understanding Unix Permissions
30 Feet from Michael Bloomberg
30 Feet from Michael Bloomberg
The Academy: A Machine Learning Framework
The Academy: A Machine Learning Framework
Setting up a queue service: Django, RabbitMQ, Celery on AWS
Setting up a queue service: Django, RabbitMQ, Celery on AWS
Versioning and Orthogonality in an API
Versioning and Orthogonality in an API
Designing to be Subclassed
Designing to be Subclassed
Understanding Contexts in Flask
Understanding Contexts in Flask
Setting up Unit Tests with Flask, SQLAlchemy, and Postgres
Setting up Unit Tests with Flask, SQLAlchemy, and Postgres
Understanding Package Imports in Python
Understanding Package Imports in Python
Setting up Virtual Environments in Python
Setting up Virtual Environments in Python
Creating superfunctions in Python
Creating superfunctions in Python
Some Recent Adventures
Some Recent Adventures
Sorting in pandas
Sorting in pandas
Mimicking DCI through Integration Tests
Mimicking DCI through Integration Tests
From Ruby to Python
From Ruby to Python
Self-Focus vs. Collaboration in a Programming School
Self-Focus vs. Collaboration in a Programming School
Designing Software to Influence Behavior
Designing Software to Influence Behavior
Maintaining Octopress themes as git submodules
Maintaining Octopress themes as git submodules
Setting up a test suite with FactoryGirl and Faker
Setting up a test suite with FactoryGirl and Faker
To Unit Test or not to Unit Test
To Unit Test or not to Unit Test
A Dynamic and Generally Efficient Front-End Filtering Algorithm
A Dynamic and Generally Efficient Front-End Filtering Algorithm
Trails & Ways: A Look at Rails Routing
Trails & Ways: A Look at Rails Routing
Getting Cozy with rspec_helper
Getting Cozy with rspec_helper
Exploring the ActiveRecord Metaphor
Exploring the ActiveRecord Metaphor
Civic Hacking as Inspiration
Civic Hacking as Inspiration
From Scheme to Ruby
From Scheme to Ruby
Setting up Auto-Indent in Sublime Text 2
Setting up Auto-Indent in Sublime Text 2
hello world
hello world
via RSS
Abacus

Kronosapiens.github.io Spined HTML


The Problem of Information II AbacusWell-nighThe Problem of Information II May 19, 2016 1 In Part I, we established the data processing inequality and used it to conclude that no wringer of data can increase the value of information we have well-nigh the world, vastitude the information provided by the data itself: We’re not quite done. The fundamental problem is not simply learning well-nigh the world, but rather human learning well-nigh the world. The full model might squint something like this: Incorporating the human element requires a larger model and spare tools. 2 A waterworks is the medium by which information travels from one point to another: At one end, we have information, encoded as some sort of representation. We send this representation through the channel. A receiver at the other end receives some signal, which they reconstruct into some sort of representation. Hopefully, this reconstruction is tropical to the original representation. No (known) waterworks is perfect. There is too much uncertainty in their underlying physics and mechanics of their very construction. Mistakes are made. Bits are flipped. We say the value of information that a waterworks can reliably transmit is that channels capacity. For a given channel, topics is denoted , and for input variable and output varaible it is specified like this ( ways “defined as”): This ways that the topics is equal to the maximum bilateral information between and , over all distributions on . Using a well-known identity, we can rewrite this equation as follows: This shows us that topics is a function of both the entropy of and the provisionary entropy of given . The provisionary entropy represents the uncertainty in given – in other words, the quality of the waterworks (for a perfect channel, this value would be ). is a function of and is what we try to maximize when determining capacity. Observe that topics is a function of both the waterworks and the randomeness input. For a stock-still channel, topics is a function of the input. For a stock-still input, the topics is a function of the waterworks (here is it known as “distortion”). Below is an example of what is known as a “Binary Symmetric Channel” – a waterworks with two inputs and outputs (hence binary), and a symmetric probability of error . This diagram should be interpretable given what we’ve discussed above. The major result in information theory concerning waterworks topics goes like this: What this says is that for any transmission scheme where the probability of error () goes to zero (as woodcut length increases), the topics of the waterworks is greater than or equal to the entropy of the input. This is true plane for perfect channels (with no error) – meaning that , the uncertainty inherent in the source, is a fundamental limit in communication.Increasinglyplainly, we observe that successful transmission of information requires a waterworks that is less uncertain than the source you’re trying to transmit. This should be intuitively satisfying. If the waterworks is increasingly upturned than what you’re trying to communicate, the output will be increasingly a result of that randomness than whatever message you wanted to send. The flip interpretation, which is less intuitive, is that the increasingly random the source, the increasingly tolerant you can be of noisy channels. Finally, the converse tells us that any struggle to send a high-entropy source through a low-capacity waterworks is gauranteed to result in upper error. 3 With that established, we can now consider the question of human communication: Let’s consider the metaphor and see if it holds. We want to say that the process of liaison is exposing those virtually us to stimulus (ourselves, media, etc), having that stimulus transmitted through the channels of perception, and ultimately represented in the mind as some sort of impression (such as an understanding or feeling). On a first impression, this seems reasonable and general. What is not present here is the concept of intention. In our communication, we may at various points be trying to teach, persuade, seduce, amuse, mislead, or learn. What is moreover woolgathering is the concept of “creativity”, or receiving an impression somehow greater than the stimulus. We will return to these questions later and see if we can write it. Let’s consider a simple case: the teacher trying to teach. We can seem good intention and an accent on the transfer of information. We model as follows: The “capacity” of human perception is then: This allows us to consider both the randomness of the source (teaching), and the uncertainty in the transmission (perception). We seem justified in proposing the following: The rencontre of teaching is in maximizing the information the student has well-nigh the subject. A subject is “harder” if there is increasingly complexity in the subject matter. A subject is moreover “harder” if it is difficult to convey the material in an understandable way. A “good” teacher is one who can present the material in a way that is towardly for the students. A “good” student is one who can make the most sense of the material that was presented. Let’s uncork with (4), the idea of material stuff tailored to the student, or the input stuff tailored to the channel. Intuitively, we would like to say that a good teacher can transpiration the teaching (the stimulus) they present to the student in order to maximize the student’s learning. First, consider that material may be too wide for some students. We would like to say then that the topics of that student was insufficient for the complexity of the material. To say this, we must first consider the relationship between randomness with complexity. 4 The language of information theory is the language of randomness and uncertainty. In teaching, it is increasingly well-appointed to speak in the language of complexity, difficulty, or challenge. Can these be equivalent? Entropy is a measure of randomness, and entropy is a function of both 1) the number of possible outcomes of a random process, and 2) the likelihood of the various outcomes. A 100-sided pearly die is increasingly random than a 10-sided pearly die, while a 1000-sided die that unchangingly came up 7 is not really random at all. Complexity, on the other hand, can be understood as the number and nature of the relationships among various parts of a system. We can perhaps formalized this as the number of pathways by which a transpiration in one part of the system can stupefy the overall state of the system. To oppose equivalence, we predicate that there is unchangingly some stratum of uncertainty in any system, or in any field of study. In math, these are formalized as variables. In history, these can be the motivations of various actors. The increasingly ramified a system, the larger the number of outcomes and the relationship between components. In the language of probability, we say there are increasingly possible outcomes, and that due to the ramified relationships between parts, that there are significant odds of many variegated outcomes. Consider the example of teaching math. Arithmetic is simpler than geometry, in that the expression contains fewer conceptual “moving pieces” than the expression Understanding arithmetic requires the student to alimony track of the concept of “magnitude” and be worldly-wise to relate magnitudes via relations of joining (addition and subtraction) and scaling (multiplication and division). It requires the utopian concept of negative numbers. Understanding geometry requires increasingly tools. It requires students to be worldly-wise to deal with points in space, and understand how to use the Cartesian plane to represent the relationship between points and numbers. It introduces the idea of “angle” as a new kind of relationship, on top of arithmetic’s “bigger” and “smaller”. Put flipside way, arithmetic requires only a line, while geometry requires a plane.Increasinglyconcepts ways increasingly possible relationships between objects, which ways increasingly possible dimensions of uncertainty, which ways increasingly complexity. We conclude at least a rough equivalence between complexity and uncertainty. V Returning to the teaching example, we can now speak in terms of complexity of the material instead of randomness of the source. If material is too ramified for the student (), then the material cannot be taught to that student (yet). Observe that the waterworks (the student) is not fixed, but is worldly-wise to handle increasingly ramified subjects over time. … to be continued? Comments Please enable JavaScript to view the comments powered by Disqus. Abacus Abacus kronovet@gmail.com kronosapiens kronosapiens I'm Daniel Kronovet, a data scientist living in Tel Aviv.