Wednesday, 28 December 2016

μεταφορά III

1. Bias towards difference... where innovation mean rather learn through comparison / differentiation (on line learn classification cf Brown)

2. Remind μεταφορά :

Analogy ↔ functor

Now on this Cat does not help : analogy is your guide, but this operation is anything but automatic ...
[Reminder: initially the first stake of Cat consists in natural transformations ...]
CF remarks Spivak in (ProMat) http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0023911
"It is important to note that ologs can be constructed on modeling and simulation, experimental studies, or theoretical considerations that essentially result in the understanding necessary to formulate the olog. This has been done for the proteins considered here on the basis of the results from earlier work which provided sufficient information to arrive at the formulation of the problem as shown in Figure 3"

3. In ProMat Spivak emphasizes hierarchical and functional aspects.
(Functors) such as Cat ~ Sch (CF Spivak 5.4), or type PhysSpGr CF "learning as categorification III", or type of those in "learning as categorification", or word2vec (word → linear spaces) .

4. I nevertheless suspect that the most useful / deep functors are symmetries, in the sense that:
Symmetry ↔ structure

Structure taken in its mathematical meaning. These are rather few ...: linear spaces, groups ... the difficulty is to see one of these structures in the domain studied. Most of the time this is not obvious.


5. Enforcing comparison thus remains the objective:
a.ProMat
b. Http://web.mit.edu/mbuehler/www/papers/BioNanoScience_2011_3.pdf
c. "The term" log "(like a scientist's log book) alludes to the fact that such a study is never really complete, and that a study is only as valuable as it is connected in the network of human understanding. In this paper, we present the results of this study. » Spivak
Http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0023911

But how to do it precisely is anything but obvious ...

6. A paradigm that emerges greatly pushes the traditional boundaries of AI:
DeepMind: "I would like to see a science where an AI would be a research assistant doing all the tedious work of finding interesting articles, identifying a structure in a vast amount of data to get back to the human experts and scientists who could move faster "
This paradigm fits precisely in the vision of Spivak in 5.c. Note the presence of the word 'structure'. My best guess would be to take the term in its strong sense: mathematical (and not the sense in which DeepMind may hear it, its 'statistical' meaning: pattern)


7. In an inescapable race for abstraction, CF 4 in "learning categorification III", it can be seen that the AI, after having long been engaged in recognizing material forms, might seek to (learn) Abstract forms: structures / categories. It is thus necessary to understand the evolution which leads from the linear regression (linear spaces), to the trees, then to the NN, then to the DNN, then to combinations of DNN (CF "SPEECH RECOGNITION WITH DEEP RECURRENT NEURAL NETWORKS ", Graves)

No comments:

Post a Comment