Tuesday, July 3, 2018

AI is fundamentally concerned with the creation of higher, more abstract representations of the world from simpler representations, automatically by a machine.  Ideally, such representations are required to be associated with statistical guarantees of their correctness.  

Previous attempts to this end identified homomorphism in algebraic structures as a fundamental tool for abstraction.    Early AI attempts applied it to solve simple board games by abstracting the board states.  In addition, more recent advances in image processing suggest that symmetries in groups is a good way to capture abstraction by ignoring of unimportant changes to the imageSome (https://www.microsoft.com/en-us/research/video/symmetry-based-learning/). More concretely, we say s is a symmetry of f(x) = y if f(s(x)) = f(x).   These two notions together suggest focusing on groups augmented with a probability measure to study the question of automatic abstraction.

We thus focus next on representation, symmetry, and homomorphism in groups.
https://m.youtube.com/watch?v=qpGDNKgfHHg# is a nice introduction to the concept of group representations with examples. 

For any set X, the set of all 1-1 onto functions f : X -> X with the composition operation form a group.  As mentioned above a symmetry of f is a s : X -> X such that F(s(x)) = f(x).  The first half of https://m.youtube.com/watch?v=MVoxtgVCo5g# By Alex Flournoy (up to ~32) motivates symmetries over transformation f : X -> X and introduces some relevant language such as continuous, discrete, infinite, compact, local and global symmetries.  The associated lecture notes are here https://inside.mines.edu/~aflourno/Particle/Lecture2Groups%20and%20Representations.pdf.

Some highlights from the symmetry learning work by Pedro Domingos et al https://www.microsoft.com/en-us/research/video/symmetry-based-learning/
1. Symmetries are changes in the data obtained by group operations such as rotation of a chair you want the classifier to be invariant under.
2. Symmetries may reduce the number of features thus we can learn with less data and still achieve the golden ratios of number of features and size of training set
3. Symmetry may reduce a search space
4.  It is not dependent on the ml method being used

Study group meeting slide -
https://drive.google.com/file/d/1SSDcrvE5uCM6J9xsRI2lvwhtVb61S7XP/view?usp=sharing.
Youtube in Hebrew of the ML study group meeting -
https://urldefense.proofpoint.com/v2/url?u=https-3A__www.youtube.com_playlist-3Flist-3DPLRPue8gCw66-2D8mizHl7s0ZQzATzdL8FZJ&d=DwIFAg&c=jf_iaSHvJObTbx-siA1ZOg&r=y_b69HJLyjn8lFwzYVfyxol578OEO4exeFDpgGN6MoQ&m=sxiQbE7RbDZ-BIVy4IS1MeIYDRpGBYK2rg4rlQDFzFQ&s=O8k45AgwGeBLm4lf2K25pZtg5uNVNWE8V7WSv4dlOhY&e=




Some related papers follow.

1. An algebraic abstraction approach to reinforcement learning is given here http://www.cse.iitm.ac.in/~ravi/papers/WALS03.pdf
2. Here is an approximate homomorphism approach https://web.eecs.umich.edu/~baveja/Papers/fp543-jiang.pdf
3. Symmetry based semantic meaning uses the concept of an orbit in a group to represent a set of paraphrases that defines implicitly the Semitic of a sentence https://homes.cs.washington.edu/~pedrod/papers/sp14.pdf.
4. Work on deep symmetry network https://homes.cs.washington.edu/~pedrod/papers/nips14.pdf

No comments:

Post a Comment

  Our next ML study group meeting will take place on Monday the 8 th  of October.   I'll cover the contraction theorem.   See relevant s...