…. 2 Ongoing software project, not “theory” Encapsulated internals & interfaces Today:...

Post on 05-Jan-2016

216 views 3 download

Tags:

transcript

2

• Ongoing software project, not “theory”

• Encapsulated internals & interfaces

• Today:– Details of module internals– Details of architecture & signaling/feedback– Single, clean, simple inputs– (26 slides)

• Not yet: – time– noise– robustness– multiple/partial hypotheses

3

One “compressor”:• Generic memory unit• Learns about low-dim structure in high-dim data• Converts live data between low-dim high-dim

Hierarchy of compressors:• Each learns from compressed &

combined output of those below• Bi-directional (feedback)

4

Compressor Internals

Probabilityestimation

Bi-directional

mapping

Matching to previous compression

Compressing

Quantizing & representing high-dim input

old:

P = p1 + p2 + …

5

Quantizing & representing high-dim input

“Point” = position, weight, radius

Two point-clouds: mapping vs. learning (sync occasionally)

1. Find 3 closest cloud-points

X

Result: Point-cloud approximates input cloud, with roughly equal weight per point

2. Choose the lightest 3. Move it to absorb new point,

preserving center of mass

4. Increase weight

5. Update radius

6. (prune lightweight points)

Online updates:

6

Compressing high to low(ISOMAP)

1. Find local distances in high-dim space

2. Create long-range distances from shortestpiecewise path (“geodesic”)

3. Link “islands” until all Dij defined

4. Diagonalize F(Dij) to get low-dim cloud (arbitrary coordinates)

7

Keeping new maps consistent with old ones

Old cloud

The low-dim mapping is not always unique…

…so rotate & stretch new cloud to minimize distance from old one (SVD)

New cloud

Rotated new cloud

8

Mapping new points using point-clouds

1. Find new point’s closest 4-5 neighbors

2. Express it as their center-of-mass (SVD)

3. Construct low-dim output from corresponding neighbors & weights

4. Also works

mapping lowhigh

=

W2

W1W3

W4

W2W1

W3W4

=

9

Prob. Estimation

Each point is center of gaussian

P = p1 + p2 + …

Ri Pi = exp ( -0.5 r2 / R2) / (RD Ptot)

“Probability” of test point is sum over local gaussians

P = p1 + p2 + …

Probability =“Closeness” to manifold = how much to trust this point

… use it later in mixing estimates.

10

Compressors interacting

Creating forward output

Feedback mixed

back in

Settling

11

Creating output

• Map from high to low dim

• Expose result to all Compressors above

• Re-map output backwards to high dim

• Expose as feedback to Compressors below

12

P

Mix feedback into output

2. Get probabilities of feedback and own output

1. Average feedback from above

3. Create weighted mixture of them

P

13

2. Iterate a few times to settle

Updating and settling

1. Expose mixture as updated output,

and map downward as updated

feedback

--- done with description of system --

14

General simulation results

3-layer hierarchy with

2-1 convergence

Input is 9x6 “pixel” space

with random illumination

Display low-dim output

in 2-D color

15

Simple 1-dim illumination

How does each module map the input space?

?

16

17

Toroidal 1-dim illumination

How does each module map the circular input space?

?

=

18

19

2-dim spot illumination

How does each module map the 2-D input space?

?

20

21

“Hallucinating” spots driven from above

1. Force activity at a single

location in top module

2. Let feedback move down

3. Look at what lower modules think input ought to be

? ? ?

22

23

2-dim clustered spots (left & right)

How does each module map the 2-D input space?

?

24

25

Next steps

Architecture– Time – Reference problem – Reference platform– Integration method– Separate streams for transforms vs. objects– Get people involved!

Algorithms– Noise– Multiple hypotheses– Distributed representation– “neurons”– Better quantization, mapping, robustness