Try here:
http://lenslet.com/technology.aspNice. As far as I can tell, it's an advantage in parallelization more than an advantage explicitly tied to optical frequencies... it just happens that it's a lot easier to parallelize with optics than in transistors?
Now call me stupid... I've looked at it back and forth three ways, and this *is* analog computation, isn't it? The 'result' is directly level-dependent and gets pulled through an ADC? So what we have here is one of the world's coolest digitally-programmable analog computers?
(Hint: There are 'slightly less than as-close-to-purely-digital-as-we-can-get' technologies in action everywhere; Intel's StrataFlash is a good example.)
But it does seem like they'll want some noise reduction (use of digital-style quanta, AKA 'bits') if they want to *really* guarantee the results... while at the same time, the properties of the light mean they do get a StrataFlash-grade guarantee. (You know how much light is coming through each cell, which is presumably a Much Larger Number than the noise floor of the detectors.) Reworking it to approach digital perfection would probably kill their speed advantage, but at the same time... isn't there some way we could do the same "para-analog" manipulation in silicon?
...which in turn is the sort of trick Sun was/is looking to implement in their attempt at asynchronous architecture? ('Instructions' get implemented by magic blocks of circuitry... oof, now I need to dig up the name for that project.)