I wasn’t happy with my previous WebGL PhotoSynth Viewer: my javascript parser was aware of the position of the cameras and I wasn’t doing anything useful with it… Furthermore I was only relying on XB-PointStream for the rendering (which manage to display perfectly a dense point cloud). But I definitively want to do more fancy stuff. So Goodbye XB-PointStream, Hello Three.js!
The only trouble with this demo is that it is using a proxy PHP page to bypass ajax cross-domain security issue. Thus I won’t put it online (as it would kill my bandwidth limit). But I’ll try to update my Google Chrome extension so that you can give it a try too (will take time and I’m going in holiday…sorry).
On the tech side, this demo is using:
- Three.js: for the rendering
- Downloadify: to allow downloading PLY generated on the client-side
- jDataView: to parse binary xhr response
- ImageFlow: coverflow system
- Accordion(except css) + Tween + SynthParser created by me
Other news
On another side I’ve been up to:
- compile a bundler version with PBA support (with bad result: fast BA but erroneous point cloud)
- create a new framework for fast matching: already 3x faster than OpenSynther (which is already really fast). This new framework is compatible with multi-gpu and multi-cpu + you can add constraints to memory usage!
- create an hybrid AR solution for my current work based on my panorama tracking demo and using gyroscope information to fix the computing time of the vision part