Andrea Colaço https://andreacolaco.info Technology tinkerer + entrepreneur Tue, 18 Oct 2016 19:21:22 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Mime: Compact Low-Power 3D Gesture Sensing https://andreacolaco.info/mime-compact-low-power-3d-gesture-sensing/ Mon, 26 Sep 2016 16:18:40 +0000 http://andreacolaco.info/wp/?p=450

Abstract

Mime is a compact, low-power 3D sensor for unencumbered free-form, single-handed gestural interaction with head-mounted displays (HMDs). It introduces a real-time signal processing framework that combines a novel three-pixel time-of-flight (TOF) module with a standard RGB camera. The TOF module achieves accurate 3D hand localization and tracking, and it thus enables motion-controlled gesture. The joint processing of 3D information with RGB image data enables finer, shape-based gestural interaction.

mime-compact-low-power-600

Videos

Collaborators

Kirmani A., Hye Soo Y., Nan-Wei G., Chris S., Vivek G.

Files

  • A Colaço, A Kirmani, HS Yang, NW Gong, C Schmandt, VK Goyal, Mime: compact, low power 3D gesture sensing for interaction with head mounted displays, UIST 2013
    Paper
]]>
CoDAC: Compressive Depth Acquisition Camera https://andreacolaco.info/codac/ Mon, 26 Sep 2016 16:06:59 +0000 http://andreacolaco.info/wp/?p=445

Abstract

CoDAC is a new time-of-flight based range measurement system for acquiring depth maps of piecewise-planar scenes with high spatial resolution using a single, omnidirectional, time-resolved photodetector and no scanning components. In contrast with the 2D laser scanning used in LIDAR systems and low-resolution 2D sensor arrays used in TOF cameras, CoDAC demonstrates that it is possible to build a non-scanning range acquisition system with high spatial resolution using only a standard, commercially-available, low-cost photodetector and spatial light modulator.

codac1

codac2

Videos

 

Collaborators

Kirmani A., Franco W., Vivek G.

Files

  • A Kirmani, A Colaço, FNC Wong, VK Goyal, CoDAC: a compressive depth acquisition camera framework, ICASSP
    Paper
]]>
First Photon Imaging https://andreacolaco.info/first-photon-imaging/ Mon, 26 Sep 2016 00:36:25 +0000 http://andreacolaco.info/wp/?p=424

Abstract

Imagers that use their own illumination can capture 3D structure and reflectivity information. With photon-counting detectors, images can be acquired at extremely low photon fluxes. To suppress the Poisson noise inherent in low-flux operation, such imagers typically require hundreds of detected photons per pixel for accurate range and reflectivity determination. We introduce a low-flux imaging technique, called first-photon imaging, which is a computational imager that exploits spatial correlations found in real-world scenes and the physics of low-flux measurements. Our technique recovers 3D structure and reflectivity from the first detected photon at each pixel. We demonstrate simultaneous acquisition of sub-pulse duration range and 4-bit reflectivity information in the presence of high background noise. First-photon imaging may be of considerable value to both microscopy and remote sensing.

First Photon Imaging Research Project Andrea Colaco

 

If you want to add a caption , it would go here

Videos

Collaborators

Kirmani A., Dheera V., Dongeek S., Franco W., Jeffrey S., Vivek G.

Files

  • A Kirmani, D Venkatraman, D Shin, A Colaço, FNC Wong, JH Shapiro, VK Goyal, First Photon Imaging, Science 343 (6166), 58-61
    Paper
]]>
Back to the desktop https://andreacolaco.info/back-to-the-desktop/ Sat, 10 Sep 2016 15:06:15 +0000 http://andreacolaco.info/wp/?p=292

Abstract

In this project, we construct a virtual desktop centered around the smartphone display with the surface around the display opportunistically used for input. We use a 3-pixel optical time-of-flight sensor, Mime, to capture hand motion. The sensor on the phone allows the table surface next to the phone to be mapped to conventional desktop windows, and the phone’s display is a small viewport onto this desktop. Moving the hand is like moving the mouse, and as the user shifts into another part of the desktop, the phone viewport display moves with it. We demonstrate that instead of writing new applications to use smart surfaces, existing applications can be readily controlled with the hands.

back-to-desktop-600

Videos

Scrolling:


 

Annotation:

Map Navigation:

Collaborator

Hye Soo Yang

]]>
Live Trace https://andreacolaco.info/live-trace/ Sat, 10 Sep 2016 15:05:08 +0000 http://andreacolaco.info/wp/?p=290

Abstract

In this interactive experience we were interested in enabling quick input actions to Google Glass. The application allows users to trace an object or region of interest in their live view. We use the trace as the foundation for allowing the user to indicate interest in a visual region. Once selected, the user can choose to apply filters to the region, annotate the selection through speech input, or capture text through optical character recognition. These selection and processing tools could naturally integrate with quick note-taking applications where limited touchpad input precludes such input. The Live Trace app demonstrates the effectiveness of gestural control for head-mounted displays.

livetrace2-600

Videos

Collaborator

Hye Soo Yang

]]>