meteor studio
mobile experiential technology through embedded optimization research

School of Arts, Media & Engineering | School of Electrical, Computer & Energy Engineering, Arizona State University

Directed by Dr. Robert LiKamWa, Meteor Studio is an engineering research group that focuses on software and hardware systems for sensing and actuation on mobile devices for computer vision, augmented reality, and other modern and future use cases. Although devices already incorporate diverse sensors, we face challenges in efficiently sensing, processing, and acting on data. Built on Operating Systems, Computer Architecture, and Machine Learning research, we create low-level system designs to work toward this goal.

Meteor Studio sits across the School of Arts, Media & Engineering and the School of Electrical, Computer & Energy Engineering at Arizona State University. Our lab is in the Stauffer building on ASU's Tempe campus.

We're always recruiting talented students for M.S./Ph.D in Electrical Engineering, Computer Engineering, and Media, Arts, and Sciences. Send an email to likamwa (at) asu.edu with your CV if you're interested in joining the lab!

News

2017-10: Samsung Mobile Processor Innovation Laboratory gift awarded.

2017-08: HoloLens + Android + Max/MSP multi-platform data visualization framework, led by undergrad Alexander Shearer, accepted to Immersive Analytics @ IEEE VIS '17.

2017-02: Best Poster Award at HotMobile '17 for early work on temperature-driven task migration to balance energy efficiency and thermal noise of sensor processing workloads, presented by Venkatesh Kodukula.

2017-02: NSF CISE: CRII: CSR grant awarded.

2017 ASU Digital Culture Speaker Series Talk: Realizing Augmented Reality Augmentations [Vimeo Link]

2016 ASU Digital Culture Speaker Series Talk: Sensing our way into a future of ubiquitous computing systems [Vimeo Link]

Projects (Click to expand)

Characterizing Bottlenecks towards a Hybrid Integration of Holographic, Mobile, and Screen-based Data Visualization [+]

Alexander Shearer, Lei Guo, Junshu Liu, Ashley Megumi Satkowski, Robert LiKamWa
Immersive Analytics @ IEEE VIS 2017

Holostage is a prototype framework to investigate the hybrid integration of head-mounted and handheld mixed-reality devices with immersive screen-based environments. Our platform integrates three devices:

  • A Microsoft HoloLens head-mounted mixed-reality device, using depth cameras and other sensors to position virtual objects in a real environment. The HoloLens employs an Intel Atom processor and a specialized holographic processing unit to render visualizations.
  • A NVIDIA Shield Tablet K1 , running Android 7.0 on a Tegra K1 System-on-Chip. The Shield Tablet features powerful graphics performance on top of a mobile sensor package, including an inertial motion unit, a front-facing camera, and a rear-facing camera.
  • The iStage (Intelligence Stage) at Arizona State University, consisting of motion capture, controllable lighting, and immersive projection covering a 10m x 10m floor and a 10m x 8m screen. Data management and projection runs on Max 7 software on Mac Pro computers.

We develop software to render particle visualizations for each device, using the Unity Game Engine for the HoloLens and C++ bindings of the Vulkan Graphics API for the NVIDIA Shield Tablet. Our software uses the PTC Vuforia SDK to geometrically register the devices, creating a uniform coordinate system for the virtual environments. In addition to providing the ability to visualize scientific and creative data, our multi-device platform serves as a testbed to explore system limitations.


[Paper PDF]
[Framework repository]

RedEye: Analog ConvNet Image Sensor Architecture for Continuous Mobile Vision [+]

Robert LiKamWa, Yunhui Hou, Julian Gao, Mia Polansky, Lin Zhong
ACM/IEEE International Symposium on Computer Architecture (ISCA) 2016
Seoul, Korea

The RedEye vision sensor architecture extracts ConvNet features in the analog domain to reduce analog-digital sensor readout overhead. The architecture promotes focal plane scalability by localizing design complexity and promotes energy efficiency by analog noise admission.
[Paper PDF]
[Talk Slides]
[Simulation framework repository]

Starfish: Efficient Concurrency Support for Computer Vision [+]

Robert LiKamWa, Lin Zhong
ACM Conf. on Mobile Systems, Applications, and Services (MobiSys) 2015
Florence, Italy

Starfish retrofits vision libraries for split-process execution. This allows multiple background applications to transparently share computation results through library function caching.
[Paper PDF]
[Conference Talk]
[Conference Slides PDF]

Energy Characterization and Optimization of Image Sensing Toward Continuous Mobile Vision [+]

Robert LiKamWa, Bodhi Priyantha, Matthai Philipose, Lin Zhong, Paramvir (Victor) Bahl
ACM Conf. on Mobile Systems, Applications, and Services (MobiSys) 2013
Taipei, Taiwan
Best Paper Award

We characterize the energy consumption of image sensors to reveal opportunities for energy proportionality to tradeoff quality for power consumption. We describe simple techniques to enable such proportionality.
[Paper PDF]
[Conference Talk + 5 Minute Rundown]
[Conference Slides PDF]
[MIT Tech Review Coverage]
[Patent Application]

Building a Mood Sensor from Smartphone Usage Patterns [+]

Robert LiKamWa, Yunxin Liu, Nicholas D. Lane, Lin Zhong
ACM Conf. on Mobile Systems, Applications, and Services (MobiSys) 2013
Taipei, Taiwan

The MoodSense/Scope project studies the use of supervised machine learning to mine inferences from smartphone usage patterns. We analyze text, call, email, location, app usage, and website browsing patterns against mood activeness and valence.
[Paper PDF]
[Conference Slides PDF]
[Conference Talk]
[Jimmy Kimmel Late Night Comedy Sketch]
[Communications of the ACM Coverage]

Reflex: Using Low-Power Processors in Smartphones Without Knowing Them [+]

Felix Xiaozhu Lin, Zhen Wang, Robert LiKamWa, Lin Zhong
ACM Conf. on Architectural Support for Programming Languages and Operating Systems (ASPLOS) 2012
London, England

Reflex is a suite of compiler and runtime support tools for energy-efficient smartphone sensing on heterogeneous architectures. Reflex not only manages deployment and execution of code that is considered for heterogeneous resources, but also creates a software shared memory among distributed code.
[Paper PDF]
[Xiaozhu Lin's Project Site]

Other Work

(Invited Talk) Rethinking the Imaging Pipeline for Privacy-Preserving Energy-Efficient Continuous Mobile Vision [+]

Society for Information Display (SID) Display Week 2015, San Jose, California

(Workshop Paper) Draining our Glass: An Energy and Heat Characterization of Google Glass [+]

APSys: Asia-Pacific Workshop on Systems 2014, Beijing, China

We teardown and analyze the Google Glass power consumption to motivate and inspire system efficiency research directions.
[Paper PDF]
[Conference Talk]

(Workshop Paper) Styrofoam: A Tightly Packed Coding Scheme for Camera-based Visible Light Communication [+]

Visible Light Communication Systems Workshop @ MobiCom 2014, Maui, Hawaii

[Paper PDF]
[Authorship Note]

Styrofoam was authored by three Rice University Ph.D. students without their advisors, which may seem unusual to an outside observer.

Jason Holloway, David Ramirez, and I (Robert) regularly take short coffee breaks in the afternoons to clear our minds. On one such coffee break, having read one too many inspirational stories about the early days of Xerox PARC and Bell Labs, we embarked on a project that was driven more out of pure amusement and curiosity than usual. Perhaps we wanted to prove to ourselves that we could do research without external pressure. Perhaps we wanted to author a paper together while we were in the same place. Perhaps we just wanted to go to this workshop in Hawaii. I don't really remember. But in any case, we puzzled over ideas on napkins and whiteboards, and attacked a fundamental impediment of screen-camera visible light communication links: inter-symbol interference. The result was this work, Styrofoam.

We submitted without telling our advisors; if it got rejected, we were going to keep silent. However, upon acceptance, our advisors and department were very supportive in our endeavor and funded our travel, supplemented by ACM travel grants. I presented Styrofoam at the Visible Light Communication Systems workshop, while David presented the work at the ACM Student Research Competition at the main MobiCom conference, where he earned 4th place. We had a great time chatting with researchers about Styrofoam and its positioning in the Screen-Camera Link research world.

(Ph.D. Forum Talk) Efficient Image Processing for Continuous Mobile Vision [+]

MobiSys 2014 Ph.D. Forum, Bretton Woods, New Hampshire
Best Presentation Award

(Workshop Paper) MoodSense: Can Your Smartphone Infer Your Mood? [+]

PhoneSense: Workshop on Sensing Applications on Mobile Phones @ SenSys 2011, Seattle, Washington
Best Paper Award

(Demo Paper) SUAVE: Sensor-based User Aware Viewing Enhancement [+]

UIST: ACM Symposium on User Interface Software and Technology 2011, Santa Barbara, California

Our Sensor-based User-Aware Viewing Enhancement package accounts for ambient light and viewing angle effects that impair mobile screens in many scenarios, which typically impair the quality of the display content. SUAVE uses brightness and contrast enhancements to improve mobile displays in these contextual situations, making it easier to see, read, and use mobile content.
[Paper PDF]

Personnel

Lab Director

Ph.D. Student

  • Jinhan Hu

M.S. Thesis Students

  • Venkatesh Kodukula
  • Siddhant Prakash
  • Sridhar Gunnam

M.S. Students

  • Vasudha Viswamurthy
  • Vraj Delhivala

Undergraduate Students

  • Ashley Megumi Satkowski, Computer Science, minor in Digital Culture
  • Alex Shearer, Computer Science
  • Aashiq Shaikh, Computer Science
  • Geoffrey Wong, Computer Science
  • Britton Jones, Electrical Engineering

Courses

Spring 2018: AME 394: Desiging and Implementing Mixed Reality Experiences

Fall 2017: EEE 598: Mobile Systems Architecture

Spring 2017: AME 112: Computational Thinking

Fall 2016: EEE 598: Mobile Systems Architecture

Support

Meteor Studio is grateful for support from the National Science Foundation and Samsung Mobile Processor Innovation Lab. We also thank Microsemi and NVIDIA for their graciously donated devices and software licenses.