Saliency Prediction for Mobile User Interfaces

Research Intern, Adobe Bigdata Experience Lab (BEL), 2017

Designed a system for predicting saliency of user interface elements in smartphone applications to assist UI designers in quick A/B testing of their designs.

  • Implemented a pipeline for collecting eye gaze data from app users using smartphone front camera, and deployed on Amazon Mechanical Turk.
  • Trained neural networks to predict element wise saliency values on smartphone UI images using features from image as well as underlying xml code.

Paper