Wenyuan Yin, Tao Mei, and Chang Wen Chen
Automatic photo quality assessment emerged as a hot topic in recent years for its potential in numerous applications. Most existing approaches to photo quality assessment have predominantly focused on image content itself, while ignoring various contexts such as the associated geo-location and timestamp. However, such a universal aesthetic assessment model may not work well with significantly different contexts, since the photography rules are always scene and context dependent. In real cases, professional photographers use different photography knowledge when shooting various scenes in different places. Motivated by this observation, we leverage the geo-context information associated with photos for visual quality assessment. Specifically, we propose in this paper a Scene-Dependent Aesthetic Model (SDAM) to assess photo quality, by jointly leveraging the geo-context and visual content. Geo-contextual leveraged searching is performed to obtain relevant images with similar content to discover the scene-dependent photography principles for accurate photo quality assessment. To overcome the problem that in many cases the number of the contextually searched images is insufficient for learning the SDAM, we adopt transfer learning to utilize auxiliary photos within the same scene category from other locations for learning photography rules. Extensive experiments shows that the proposed SDAM scheme indeed improves the photo quality assessment accuracy via leveraging photo geo-contexts, compared with traditional universal aesthetic models.
In Proceedings of VCIP (Best Student Paper Award)