As the popularity of media-based social networking services (SNS), such as Instagram and Snapchat, has increased significantly, a growing body of research has analyzed SNS images in relation to emotional analysis and classification model development. However, these prior studies were based on relatively small amounts of data, where the emotions of images were labeled from viewers' perspectives, not posters' perspectives. Consequently, we analyze 120K images that reflect poster's emotion. We develop color- and content-based classification models by considering: (1) the dynamics of SNS, in terms of the volume and variety of images shared, and (2) the fact that people express their emotions through colors and objects. We demonstrate the comparable performance of our model with models proposed in prior studies and discuss the applications.