With the popular use of mobile devices, it becomes increasingly important to conduct analysis on distributed data collected from multiple devices. Federated learning is a distributed learning framework which takes advantage of the training data and computational ability of scattered mobile devices to learn prediction models, and multi-task learning infers personalized but shared models among devices. Some recent work has integrated federated and multi-task learning, but such approaches may be impractical and inefficient in the online scenario, e.g., when new mobile devices keep joining the mobile computing system. To address this challenge, we propose OFMTL, an online federated multi-task learning algorithm, which learns the model parameters for the new device without revisiting the data of existing devices. The model parameters are derived by an effective way that combines the information inferred from local data and information borrowed from existing models. Through extensive experiments on three real datasets, we show that the proposed OFMTL framework achieves comparable accuracy to the existing algorithms but with much smaller computation, transmission and storage cost.