Mid-air gestures have become an important interaction technique in natural user interfaces, especially in augmented reality and virtual reality. Supporting a set of continuous gesture-based commands in mid-air gesture interaction systems, such as selecting and moving then placing an object, however, remains to be a challenge. This is largely because these intentional command gestures are connected through transitional, meaningless gestures, which are often misleading for gesture recognition systems. The inability to separate unintentional movements from intentional command gestures, also called the Midas problem, limits the application of mid-air gestures. This paper addresses the Midas problem via a physiological computing approach. With the help of sensors that capture physiological signals, we present a novel method, Pactolus, for segmenting mid-air gestures using arm electromyography. User studies demonstrate the high accuracy of our approach in segmenting mid-air gestures interleaved by transitional hand or finger movements.