When comparing software programs based on certain qualities there is usually more than one metric that can be used. Often these metrics may contradict one another or there may be no standard acceptance thresholds. In this work we demonstrate how the Analytical Hierarchy Process (AHP) can be used to mitigate the aforementioned deficiencies in metrics-based software decision making. We illustrate the procedure by incorporating value judgments from a group of experts into an existing metrics data set to rank the design complexity in three imaging software packages. In this case the injection of expert opinion in a formalized framework minimizes the problems associated with conflicting metrics. The contribution of this work is to demonstrate how a combination of expert opinion and tool-collected measures can be used to reason about software programs. The methodology employed can be easily modified to include different metrics, applications and weights, thus providing a practical assessment tool for decision making about software.