TY - JOUR

T1 - An interior boundary pivotal solution algorithm for linear programmes with the optimal solution-based sensitivity region

AU - Arsham, Hossein

AU - Gunasekaran, Angappa

PY - 2013

Y1 - 2013

N2 - We have developed a full gradient method that consists of three phases. The initialisation phase provides the initial tableau that may not have a full set of basis. The push phase uses a full gradient vector of the objective function to obtain a feasible vertex. This is then followed by a series of pivotal steps using the sub-gradient, which leads to an optimal solution (if exists) in the final iteration phase. At each of these iterations, the sub-gradient provides the desired direction of motion within the feasible region. The algorithm hits and/or moves on the constraint hyper-planes and their intersections to reach an optimal vertex (if exists). The algorithm works in the original decision variables and slack/surplus space, therefore, there is no need to introduce any new extra variables such as artificial variables. The simplex solution algorithm can be considered as a sub-more efficient. Given a linear programme has a known unique non-degenerate primal/dual solution; we develop the largest sensitivity region for linear programming models-based only the optimal solution rather than the final tableau. It allows for simultaneous, dependent/independent changes on the cost coefficients and the right-hand side of constraint. Numerical illustrative examples are given.

AB - We have developed a full gradient method that consists of three phases. The initialisation phase provides the initial tableau that may not have a full set of basis. The push phase uses a full gradient vector of the objective function to obtain a feasible vertex. This is then followed by a series of pivotal steps using the sub-gradient, which leads to an optimal solution (if exists) in the final iteration phase. At each of these iterations, the sub-gradient provides the desired direction of motion within the feasible region. The algorithm hits and/or moves on the constraint hyper-planes and their intersections to reach an optimal vertex (if exists). The algorithm works in the original decision variables and slack/surplus space, therefore, there is no need to introduce any new extra variables such as artificial variables. The simplex solution algorithm can be considered as a sub-more efficient. Given a linear programme has a known unique non-degenerate primal/dual solution; we develop the largest sensitivity region for linear programming models-based only the optimal solution rather than the final tableau. It allows for simultaneous, dependent/independent changes on the cost coefficients and the right-hand side of constraint. Numerical illustrative examples are given.

UR - http://www.scopus.com/inward/record.url?scp=84891929972&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84891929972&partnerID=8YFLogxK

U2 - 10.1504/IJMOR.2013.057489

DO - 10.1504/IJMOR.2013.057489

M3 - Article

AN - SCOPUS:84891929972

VL - 5

SP - 663

EP - 692

JO - International Journal of Mathematics in Operational Research

JF - International Journal of Mathematics in Operational Research

SN - 1757-5850

IS - 6

ER -