Close Menu
    Trending
    • Qantas data breach to impact 6 million airline customers
    • He Went From $471K in Debt to Teaching Others How to Succeed
    • An Introduction to Remote Model Context Protocol Servers
    • Blazing-Fast ML Model Serving with FastAPI + Redis (Boost 10x Speed!) | by Sarayavalasaravikiran | AI Simplified in Plain English | Jul, 2025
    • AI Knowledge Bases vs. Traditional Support: Who Wins in 2025?
    • Why Your Finance Team Needs an AI Strategy, Now
    • How to Access NASA’s Climate Data — And How It’s Powering the Fight Against Climate Change Pt. 1
    • From Training to Drift Monitoring: End-to-End Fraud Detection in Python | by Aakash Chavan Ravindranath, Ph.D | Jul, 2025
    AIBS News
    • Home
    • Artificial Intelligence
    • Machine Learning
    • AI Technology
    • Data Science
    • More
      • Technology
      • Business
    AIBS News
    Home»Artificial Intelligence»Understanding the Optimization Process Pipeline in Linear Programming | by Himalaya Bir Shrestha | Dec, 2024
    Artificial Intelligence

    Understanding the Optimization Process Pipeline in Linear Programming | by Himalaya Bir Shrestha | Dec, 2024

    Team_AIBS NewsBy Team_AIBS NewsDecember 28, 2024No Comments7 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Downside Assertion

    The issue assertion is given beneath. x and y are the 2 resolution variables. The target is to maximise revenue topic to a few constraints. Each x and y have decrease and higher bounds respectively.

    Revenue = 90x + 75y
    Goal: maximize Revenue topic to:
    3x+2y≤66
    9x+4y≤180
    2x+10y≤200

    Bounds:
    2≤x≤8
    10≤y≤40

    Optimization utilizing highspy

    Within the code beneath, I provoke the mannequin as h. Then, I introduce my resolution variables x and y together with their decrease bounds and higher bounds respectively, and likewise assign the names. Subsequent, I add the three constraint inequalities which I’ve known as c0, c1 and c2 respectively. Every constraint has coefficient for x and y, and a RHS worth. Then, I maximized the worth of 90x+75y, which is the target perform. The mannequin is run on this line.

    import highspy
    import numpy as np

    #provoke the mannequin
    h = highspy.Highs()

    #outline resolution variables
    x = h.addVariable(lb = 2, ub = 8, title = “x”)
    y = h.addVariable(lb = 10, ub = 40, title = “y”)

    #h.setOptionValue("solver", "ipm")

    #outline constraints
    h.addConstr(3*x + 2*y<=66) #c0
    h.addConstr(9*x + 4*y<=180) #c1
    h.addConstr(2*x + 10*y<=200) #c2

    #goal
    h.maximize(90*x + 75*y)

    What occurs within the backend throughout the optimization course of?

    When the mannequin runs, one can see the next progress occurring within the terminal window. However what precisely is occurring right here? I describe it beneath:

    Downside measurement:

    The constraints within the linear downside will be represented within the matrix kind as Ax≤b, whereby, A is the matrix of constraint coefficients, x is the vector containing resolution variables, and b is the matrix of RHS values. For the given downside, the constraints are represented within the matrix format as proven beneath:

    Representing constraints within the type of matrix. Illustration by Creator.

    The issue matrix measurement is characterised by rows, columns and non-zero parts. Row refers back to the variety of constraints (right here 3), column refers back to the variety of resolution variables (right here 2), and parts/non-zeros confer with the coefficients, which don’t have zero values. In all three constraints, there aren’t any coefficient with zero worth. Therefore the overall variety of non-zero parts is six.

    That is an instance of a quite simple downside. In actuality, there will be issues the place the variety of rows, columns and non-zero parts will be within the order of hundreds and hundreds of thousands. A rise in the issue measurement will increase the complexity of the mannequin, and the time taken to resolve it.

    Coefficient ranges
    The coefficients of x and y in the issue vary from 2 to 10. Therefore, the matrix coefficient vary is displayed as [2e+00, 1e+01].

    Price refers back to the goal perform right here. Its coefficient is 90 for x and 75 for y. Because of this, Price has a coefficient vary of [8e+01, 9e+01].

    Bounds for x and y vary between 2 and 40. Therefore, Certain has a coefficient vary of [2e+00, 4e+01]

    Coefficients of RHS vary between 66 and 200. Therefore, RHS has a coefficient vary of [7e+01, 2e+02].

    Presolving
    Presolve is the preliminary course of when a solver tries to resolve an optimization downside, it tries to simplify the mannequin at first. For instance, it’d deal with a coefficient past a sure worth as infinity. The aim of the presolve is to create a smaller model of the issue matrix, with equivalent goal perform and with a possible house that may be mapped to the possible house of the unique downside. The lowered downside matrix can be easier, simpler, and quicker to resolve than the unique one.

    On this case, the presolve step was accomplished in simply two iterations leading to an empty matrix. This additionally signifies that the answer was obtained and no additional optimization was required. The target worth it returned was 2100, and the run time of the HiGHS solver was simply 0.01 seconds. After the answer is obtained from the optimization, the solver can use the postsolve/unpresolve step whereby, the answer is mapped to the possible house of the unique downside.

    Mathematical Programming System (MPS) format

    Mathematical Programming System (MPS) is a file format for representing linear and combined integer linear programming issues. It’s a comparatively outdated format however accepted by all industrial linear program solvers. Linear issues will also be written in different codecs reminiscent of LP, AMPL, and GAMS.

    One can use highspy to put in writing mps file by merely utilizing h.writeModel("foo.mps"). And studying the mps file is so simple as h.readModel("foo.mps").

    MPS format of the given LP downside. Illustration by Creator.

    The construction of the MPS file of the given optimization downside is proven above. It begins with the NAME of the LP downside. OBJSENSE signifies whether or not the issue is a minimization (MIN) or maximization (MAX), right here the latter. The ROWS part signifies the target, names of all constraints, and their sorts by way of equality/inequality. E stands for equality, G stands for higher than or equal rows, L stands for lower than or equal rows, and N stands for no restriction rows. Right here, the three constraints are given as __c0, __c1, and __c2 whereas Obj is the abbreviation for the target.

    Within the COLUMNS part, the names of the choice variables (right here x and y) are assigned on the left, and their coefficients which belong to goal or constraints inequalities are supplied on the appropriate. The RHS part comprises the right-hand facet vectors of the mannequin constraints. The decrease and higher bounds of the choice variables are outlined within the BOUNDS part. The MPS file closes with ENDATA.

    Optimization Course of and Getting Outcomes

    HiGHS makes use of algorithms reminiscent of simplex or inside level technique for the optimization course of. To clarify these algorithms deserve a separate publish of their very own. I hope to the touch upon them sooner or later.

    The code used to extract the outcomes is given beneath. The mannequin standing is optimum. I extract the target perform worth and the answer values of the choice variables. Moreover, I print the variety of iterations, the standing of primal and twin options, and foundation validity.

    answer = h.getSolution()
    foundation = h.getBasis()
    data = h.getInfo()

    model_status = h.getModelStatus()
    print("Mannequin standing = ", h.modelStatusToString(model_status))
    print()

    #Get answer goal worth, and optimum values for x and y
    print("Optimum goal = ", data.objective_function_value)
    print ("Optimum worth of x:", answer.col_value[0])
    print ("Optimum worth of y:", answer.col_value[1])

    #get mannequin run traits
    print('Iteration rely = ', data.simplex_iteration_count)
    print('Primal answer standing = ', h.solutionStatusToString(data.primal_solution_status))
    print('Twin answer standing = ', h.solutionStatusToString(data.dual_solution_status))
    print('Foundation validity = ', h.basisValidityToString(data.basis_validity))

    Printing outcomes of the code above. Illustration by Creator.

    Answer Information

    After the optimization course of, HiGHS permits writing the answer into an answer file with a .sol extension. Additional, the answer will be written in numerous codecs as given here. 1 stands for HiGHS fairly format, and three stands for Glpsol fairly format respectively.

    Answer file types out there with HiGHS. Illustration based mostly on HiGHS documentation.

    To get the answer in type 3, I used h.writeSolution("mysolution.sol", 3). The issue statistics are supplied on the prime. The optimum answer values are supplied within the Exercise column. The St column specifies the standing of the answer. For instance, B stands for Fundamental- the variable or constraint is a part of the premise answer (optimum). NU refers that the answer is non-basic and is identical because the higher certain. The worth within the Marginal column (also known as the shadow worth or twin worth) refers to how a lot the target perform would fluctuate with the unit change within the non-basic variable. For extra info on the GLPK answer file info, one can confer with here.

    Construction of the answer file in Glpsol fairly type. Illustration by Creator.

    Conclusion

    On this publish, I offered an instance of fixing a easy linear optimization downside utilizing an open-source solver known as HiGHS with the highspy bundle in Python. Subsequent, I defined how the optimization downside measurement will be inferred utilizing the coefficient matrix, resolution variable vector and RHS vector. I launched and defined totally different elements of mathematical programming system (mps) recordsdata for representing optimization downside. Lastly, I demonstrated the optimization technique of a solver, steps for extracting outcomes and analyzing the answer file.

    The pocket book and related recordsdata for this publish is accessible on this GitHub repository. Thanks for studying!



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleA3 Learn Databricks AI: Concept of Feature Scaling | by THE BRICK LEARNING | Dec, 2024
    Next Article 5 Benefits of ‘Ick’ Franchise Industries
    Team_AIBS News
    • Website

    Related Posts

    Artificial Intelligence

    An Introduction to Remote Model Context Protocol Servers

    July 2, 2025
    Artificial Intelligence

    How to Access NASA’s Climate Data — And How It’s Powering the Fight Against Climate Change Pt. 1

    July 1, 2025
    Artificial Intelligence

    STOP Building Useless ML Projects – What Actually Works

    July 1, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Qantas data breach to impact 6 million airline customers

    July 2, 2025

    I Tried Buying a Car Through Amazon: Here Are the Pros, Cons

    December 10, 2024

    Amazon and eBay to pay ‘fair share’ for e-waste recycling

    December 10, 2024

    Artificial Intelligence Concerns & Predictions For 2025

    December 10, 2024

    Barbara Corcoran: Entrepreneurs Must ‘Embrace Change’

    December 10, 2024
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    Most Popular

    Building Multi-Agent with Google’s A2A (Agent2Agent) Protocol, Agent Development Kit(ADK), and MCP (Model Context Protocol) - A Deep Dive(Full Code)

    April 18, 2025

    China Rescues Stranded Lunar Satellites After Rocket Failure

    February 23, 2025

    Why some school districts are spending big on schools tailor-made for 4-year-olds

    February 18, 2025
    Our Picks

    Qantas data breach to impact 6 million airline customers

    July 2, 2025

    He Went From $471K in Debt to Teaching Others How to Succeed

    July 2, 2025

    An Introduction to Remote Model Context Protocol Servers

    July 2, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Aibsnews.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.