Initial commit

dev
viulong-kong 2015-09-25 17:00:41 +02:00
commit 99cec7ef13
12 changed files with 1846 additions and 0 deletions

202
LICENSE.txt 100644
View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [2015] [IBM Corporation]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

108
README.md 100644
View File

@ -0,0 +1,108 @@
# IBM Decision Optimization CPLEX Modeling for Python (DOcplex) Samples - Technology Preview
Welcome to IBM Decision Optimization CPLEX Modeling for Python.
Licensed under the Apache License, Version 2.0.
These are the IBM Decision Optimization CPLEX for Python samples
Solving with CPLEX Optimizers locally requires that IBM ILOG CPLEX Optimization Studio V12.6.2 is installed on your machine.
Solving with the IBM Decision Optimization on Cloud service requires that you register for an account and get the API key.
## Requirements
This library requires Python version 2.7.9 (or later), or 3.4 (or later).
* One of the following options:
* An **IBM Decision Optimization on Cloud** Service account and API key. You can
register for a 30-day free trial or buy a subscription
[here](https://developer.ibm.com/docloud/try-docloud-free).
* **IBM ILOG CPLEX Optimization Studio V12.6.2** Development or Deployment edition for solving with no engine limit or
the Community Edition with engine limits. You can download the Community Edition
[here](http://www-01.ibm.com/software/websphere/products/optimization/cplex-studio-community-edition).
## Install the library
```
pip install docplex
```
## Get the source code and examples
* [Documentation](https://github.com/IBMDecisionOptimization/docplex-doc)
* [Source Code](https://github.com/IBMDecisionOptimization/docplex)
## Using the IBM Decision Optimization on Cloud service
1. Register for a trial account.
Register for the DOcloud free trial and use it free for 30 days. See
[Free trial](https://developer.ibm.com/docloud/try-docloud-free>).
2. Get your API key.
With your free trial, you can generate a key to access the DOcloud API.
Go to the
[Get API key & base URL](http://developer.ibm.com/docloud/docs/api-key/)
page to generate the key after you register. This page also contains
the base URL you must use for DOcloud.
3. The examples rely on you specifying the api_key either in the sample
``.py`` file or in a resource file in your HOME directory.
a. Create a ``.docplexrc`` file in your HOME directory and insert the following
lines :
url: YOUR_DOCLOUD_URL
api_key: YOUR_API_KEY_HERE
b. Edit each sample ``.py`` file. Look for ::
"""DOcloud credentials can be specified here with url and api_key.
Alternatively, if api_key is None, DOcloudContext.make_default_context()
looks for a .docplexrc file in your home directory. That file contains the
credential and other properties.
"""
url = "YOUR_URL_HERE"
api_key = None
Edit your url and api_key.
## Using IBM ILOG CPLEX V12.6.2 on your computer
If you have IBM ILOG CPLEX Optimization Studio V12.6.2 installed, you need to add
``<cplexdir>/python/<python_version>/<platform>`` to your PYTHONPATH.
* ``<cplexdir>`` is your CPLEX installation directory.
* ``<python_version>`` is:
* 2.7 if your python version is 2.7
* 3.4 if your python version is 3.4
* ``<platform>`` is:
* ``x64_win64`` if your operating system is Windows
* ``x86-64_linux`` if your operating system is Linux
Note that if CPLEX is in the PYTHONPATH, then it overrides the DOcloud credentials and solves locally, unless you use
``solve_cloud`` instead of standard methods.
## Dependencies
These third-party dependencies are installed with ``pip``
- [docloud](https://pypi.python.org/pypi/docloud)
- [enum34](https://pypi.python.org/pypi/enum34)
- [futures](https://pypi.python.org/pypi/futures)
- [requests](https://pypi.python.org/pypi/requests)
- [six](https://pypi.python.org/pypi/six)
## License
This library is delivered under the Apache License Version 2.0, January 2004 (see LICENSE.txt).

View File

@ -0,0 +1,4 @@
# gendoc: ignore
"""
This is the examples package.
"""

View File

@ -0,0 +1,4 @@
# gendoc: ignore
"""
This is the modeling examples package.
"""

View File

@ -0,0 +1,104 @@
# The goal of the diet problem is to select a set of foods that satisfies
# a set of daily nutritional requirements at minimal cost.
# Source of data: http://www.neos-guide.org/content/diet-problem-solver
from collections import namedtuple
from docplex.mp.model import Model
from docplex.mp.context import DOcloudContext
FOODS = [
("Roasted Chicken", 0.84, 0, 10),
("Spaghetti W/ Sauce", 0.78, 0, 10),
("Tomato,Red,Ripe,Raw", 0.27, 0, 10),
("Apple,Raw,W/Skin", .24, 0, 10),
("Grapes", 0.32, 0, 10),
("Chocolate Chip Cookies", 0.03, 0, 10),
("Lowfat Milk", 0.23, 0, 10),
("Raisin Brn", 0.34, 0, 10),
("Hotdog", 0.31, 0, 10)
]
NUTRIENTS = [
("Calories", 2000, 2500),
("Calcium", 800, 1600),
("Iron", 10, 30),
("Vit_A", 5000, 50000),
("Dietary_Fiber", 25, 100),
("Carbohydrates", 0, 300),
("Protein", 50, 100)
]
FOOD_NUTRIENTS = [
("Roasted Chicken", 277.4, 21.9, 1.8, 77.4, 0, 0, 42.2),
("Spaghetti W/ Sauce", 358.2, 80.2, 2.3, 3055.2, 11.6, 58.3, 8.2),
("Tomato,Red,Ripe,Raw", 25.8, 6.2, 0.6, 766.3, 1.4, 5.7, 1),
("Apple,Raw,W/Skin", 81.4, 9.7, 0.2, 73.1, 3.7, 21, 0.3),
("Grapes", 15.1, 3.4, 0.1, 24, 0.2, 4.1, 0.2),
("Chocolate Chip Cookies", 78.1, 6.2, 0.4, 101.8, 0, 9.3, 0.9),
("Lowfat Milk", 121.2, 296.7, 0.1, 500.2, 0, 11.7, 8.1),
("Raisin Brn", 115.1, 12.9, 16.8, 1250.2, 4, 27.9, 4),
("Hotdog", 242.1, 23.5, 2.3, 0, 0, 18, 10.4)
]
def build_diet_model(docloud_context=None):
# Create tuples with named fields for foods and nutrients
Food = namedtuple("Food", ["name", "unit_cost", "qmin", "qmax"])
food = [Food(*f) for f in FOODS]
Nutrient = namedtuple("Nutrient", ["name", "qmin", "qmax"])
nutrients = [Nutrient(*row) for row in NUTRIENTS]
food_nutrients = {(fn[0], nutrients[n].name):
fn[1 + n] for fn in FOOD_NUTRIENTS for n in range(len(NUTRIENTS))}
# Model
m = Model("diet", docloud_context=docloud_context)
# Decision variables, limited to be >= Food.qmin and <= Food.qmax
qty = dict((f, m.continuous_var(f.qmin, f.qmax, f.name)) for f in food)
# Limit range of nutrients, and mark them as KPIs
for n in nutrients:
amount = m.sum(qty[f] * food_nutrients[f.name, n.name] for f in food)
m.add_range(n.qmin, amount, n.qmax)
m.add_kpi(amount, publish_name="Total %s" % n.name)
# Minimize cost
m.minimize(m.sum(qty[f] * f.unit_cost for f in food))
m.print_information()
return m
if __name__ == '__main__':
"""DOcloud credentials can be specified here with url and api_key in the code block below.
Alternatively, if api_key is None, DOcloudContext.make_default_context()
looks for a .docplexrc file in your home directory on unix ($HOME) or
user profile directory on windows (%UserProfile%). That file contains the
credential and other properties. For example, something similar to::
url = "https://docloud.service.com/job_manager/rest/v1"
api_key = "example api_key"
"""
url = "YOUR_URL_HERE"
api_key = None
ctx = DOcloudContext.make_default_context(url, api_key)
ctx.print_information()
from docplex.mp.environment import Environment
env = Environment()
env.print_information()
mdl = build_diet_model(ctx)
if not mdl.solve():
print("*** Problem has no solution")
else:
mdl.float_precision = 3
print("* model solved as function with objective: {:g}".format(mdl.objective_value))
mdl.print_solution()
mdl.report_kpis()

View File

@ -0,0 +1,494 @@
from collections import namedtuple
from enum import Enum
from docplex.mp.model import Model
from docplex.mp.context import DOcloudContext
class Weekday(Enum):
(Monday, Tuesday, Wednesday, Thursday, Friday, Saturday, Sunday) = range(1, 8) # [1..7]
TWorkRules = namedtuple("TWorkRules", ["work_time_max"])
TShift1 = namedtuple("TShift", ["department", "day", "start_time", "end_time", "min_requirement", "max_requirement"])
TVacation = namedtuple("TVacation", ["nurse", "day"])
TNursePair = namedtuple("TNursePair", ["firstNurse", "secondNurse"])
TSkillRequirement = namedtuple("TSkillRequirement", ["department", "skill", "required"])
# subclass the namedtuple to refine the str() method as the nurse's name
class TNurse(namedtuple("TNurse1", ["name", "seniority", "qualification", "payRate"])):
def __str__(self):
return self.name
# specialized namedtuple to redefine its str() method
class TShift(TShift1):
def __str__(self):
# keep first two characters in department, uppercase
dept2 = self.department[0:4].upper()
# keep 3 days of weekday
dayname = self.day.name[0:3]
return '{}_{}_{:02d}'.format(dept2, dayname, self.start_time)
class ShiftActivity(object):
@staticmethod
def to_abstime(day_index, time_of_day):
""" Convert a pair (day_index, time) into a number of hours since Monday 00:00
:param day_index: The index of the day from 1 to 7 (Monday is 1).
:param time_of_day: An integer number of hours.
:return:
"""
ONE_DAY = 24
time = ONE_DAY * (day_index - 1)
time += time_of_day
return time
def __init__(self, weekday, start_time_of_day, end_time_of_day):
assert (isinstance(weekday, Weekday))
assert (start_time_of_day >= 0)
assert (start_time_of_day <= 24)
assert (end_time_of_day >= 0)
assert (end_time_of_day <= 24)
self._weekday = weekday
self._start_time_of_day = start_time_of_day
self._end_time_of_day = end_time_of_day
# conversion to absolute time.
start_day_index = weekday.value
self.start_time = self.to_abstime(start_day_index, start_time_of_day)
end_day_index = start_day_index if end_time_of_day > start_time_of_day else start_day_index + 1
self.end_time = self.to_abstime(end_day_index, end_time_of_day)
assert (self.end_time > self.start_time)
@property
def duration(self):
return self.end_time - self.start_time
def overlaps(self, other_shift):
if not isinstance(other_shift, ShiftActivity):
return False
else:
return other_shift.end_time > self.start_time and other_shift.start_time < self.end_time
def solve(model):
if model.solve():
print("solution for a cost of {}".format(model.objective_value))
print_information(model)
print_solution(model)
return model.objective_value
else:
print("* model is infeasible")
return None
def load_data(model, *args):
""" Usage: load_data(departments, skills, shifts, nurses, nurse_skills, vacations) """
model.number_of_overlaps = 0
model.work_rules = DEFAULT_WORK_RULES
number_of_args = len(args)
model.departments = args[0]
model.skills = args[1]
model.shifts = [TShift(*shift_row) for shift_row in args[2]]
model.nurses = [TNurse(*nurse_row) for nurse_row in args[3]]
model.nurse_skills = args[4]
model.skill_requirements = SKILL_REQUIREMENTS
# transactional data
if number_of_args >= 6:
model.vacations = [TVacation._make(vacation_row) for vacation_row in args[5]]
else:
model.vacations = []
if number_of_args >= 7:
model.nurse_associations = [TNursePair._make(npr) for npr in args[6]]
else:
model.nurse_associations = []
if number_of_args >= 8:
model.nurse_incompatibilities = [TNursePair._make(npr) for npr in args[7]]
else:
model.nurse_incompatibilities = []
def setup_data(model):
""" compute internal data """
all_nurses = model.nurses
model.vacations_by_nurse = {n: [vac_day for (vac_nurse_id, vac_day) in model.vacations if vac_nurse_id == n.name]
for n in model.nurses}
# compute shift activities (start, end duration)
model.shift_activities = {s: ShiftActivity(s.day, s.start_time, s.end_time) for s in model.shifts}
model.nurses_by_id = {n.name: n for n in all_nurses}
def setup_variables(model):
all_nurses, all_shifts = model.nurses, model.shifts
model.nurse_assignment_vars = model.binary_var_matrix(all_nurses, all_shifts, 'NurseAssigned')
model.nurse_work_time_vars = model.continuous_var_dict(all_nurses, lb=0, name='NurseWorkTime')
model.nurse_over_average_time_vars = model.continuous_var_dict(all_nurses, lb=0,
name='NurseOverAverageWorkTime')
model.nurse_under_average_time_vars = model.continuous_var_dict(all_nurses, lb=0,
name='NurseUnderAverageWorkTime')
model.average_nurse_work_time = model.continuous_var(lb=0, name='AverageNurseWorkTime')
def setup_constraints(model):
all_nurses = model.nurses
all_shifts = model.shifts
nurse_assigned_vars = model.nurse_assignment_vars
nurse_work_time_vars = model.nurse_work_time_vars
shift_activities = model.shift_activities
nurses_by_id = model.nurses_by_id
max_work_time = model.work_rules.work_time_max
# define average
model.add_constraint(
len(all_nurses) * model.average_nurse_work_time == model.sum(
model.nurse_work_time_vars[n] for n in model.nurses),
"average")
# compute nurse work time , average and under, over
for n in all_nurses:
work_time_var = nurse_work_time_vars[n]
model.add_constraint(
work_time_var == model.sum(nurse_assigned_vars[n, s] * shift_activities[s].duration for s in model.shifts),
"work_time_%s" % str(n))
model.add_constraint(work_time_var == model.average_nurse_work_time + model.nurse_over_average_time_vars[n] -
model.nurse_under_average_time_vars[n], "averag_work_time_%s" % str(n))
model.add_constraint(work_time_var <= max_work_time, "max_time_%s" % str(n))
# vacations
for n in all_nurses:
for vac_day in model.vacations_by_nurse[n]:
for shift in (s for s in all_shifts if s.day == vac_day):
model.add_constraint(nurse_assigned_vars[n, shift] == 0,
"medium_vacations_%s_%s_%s" % (str(n), vac_day, str(shift)))
# a nurse cannot be assigned overlapping shifts
model.number_of_overlaps = 0
for s1 in all_shifts:
for s2 in all_shifts:
if s1 != s2 and shift_activities[s1].overlaps(shift_activities[s2]):
model.number_of_overlaps += 1
for n in all_nurses:
model.add_constraint(nurse_assigned_vars[n, s1] + nurse_assigned_vars[n, s2] <= 1,
"medium_overlapping_%s_%s_%s" % (str(s1), str(s2), str(n)))
for s in all_shifts:
demand_min = s.min_requirement
demand_max = s.max_requirement
model.add_range(demand_min, model.sum([nurse_assigned_vars[n, s] for n in model.nurses]), demand_max,
"medium_shift_%s" % str(s))
for (dept, skill, required) in model.skill_requirements:
if required > 0:
for dsh in (s for s in all_shifts if dept == s.department):
model.add_constraint(model.sum(nurse_assigned_vars[skilled_nurse, dsh] for skilled_nurse in
(n for n in all_nurses if
n.name in model.nurse_skills.keys() and skill in model.nurse_skills[
n.name])) >= required,
"high_required_%s_%s_%s_%s" % (str(dept), str(skill), str(required), str(dsh)))
# nurse-nurse associations
c = 0
for (first_nurse_id, second_nurse_id) in model.nurse_associations:
if first_nurse_id in nurses_by_id and second_nurse_id in nurses_by_id:
first_nurse = nurses_by_id[first_nurse_id]
second_nurse = nurses_by_id[second_nurse_id]
for s in all_shifts:
c += 1
ct_name = 'medium_ct_nurse_assoc%s_%s_%d' % (first_nurse_id, second_nurse_id, c)
model.add_constraint(nurse_assigned_vars[first_nurse, s] == nurse_assigned_vars[second_nurse, s],
ct_name)
# nurse-nurse incompatibilities
c = 0
for (first_nurse_id, second_nurse_id) in model.nurse_incompatibilities:
if first_nurse_id in nurses_by_id and second_nurse_id in nurses_by_id:
first_nurse = nurses_by_id[first_nurse_id]
second_nurse = nurses_by_id[second_nurse_id]
for s in all_shifts:
c += 1
ct_name = 'medium_ct_nurse_incompat_%s_%s_%d' % (first_nurse_id, second_nurse_id, c)
model.add_constraint(nurse_assigned_vars[first_nurse, s] + nurse_assigned_vars[second_nurse, s] <= 1,
ct_name)
model.nurse_costs = [model.nurse_assignment_vars[n, s] * n.payRate * model.shift_activities[s].duration for n in
model.nurses
for s in model.shifts]
model.total_number_of_assignments = model.sum(
model.nurse_assignment_vars[n, s] for n in model.nurses for s in model.shifts)
model.total_salary_cost = model.sum(model.nurse_costs)
def setup_objective(model):
model.add_kpi(model.total_salary_cost, "Total salary cost")
model.add_kpi(model.total_number_of_assignments, "Total number of assignments")
model.add_kpi(model.average_nurse_work_time)
total_fairness = model.sum(model.nurse_over_average_time_vars[n] for n in model.nurses) + model.sum(
model.nurse_under_average_time_vars[n] for n in model.nurses)
model.add_kpi(total_fairness, "Total fairness")
model.minimize(model.total_salary_cost + model.total_number_of_assignments + total_fairness)
def print_information(model):
print("#departments=%d" % len(model.departments))
print("#skills=%d" % len(model.skills))
print("#shifts=%d" % len(model.shifts))
print("#nurses=%d" % len(model.nurses))
print("#vacations=%d" % len(model.vacations))
print("#nurse associations=%d" % len(model.nurse_associations))
print("#incompatibilities=%d" % len(model.nurse_incompatibilities))
model.print_information()
model.report_kpis()
def print_solution(model):
print("*************************** Solution ***************************")
print("Allocation By Department:")
for d in model.departments:
print ("\t{}: {}".format(d, sum(
model.nurse_assignment_vars[n, s].solution_value for n in model.nurses for s in model.shifts if
s.department == d)))
print("Cost By Department:")
for d in model.departments:
cost = sum(
model.nurse_assignment_vars[n, s].solution_value * n.payRate * model.shift_activities[s].duration for n in
model.nurses for s in model.shifts if s.department == d)
print("\t{}: {}".format(d, cost))
print("Nurses Assignments")
for n in sorted(model.nurses):
total_hours = sum(
model.nurse_assignment_vars[n, s].solution_value * model.shift_activities[s].duration for s in model.shifts)
print("\t{}: total hours:{}".format(n.name, total_hours))
for s in model.shifts:
if model.nurse_assignment_vars[n, s].solution_value == 1:
print ("\t\t{}: {} {}-{}".format(s.day.name, s.department, s.start_time, s.end_time))
SKILLS = ["Anaesthesiology",
"Cardiac Care",
"Geriatrics",
"Oncology",
"Pediatrics"
]
DEPTS = ["Consultation",
"Emergency"
]
NURSES = [("Anne", 11, 1, 25),
("Bethanie", 4, 5, 28),
("Betsy", 2, 2, 17),
("Cathy", 2, 2, 17),
("Cecilia", 9, 5, 38),
("Chris", 11, 4, 38),
("Cindy", 5, 2, 21),
("David", 1, 2, 15),
("Debbie", 7, 2, 24),
("Dee", 3, 3, 21),
("Gloria", 8, 2, 25),
("Isabelle", 3, 1, 16),
("Jane", 3, 4, 23),
("Janelle", 4, 3, 22),
("Janice", 2, 2, 17),
("Jemma", 2, 4, 22),
("Joan", 5, 3, 24),
("Joyce", 8, 3, 29),
("Jude", 4, 3, 22),
("Julie", 6, 2, 22),
("Juliet", 7, 4, 31),
("Kate", 5, 3, 24),
("Nancy", 8, 4, 32),
("Nathalie", 9, 5, 38),
("Nicole", 0, 2, 14),
("Patricia", 1, 1, 13),
("Patrick", 6, 1, 19),
("Roberta", 3, 5, 26),
("Suzanne", 5, 1, 18),
("Vickie", 7, 1, 20),
("Wendie", 5, 2, 21),
("Zoe", 8, 3, 29)
]
SHIFTS = [("Emergency", Weekday.Monday, 2, 8, 3, 5),
("Emergency", Weekday.Monday, 8, 12, 4, 7),
("Emergency", Weekday.Monday, 12, 18, 2, 5),
("Emergency", Weekday.Monday, 18, 2, 3, 7),
("Consultation", Weekday.Monday, 8, 12, 10, 13),
("Consultation", Weekday.Monday, 12, 18, 8, 12),
("Cardiac Care", Weekday.Monday, 8, 12, 10, 13),
("Cardiac Care", Weekday.Monday, 12, 18, 8, 12),
("Emergency", Weekday.Tuesday, 8, 12, 4, 7),
("Emergency", Weekday.Tuesday, 12, 18, 2, 5),
("Emergency", Weekday.Tuesday, 18, 2, 3, 7),
("Consultation", Weekday.Tuesday, 8, 12, 10, 13),
("Consultation", Weekday.Tuesday, 12, 18, 8, 12),
("Cardiac Care", Weekday.Tuesday, 8, 12, 4, 7),
("Cardiac Care", Weekday.Tuesday, 12, 18, 2, 5),
("Cardiac Care", Weekday.Tuesday, 18, 2, 3, 7),
("Emergency", Weekday.Wednesday, 2, 8, 3, 5),
("Emergency", Weekday.Wednesday, 8, 12, 4, 7),
("Emergency", Weekday.Wednesday, 12, 18, 2, 5),
("Emergency", Weekday.Wednesday, 18, 2, 3, 7),
("Consultation", Weekday.Wednesday, 8, 12, 10, 13),
("Consultation", Weekday.Wednesday, 12, 18, 8, 12),
("Emergency", Weekday.Thursday, 2, 8, 3, 5),
("Emergency", Weekday.Thursday, 8, 12, 4, 7),
("Emergency", Weekday.Thursday, 12, 18, 2, 5),
("Emergency", Weekday.Thursday, 18, 2, 3, 7),
("Consultation", Weekday.Thursday, 8, 12, 10, 13),
("Consultation", Weekday.Thursday, 12, 18, 8, 12),
("Emergency", Weekday.Friday, 2, 8, 3, 5),
("Emergency", Weekday.Friday, 8, 12, 4, 7),
("Emergency", Weekday.Friday, 12, 18, 2, 5),
("Emergency", Weekday.Friday, 18, 2, 3, 7),
("Consultation", Weekday.Friday, 8, 12, 10, 13),
("Consultation", Weekday.Friday, 12, 18, 8, 12),
("Emergency", Weekday.Saturday, 2, 12, 5, 7),
("Emergency", Weekday.Saturday, 12, 20, 7, 9),
("Emergency", Weekday.Saturday, 20, 2, 12, 12),
("Emergency", Weekday.Sunday, 2, 12, 5, 7),
("Emergency", Weekday.Sunday, 12, 20, 7, 9),
("Emergency", Weekday.Sunday, 20, 2, 12, 12),
("Geriatrics", Weekday.Sunday, 8, 10, 2, 5)]
NURSE_SKILLS = {"Anne": ["Anaesthesiology", "Oncology", "Pediatrics"],
"Betsy": ["Cardiac Care"],
"Cathy": ["Anaesthesiology"],
"Cecilia": ["Anaesthesiology", "Oncology", "Pediatrics"],
"Chris": ["Cardiac Care", "Oncology", "Geriatrics"],
"Gloria": ["Pediatrics"], "Jemma": ["Cardiac Care"],
"Joyce": ["Anaesthesiology", "Pediatrics"],
"Julie": ["Geriatrics"], "Juliet": ["Pediatrics"],
"Kate": ["Pediatrics"], "Nancy": ["Cardiac Care"],
"Nathalie": ["Anaesthesiology", "Geriatrics"],
"Patrick": ["Oncology"], "Suzanne": ["Pediatrics"],
"Wendie": ["Geriatrics"],
"Zoe": ["Cardiac Care"]
}
VACATIONS = [("Anne", Weekday.Friday),
("Anne", Weekday.Sunday),
("Cathy", Weekday.Thursday),
("Cathy", Weekday.Tuesday),
("Joan", Weekday.Thursday),
("Joan", Weekday.Saturday),
("Juliet", Weekday.Monday),
("Juliet", Weekday.Tuesday),
("Juliet", Weekday.Thursday),
("Nathalie", Weekday.Sunday),
("Nathalie", Weekday.Thursday),
("Isabelle", Weekday.Monday),
("Isabelle", Weekday.Thursday),
("Patricia", Weekday.Saturday),
("Patricia", Weekday.Wednesday),
("Nicole", Weekday.Friday),
("Nicole", Weekday.Wednesday),
("Jude", Weekday.Tuesday),
("Jude", Weekday.Friday),
("Debbie", Weekday.Saturday),
("Debbie", Weekday.Wednesday),
("Joyce", Weekday.Sunday),
("Joyce", Weekday.Thursday),
("Chris", Weekday.Thursday),
("Chris", Weekday.Tuesday),
("Cecilia", Weekday.Friday),
("Cecilia", Weekday.Wednesday),
("Patrick", Weekday.Saturday),
("Patrick", Weekday.Sunday),
("Cindy", Weekday.Sunday),
("Dee", Weekday.Tuesday),
("Dee", Weekday.Friday),
("Jemma", Weekday.Friday),
("Jemma", Weekday.Wednesday),
("Bethanie", Weekday.Wednesday),
("Bethanie", Weekday.Tuesday),
("Betsy", Weekday.Monday),
("Betsy", Weekday.Thursday),
("David", Weekday.Monday),
("Gloria", Weekday.Monday),
("Jane", Weekday.Saturday),
("Jane", Weekday.Sunday),
("Janelle", Weekday.Wednesday),
("Janelle", Weekday.Friday),
("Julie", Weekday.Sunday),
("Kate", Weekday.Tuesday),
("Kate", Weekday.Monday),
("Nancy", Weekday.Sunday),
("Roberta", Weekday.Friday),
("Roberta", Weekday.Saturday),
("Janice", Weekday.Tuesday),
("Janice", Weekday.Friday),
("Suzanne", Weekday.Monday),
("Vickie", Weekday.Wednesday),
("Vickie", Weekday.Friday),
("Wendie", Weekday.Thursday),
("Wendie", Weekday.Saturday),
("Zoe", Weekday.Saturday),
("Zoe", Weekday.Sunday)]
NURSE_ASSOCIATIONS = [("Isabelle", "Dee"),
("Anne", "Patrick")]
NURSE_INCOMPATIBILITIES = [("Patricia", "Patrick"),
("Janice", "Wendie"),
("Suzanne", "Betsy"),
("Janelle", "Jane"),
("Gloria", "David"),
("Dee", "Jemma"),
("Bethanie", "Dee"),
("Roberta", "Zoe"),
("Nicole", "Patricia"),
("Vickie", "Dee"),
("Joan", "Anne")
]
SKILL_REQUIREMENTS = [("Emergency", "Cardiac Care", 1)]
DEFAULT_WORK_RULES = TWorkRules(40)
def build(docloud_context=None):
model = Model("Nurses", docloud_context=docloud_context)
load_data(model, DEPTS, SKILLS, SHIFTS, NURSES, NURSE_SKILLS, VACATIONS, NURSE_ASSOCIATIONS,
NURSE_INCOMPATIBILITIES)
setup_data(model)
setup_variables(model)
setup_constraints(model)
setup_objective(model)
return model
def run(docloud_context=None):
model = build(docloud_context=docloud_context)
status = solve(model)
return status
if __name__ == '__main__':
"""DOcloud credentials can be specified here with url and api_key in the code block below.
Alternatively, if api_key is None, DOcloudContext.make_default_context()
looks for a .docplexrc file in your home directory on unix ($HOME)
or user profile directory on windows (%UserProfile%). That file contains the
credential and other properties. For example, something similar to::
url = "https://docloud.service.com/job_manager/rest/v1"
api_key = "example api_key"
"""
url = "YOUR_URL_HERE"
api_key = None
ctx = DOcloudContext.make_default_context(url, api_key)
ctx.print_information()
from docplex.mp.environment import Environment
env = Environment()
env.print_information()
run(ctx)

View File

@ -0,0 +1,97 @@
"""The model aims at minimizing the production cost for a number of products while satisfying customer demand.
Each product can be produced either inside the company or outside, at a higher cost.
The inside production is constrained by the company's resources, while outside production is considered unlimited.
The model first declares the products and the resources.
The data consists of the description of the products (the demand, the inside and outside costs,
and the resource consumption) and the capacity of the various resources.
The variables for this problem are the inside and outside production for each product.
"""
from docplex.mp.model import Model
from docplex.mp.context import DOcloudContext
def build_production_problem(products, resources, consumptions, docloud_context=None):
""" Takes as input:
- a list of product tuples (name, demand, inside, outside)
- a list of resource tuples (name, capacity)
- a list of consumption tuples (product_name, resource_named, consumed)
"""
mdl = Model('production', docloud_context=docloud_context)
# --- decision variables ---
mdl.inside_vars = mdl.continuous_var_dict(products, name='inside')
mdl.outside_vars = mdl.continuous_var_dict(products, name='outside')
# --- constraints ---
# demand satisfaction
for prod in products:
mdl.add_constraint(mdl.inside_vars[prod] + mdl.outside_vars[prod] >= prod[1])
# --- resource capacity ---
for res in resources:
mdl.add_constraint(mdl.sum([mdl.inside_vars[p] * consumptions[p[0], res[0]] for p in products]) <= res[1])
# --- objective ---
mdl.total_inside_cost = mdl.sum(mdl.inside_vars[p] * p[2] for p in products)
mdl.total_outside_cost = mdl.sum(mdl.outside_vars[p] * p[3] for p in products)
mdl.minimize(mdl.total_inside_cost + mdl.total_outside_cost)
return mdl
def solve_production_problem(products, resources, consumptions, docloud_context=None):
mdl = build_production_problem(products, resources, consumptions, docloud_context)
# --- solve ---
mdl.print_information()
if not mdl.solve():
print("Problem has no solution")
return -1
obj = mdl.objective_value
print("* Production model solved with objective: {:g}".format(obj))
print("* Total inside cost=%g" % mdl.total_inside_cost.solution_value)
for p in products:
print("Inside production of {product}: {ins_var}".format
(product=p[0], ins_var=mdl.inside_vars[p].solution_value))
print("* Total outside cost=%g" % mdl.total_outside_cost.solution_value)
for p in products:
print("Outside production of {product}: {out_var}".format
(product=p[0], out_var=mdl.outside_vars[p].solution_value))
return obj
PRODUCTS = [("kluski", 100, 0.6, 0.8),
("capellini", 200, 0.8, 0.9),
("fettucine", 300, 0.3, 0.4)]
# resources are a list of simple tuples (name, capacity)
RESOURCES = [("flour", 20),
("eggs", 40)]
CONSUMPTIONS = {("kluski", "flour"): 0.5,
("kluski", "eggs"): 0.2,
("capellini", "flour"): 0.4,
("capellini", "eggs"): 0.4,
("fettucine", "flour"): 0.3,
("fettucine", "eggs"): 0.6}
if __name__ == '__main__':
"""DOcloud credentials can be specified here with url and api_key in the code block below.
Alternatively, if api_key is None, DOcloudContext.make_default_context()
looks for a .docplexrc file in your home directory on unix ($HOME)
or user profile directory on windows (%UserProfile%). That file contains the
credential and other properties. For example, something similar to::
url = "https://docloud.service.com/job_manager/rest/v1"
api_key = "example api_key"
"""
url = "YOUR_URL_HERE"
api_key = None
ctx = DOcloudContext.make_default_context(url, api_key)
ctx.print_information()
EXPECTED_COST = 372
print("* Running production model as a function")
fobj = solve_production_problem(PRODUCTS, RESOURCES, CONSUMPTIONS, docloud_context=ctx)
assert fobj == EXPECTED_COST

View File

@ -0,0 +1,140 @@
from collections import namedtuple
from docplex.mp.model import Model
from docplex.mp.context import DOcloudContext
nbs = (8, 1, 1, 16)
team_div1 = {"Baltimore Ravens", "Cincinnati Bengals", "Cleveland Browns",
"Pittsburgh Steelers", "Houston Texans", "Indianapolis Colts",
"Jacksonville Jaguars", "Tennessee Titans", "Buffalo Bills",
"Miami Dolphins", "New England Patriots", "New York Jets",
"Denver Broncos", "Kansas City Chiefs", "Oakland Raiders",
"San Diego Chargers"}
team_div2 = {"Chicago Bears", "Detroit Lions", "Green Bay Packers",
"Minnesota Vikings", "Atlanta Falcons", "Carolina Panthers",
"New Orleans Saints", "Tampa Bay Buccaneers", "Dallas Cowboys",
"New York Giants", "Philadelphia Eagles", "Washington Redskins",
"Arizona Cardinals", "San Francisco 49ers", "Seattle Seahawks",
"St. Louis Rams"}
Match = namedtuple("Matches", ["team1", "team2", "is_divisional"])
def build_sports(docloud_context=None):
print("* building sport scheduling model instance")
model = Model('sportSchedCPLEX', docloud_context=docloud_context)
nb_teams_in_division = nbs[0]
nb_intra_divisional = nbs[1]
nb_inter_divisional = nbs[2]
max_teams_in_division = nbs[3]
assert len(team_div1) == len(team_div2)
teams = list(team_div1 | team_div2)
model.teams = teams
teams = range(1, 2 * nb_teams_in_division + 1)
# Calculate the number of weeks necessary.
nb_weeks = (nb_teams_in_division - 1) * nb_intra_divisional + nb_teams_in_division * nb_inter_divisional
weeks = range(1, nb_weeks + 1)
model.weeks = weeks
print("{0} games, {1} intradivisional, {2} interdivisional"
.format(nb_weeks, (nb_teams_in_division - 1) * nb_intra_divisional,
nb_teams_in_division * nb_inter_divisional))
# Season is split into two halves.
first_half_weeks = range(1, nb_weeks // 2 + 1)
nb_first_half_games = nb_weeks // 3
# All possible matches (pairings) and whether of not each is intradivisional.
matches = sorted(
{Match(t1, t2, 1 if (t2 <= nb_teams_in_division or t1 > nb_teams_in_division) else 0) for t1 in teams for t2 in
teams
if t1 < t2})
model.matches = matches
# Number of games to play between pairs depends on
# whether the pairing is intradivisional or not.
nb_play = {m: nb_intra_divisional if m.is_divisional == 1 else nb_inter_divisional for m in matches}
plays = model.binary_var_matrix(keys1=matches, keys2=weeks,
name=lambda mw: "play_%d_%d_w%d" % (mw[0].team1, mw[0].team2, mw[1]))
model.plays = plays
for m in matches:
model.add_constraint(model.sum(plays[m, w] for w in weeks) == nb_play[m],
"correct_nb_games_%d_%d" % (m.team1, m.team2))
for w in weeks:
# Each team must play exactly once in a week.
for t in teams:
max_teams_in_division = (plays[m, w] for m in matches if m.team1 == t or m.team2 == t)
model.add_constraint(model.sum(max_teams_in_division) == 1,
"plays_exactly_once_%d_%s" % (w, t))
# Games between the same teams cannot be on successive weeks.
for m in matches:
if w < nb_weeks:
model.add_constraint(plays[m, w] + plays[m, w + 1] <= 1)
# Some intradivisional games should be in the first half.
for t in teams:
max_teams_in_division = [plays[m, w] for w in first_half_weeks for m in matches if
m.is_divisional == 1 and (m.team1 == t or m.team2 == t)]
model.add_constraint(model.sum(max_teams_in_division) >= nb_first_half_games,
"in_division_first_half_%s" % t)
# postpone divisional matches as much as possible
# we weight each play variable with the square of w.
model.maximize(model.sum(plays[m, w] * w * w for w in weeks for m in matches if m.is_divisional))
return model
def solve_sports(docloud_context=None):
model = build_sports(docloud_context=docloud_context)
model.print_information()
model.export_as_lp()
model.solve()
model.report()
TSolution = namedtuple("TSolution", ["week", "is_divisional", "team1", "team2"])
solution = sorted(
{TSolution(w, m.is_divisional, model.teams[m.team1], model.teams[m.team2]) for m in model.matches for w in
model.weeks if
model.plays[m, w].get_value() == 1})
currweek = 0
print("Intradivisional games are marked with a *")
for s in solution:
if s.week != currweek:
currweek = s.week
print(" == == == == == == == == == == == == == == == == ")
print("On week %d" % currweek)
print(" {0:s}{1} will meet the {2}".format("*" if s.is_divisional else "", s.team1, s.team2))
return model.objective_value
if __name__ == '__main__':
"""DOcloud credentials can be specified here with url and api_key in the code block below.
Alternatively, if api_key is None, DOcloudContext.make_default_context()
looks for a .docplexrc file in your home directory on unix ($HOME)
or user profile directory on windows (%UserProfile%). That file contains the
credential and other properties. For example, something similar to::
url = "https://docloud.service.com/job_manager/rest/v1"
api_key = "example api_key"
"""
print("This example hits the limits of CPLEX Optimization Studio and will work only the DOcloud solve.")
url = "YOUR_URL_HERE"
api_key = None
ctx = DOcloudContext.make_default_context(url, api_key)
ctx.print_information()
from docplex.mp.environment import Environment
env = Environment()
env.print_information()
solve_sports(docloud_context=ctx)

View File

@ -0,0 +1,4 @@
# gendoc: ignore
"""
This is the workflow examples package.
"""

View File

@ -0,0 +1,283 @@
from collections import namedtuple
from docplex.mp.model import AbstractModel
from docplex.mp.utils import is_iterable
from docplex.mp.context import DOcloudContext
# ------------------------------
DEFAULT_ROLL_WIDTH = 110
DEFAULT_ITEMS = [(1, 20, 48), (2, 45, 35), (3, 50, 24), (4, 55, 10), (5, 75, 8)]
DEFAULT_PATTERNS = [(i, 1) for i in range(1, 6)] # (1, 1), (2, 1) etc
DEFAULT_PATTERN_ITEM_FILLED = [(p, p, 1) for p in range(1, 6)] # pattern1 for item1, pattern2 for item2, etc.
FIRST_GENERATION_DUALS = [1, 1, 1, 1, 0]
class TItem(object):
def __init__(self, item_id, item_size, demand):
self.id = item_id
self.size = item_size
self.demand = demand
self.dual_value = -1
@classmethod
def make(cls, args):
arg_id = args[0]
arg_size = args[1]
arg_demand = args[2]
return cls(arg_id, arg_size, arg_demand)
def __str__(self):
return 'item%d' % self.id
class TPattern(namedtuple("TPattern", ["id", "cost"])):
def __str__(self):
return 'pattern%d' % self.id
class CuttingStockPatternGeneratorModel(AbstractModel):
""" The cutting stock pattern-generation model."""
def __init__(self, master_items, roll_width, output_level=1, docloud_context=None):
AbstractModel.__init__(self, 'CuttingStock_PatternGeneratorModel',
output_level=output_level,
docloud_context=docloud_context)
self.items = master_items
# default values
self.duals = [1] * len(master_items)
self.use_vars = {}
self.roll_width = roll_width
def setup_variables(self):
self.use_vars = self.integer_var_list(self.items, ub=999999, name='Use')
def load_data(self, *args):
self.items = [TItem.make(it_row) for it_row in args[0]]
self.duals = args[1][:]
self.roll_width = args[2]
def update_duals(self, new_duals):
""" Update the duals array"""
self.duals = new_duals
# duals not used in constraint , only objective has to be updated
self.setup_objective()
def clear(self):
self.use_vars = {}
AbstractModel.clear(self)
def setup_constraints(self):
self.add_constraint(self.scal_prod(self.use_vars, (it.size for it in self.items)) <= self.roll_width)
def setup_objective(self):
""" NOTE: this method is called at each loop"""
self.minimize(1 - self.scal_prod(self.use_vars, self.duals))
def get_use_values(self):
assert self.has_solution()
return [use_var.solution_value for use_var in self.use_vars]
class FirstPatternGeneratorModel(CuttingStockPatternGeneratorModel):
""" a specialized generator model to check the first iteration of pattern generation."""
def __init__(self):
CuttingStockPatternGeneratorModel.__init__(self, DEFAULT_ITEMS, DEFAULT_ROLL_WIDTH)
self.update_duals(FIRST_GENERATION_DUALS)
class CutStockMasterModel(AbstractModel):
""" The cutting stock master model. """
def __init__(self, output_level=1, docloud_context=None):
AbstractModel.__init__(self, 'Cutting Stock Master', output_level, docloud_context=docloud_context)
self.items = []
self.patterns = []
self.pattern_item_filled = {}
self.max_pattern_id = -1
self.items_by_id = {}
self.patterns_by_id = {}
# results
self.best_cost = -1
self.nb_iters = -1
self.item_fill_cts = []
self.cut_vars = {}
self.roll_width = 99999
self.MAX_CUT = 9999
def clear(self):
AbstractModel.clear(self)
self.item_fill_cts = []
self.cut_vars = {}
def load_data(self, *args):
self._check_data_args(args, 3)
item_table = args[0]
pattern_table = args[1]
fill_table = args[2]
self.items = [TItem.make(it_row) for it_row in item_table]
self.items_by_id = {it.id: it for it in self.items}
self.patterns = [TPattern(*pattern_row) for pattern_row in pattern_table]
self.patterns_by_id = {pat.id: pat for pat in self.patterns}
self.max_pattern_id = max(pt.id for pt in self.patterns)
# build the dictionary storing how much each pattern fills each item.
self.pattern_item_filled = {(self.patterns_by_id[p], self.items_by_id[i]): f for (p, i, f) in fill_table}
self.roll_width = args[3]
def add_new_pattern(self, item_usages):
""" makes a new pattern from a sequence of usages (one per item)"""
assert is_iterable(item_usages)
new_pattern_id = self.max_pattern_id + 1
new_pattern = TPattern(new_pattern_id, 1)
self.patterns.append(new_pattern)
self.max_pattern_id = new_pattern_id
for i in range(len(item_usages)):
used = item_usages[i]
item = self.items[i]
self.pattern_item_filled[new_pattern, item] = used
def setup_variables(self):
# how much to cut?
self.cut_vars = self.continuous_var_dict(self.patterns, lb=0, ub=self.MAX_CUT, name='Cut')
def setup_constraints(self):
all_items = self.items
all_patterns = self.patterns
def pattern_item_filled(pattern, item):
return self.pattern_item_filled[pattern, item] if (pattern, item) in self.pattern_item_filled else 0
self.item_fill_cts = []
for item in all_items:
item_fill_ct = self.sum(
self.cut_vars[p] * pattern_item_filled(p, item) for p in all_patterns) >= item.demand
self.item_fill_cts.append(item_fill_ct)
self.add_constraint(item_fill_ct, 'ct_fill_{0!s}'.format(item))
def setup_objective(self):
total_cutting_cost = self.sum(self.cut_vars[p] * p.cost for p in self.patterns)
self.add_kpi(total_cutting_cost, 'Total cutting cost')
self.minimize(total_cutting_cost)
def print_information(self):
print('#items={}'.format(len(self.items)))
print('#patterns={}'.format(len(self.patterns)))
AbstractModel.print_information(self)
def print_solution(self, do_filter_zeros=True):
print("| Nb of cuts \t| Pattern \t\t | Detail of pattern (nb of item1, nb of item2, ..., nb of item5) |")
print("| ----------------------------------------------------------------------------------------------- |")
for p in self.patterns:
if self.cut_vars[p].solution_value >= 1e-3:
pattern_detail = {b.id: self.pattern_item_filled[(a, b)] for (a, b) in self.pattern_item_filled if
a == p}
print(
"| {:g} \t \t \t| {} \t | {} \t\t\t\t\t\t\t\t |".format(self.cut_vars[p].solution_value, p,
pattern_detail))
print("| ----------------------------------------------------------------------------------------------- |")
def run(self, context=None):
master_model = self
master_model.ensure_setup()
gen_model = CuttingStockPatternGeneratorModel(master_items=self.items,
roll_width=self.roll_width,
output_level=self.output_level,
docloud_context=self.docloud_context
)
gen_model.setup()
rc_eps = 1e-6
obj_eps = 1e-4
loop_count = 0
best = 0
curr = self.infinity
status = False
while loop_count < 100 and abs(best - curr) >= obj_eps:
print('\n#items={},#patterns={},#vars={}'.format(len(self.items), len(self.patterns), self.variable_stats))
if loop_count > 0:
self.refresh_model()
status = master_model.solve()
loop_count += 1
best = curr
if not status:
print('{}> master model fails, stop'.format(loop_count))
break
else:
assert master_model.has_solution()
curr = self.objective_value
print('{}> new column generation iteration, best={:g}, curr={:g}'.format(loop_count, best, curr))
duals = self.get_fill_dual_values()
print('{0}> moving duals from master to sub model: {1!s}'.format(loop_count, duals))
gen_model.update_duals(duals)
status = gen_model.solve()
if not status:
print('{}> slave model fails, stop'.format(loop_count))
break
rc_cost = gen_model.objective_value
if rc_cost <= -rc_eps:
print('{}> slave model runs with obj={:g}'.format(loop_count, rc_cost))
else:
print('{}> pattern-generator model stops, obj={:g}'.format(loop_count, rc_cost))
break
use_values = gen_model.get_use_values()
print('{}> add new pattern to master data: {}'.format(loop_count, str(use_values)))
# make a new pattern with use values
if not (loop_count < 100 and abs(best - curr) >= obj_eps):
print('* terminating: best-curr={:g}'.format(abs(best - curr)))
break
self.add_new_pattern(use_values)
if status:
print('Cutting-stock column generation terminates, best={:g}, #loops={}'.format(curr, loop_count))
self.best_cost = curr
self.nb_iters = loop_count
else:
print("Cutting-stock column generation fails")
return status
def get_fill_dual_values(self):
return self.dual_values(self.item_fill_cts)
class DefaultCutStockMasterModel(CutStockMasterModel):
def __init__(self, output_level=1, docloud_context=None):
CutStockMasterModel.__init__(self, output_level=output_level, docloud_context=docloud_context)
self.load_data(DEFAULT_ITEMS, DEFAULT_PATTERNS, DEFAULT_PATTERN_ITEM_FILLED, DEFAULT_ROLL_WIDTH)
if __name__ == '__main__':
"""DOcloud credentials can be specified here with url and api_key in the code block below.
Alternatively, if api_key is None, DOcloudContext.make_default_context()
looks for a .docplexrc file in your home directory on unix ($HOME)
or user profile directory on windows (%UserProfile%). That file contains the
credential and other properties. For example, something similar to::
url = "https://docloud.service.com/job_manager/rest/v1"
api_key = "example api_key"
"""
url = "YOUR_URL_HERE"
api_key = None
ctx = DOcloudContext.make_default_context(url, api_key)
from docplex.mp.environment import Environment
env = Environment()
env.print_information()
cutstock_model = DefaultCutStockMasterModel(docloud_context=ctx)
ok = cutstock_model.run(ctx)
assert ok
assert cutstock_model.best_cost == 46.25
cutstock_model.print_solution()

View File

@ -0,0 +1,136 @@
from docplex.mp.model import Model
from docplex.mp.context import DOcloudContext
B = [15, 15, 15]
C = [
[6, 10, 1],
[12, 12, 5],
[15, 4, 3],
[10, 3, 9],
[8, 9, 5]
]
A = [
[5, 7, 2],
[14, 8, 7],
[10, 6, 12],
[8, 4, 15],
[6, 12, 5]
]
def run_GAP_model(As, Bs, Cs, docloud_context=None):
mdl = Model('GAP per Wolsey -without- Lagrangian Relaxation', 0, docloud_context=docloud_context)
print("#As={}, #Bs={}, #Cs={}".format(len(As), len(Bs), len(Cs)))
number_of_cs = len(C)
# variables
x_vars = [mdl.binary_var_list(Cs[i], name=None) for i in range(number_of_cs)]
# constraints
for i in range(number_of_cs):
mdl.add_constraint(mdl.sum(x_vars[i]) <= 1)
# sum i: a_ij * x_ij <= b[j] for all j
for j in range(len(B)):
mdl.add_constraint(mdl.sum(x_vars[i][j] * As[i][j] for i in range(number_of_cs)) <= Bs[j])
# objective
total_profit = mdl.sum(mdl.sum(c_ij * x_ij for c_ij, x_ij in zip(c_i, x_i))
for c_i, x_i in zip(Cs, x_vars))
mdl.maximize(total_profit)
mdl.print_information()
assert mdl.solve()
obj = mdl.objective_value
mdl.print_information()
print("* GAP with no relaxation run OK, best objective is: {:g}".format(obj))
mdl.end()
return obj
def run_GAP_model_with_Lagrangian_relaxation(As, Bs, Cs, max_iters=101, docloud_context=None):
mdl = Model('GAP per Wolsey -with- Lagrangian Relaxation', 0, docloud_context=docloud_context)
print("#As={}, #Bs={}, #Cs={}".format(len(As), len(Bs), len(Cs)))
c_range = range(len(Cs))
# variables
x_vars = [mdl.binary_var_list(C[i], name=None) for i in c_range]
p_vars = [mdl.continuous_var(lb=0) for _ in c_range] # new for relaxation
# constraints
for i in c_range:
# was mdl.add_constraint(mdl.sum(xVars[i]) <= 1)
mdl.add_constraint(mdl.sum(x_vars[i]) == 1 - p_vars[i])
# sum i: a_ij * x_ij <= b[j] for all j
for j in range(len(Bs)):
mdl.add_constraint(mdl.sum(x_vars[i][j] * As[i][j] for i in c_range) <= Bs[j])
# lagrangian relaxation loop
eps = 1e-6
loop_count = 0
best = 0
initial_multiplier = 1
multipliers = [initial_multiplier] * len(Cs)
total_profit = mdl.sum(mdl.sum(c_ij * x_ij for c_ij, x_ij in zip(c_i, x_i)) for c_i, x_i in zip(Cs, x_vars))
mdl.add_kpi(total_profit, "Total profit")
while loop_count <= max_iters:
loop_count += 1
# rebuilt at each loop iteration
total_penalty = mdl.sum(p_vars[i] * multipliers[i] for i in c_range)
mdl.maximize(total_profit + total_penalty)
ok = mdl.solve()
if not ok:
print("*** solve fails, stopping at iteration: %d" % loop_count)
break
best = mdl.objective_value
penalties = [pv.get_value() for pv in p_vars]
print('%d> new lagrangian iteration, obj=%g, m=%s, p=%s' % (loop_count, best, str(multipliers), str(penalties)))
do_stop = True
justifier = 0
for k in c_range:
penalized_violation = penalties[k] * multipliers[k]
if penalized_violation >= eps:
do_stop = False
justifier = penalized_violation
break
if do_stop:
print("* Lagrangian relaxation succeeds, best={:g}, penalty={:g}, #iterations={}"
.format(best, total_penalty.get_value(), loop_count))
break
else:
# update multipliers and start loop again.
scale_factor = 1.0 / float(loop_count)
multipliers = [max(multipliers[i] - scale_factor * penalties[i], 0.) for i in c_range]
print('{}> -- loop continues, m={}, justifier={:g}'.format(loop_count, str(multipliers), justifier))
return best
def run_default_GAP_model_with_lagrangian_relaxation(docloud_context):
return run_GAP_model_with_Lagrangian_relaxation(As=A, Bs=B, Cs=C, docloud_context=docloud_context)
if __name__ == '__main__':
"""DOcloud credentials can be specified here with url and api_key in the code block below.
Alternatively, if api_key is None, DOcloudContext.make_default_context()
looks for a .docplexrc file in your home directory on unix ($HOME)
or user profile directory on windows (%UserProfile%). That file contains the
credential and other properties. For example, something similar to::
url = "https://docloud.service.com/job_manager/rest/v1"
api_key = "example api_key"
"""
url = "YOUR_URL_HERE"
api_key = None
ctx = DOcloudContext.make_default_context(url, api_key)
from docplex.mp.environment import Environment
env = Environment()
env.print_information()
gap_best_obj = run_GAP_model(A, B, C, docloud_context=ctx)
assert (46 == gap_best_obj)
relaxed_best = run_GAP_model_with_Lagrangian_relaxation(A, B, C, docloud_context=ctx)
assert (46 == relaxed_best)

View File

@ -0,0 +1,270 @@
# Source: http://blog.yhathq.com/posts/how-yhat-does-cloud-balancing.html
from collections import namedtuple
from docplex.mp.model import AbstractModel
from docplex.mp.utils import is_int
from docplex.mp.context import DOcloudContext
class TUser(namedtuple("TUser1", ["id", "running", "sleeping", "current_server"])):
def __str__(self):
return self.id
DEFAULT_MAX_PROCESSES_PER_SERVER = 50
class LoadBalancingModel(AbstractModel):
def __init__(self, output_level=0, docloud_context=None):
AbstractModel.__init__(self, 'load_balancing', output_level=output_level, docloud_context=docloud_context)
# raw data
self.max_processes_per_server = DEFAULT_MAX_PROCESSES_PER_SERVER
self.servers = []
self.users = []
# decision objects
self.active_var_by_server = {}
self.assign_user_to_server_vars = {}
self.number_of_active_servers = None
self.number_of_migrations = None
self.max_sleeping_workload = None
def load_data(self, *args):
self._check_data_args(args, 2)
self.servers = args[0]
self.users = [TUser(*user_row) for user_row in args[1]]
self.max_processes_per_server = DEFAULT_MAX_PROCESSES_PER_SERVER
if len(args) >= 3:
arg2 = args[2]
if is_int(arg2):
self.max_processes_per_server = arg2
else:
print('* unexpected max process/server arg, not an int: %s'.format(str(arg2)))
return self.is_valid()
def is_valid(self):
if len(self.servers) <= 2:
print("At least two servers are required")
return False
if len(self.users) <= 2:
print("At least two users are required")
return False
if self.max_processes_per_server <= 1:
print("incorrect max #process/server, got: {}".format(self.max_processes_per_server))
return False
return True
def clear(self):
AbstractModel.clear(self)
self.active_var_by_server = {}
self.assign_user_to_server_vars = {}
self.number_of_active_servers = None
def setup_variables(self):
all_servers = self.servers
all_users = self.users
self.active_var_by_server = self.binary_var_dict(all_servers, 'isActive')
def user_server_pair_namer(u_s):
u, s = u_s
return '%s_to_%s' % (u.id, s)
self.assign_user_to_server_vars = self.binary_var_matrix(all_users, all_servers, user_server_pair_namer)
@staticmethod
def _is_migration(user, server):
""" Returns True if server is not the user's current
Used in setup of constraints.
"""
return server != user.current_server
def setup_constraints(self):
m = self
all_servers = self.servers
all_users = self.users
max_proc_per_server = self.max_processes_per_server
for s in all_servers:
m.add_constraint(
m.sum(self.assign_user_to_server_vars[u, s] * u.running for u in all_users) <= max_proc_per_server)
# each assignment var <u, s> is <= active_server(s)
for s in all_servers:
for u in all_users:
ct_name = 'ct_assign_to_active_{0!s}_{1!s}'.format(u, s)
m.add_constraint(self.assign_user_to_server_vars[u, s] <= self.active_var_by_server[s], ct_name)
# sum of assignment vars for (u, all s in servers) == 1
for u in all_users:
ct_name = 'ct_unique_server_%s' % (u[0])
m.add_constraint(m.sum((self.assign_user_to_server_vars[u, s] for s in all_servers)) == 1.0, ct_name)
def setup_objective(self):
m = self
self.number_of_active_servers = m.sum((self.active_var_by_server[svr] for svr in self.servers))
self.add_kpi(self.number_of_active_servers, "Number of active servers")
self.number_of_migrations = m.sum(
self.assign_user_to_server_vars[u, s] for u in self.users for s in self.servers if
self._is_migration(u, s))
m.add_kpi(self.number_of_migrations, "Total number of migrations")
max_sleeping_workload = m.integer_var(name="max_sleeping_processes")
for s in self.servers:
ct_name = 'ct_define_max_sleeping_%s' % s
m.add_constraint(
m.sum(self.assign_user_to_server_vars[u, s] * u.sleeping for u in self.users) <= max_sleeping_workload,
ct_name)
m.add_kpi(max_sleeping_workload, "Max sleeping workload")
self.max_sleeping_workload = max_sleeping_workload
# Set objective function
m.minimize(self.number_of_active_servers)
def run(self, docloud_context=None):
m = self
m.ensure_setup()
m.print_information()
# build an ordered sequence of goals
ordered_kpi_keywords = ["servers", "migrations", "sleeping"]
ordered_goals = [m.kpi_by_name(k) for k in ordered_kpi_keywords]
return m.solve_lexicographic(ordered_goals)
def print_solution(self, do_filter_zeros=True):
m = self
active_servers = sorted([s for s in m.servers if m.active_var_by_server[s].solution_value == 1])
print ("Active Servers: {}".format(active_servers))
print ("*** User assignment ***")
for (u, s) in sorted(m.assign_user_to_server_vars):
if m.assign_user_to_server_vars[(u, s)].solution_value == 1:
print ("{} uses {}, migration: {}".format(u, s, "yes" if m._is_migration(u, s) else "no"))
print ("*** Servers sleeping processes ***")
for s in active_servers:
sleeping = sum(self.assign_user_to_server_vars[u, s].solution_value * u.sleeping for u in self.users)
print ("Server: {} #sleeping={}".format(s, sleeping))
SERVERS = ["server002", "server003", "server001", "server006", "server007", "server004", "server005"]
USERS = [("user013", 2, 1, "server002"),
("user014)", 0, 2, "server002"),
("user015", 0, 4, "server002"),
("user016", 1, 4, "server002"),
("user017", 0, 3, "server002"),
("user018", 0, 2, "server002"),
("user019", 0, 2, "server002"),
("user020", 0, 1, "server002"),
("user021", 4, 4, "server002"),
("user022", 0, 1, "server002"),
("user023", 0, 3, "server002"),
("user024", 1, 2, "server002"),
("user025", 0, 1, "server003"),
("user026", 0, 1, "server003"),
("user027", 1, 1, "server003"),
("user028", 0, 1, "server003"),
("user029", 2, 1, "server003"),
("user030", 0, 5, "server003"),
("user031", 0, 2, "server003"),
("user032", 0, 3, "server003"),
("user033", 1, 1, "server003"),
("user034", 0, 1, "server003"),
("user035", 0, 1, "server003"),
("user036", 4, 1, "server003"),
("user037", 7, 1, "server003"),
("user038", 2, 1, "server003"),
("user039", 0, 3, "server003"),
("user040", 1, 2, "server003"),
("user001", 0, 2, "server001"),
("user002", 0, 3, "server001"),
("user003", 5, 4, "server001"),
("user004", 0, 1, "server001"),
("user005", 0, 1, "server001"),
("user006", 0, 2, "server001"),
("user007", 0, 4, "server001"),
("user008", 0, 1, "server001"),
("user009", 5, 1, "server001"),
("user010", 7, 1, "server001"),
("user011", 4, 5, "server001"),
("user012", 0, 4, "server001"),
("user062", 0, 1, "server006"),
("user063", 3, 5, "server006"),
("user064", 0, 1, "server006"),
("user065", 0, 3, "server006"),
("user066", 3, 1, "server006"),
("user067", 0, 1, "server006"),
("user068", 0, 1, "server006"),
("user069", 0, 2, "server006"),
("user070", 3, 2, "server006"),
("user071", 0, 1, "server006"),
("user072", 5, 3, "server006"),
("user073", 0, 1, "server006"),
("user074", 0, 1, "server006"),
("user075", 0, 2, "server007"),
("user076", 1, 1, "server007"),
("user077", 1, 1, "server007"),
("user078", 0, 1, "server007"),
("user079", 0, 3, "server007"),
("user080", 0, 1, "server007"),
("user081", 4, 1, "server007"),
("user082", 1, 1, "server007"),
("user041", 0, 1, "server004"),
("user042", 2, 1, "server004"),
("user043", 5, 2, "server004"),
("user044", 5, 2, "server004"),
("user045", 0, 2, "server004"),
("user046", 1, 5, "server004"),
("user047", 0, 1, "server004"),
("user048", 0, 3, "server004"),
("user049", 5, 1, "server004"),
("user050", 0, 2, "server004"),
("user051", 0, 3, "server004"),
("user052", 0, 3, "server004"),
("user053", 0, 1, "server004"),
("user054", 0, 2, "server004"),
("user055", 0, 3, "server005"),
("user056", 3, 1, "server005"),
("user057", 0, 3, "server005"),
("user058", 0, 2, "server005"),
("user059", 0, 1, "server005"),
("user060", 0, 5, "server005"),
("user061", 0, 2, "server005")
]
class DefaultLoadBalancingModel(LoadBalancingModel):
def __init__(self, output_level=0, docloud_context=None):
LoadBalancingModel.__init__(self, output_level=output_level, docloud_context=docloud_context)
self.load_data(SERVERS, USERS)
if __name__ == '__main__':
"""DOcloud credentials can be specified here with url and api_key in the code block below.
Alternatively, if api_key is None, DOcloudContext.make_default_context()
looks for a .docplexrc file in your home directory on unix ($HOME)
or user profile directory on windows (%UserProfile%). That file contains the
credential and other properties. For example, something similar to::
url = "https://docloud.service.com/job_manager/rest/v1"
api_key = "example api_key"
"""
url = "YOUR_URL_HERE"
api_key = None
ctx = DOcloudContext.make_default_context(url, api_key)
from docplex.mp.environment import Environment
env = Environment()
env.print_information()
lbm = DefaultLoadBalancingModel(docloud_context=ctx)
ok = lbm.run()
assert ok
lbm.print_solution()
import math
assert math.fabs(82.0 - lbm.objective_value) <= 8e-3