Thursday December 19th at 7:30pm (evening), DC 1351.
Practice final. For some questions, several variants are given to provide additional practice (for example, Question 3 has variants 3.1, 3.2, 3.3 and 3.4). Aside from that, the final exam will have the same length and structure as the practice final.
Tuesday October 29th at 10am, AL 113: solutions.
Date | Slides | Notes | Topics and models | |
---|---|---|---|---|
1 | 09/05 | lect-01 | notes-01 | I. Linear programming. Example 1. |
2 | 09/10 | lect-02 | notes-02 | Examples: blending, multiperiod investment. |
3 | 09/12 | notes-03 | Examples: max s-t flow problem. | |
4 | 09/17 | notes-04 | Examples: matching in a bipartite graph, min-cost flow problem, bus scheduling problem. | |
5 | 09/19 | notes-05 | II. Integer programming. Examples: knapsack, bin packing. | |
6 | 09/24 | notes-06 | IP modeling tricks (1). | |
7 | 09/26 | lect-07 | notes-07 | Examples: sports scheduling 1, sports scheduling 2 |
8 | 10/01 | notes-08 | IP modeling tricks (2): piecewise-linear functions, union of polyhedra. | |
9 | 10/03 | notes-09 | III. Solving LPs. | |
10 | 10/08 | notes-10 | Simplex method, duality. | |
11 | 10/10 | notes-11 | Weak duality, strong duality, constructing duals, complementary slackness, economic interpretation. | |
10/15 | reading week | |||
10/17 | reading week | |||
10/22 | (no lecture) | |||
12 | 10/24 | notes-12 | Bases and duality, sensitivity analysis (changes to rhs). | |
10/29 | (midterm Oct 29 10am) | |||
13 | 10/31 | notes-13 | Sensitivity analysis (changes to objective). | |
14 | 11/05 | notes-14, errata | Sensitivity analysis (adding a variable), dual simplex method. | |
15 | 11/07 | notes-15 | Dual simplex method. IV. Solving IPs. | |
16 | 11/12 | notes-16 | Integer and continuous knapsack problem. | |
17 | 11/14 | notes-17 | Totally unimodular matrices. | |
18 | 11/19 | notes-18, addendum | Constructing TU matrices, application to flows. | |
19 | 11/21 | notes-19 | V. Other problems. Satisfiability. | |
20 | 11/26 | notes-20 | Nonlinear optimization, gradient descent. Classification problem, neural networks. | |
21 | 11/28 | notes-21 | Training neural networks. | |
22 | 12/03 | review |
Here are a few pointers to help you get started with Julia.
First, you need to download and install Julia. I use the current stable release (v1.2.0). You should preferably get the 64-bit version for your operating system (Windows, MacOS or Linux).
Once Julia is installed, you need the JuMP package, which provides the mathematical programming facilities in the Julia language. Execute Julia, and from the Julia command line, run the following commands:
import Pkg
Pkg.add("JuMP")
Pkg.add("Cbc")
Note that these commands will download packages from the internet and compile them, which will take several minutes. Moreover, Julia will also be slow the first time you use the JuMP package, but it should get faster after that.
See the JuMP documentation for more details on the installation process.
In most cases, it is easier to write models in a file rather than type them line-by-line into the Julia command line. In class, I am using the Atom editor, with the Juno development environment. You can find directions on how to install them here. However, any text editor will do just fine.
In class, we start most implementations from an empty model, which you can download here, then we fill in the gaps. When it is ready, we try the model by running:
include("model-empty.jl")
where “model-empty.jl
” is replaced by the name we are
giving to our current model. For most problems, the examples covered in
class should be enough to show you how to write your own models.
However, in case of necessity, you are encouraged to consult the JuMP documentation.
If you are using more advanced features of the Julia language itself,
its documentation can be
found here.