kangwonlee / nmisp

Numerical Methods in SciPy
Other
10 stars 62 forks source link

More about optimization #168

Open kangwonlee opened 2 years ago

kangwonlee commented 2 years ago

https://darkpgmr.tistory.com/149

kangwonlee commented 1 year ago

AskUp

  1. Concepts and Categories
  2. Algorithms : Genetic, Search, Gradient based
  3. Math modeling the problems : linear, non-linear, equality & inequality constraints, ...
  4. Tools
  5. Applications

Currently 258d40e covering only part of 4, 3, and 2 I guess?

kangwonlee commented 1 year ago

chat.openai.com

  1. Introduction to optimization
    • Motivation and applications of optimization
    • Basic terminology and notation (e.g., objective function, decision variables, constraints)
    • Formulation of optimization problems (e.g., linear programming, quadratic programming, nonlinear programming)
  2. Unconstrained optimization
    • Definition of local and global optima
    • Newton's method and gradient descent
    • Convexity and convex optimization
    • Convex optimization algorithms (e.g., gradient descent, subgradient methods, projected gradient descent)
  3. Constrained optimization
    • Linear programming and simplex method
    • Nonlinear programming and Karush-Kuhn-Tucker conditions
    • Lagrange multipliers and duality
    • Constrained optimization algorithms (e.g., interior-point methods, augmented Lagrangian methods)
  4. Applications of optimization
    • Applications of optimization in various fields (e.g., engineering, economics, finance, machine learning)
    • Examples of real-world optimization problems (e.g., portfolio optimization, transportation planning, supply chain management)
kangwonlee commented 1 year ago

ChatGPT

The learning objectives for optimization lectures spanning two to three weeks of a numerical analysis course for undergraduates may include the following:

  1. Understanding the fundamental concepts of optimization: The students should be able to understand the fundamental concepts of optimization, including optimization problems, objective functions, constraints, and optimization algorithms.

  2. Knowledge of optimization algorithms: The students should be able to understand the basic properties of optimization algorithms such as gradient descent, Newton's method, and Quasi-Newton methods. They should be able to implement these algorithms and analyze their convergence properties.

  3. Convex optimization: The students should be able to understand the concept of convex optimization and be able to identify whether an optimization problem is convex or not. They should be able to understand the properties of convex functions, convex sets, and convex optimization problems.

  4. Linear programming: The students should be able to understand linear programming and its applications. They should be able to formulate linear programming problems, identify feasible solutions, and solve these problems using the simplex method.

  5. Nonlinear programming: The students should be able to understand nonlinear programming and its applications. They should be able to formulate nonlinear programming problems, identify feasible solutions, and solve these problems using optimization algorithms.

  6. Applications: The students should be able to understand the application of optimization techniques to real-world problems in fields such as engineering, finance, and computer science. They should be able to identify optimization problems in these fields, formulate them mathematically, and solve them using appropriate optimization techniques.

  7. Numerical implementation: The students should be able to implement optimization algorithms using a high-level programming language such as Python. They should be able to apply these algorithms to solve optimization problems and analyze the results.

Overall, the goal of the optimization lectures should be to provide the students with a solid understanding of the fundamental concepts of optimization and their applications. The students should be able to identify optimization problems in real-world applications, formulate them mathematically, and solve them using appropriate optimization techniques. They should also be able to analyze the results and communicate their findings effectively.


The order of topics depends on the specific learning objectives of the course and the background knowledge of the students. However, a general suggestion could be:

  1. Pandas: Introduce pandas as a data manipulation library and cover basic concepts such as data frames, series, indexing, and filtering. This will serve as a foundation for the data analysis parts of scikit-learn and tensorflow.

  2. Scipy.optimize: Cover optimization basics such as gradient descent, convexity, and unconstrained optimization. Also, cover constrained optimization using Linear Programming, Quadratic Programming or other similar methods. This will give the students the necessary tools to understand the optimization-based algorithms used in machine learning.

  3. Scikit-learn: Cover basic machine learning concepts such as supervised and unsupervised learning, cross-validation, and model selection. Then cover some specific algorithms such as linear regression, logistic regression, decision trees, k-nearest neighbors, and support vector machines.

  4. Tensorflow: Cover the basics of building and training neural networks. Cover the different types of layers and activation functions, and introduce common neural network architectures such as feedforward neural networks, convolutional neural networks, and recurrent neural networks.

Overall, it is important to balance theory and practice throughout the course. Also, consider assigning assignments and projects that require the students to apply the concepts they have learned to real-world problems.

kangwonlee commented 1 year ago

ChatGPT

1d optimization visualization example

import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import minimize_scalar

# Define the objective function
def objective(x):
    return (x - 3) ** 2 + 1

# Create a range of x values
x = np.linspace(-5, 10, 100)

# Plot the objective function
plt.plot(x, objective(x))
plt.xlabel('x')
plt.ylabel('f(x)')
plt.title('Objective Function')

# Find the minimum using the BFGS algorithm
res = minimize_scalar(objective)

# Plot the minimum
plt.scatter(res.x, objective(res.x), marker='x', color='red')
plt.annotate('Minimum', xy=(res.x, objective(res.x)), xytext=(-3, 50),
             textcoords='offset points', arrowprops=dict(arrowstyle='->'))
plt.show()