国外优秀数学教材选评(第十二章 计算数学)

发布时间:2012-03-19浏览次数:51

 

本书内容为原作者版权所有,未经协议授权,禁止下载使用

主  编 杨劲根
副主编 楼红卫 李振钱 郝群

编写人员(按汉语拼音为序)
陈超群 陈猛 东瑜昕 高威 郝群 刘东弟 吕志 童裕孙 王巨平 王泽军 徐晓津 杨劲根 应坚刚 张锦豪 张永前 周子翔 朱胜林


 

1. 序言
2. 非数学专业的数学教材
3.数学分析和泛函分析
4.单复变函数
5.多复变函数
6.代数
7.数论
8.代数几何
9.拓扑与微分几何
10.偏微分方程
11.概率论
12.计算数学
13.其他
14.附录


12 计算数学

书名:Numerical Optimization
作者: J. Nocedal \& S. Wright
出版商: 科学出版社, 2006
页数:636
适用范围:计算数学、运筹学、工业工程、计算机专业高年级本科或者研究生教材
预备知识:数学分析、高等代数、数值线性代数
习题数量:一般
习题难度:一般
推荐强度:10
书评: 本书是由 Northwestern 大学电子工程和计算机科学系教授、数值最优化最著名的专家 Nocedal 和 Wisconsin 大学 Madison 分校 (UW-Madison) 计算机科学系教授、优化著名专家 Wright 合写的一本数值最优化教材。深受读者欢迎,国外很多院系也把这本书作为最优化或者非线性规划课程的教材。本书内容十分丰富,包括求解无约束问题的最速下降法、牛顿法、拟牛顿法、共轭梯度法、信赖域算法;非线性最小二乘问题、求解非线性方程组的数值计算方法;线性规划的单纯形法和内点算法、二次规划;求解约束问题的罚函数法、乘子法、增广拉格朗日法、序列二次规划算法。关于这本书和其它非线性规划专著的比较可以参考国外评论部分 Michael Ferris 做的精彩评价。 (杨卫红)

国外评论摘选
I find the book under review (hereafter N-W) to be a well-written treatment of continuous nonlinear optimization and I would recommend its use for upper level undergraduate or graduate level courses in nonlinear optimization. The book is sufficiently detailed to be useful to researchers, but its real merit is as an educational resource.
In reaching this conclusion, I asked the following questions of the text: What is the competition for this text? Will this text replace the one I use? Does it cover all the topics I cover, or enough of them? Is it readable?
I believe there are five books that are possible competitors to (N-W), namely (1) D. Bertsekas, Nonlinear programming, Athena Scientific, Belmont, MA, 1995; (2) J. E. Dennis, Jr. and R. B. Schnabel, Numerical methods for unconstrained optimization and nonlinear equations, Corrected reprint of the 1983 original, SIAM, Philadelphia, PA, 1996; (3) R. Fletcher, Practical methods of optimization, Second edition, Wiley, Chichester, 1987; (4) P. E. Gill, W. Murray and M. H. Wright, Practical optimization, Academic Press, London, 1981; (5) S. Nash and A. Sofer, Linear and nonlinear programming, McGraw-Hill, New York, 1996.
The book by Fletcher is the classical reference; I believe\break (N-W) will supplant Fletcher's for two reasons. Firstly, the new book uses a more modern typeset and is much easier to read and understand. Such clarity enables students to learn the material more quickly, and, furthermore, to read the relevant literature more easily. Secondly, the topics covered in the new book are more current and balanced (the one exception being nonsmooth optimization). In particular, the section on automatic derivatives is an important and welcome addition. Clearly, the (N-W) text benefits from being written a good decade later.
Similar comments can be made comparing (N-W) to the Dennis and Schnabel book. While the latter is much more specialised than Fletcher's text, it is still the book of choice for many who are interested in unconstrained and nonlinear equation software.
I believe (N-W) is much more up-to-date and will be a much better choice for engineers and practitioners interested in knowing some mathematical background for the algorithms they need to use. The treatment of the same topics is more thorough but eminently readable in (N-W); furthermore, (N-W) covers linear and constrained optimization as well, without becoming too long.
While the treatment of line search methods in the book by Gill, Murray and Wright is very good, (N-W) covers these in adequate detail, and furthermore gives an up-to-date view of trust region methods as well. The treatment of interior point methods in the description given in (N-W) also benefits from the research carried out over the past decade, and is preferable to that of Gill et al.
The Nash and Sofer book is somewhat similar to (N-W) and may be preferable for lecturers who wish to teach nonlinear optimization appended to a course on linear programming. I believe, however, that the approach given in (N-W) is better, and enables a clearer understanding of how linear problems fit within the big picture. Again, the treatment of trust region methods is much more complete in (N-W).
The Bertsekas book is very different from (N-W). It is a much more theoretical text, and covers aspects of convex analysis and nonsmooth optimization in much greater detail than (N-W). For more theoretical courses, I would prefer to use the Bertsekas text. However, for courses that stress algorithms for smooth problems, the (N-W) text covers all the necessary ground concisely, clearly and in an eminently readable fashion. These two books complement each other nicely.
Is the book well organised? For the most part, I believe it is. The only caveat to this is that the trust region and conjugate gradient chapters seem to be the wrong way around, with many forward references in Chapter 4. What's missing? Several things are missing from the book, in my opinion. Some treatment of nonsmooth optimization would have been useful in this text, as many of the ideas from that literature can be used to supplement our understanding of algorithms and enhance the problem classes that can be treated. (N-W) also does not cover derivative free methods at all, unless one counts finite differences. Given the huge variety of applications that use such methods in practice, I believe some comments on their applicability, strengths and weaknesses would strengthen the book. Finally, I would be remiss if I did not complain about the omission of complementarity. The unifying viewpoint and newly emerging problem classes that could benefit from overlap would draw these two fields more closely together again. (by Michael C. Ferris)