Browse Results

Showing 23,476 through 23,500 of 23,789 results

A Primer on PDEs: Models, Methods, Simulations (UNITEXT)

by Anna Zaretti Federico Vegni Paolo Zunino Sandro Salsa

This book is designed as an advanced undergraduate or a first-year graduate course for students from various disciplines like applied mathematics, physics, engineering. It has evolved while teaching courses on partial differential equations during the last decade at the Politecnico of Milan. The main purpose of these courses was twofold: on the one hand, to train the students to appreciate the interplay between theory and modelling in problems arising in the applied sciences and on the other hand to give them a solid background for numerical methods, such as finite differences and finite elements.

Analytics for Managers: With Excel

by Gregory S. Zaric Peter C. Bell

Analytics is one of a number of terms which are used to describe a data-driven more scientific approach to management. Ability in analytics is an essential management skill: knowledge of data and analytics helps the manager to analyze decision situations, prevent problem situations from arising, identify new opportunities, and often enables many millions of dollars to be added to the bottom line for the organization. The objective of this book is to introduce analytics from the perspective of the general manager of a corporation. Rather than examine the details or attempt an encyclopaedic review of the field, this text emphasizes the strategic role that analytics is playing in globally competitive corporations today. The chapters of this book are organized in two main parts. The first part introduces a problem area and presents some basic analytical concepts that have been successfully used to address the problem area. The objective of this material is to provide the student, the manager of the future, with a general understanding of the tools and techniques used by the analyst.

Nonlinear Dynamics: Exploration Through Normal Forms (Dover Books On Physics Series #Vol. 5)

by Prof. Yair Zarmi Peter B. Kahn

Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations, in which the force has a dominant linear term and a small nonlinear one. The text considers a variety of nonautonomous systems that arise during the study of forced oscillatory motion. Topics include boundary value problems, connections to the method of the center manifold, linear and nonlinear Mathieu equations, pendula, Nuclear Magnetic Resonance, coupled oscillator systems, and other subjects. 1998 edition.

Algorithms, Probability, Networks, and Games: Scientific Papers and Essays Dedicated to Paul G. Spirakis on the Occasion of His 60th Birthday (Lecture Notes in Computer Science #9295)

by Christos Zaroliagis Grammati Pantziou Spyros Kontogiannis

This Festschrift volume is published in honor of Professor Paul G. Spirakis on the occasion of his 60th birthday. It celebrates his significant contributions to computer science as an eminent, talented, and influential researcher and most visionary thought leader, with a great talent in inspiring and guiding young researchers. The book is a reflection of his main research activities in the fields of algorithms, probability, networks, and games, and contains a biographical sketch as well as essays and research contributions from close collaborators and former PhD students.

Algorithms for Solving Common Fixed Point Problems (Springer Optimization And Its Applications #132)

by Alexander J. Zaslavski

This book details approximate solutions to common fixed point problems and convex feasibility problems in the presence of perturbations. Convex feasibility problems search for a common point of a finite collection of subsets in a Hilbert space; common fixed point problems pursue a common fixed point of a finite collection of self-mappings in a Hilbert space. A variety of algorithms are considered in this book for solving both types of problems, the study of which has fueled a rapidly growing area of research. This monograph is timely and highlights the numerous applications to engineering, computed tomography, and radiation therapy planning.<P><P> Totaling eight chapters, this book begins with an introduction to foundational material and moves on to examine iterative methods in metric spaces. The dynamic string-averaging methods for common fixed point problems in normed space are analyzed in Chapter 3. Dynamic string methods, for common fixed point problems in a metric space are introduced and discussed in Chapter 4. Chapter 5 is devoted to the convergence of an abstract version of the algorithm which has been called component-averaged row projections (CARP). Chapter 6 studies a proximal algorithm for finding a common zero of a family of maximal monotone operators. Chapter 7 extends the results of Chapter 6 for a dynamic string-averaging version of the proximal algorithm. In Chapters 8 subgradient projections algorithms for convex feasibility problems are examined for infinite dimensional Hilbert spaces.

Approximate Solutions of Common Fixed-Point Problems (Springer Optimization and Its Applications #112)

by Alexander J. Zaslavski

This book presents results on theconvergence behavior of algorithms which are known as vital tools for solvingconvex feasibility problems and common fixed point problems. The main goal forus in dealing with a known computational error is to find what approximatesolution can be obtained and how many iterates one needs to find it. Accordingto know results, these algorithms should converge to a solution. In thisexposition, these algorithms are studied, taking into account computationalerrors which remain consistent in practice. In this case the convergence to asolution does not take place. We show that our algorithms generate a goodapproximate solution if computational errors are bounded from above by a smallpositive constant. Beginning with an introduction, this monograph moves onto study: · dynamicstring-averaging methods for common fixed point problems in a Hilbert space · dynamicstring methods for common fixed point problems in a metric space · dynamicstring-averaging version of the proximal algorithm · common fixedpoint problems in metric spaces · common fixedpoint problems in the spaces with distances of the Bregman type · a proximalalgorithm for finding a common zero of a family of maximal monotone operators · subgradientprojections algorithms for convex feasibility problems in Hilbert spaces

Convex Optimization with Computational Errors (Springer Optimization and Its Applications #155)

by Alexander J. Zaslavski

The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors. In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors. All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed

Nonconvex Optimal Control and Variational Problems

by Alexander J. Zaslavski

Nonconvex Optimal Control and Variational Problems is an important contribution to the existing literature in the field and is devoted to the presentation of progress made in the last 15 years of research in the area of optimal control and the calculus of variations. This volume contains a number of results concerning well-posedness of optimal control and variational problems, nonoccurrence of the Lavrentiev phenomenon for optimal control and variational problems, and turnpike properties of approximate solutions of variational problems. Chapter 1 contains an introduction as well as examples of select topics. Chapters 2-5 consider the well-posedness condition using fine tools of general topology and porosity. Chapters 6-8 are devoted to the nonoccurrence of the Lavrentiev phenomenon and contain original results. Chapter 9 focuses on infinite-dimensional linear control problems, and Chapter 10 deals with "good" functions and explores new understandings on the questions of optimality and variational problems. Finally, Chapters 11-12 are centered around the turnpike property, a particular area of expertise for the author. This volume is intended for mathematicians, engineers, and scientists interested in the calculus of variations, optimal control, optimization, and applied functional analysis, as well as both undergraduate and graduate students specializing in those areas. The text devoted to Turnpike properties may be of particular interest to the economics community.

Numerical Optimization with Computational Errors

by Alexander J. Zaslavski

This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton s method. "

Optimal Control Problems Arising in Forest Management (SpringerBriefs in Optimization)

by Alexander J. Zaslavski

This book is devoted to the study of optimal control problems arising in forest management, an important and fascinating topic in mathematical economics studied by many researchers over the years. The volume studies the forest management problem by analyzing a class of optimal control problems that contains it and showing the existence of optimal solutions over infinite horizon. It also studies the structure of approximate solutions on finite intervals and their turnpike properties, as well as the stability of the turnpike phenomenon and the structure of approximate solutions on finite intervals in the regions close to the end points. The book is intended for mathematicians interested in the optimization theory, optimal control and their applications to the economic theory.

Optimal Control Problems Related to the Robinson–Solow–Srinivasan Model (Monographs in Mathematical Economics #4)

by Alexander J. Zaslavski

This book is devoted to the study of classes of optimal control problems arising in economic growth theory, related to the Robinson–Solow–Srinivasan (RSS) model. The model was introduced in the 1960s by economists Joan Robinson, Robert Solow, and Thirukodikaval Nilakanta Srinivasan and was further studied by Robinson, Nobuo Okishio, and Joseph Stiglitz. Since then, the study of the RSS model has become an important element of economic dynamics. In this book, two large general classes of optimal control problems, both of them containing the RSS model as a particular case, are presented for study. For these two classes, a turnpike theory is developed and the existence of solutions to the corresponding infinite horizon optimal control problems is established. The book contains 9 chapters. Chapter 1 discusses turnpike properties for some optimal control problems that are known in the literature, including problems corresponding to the RSS model. The first class of optimal control problems is studied in Chaps. 2–6. In Chap. 2, infinite horizon optimal control problems with nonautonomous optimality criteria are considered. The utility functions, which determine the optimality criterion, are nonconcave. This class of models contains the RSS model as a particular case. The stability of the turnpike phenomenon of the one-dimensional nonautonomous concave RSS model is analyzed in Chap. 3. The following chapter takes up the study of a class of autonomous nonconcave optimal control problems, a subclass of problems considered in Chap. 2. The equivalence of the turnpike property and the asymptotic turnpike property, as well as the stability of the turnpike phenomenon, is established. Turnpike conditions and the stability of the turnpike phenomenon for nonautonomous problems are examined in Chap. 5, with Chap. 6 devoted to the study of the turnpike properties for the one-dimensional nonautonomous nonconcave RSS model. The utility functions, which determine the optimality criterion, are nonconcave. The class of RSS models is identified with a complete metric space of utility functions. Using the Baire category approach, the turnpike phenomenon is shown to hold for most of the models. Chapter 7 begins the study of the second large class of autonomous optimal control problems, and turnpike conditions are established. The stability of the turnpike phenomenon for this class of problems is investigated further in Chaps. 8 and 9.

Optimization in Banach Spaces (SpringerBriefs in Optimization)

by Alexander J. Zaslavski

The book is devoted to the study of constrained minimization problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such problems are well studied in a finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an objective function and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory.In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under the presence of computational errors. It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors.

Optimization on Solution Sets of Common Fixed Point Problems (Springer Optimization and Its Applications #178)

by Alexander J. Zaslavski

This book is devoted to a detailed study of the subgradient projection method and its variants for convex optimization problems over the solution sets of common fixed point problems and convex feasibility problems. These optimization problems are investigated to determine good solutions obtained by different versions of the subgradient projection algorithm in the presence of sufficiently small computational errors. The use of selected algorithms is highlighted including the Cimmino type subgradient, the iterative subgradient, and the dynamic string-averaging subgradient. All results presented are new. Optimization problems where the underlying constraints are the solution sets of other problems, frequently occur in applied mathematics. The reader should not miss the section in Chapter 1 which considers some examples arising in the real world applications. The problems discussed have an important impact in optimization theory as well. The book will be useful for researches interested in the optimization theory and its applications.

The Projected Subgradient Algorithm in Convex Optimization (SpringerBriefs in Optimization)

by Alexander J. Zaslavski

This focused monograph presents a study of subgradient algorithms for constrained minimization problems in a Hilbert space. The book is of interest for experts in applications of optimization to engineering and economics. The goal is to obtain a good approximate solution of the problem in the presence of computational errors. The discussion takes into consideration the fact that for every algorithm its iteration consists of several steps and that computational errors for different steps are different, in general. The book is especially useful for the reader because it contains solutions to a number of difficult and interesting problems in the numerical optimization. The subgradient projection algorithm is one of the most important tools in optimization theory and its applications. An optimization problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step requires a calculation of a subgradient of the objective function; the second requires a calculation of a projection on the feasible set. The computational errors in each of these two steps are different. This book shows that the algorithm discussed, generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if computational errors for the two steps of the algorithm are known, one discovers an approximate solution and how many iterations one needs for this. In addition to their mathematical interest, the generalizations considered in this book have a significant practical meaning.

Solutions of Fixed Point Problems with Computational Errors (Springer Optimization and Its Applications #210)

by Alexander J. Zaslavski

The book is devoted to the study of approximate solutions of fixed point problems in the presence of computational errors. It begins with a study of approximate solutions of star-shaped feasibility problems in the presence of perturbations. The goal is to show the convergence of algorithms, which are known as important tools for solving convex feasibility problems and common fixed point problems.The text also presents studies of algorithms based on unions of nonexpansive maps, inconsistent convex feasibility problems, and split common fixed point problems. A number of algorithms are considered for solving convex feasibility problems and common fixed point problems. The book will be of interest for researchers and engineers working in optimization, numerical analysis, and fixed point theory. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of the influence of computational errorsfor several important algorithms used for nonconvex feasibility problems.

Stability of the Turnpike Phenomenon in Discrete-Time Optimal Control Problems

by Alexander J. Zaslavski

The structure of approximate solutions of autonomous discrete-time optimal control problems and individual turnpike results for optimal control problems without convexity (concavity) assumptions are examined in this book. In particular, the book focuses on the properties of approximate solutions which are independent of the length of the interval, for all sufficiently large intervals; these results apply to the so-called turnpike property of the optimal control problems. By encompassing the so-called turnpike property the approximate solutions of the problems are determined primarily by the objective function and are fundamentally independent of the choice of interval and endpoint conditions, except in regions close to the endpoints. This book also explores the turnpike phenomenon for two large classes of autonomous optimal control problems. It is illustrated that the turnpike phenomenon is stable for an optimal control problem if the corresponding infinite horizon optimal control problem possesses an asymptotic turnpike property. If an optimal control problem belonging to the first class possesses the turnpike property, then the turnpike is a singleton (unit set). The stability of the turnpike property under small perturbations of an objective function and of a constraint map is established. For the second class of problems where the turnpike phenomenon is not necessarily a singleton the stability of the turnpike property under small perturbations of an objective function is established. Containing solutions of difficult problems in optimal control and presenting new approaches, techniques and methods this book is of interest for mathematicians working in optimal control and the calculus of variations. It also can be useful in preparation courses for graduate students.

Structure of Approximate Solutions of Optimal Control Problems

by Alexander J. Zaslavski

This title examines the structure of approximate solutions of optimal control problems considered on subintervals of a real line. Specifically at the properties of approximate solutions which are independent of the length of the interval. The results illustrated in this book look into the so-called turnpike property of optimal control problems. The author generalizes the results of the turnpike property by considering a class of optimal control problems which is identified with the corresponding complete metric space of objective functions. This establishes the turnpike property for any element in a set that is in a countable intersection which is open everywhere dense sets in the space of integrands; meaning that the turnpike property holds for most optimal control problems. Mathematicians working in optimal control and the calculus of variations and graduate students will find this book useful and valuable due to its presentation of solutions to a number of difficult problems in optimal control and presentation of new approaches, techniques and methods.

Structure of Solutions of Variational Problems

by Alexander J. Zaslavski

Structure of Solutions of Variational Problems is devoted to recent progress made in the studies of the structure of approximate solutions of variational problems considered on subintervals of a real line. Results on properties of approximate solutions which are independent of the length of the interval, for all sufficiently large intervals are presented in a clear manner. Solutions, new approaches, techniques and methods to a number of difficult problems in the calculus of variations are illustrated throughout this book. This book also contains significant results and information about the turnpike property of the variational problems. This well-known property is a general phenomenon which holds for large classes of variational problems. The author examines the following in relation to the turnpike property in individual (non-generic) turnpike results, sufficient and necessary conditions for the turnpike phenomenon as well as in the non-intersection property for extremals of variational problems. This book appeals to mathematicians working in optimal control and the calculus as well as with graduate students.

Turnpike Conditions in Infinite Dimensional Optimal Control (Springer Optimization and Its Applications #148)

by Alexander J. Zaslavski

This book provides a comprehensive study of turnpike phenomenon arising in optimal control theory. The focus is on individual (non-generic) turnpike results which are both mathematically significant and have numerous applications in engineering and economic theory. All results obtained in the book are new. New approaches, techniques, and methods are rigorously presented and utilize research from finite-dimensional variational problems and discrete-time optimal control problems to find the necessary conditions for the turnpike phenomenon in infinite dimensional spaces. The semigroup approach is employed in the discussion as well as PDE descriptions of continuous-time dynamics. The main results on sufficient and necessary conditions for the turnpike property are completely proved and the numerous illustrative examples support the material for the broad spectrum of experts. Mathematicians interested in the calculus of variations, optimal control and in applied functional analysis will find this book a useful guide to the turnpike phenomenon in infinite dimensional spaces. Experts in economic and engineering modeling as well as graduate students will also benefit from the developed techniques and obtained results.

Turnpike Phenomenon and Infinite Horizon Optimal Control

by Alexander J. Zaslavski

This book is devoted to the study of the turnpike phenomenon and describes the existence of solutions for a large variety of infinite horizon optimal control classes of problems. Chapter 1 provides introductory material on turnpike properties. Chapter 2 studies the turnpike phenomenon for discrete-time optimal control problems. The turnpike properties of autonomous problems with extended-value integrands are studied in Chapter 3. Chapter 4 focuses on large classes of infinite horizon optimal control problems without convexity (concavity) assumptions. In Chapter 5, the turnpike results for a class of dynamic discrete-time two-player zero-sum game are proven. This thorough exposition will be very useful for mathematicians working in the fields of optimal control, the calculus of variations, applied functional analysis and infinite horizon optimization. It may also be used as a primary text in a graduate course in optimal control or as supplementary text for a variety of courses in other disciplines. Researchers in other fields such as economics and game theory, where turnpike properties are well known, will also find this Work valuable.

Turnpike Phenomenon in Metric Spaces (Springer Optimization and Its Applications #201)

by Alexander J. Zaslavski

This book is devoted to the study of the turnpike phenomenon arising in optimal control theory. Special focus is placed on Turnpike results, in sufficient and necessary conditions for the turnpike phenomenon and in its stability under small perturbations of objective functions. The most important feature of this book is that it develops a large, general class of optimal control problems in metric space. Additional value is in the provision of solutions to a number of difficult and interesting problems in optimal control theory in metric spaces. Mathematicians working in optimal control, optimization, and experts in applications of optimal control to economics and engineering, will find this book particularly useful.All main results obtained in the book are new. The monograph contains nine chapters. Chapter 1 is an introduction. Chapter 2 discusses Banach space valued functions, set-valued mappings in infinite dimensional spaces, and related continuous-time dynamical systems. Some convergence results are obtained. In Chapter 3, a discrete-time dynamical system with a Lyapunov function in a metric space induced by a set-valued mapping, is studied. Chapter 4 is devoted to the study of a class of continuous-time dynamical systems, an analog of the class of discrete-time dynamical systems considered in Chapter 3. Chapter 5 develops a turnpike theory for a class of general dynamical systems in a metric space with a Lyapunov function. Chapter 6 contains a study of the turnpike phenomenon for discrete-time nonautonomous problems on subintervals of half-axis in metric spaces, which are not necessarily compact. Chapter 7 contains preliminaries which are needed in order to study turnpike properties of infinite-dimensional optimal control problems. In Chapter 8, sufficient and necessary conditions for the turnpike phenomenon for continuous-time optimal control problems on subintervals of the half-axis in metric spaces, is established. In Chapter 9, the examination continues of the turnpike phenomenon for the continuous-time optimal control problems on subintervals of half-axis in metric spaces discussed in Chapter 8.

Turnpike Theory for the Robinson–Solow–Srinivasan Model (Springer Optimization and Its Applications #166)

by Alexander J. Zaslavski

This book is devoted to the study of a class of optimal control problems arising in mathematical economics, related to the Robinson–Solow–Srinivasan (RSS) model. It will be useful for researches interested in the turnpike theory, infinite horizon optimal control and their applications, and mathematical economists. The RSS is a well-known model of economic dynamics that was introduced in the 1960s and as many other models of economic dynamics, the RSS model is determined by an objective function (a utility function) and a set-valued mapping (a technology map). The set-valued map generates a dynamical system whose trajectories are under consideration and the objective function determines an optimality criterion. The goal is to find optimal trajectories of the dynamical system, using the optimality criterion. Chapter 1 discusses turnpike properties for some classes of discrete time optimal control problems. Chapter 2 present the description of the RSS model and discuss its basic properties. Infinite horizon optimal control problems, related to the RSS model are studied in Chapter 3. Turnpike properties for the RSS model are analyzed in Chapter 4. Chapter 5 studies infinite horizon optimal control problems related to the RSS model with a nonconcave utility function. Chapter 6 focuses on infinite horizon optimal control problems with nonautonomous optimality criterions. Chapter 7 contains turnpike results for a class of discrete-time optimal control problems. Chapter 8 discusses the RSS model and compares different optimality criterions. Chapter 9 is devoted to the study of the turnpike properties for the RSS model. In Chapter 10 the one-dimensional autonomous RSS model is considered and the continuous time RSS model is studied in Chapter 11.

Turnpike Theory of Continuous-Time Linear Optimal Control Problems

by Alexander J. Zaslavski

Individual turnpike results are of great interest due to their numerous applications in engineering and in economic theory; in this book the study is focused on new results of turnpike phenomenon in linear optimal control problems. The book is intended for engineers as well as for mathematicians interested in the calculus of variations, optimal control and in applied functional analysis. Two large classes of problems are studied in more depth. The first class studied in Chapter 2 consists of linear control problems with periodic nonsmooth convex integrands. Chapters 3-5 consist of linear control problems with autonomous convex smooth integrands. Chapter 6 discusses a turnpike property for dynamic zero-sum games with linear constraints. Chapter 7 examines genericity results. In Chapter 8, the description of structure of variational problems with extended-valued integrands is obtained. Chapter 9 ends the exposition with a study of turnpike phenomenon for dynamic games with extended value integrands.

Math Games & Activities from Around the World

by Claudia Zaslavsky

More than 70 math games, puzzles, and projects from all over the world are included in this delightful book for kids.

More Math Games & Activities from Around the World

by Claudia Zaslavsky

Math, history, art, and world cultures come together in this delightful book for kids, even for those who find traditional math lessons boring. More than 70 games, puzzles, and projects encourage kids to hone their math skills as they calculate, measure, and solve problems. The games span the globe, and many have been played for thousands of years, such as three-in-a-row games like Achi from Ghana or the forbidden game of Jirig from Mongolia. Also included are imaginative board games like Lambs and Tigers from India and the Little Goat Game from Sudan, or bead and string puzzles from China, and M+bius strip puzzles from Germany. Through compelling math play, children will gain confidence and have fun as they learn about the different ways people around the world measure, count, and use patterns and symmetry in their everyday lives.

Refine Search

Showing 23,476 through 23,500 of 23,789 results