Browse Results

Showing 43,401 through 43,425 of 64,255 results

Optimal Automated Process Fault Analysis

by Daniel L. Chester Richard J. Fickelscherer

Automated fault analysis is not widely used within chemical processing industries due to problems of cost and performance as well as the difficulty of modeling process behavior at needed levels of detail. In response, this book presents the method of minimal evidence (MOME), a model-based diagnostic strategy that facilitates the development and implementation of optimal automated process fault analyzers. With this book as their guide, readers have a powerful new tool for ensuring the safety and reliability of any chemical processing system.

Optimal Auxiliary Functions Method for Nonlinear Dynamical Systems

by Vasile Marinca Nicolae Herisanu Bogdan Marinca

This book presents the optimal auxiliary functions method and applies it to various engineering problems and in particular in boundary layer problems. The cornerstone of the presented procedure is the concept of “optimal auxiliary functions” which are needed to obtain accurate results in an efficient way. Unlike other known analytic approaches, this procedure provides us with a simple but rigorous way to control and adjust the convergence of the solutions of nonlinear dynamical systems. The optimal auxiliary functions are depending on some convergence-control parameters whose optimal values are rigorously determined from mathematical point of view. The capital strength of our procedure is its fast convergence, since after only one iteration, we obtain very accurate analytical solutions which are very easy to be verified. Moreover, no simplifying hypothesis or assumptions are made. The book contains a large amount of practical models from various fields of engineering such as classical and fluid mechanics, thermodynamics, nonlinear oscillations, electrical machines, and many more. The book is a continuation of our previous books “Nonlinear Dynamical Systems in Engineering. Some Approximate Approaches”, Springer-2011 and “The Optimal Homotopy Asymptotic Method. Engineering Applications”, Springer-2015.

Optimal Control: An Introduction to the Theory and Its Applications

by Peter L. Falb Michael Athans

Geared toward advanced undergraduate and graduate engineering students, this text introduces the theory and applications of optimal control. It serves as a bridge to the technical literature, enabling students to evaluate the implications of theoretical control work, and to judge the merits of papers on the subject.Rather than presenting an exhaustive treatise, Optimal Control offers a detailed introduction that fosters careful thinking and disciplined intuition. It develops the basic mathematical background, with a coherent formulation of the control problem and discussions of the necessary conditions for optimality based on the maximum principle of Pontryagin. In-depth examinations cover applications of the theory to minimum time, minimum fuel, and to quadratic criteria problems. The structure, properties, and engineering realizations of several optimal feedback control systems also receive attention.Special features include numerous specific problems, carried through to engineering realization in block diagram form. The text treats almost all current examples of control problems that permit analytic solutions, and its unified approach makes frequent use of geometric ideas to encourage students' intuition.

Optimal Control: Weakly Coupled Systems and Applications (Automation and Control Engineering #206)

by Zoran Gajic Myo-Taeg Lim Dobrila Skataric Wu-Chung Su Vojislav Kecman

Unique in scope, Optimal Control: Weakly Coupled Systems and Applications provides complete coverage of modern linear, bilinear, and nonlinear optimal control algorithms for both continuous-time and discrete-time weakly coupled systems, using deterministic as well as stochastic formulations. This book presents numerous applications to real world systems from various industries, including aerospace, and discusses the design of subsystem-level optimal filters. Organized into independent chapters for easy access to the material, this text also contains several case studies, examples, exercises, computer assignments, and formulations of research problems to help instructors and students.

Optimal Control

by Frank L. Lewis Draguna Vrabie Vassilis L. Syrmos

A new edition of the classic text on optimal control theoryAs a superb introductory text and an indispensable reference, this new edition of Optimal Control will serve the needs of both the professional engineer and the advanced student in mechanical, electrical, and aerospace engineering. Its coverage encompasses all the fundamental topics as well as the major changes that have occurred in recent years. An abundance of computer simulations using MATLAB and relevant Toolboxes is included to give the reader the actual experience of applying the theory to real-world situations. Major topics covered include: Static Optimization Optimal Control of Discrete-Time Systems Optimal Control of Continuous-Time Systems The Tracking Problem and Other LQR Extensions Final-Time-Free and Constrained Input Control Dynamic Programming Optimal Control for Polynomial Systems Output Feedback and Structured Control Robustness and Multivariable Frequency-Domain Techniques Differential Games Reinforcement Learning and Optimal Adaptive Control

Optimal Control: Linear Quadratic Methods (Dover Books On Engineering Ser.)

by John B. Moore Brian D. Anderson

This augmented edition of a respected text teaches the reader how to use linear quadratic Gaussian methods effectively for the design of control systems. It explores linear optimal control theory from an engineering viewpoint, with step-by-step explanations that show clearly how to make practical use of the material.The three-part treatment begins with the basic theory of the linear regulator/tracker for time-invariant and time-varying systems. The Hamilton-Jacobi equation is introduced using the Principle of Optimality, and the infinite-time problem is considered. The second part outlines the engineering properties of the regulator. Topics include degree of stability, phase and gain margin, tolerance of time delay, effect of nonlinearities, asymptotic properties, and various sensitivity problems. The third section explores state estimation and robust controller design using state-estimate feedback. Numerous examples emphasize the issues related to consistent and accurate system design. Key topics include loop-recovery techniques, frequency shaping, and controller reduction, for both scalar and multivariable systems. Self-contained appendixes cover matrix theory, linear systems, the Pontryagin minimum principle, Lyapunov stability, and the Riccati equation. Newly added to this Dover edition is a complete solutions manual for the problems appearing at the conclusion of each section.

Optimal Control and Optimization of Stochastic Supply Chain Systems

by Dong-Ping Song

Optimal Control and Optimization of Stochastic Supply Chain Systems examines its subject the context of the presence of a variety of uncertainties. Numerous examples with intuitive illustrations and tables are provided, to demonstrate the structural characteristics of the optimal control policies in various stochastic supply chains and to show how to make use of these characteristics to construct easy-to-operate sub-optimal policies. In Part I, a general introduction to stochastic supply chain systems is provided. Analytical models for various stochastic supply chain systems are formulated and analysed in Part II. In Part III the structural knowledge of the optimal control policies obtained in Part II is utilized to construct easy-to-operate sub-optimal control policies for various stochastic supply chain systems accordingly. Finally, Part IV discusses the optimisation of threshold-type control policies and their robustness. A key feature of the book is its tying together of the complex analytical models produced by the requirements of operational practice, and the simple solutions needed for implementation. The analytical models and theoretical analysis propounded in this monograph will be of benefit to academic researchers and graduate students looking at logistics and supply chain management from standpoints in operations research or industrial, manufacturing, or control engineering. The practical tools and solutions and the qualitative insights into the ideas underlying functional supply chain systems will be of similar use to readers from more industrially-based backgrounds.

Optimal Control for Chemical Engineers

by Simant Ranjan Upreti

This self-contained book gives a detailed treatment of optimal control theory that enables readers to formulate and solve optimal control problems. With a strong emphasis on problem solving, it provides all the necessary mathematical analyses and derivations of important results, including multiplier theorems and Pontryagin's principle. The text presents various examples and basic concepts of optimal control and describes important numerical methods and computational algorithms for solving a wide range of optimal control problems, including periodic processes.

Optimal Control of Hybrid Vehicles

by Thijs Van Keulen John Kessels Bram De Jager

Optimal Control of Hybrid Vehicles provides a description of power train control for hybrid vehicles. The background, environmental motivation and control challenges associated with hybrid vehicles are introduced. The text includes mathematical models for all relevant components in the hybrid power train. The power split problem in hybrid power trains is formally described and several numerical solutions detailed, including dynamic programming and a novel solution for state-constrained optimal control problems based on the maximum principle. Real-time-implementable strategies that can approximate the optimal solution closely are dealt with in depth. Several approaches are discussed and compared, including a state-of-the-art strategy which is adaptive for vehicle conditions like velocity and mass. Three case studies are included in the book: * a control strategy for a micro-hybrid power train; * experimental results obtained with a real-time strategy implemented in a hybrid electric truck; and * an analysis of the optimal component sizes for a hybrid power train. Optimal Control of Hybrid Vehicles will appeal to academic researchers and graduate students interested in hybrid vehicle control or in the applications of optimal control. Practitioners working in the design of control systems for the automotive industry will also find the ideas propounded in this book of interest.

Optimal Control of Hydrosystems

by Larry W. Mays

"Combines the hydraulic simulation of physical processes with mathematical programming and differential dynamic programming techniques to ensure the optimization of hydrosystems. Presents the principles and methodologies for systems and optimal control concepts; features differential dynamic programming in developing models and solution algorithms for groundwater, real-time flood and sediment control of river-reservoir systems, and water distribution systems operations, as well as bay and estuary freshwater inflow reservoir oprations; and more."

Optimal Control of PDEs under Uncertainty: An Introduction with Application to Optimal Shape Design of Structures (SpringerBriefs in Mathematics)

by Jesús Martínez-Frutos Francisco Periago Esparza

This book provides a direct and comprehensive introduction to theoretical and numerical concepts in the emerging field of optimal control of partial differential equations (PDEs) under uncertainty. The main objective of the book is to offer graduate students and researchers a smooth transition from optimal control of deterministic PDEs to optimal control of random PDEs. Coverage includes uncertainty modelling in control problems, variational formulation of PDEs with random inputs, robust and risk-averse formulations of optimal control problems, existence theory and numerical resolution methods. The exposition focusses on the entire path, starting from uncertainty modelling and ending in the practical implementation of numerical schemes for the numerical approximation of the considered problems. To this end, a selected number of illustrative examples are analysed in detail throughout the book. Computer codes, written in MatLab, are provided for all these examples. This book is adressed to graduate students and researches in Engineering, Physics and Mathematics who are interested in optimal control and optimal design for random partial differential equations.

Optimal Control Of Singularly Perturbed Linear Systems And Applications (Automation And Control Engineering Ser.)

by Zoran Gajic

Highlights the Hamiltonian approach to singularly perturbed linear optimal control systems. Develops parallel algorithms in independent slow and fast time scales for solving various optimal linear control and filtering problems in standard and nonstandard singularly perturbed systems, continuous- and discrete-time, deterministic and stochastic, mul

Optimal Control Systems (Electrical Engineering Series)

by D. Subbaram Naidu

The theory of optimal control systems has grown and flourished since the 1960's. Many texts, written on varying levels of sophistication, have been published on the subject. Yet even those purportedly designed for beginners in the field are often riddled with complex theorems, and many treatments fail to include topics that are essential to a thorough grounding in the various aspects of and approaches to optimal control.Optimal Control Systems provides a comprehensive but accessible treatment of the subject with just the right degree of mathematical rigor to be complete but practical. It provides a solid bridge between "traditional" optimization using the calculus of variations and what is called "modern" optimal control. It also treats both continuous-time and discrete-time optimal control systems, giving students a firm grasp on both methods. Among this book's most outstanding features is a summary table that accompanies each topic or problem and includes a statement of the problem with a step-by-step solution. Students will also gain valuable experience in using industry-standard MATLAB and SIMULINK software, including the Control System and Symbolic Math Toolboxes.Diverse applications across fields from power engineering to medicine make a foundation in optimal control systems an essential part of an engineer's background. This clear, streamlined presentation is ideal for a graduate level course on control systems and as a quick reference for working engineers.

Optimal Control Theory: An Introduction

by Donald E. Kirk

Optimal control theory is the science of maximizing the returns from and minimizing the costs of the operation of physical, social, and economic processes. Geared toward upper-level undergraduates, this text introduces three aspects of optimal control theory: dynamic programming, Pontryagin's minimum principle, and numerical techniques for trajectory optimization.Chapters 1 and 2 focus on describing systems and evaluating their performances. Chapter 3 deals with dynamic programming. The calculus of variations and Pontryagin's minimum principle are the subjects of chapters 4 and 5, and chapter 6 examines iterative numerical techniques for finding optimal controls and trajectories. Numerous problems, intended to introduce additional topics as well as to illustrate basic concepts, appear throughout the text.

Optimal Control Theory: The Variational Method

by Zhongjing Ma Suli Zou

This book focuses on how to implement optimal control problems via the variational method. It studies how to implement the extrema of functional by applying the variational method and covers the extrema of functional with different boundary conditions, involving multiple functions and with certain constraints etc. It gives the necessary and sufficient condition for the (continuous-time) optimal control solution via the variational method, solves the optimal control problems with different boundary conditions, analyzes the linear quadratic regulator & tracking problems respectively in detail, and provides the solution of optimal control problems with state constraints by applying the Pontryagin’s minimum principle which is developed based upon the calculus of variations. And the developed results are applied to implement several classes of popular optimal control problems and say minimum-time, minimum-fuel and minimum-energy problems and so on. As another key branch of optimal control methods, it also presents how to solve the optimal control problems via dynamic programming and discusses the relationship between the variational method and dynamic programming for comparison. Concerning the system involving individual agents, it is also worth to study how to implement the decentralized solution for the underlying optimal control problems in the framework of differential games. The equilibrium is implemented by applying both Pontryagin’s minimum principle and dynamic programming. The book also analyzes the discrete-time version for all the above materials as well since the discrete-time optimal control problems are very popular in many fields.

Optimal Control Theory: Applications to Management Science and Economics

by Suresh P. Sethi

This fully revised 3rd edition offers an introduction to optimal control theory and its diverse applications in management science and economics. It brings to students the concept of the maximum principle in continuous, as well as discrete, time by using dynamic programming and Kuhn-Tucker theory. While some mathematical background is needed, the emphasis of the book is not on mathematical rigor, but on modeling realistic situations faced in business and economics. The book exploits optimal control theory to the functional areas of management including finance, production and marketing and to economics of growth and of natural resources. In addition, this new edition features materials on stochastic Nash and Stackelberg differential games and an adverse selection model in the principal-agent framework. The book provides exercises for each chapter and answers to selected exercises to help deepen the understanding of the material presented. Also included are appendices comprised of supplementary material on the solution of differential equations, the calculus of variations and its relationships to the maximum principle, and special topics including the Kalman filter, certainty equivalence, singular control, a global saddle point theorem, Sethi-Skiba points, and distributed parameter systems.Optimal control methods are used to determine optimal ways to control a dynamic system. The theoretical work in this field serves as a foundation for the book, which the author has applied to business management problems developed from his research and classroom instruction. The new edition has been completely refined and brought up to date. Ultimately this should continue to be a valuable resource for graduate courses on applied optimal control theory, but also for financial and industrial engineers, economists, and operational researchers concerned with the application of dynamic optimization in their fields.

Optimal Control Theory: Applications to Management Science and Economics (Springer Texts in Business and Economics)

by Suresh P. Sethi

This new 4th edition offers an introduction to optimal control theory and its diverse applications in management science and economics. It introduces students to the concept of the maximum principle in continuous (as well as discrete) time by combining dynamic programming and Kuhn-Tucker theory. While some mathematical background is needed, the emphasis of the book is not on mathematical rigor, but on modeling realistic situations encountered in business and economics. It applies optimal control theory to the functional areas of management including finance, production and marketing, as well as the economics of growth and of natural resources. In addition, it features material on stochastic Nash and Stackelberg differential games and an adverse selection model in the principal-agent framework. Exercises are included in each chapter, while the answers to selected exercises help deepen readers’ understanding of the material covered. Also included are appendices of supplementary material on the solution of differential equations, the calculus of variations and its ties to the maximum principle, and special topics including the Kalman filter, certainty equivalence, singular control, a global saddle point theorem, Sethi-Skiba points, and distributed parameter systems. Optimal control methods are used to determine optimal ways to control a dynamic system. The theoretical work in this field serves as the foundation for the book, in which the author applies it to business management problems developed from his own research and classroom instruction. The new edition has been refined and updated, making it a valuable resource for graduate courses on applied optimal control theory, but also for financial and industrial engineers, economists, and operational researchers interested in applying dynamic optimization in their fields.

Optimal Control with Aerospace Applications

by James M. Longuski José J. Guzmán John E. Prussing

Want to know not just what makes rockets go up but how to do it optimally? Optimal control theory has become such an important field in aerospace engineering that no graduate student or practicing engineer can afford to be without a working knowledge of it. This is the first book that begins from scratch to teach the reader the basic principles of the calculus of variations, develop the necessary conditions step-by-step, and introduce the elementary computational techniques of optimal control. This book, with problems and an online solution manual, provides the graduate-level reader with enough introductory knowledge so that he or she can not only read the literature and study the next level textbook but can also apply the theory to find optimal solutions in practice. No more is needed than the usual background of an undergraduate engineering, science, or mathematics program: namely calculus, differential equations, and numerical integration. Although finding optimal solutions for these problems is a complex process involving the calculus of variations, the authors carefully lay out step-by-step the most important theorems and concepts. Numerous examples are worked to demonstrate how to apply the theories to everything from classical problems (e. g. , crossing a river in minimum time) to engineering problems (e. g. , minimum-fuel launch of a satellite). Throughout the book use is made of the time-optimal launch of a satellite into orbit as an important case study with detailed analysis of two examples: launch from the Moon and launch from Earth. For launching into the field of optimal solutions, look no further!

Optimal Design and Control of Multibody Systems: Proceedings of the IUTAM Symposium (IUTAM Bookseries #42)

by Karin Nachbagauer Alexander Held

This book presents the proceedings of the IUTAM Symposium on Optimal Design and Control of Multibody Systems 2022, covering research papers in the realm of optimal structural and control design for both rigid and flexible multibody systems. It delves into the application of the adjoint approach, enabling the undertaking of extensive topology optimizations to unearth body designs that excel under time- and design-dependent loads. Encompassing presentations on (adjoint) sensitivity analysis, structural optimization, optimal control, robust optimization, artificial intelligence, machine learning, and computational methods and software development, the IUTAM Symposium 2022 showcased the latest breakthroughs and innovative methodologies. This book presents 14 meticulously peer-reviewed proceedings papers from the event, evenly split between the Optimal Design and Optimal Control panels.

Optimal Design of Control Systems: Stochastic and Deterministic Problems (Pure and Applied Mathematics: A Series of Monographs and Textbooks/221)

by Gennadii E. Kolosov

"Covers design methods for optimal (or quasioptimal) control algorithms in the form of synthesis for deterministic and stochastic dynamical systems-with applications in aerospace, robotic, and servomechanical technologies. Providing new results on exact and approximate solutions of optimal control problems."

Optimal Design of Distributed Control and Embedded Systems

by Arben Çela Mongi Ben Gaid Xu-Guang Li Silviu-Iulian Niculescu

Optimal Design of Distributed Control and Embedded Systems focuses on the design of special control and scheduling algorithms based on system structural properties as well as on analysis of the influence of induced time-delay on systems performances. It treats the optimal design of distributed and embedded control systems (DCESs) with respect to communication and calculation-resource constraints, quantization aspects, and potential time-delays induced by the associated communication and calculation model. Particular emphasis is put on optimal control signal scheduling based on the system state. In order to render this complex optimization problem feasible in real time, a time decomposition is based on periodicity induced by the static scheduling is operated. The authors present a co-design approach which subsumes the synthesis of the optimal control laws and the generation of an optimal schedule of control signals on real-time networks as well as the execution of control tasks on a single processor. The authors also operate a control structure modification or a control switching based on a thorough analysis of the influence of the induced time-delay system influence on stability and system performance in order to optimize DCES performance in case of calculation and communication resource limitations. Although the richness and variety of classes of DCES preclude a completely comprehensive treatment or a single "best" method of approaching them all, this co-design approach has the best chance of rendering this problem feasible and finding the optimal or some sub-optimal solution. The text is rounded out with references to such applications as car suspension and unmanned vehicles. Optimal Design of Distributed Control and Embedded Systems will be of most interest to academic researchers working on the mathematical theory of DCES but the wide range of environments in which they are used also promotes the relevance of the text for control practitioners working in the avionics, automotive, energy-production, space exploration and many other industries.

Optimal Design of Switching Power Supply

by Zhanyou Sha Xiaojun Wang Yanpeng Wang Hongtao Ma

A contemporary evaluation of switching power design methods with real world applications• Written by a leading author renowned in his field• Focuses on switching power supply design, manufacture and debugging• Switching power supplies have relevance for contemporary applications including mobile phone chargers, laptops and PCs• Based on the authors' successful "Switching Power Optimized Design 2nd Edition" (in Chinese)• Highly illustrated with design examples of real world applications

Optimal Energie sparen beim Bauen, Sanieren und Wohnen: Ein vergleichbarer Index aller Maßnahmen

by Jürgen Eiselt

Für Energiesparmaßnahmen im Wohnung​sbestand gibt es zahlreiche Möglichkeiten. Doch welche sind wirtschaftlich sinnvoll? Welche Maßnahmen lohnen sich für Eigentümer und gibt es auch Möglichkeiten für Mieter Energie einzusparen? Zu diesen Fragen gibt das Buch Antworten und will Strategien aufzeigen, wie intelligente und wirtschaftliche Einsparlösungen erreicht werden können. Dabei werden Energie für den Heizungsbedarf und der häusliche Stromverbrauch gemeinsam betrachtet. Dadurch unterscheiden sich die im Buch präsentierten Vorschläge von der bisher üblichen Herangehensweise. Energieberater werden neue Anregungen finden und Eigentümer sowie Mieter sehen sich in die Lage versetzt, Energiesparmaßnahmen eigenständig anzugehen und vorgeschlagene Projekte kritisch zu hinterfragen.

Optimal Estimation of Dynamic Systems (Chapman & Hall/CRC Applied Mathematics & Nonlinear Science)

by John L. Crassidis John L. Junkins

An ideal self-study guide for practicing engineers as well as senior undergraduate and beginning graduate students, this book highlights the importance of both physical and numerical modeling in solving dynamics-based estimation problems found in engineering systems, such as spacecraft attitude determination, GPS navigation, orbit determination, and aircraft tracking. With more than 100 pages of new material, this reorganized and expanded edition incorporates new theoretical results, a new chapter on advanced sequential state estimation, and additional examples and exercises. MATLAB codes are available on the book's website.

Optimal Event-Triggered Control Using Adaptive Dynamic Programming (ISSN)

by Sarangapani Jagannathan Vignesh Narayanan Avimanyu Sahoo

Optimal Event-triggered Control using Adaptive Dynamic Programming discusses event triggered controller design which includes optimal control and event sampling design for linear and nonlinear dynamic systems including networked control systems (NCS) when the system dynamics are both known and uncertain. The NCS are a first step to realize cyber-physical systems (CPS) or industry 4.0 vision. The authors apply several powerful modern control techniques to the design of event-triggered controllers and derive event-trigger condition and demonstrate closed-loop stability. Detailed derivations, rigorous stability proofs, computer simulation examples, and downloadable MATLAB® codes are included for each case.The book begins by providing background on linear and nonlinear systems, NCS, networked imperfections, distributed systems, adaptive dynamic programming and optimal control, stability theory, and optimal adaptive event-triggered controller design in continuous-time and discrete-time for linear, nonlinear and distributed systems. It lays the foundation for reinforcement learning-based optimal adaptive controller use for infinite horizons. The text then: Introduces event triggered control of linear and nonlinear systems, describing the design of adaptive controllers for them Presents neural network-based optimal adaptive control and game theoretic formulation of linear and nonlinear systems enclosed by a communication network Addresses the stochastic optimal control of linear and nonlinear NCS by using neuro dynamic programming Explores optimal adaptive design for nonlinear two-player zero-sum games under communication constraints to solve optimal policy and event trigger condition Treats an event-sampled distributed linear and nonlinear systems to minimize transmission of state and control signals within the feedback loop via the communication network Covers several examples along the way and provides applications of event triggered control of robot manipulators, UAV and distributed joint optimal network scheduling and control design for wireless NCS/CPS in order to realize industry 4.0 vision An ideal textbook for senior undergraduate students, graduate students, university researchers, and practicing engineers, Optimal Event Triggered Control Design using Adaptive Dynamic Programming instills a solid understanding of neural network-based optimal controllers under event-sampling and how to build them so as to attain CPS or Industry 4.0 vision.

Refine Search

Showing 43,401 through 43,425 of 64,255 results