Browse Results

Showing 15,051 through 15,075 of 28,140 results

Marketing Database Analytics: Transforming Data for Competitive Advantage

by Andrew D. Banasiewicz

Marketing Database Analytics presents a step-by-step process for understanding and interpreting data in order to gain insights to drive business decisions. One of the core elements of measuring marketing effectiveness is through the collection of appropriate data, but this data is nothing but numbers unless it is analyzed meaningfully. Focusing specifically on quantitative marketing metrics, the book: Covers the full spectrum of marketing analytics, from the initial data setup and exploration, to segmentation, behavioral predictions and impact quantification Establishes the importance of database analytics, integrating both business and marketing practice Provides a theoretical framework that explains the concepts and delivers techniques for analyzing data Includes cases and exercises to guide students’ learning Banasiewicz integrates his knowledge from both his academic training and professional experience, providing a thorough, comprehensive approach that will serve graduate students of marketing research and analytics well.

Marketing and Smart Technologies: Proceedings of ICMarkTech 2021, Volume 1 (Smart Innovation, Systems and Technologies #279)

by Luiz Moutinho José Luís Reis Eduardo Parra López José Paulo Marques dos Santos

This book includes selected papers presented at the International Conference on Marketing and Technologies (ICMarkTech 2021), held at University of La Laguna, Tenerife, Spain, during December 2–4, 2021. It covers up-to-date cutting-edge research on artificial intelligence applied in marketing, virtual and augmented reality in marketing, business intelligence databases and marketing, data mining and big data, marketing data science, web marketing, e-commerce and v-commerce, social media and networking, geomarketing and IoT, marketing automation and inbound marketing, machine learning applied to marketing, customer data management and CRM, and neuromarketing technologies.

Marketing and Smart Technologies: Proceedings of ICMarkTech 2022, Volume 1 (Smart Innovation, Systems and Technologies #344)

by Luís Paulo Reis José Luís Reis José Paulo Marques dos Santos Marisa Del Rio Araujo

This book includes selected papers presented at the International Conference on Marketing and Technologies (ICMarkTech 2022), held at Universidade de Santiago de Compostela, Spain, during 1 – 3 December 2022. It covers up-to-date cutting-edge research on artificial intelligence applied in marketing, virtual and augmented reality in marketing, business intelligence databases and marketing, data mining and big data, marketing data science, web marketing, e-commerce and v-commerce, social media and networking, geomarketing and IoT, marketing automation and inbound marketing, machine learning applied to marketing, customer data management and CRM, and neuromarketing technologies.

Marketing and Smart Technologies: Proceedings of ICMarkTech 2022, Volume 2 (Smart Innovation, Systems and Technologies #337)

by José Luís Reis Marc K. Peter Zorica Bogdanović José Antonio Varela González

This book includes selected papers presented at the International Conference on Marketing and Technologies (ICMarkTech 2022), held at Universidade de Santiago de Compostela, Spain, during December 1–3, 2022. It covers up-to-date cutting-edge research on artificial intelligence applied in marketing, virtual and augmented reality in marketing, business intelligence databases and marketing, data mining and big data, marketing data science, web marketing, e-commerce and v-commerce, social media and networking, geomarketing and IoT, marketing automation and inbound marketing, machine learning applied to marketing, customer data management and CRM, and neuromarketing technologies.

Marketing and Smart Technologies: Proceedings of ICMarkTech 2023, Volume 1 (Smart Innovation, Systems and Technologies #386)

by José Luís Reis José Paulo Marques dos Santos Jiří Zelený Beáta Gavurová

This book includes selected papers presented at the International Conference on Marketing and Technologies (ICMarkTech 2023), held at Faculty of Economics and Management (FEM), Czech University of Life Sciences Prague (CZU), in partnership with University College Prague (UCP), in Prague, Czech Republic, between 30 November and 2 December 2023. It covers up-to-date cutting-edge research on artificial intelligence applied in marketing, virtual and augmented reality in marketing, business intelligence databases and marketing, data mining and big data, marketing data science, web marketing, e-commerce and v-commerce, social media and networking, geomarketing and IoT, marketing automation and inbound marketing, machine learning applied to marketing, customer data management and CRM, and neuromarketing technologies.

Marketing and Smart Technologies: Proceedings of ICMarkTech 2023, Volume 2 (Smart Innovation, Systems and Technologies #393)

by Luís Paulo Reis José Luís Reis Marc K. Peter Zorica Bogdanovic

This book includes selected papers presented at the International Conference on Marketing and Technologies (ICMarkTech 2023), held at Faculty of Economics and Management (FEM), Czech University of Life Sciences Prague (CZU), in partnership with University College Prague (UCP), in Prague, Czech Republic, between 30 November and 2 December 2023. It covers up-to-date cutting-edge research on artificial intelligence applied in marketing, virtual and augmented reality in marketing, business intelligence databases and marketing, data mining and big data, marketing data science, web marketing, e-commerce and v-commerce, social media and networking, geomarketing and IoT, marketing automation and inbound marketing, machine learning applied to marketing, customer data management and CRM, and neuromarketing technologies.

Marketing to the Aging Population: Strategies and Tools for Companies in Various Industries (Management for Professionals)

by George P. Moschis

This book coaches marketing practitioners and students how to best satisfy the needs of the older consumer population. It first highlights the heterogeneity of the older consumer market, then examines the specific needs of the older consumer. Lastly, the book highlights the most effective ways of reaching and serving older consumer segments for different products and services such as financial services, food and beverages, healthcare and pharmaceuticals, and travel among others. It presents segment-to-industry specific strategies that help marketers develop more refined and targeted micro-marketing strategies and customer relationship management (CRM) systems for building and retaining a large base of older customers. These strategies also help demonstrate how companies can make decisions that increase profitability not only by satisfying consumer needs and wants, but also by creating positive change and improvement in consumer well-being.

Markov Chain Aggregation for Agent-Based Models

by Sven Banisch

This self-contained text develops a Markov chain approach that makes the rigorous analysis of a class of microscopic models that specify the dynamics of complex systems at the individual level possible. It presents a general framework of aggregation in agent-based and related computational models, one which makes use of lumpability and information theory in order to link the micro and macro levels of observation. The starting point is a microscopic Markov chain description of the dynamical process in complete correspondence with the dynamical behavior of the agent-based model (ABM), which is obtained by considering the set of all possible agent configurations as the state space of a huge Markov chain. An explicit formal representation of a resulting "micro-chain" including microscopic transition rates is derived for a class of models by using the random mapping representation of a Markov process. The type of probability distribution used to implement the stochastic part of the model, which defines the updating rule and governs the dynamics at a Markovian level, plays a crucial part in the analysis of "voter-like" models used in population genetics, evolutionary game theory and social dynamics. The book demonstrates that the problem of aggregation in ABMs - and the lumpability conditions in particular - can be embedded into a more general framework that employs information theory in order to identify different levels and relevant scales in complex dynamical systems

Markov Chain Monte Carlo Methods in Quantum Field Theories: A Modern Primer (SpringerBriefs in Physics)

by Anosh Joseph

This primer is a comprehensive collection of analytical and numerical techniques that can be used to extract the non-perturbative physics of quantum field theories. The intriguing connection between Euclidean Quantum Field Theories (QFTs) and statistical mechanics can be used to apply Markov Chain Monte Carlo (MCMC) methods to investigate strongly coupled QFTs. The overwhelming amount of reliable results coming from the field of lattice quantum chromodynamics stands out as an excellent example of MCMC methods in QFTs in action. MCMC methods have revealed the non-perturbative phase structures, symmetry breaking, and bound states of particles in QFTs. The applications also resulted in new outcomes due to cross-fertilization with research areas such as AdS/CFT correspondence in string theory and condensed matter physics. The book is aimed at advanced undergraduate students and graduate students in physics and applied mathematics, and researchers in MCMC simulations and QFTs. At the end of this book the reader will be able to apply the techniques learned to produce more independent and novel research in the field.

Markov Chain Monte Carlo in Practice (Chapman & Hall/CRC Interdisciplinary Statistics)

by W. R. Gilks S. Richardson D. J. Spiegelhalter

In a family study of breast cancer, epidemiologists in Southern California increase the power for detecting a gene-environment interaction. In Gambia, a study helps a vaccination program reduce the incidence of Hepatitis B carriage. Archaeologists in Austria place a Bronze Age site in its true temporal location on the calendar scale. And in France,

Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference, Second Edition (Chapman & Hall/CRC Texts in Statistical Science)

by Dani Gamerman Hedibert F. Lopes

While there have been few theoretical contributions on the Markov Chain Monte Carlo (MCMC) methods in the past decade, current understanding and application of MCMC to the solution of inference problems has increased by leaps and bounds. Incorporating changes in theory and highlighting new applications, Markov Chain Monte Carlo: Stochastic Simul

Markov Chains

by J. R. Norris

Markov chains are central to the understanding of random processes. This is not only because they pervade the applications of random processes, but also because one can calculate explicitly many quantities of interest. This textbook, aimed at advanced undergraduate or MSc students with some background in basic probability theory, focuses on Markov chains and quickly develops a coherent and rigorous theory whilst showing also how actually to apply it. Both discrete-time and continuous-time chains are studied. A distinguishing feature is an introduction to more advanced topics such as martingales and potentials in the established context of Markov chains. There are applications to simulation, economics, optimal control, genetics, queues and many other topics, and exercises and examples drawn both from theory and practice. It will therefore be an ideal text either for elementary courses on random processes or those that are more oriented towards applications.

Markov Chains

by Michael K. Ng Wai-Ki Ching Tak-Kuen Siu Ximin Huang

This new edition of Markov Chains: Models, Algorithms and Applications has been completely reformatted as a text, complete with end-of-chapter exercises, a new focus on management science, new applications of the models, and new examples with applications in financial risk management and modeling of financial data. This book consists of eight chapters. Chapter 1 gives a brief introduction to the classical theory on both discrete and continuous time Markov chains. The relationship between Markov chains of finite states and matrix theory will also be highlighted. Some classical iterative methods for solving linear systems will be introduced for finding the stationary distribution of a Markov chain. The chapter then covers the basic theories and algorithms for hidden Markov models (HMMs) and Markov decision processes (MDPs). Chapter 2 discusses the applications of continuous time Markov chains to model queueing systems and discrete time Markov chain for computing the PageRank, the ranking of websites on the Internet. Chapter 3 studies Markovian models for manufacturing and re-manufacturing systems and presents closed form solutions and fast numerical algorithms for solving the captured systems. In Chapter 4, the authors present a simple hidden Markov model (HMM) with fast numerical algorithms for estimating the model parameters. An application of the HMM for customer classification is also presented. Chapter 5 discusses Markov decision processes for customer lifetime values. Customer Lifetime Values (CLV) is an important concept and quantity in marketing management. The authors present an approach based on Markov decision processes for the calculation of CLV using real data. Chapter 6 considers higher-order Markov chain models, particularly a class of parsimonious higher-order Markov chain models. Efficient estimation methods for model parameters based on linear programming are presented. Contemporary research results on applications to demand predictions, inventory control and financial risk measurement are also presented. In Chapter 7, a class of parsimonious multivariate Markov models is introduced. Again, efficient estimation methods based on linear programming are presented. Applications to demand predictions, inventory control policy and modeling credit ratings data are discussed. Finally, Chapter 8 re-visits hidden Markov models, and the authors present a new class of hidden Markov models with efficient algorithms for estimating the model parameters. Applications to modeling interest rates, credit ratings and default data are discussed. This book is aimed at senior undergraduate students, postgraduate students, professionals, practitioners, and researchers in applied mathematics, computational science, operational research, management science and finance, who are interested in the formulation and computation of queueing networks, Markov chain models and related topics. Readers are expected to have some basic knowledge of probability theory, Markov processes and matrix theory.

Markov Chains and Decision Processes for Engineers and Managers

by Theodore J. Sheskin

Recognized as a powerful tool for dealing with uncertainty, Markov modeling can enhance your ability to analyze complex production and service systems. However, most books on Markov chains or decision processes are often either highly theoretical, with few examples, or highly prescriptive, with little justification for the steps of the algorithms u

Markov Chains and Dependability Theory

by Gerardo Rubino Bruno Sericola

Dependability metrics are omnipresent in every engineering field, from simple ones through to more complex measures combining performance and dependability aspects of systems. This book presents the mathematical basis of the analysis of these metrics in the most used framework, Markov models, describing both basic results and specialized techniques. The authors first present both discrete and continuous time Markov chains before focusing on dependability measures, which necessitate the study of Markov chains on a subset of states representing different user satisfaction levels for the modelled system. Topics covered include Markovian state lumping, analysis of sojourns on subset of states of Markov chains, analysis of most dependability metrics, fundamentals of performability analysis, and bounding and simulation techniques designed to evaluate dependability measures. The book is of interest to graduate students and researchers in all areas of engineering where the concepts of life-time, repair duration, availability, reliability and risk are important.

Markov Chains on Metric Spaces: A Short Course (Universitext)

by Michel Benaïm Tobias Hurth

This book gives an introduction to discrete-time Markov chains which evolve on a separable metric space. The focus is on the ergodic properties of such chains, i.e., on their long-term statistical behaviour. Among the main topics are existence and uniqueness of invariant probability measures, irreducibility, recurrence, regularizing properties for Markov kernels, and convergence to equilibrium. These concepts are investigated with tools such as Lyapunov functions, petite and small sets, Doeblin and accessible points, coupling, as well as key notions from classical ergodic theory. The theory is illustrated through several recurring classes of examples, e.g., random contractions, randomly switched vector fields, and stochastic differential equations, the latter providing a bridge to continuous-time Markov processes. The book can serve as the core for a semester- or year-long graduate course in probability theory with an emphasis on Markov chains or random dynamics. Some of the material is also well suited for an ergodic theory course. Readers should have taken an introductory course on probability theory, based on measure theory. While there is a chapter devoted to chains on a countable state space, a certain familiarity with Markov chains on a finite state space is also recommended.

Markov Chains: Analytic and Monte Carlo Computations (Wiley Series in Probability and Statistics #593)

by Carl Graham

Markov Chains: Analytic and Monte Carlo Computations introduces the main notions related to Markov chains and provides explanations on how to characterize, simulate, and recognize them. Starting with basic notions, this book leads progressively to advanced and recent topics in the field, allowing the reader to master the main aspects of the classical theory. This book also features: Numerous exercises with solutions as well as extended case studies. A detailed and rigorous presentation of Markov chains with discrete time and state space. An appendix presenting probabilistic notions that are necessary to the reader, as well as giving more advanced measure-theoretic notions.

Markov Chains: From Theory to Implementation and Experimentation

by Paul A. Gagniuc

A fascinating and instructive guide to Markov chains for experienced users and newcomers alike This unique guide to Markov chains approaches the subject along the four convergent lines of mathematics, implementation, simulation, and experimentation. It introduces readers to the art of stochastic modeling, shows how to design computer implementations, and provides extensive worked examples with case studies. Markov Chains: From Theory to Implementation and Experimentation begins with a general introduction to the history of probability theory in which the author uses quantifiable examples to illustrate how probability theory arrived at the concept of discrete-time and the Markov model from experiments involving independent variables. An introduction to simple stochastic matrices and transition probabilities is followed by a simulation of a two-state Markov chain. The notion of steady state is explored in connection with the long-run distribution behavior of the Markov chain. Predictions based on Markov chains with more than two states are examined, followed by a discussion of the notion of absorbing Markov chains. Also covered in detail are topics relating to the average time spent in a state, various chain configurations, and n-state Markov chain simulations used for verifying experiments involving various diagram configurations. • Fascinating historical notes shed light on the key ideas that led to the development of the Markov model and its variants • Various configurations of Markov Chains and their limitations are explored at length • Numerous examples—from basic to complex—are presented in a comparative manner using a variety of color graphics • All algorithms presented can be analyzed in either Visual Basic, Java Script, or PHP • Designed to be useful to professional statisticians as well as readers without extensive knowledge of probability theory Covering both the theory underlying the Markov model and an array of Markov chain implementations, within a common conceptual framework, Markov Chains: From Theory to Implementation and Experimentation is a stimulating introduction to and a valuable reference for those wishing to deepen their understanding of this extremely valuable statistical tool. Paul A. Gagniuc, PhD, is Associate Professor at Polytechnic University of Bucharest, Romania. He obtained his MS and his PhD in genetics at the University of Bucharest. Dr. Ganiuc’s work has been published in numerous high profile scientific journals, ranging from the Public Library of Science to BioMed Central and Nature journals. He is the recipient of several awards for exceptional scientific results and a highly active figure in the review process for different scientific areas.

Markov Chains: Solved Exercises And Elements Of Theory (Springer Series in Operations Research and Financial Engineering)

by Randal Douc Eric Moulines Pierre Priouret Philippe Soulier

This book covers the classical theory of Markov chains on general state-spaces as well as many recent developments. The theoretical results are illustrated by simple examples, many of which are taken from Markov Chain Monte Carlo methods. The book is self-contained, while all the results are carefully and concisely proven. Bibliographical notes are added at the end of each chapter to provide an overview of the literature. Part I lays the foundations of the theory of Markov chain on general states-space. Part II covers the basic theory of irreducible Markov chains on general states-space, relying heavily on regeneration techniques. These two parts can serve as a text on general state-space applied Markov chain theory. Although the choice of topics is quite different from what is usually covered, where most of the emphasis is put on countable state space, a graduate student should be able to read almost all these developments without any mathematical background deeper than that needed to study countable state space (very little measure theory is required). Part III covers advanced topics on the theory of irreducible Markov chains. The emphasis is on geometric and subgeometric convergence rates and also on computable bounds. Some results appeared for a first time in a book and others are original. Part IV are selected topics on Markov chains, covering mostly hot recent developments.

Markov Chains: Theory and Applications

by Bruno Sericola

Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of many metrics of interest. The author presents the theory of both discrete-time and continuous-time homogeneous Markov chains. He carefully examines the explosion phenomenon, the Kolmogorov equations, the convergence to equilibrium and the passage time distributions to a state and to a subset of states. These results are applied to birth-and-death processes. He then proposes a detailed study of the uniformization technique by means of Banach algebra. This technique is used for the transient analysis of several queuing systems. Contents 1. Discrete-Time Markov Chains 2. Continuous-Time Markov Chains 3. Birth-and-Death Processes 4. Uniformization 5. Queues About the Authors Bruno Sericola is a Senior Research Scientist at Inria Rennes – Bretagne Atlantique in France. His main research activity is in performance evaluation of computer and communication systems, dependability analysis of fault-tolerant systems and stochastic models.

Markov Decision Processes

by Martin L. Puterman

The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This text is unique in bringing together so many results hitherto found only in part in other texts and papers. . . . The text is fairly self-contained, inclusive of some basic mathematical results needed, and provides a rich diet of examples, applications, and exercises. The bibliographical material at the end of each chapter is excellent, not only from a historical perspective, but because it is valuable for researchers in acquiring a good perspective of the MDP research potential."-Zentralblatt fur Mathematik". . . it is of great value to advanced-level students, researchers, and professional practitioners of this field to have now a complete volume (with more than 600 pages) devoted to this topic. . . . Markov Decision Processes: Discrete Stochastic Dynamic Programming represents an up-to-date, unified, and rigorous treatment of theoretical and computational aspects of discrete-time Markov decision processes."-Journal of the American Statistical Association

Markov Decision Processes and Stochastic Positional Games: Optimal Control on Complex Networks (International Series in Operations Research & Management Science #349)

by Dmitrii Lozovanu Stefan Wolfgang Pickl

This book presents recent findings and results concerning the solutions of especially finite state-space Markov decision problems and determining Nash equilibria for related stochastic games with average and total expected discounted reward payoffs. In addition, it focuses on a new class of stochastic games: stochastic positional games that extend and generalize the classic deterministic positional games. It presents new algorithmic results on the suitable implementation of quasi-monotonic programming techniques. Moreover, the book presents applications of positional games within a class of multi-objective discrete control problems and hierarchical control problems on networks. Given its scope, the book will benefit all researchers and graduate students who are interested in Markov theory, control theory, optimization and games.

Markov Decision Processes in Practice

by Richard J. Boucherie Nico M. van Dijk

This book utilizes classical Markov Decision Processes (MDP) for real-life applications and optimization. MDP allows users to develop and formally support approximate and simple decision rules, and this book showcases state-of-the-art applications in which MDP was key to the solution approach. The book is divided into six parts. Part 1 is devoted to the state-of-the-art theoretical foundation of MDP, including approximate methods such as policy improvement, successive approximation and infinite state spaces as well as an instructive chapter on Approximate Dynamic Programming. It then continues with five parts of specific and non-exhaustive application areas. Part 2 covers MDP healthcare applications, which includes different screening procedures, appointment scheduling, ambulance scheduling and blood management. Part 3 explores MDP modeling within transportation. This ranges from public to private transportation, from airports and traffic lights to car parking or charging your electric car . Part 4 contains three chapters that illustrates the structure of approximate policies for production or manufacturing structures. In Part 5, communications is highlighted as an important application area for MDP. It includes Gittins indices, down-to-earth call centers and wireless sensor networks. Finally Part 6 is dedicated to financial modeling, offering an instructive review to account for financial portfolios and derivatives under proportional transactional costs. The MDP applications in this book illustrate a variety of both standard and non-standard aspects of MDP modeling and its practical use. This book should appeal to readers for practitioning, academic research and educational purposes, with a background in, among others, operations research, mathematics, computer science, and industrial engineering.

Markov Models & Optimization (Chapman And Hall/crc Monographs On Statistics And Applied Probability Ser. #49)

by M.H.A. Davis

This book presents a radically new approach to problems of evaluating and optimizing the performance of continuous-time stochastic systems. This approach is based on the use of a family of Markov processes called Piecewise-Deterministic Processes (PDPs) as a general class of stochastic system models. A PDP is a Markov process that follows deterministic trajectories between random jumps, the latter occurring either spontaneously, in a Poisson-like fashion, or when the process hits the boundary of its state space. This formulation includes an enormous variety of applied problems in engineering, operations research, management science and economics as special cases; examples include queueing systems, stochastic scheduling, inventory control, resource allocation problems, optimal planning of production or exploitation of renewable or non-renewable resources, insurance analysis, fault detection in process systems, and tracking of maneuvering targets, among many others.The first part of the book shows how these applications lead to the PDP as a system model, and the main properties of PDPs are derived. There is particular emphasis on the so-called extended generator of the process, which gives a general method for calculating expectations and distributions of system performance functions. The second half of the book is devoted to control theory for PDPs, with a view to controlling PDP models for optimal performance: characterizations are obtained of optimal strategies both for continuously-acting controllers and for control by intervention (impulse control). Throughout the book, modern methods of stochastic analysis are used, but all the necessary theory is developed from scratch and presented in a self-contained way. The book will be useful to engineers and scientists in the application areas as well as to mathematicians interested in applications of stochastic analysis.

Markov Models for Pattern Recognition

by Gernot A. Fink

This thoroughly revised and expanded new edition now includes a more detailed treatment of the EM algorithm, a description of an efficient approximate Viterbi-training procedure, a theoretical derivation of the perplexity measure and coverage of multi-pass decoding based on n-best search. Supporting the discussion of the theoretical foundations of Markov modeling, special emphasis is also placed on practical algorithmic solutions. Features: introduces the formal framework for Markov models; covers the robust handling of probability quantities; presents methods for the configuration of hidden Markov models for specific application areas; describes important methods for efficient processing of Markov models, and the adaptation of the models to different tasks; examines algorithms for searching within the complex solution spaces that result from the joint application of Markov chain and hidden Markov models; reviews key applications of Markov models.

Refine Search

Showing 15,051 through 15,075 of 28,140 results