- Table View
- List View
Statistical Reinforcement Learning: Modern Machine Learning Approaches (Chapman & Hall/CRC Machine Learning & Pattern Recognition)
by Masashi SugiyamaReinforcement learning (RL) is a framework for decision making in unknown environments based on a large amount of data. Several practical RL applications for business intelligence, plant control, and gaming have been successfully explored in recent years. Providing an accessible introduction to the field, this book covers model-based and model-free approaches, policy iteration, and policy search methods. It presents illustrative examples and state-of-the-art results, including dimensionality reduction in RL and risk-sensitive RL. The book provides a bridge between RL and data mining and machine learning research.
Statistical Reliability Engineering: Methods, Models and Applications (Springer Series in Reliability Engineering)
by Hoang PhamThis book presents the state-of-the-art methodology and detailed analytical models and methods used to assess the reliability of complex systems and related applications in statistical reliability engineering. It is a textbook based mainly on the author’s recent research and publications as well as experience of over 30 years in this field. The book covers a wide range of methods and models in reliability, and their applications, including: statistical methods and model selection for machine learning; models for maintenance and software reliability; statistical reliability estimation of complex systems; and statistical reliability analysis of k out of n systems, standby systems and repairable systems. Offering numerous examples and solved problems within each chapter, this comprehensive text provides an introduction to reliability engineering graduate students, a reference for data scientists and reliability engineers, and a thorough guide for researchers and instructors in the field.
Statistical Remedies for Medical Researchers (Springer Series in Pharmaceutical Statistics)
by Peter F. ThallThis book illustrates numerous statistical practices that are commonly used by medical researchers, but which have severe flaws that may not be obvious. For each example, it provides one or more alternative statistical methods that avoid misleading or incorrect inferences being made. The technical level is kept to a minimum to make the book accessible to non-statisticians. At the same time, since many of the examples describe methods used routinely by medical statisticians with formal statistical training, the book appeals to a broad readership in the medical research community.
Statistical Remedies for Medical Researchers (Springer Series in Pharmaceutical Statistics)
by Peter F. ThallThis book illustrates numerous statistical practices that are commonly used by medical researchers, but which have severe flaws that may not be obvious. For each example, it provides one or more alternative statistical methods that avoid misleading or incorrect inferences being made. The technical level is kept to a minimum to make the book accessible to non-statisticians. At the same time, since many of the examples describe methods used routinely by medical statisticians with formal statistical training, the book appeals to a broad readership in the medical research community.
Statistical Research Methods
by Roy Sabo Edward BooneThis textbook will help graduate students in non-statistics disciplines, advanced undergraduate researchers, and research faculty in the health sciences to learn, use and communicate results from many commonly used statistical methods. The material covered, and the manner in which it is presented, describe the entire data analysis process from hypothesis generation to writing the results in a manuscript. Chapters cover, among other topics: one and two-sample proportions, multi-category data, one and two-sample means, analysis of variance, and regression. Throughout the text, the authors explain statistical procedures and concepts using a non-statistical language. This accessible approach is complete with real-world examples and sample write-ups for the Methods and Results sections of scholarly papers. The text also allows for the concurrent use of the programming language R, which is an open-source program created, maintained and updated by the statistical community. R is freely available and easy to download.
Statistical Rethinking: A Bayesian Course with Examples in R and Stan (Chapman & Hall/CRC Texts in Statistical Science #122)
by Richard McElreath<p>Statistical Rethinking: A Bayesian Course with Examples in R and Stan builds readers’ knowledge of and confidence in statistical modeling. Reflecting the need for even minor programming in today’s model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work. <p>The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy. It covers from the basics of regression to multilevel models. The author also discusses measurement error, missing data, and Gaussian process models for spatial and network autocorrelation. <p>By using complete R code examples throughout, this book provides a practical foundation for performing statistical inference. Designed for both PhD students and seasoned professionals in the natural and social sciences, it prepares them for more advanced or specialized statistical modeling. </p>
Statistical Rethinking: A Bayesian Course with Examples in R and STAN (Chapman & Hall/CRC Texts in Statistical Science)
by Richard McElreathStatistical Rethinking: A Bayesian Course with Examples in R and Stan builds your knowledge of and confidence in making inferences from data. Reflecting the need for scripting in today's model-based statistics, the book pushes you to perform step-by-step calculations that are usually automated. This unique computational approach ensures that you understand enough of the details to make reasonable choices and interpretations in your own modeling work. The text presents causal inference and generalized linear multilevel models from a simple Bayesian perspective that builds on information theory and maximum entropy. The core material ranges from the basics of regression to advanced multilevel models. It also presents measurement error, missing data, and Gaussian process models for spatial and phylogenetic confounding. The second edition emphasizes the directed acyclic graph (DAG) approach to causal inference, integrating DAGs into many examples. The new edition also contains new material on the design of prior distributions, splines, ordered categorical predictors, social relations models, cross-validation, importance sampling, instrumental variables, and Hamiltonian Monte Carlo. It ends with an entirely new chapter that goes beyond generalized linear modeling, showing how domain-specific scientific models can be built into statistical analyses. Features Integrates working code into the main text Illustrates concepts through worked data analysis examples Emphasizes understanding assumptions and how assumptions are reflected in code Offers more detailed explanations of the mathematics in optional sections Presents examples of using the dagitty R package to analyze causal graphs Provides the rethinking R package on the author's website and on GitHub
Statistical Robust Design
by Magnus ArnerA UNIQUELY PRACTICAL APPROACH TO ROBUST DESIGN FROM A STATISTICAL AND ENGINEERING PERSPECTIVEVariation in environment, usage conditions, and the manufacturing process has long presented a challenge in product engineering, and reducing variation is universally recognized as a key to improving reliability and productivity. One key and cost-effective way to achieve this is by robust design - making the product as insensitive as possible to variation.With Design for Six Sigma training programs primarily in mind, the author of this book offers practical examples that will help to guide product engineers through every stage of experimental design: formulating problems, planning experiments, and analysing data. He discusses both physical and virtual techniques, and includes numerous exercises and solutions that make the book an ideal resource for teaching or self-study.* Presents a practical approach to robust design through design of experiments.* Offers a balance between statistical and industrial aspects of robust design.* Includes practical exercises, making the book useful for teaching.* Covers both physical and virtual approaches to robust design.* Supported by an accompanying website (www.wiley/com/go/robust) featuring MATLAB® scripts and solutions to exercises.* Written by an experienced industrial design practitioner.This book's state of the art perspective will be of benefit to practitioners of robust design in industry, consultants providing training in Design for Six Sigma, and quality engineers. It will also be a valuable resource for specialized university courses in statistics or quality engineering.
Statistical Rules of Thumb
by Gerald Van BelleSensibly organized for quick reference, Statistical Rules of Thumb, Second Edition compiles simple rules that are widely applicable, robust, and elegant, and each captures key statistical concepts. This unique guide to the use of statistics for designing, conducting, and analyzing research studies illustrates real-world statistical applications through examples from fields such as public health and environmental studies. Along with an insightful discussion of the reasoning behind every technique, this easy-to-use handbook also conveys the various possibilities statisticians must think of when designing and conducting a study or analyzing its data.Each chapter presents clearly defined rules related to inference, covariation, experimental design, consultation, and data representation, and each rule is organized and discussed under five succinct headings: introduction; statement and illustration of the rule; the derivation of the rule; a concluding discussion; and exploration of the concept's extensions. The author also introduces new rules of thumb for topics such as sample size for ratio analysis, absolute and relative risk, ANCOVA cautions, and dichotomization of continuous variables. Additional features of the Second Edition include:Additional rules on Bayesian topicsNew chapters on observational studies and Evidence-Based Medicine (EBM)Additional emphasis on variation and causationUpdated material with new references, examples, and sourcesA related Web site provides a rich learning environment and contains additional rules, presentations by the author, and a message board where readers can share their own strategies and discoveries. Statistical Rules of Thumb, Second Edition is an ideal supplementary book for courses in experimental design and survey research methods at the upper-undergraduate and graduate levels. It also serves as an indispensable reference for statisticians, researchers, consultants, and scientists who would like to develop an understanding of the statistical foundations of their research efforts.A related website www.vanbelle.org provides additional rules, author presentations and more.
Statistical Shape Analysis: with applications in R
by Ian L. Dryden Kanti V. MardiaA thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis' by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while retaining sufficient detail for more specialist statisticians to appreciate the challenges and opportunities of this new field. Computer code has been included for instructional use, along with exercises to enable readers to implement the applications themselves in R and to follow the key ideas by hands-on analysis. Statistical Shape Analysis: with Applications in R will offer a valuable introduction to this fast-moving research area for statisticians and other applied scientists working in diverse areas, including archaeology, bioinformatics, biology, chemistry, computer science, medicine, morphometics and image analysis .
Statistical Signal Processing
by Debasis Kundu Swagata NandiSignal processing may broadly be considered to involve the recovery of information from physical observations. The received signal is usually disturbed by thermal, electrical, atmospheric or intentional interferences. Due to the random nature of the signal, statistical techniques play an important role in analyzing the signal. Statistics is also used in the formulation of the appropriate models to describe the behavior of the system, the development of appropriate techniques for estimation of model parameters and the assessment of the model performances. Statistical signal processing basically refers to the analysis of random signals using appropriate statistical techniques. The main aim of this book is to introduce different signal processing models which have been used in analyzing periodic data, and different statistical and computational issues involved in solving them. We discuss in detail the sinusoidal frequency model which has been used extensively in analyzing periodic data occuring in various fields. We have tried to introduce different associated models and higher dimensional statistical signal processing models which have been further discussed in the literature. Different real data sets have been analyzed to illustrate how different models can be used in practice. Several open problems have been indicated for future research.
Statistical Signal Processing: Frequency Estimation (Springerbriefs In Statistics Ser.)
by Swagata Nandi Debasis KunduThis book introduces readers to various signal processing models that have been used in analyzing periodic data, and discusses the statistical and computational methods involved. Signal processing can broadly be considered to be the recovery of information from physical observations. The received signals are usually disturbed by thermal, electrical, atmospheric or intentional interferences, and due to their random nature, statistical techniques play an important role in their analysis. Statistics is also used in the formulation of appropriate models to describe the behavior of systems, the development of appropriate techniques for estimation of model parameters and the assessment of the model performances. Analyzing different real-world data sets to illustrate how different models can be used in practice, and highlighting open problems for future research, the book is a valuable resource for senior undergraduate and graduate students specializing in mathematics or statistics.
Statistical Significance and the PHC Curve
by Hideki ToyodaThis book explains the importance of using the probability that the hypothesis is correct (PHC), an intuitive measure that anyone can understand, as an alternative to the p-value. In order to overcome the “reproducibility crisis” caused by the misuse of significance tests, this book provides a detailed explanation of the mechanism of p-hacking using significance tests, and concretely shows the merits of PHC as an alternative to p-values. In March 2019, two impactful papers on statistics were published. One paper, "Moving to a World Beyond ‘p The American Statistician, overseen by the American Statistical Association. The title of the first chapter is “Don't Say ‘Statistically Significant’”, and it uses the imperative form to clearly forbid the use of significance testing. Another paper, “Retire statistical significance”, was published in the prestigious scientific journal Nature. This commentary was endorsed by more than 800 scientists, advocating for the statement, “We agree, and call for the entire concept of statistical significance to be abandoned.” Consider a study comparing the duration of hospital stays between treatments A and B. Previously, research conclusions were typically stated as: “There was a statistically significant difference at the 5% level in the average duration of hospital stays.” This phrasing is quite abstract. Instead, we present the following conclusion as an example: (1) The average duration of hospital stays for Group A is at least half a day shorter than for Group B. (2) 71% of patients in Group A have shorter hospital stays than the average for Group B. (3) Group A has an average hospital stay that is, on average, no more than 94% of that of Group B. Then, the probability that the expression is correct is shown. That is the PHC curve.
The Statistical Sleuth: A Course in Methods of Data Analysis, Third Edition
by Fred Ramsey Daniel SchaferTHE STATISTICAL SLEUTH: A COURSE IN METHODS OF DATA ANALYSIS, Third Edition offers an appealing treatment of general statistical methods that takes full advantage of the computer, both as a computational and an analytical tool. The material is independent of any specific software package, and prominently treats modeling and interpretation in a way that goes beyond routine patterns. The book focuses on a serious analysis of real case studies, strategies and tools of modern statistical data analysis, the interplay of statistics and scientific learning, and the communication of results. With interesting examples, real data, and a variety of exercise types (conceptual, computational, and data problems), the authors get readers excited about statistics.
The Statistical System of Communist China
by Choh-Ming LiOne of the most baffling problems in contemporary Chinese economic studies concerns the validity of official statistics. In the continuing discussion of claims and counter-claims, appeals to common sense are unconvincing. Because of the pressing need for substantial evidence on which to base a judgment, the present inquiry is an important contribution to the literature on Communist China. The book provides a quizzical but objective look at the statistical system of the country, and attempts to appraise the quality of official statistics by analyzing the development and inner working of the sytem. Its approach is broadly historical, beginning with the pre-Communist period (before 1949) and dividing the next dozen years into phases: the foundation of the state statistical system (1952 - 57), the period of decentralization (1958 - 59), and subsequent efforts at reorganization. Li's study of the development of a national statistical system in China is particularly instructive in delineating both the obstacles to such development that may be expected in a densely populated, largely agricultural country and the measure that have been adopted to overcome them. Therefore his hard-headed conclusions concerning the Chinese experience should be of lively intrest in those underdeveloped countries that are now planning or executing development programs. This title is part of UC Press's Voices Revived program, which commemorates University of California Press's mission to seek out and cultivate the brightest minds and give them voice, reach, and impact. Drawing on a backlist dating to 1893, Voices Revived makes high-quality, peer-reviewed scholarship accessible once again using print-on-demand technology. This title was originally published in 1962.
Statistical Tableau: How To Use Statistical Models And Decision Science In Tableau
by Ethan LangIn today's data-driven world, understanding statistical models is crucial for effective analysis and decision making. Whether you're a beginner or an experienced user, this book equips you with the foundational knowledge to grasp and implement statistical models within Tableau. Gain the confidence to speak fluently about the models you employ, driving adoption of your insights and analysis across your organization.As AI continues to revolutionize industries, possessing the skills to leverage statistical models is no longer optional—it's a necessity. Stay ahead of the curve and harness the full potential of your data by mastering the ability to interpret and utilize the insights generated by these models.Whether you're a data enthusiast, analyst, or business professional, this book empowers you to navigate the ever-evolving landscape of data analytics with confidence and proficiency. Start your journey toward data mastery today.In this book, you will learn:The basics of foundational statistical modeling with TableauHow to prove your analysis is statistically significantHow to calculate and interpret confidence intervalsBest practices for incorporating statistics into data visualizationsHow to connect external analytics resources from Tableau using R and Python
Statistical Techniques for Data Analysis
by John K. Taylor Cheryl CihonSince the first edition of this book appeared, computers have come to the aid of modern experimenters and data analysts, bringing with them data analysis techniques that were once beyond the calculational reach of even professional statisticians. Today, scientists in every field have access to the techniques and technology they need to analyze stat
Statistical Techniques for Neuroscientists (Foundations and Innovations in Neurobiology)
by Young K. Truong Mechelle M. LewisStatistical Techniques for Neuroscientists introduces new and useful methods for data analysis involving simultaneous recording of neuron or large cluster (brain region) neuron activity. The statistical estimation and tests of hypotheses are based on the likelihood principle derived from stationary point processes and time series. Algorithms and software development are given in each chapter to reproduce the computer simulated results described therein. The book examines current statistical methods for solving emerging problems in neuroscience. These methods have been applied to data involving multichannel neural spike train, spike sorting, blind source separation, functional and effective neural connectivity, spatiotemporal modeling, and multimodal neuroimaging techniques. The author provides an overview of various methods being applied to specific research areas of neuroscience, emphasizing statistical principles and their software. The book includes examples and experimental data so that readers can understand the principles and master the methods. The first part of the book deals with the traditional multivariate time series analysis applied to the context of multichannel spike trains and fMRI using respectively the probability structures or likelihood associated with time-to-fire and discrete Fourier transforms (DFT) of point processes. The second part introduces a relatively new form of statistical spatiotemporal modeling for fMRI and EEG data analysis. In addition to neural scientists and statisticians, anyone wishing to employ intense computing methods to extract important features and information directly from data rather than relying heavily on models built on leading cases such as linear regression or Gaussian processes will find this book extremely helpful.
Statistical Testing Strategies in the Health Sciences (Chapman & Hall/CRC Biostatistics Series)
by Albert Vexler Alan D. Hutson Xiwei ChenStatistical Testing Strategies in the Health Sciences provides a compendium of statistical approaches for decision making, ranging from graphical methods and classical procedures through computationally intensive bootstrap strategies to advanced empirical likelihood techniques. It bridges the gap between theoretical statistical methods and practical procedures applied to the planning and analysis of health-related experiments. The book is organized primarily based on the type of questions to be answered by inference procedures or according to the general type of mathematical derivation. It establishes the theoretical framework for each method, with a substantial amount of chapter notes included for additional reference. It then focuses on the practical application for each concept, providing real-world examples that can be easily implemented using corresponding statistical software code in R and SAS. The book also explains the basic elements and methods for constructing correct and powerful statistical decision-making processes to be adapted for complex statistical applications. With techniques spanning robust statistical methods to more computationally intensive approaches, this book shows how to apply correct and efficient testing mechanisms to various problems encountered in medical and epidemiological studies, including clinical trials. Theoretical statisticians, medical researchers, and other practitioners in epidemiology and clinical research will appreciate the book’s novel theoretical and applied results. The book is also suitable for graduate students in biostatistics, epidemiology, health-related sciences, and areas pertaining to formal decision-making mechanisms.
Statistical Theory: A Concise Introduction (Chapman & Hall/CRC Texts in Statistical Science #100)
by Felix Abramovich Ya'acov RitovDesigned for a one-semester advanced undergraduate or graduate statistical theory course, Statistical Theory: A Concise Introduction, Second Edition clearly explains the underlying ideas, mathematics, and principles of major statistical concepts, including parameter estimation, confidence intervals, hypothesis testing, asymptotic analysis, Bayesian inference, linear models, nonparametric statistics, and elements of decision theory. It introduces these topics on a clear intuitive level using illustrative examples in addition to the formal definitions, theorems, and proofs. Based on the authors’ lecture notes, the book is self-contained, which maintains a proper balance between the clarity and rigor of exposition. In a few cases, the authors present a "sketched" version of a proof, explaining its main ideas rather than giving detailed technical mathematical and probabilistic arguments. Features: Second edition has been updated with a new chapter on Nonparametric Estimation; a significant update to the chapter on Statistical Decision Theory; and other updates throughout No requirement for heavy calculus, and simple questions throughout the text help students check their understanding of the material Each chapter also includes a set of exercises that range in level of difficulty Self-contained, and can be used by the students to understand the theory Chapters and sections marked by asterisks contain more advanced topics and may be omitted Special chapters on linear models and nonparametric statistics show how the main theoretical concepts can be applied to well-known and frequently used statistical tools The primary audience for the book is students who want to understand the theoretical basis of mathematical statistics—either advanced undergraduate or graduate students. It will also be an excellent reference for researchers from statistics and other quantitative disciplines.
Statistical Theory (Chapman And Hall/crc Texts In Statistical Science Ser. #22)
by Bernard LindgrenThis classic textbook is suitable for a first course in the theory of statistics for students with a background in calculus, multivariate calculus, and the elements of matrix algebra.
Statistical Thermodynamics and Stochastic Kinetics
by Yiannis N. KaznessisPresenting the key principles of thermodynamics from a microscopic point of view, this book provides engineers with the knowledge they need to apply thermodynamics and solve engineering challenges at the molecular level. It clearly explains the concepts of entropy and free energy, emphasizing key ideas used in equilibrium applications, whilst stochastic processes, such as stochastic reaction kinetics, are also covered. It provides a classical microscopic interpretation of thermodynamic properties, which is key for engineers, rather than focusing on more esoteric concepts of statistical mechanics and quantum mechanics. Coverage of molecular dynamics and Monte Carlo simulations as natural extensions of the theoretical treatment of statistical thermodynamics is also included, teaching readers how to use computer simulations and thus enabling them to understand and engineer the microcosm. Featuring many worked examples and over 100 end-of-chapter exercises, it is ideal for use in the classroom as well as for self-study.
Statistical Thermodynamics and Stochastic Kinetics
by Yiannis N. KaznessisPresenting the key principles of thermodynamics from a microscopic point of view, this book provides engineers with the knowledge they need to apply thermodynamics and solve engineering challenges at the molecular level. It clearly explains the concepts of entropy and free energy, emphasizing key ideas used in equilibrium applications, whilst stochastic processes, such as stochastic reaction kinetics, are also covered. It provides a classical microscopic interpretation of thermodynamic properties, which is key for engineers, rather than focusing on more esoteric concepts of statistical mechanics and quantum mechanics. Coverage of molecular dynamics and Monte Carlo simulations as natural extensions of the theoretical treatment of statistical thermodynamics is also included, teaching readers how to use computer simulations and thus enabling them to understand and engineer the microcosm. Featuring many worked examples and over 100 end-of-chapter exercises, it is ideal for use in the classroom as well as for self-study.
Statistical Thermodynamics for Pure and Applied Sciences: Statistical Thermodynamics
by Frederick Richard McCourtThis textbook concerns thermal properties of bulk matter and is aimed at advanced undergraduate or first-year graduate students in a range of programs in science or engineering. It provides an intermediate level presentation of statistical thermodynamics for students in the physical sciences (chemistry, nanosciences, physics) or related areas of applied science/engineering (chemical engineering, materials science, nanotechnology engineering), as they are areas in which statistical mechanical concepts play important roles. The book enables students to utilize microscopic concepts to achieve a better understanding of macroscopic phenomena and to be able to apply these concepts to the types of sub-macroscopic systems encountered in areas of nanoscience and nanotechnology.
Statistical Thinking: Improving Business Performance (Wiley and SAS Business Series #58)
by Roger W. Hoerl Ronald D. SneeApply statistics in business to achieve performance improvement Statistical Thinking: Improving Business Performance, 3rd Edition helps managers understand the role of statistics in implementing business improvements. It guides professionals who are learning statistics in order to improve performance in business and industry. It also helps graduate and undergraduate students understand the strategic value of data and statistics in arriving at real business solutions. Instruction in the book is based on principles of effective learning, established by educational and behavioral research. The authors cover both practical examples and underlying theory, both the big picture and necessary details. Readers gain a conceptual understanding and the ability to perform actionable analyses. They are introduced to data skills to improve business processes, including collecting the appropriate data, identifying existing data limitations, and analyzing data graphically. The authors also provide an in-depth look at JMP software, including its purpose, capabilities, and techniques for use. Updates to this edition include: A new chapter on data, assessing data pedigree (quality), and acquisition tools Discussion of the relationship between statistical thinking and data science Explanation of the proper role and interpretation of p-values (understanding of the dangers of “p-hacking”) Differentiation between practical and statistical significance Introduction of the emerging discipline of statistical engineering Explanation of the proper role of subject matter theory in order to identify causal relationships A holistic framework for variation that includes outliers, in addition to systematic and random variation Revised chapters based on significant teaching experience Content enhancements based on student input This book helps readers understand the role of statistics in business before they embark on learning statistical techniques.