Browse Results

Showing 22,301 through 22,325 of 25,291 results

Statistical Rules of Thumb

by Gerald Van Belle

Sensibly organized for quick reference, Statistical Rules of Thumb, Second Edition compiles simple rules that are widely applicable, robust, and elegant, and each captures key statistical concepts. This unique guide to the use of statistics for designing, conducting, and analyzing research studies illustrates real-world statistical applications through examples from fields such as public health and environmental studies. Along with an insightful discussion of the reasoning behind every technique, this easy-to-use handbook also conveys the various possibilities statisticians must think of when designing and conducting a study or analyzing its data.Each chapter presents clearly defined rules related to inference, covariation, experimental design, consultation, and data representation, and each rule is organized and discussed under five succinct headings: introduction; statement and illustration of the rule; the derivation of the rule; a concluding discussion; and exploration of the concept's extensions. The author also introduces new rules of thumb for topics such as sample size for ratio analysis, absolute and relative risk, ANCOVA cautions, and dichotomization of continuous variables. Additional features of the Second Edition include:Additional rules on Bayesian topicsNew chapters on observational studies and Evidence-Based Medicine (EBM)Additional emphasis on variation and causationUpdated material with new references, examples, and sourcesA related Web site provides a rich learning environment and contains additional rules, presentations by the author, and a message board where readers can share their own strategies and discoveries. Statistical Rules of Thumb, Second Edition is an ideal supplementary book for courses in experimental design and survey research methods at the upper-undergraduate and graduate levels. It also serves as an indispensable reference for statisticians, researchers, consultants, and scientists who would like to develop an understanding of the statistical foundations of their research efforts.A related website www.vanbelle.org provides additional rules, author presentations and more.

Statistical Shape Analysis: with applications in R

by Ian L. Dryden Kanti V. Mardia

A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis' by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while retaining sufficient detail for more specialist statisticians to appreciate the challenges and opportunities of this new field. Computer code has been included for instructional use, along with exercises to enable readers to implement the applications themselves in R and to follow the key ideas by hands-on analysis. Statistical Shape Analysis: with Applications in R will offer a valuable introduction to this fast-moving research area for statisticians and other applied scientists working in diverse areas, including archaeology, bioinformatics, biology, chemistry, computer science, medicine, morphometics and image analysis .

Statistical Signal Processing

by Debasis Kundu Swagata Nandi

Signal processing may broadly be considered to involve the recovery of information from physical observations. The received signal is usually disturbed by thermal, electrical, atmospheric or intentional interferences. Due to the random nature of the signal, statistical techniques play an important role in analyzing the signal. Statistics is also used in the formulation of the appropriate models to describe the behavior of the system, the development of appropriate techniques for estimation of model parameters and the assessment of the model performances. Statistical signal processing basically refers to the analysis of random signals using appropriate statistical techniques. The main aim of this book is to introduce different signal processing models which have been used in analyzing periodic data, and different statistical and computational issues involved in solving them. We discuss in detail the sinusoidal frequency model which has been used extensively in analyzing periodic data occuring in various fields. We have tried to introduce different associated models and higher dimensional statistical signal processing models which have been further discussed in the literature. Different real data sets have been analyzed to illustrate how different models can be used in practice. Several open problems have been indicated for future research.

Statistical Signal Processing: Frequency Estimation (Springerbriefs In Statistics Ser.)

by Swagata Nandi Debasis Kundu

This book introduces readers to various signal processing models that have been used in analyzing periodic data, and discusses the statistical and computational methods involved. Signal processing can broadly be considered to be the recovery of information from physical observations. The received signals are usually disturbed by thermal, electrical, atmospheric or intentional interferences, and due to their random nature, statistical techniques play an important role in their analysis. Statistics is also used in the formulation of appropriate models to describe the behavior of systems, the development of appropriate techniques for estimation of model parameters and the assessment of the model performances. Analyzing different real-world data sets to illustrate how different models can be used in practice, and highlighting open problems for future research, the book is a valuable resource for senior undergraduate and graduate students specializing in mathematics or statistics.

Statistical Significance and the PHC Curve

by Hideki Toyoda

This book explains the importance of using the probability that the hypothesis is correct (PHC), an intuitive measure that anyone can understand, as an alternative to the p-value. In order to overcome the “reproducibility crisis” caused by the misuse of significance tests, this book provides a detailed explanation of the mechanism of p-hacking using significance tests, and concretely shows the merits of PHC as an alternative to p-values. In March 2019, two impactful papers on statistics were published. One paper, "Moving to a World Beyond ‘p The American Statistician, overseen by the American Statistical Association. The title of the first chapter is “Don't Say ‘Statistically Significant’”, and it uses the imperative form to clearly forbid the use of significance testing. Another paper, “Retire statistical significance”, was published in the prestigious scientific journal Nature. This commentary was endorsed by more than 800 scientists, advocating for the statement, “We agree, and call for the entire concept of statistical significance to be abandoned.” Consider a study comparing the duration of hospital stays between treatments A and B. Previously, research conclusions were typically stated as: “There was a statistically significant difference at the 5% level in the average duration of hospital stays.” This phrasing is quite abstract. Instead, we present the following conclusion as an example: (1) The average duration of hospital stays for Group A is at least half a day shorter than for Group B. (2) 71% of patients in Group A have shorter hospital stays than the average for Group B. (3) Group A has an average hospital stay that is, on average, no more than 94% of that of Group B. Then, the probability that the expression is correct is shown. That is the PHC curve.

The Statistical Sleuth: A Course in Methods of Data Analysis, Third Edition

by Fred Ramsey Daniel Schafer

THE STATISTICAL SLEUTH: A COURSE IN METHODS OF DATA ANALYSIS, Third Edition offers an appealing treatment of general statistical methods that takes full advantage of the computer, both as a computational and an analytical tool. The material is independent of any specific software package, and prominently treats modeling and interpretation in a way that goes beyond routine patterns. The book focuses on a serious analysis of real case studies, strategies and tools of modern statistical data analysis, the interplay of statistics and scientific learning, and the communication of results. With interesting examples, real data, and a variety of exercise types (conceptual, computational, and data problems), the authors get readers excited about statistics.

The Statistical System of Communist China

by Choh-Ming Li

One of the most baffling problems in contemporary Chinese economic studies concerns the validity of official statistics. In the continuing discussion of claims and counter-claims, appeals to common sense are unconvincing. Because of the pressing need for substantial evidence on which to base a judgment, the present inquiry is an important contribution to the literature on Communist China. The book provides a quizzical but objective look at the statistical system of the country, and attempts to appraise the quality of official statistics by analyzing the development and inner working of the sytem. Its approach is broadly historical, beginning with the pre-Communist period (before 1949) and dividing the next dozen years into phases: the foundation of the state statistical system (1952 - 57), the period of decentralization (1958 - 59), and subsequent efforts at reorganization. Li's study of the development of a national statistical system in China is particularly instructive in delineating both the obstacles to such development that may be expected in a densely populated, largely agricultural country and the measure that have been adopted to overcome them. Therefore his hard-headed conclusions concerning the Chinese experience should be of lively intrest in those underdeveloped countries that are now planning or executing development programs. This title is part of UC Press's Voices Revived program, which commemorates University of California Press's mission to seek out and cultivate the brightest minds and give them voice, reach, and impact. Drawing on a backlist dating to 1893, Voices Revived makes high-quality, peer-reviewed scholarship accessible once again using print-on-demand technology. This title was originally published in 1962.

Statistical Tableau: How To Use Statistical Models And Decision Science In Tableau

by Ethan Lang

In today's data-driven world, understanding statistical models is crucial for effective analysis and decision making. Whether you're a beginner or an experienced user, this book equips you with the foundational knowledge to grasp and implement statistical models within Tableau. Gain the confidence to speak fluently about the models you employ, driving adoption of your insights and analysis across your organization.As AI continues to revolutionize industries, possessing the skills to leverage statistical models is no longer optional—it's a necessity. Stay ahead of the curve and harness the full potential of your data by mastering the ability to interpret and utilize the insights generated by these models.Whether you're a data enthusiast, analyst, or business professional, this book empowers you to navigate the ever-evolving landscape of data analytics with confidence and proficiency. Start your journey toward data mastery today.In this book, you will learn:The basics of foundational statistical modeling with TableauHow to prove your analysis is statistically significantHow to calculate and interpret confidence intervalsBest practices for incorporating statistics into data visualizationsHow to connect external analytics resources from Tableau using R and Python

Statistical Techniques for Data Analysis

by John K. Taylor Cheryl Cihon

Since the first edition of this book appeared, computers have come to the aid of modern experimenters and data analysts, bringing with them data analysis techniques that were once beyond the calculational reach of even professional statisticians. Today, scientists in every field have access to the techniques and technology they need to analyze stat

Statistical Techniques for Neuroscientists (Foundations and Innovations in Neurobiology)

by Young K. Truong Mechelle M. Lewis

Statistical Techniques for Neuroscientists introduces new and useful methods for data analysis involving simultaneous recording of neuron or large cluster (brain region) neuron activity. The statistical estimation and tests of hypotheses are based on the likelihood principle derived from stationary point processes and time series. Algorithms and software development are given in each chapter to reproduce the computer simulated results described therein. The book examines current statistical methods for solving emerging problems in neuroscience. These methods have been applied to data involving multichannel neural spike train, spike sorting, blind source separation, functional and effective neural connectivity, spatiotemporal modeling, and multimodal neuroimaging techniques. The author provides an overview of various methods being applied to specific research areas of neuroscience, emphasizing statistical principles and their software. The book includes examples and experimental data so that readers can understand the principles and master the methods. The first part of the book deals with the traditional multivariate time series analysis applied to the context of multichannel spike trains and fMRI using respectively the probability structures or likelihood associated with time-to-fire and discrete Fourier transforms (DFT) of point processes. The second part introduces a relatively new form of statistical spatiotemporal modeling for fMRI and EEG data analysis. In addition to neural scientists and statisticians, anyone wishing to employ intense computing methods to extract important features and information directly from data rather than relying heavily on models built on leading cases such as linear regression or Gaussian processes will find this book extremely helpful.

Statistical Testing Strategies in the Health Sciences (Chapman & Hall/CRC Biostatistics Series)

by Albert Vexler Alan D. Hutson Xiwei Chen

Statistical Testing Strategies in the Health Sciences provides a compendium of statistical approaches for decision making, ranging from graphical methods and classical procedures through computationally intensive bootstrap strategies to advanced empirical likelihood techniques. It bridges the gap between theoretical statistical methods and practical procedures applied to the planning and analysis of health-related experiments. The book is organized primarily based on the type of questions to be answered by inference procedures or according to the general type of mathematical derivation. It establishes the theoretical framework for each method, with a substantial amount of chapter notes included for additional reference. It then focuses on the practical application for each concept, providing real-world examples that can be easily implemented using corresponding statistical software code in R and SAS. The book also explains the basic elements and methods for constructing correct and powerful statistical decision-making processes to be adapted for complex statistical applications. With techniques spanning robust statistical methods to more computationally intensive approaches, this book shows how to apply correct and efficient testing mechanisms to various problems encountered in medical and epidemiological studies, including clinical trials. Theoretical statisticians, medical researchers, and other practitioners in epidemiology and clinical research will appreciate the book’s novel theoretical and applied results. The book is also suitable for graduate students in biostatistics, epidemiology, health-related sciences, and areas pertaining to formal decision-making mechanisms.

Statistical Theory: A Concise Introduction (Chapman & Hall/CRC Texts in Statistical Science #100)

by Felix Abramovich Ya'acov Ritov

Designed for a one-semester advanced undergraduate or graduate statistical theory course, Statistical Theory: A Concise Introduction, Second Edition clearly explains the underlying ideas, mathematics, and principles of major statistical concepts, including parameter estimation, confidence intervals, hypothesis testing, asymptotic analysis, Bayesian inference, linear models, nonparametric statistics, and elements of decision theory. It introduces these topics on a clear intuitive level using illustrative examples in addition to the formal definitions, theorems, and proofs. Based on the authors’ lecture notes, the book is self-contained, which maintains a proper balance between the clarity and rigor of exposition. In a few cases, the authors present a "sketched" version of a proof, explaining its main ideas rather than giving detailed technical mathematical and probabilistic arguments. Features: Second edition has been updated with a new chapter on Nonparametric Estimation; a significant update to the chapter on Statistical Decision Theory; and other updates throughout No requirement for heavy calculus, and simple questions throughout the text help students check their understanding of the material Each chapter also includes a set of exercises that range in level of difficulty Self-contained, and can be used by the students to understand the theory Chapters and sections marked by asterisks contain more advanced topics and may be omitted Special chapters on linear models and nonparametric statistics show how the main theoretical concepts can be applied to well-known and frequently used statistical tools The primary audience for the book is students who want to understand the theoretical basis of mathematical statistics—either advanced undergraduate or graduate students. It will also be an excellent reference for researchers from statistics and other quantitative disciplines.

Statistical Theory (Chapman And Hall/crc Texts In Statistical Science Ser. #22)

by Bernard Lindgren

This classic textbook is suitable for a first course in the theory of statistics for students with a background in calculus, multivariate calculus, and the elements of matrix algebra.

Statistical Thermodynamics and Stochastic Kinetics

by Yiannis N. Kaznessis

Presenting the key principles of thermodynamics from a microscopic point of view, this book provides engineers with the knowledge they need to apply thermodynamics and solve engineering challenges at the molecular level. It clearly explains the concepts of entropy and free energy, emphasizing key ideas used in equilibrium applications, whilst stochastic processes, such as stochastic reaction kinetics, are also covered. It provides a classical microscopic interpretation of thermodynamic properties, which is key for engineers, rather than focusing on more esoteric concepts of statistical mechanics and quantum mechanics. Coverage of molecular dynamics and Monte Carlo simulations as natural extensions of the theoretical treatment of statistical thermodynamics is also included, teaching readers how to use computer simulations and thus enabling them to understand and engineer the microcosm. Featuring many worked examples and over 100 end-of-chapter exercises, it is ideal for use in the classroom as well as for self-study.

Statistical Thermodynamics and Stochastic Kinetics

by Yiannis N. Kaznessis

Presenting the key principles of thermodynamics from a microscopic point of view, this book provides engineers with the knowledge they need to apply thermodynamics and solve engineering challenges at the molecular level. It clearly explains the concepts of entropy and free energy, emphasizing key ideas used in equilibrium applications, whilst stochastic processes, such as stochastic reaction kinetics, are also covered. It provides a classical microscopic interpretation of thermodynamic properties, which is key for engineers, rather than focusing on more esoteric concepts of statistical mechanics and quantum mechanics. Coverage of molecular dynamics and Monte Carlo simulations as natural extensions of the theoretical treatment of statistical thermodynamics is also included, teaching readers how to use computer simulations and thus enabling them to understand and engineer the microcosm. Featuring many worked examples and over 100 end-of-chapter exercises, it is ideal for use in the classroom as well as for self-study.

Statistical Thermodynamics for Pure and Applied Sciences: Statistical Thermodynamics

by Frederick Richard McCourt

This textbook concerns thermal properties of bulk matter and is aimed at advanced undergraduate or first-year graduate students in a range of programs in science or engineering. It provides an intermediate level presentation of statistical thermodynamics for students in the physical sciences (chemistry, nanosciences, physics) or related areas of applied science/engineering (chemical engineering, materials science, nanotechnology engineering), as they are areas in which statistical mechanical concepts play important roles. The book enables students to utilize microscopic concepts to achieve a better understanding of macroscopic phenomena and to be able to apply these concepts to the types of sub-macroscopic systems encountered in areas of nanoscience and nanotechnology.

Statistical Thinking: Improving Business Performance (Wiley and SAS Business Series #58)

by Roger W. Hoerl Ronald D. Snee

Apply statistics in business to achieve performance improvement Statistical Thinking: Improving Business Performance, 3rd Edition helps managers understand the role of statistics in implementing business improvements. It guides professionals who are learning statistics in order to improve performance in business and industry. It also helps graduate and undergraduate students understand the strategic value of data and statistics in arriving at real business solutions. Instruction in the book is based on principles of effective learning, established by educational and behavioral research. The authors cover both practical examples and underlying theory, both the big picture and necessary details. Readers gain a conceptual understanding and the ability to perform actionable analyses. They are introduced to data skills to improve business processes, including collecting the appropriate data, identifying existing data limitations, and analyzing data graphically. The authors also provide an in-depth look at JMP software, including its purpose, capabilities, and techniques for use. Updates to this edition include: A new chapter on data, assessing data pedigree (quality), and acquisition tools Discussion of the relationship between statistical thinking and data science Explanation of the proper role and interpretation of p-values (understanding of the dangers of “p-hacking”) Differentiation between practical and statistical significance Introduction of the emerging discipline of statistical engineering Explanation of the proper role of subject matter theory in order to identify causal relationships A holistic framework for variation that includes outliers, in addition to systematic and random variation Revised chapters based on significant teaching experience Content enhancements based on student input This book helps readers understand the role of statistics in business before they embark on learning statistical techniques.

Statistical Thinking

by Ron D. Snee Roger Hoerl

How statistical thinking and methodology can help you make crucial business decisionsStraightforward and insightful, Statistical Thinking: Improving Business Performance, Second Edition, prepares you for business leadership by developing your capacity to apply statistical thinking to improve business processes. Unique and compelling, this book shows you how to derive actionable conclusions from data analysis, solve real problems, and improve real processes. Here, you'll discover how to implement statistical thinking and methodology in your work to improve business performance.Explores why statistical thinking is necessary and helpfulProvides case studies that illustrate how to integrate several statistical tools into the decision-making processFacilitates and encourages an experiential learning environment to enable you to apply material to actual problemsWith an in-depth discussion of JMP® software, the new edition of this important book focuses on skills to improve business processes, including collecting data appropriate for a specified purpose, recognizing limitations in existing data, and understanding the limitations of statistical analyses.

Statistical Thinking for Non-Statisticians in Drug Regulation

by Richard Kay

STATISTICAL THINKING FOR NON-STATISTICIANS IN DRUG REGULATION Statistical methods in the pharmaceutical industry are accepted as a key element in the design and analysis of clinical studies. Increasingly, the medical and scientific community are aligning with the regulatory authorities and recognizing that correct statistical methodology is essential as the basis for valid conclusions. In order for those correct and robust methods to be successfully employed there needs to be effective communication across disciplines at all stages of the planning, conducting, analyzing and reporting of clinical studies associated with the development and evaluation of new drugs and devices. Statistical Thinking for Non-Statisticians in Drug Regulation provides a comprehensive in-depth guide to statistical methodology for pharmaceutical industry professionals, including physicians, investigators, medical science liaisons, clinical research scientists, medical writers, regulatory personnel, statistical programmers, senior data managers and those working in pharmacovigilance. The author’s years of experience and up-to-date familiarity with pharmaceutical regulations and statistical practice within the wider clinical community make this an essential guide for the those working in and with the industry. The third edition of Statistical Thinking for Non-Statisticians in Drug Regulation includes: A detailed new chapter on Estimands in line with the 2019 Addendum to ICH E9 Major new sections on topics including Combining Hierarchical Testing and Alpha Adjustment, Biosimilars, Restricted Mean Survival Time, Composite Endpoints and Cumulative Incidence Functions, Adjusting for Cross-Over in Oncology, Inverse Propensity Score Weighting, and Network Meta-Analysis Updated coverage of many existing topics to reflect new and revised guidance from regulatory authorities and author experience Statistical Thinking for Non-Statisticians in Drug Regulation is a valuable guide for pharmaceutical and medical device industry professionals, as well as statisticians joining the pharmaceutical industry and students and teachers of drug development.

Statistical Thinking in Clinical Trials (Chapman & Hall/CRC Biostatistics Series)

by Michael A. Proschan

Statistical Thinking in Clinical Trials combines a relatively small number of key statistical principles and several instructive clinical trials to gently guide the reader through the statistical thinking needed in clinical trials. Randomization is the cornerstone of clinical trials and randomization-based inference is the cornerstone of this book. Read this book to learn the elegance and simplicity of re-randomization tests as the basis for statistical inference (the analyze as you randomize principle) and see how re-randomization tests can save a trial that required an unplanned, mid-course design change. Other principles enable the reader to quickly and confidently check calculations without relying on computer programs. The `EZ’ principle says that a single sample size formula can be applied to a multitude of statistical tests. The `O minus E except after V’ principle provides a simple estimator of the log odds ratio that is ideally suited for stratified analysis with a binary outcome. The same principle can be used to estimate the log hazard ratio and facilitate stratified analysis in a survival setting. Learn these and other simple techniques that will make you an invaluable clinical trial statistician.

Statistical Tools for Measuring Agreement

by Wenting Wu Lawrence Lin A. S. Hedayat

Agreement assessment techniques are widely used in examining the acceptability of a new or generic process, methodology and/or formulation in areas of lab performance, instrument/assay validation or method comparisons, statistical process control, goodness-of-fit, and individual bioequivalence. Successful applications in these situations require a sound understanding of both the underlying theory and methodological advances in handling real-life problems. This book seeks to effectively blend theory and applications while presenting readers with many practical examples. For instance, in the medical device environment, it is important to know if the newly established lab can reproduce the instrument/assay results from the established but outdating lab. When there is a disagreement, it is important to differentiate the sources of disagreement. In addition to agreement coefficients, accuracy and precision coefficients are introduced and utilized to characterize these sources. This book will appeal to a broad range of statisticians, researchers, practitioners and students, in areas of biomedical devices, psychology, medical research, and others, in which agreement assessment are needed. Many practical illustrative examples will be presented throughout the book in a wide variety of situations for continuous and categorical data.

Statistical Tools for Program Evaluation: Methods and Applications to Economic Policy, Public Health, and Education

by Jean-Michel Josselin Benoît Le Maux

This book provides a self-contained presentation of the statistical tools required for evaluating public programs, as advocated by many governments, the World Bank, the European Union, and the Organization for Economic Cooperation and Development. After introducing the methodological framework of program evaluation, the first chapters are devoted to the collection, elementary description and multivariate analysis of data as well as the estimation of welfare changes. The book then successively presents the tools of ex-ante methods (financial analysis, budget planning, cost-benefit, cost-effectiveness and multi-criteria evaluation) and ex-post methods (benchmarking, experimental and quasi-experimental evaluation). The step-by-step approach and the systematic use of numerical illustrations equip readers to handle the statistics of program evaluation. It not only offers practitioners from public administrations, consultancy firms and nongovernmental organizations the basic tools and advanced techniques used in program assessment, it is also suitable for executive management training, upper undergraduate and graduate courses, as well as for self-study.

Statistical Topics and Stochastic Models for Dependent Data with Applications

by Vlad Stefan Barbu Nicolas Vergne

This book is a collective volume authored by leading scientists in the field of stochastic modelling, associated statistical topics and corresponding applications. The main classes of stochastic processes for dependent data investigated throughout this book are Markov, semi-Markov, autoregressive and piecewise deterministic Markov models. The material is divided into three parts corresponding to: (i) Markov and semi-Markov processes, (ii) autoregressive processes and (iii) techniques based on divergence measures and entropies. A special attention is payed to applications in reliability, survival analysis and related fields.

Statistical Topics in Health Economics and Outcomes Research (Chapman & Hall/CRC Biostatistics Series)

by Demissie Alemayehu Joseph C. Cappelleri Birol Emir Kelly H. Zou

With ever-rising healthcare costs, evidence generation through Health Economics and Outcomes Research (HEOR) plays an increasingly important role in decision-making about the allocation of resources. Accordingly, it is now customary for health technology assessment and reimbursement agencies to request for HEOR evidence, in addition to data from clinical trials, to inform decisions about patient access to new treatment options. While there is a great deal of literature on HEOR, there is a need for a volume that presents a coherent and unified review of the major issues that arise in application, especially from a statistical perspective. Statistical Topics in Health Economics and Outcomes Research fulfils that need by presenting an overview of the key analytical issues and best practice. Special attention is paid to key assumptions and other salient features of statistical methods customarily used in the area, and appropriate and relatively comprehensive references are made to emerging trends. The content of the book is purposefully designed to be accessible to readers with basic quantitative backgrounds, while providing an in-depth coverage of relatively complex statistical issues. The book will make a very useful reference for researchers in the pharmaceutical industry, academia, and research institutions involved with HEOR studies. The targeted readers may include statisticians, data scientists, epidemiologists, outcomes researchers, health economists, and healthcare policy and decision-makers.

Statistical Treatment of Experimental Data

by Hugh D. Young

Dealing with statistical treatment of experimental data, this text covers topics such as errors, probability, the binomial distribution, the Poisson distribution, the Gauss distribution, method of least squares and standard deviation of the mean.

Refine Search

Showing 22,301 through 22,325 of 25,291 results