Browse Results

Showing 23,176 through 23,200 of 23,809 results

Using R for Statistics

by Sarah Stowell

Using R for Statistics will get you the answers to most of the problems you are likely to encounter when using a variety of statistics. This book is a problem-solution primer for using R to set up your data, pose your problems and get answers using a wide array of statistical tests. The book walks you through R basics and how to use R to accomplish a wide variety statistical operations. You'll be able to navigate the R system, enter and import data, manipulate datasets, calculate summary statistics, create statistical plots and customize their appearance, perform hypothesis tests such as the t-tests and analyses of variance, and build regression models. Examples are built around actual datasets to simulate real-world solutions, and programming basics are explained to assist those who do not have a development background. After reading and using this guide, you'll be comfortable using and applying R to your specific statistical analyses or hypothesis tests. No prior knowledge of R or of programming is assumed, though you should have some experience with statistics. What you'll learn How to apply statistical concepts using R and some R programming How to work with data files, prepare and manipulate data, and combine and restructure datasets How to summarize continuous and categorical variables What is a probability distribution How to create and customize plots How to do hypothesis testing How to build and use regression and linear models Who this book is for No prior knowledge of R or of programming is assumed, making this book ideal if you are more accustomed to using point-and-click style statistical packages. You should have some prior experience with statistics, however. Table of Contents 1. R Fundamentals 2. Working with Data Files 3. Preparing and Manipulating Data 4. Combining and Restructuring Data Sets 5. Continuous Variables 6. Tabular Data 7. Probability Distribution 8. Creating Plots 9. Customizing Plots 10. Hypothesis Tests 11. Regression and Linear Models 12. Appendix A: Basic Programming with R 13. Appendix B: Add-on Packages 14: Appendix C: Data Sets

Using R for Trade Policy Analysis: R Codes for the UNCTAD and WTO Practical Guide (SpringerBriefs in Economics)

by Massimiliano Porto

This book explains the best practices of the UNCTAD & WTO for trade analysis to the R users community. It shows how to replicate the UNCTAD & WTO's Stata codes in the Practical Guide to Trade Policy Analysis by using R. Applications and exercises are chosen from the Practical Guide to Trade Policy Analysis and explain how to implement the codes in R. This books targets readers with a basic knowledge of R. It is particularly suitable for Stata users.

Using R for Trade Policy Analysis: R Codes for the UNCTAD and WTO Practical Guide

by Massimiliano Porto

This book explains the best practices of the UNCTAD & WTO for trade analysis to the R users community. It shows how to replicate the UNCTAD & WTO's Stata codes in the Practical Guide to Trade Policy Analysis by using R. Applications and exercises are chosen from the Practical Guide to Trade Policy Analysis and explain how to implement the codes in R. This books targets readers with a basic knowledge of R. It is particularly suitable for Stata users. This edition has been updated and expanded to include updated R code and visualization tools.

Using SPSS for Windows and Macintosh: Analyzing and Understanding Data

by Neil Salkind Samuel Green

The development of easy-to-use statistical software like SPSS has changed the way statistics is being taught and learned. Even with these advancements, however, students sometimes still find statistics a tough nut to crack. Using SPSS for Windows and Macintosh, 7/e, guides students through basic SPSS techniques using step-by-step descriptions and explaining in detail how to avoid common pitfalls in the study of statistics.

Using SPSS Syntax: A Beginner's Guide

by Jacqueline Collier

SPSS syntax is the command language used by SPSS to carry out all of its commands and functions. In this book, Jacqueline Collier introduces the use of syntax to those who have not used it before, or who are taking their first steps in using syntax. Without requiring any knowledge of programming, the text outlines: - how to become familiar with the syntax commands; - how to create and manage the SPSS journal and syntax files; - and how to use them throughout the data entry, management and analysis process. Collier covers all aspects of data management from data entry through to data analysis, including managing the errors and the error messages created by SPSS. Syntax commands are clearly explained and the value of syntax is demonstrated through examples. This book also supports the use of SPSS syntax alongside the usual button and menu-driven graphical interface (GIF) using the two methods together, in a complementary way. The book is written in such a way as to enable you to pick and choose how much you rely on one method over the other, encouraging you to use them side-by-side, with a gradual increase in use of syntax as your knowledge, skills and confidence develop. This book is ideal for all those carrying out quantitative research in the health and social sciences who can benefit from SPSS syntax's capacity to save time, reduce errors and allow a data audit trail.

Using Statistical Methods in Social Science Research: With a Complete SPSS Guide

by Soleman H. Abu-Bader

Using Statistical Methods in Social Science Research, Third Edition is the user-friendly text every student needs for analyzing and making sense of quantitative data. With over 20 years of experience teaching statistics, Soleman H. Abu-Bader provides an accessible, step-by-step description of the process needed to organize data, choose a test or statistical technique, analyze, interpret, and report research findings. <p><p>The book begins with an overview of research and statistical terms, followed by an explanation of basic descriptive statistics. It then focuses on the purpose, rationale, and assumptions made by each test, such as Pearson's correlation, student's t-tests, analysis of variances, and simple linear regression, among others. The book also provides a wealth of research examples that clearly display the applicability and function of these tests in real-world practice. In a separate appendix, the author provides a step-by-step process for calculating each test for those who still like to understand the mathematical formulas behind these processes.

Using Statistics in Small-Scale Language Education Research: Focus on Non-Parametric Data (ESL & Applied Linguistics Professional Series)

by Jean L. Turner

Assuming no familiarity with statistical methods, this text for language education research methods and statistics courses provides detailed guidance and instruction on principles of designing, conducting, interpreting, reading, and evaluating statistical research done in classroom settings or with a small number of participants. While three different types of statistics are addressed (descriptive, parametric, non-parametric) the emphasis is on non-parametric statistics because they are appropriate when the number of participants is small and the conditions for use of parametric statistics are not satisfied. The emphasis on non-parametric statistics is unique and complements the growing interest among second and foreign language educators in doing statistical research in classrooms. Designed to help students and other language education researchers to identify and use analyses that are appropriate for their studies, taking into account the number of participants and the shape of the data distribution, the text includes sample studies to illustrate the important points in each chapter and exercises to promote understanding of the concepts and the development of practical research skills. Mathematical operations are explained in detail, and step-by-step illustrations in the use of R (a very powerful, online, freeware program) to perform all calculations are provided. A Companion Website extends and enhances the text with PowerPoint presentations illustrating how to carry out calculations and use R; practice exercises with answer keys; data sets in Excel MS-DOS format; and quiz, midterm, and final problems with answer keys.

Using Statistics in Social Research

by Scott M. Lynch

This book covers applied statistics for the social sciences with upper-level undergraduate students in mind. The chapters are based on lecture notes from an introductory statistics course the author has taught for a number of years. The book integrates statistics into the research process, with early chapters covering basic philosophical issues underpinning the process of scientific research. These include the concepts of deductive reasoning and the falsifiability of hypotheses, the development of a research question and hypotheses, and the process of data collection and measurement. Probability theory is then covered extensively with a focus on its role in laying the foundation for statistical reasoning and inference. After illustrating the Central Limit Theorem, later chapters address the key, basic statistical methods used in social science research, including various z and t tests and confidence intervals, nonparametric chi square tests, one-way analysis of variance, correlation, simple regression, and multiple regression, with a discussion of the key issues involved in thinking about causal processes Concepts and topics are illustrated using both real and simulated data The penultimate chapter presents rules and suggestions for the successful presentation of statistics in tabular and graphic formats, and the final chapter offers suggestions for subsequent reading and study.

Using Statistics in the Social and Health Sciences with SPSS and Excel

by Martin Lee Abbott

Provides a step-by-step approach to statistical procedures to analyze data and conduct research, with detailed sections in each chapter explaining SPSS® and Excel® applications This book identifies connections between statistical applications and research design using cases, examples, and discussion of specific topics from the social and health sciences. Researched and class-tested to ensure an accessible presentation, the book combines clear, step-by-step explanations for both the novice and professional alike to understand the fundamental statistical practices for organizing, analyzing, and drawing conclusions from research data in their field. The book begins with an introduction to descriptive and inferential statistics and then acquaints readers with important features of statistical applications (SPSS and Excel) that support statistical analysis and decision making. Subsequent chapters treat the procedures commonly employed when working with data across various fields of social science research. Individual chapters are devoted to specific statistical procedures, each ending with lab application exercises that pose research questions, examine the questions through their application in SPSS and Excel, and conclude with a brief research report that outlines key findings drawn from the results. Real-world examples and data from social and health sciences research are used throughout the book, allowing readers to reinforce their comprehension of the material. Using Statistics in the Social and Health Sciences with SPSS® and Excel® includes: * Use of straightforward procedures and examples that help students focus on understanding of analysis and interpretation of findings * Inclusion of a data lab section in each chapter that provides relevant, clear examples * Introduction to advanced statistical procedures in chapter sections (e.g., regression diagnostics) and separate chapters (e.g., multiple linear regression) for greater relevance to real-world research needs Emphasizing applied statistical analyses, this book can serve as the primary text in undergraduate and graduate university courses within departments of sociology, psychology, urban studies, health sciences, and public health, as well as other related departments. It will also be useful to statistics practitioners through extended sections using SPSS® and Excel® for analyzing data. Martin Lee Abbott, PhD, is Professor of Sociology at Seattle Pacific University, where he has served as Executive Director of the Washington School Research Center, an independent research and data analysis center funded by the Bill & Melinda Gates Foundation. Dr. Abbott has held positions in both academia and industry, focusing his consulting and teaching in the areas of statistical procedures, program evaluation, applied sociology, and research methods. He is the author of Understanding Educational Statistics Using Microsoft Excel® and SPSS®, The Program Evaluation Prism: Using Statistical Methods to Discover Patterns, and Understanding and Applying Research Design, also from Wiley.

Using Technology to Enhance Clinical Supervision

by Tony Rousmaniere Edina Renfro-Michel

This is the first comprehensive research and practice-based guide for understanding and assessing supervision technology and for using it to improve the breadth and depth of services offered to supervisees and clients. Written by supervisors, for supervisors, it examines the technology that is currently available and how and when to use it. Part I provides a thorough review of the technological, legal, ethical, cultural, accessibility, and security competencies that are the foundation for effectively integrating technology into clinical supervision. Part II presents applications of the most prominent and innovative uses of technology across the major domains in counseling, along with best practices for delivery. Each chapter in this section contains a literature review, concrete examples for use, case examples, and lessons learned. *Requests for digital versions from ACA can be found on www.wiley.com. *To request print copies, please visit the ACA website. *Reproduction requests for material from books published by ACA should be directed to permissions@counseling.org

Using the American Community Survey for the National Science Foundation's Science and Engineering Workforce Statistics Programs

by National Research Council of the National Academies

The National Science Foundation (NSF) has long collected information on the number and characteristics of individuals with education or employment in science and engineering and related fields in the United States. An important motivation for this effort is to fulfill a congressional mandate to monitor the status of women and minorities in the science and engineering workforce. Consequently, many statistics are calculated by race or ethnicity, gender, and disability status. For more than 25 years, NSF obtained a sample frame for identifying the target population for information it gathered from the list of respondents to the decennial census long-form who indicated that they had earned a bachelors or higher degree. The probability that an individual was sampled from this list was dependent on both demographic and employment characteristics. But, the source for the sample frame will no longer be available because the census long-form is being replaced as of the 2010 census with the continuous collection of detailed demographic and other information in the new American Community Survey (ACS). At the request of NSF&rsquo;s Science Resources Statistics Division, the Committee on National Statistics of the National Research Council formed a panel to conduct a workshop and study the issues involved in replacing the decennial census long-form sample with a sample from the ACS to serve as the frame for the information the NSF gathers. The workshop had the specific objective of identifying issues for the collection of field of degree information on the ACS with regard to goals, content, statistical methodology, data quality, and data products.

Using the C++ Standard Template Libraries

by Ivor Horton

Using the C++ Standard Template Libraries is a contemporary treatment that teaches the generic programming capabilities that the C++ 14 Standard Library provides. In this book, author Ivor Horton explains what the class and function templates available with C++ 14 do, and how to use them in a practical context. You'll learn how to create containers, and how iterators are used with them to access, modify, and extend the data elements they contain. You'll also learn about stream iterators that can transfer data between containers and streams, including file streams. The function templates that define algorithms are explained in detail, and you'll learn how to pass function objects or lambda expressions to them to customize their behavior. Many working examples are included to demonstrate how to apply the algorithms with different types of containers. After reading this book, you will understand the scope and power of the templates that the C++ 14 Standard Library includes and how these can greatly reduce the coding and development time for many applications. You'll be able to combine the class and function templates to great effect in dealing with real-world problems. The templates in the Standard Library provide you as a C++ programmer with a comprehensive set of efficiently implemented generic programming tools that you can use for most types of application. How to use Standard Library templates with your C++ applications. Understand the different types of containers that are available and what they are used for. How to define your own class types to meet the requirements of use with containers. What iterators are, the characteristics of the various types of iterators, and how they allow algorithms to be applied to the data in different types of container. How you can define your own iterator types. What the templates that define algorithms do, and how you apply them to data stored in containers and arrays. How to access hardware clocks and use them for timing execution. How to use the templates available for compute-intensive numerical data processing. How to create and use pseudo-random number generators with distribution objects.

Using the Common Core State Standards for Mathematics With Gifted and Advanced Learners

by National Assoc For Gifted Children Linda J. Sheffield

Using the Common Core State Standards for Mathematics With Gifted and Advanced Learners provides teachers and administrators examples and strategies to implement the new Common Core State Standards (CCSS) with advanced learners at all stages of development in K-12 schools. The book describes—and demonstrates with specific examples from the CCSS—what effective differentiated activities in mathematics look like for top learners. It shares how educators can provide rigor within the new standards to allow students to demonstrate higher level thinking, reasoning, problem solving, passion, and inventiveness in mathematics. By doing so, students will develop the skills, habits of mind, and attitudes toward learning needed to reach high levels of competency and creative production in mathematics fields.

Using the R Commander: A Point-and-Click Interface for R (Chapman & Hall/CRC The R Series #35)

by John Fox

This book provides a general introduction to the R Commander graphical user interface (GUI) to R for readers who are unfamiliar with R. It is suitable for use as a supplementary text in a basic or intermediate-level statistics course. It is not intended to replace a basic or other statistics text but rather to complement it, although it does promote sound statistical practice in the examples. The book should also be useful to individual casual or occasional users of R for whom the standard command-line interface is an obstacle. tinyurl.com/RcmdrBookThe site includes data files used in the book and an errata list.

Using the Schoolwide Enrichment Model in Mathematics: A How-To Guide for Developing Student Mathematicians

by M. Katherine Gavin Joseph S. Renzulli

Using the Schoolwide Enrichment Model in Mathematics: A How-to Guide for Developing Student Mathematicians applies the teaching and learning strategies of the Schoolwide Enrichment Model (SEM) to the math classroom. Based on more than 40 years of research and development and used in schools around the world, the SEM approach focuses on promoting higher level thinking skills and creative productivity. Using this approach in mathematics, this new guidebook promotes the use of the Mathematical Practices outlined in the Common Core State Standards as the underlying processes and proficiencies that should be developed in students. Teachers learn how to create a culture of enjoyment, engagement, and enthusiasm for all students, and in particular gifted students, while developing students who think and act like mathematicians. Easy to read and use, the book incorporates many practical suggestions, including views from the classroom and sample activities from NAGC-award winning curriculum to motivate and challenge students.

Using the Weibull Distribution

by John I. Mccool

Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution and its statistical and probabilistic basis, providing a wealth of material that is not available in the current literature. The book begins by outlining the fundamental probability and statistical concepts that serve as a foundation for subsequent topics of coverage, including: * Optimum burn-in, age and block replacement, warranties and renewal theory * Exact inference in Weibull regression * Goodness of fit testing and distinguishing the Weibull from the lognormal * Inference for the Three Parameter Weibull Throughout the book, a wealth of real-world examples showcases the discussed topics and each chapter concludes with a set of exercises, allowing readers to test their understanding of the presented material. In addition, a related website features the author's own software for implementing the discussed analyses along with a set of modules written in Mathcad®, and additional graphical interface software for performing simulations. With its numerous hands-on examples, exercises, and software applications, Using the Weibull Distribution is an excellent book for courses on quality control and reliability engineering at the upper-undergraduate and graduate levels. The book also serves as a valuable reference for engineers, scientists, and business analysts who gather and interpret data that follows the Weibull distribution

Using Time Series to Analyze Long-Range Fractal Patterns (Quantitative Applications in the Social Sciences #185)

by Matthijs Koopmans

Using Time Series to Analyze Long Range Fractal Patterns presents methods for describing and analyzing dependency and irregularity in long time series. Irregularity refers to cycles that are similar in appearance, but unlike seasonal patterns more familiar to social scientists, repeated over a time scale that is not fixed. Until now, the application of these methods has mainly involved analysis of dynamical systems outside of the social sciences, but this volume makes it possible for social scientists to explore and document fractal patterns in dynamical social systems. Author Matthijs Koopmans concentrates on two general approaches to irregularity in long time series: autoregressive fractionally integrated moving average models, and power spectral density analysis. He demonstrates the methods through two kinds of examples: simulations that illustrate the patterns that might be encountered and serve as a benchmark for interpreting patterns in real data; and secondly social science examples such a long range data on monthly unemployment figures, daily school attendance rates; daily numbers of births to teens, and weekly survey data on political orientation. Data and R-scripts to replicate the analyses are available on an accompanying website.

Using Time Series to Analyze Long-Range Fractal Patterns (Quantitative Applications in the Social Sciences #185)

by Matthijs Koopmans

Using Time Series to Analyze Long Range Fractal Patterns presents methods for describing and analyzing dependency and irregularity in long time series. Irregularity refers to cycles that are similar in appearance, but unlike seasonal patterns more familiar to social scientists, repeated over a time scale that is not fixed. Until now, the application of these methods has mainly involved analysis of dynamical systems outside of the social sciences, but this volume makes it possible for social scientists to explore and document fractal patterns in dynamical social systems. Author Matthijs Koopmans concentrates on two general approaches to irregularity in long time series: autoregressive fractionally integrated moving average models, and power spectral density analysis. He demonstrates the methods through two kinds of examples: simulations that illustrate the patterns that might be encountered and serve as a benchmark for interpreting patterns in real data; and secondly social science examples such a long range data on monthly unemployment figures, daily school attendance rates; daily numbers of births to teens, and weekly survey data on political orientation. Data and R-scripts to replicate the analyses are available on an accompanying website.

The USSR Olympiad Problem Book: Selected Problems and Theorems of Elementary Mathematics (Dover Books on Mathematics)

by I. M. Yaglom D. O. Shklarsky N. N. Chentzov

This book contains 320 unconventional problems in algebra, arithmetic, elementary number theory, and trigonometry. Most of the problems first appeared in competitive examinations sponsored by the School Mathematical Society of the Moscow State University and the Mathematical Olympiads held in Moscow. Although most of the problems presuppose only high school mathematics, they are not easy; some are of uncommon difficulty and will challenge the ingenuity of any research mathematician. Nevertheless, many are well within the reach of motivated high school students and even advanced seventh and eighth graders.The problems are grouped into twelve separate sections. Among these are: the divisibility of integers, equations having integer solutions, evaluating sums and products, miscellaneous algebraic problems, the algebra of polynomials, complex numbers, problems of number theory, distinctive inequalities, difference sequences and sums, and more.Complete solutions to all problems are given; in many cases, alternate solutions are detailed from different points of view. Solutions to more advanced problems are given in considerable detail. Moreover, when advanced concepts are employed, they are discussed in the section preceding the problems. Useful in a variety of ways in high school and college curriculums, this challenging volume will be of particular interest to teachers dealing with gifted and advanced classes.

USA Through the Lens of Mathematics

by Natali Hritonenko Yuri Yatsenko

The main purpose of this captivating book is to help instructors in popularizing mathematics and other subjects by considering them in a unique multidisciplinary way. This integrative technique contributes to innovative teaching strategies to improving students’ critical and problem-solving skills and broadening their scientific vision and interdisciplinary knowledge. The authors motivate the simultaneous learning of mathematics and social studies by telling the story of the United States of America in an original, mathematically oriented way. The readers will discover practical reasoning behind mathematical concepts. This fascinating book exposes students to a novel educational strategy that aims to overcome fear of mathematics, reduce mathematical anxiety, and show the applicability of mathematics to everyday life and events. It is unique among mathematical books in its devotion to present facts and stories from the country’s heritage. The collection of 325 informative problems is designed to fit any abilities, background, and taste. Their solution requires only basic knowledge of algebra.

Utility-Based Learning from Data

by Craig Friedman Sven Sandow

Utility-Based Learning from Data provides a pedagogical, self-contained discussion of probability estimation methods via a coherent approach from the viewpoint of a decision maker who acts in an uncertain environment. This approach is motivated by the idea that probabilistic models are usually not learned for their own sake; rather, they are used t

Utility Maximization in Nonconvex Wireless Systems

by Johannes Brehmer

This monograph develops a framework for modeling and solving utility maximization problems in nonconvex wireless systems. The first part develops a model for utility optimization in wireless systems. The model is general enough to encompass a wide array of system configurations and performance objectives. Based on the general model, a set of methods for solving utility maximization problems is developed in the second part of the book. The development is based on a careful examination of the properties that are required for the application of each method. This part focuses on problems whose initial formulation does not allow for a solution by standard methods and discusses alternative approaches. The last part presents two case studies to demonstrate the application of the proposed framework. In both cases, utility maximization in multi-antenna broadcast channels is investigated.

Utilization of Geospatial Information in Daily Life: Expression and Analysis of Dynamic Life Activity (New Frontiers in Regional Science: Asian Perspectives #65)

by Yoshihide Sekimoto Yasuhiro Kawahara

This book focuses on geospatial information in living spaces, providing many examples of its collection and use as well as discussing the problems of how it is used and its future prospects.Geospatial information science is in the process of evolving and being systematized, with the technical and usage aspects of the real world stimulating each other. This book systematizes the technical aspects of positioning; of geography, which manages and represents what is measured in units of earth coordinates; and of data science, which aims to efficiently express and process geographic information, all by introducing contemporary examples that are systematized with regard to their use in our living spaces. Examples of geospatial information used in almost all aspects of our lives, including urban areas, transportation, disaster prevention, health and medical care, agriculture, forestry and fisheries, culture, ecology, and topography, are presented, along with examples of their use in each area. One of the major features of this book is that it describes the use of data from earthquake disasters that is unique to Japan, as well as the use of open data and personal data in Japan, which is a trend that is gaining attention in many countries. In this way the book systematically describes events and circumstances in living spaces that are revealed by the expression and analysis of geospatial data, with case studies and discusses their use in the IoT era.

Utilization of Renormalized Mean-Field Theory upon Novel Quantum Materials (Springer Theses)

by Wei-Lin Tu

This book offers a new approach to the long-standing problem of high-Tc copper-oxide superconductors. It has been demonstrated that starting from a strongly correlated Hamiltonian, even within the mean-field regime, the “competing orders” revealed by experiments can be achieved using numerical calculations. In the introduction, readers will find a brief review of the high-Tc problem and the unique challenges it poses, as well as a comparatively simple numerical approach, the renormalized mean-field theory (RMFT), which provides rich results detailed in the following chapters. With an additional phase picked up by the original Hamiltonian, some behaviors of interactive fermions under an external magnetic field, which have since been experimentally observed using cold atom techniques, are also highlighted.

V.A. Fock - Selected Works: Quantum Mechanics and Quantum Field Theory

by L. D. Faddeev L. A. Khalfin I. V. Komarov

In the period between the birth of quantum mechanics and the late 1950s, V.A. Fock wrote papers that are now deemed classics. In his works on theoretical physics, Fock not only skillfully applied advanced analytical and algebraic methods, but also systematically created new mathematical tools when existing approaches proved insufficient. This co

Refine Search

Showing 23,176 through 23,200 of 23,809 results