The word statistics is believed to have been derived from the Latin word “Statistic”. In early days, it was used only of the collection of the information of the population of the state military. But in the modern time it is used in almost all aspects of human related activities. Statistics can be defined in two sense i.e., singular and plural. In singular sense it may be defined as the various methods and techniques for attaining and analyzing the numerical information. Different economists have different view about statistics. According to Boddingtons Statistics is, “the science of estimates and probabilities”. The techniques and method means the collection of data, organization, presentation, analysis and interpretation of numerical data. In plural sense, statistics means the aggregates of numerical facts collected systematically. The most popular and acceptable definition is given by Horace and Secrist. According to them, “Statistics means the aggregate of face affected to a market extent by multiplicity of course, numerically expressed, enumerated or estimated according to reasonable standard of accuracy collected in a systematic manner for pre-determined purpose and placed in relation to each other. The importance of statistics can be defined in different parts i.e., statistics in planning in economics, in business etc because statistical methods are used in every economic related areas. The function of statistics can be defined on the following points: Statistics simplifies complexes, Statistics express facts in definite form, it facilities comparison, it helps in formulating policies and Statistics helps in forecasting. Statistics is extremely useful in economics field but it has some limitations in itself which are as follow: Statistics doesn’t deal with the individual, Statistics doesn’t study qualitative phenomenon, Statistical laws are not exact, Statistics is liable to be misused and Statistics is only means. Statistics is comparatively new subject, which branched from Probability Theory and is widely used in areas such as Economics, Astrology, Agriculture, Medicine etc. The history of Statistics can be firstly traced back to the 1600’s and John Graunt (1620-1674) could be considered as the pioneer of statistics and as the author of the first book regarding statistics. He published Natural and Political observations on the Bills of Mortality in 1662 whereby he was studying the plague outbreak in London at the time requested by the King. Graunt was asked to come up with a system that would allow them to detect threats of further outbreaks, by keeping records of mortality and causes of death and making an estimation of the population. Graunt by forming the life table, discovered that ‘statistically’, the ratio of male to females are almost equal. Then in 1666, he collected data and started to examine life expectancies. All of this was fundamental as he was arguably the first to create a condensed life table from large data and was able to do some analysis on it. This is widely used in life insurance today, showing the importance and significance of Graunt’s work (Stigler, 1986, Verduin, 2009). In 1693, Edmond Halley extended Graunt’s thoughts and formed the first mortality table that statistically made the relationship between age and death rates. Abraham De Moivre (1667-1823) is another contributor who was the first person to identify the properties of the normal curve and in 1711, introduced the notion of statistical independence (Verduin, 2009). De Moivre in 1724, studied mortality statistics and laid down foundations of the theory of annuities, widely used in the Finance industry today, motivated by the work of Halley. De Moivre then gave idea of the normal distribution which can be used to approximate the binomial distribution (O’Connor and Robertson, 2004). William Playfair (1759-1823) was the person who invented statistical graphics, we believed that charts were a better way to represent data, which included the line graph and the bar graph chart in 1786 and the pie chart in 1801. This was a milestone as these graphical representations are used everywhere today, the most notable being the time-series graph, which is a graph containing many data points measured at successive uniform intervals over a period of time. These graphs could be used to predict future data (Robyn 1978). The application of Statistical methods to Social Sciences was discussed by Adolphe Quetlet (1796-1874) in 1835. He was interested in studying about human characteristics and suggested that the law of errors, which are commonly used in Astronomy, could be applied when studying people and through this, assumptions or predictions could be in regards to physical features and intellectual features of a person. Through Quetlet’s studies, he discovered that the distribution of certain characteristics when he made a diagram of it was in a shape of a bell curve. He is also well known for the coming up with a formula called the Quetlet Index, or more commonly known as Body Mass Index, which is an indication or measure for obesity. Other members who made little but significance contributions to Statistics are Carl Gauss and Florence Nightingale. Gauss was the first person who played around with the least squares estimation method when he was interested in astronomy and attempted to predict the position of a planet. He later proved this method by assuming the errors are normally distributed. Nightingale, first female to be a member of the Royal Statistical Society was inspired by Quetlet’s work on statistical graphics and produced a chart detailing the deaths of soldiers where she worked. She later went on to analyse that state and care of medical facilities in India. This was significant as Nightingale applied statistics to health problems and this led to the improvement of medical healthcare. The other greatest contributors was Francis Galton (1822-1911) who helped create a statistical revolution which laid foundations for future statisticians like Karl Pearson and Charles Spearman (Stigler, 1986). He came up with a number of vital concepts, including the regression, standard deviation and correlation, which came about when Galton was studying sweet peas. He discovered that the successive sweet peas were of different sizes but regressed towards the mean size and the distribution of their parents (Gavan Tredoux, 2007). He later went on to work with the idea of correlation when he was studying the heights of parents and the parent’s children when they reach adulthood, where he made a diagram of his findings and found an obvious correlation between the two.
“Statistics is life and even Allah loves it by keeping record of Duniya”
The greatest scientist of his time, Sir Ronald Fisher, by name R.A. Fisher (1890-1962) was a British statistician and biologist who was known for his contributions to experimental design and population genetics. He is known as the father of modern statistics and experimental design. Sir Ronald Aylmer Fisher was born into a wealthy family in London, England on 17 February, 1890. He studied at Harrow School in Hampstead. He was among the brightest students of the school. When he turned 15, his father’s business went bankrupt and the family had to move to Streatham. He suffered from extreme shortsightedness. He was not even allowed to study under an electric lamp as it strained his eyes. This proved to be a blessing in disguise, as he learned to visualize mathematical problems in his head and solve them mentally. He did not let it put him down in any way. Fisher at the age of 19, won a scholarship to the University of Cambridge, where he studied mathematics and graduated in 1912 with a first class honors in mathematics. After graduating, he stayed at Cambridge to study postgraduate level physics, including the theory of errors, which increased his interest in statistics. Fisher was always interested in the field of evolution and genetics. He maintained a strong interest in eugenics, which is the science involving improvement of human species by selective breeding. In 1911, he formed a Eugenics Society in Cambridge University, which attracted a number of prominent members. He started working as a statistician at an insurance company in 1913. After a brief stint there, he became a high school teacher and continued his research in statistics. Fisher taught high school mathematics and physics from 1914 until 1919 while continuing his research in statistics and genetics. In 1919, he started working with Rothamsted Experimental Station in agricultural research. The access to huge amounts of agricultural data here helped him in devising new theories on experiments. Fisher’s early experiences in the University of Cambridge shaped his interest in the field of population genetics. Fisher had evidenced a keen interest in evolutionary theory during his student days—he was a founder of the Cambridge University Eugenics Society—and he combined his training in statistics with his avocation for genetics. In particular, he published an important paper in 1918 in which he used powerful statistical tools to reconcile what had been apparent inconsistencies between Charles Darwin’s ideas of natural selection and the recently rediscovered experiments of the Austrian botanist Gregor Mendel.
At Rothamsted Fisher designed plant-breeding experiments that provided greater information with less investments of time, effort, and money. One major problem he encountered was avoiding biased selection of experimental materials, which results in inaccurate or misleading experimental data. To avoid such bias, Fisher introduced the principle of randomization. This principle states that before an effect in an experiment can be ascribed to a given cause or treatment independently of other causes or treatments, the experiment must be repeated on a number of control units of the material and that all units of material used in the experiments must be randomly selected samples from the whole population they are intended to represent. In this way, random selection is used to diminish the effects of variability in experimental materials. An even more important achievement was Fisher’s origination of the concept of analysis of variance, or ANOVA. Variance, in statistics, the square of the standard deviation of a sample or set of data, used procedurally to analyze the factors that may influence the distribution or spread of the data under consideration. This statistical procedure enabled experiments to answer several questions at once. Fisher’s principal idea was to arrange an experiment as a set of partitioned subexperiments that differ from each other in one or more of the factors or treatments applied in them. By permitting differences in their outcome to be attributed to the different factors or combinations of factors by means of statistical analysis, these subexperiments constituted a notable advance over the prevailing procedure of varying only one factor at a time in an experiment. It was later found that the problems of bias and multivariate analysis that Fisher had solved in his plant-breeding research are encountered in many other scientific fields as well. Fisher summed up his statistical work in Statistical Methods and Scientific Inference (1956). He was knighted in 1952 and spent the last years of his life conducting research in Australia. He was known as the best biologist since Charles Darwin. He published 7 books and almost 400 academic research papers in the fields of statistics and genetics. The contribution of Fisher to Statistics is, in 1912, he published his first paper on the method of maximum likelihood estimation, which involves estimating the parameters of a statistical model, given its observations, by maximizing the likelihood of these observations with the parameters. In the same year, he established the principle that the sample mean exists, which is different from the population mean. In 1918, while researching for his paper on quantitative genetics, he introduced the concept of variance. In 1919, while working at the Rothamsted Experimental Station, he invented the tools for modern experimental design with the help of data available there. In 1919 Fisher became the statistician for the Rothamsted Experimental Station near Harpenden, Hertfordshire, and did statistical work associated with the plant-breeding experiments conducted there. His Statistical Methods for Research Workers (1925) remained in print for more than 50 years. His breeding experiments led to theories about gene dominance and fitness, published in The Genetical Theory of Natural Selection (1930). In 1933 Fisher became Galton Professor of Eugenics at University College, London. From 1943 to 1957 he was Balfour Professor of Genetics at Cambridge. He investigated the linkage of genes for different traits and developed methods of multivariate analysis to deal with such questions. Fisher, popularly known as father of Statistics, suffered from colon cancer and died aged 72 on July 29, 1962, in Adelaide, Australia following a surgery for the same. He was cremated in St. Peter’s Cathedral, Adelaide. It is concluded that Statistics keeps us informed about, what is happening in the world around us. It helps us to understand the world a little bit better through numbers and other quantitative information. Statistics are important because today we live in the information world and much of this information’s are determined mathematically by Statistics Help. It plays a vital role in every field of human activity. Statistics helps in determining the existing position of per capita income, unemployment, population growth rates, housing, schooling medical facilities, etc., in a country. Statistics shape our lives without our knowledge as we notice statistics is used in Weather Forecasts, Emergency Preparedness, Predicting Disease, Medical Studies, Genetics, Political Campaigns, Insurance, Consumer Goods, Quality Testing, Stock Market etc
( While Dr. Bilal A. Bhat, Associate Professor (Statistics) S K University Of Agriculture Sciences & Technology- SKUAST-K Shalimar Srinagart, Dr. Parmil Kumar is Associate Professor (Statistics), at University of Jammu. Views are their own)