The importance of contextual factors on the accuracy of estimates in project management. An emergence of a framework for more realistic estimation process
SupervisorHussain, Zahid I.
KeywordProject management; Estimation process; Estimation framework; Estimation technique; Estimation-related risk; Knowledge-based estimation; Project scheduling; Critical chain; Buffer management
The University of Bradford theses are licenced under a Creative Commons Licence.
InstitutionUniversity of Bradford
DepartmentSchool of Management
MetadataShow full item record
AbstractSuccessful projects are characterized by the quality of their planning. Good planning that better takes into account contextual factors allows more accurate estimates to be achieved. As an outcome of this research, a new framework composed of best practices has been discovered. This comprises an open platform that project experts and practitioners can work with efficiently, and that researchers can develop further as required. The research investigation commenced in the autumn of 2008 with a pilot study and then proceeded through an inductive research process, involving a series of eleven interviews. These consisted of interviews with four well-recognized experts in the field, four interviews with different practitioners and three group interviews. In addition, a long-running observation of forty-five days was conceptualized, together with other data sources, before culminating in the proposal of a new framework for improving the accuracy of estimates. Furthermore, an emerging framework – and a description of its know-how in terms of application – have been systematically reviewed through the course of four hundred twenty-five days of meetings, dedicated for the most part to improving the use of a wide range of specific project management tools and techniques and to an improvement in understanding of planning and the estimation process associated with it. This approach constituted an ongoing verification of the research’s findings against project management practice and also served as an invaluable resource for the researcher’s professional and practice-oriented development. The results obtained offered fresh insights into the importance of knowledge management in the estimation process, including the “value of not knowing”, the oft-overlooked phenomenon of underestimation and its potential to co-exist with overestimation, and the use of negative buffer management in the critical chain concept to secure project deadlines. The project also highlighted areas of improvement for future research practice that wishes to make use of an inductive approach in order to achieve a socially agreed framework, rather than a theory alone. In addition, improvements were suggested to the various qualitative tools employed in the customized data analysis process.
Showing items related by title, author, creator and subject.
Analogy-based software project effort estimation. Contributions to projects similarity measurement, attribute selection and attribute weighting algorithms for analogy-based effort estimation.Neagu, Daniel; Cowling, Peter I.; Azzeh, Mohammad Y.A. (University of BradfordDepartment of Computing School of Computing, Informatics & Media, 2010-10-01)Software effort estimation by analogy is a viable alternative method to other estimation techniques, and in many cases, researchers found it outperformed other estimation methods in terms of accuracy and practitioners¿ acceptance. However, the overall performance of analogy based estimation depends on two major factors: similarity measure and attribute selection & weighting. Current similarity measures such as nearest neighborhood techniques have been criticized that have some inadequacies related to attributes relevancy, noise and uncertainty in addition to the problem of using categorical attributes. This research focuses on improving the efficiency and flexibility of analogy-based estimation to overcome the abovementioned inadequacies. Particularly, this thesis proposes two new approaches to model and handle uncertainty in similarity measurement method and most importantly to reflect the structure of dataset on similarity measurement using Fuzzy modeling based Fuzzy C-means algorithm. The first proposed approach called Fuzzy Grey Relational Analysis method employs combined techniques of Fuzzy set theory and Grey Relational Analysis to improve local and global similarity measure and tolerate imprecision associated with using different data types (Continuous and Categorical). The second proposed approach presents the use of Fuzzy numbers and its concepts to develop a practical yet efficient approach to support analogy-based systems especially at early phase of software development. Specifically, we propose a new similarity measure and adaptation technique based on Fuzzy numbers. We also propose a new attribute subset selection algorithm and attribute weighting technique based on the hypothesis of analogy-based estimation that assumes projects that are similar in terms of attribute value are also similar in terms of effort values, using row-wise Kendall rank correlation between similarity matrix based project effort values and similarity matrix based project attribute values. A literature review of related software engineering studies revealed that the existing attribute selection techniques (such as brute-force, heuristic algorithms) are restricted to the choice of performance indicators such as (Mean of Magnitude Relative Error and Prediction Performance Indicator) and computationally far more intensive. The proposed algorithms provide sound statistical basis and justification for their procedures. The performance figures of the proposed approaches have been evaluated using real industrial datasets. Results and conclusions from a series of comparative studies with conventional estimation by analogy approach using the available datasets are presented. The studies were also carried out to statistically investigate the significant differences between predictions generated by our approaches and those generated by the most popular techniques such as: conventional analogy estimation, neural network and stepwise regression. The results and conclusions indicate that the two proposed approaches have potential to deliver comparable, if not better, accuracy than the compared techniques. The results also found that Grey Relational Analysis tolerates the uncertainty associated with using different data types. As well as the original contributions within the thesis, a number of directions for further research are presented. Most chapters in this thesis have been disseminated in international journals and highly refereed conference proceedings.
Internal stock market returns and systematic risk factors. An empirical investigation into the APT using macroeconomic factors and multivariate estimationWheeler, Frederick; Al-Saiaari, Mohsen N.K. (University of BradfordThe Management Centre, 2010-02-08)This thesis examines the relationship between stock market returns and systematic risk factors in twelve industrial countries. Using the APT framework, the thesis investigates the notion of international stock market integration versus segmentation in terms of pricing risk, international stock market efficiency in terms of eliminating arbitrage opportunities across domestic markets, and the validity of the international version of the APT according to a model that specifies purely domestic factors. Starting with ordinary least squares estimation the thesis investigates the responses of investors in their national stock markets to systematic shocks. By employing iterative non-linear multivariate seemingly unrelated regression estimation, this work avoids the statistical problems encountered in the second-pass test of the two-stage procedure. This study found that the international stock market was neither integrated nor efficient and that the IAPT was not supported by the results during the period investigated. It was demonstrated that partial and regional integration, regional efficiency, and regional IAPT validity cannot be ruled out. Moreover, the alternative model proved to be practically valid.
Risk analysis in mangement planning and project control. (Probabilistic techniques are applied to the estimation, planning, forecasting and control of large capital projects to ascertain and reduce the degree of inherent risk and uncertainty)Keller, A.Z.; Ashrafi, Rafi A. (University of BradfordIndustrial Technology, 2009-10-02)Effective estimation, planning, and control of the functions, operations, and resources of a project are among the most challenging tasks faced by the management of today's engineering and construction organisations. The increase in size and complexity of modern projects demand a sound organisational structure and a rational approach. The main objectives of the present study are two-fold. Firstly to report and critically review theoretical and practical developments of different aspects of the management of engineering and construction projects. Secondly to further develop conceptual, practical techniques and processes; also to provide Guidelines to make more effective use. of resources and systems. To achieve these objectives the present research was carried out in close collaboration with various indurtrial organisations. The current literature on project management is critically examined from the point of View of project cost estimation, planning and control. Various existing and recommended procedures, approaches and techniques are reviewed with particular emphasis on using probabilistic techniques. As the problems of scale are increasing, progressively more industries are adopting systems and project management approaches. Problems, deficiencies and gaps in the existing systems are identified. An analysis of a questionnaire survey on Systems-Caps is carried out and the results of the analysis are reported. . S-curves (or progress curves) are widely used in the plauaing and control of cost, time and resources. A mathematical model for the S-curve is adopted for this purpose. Expenditure data on a number of ii recent projects is analysed and fitted to two S-curve models suggested by Keller-Singh and the Department of Health and Social Security (D. H. S. S. ). A comparative study of the models is carried out. A set of standard parameters for the models is obtained and the predicting accuracy of these models for forecasting expenditure for future similar projects investigated. Quantification aspects of risk involved with the completion time of a project are studied. 'A number of stochastic distributions arc fitted for this purpose to the programed and actual durations for the different activities of a housing project. The maximum likelihood method is used for the estimation of parameters of the fitted distributions. Due to the increasing use of indices in the construction industry, building cost and tender price indices, their application, limitations and methods of formation are discussed. Box-Jenkins models are employed to study past behaviour and to forecast future trends for labour, materials and building cost indices. Finally, general conclusions derived from the present regearch are sunmarised and areas requiring further research are proposed.