Undersampling Aware Learning based Fetal Health Prediction using Cardiotocographic Data

Main Article Content

M. Shyamala Devi, S. Sridevi , D.Umanandhini , A. Peter Soosai Anandaraj , Sudheer Kumar Gupta, Bhumireddy Sidhartha


With the current improvement of development towards pharmaceutical, distinctive ultrasound methodologies are open to find the fetal prosperity. It is analyzed with diverse clinical parameters with 2-D imaging and other test. In any case, prosperity desire of fetal heart still remains an open issue due to unconstrained works out of the hatchling, the minor heart appraise and inadequate of data in fetal echocardiography. The machine learning strategies can find out the classes of fetal heart rate which can be utilized for earlier evaluating. With this background, we have utilized Cardiotocographic Fetal heart rate dataset removed from UCI Machine Learning Store for predicting the fetal heart rate health classes.  The Prediction of fetal health rate are achieved in six ways. Firstly, the data set is preprocessed with Feature Scaling and missing values. Secondly, exploratory data investigation is done and the dispersion of target feature is visualized. Thirdly, the raw data set is fitted to all the classifiers and the performance is analysed before and after feature scaling. Fourth, the raw data set is subjected to undersampling methods like ClusterCentroids, RepeatedENN, AllKNN, CondensedNearestNeighbour, EditedNearestNeighbours, InstanceHardnessThreshold and NearMiss. Fifth, the undersampled dataset by above mentioned methods are fitted to all the classifiers and the performance is analyzed before and after feature scaling. Sixth, performance analysis is done using metrics like Precision, Recall, F-score, Accuracy and running time. The execution is done using python language under Spyder platform with Anaconda Navigator. Experimental results shows that the Decision Tree classifier tends to retain 98% before and after feature scaling for the underrsampling with EditedNearestNeighbours, RepeatedENN and AllKNN methods.

Article Details