Enhancing Graduate Academic Performance Prediction and Classification: An Analysis Using the Enhanced Correlated Feature Set Model

Enhancing Graduate Academic Performance Prediction and Classification: An Analysis Using the Enhanced Correlated Feature Set Model

Kandula Neha Ram Kumar*

Department of Computer Science and Engineering, Lovely Professional University, Punjab 144402, India

School of Computing Science and Engineering VIT Bhopal University, Madhya Pradesh 466114, Bhopal, India

Corresponding Author Email: 
ramkumar@vitbhopal.ac.in
Page: 
1605-1612
|
DOI: 
https://doi.org/10.18280/isi.280617
Received: 
7 May 2023
|
Revised: 
28 August 2023
|
Accepted: 
9 October 2023
|
Available online: 
23 December 2023
| Citation

© 2023 IIETA. This article is published by IIETA and is licensed under the CC BY 4.0 license (http://creativecommons.org/licenses/by/4.0/).

OPEN ACCESS

Abstract: 

Complex assessment processes and limited improvement opportunities contribute to the challenges currently confronting higher education institutions. Recent focus shifts in academic research have sought to leverage the unique datasets generated within these institutions, aiming to refine student identification and performance metrics. This study primarily investigates the role data analysis and mining play in augmenting the assessment capabilities within these education institutions. The uncertainties surrounding student progression and retention rates necessitate the systematic evaluation of extensive data amassed within institutional databases and Learning Supervision Method (LSM) datasets. The intent is to provide academic professionals with detailed analytics on student activities and progression, enabling the tailoring of support to individual students and thus potentially increasing the efficiency of their coursework completion. In response to the growing urgency to evaluate and improve graduation outcomes without compromising educational quality, this study employs classification and data mining techniques to analyze graduate student academic performance. Educational institutions, striving to optimize limited resources, stand to benefit from the recognized decision-making power of deep learning models. A comparative analysis of the performance of elementary school students in college settings versus college students revealed a superior performance by the elementary school cohort. The proposed Graduate Interlinked Precedent Academic-based Performance Analysis using Enhanced Correlated Feature Set Model (GIPA-PA-ECFSM) illuminated a significant correlation between third-year and final-year performances. The proposed model aids in the identification of student performance across semesters, facilitating more effective student monitoring. When compared with traditional models, the proposed model demonstrates superior performance in analyzing graduate performance levels.

Keywords: 

graduate performance, interlinked precedent academic performance, feature set, deep learning, feature correlation

1. Introduction

In the face of escalating pressures from globalization and domestic economies, educational institutions are being compelled to implement innovative changes aimed at enhancing the quality of their academic programs and increasing the graduation rates of students with specialized degrees [1]. Students, being the primary investors in these institutions, substantially contribute to a country's economic and social development through the production of creative graduates. Pedagogical institutions hold a vested interest in maintaining operability and managing large student databases for efficient administration. With the proliferation of web technology, its adoption has become a standard practice in many educational institutions, facilitating the sharing of vital data about students, faculty, and their engagement with educational systems [2]. Supporting the development and expansion of society, higher education provides a community rich in resources, including directories of universities, faculty, staff, and listings of available training and development opportunities [3].

Academic success is a shared interest among educators, institutions, and businesses, and as such, it is crucial that students strive to excel acadically to meet these stakeholders' expectations [4]. The reputation of a university grows in tandem with the quality of its graduates, making academic success a pivotal factor [5]. While the economic ramifications and extracurricular activities of a student may influence their academic success, most research posits that the culmination of academic success is marked by graduation [6].

The transformation of large data volumes into useful information for individuals, organizations, and governments necessitated the integration of previous methodologies. Data scientists rely heavily on insights derived from accounting records [7], empirical studies, and predictive models. The focus has now shifted towards predictive and correlative analytics, thanks to the development of efficient data storage, retrieval, and management systems [8]. Information science can be leveraged by educators to evaluate student progress in comparison to their peers, enhancing the systems of tutoring and monitoring employed by administrators and leaders [9].

Students embarking on their college journey at an introductory level often encounter multiple obstacles that may impede their academic progress. Not all students who commence higher education at a freshman level successfully graduate with the required coursework. Many university professors struggle to provide adequate support to their students. This work focuses on advanced organizational strategies for providing relevant conceptual support and meaningful verbal material, defining feedforward teaching methodologies to enhance learning. However, there is a paucity of research on feedforward strategies to improve students' learning outcomes. The aim of this study is to lay the groundwork for improved academic support and student autonomy. The primary contributions of this paper are the identification of critical feedforward properties and the proposal of effective feedforward strategies, with the objective of systematically adopting these strategies to aid student populations at the introductory level.

The prediction of academic performance is a central problem in the field of education data mining, as it is an essential precursor to achieving personalized learning [10]. Multiple studies have demonstrated that the following factors significantly influence academic performance:

  • Personality traits of the students, such as neuroticism, extroversion, and agreeableness.
  • Personal situations.
  • Motivating factors in daily life.
  • Learning-promoting behaviors.

Institutional managers can glean insights into a student's likelihood of graduation by analyzing academic performance data [11]. Traditionally, this analysis is conducted by professors, who, through class activities and midterm assessments, identify students at risk and implement timely corrective measures [12]. However, in contemporary higher education, identifying such students has become increasingly challenging due to escalating class sizes and diminished opportunities for individual student-faculty interaction. This is exacerbated by the increased availability of online educational resources and the growing number of students who are concurrently employed full-time [13].

Data mining and deep learning analyses provide more precise forecasting of student outcomes. With the advent of widespread computerization, many educational institutions amass vast amounts of data, much of which remains unutilized [14]. Extracting value from such data necessitates robust tools [15], and deep learning algorithms fit this role.

Feature selection is a crucial component of developing a student performance prediction system [16]. It can enhance the accuracy of academic performance prediction and identify the key factors influencing student performance. A novel score and ranking system is employed to develop a set of candidate qualities, followed by using a heuristic approach for the final result [17]. Predicting a student's academic performance can assist universities in identifying individuals who require financial aid, enhancing the quality of incoming students, facilitating students’ future planning, and helping struggling students identify solutions. The model used for performance prediction is driven by the attributes selected from the dataset [18]. Teachers can choose the most useful features using a feature selection technique, improving the outcomes of predictive analyses. Feature selection algorithms are particularly beneficial in educational institutions as they excel at extracting key features and avoiding redundancy without the risk of data loss [19]. Feature selection methods are employed to increase prediction accuracy while minimizing computational complexity, potentially enhancing the accuracy of models used to predict student grades [20].

The fields of Deep Learning and data mining in education have received substantial attention in recent years. This research proposes a Deep Neural Network (DNN) model to assist students in choosing appropriate class labels, aiding schools in identifying and supporting students at risk of failing [21]. Machine learning programs, utilizing students' historical data and term exam results, have proven effective in predicting future course performance. With these techniques, educators can swiftly identify students at risk of failing and intervene to enhance their performance. Moreover, it can identify top-performing students for scholarship opportunities.

Student's academic standing and progress are reflected in grades, rankings, and passing standards [22]. Predicting student performance benefits academia monitoring and early warning systems, enabling students to make timely adjustments to their study habits and progress reports to avert setbacks [23]. Teachers can rapidly adapt their lesson plans to suit the needs of their classes and improve passing rates. Increased attention from counselors to students' needs can lead to improved academic outcomes. The future success of college and university graduates may hinge on differentiated teaching [24]. A strong correlation between third-year and final year performance has been observed using the proposed Graduate Interlinked Precedent Academic based Performance Analysis using Enhanced Correlated Feature Set Model.

2. Literature Survey

Assessing students' progress vis-à-vis course objectives is a fundamental component in the enhancement of both instructional quality and the overall learning experience. Contemporary education increasingly emphasizes outcomes, with each course delineated by a set of intended learning outcomes, providing both students and educators with a clear focus. Each learning outcome is generally constituted by an action verb and content matter, encapsulating what the students should have acquired and be able to demonstrate by the course's end. The action verb delineates the activity associated with the learning outcome, while the content embodies the resultant knowledge and skills.

In 1956, Benjamin Bloom introduced a taxonomy of learning objectives, segmenting educational goals into cognitive, affective, and psychomotor domains. Performance indicators (PIs), developed to facilitate the attainment of these educational objectives, serve not only as metrics of individual student needs but also as foundational elements for an evaluation framework. Upon establishing the course's learning outcomes, corresponding PIs that exemplify mastery of these outcomes are introduced. This subsequently informs the selection of student activities that comprise the course's assessment strategy.

Liu et al. [1] proposed a pedagogical approach incorporating a teaching strategy-based learning tool, designed to enhance students' skills concurrently with new material absorption. Research has demonstrated that continuous progress monitoring towards goals significantly enhances achievement. However, these studies did not consider the additional effort required to implement such strategies. Their model examines the time and effort committed to the implementation of continuous evaluation in a first-year Computer Fundamentals course within the Computer Engineering degree program at the Technical University of Valencia, Spain.

Almayan et al. [2] proposed an innovative performance prediction tool that utilizes matrix factorization and multi-regression. Initially developed for the study of e-commerce software, it has since been effectively repurposed to evaluate student performance. A degree planner component is integrated to identify students anticipated to struggle with specific subjects, along with forecasting future course outcomes.

A web-based analytic system, designed to enhance student performance and counseling, was introduced by Chen et al. [5]. This technology, grounded in principles similar to recommendation engines, groups students according to shared attributes. Each incoming student is allocated to a cohort based on these common features, and subsequently provided with a curated list of suggested courses. The k-means method is employed for data classification.

A performance analysis model utilizing a data mining approach was proposed by Faught et al. [8]. This model generates semantic rules, derived via a decision tree-based methodology, to enable a more in-depth analysis of student performance in a given course. The employment of semantic web and ontology methods in this system aims to enhance the quality of learning resources.

Fortier et al. [9] suggested the provision of a self-guided training system for nursing students. This system facilitates the learning of correct techniques for patient transfer from a bed to a wheelchair, demonstrated through video and checklist mechanisms. The training system incorporates two Kinect sensors for the evaluation of both the trainer's and patient's postures.

In another study, Pu et al. [12] employed campus card data to extract student behavior. Activities such as dining in the cafeteria, borrowing library books, obtaining study room water, and dormitory showers were categorized into two groups: study diligence and study order. Rigorous quantitative benchmarks were developed, revealing a correlation between students' actions and their academic performance.

Hooshyar et al. [14] amalgamated data from 683 university students, collected from the student information system, campus cards, and wireless internet access points. The authors quantified criteria to investigate the impact of students' study habits and adherence to guidelines on their academic grades. Despite their apparent lack of subjectivity, these studies highlight a connection between academic success and behavioral traits.

Jaber et al. [16] classified 2,459 undergraduates into five groups based on their grades. Utilizing data mining and machine learning algorithms, the authors projected a student's performance over four years based on data gathered at the end of the first year. Caveats regarding the wide variation in students' academic achievement and the challenges in accurately forecasting future performance were implicit throughout this research. Meanwhile, a statistical model was developed by Jie [17] to identify and support at-risk students, following the discovery of an association between college student alcohol consumption and academic performance.

Li et al. [18] conducted a study involving 323 samples, with the objective of determining the correlation between grade point average, dynamic grades in four requisite courses, and dynamic midterm grades. Subsequently, twenty-four mathematical models were developed. Despite the extensive efforts involved, it was found that success in one area does not necessarily guarantee success in another. It was observed that students from China's Joint School Cooperation Program generally exhibited poor performance, with some unable to graduate on time due to inadequate academic achievement.

An early warning system for students' academic performance was developed by Mengash [19], utilizing convolutional neural networks. Shandong University of Science and Technology's data served as the case study for this development. However, the full implementation of these methods remains restricted to a select number of academic institutions.

Miguéis et al. [20] recently explored the use of data mining and machine learning techniques to forecast academic achievement in the classroom. Using a sample of 2,039 students from 2016-2019, the study aimed to predict post-admission grades based on pre-enrollment academic performance. This could potentially provide universities with a theoretical framework for making admission decisions, as the research suggested that students' high school academic performance could serve as a predictor for their university academic success.

Poropat et al. [21] conducted an investigation into the relationship between self-directed learning, optimism, psychological well-being, and academic performance among students at Wuhan University. While optimism and psychological well-being were found to indirectly influence academic performance [22], self-directed learning was directly linked to academic performance. However, the generalizability of this study is limited due to the selective nature of high academic performance among college students.

The influence of personality on academic performance was examined by Quan et al. [22] using the Big Five Inventory (BFI) scale. The results indicated a strong correlation between students' personality traits and their academic performance. However, as this study was conducted among adult medical students, its findings may not be applicable to younger student populations.

A method for faculty performance evaluation was proposed by Scanlon and Smeaton [23], positing that teaching staff quality significantly influences student outcomes. This was achieved using a semantic web rule language-based ontology system to develop the required semantic rules. The reliability of this approach was verified using a representative dataset from a public university in Pakistan.

3. Proposed Model

Recently, a new educational philosophy known as outcome-based education (OBE) has gained widespread support and popularity. When it comes to education, this new paradigm puts the emphasis on what students actually learn rather than on what teachers originally set out to do. The phrase student outcome is commonly used to describe the expected levels of student mastery of content and transferable skills and values by the end of a course or program [25]. Course outcomes and program outcomes are two possible levels at which the outcomes are defined and evaluated. Curriculum mapping is an essential activity because it ensures that courses and programs are aligned in a way that allows students to successfully complete their intended learning outcomes [26].

Digital tools were developed to more effectively record the educational evaluation activities and to promote the accomplishment of the OBE goals. The value of these tools could be greatly enhanced with the addition of intelligent models that can anticipate whether or not students will accomplish learning outcomes during the course of an education. The value of tracking student progress in higher education is difficult to overstate. These include, but are not limited to, defining the expected outcomes of the program, outlining expectations for students and teachers, and evaluating the efficacy of individual classes and entire programs. Several quality evaluation instruments and quality management frameworks were presented to facilitate the implementation of the outcome-based education paradigm and the accreditation of programs.

Predicting students' performance provides other benefits, such as the chance to apply remedial interventions in the teaching and learning processes. Nonetheless, there is a lack of publications that investigate intelligent prediction of educational results in depth. Furthermore, there is a lack of information about the factors and traits that affect academic achievement. Both academic factors, like the quality of education and the amount of time children spend online, and extracurricular factors, such parental participation and the level of internal drive demonstrated by students, are important, as shown by the existing studies. The dataset used in this study is comprised of information collected from universities about their students' academic performance.

Labeled data is data for which the desired outcome has already been determined. When performing feature extraction, users are essentially paring down a set of attributes. As opposed to feature selection, which merely ranks the current attributes in order of their predictive significance, feature extraction actively modifies the attributes. The features that result from the transformation are linear combinations of the original features. The models will be evaluated using the remaining labelled data we have. Before further processing, some deep learning methods were applied to categorize the raw data.

Students' attitudes, knowledge, and abilities must be constructed in a true and comprehensive method based on student activities; hence, evaluation cannot be done solely in the cognitive domain. Competence, attitudes, knowledge, and skills are all assessed in a balanced manner so that each student's progress in relation to predetermined goals can be determined. As a result, the old system of evaluating students' performance gave way to a more authentic evaluation system (actual), with assessment of students' psychomotor abilities becoming a key component of this latter. Performance evaluation is one type of genuine assessment that might be used to offer comprehensive data on students' progress toward and actual attainment of their Physics learning outcomes. Students' abilities in many domains can be evaluated through performance appraisal by proving their proficiency in these domains. One alternate approach to gauging students' knowledge and abilities in the actual world is through the use of performance-based assessments. Since these assessments measure the development of learning not only via results but also processes and varied ways, performance appraisal is an element of a real evaluation that is thought to be able to better quantify the total results of student learning. When it comes to the psychomotor domains of physics learning, evaluation of performance is one of the most highly suggested assessment method. One other method of evaluation is the performance review. As a result, this method of evaluating students' progress can contribute to their learning.

A key aspect of any model-building effort is an evaluation of the working model. It aids in determining which model most accurately depicts available data and how effectively the selected model will function in the long run. In data science, it is not acceptable to evaluate a model's efficacy using the same data that was used to train it, as doing so can easily lead to overoptimistic and over fitted models. Every classification model's performance is estimated by taking the mean. Using graphs to display information that has been previously categorized. Accuracy is the proportion of times test data is predicted correctly. A strong correlation between third-year and final year performance has been found using the proposed Graduate Interlinked Precedent Academic based Performance Analysis using Enhanced Correlated Feature Set Model (GIPA-PA-ECFSM) model.

The students set {S1, S2,………….., SM} is considered and the academic parameters { P1, P2……., PN} are considred for the academic performance analysis. The average academic parameters are considered as Eq. (1):

$P_{\text {analysis }}=\frac{\sum_{s=1}^M \operatorname{maxgrade}\left(P_s\right)+\operatorname{maxgrade}\left(P_{s+1}\right)}{\sum_{s=1}^M \operatorname{maxgrade}\left(P_{s+N}\right)}$        (1)

The features from the dataset are considered that is used for training the model. The features extracted will be used for academic performance analysis. The feature extraction is performed as Eq. (2):

$F \operatorname{set}(S G [M])=\sum_{s=1} \frac{\operatorname{avg}\left(\max \left(P_s, P_{s+1}\right)+\operatorname{getattr}\left(P_{,} P+S\right)\right)}{\operatorname{size}(\lambda)}+\frac{\min \left(P_s, P_{s+1}\right)}{\operatorname{size}(\lambda)}$        (2)

Here λ is the records in the dataset considered. The features extracted are used to verify the correlation factor and the maximum correlated values are considered for analyzing the performance levels. The enhanced correlated feature set model is generated as Eq. (3):

$\begin{aligned} & \text { CFset }(S(i))= {\left[\sum_{s=1}^M+\frac{\left(\max (\operatorname{Semester}(P(s+1))-\max (\operatorname{semester}(s)))^N\right.}{\min (\operatorname{semester}(s)) / \min (\operatorname{semster}(s-1))}\right]}\end{aligned}$           (3)

A deep learning model is applied with deep layered structure representation of performance analysis parameters as layers that can improve accuracy rate and level ordering. Estimating the appearance of a permutation layer allows us to join two academic parameters sections that are separated by it. Different convolution kernels can be detected in extracted feature mapping sets like map P and map Q if they are obtained by summing their convolutions. Common practice has it that the features are used as input, and the academic performance levels in each emester is then added up ana analyzed for final prediction. The hidden layer processing is performed as Eq. (4):

$\begin{aligned} & \text { Hidd_Layer }(\mathrm{i}, \mathrm{o}, \mathrm{P})=\sum_{p=1}^M \max (F \operatorname{set}(p)) +\frac{\max (C F \operatorname{set}(p)}{\min (C F \operatorname{set}(p)} * \operatorname{len}(C F \operatorname{set})+\frac{\min (P(i, i+1))+\lim _{s \rightarrow p}\left(\max (p)+\frac{\min (p)}{M}\right)^2}{\operatorname{sizeof}(p)}\end{aligned}$            (4)

The performance levels of the undergraduates are analyzed based on the features considered. These features are the mostly useful features in identification of student academic performance. The feature subset interlinking is performed as Eq. (5):

$\begin{aligned} & \text { Flink }(\operatorname{CFset}[M])= & \frac{\sum_{s \in \text { CFset }} \operatorname{mean}\left(P_s, P_{s+1}\right)+\text { Hidd_Layer }(s)-\sum_{s=1} \min \left(P_{s-1}, P_s\right)}{\text { size }(\text { CFset })} \left\{\begin{array}{lc}\text { if } \quad \text { mean }\left(P_s, P_{s+1}\right)> & Th ~\text {Im } proved \\ \text { Otherwise } & \text { Poor }\end{array}\right\}\end{aligned}$            (5)

The proposed model performs the student performance analysis based on the interlinking features for accurate academic performance levels. The academic performance analysis is performance as Eq. (6):

$\begin{aligned} & \text { Perf_Pred }[M]=\frac{\sum_{s=1}^M \max (F \operatorname{link}(i))+\operatorname{diff}(\max (P(s+1), P(s))}{\max (F \operatorname{link}(s))} +\frac{\sum_{s=1}^M \operatorname{getatt}(P(s-1), P(s))}{\max (F \operatorname{set}(s))}\end{aligned}$            (6)

4. Results

The subject of education has been making consistent progress from years. Data growth and technical development are positive trends that point to the industry's future success. Furthermore, the number of students enrolled in all types of universities had increased dramatically. While enrolment has increased, that doesn't necessarily mean that students are staying in university longer or making more academic progress toward their eventual diploma. Persistently high rates of employee turnover have overwhelmed the higher education industry, with some estimates placing the figure at over 75% in some regions of the developing world. Considering all these factors, it's evident that the institution's databases can't provide nearly enough data to produce an accurate prediction of a student's success. Equally, the bulk of the work in ensuring a student's success and performance lies not with the supervisor or the university administration, but with the student themselves.

The evaluation and assessment process in higher education is used to gauge students' progress toward achieving program goals. The purpose of learning outcomes is to specify what knowledge and abilities students should have acquired at the end of a course. Assessment is defined as "all methods used to gather information about student learning", and it is a crucial part of an efficient teaching and learning process. The process begins with the establishment of desired learning outcomes and concludes with an evaluation of the degree to which those objectives have been met. The importance of a course or program can be clearly communicated to students through the learning outcomes and evaluation methods utilized by teachers. As a result, assessment has consequential effects on a student's development and performance.

It is also used to keep an eye on how well students are doing in class and picking up new abilities through study. For the simple reason that learning, execution, and experience are what shape an individual's behavior and performance. During the evaluation process, schools take into account a wide range of factors, including students' abilities in areas like problem-solving, communication, teamwork, behavioral outlook, laboratory performance, project analysis, and so on. Mathematical and statistical approaches based on approximate data have been utilized to evaluate these requirements. This vagueness originates in teachers' subjective evaluations of students' work.

The failure to actively involve students in establishing their own success projections is a fundamental shortcoming of earlier models. While the models have generated some promising ideas, they are woefully inadequate to address the most serious concerns in today's classrooms. Python code is used to execute the proposed model in Google Colab. This dataset is the result of questioning and interviewing 2497 students. A strong correlation between third-year and final year performance has been found using the proposed Graduate Interlinked Precedent Academic based Performance Analysis using Enhanced Correlated Feature Set Model (GIPA-PA-ECFSM) model. The dataset considered records are shown in Figure 1.

The students SSC percentage is considered and analyzed the individual performance. Majority students performance is better and the performance levels are shown in Figure 2. The undergraduate students performance analysis is considered from their SSC level for enhancing the accuracy levels.

After analyzing the SSC performance, Intermediate performance is considered and the performance levels are represented in Figure 3. The students who performed well in SSC performed well in intermediate. The student improvement count is satisfactory in this level.

Figure 1. Dataset records

Figure 2. SSC performance levels of students

Figure 3. Inter performance levels of students

Figure 4. Graduates first semester performance levels

Figure 5. Performance level comparison of SSC & intermediate

Figure 6. Second year performance levels

Figure 7. Performance comparison of recent semesters

The analysis identifies the student adoption to engineering education. The first semester of the undergraduate is considered and analyzed. The performance levels of the first semester is shown in Figure 4.

The comparison of the SSC and intermediate is performed and it is observed that the students whose performance is less in SSC, the performance in intermediate is also less. The Figure 5 represents the performance level comparison of the marks of SSC and intermediate.

The second year graduate performance levels of multiple students are considered. There are inconsistencies observed in major cases in second year performance. The second year performance levels are shown in Figure 6.

The graduates recent two semester performance levels are also analyzed and the results are indicated in Figure 7. The poor student improvement is observed in the performance semester by semester whose performance is low in the previous semesters.

5. Conclusion

Deep learning relies on an ongoing utilization of datasets and deep learning technologies to effectively predict student performance. Picking the right method of study to guarantee precise questions is crucial to students success. However, even if the system could, it would be unable to provide the most accurate forecasts. Feature engineering, the act of modifying datasets for deep learning, is a crucial aspect of sharing reliable prediction findings. The purpose of this study was to investigate feature engineering and the selected technique to determine how they might enhance prediction outcomes. When choosing which features to use, feature engineering was essential. The present standard for classification and regression trees was achieved through the development of both custom features and further feature engineering. The feature engineering process leveraged the dataset's autonomous and human-driven capabilities. Clearly, the process of examining and analyzing student performance is aided by the proposed fuzzy expert system. Colleges and universities must prioritize this activity if they want their students to be adequately prepared to succeed in today's highly competitive job market. Teachers who can identify where their pupils are lacking will have the most impact on their students' development of personal accountability. In any case, knowing how their students are doing academically can help teachers improve their lessons, group work, games, and overall assessment of their progress. After all, students can get ahead in life by getting an education that caters to the demands of the workforce and the general public. This research demonstrates that proposed mdoel is highly adaptable, and that one of its benefits is that it allows for the attributes to be weighted differently depending on the institution's specific requirements. The proposed Graduate Interlinked Precedent Academic based Performance Analysis utilizing Enhanced Correlated Feature Set Model uncovered a robust relationship between performance in the second and third years. There was no attempt to convert the dependent attributes to binary valued attributes in the original research. The purpose of this study is to employ a supervised learning algorithm using a deep learning approach to accurately categorize student performance as either high, average, or low. In order to dynamically supply student records and alert teachers about underachieving kids, the user interaction model may be derived. In the future, the Neural Network might be utilized to make the prediction, which, if accurate, would improve the outcome. In the future, we may also take into account criteria outside of the classroom when making forecasts.

  References

[1] Liu, D., Zhang, Y., Zhang, J., Li, Q., Zhang, C., Yin, Y.U. (2020). Multiple features fusion attention mechanism enhanced deep knowledge tracing for student performance prediction. IEEE Access, 8: 194894-194903. https://doi.org/10.1109/ACCESS.2020.3033200

[2] Almayan, H., Al Mayyan, W. (2016). Improving accuracy of students' final grade prediction model using PSO. In 2016 6th International Conference on Information Communication and Management (ICICM), IEEE, Hatfield, UK, pp. 35-39. https://doi.org/10.1109/INFOCOMAN.2016.7784211

[3] Cao, Y., Gao, J., Lian, D., Rong, Z., Shi, J., Wang, Q., Wu, Y., Yao, H., Zhou, T. (2018). Orderliness predicts academic performance: Behavioural analysis on campus lifestyle. Journal of The Royal Society Interface, 15(146): 20180210. https://doi.org/10.1098/rsif.2018.0210

[4] Chamorro-Premuzic, T., Furnham, A. (2003). Personality predicts academic performance: Evidence from two longitudinal university samples. Journal of Research in Personality, 37(4): 319-338. https://doi.org/10.1016/S0092-6566(02)00578-0

[5] Chen, C.H., Yang, S.J., Weng, J.X., Ogata, H., Su, C.Y. (2021). Predicting at-risk university students based on their e-book reading behaviours by using machine learning classifiers. Australasian Journal of Educational Technology, 37(4): 130-144. https://doi.org/10.14742/ajet.6116

[6] Wang, D., Lian, D., Xing, Y., Dong, S., Sun, X., Yu, J. (2022). Analysis and Prediction of Influencing Factors of College Student Achievement Based on Machine Learning. Frontiers in Psychology, 13: 881859. https://doi.org/10.3389/fpsyg.2022.881859

[7] Entwistle, N.J., Nisbet, J., Entwistle, D., Cowell, M.D. (1971). The academic performance of students: 1-prediction from scales of motivation and study methods. British Journal of Educational Psychology, 41(3): 258-267. https://doi.org/10.1111/j.2044-8279.1971.tb00670.x

[8] Faught, E.L., Ekwaru, J.P., Gleddie, D., Storey, K.E., Asbridge, M., Veugelers, P.J. (2017). The combined impact of diet, physical activity, sleep and screen time on academic achievement: A prospective study of elementary school students in Nova Scotia, Canada. International Journal of Behavioral Nutrition and Physical Activity, 14: 1-13. https://doi.org/10.1186/s12966-017-0476-0

[9] Fortier, M.S., Vallerand, R.J., Guay, F. (1995). Academic motivation and school performance: Toward a structural model. Contemporary Educational Psychology, 20(3): 257-274. https://doi.org/10.1006/ceps.1995.1017

[10] Fu, J.H., Chang, J.H., Huang, Y.M., Chao, H.C. (2012). A support vector regression-based prediction of students' school performance. In 2012 International Symposium on Computer, Consumer and Control, IEEE, Taichung, Taiwan, pp. 84-87. https://doi.org/10.1109/IS3C.2012.31

[11] Gilbert, S.P., Weaver, C.C. (2010). Sleep quality and academic performance in university students: A wake-up call for college psychologists. Journal of College Student Psychotherapy, 24(4): 295-306. https://doi.org/10.1080/87568225.2010.509245

[12] Pu, H.T., Fan, M.Q., Zhang, H.B., You, B.Z., Lin, J.J., Liu, C.F., Zhao, Y.Z., Song, R. (2021). Predicting academic performance of students in Chinese-foreign cooperation in running schools with graph convolutional network. Neural Computing and Applications, 33: 637-645. https://doi.org/10.1007/s00521-020-05045-9

[13] Hallinan, M.T. (Ed.). (2006). Handbook of the sociology of education. Springer Science & Business Media.

[14] Hooshyar, D., Pedaste, M., Yang, Y. (2019). Mining educational data to predict students’ performance through procrastination behavior. Entropy, 22(1): 12. https://doi.org/10.3390/e22010012

[15] Huang, S., Fang, N. (2013). Predicting student academic performance in an engineering dynamics course: A comparison of four types of predictive mathematical models. Computers & Education, 61: 133-145. https://doi.org/10.1016/j.compedu.2012.08.015

[16] Jaber, M., Al-Samarrai, B., Salah, A., Varma, S.R., Karobari, M.I., Marya, A. (2022). Does general and specific traits of personality predict students’ academic performance? BioMed Research International, 2022. https://doi.org/10.1155/2022/9422299

[17] Jie S. (2020). Analysis of learning behavior and prediction of learning achievement based on campus big data. dissertation/master’s thesis. Wuhan: Central China Normal University.

[18] Li, J., Yang, D., Hu, Z. (2022). Wuhan college students’ self-directed learning and academic performance: Chain-mediating roles of optimism and mental health. Frontiers in Psychology, 12: 757496. https://doi.org/10.3389/fpsyg.2021.757496

[19] Mengash, H.A. (2020). Using data mining techniques to predict student performance to support decision making in university admission systems. IEEE Access, 8: 55462-55470. https://doi.org/10.1109/ACCESS.2020.2981905

[20] Miguéis, V.L., Freitas, A., Garcia, P.J., Silva, A. (2018). Early segmentation of students according to their academic performance: A predictive modelling approach. Decision Support Systems, 115: 36-51. https://doi.org/10.1016/j.dss.2018.09.001

[21] Poropat, A.E. (2009). A meta-analysis of the five-factor model of personality and academic performance. Psychological Bulletin, 135(2): 322-338. https://doi.org/10.1037/a0014996

[22] Quan, W., Zhou, Q., Zhong, Y., Wang, P. (2019). Predicting at-risk students using campus meal consumption records. The International Journal of Engineering Education, 35(2): 563-571.

[23] Scanlon, P., Smeaton, A.F. (2017). Using wifi technology to identify student activity within a bounded environment. In European Conference on Technology Enhanced Learning. https://doi.org/10.1007/978-3-319-66610-5_45

[24] Shuyuan Z., Taisheng C., Zhijian C. (2012). A research on achievement emotions of college students and its relationship to school-work achievement. Chinese Journal of Clinical Psychology, 20(3): 398-400. https://doi.org/10.16128/j.cnki.1005-3611.2012.03.006

[25] Neha, K., Sidiq, J., Zaman, M. (2021). Deep neural network model for identification of predictive variables and evaluation of student's academic performance. Revue d'Intelligence Artificielle, 35(5). https://doi.org/10.18280/ria.350507

[26] Neha, K., Kumar, R., Jahangeer Sidiq, S., Zaman, M. (2023). Deep neural networks predicting student performance. In Proceedings of International Conference on Data Science and Applications: ICDSA 2022. Singapore: Springer Nature Singapore, 2: 71-79. https://doi.org/10.1007/978-981-19-6634-7_6