Volume 23, Nº 1 (2024)
Artificial intelligence, knowledge and data engineering
Analytical Review of Methods for Automatic Analysis of Extra-Linguistic Units in Spontaneous Speech
Resumo



Sentiment Analysis Framework for Telugu Text Based on Novel Contrived Passive Aggressive with Fuzzy Weighting Classifier (CPSC-FWC)
Resumo
Natural language processing (NLP) is a subset of artificial intelligence demonstrating how algorithms can interact with individuals in their unique languages. In addition, sentiment analysis in NLP is better in numerous programs, including evaluating sentiment in Telugu. Several unsupervised machine-learning algorithms, such as k-means clustering with cuckoo search, are used to detect Telugu text. However, these techniques struggle to cluster data with variable cluster sizes and densities, slow search speeds, and poor convergence accuracy. This study developed a unique ML-based sentiment analysis system for Telugu text to address the shortcomings. Initially, in the pre-processing stage, the proposed Linear Pursuit Algorithm (LPA) removes words in white spaces, punctuation, and stops. Then, for POS tagging, this research proposed a Conditional Random Field with Lexicon weighting; following that, a Contrived Passive Aggressive with Fuzzy Weighting Classifier (CPSC-FWC) is proposed to classify the sentiments in Telugu text. Consequently, the method we propose produces efficient outcomes in terms of accuracy, precision, recall, and f1-score.



Evaluation of the Informativeness of Features in Datasets for Continuous Verification
Resumo



Building an Online Learning Model Through a Dance Recognition Video Based on Deep Learning
Resumo
Jumping motion recognition via video is a significant contribution because it considerably impacts intelligent applications and will be widely adopted in life. This method can be used to train future dancers using innovative technology. Challenging poses will be repeated and improved over time, reducing the strain on the instructor when performing multiple times. Dancers can also be recreated by removing features from their images. To recognize the dancers’ moves, check and correct their poses, and another important aspect is that our model can extract cognitive features for efficient evaluation and classification, and deep learning is currently one of the best ways to do this for short-form video features capabilities. In addition, evaluating the quality of the performance video, the accuracy of each dance step is a complex problem when the eyes of the judges cannot focus 100% on the dance on the stage. Moreover, dance on videos is of great interest to scientists today, as technology is increasingly developing and becoming useful to replace human beings. Based on actual conditions and needs in Vietnam. In this paper, we propose a method to replace manual evaluation, and our approach is used to evaluate dance through short videos. In addition, we conduct dance analysis through short-form videos, thereby applying techniques such as deep learning to assess and collect data from which to draw accurate conclusions. Experiments show that our assessment is relatively accurate when the accuracy and F1-score values are calculated. More than 92.38% accuracy and 91.18% F1-score, respectively. This demonstrates that our method performs well and accurately in dance evaluation analysis.



Iterative Tuning of Tree-Ensemble-Based Models' parameters Using Bayesian Optimization for Breast Cancer Prediction
Resumo
The study presents a method for iterative parameter tuning of tree ensemble-based models using Bayesian hyperparameter tuning for states prediction, using breast cancer as an example. The proposed method utilizes three different datasets, including the Wisconsin Diagnostic Breast Cancer (WDBC) dataset, the Surveillance, Epidemiology, and End Results (SEER) breast cancer dataset, and the Breast Cancer Coimbra dataset (BCCD), and implements tree ensemble-based models, specifically AdaBoost, Gentle-Boost, LogitBoost, Bag, and RUSBoost, for breast cancer prediction. Bayesian optimization was used to tune the hyperparameters of the models iteratively, and the performance of the models was evaluated using several metrics, including accuracy, precision, recall, and f1-score. Our results show that the proposed method significantly improves the performance of tree ensemble-based models, resulting in higher accuracy, precision, recall, and f1-score. Compared to other state-of-the-art models, the proposed method is more efficient. It achieved perfect scores of 100% for Accuracy, Precision, Recall, and F1-Score on the WDBC dataset. On the SEER BC dataset, the method achieved an accuracy of 95.9%, a precision of 97.6%, a recall of 94.2%, and an F1-Score of 95.9%. For the BCCD dataset, the method achieved an accuracy of 94.7%, a precision of 90%, a recall of 100%, and an F1-Score of 94.7%. The outcomes of this study have important implications for medical professionals, as early detection of breast cancer can significantly increase the chances of survival. Overall, this study provides a valuable contribution to the field of breast cancer prediction using machine learning.



Competence Coefficients Calculation Method of Participants in Group Decision-Making for Selecting the Best Alternative with the Multivariate of the Result
Resumo



Digital information telecommunication technologies
Model of Satellite Communication Channel Functioning under Conditions of Episodic Synchronization with Pulse Interference Flows
Resumo



A Method for Ensuring the Functional Stability of a Communication System by Detecting Conflicts
Resumo



Graph Attention Network Enhanced Power Allocation for Wireless Cellular System
Resumo
The importance of an efficient network resource allocation strategy has grown significantly with the rapid advancement of cellular network technology and the widespread use of mobile devices. Efficient resource allocation is crucial for enhancing user services and optimizing network performance. The primary objective is to optimize the power distribution method to maximize the total aggregate rate for all customers within the network. In recent years, graph-based deep learning approaches have shown great promise in addressing the challenge of network resource allocation. Graph neural networks (GNNs) have particularly excelled in handling graph-structured data, benefiting from the inherent topological characteristics of mobile networks. However, many of these methodologies tend to focus predominantly on node characteristics during the learning phase, occasionally overlooking or oversimplifying the importance of edge attributes, which are equally vital as nodes in network modeling. To tackle this limitation, we introduce a novel framework known as the Heterogeneous Edge Feature Enhanced Graph Attention Network (HEGAT). This framework establishes a direct connection between the evolving network topology and the optimal power distribution strategy throughout the learning process. Our proposed HEGAT approach exhibits improved performance and demonstrates significant generalization capabilities, as evidenced by extensive simulation results.



Latency Aware Intelligent Task Offloading Scheme for Edge-Fog-Cloud Computing – a Review
Resumo
The huge volume of data produced by IoT procedures needs the processing power and space for storage provided by cloud, edge, and fog computing systems. Each of these ways of computing has benefits as well as drawbacks. Cloud computing improves the storage of information and computational capability while increasing connection delay. Edge computing and fog computing offer similar advantages with decreased latency, but they have restricted storage, capacity, and coverage. Initially, optimization has been employed to overcome the issue of traffic dumping. Conversely, conventional optimization cannot keep up with the tight latency requirements of decision-making in complex systems ranging from milliseconds to sub-seconds. As a result, ML algorithms, particularly reinforcement learning, are gaining popularity since they can swiftly handle offloading issues in dynamic situations involving certain unidentified data. We conduct an analysis of the literature to examine the different techniques utilized to tackle this latency-aware intelligent task offloading issue schemes for cloud, edge, and fog computing. The lessons acquired consequently, from these surveys are then presented in this report. Lastly, we identify some additional avenues for study and problems that must be overcome in order to attain the lowest latency in the task offloading system.


