• Title/Summary/Keyword: MachineLearning

Search Result 5,657, Processing Time 0.03 seconds

Small Cell Communication Analysis based on Machine Learning in 5G Mobile Communication

  • Kim, Yoon-Hwan
    • Journal of Integrative Natural Science
    • /
    • v.14 no.2
    • /
    • pp.50-56
    • /
    • 2021
  • Due to the recent increase in the mobile streaming market, mobile traffic is increasing exponentially. IMT-2020, named as the next generation mobile communication standard by ITU, is called the 5th generation mobile communication (5G), and is a technology that satisfies the data traffic capacity, low latency, high energy efficiency, and economic efficiency compared to the existing LTE (Long Term Evolution) system. 5G implements this technology by utilizing a high frequency band, but there is a problem of path loss due to the use of a high frequency band, which is greatly affected by system performance. In this paper, small cell technology was presented as a solution to the high frequency utilization of 5G mobile communication system, and furthermore, the system performance was improved by applying machine learning technology to macro communication and small cell communication method decision. It was found that the system performance was improved due to the technical application and the application of machine learning techniques.

Machine Learning-based SOH Estimation Algorithm Using a Linear Regression Analysis (선형 회귀 분석법을 이용한 머신 러닝 기반의 SOH 추정 알고리즘)

  • Kang, Seung-Hyun;Noh, Tae-Won;Lee, Byoung-Kuk
    • The Transactions of the Korean Institute of Power Electronics
    • /
    • v.26 no.4
    • /
    • pp.241-248
    • /
    • 2021
  • A battery state-of-health (SOH) estimation algorithm using a machine learning-based linear regression method is proposed for estimating battery aging. The proposed algorithm analyzes the change trend of the open-circuit voltage (OCV) curve, which is a parameter related to SOH. At this time, a section with high linearity of the SOH and OCV curves is selected and used for SOH estimation. The SOH of the aged battery is estimated according to the selected interval using a machine learning-based linear regression method. The performance of the proposed battery SOH estimation algorithm is verified through experiments and simulations using battery packs for electric vehicles.

A Residual Power Estimation Scheme Using Machine Learning in Wireless Sensor Networks (센서 네트워크에서 기계학습을 사용한 잔류 전력 추정 방안)

  • Bae, Shi-Kyu
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.1
    • /
    • pp.67-74
    • /
    • 2021
  • As IoT(Internet Of Things) devices like a smart sensor have constrained power sources, a power strategy is critical in WSN(Wireless Sensor Networks). Therefore, it is necessary to figure out the residual power of each sensor node for managing power strategies in WSN, which, however, requires additional data transmission, leading to more power consumption. In this paper, a residual power estimation method was proposed, which uses ignorantly small amount of power consumption in the resource-constrained wireless networks including WSN. A residual power prediction is possible with the least data transmission by using Machine Learning method with some training data in this proposal. The performance of the proposed scheme was evaluated by machine learning method, simulation, and analysis.

Machine learning-based Multi-modal Sensing IoT Platform Resource Management (머신러닝 기반 멀티모달 센싱 IoT 플랫폼 리소스 관리 지원)

  • Lee, Seongchan;Sung, Nakmyoung;Lee, Seokjun;Jun, Jaeseok
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.17 no.2
    • /
    • pp.93-100
    • /
    • 2022
  • In this paper, we propose a machine learning-based method for supporting resource management of IoT software platforms in a multi-modal sensing scenario. We assume that an IoT device installed with a oneM2M-compatible software platform is connected with various sensors such as PIR, sound, dust, ambient light, ultrasonic, accelerometer, through different embedded system interfaces such as general purpose input output (GPIO), I2C, SPI, USB. Based on a collected dataset including CPU usage and user-defined priority, a machine learning model is trained to estimate the level of nice value required to adjust according to the resource usage patterns. The proposed method is validated by comparing with a rule-based control strategy, showing its practical capability in a multi-modal sensing scenario of IoT devices.

Machine Learning-based UWB Error Correction Experiment in an Indoor Environment

  • Moon, Jiseon;Kim, Sunwoo
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.11 no.1
    • /
    • pp.45-49
    • /
    • 2022
  • In this paper, we propose a method for estimating the error of the Ultra-Wideband (UWB) distance measurement using the channel impulse response (CIR) of the UWB signal based on machine learning. Due to the recent demand for indoor location-based services, wireless signal-based localization technologies are being studied, such as UWB, Wi-Fi, and Bluetooth. The constructive obstacles constituting the indoor environment make the distance measurement of UWB inaccurate, which lowers the indoor localization accuracy. Therefore, we apply machine learning to learn the characteristics of UWB signals and estimate the error of UWB distance measurements. In addition, the performance of the proposed algorithm is analyzed through experiments in an indoor environment composed of various walls.

A Strategy for Constructing the Thesaurus of Traditional East Asian Medicine (TEAM) Terms With Machine Learning (기계 학습을 이용한 한의학 용어 유의어 사전 구축 방안)

  • Oh, Junho
    • Journal of Korean Medical classics
    • /
    • v.35 no.1
    • /
    • pp.93-102
    • /
    • 2022
  • Objectives : We propose a method for constructing a thesaurus of Traditional East Asian Medicine terminology using machine learning. Methods : We presented a method of combining the 'Automatic Step' which uses machine learning and the 'Manual Step' which is the operator's review process. By applying this method to the sample data, we constructed a simple thesaurus and examined the results. Results : Out of the 17,874 sample data, a thesaurus was constructed targeting 749 terminologies. 200 candidate groups were derived in the automatic step, from which 79 synonym groups were derived in the manual step. Conclusions : The proposed method in this study will likely save resources required in constructing a thesaurus.

An Approach to Applying Multiple Linear Regression Models by Interlacing Data in Classifying Similar Software

  • Lim, Hyun-il
    • Journal of Information Processing Systems
    • /
    • v.18 no.2
    • /
    • pp.268-281
    • /
    • 2022
  • The development of information technology is bringing many changes to everyday life, and machine learning can be used as a technique to solve a wide range of real-world problems. Analysis and utilization of data are essential processes in applying machine learning to real-world problems. As a method of processing data in machine learning, we propose an approach based on applying multiple linear regression models by interlacing data to the task of classifying similar software. Linear regression is widely used in estimation problems to model the relationship between input and output data. In our approach, multiple linear regression models are generated by training on interlaced feature data. A combination of these multiple models is then used as the prediction model for classifying similar software. Experiments are performed to evaluate the proposed approach as compared to conventional linear regression, and the experimental results show that the proposed method classifies similar software more accurately than the conventional model. We anticipate the proposed approach to be applied to various kinds of classification problems to improve the accuracy of conventional linear regression.

A Sweet Persimmon Grading Algorithm using Object Detection Techniques and Machine Learning Libraries (객체 탐지 기법과 기계학습 라이브러리를 활용한 단감 등급 선별 알고리즘)

  • Roh, SeungHee;Kang, EunYoung;Park, DongGyu;Kang, Young-Min
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.6
    • /
    • pp.769-782
    • /
    • 2022
  • A study on agricultural automation became more important. In Korea, sweet persimmon farmers spend a lot of time and effort on classifying profitable persimmons. In this paper, we propose and implement an efficient grading algorithm for persimmons before shipment. We gathered more than 1,750 images of persimmons, and the images were graded and labeled for classifications purpose. Our main algorithm is based on EfficientDet object detection model but we implemented more exquisite method for better classification performance. In order to improve the precision of classification, we adopted a machine learning algorithm, which was proposed by PyCaret machine learning workflow generation library. Finally we acquired an improved classification model with the accuracy score of 81%.

Optimizing Artificial Neural Network-Based Models to Predict Rice Blast Epidemics in Korea

  • Lee, Kyung-Tae;Han, Juhyeong;Kim, Kwang-Hyung
    • The Plant Pathology Journal
    • /
    • v.38 no.4
    • /
    • pp.395-402
    • /
    • 2022
  • To predict rice blast, many machine learning methods have been proposed. As the quality and quantity of input data are essential for machine learning techniques, this study develops three artificial neural network (ANN)-based rice blast prediction models by combining two ANN models, the feed-forward neural network (FFNN) and long short-term memory, with diverse input datasets, and compares their performance. The Blast_Weathe long short-term memory r_FFNN model had the highest recall score (66.3%) for rice blast prediction. This model requires two types of input data: blast occurrence data for the last 3 years and weather data (daily maximum temperature, relative humidity, and precipitation) between January and July of the prediction year. This study showed that the performance of an ANN-based disease prediction model was improved by applying suitable machine learning techniques together with the optimization of hyperparameter tuning involving input data. Moreover, we highlight the importance of the systematic collection of long-term disease data.

Trend of Edge Machine Learning as-a-Service (서비스형 엣지 머신러닝 기술 동향)

  • Na, J.C.;Jeon, S.H.
    • Electronics and Telecommunications Trends
    • /
    • v.37 no.5
    • /
    • pp.44-53
    • /
    • 2022
  • The Internet of Things (IoT) is growing exponentially, with the number of IoT devices multiplying annually. Accordingly, the paradigm is changing from cloud computing to edge computing and even tiny edge computing because of the low latency and cost reduction. Machine learning is also shifting its role from the cloud to edge or tiny edge according to the paradigm shift. However, the fragmented and resource-constrained features of IoT devices have limited the development of artificial intelligence applications. Edge MLaaS (Machine Learning as-a-Service) has been studied to easily and quickly adopt machine learning to products and overcome the device limitations. This paper briefly summarizes what Edge MLaaS is and what element of research it requires.