• Title/Summary/Keyword: Machine learning in communications

Search Result 109, Processing Time 0.021 seconds

Analysis on Trends of No-Code Machine Learning Tools

  • Yo-Seob, Lee;Phil-Joo, Moon
    • International Journal of Advanced Culture Technology
    • /
    • v.10 no.4
    • /
    • pp.412-419
    • /
    • 2022
  • The amount of digital text data is growing exponentially, and many machine learning solutions are being used to monitor and manage this data. Artificial intelligence and machine learning are used in many areas of our daily lives, but the underlying processes and concepts are not easy for most people to understand. At a time when many experts are needed to run a machine learning solution, no-code machine learning tools are a good solution. No-code machine learning tools is a platform that enables machine learning functions to be performed without engineers or developers. The latest No-Code machine learning tools run in your browser, so you don't need to install any additional software, and the simple GUI interface makes them easy to use. Using these platforms can save you a lot of money and time because there is less skill and less code to write. No-Code machine learning tools make it easy to understand artificial intelligence and machine learning. In this paper, we examine No-Code machine learning tools and compare their features.

A Sweet Persimmon Grading Algorithm using Object Detection Techniques and Machine Learning Libraries (객체 탐지 기법과 기계학습 라이브러리를 활용한 단감 등급 선별 알고리즘)

  • Roh, SeungHee;Kang, EunYoung;Park, DongGyu;Kang, Young-Min
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.6
    • /
    • pp.769-782
    • /
    • 2022
  • A study on agricultural automation became more important. In Korea, sweet persimmon farmers spend a lot of time and effort on classifying profitable persimmons. In this paper, we propose and implement an efficient grading algorithm for persimmons before shipment. We gathered more than 1,750 images of persimmons, and the images were graded and labeled for classifications purpose. Our main algorithm is based on EfficientDet object detection model but we implemented more exquisite method for better classification performance. In order to improve the precision of classification, we adopted a machine learning algorithm, which was proposed by PyCaret machine learning workflow generation library. Finally we acquired an improved classification model with the accuracy score of 81%.

Predicting Crop Production for Agricultural Consultation Service

  • Lee, Soong-Hee;Bae, Jae-Yong
    • Journal of information and communication convergence engineering
    • /
    • v.17 no.1
    • /
    • pp.8-13
    • /
    • 2019
  • Smart Farming has been regarded as an important application in information and communications technology (ICT) fields. Selecting crops for cultivation at the pre-production stage is critical for agricultural producers' final profits because over-production and under-production may result in uncountable losses, and it is necessary to predict crop production to prevent these losses. The ITU-T Recommendation for Smart Farming (Y.4450/Y.2238) defines plan/production consultation service at the pre-production stage; this type of service must trace crop production in a predictive way. Several research papers present that machine learning technology can be applied to predict crop production after related data are learned, but these technologies have little to do with standardized ICT services. This paper clarifies the relationship between agricultural consultation services and predicting crop production. A prediction scheme is proposed, and the results confirm the usability and superiority of machine learning for predicting crop production.

Recent deep learning methods for tabular data

  • Yejin Hwang;Jongwoo Song
    • Communications for Statistical Applications and Methods
    • /
    • v.30 no.2
    • /
    • pp.215-226
    • /
    • 2023
  • Deep learning has made great strides in the field of unstructured data such as text, images, and audio. However, in the case of tabular data analysis, machine learning algorithms such as ensemble methods are still better than deep learning. To keep up with the performance of machine learning algorithms with good predictive power, several deep learning methods for tabular data have been proposed recently. In this paper, we review the latest deep learning models for tabular data and compare the performances of these models using several datasets. In addition, we also compare the latest boosting methods to these deep learning methods and suggest the guidelines to the users, who analyze tabular datasets. In regression, machine learning methods are better than deep learning methods. But for the classification problems, deep learning methods perform better than the machine learning methods in some cases.

Multiple Discriminative DNNs for I-Vector Based Open-Set Language Recognition (I-벡터 기반 오픈세트 언어 인식을 위한 다중 판별 DNN)

  • Kang, Woo Hyun;Cho, Won Ik;Kang, Tae Gyoon;Kim, Nam Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.41 no.8
    • /
    • pp.958-964
    • /
    • 2016
  • In this paper, we propose an i-vector based language recognition system to identify the spoken language of the speaker, which uses multiple discriminative deep neural network (DNN) models analogous to the multi-class support vector machine (SVM) classification system. The proposed model was trained and tested using the i-vectors included in the NIST 2015 i-vector Machine Learning Challenge database, and shown to outperform the conventional language recognition methods such as cosine distance, SVM and softmax NN classifier in open-set experiments.

Priority-based learning automata in Q-learning random access scheme for cellular M2M communications

  • Shinkafi, Nasir A.;Bello, Lawal M.;Shu'aibu, Dahiru S.;Mitchell, Paul D.
    • ETRI Journal
    • /
    • v.43 no.5
    • /
    • pp.787-798
    • /
    • 2021
  • This paper applies learning automata to improve the performance of a Q-learning based random access channel (QL-RACH) scheme in a cellular machine-to-machine (M2M) communication system. A prioritized learning automata QL-RACH (PLA-QL-RACH) access scheme is proposed. The scheme employs a prioritized learning automata technique to improve the throughput performance by minimizing the level of interaction and collision of M2M devices with human-to-human devices sharing the RACH of a cellular system. In addition, this scheme eliminates the excessive punishment suffered by the M2M devices by controlling the administration of a penalty. Simulation results show that the proposed PLA-QL-RACH scheme improves the RACH throughput by approximately 82% and reduces access delay by 79% with faster learning convergence when compared with QL-RACH.

Review of Korean Speech Act Classification: Machine Learning Methods

  • Kim, Hark-Soo;Seon, Choong-Nyoung;Seo, Jung-Yun
    • Journal of Computing Science and Engineering
    • /
    • v.5 no.4
    • /
    • pp.288-293
    • /
    • 2011
  • To resolve ambiguities in speech act classification, various machine learning models have been proposed over the past 10 years. In this paper, we review these machine learning models and present the results of experimental comparison of three representative models, namely the decision tree, the support vector machine (SVM), and the maximum entropy model (MEM). In experiments with a goal-oriented dialogue corpus in the schedule management domain, we found that the MEM has lighter hardware requirements, whereas the SVM has better performance characteristics.

Introduction to convolutional neural network using Keras; an understanding from a statistician

  • Lee, Hagyeong;Song, Jongwoo
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.6
    • /
    • pp.591-610
    • /
    • 2019
  • Deep Learning is one of the machine learning methods to find features from a huge data using non-linear transformation. It is now commonly used for supervised learning in many fields. In particular, Convolutional Neural Network (CNN) is the best technique for the image classification since 2012. For users who consider deep learning models for real-world applications, Keras is a popular API for neural networks written in Python and also can be used in R. We try examine the parameter estimation procedures of Deep Neural Network and structures of CNN models from basics to advanced techniques. We also try to figure out some crucial steps in CNN that can improve image classification performance in the CIFAR10 dataset using Keras. We found that several stacks of convolutional layers and batch normalization could improve prediction performance. We also compared image classification performances with other machine learning methods, including K-Nearest Neighbors (K-NN), Random Forest, and XGBoost, in both MNIST and CIFAR10 dataset.

Performance of Real-time Image Recognition Algorithm Based on Machine Learning (기계학습 기반의 실시간 이미지 인식 알고리즘의 성능)

  • Sun, Young Ghyu;Hwang, Yu Min;Hong, Seung Gwan;Kim, Jin Young
    • Journal of Satellite, Information and Communications
    • /
    • v.12 no.3
    • /
    • pp.69-73
    • /
    • 2017
  • In this paper, we developed a real-time image recognition algorithm based on machine learning and tested the performance of the algorithm. The real-time image recognition algorithm recognizes the input image in real-time based on the machine-learned image data. In order to test the performance of the real-time image recognition algorithm, we applied the real-time image recognition algorithm to the autonomous vehicle and showed the performance of the real-time image recognition algorithm through the application of the autonomous vehicle.

Research Trends on Physical Layers in Wireless Communications Using Machine Learning (무선 통신 물리 계층의 기계학습 활용 동향)

  • Choi, Y.H.;Kang, H.D.;Kim, D.Y.;Lee, J.H.;Park, Y.O.
    • Electronics and Telecommunications Trends
    • /
    • v.33 no.2
    • /
    • pp.39-47
    • /
    • 2018
  • The fundamental problem of communication is that of transmitting a message from a source to a destination over a channel through the use of a transmitter and receiver. To derive a theoretically optimal solution, the transmitter and receiver can be divided into several processing blocks, with each component analyzed and optimized. The idea of machine learning (or deep learning) communications systems goes back to the original definition of the communi-cation problem, and optimizes the transmitter and receiver jointly. Although today's systems have been optimized over the last decades, and it seems difficult to compete with their performance, deep learning based communication is attractive owing to its simplicity and the fact that it can learn to communicate over any type of channel without the need for mathematical modeling or analysis.