• Title/Summary/Keyword: machine learning applications

Search Result 538, Processing Time 0.023 seconds

A Study on Development Environments for Machine Learning (머신러닝 자동화를 위한 개발 환경에 관한 연구)

  • Kim, Dong Gil;Park, Yong-Soon;Park, Lae-Jeong;Chung, Tae-Yun
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.15 no.6
    • /
    • pp.307-316
    • /
    • 2020
  • Machine learning model data is highly affected by performance. preprocessing is needed to enable analysis of various types of data, such as letters, numbers, and special characters. This paper proposes a development environment that aims to process categorical and continuous data according to the type of missing values in stage 1, implementing the function of selecting the best performing algorithm in stage 2 and automating the process of checking model performance in stage 3. Using this model, machine learning models can be created without prior knowledge of data preprocessing.

Effective E-Learning Practices by Machine Learning and Artificial Intelligence

  • Arshi Naim;Sahar Mohammed Alshawaf
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.1
    • /
    • pp.209-214
    • /
    • 2024
  • This is an extended research paper focusing on the applications of Machine Learing and Artificial Intelligence in virtual learning environment. The world is moving at a fast pace having the application of Machine Learning (ML) and Artificial Intelligence (AI) in all the major disciplines and the educational sector is also not untouched by its impact especially in an online learning environment. This paper attempts to elaborate on the benefits of ML and AI in E-Learning (EL) in general and explain how King Khalid University (KKU) EL Deanship is making the best of ML and AI in its practices. Also, researchers have focused on the future of ML and AI in any academic program. This research is descriptive in nature; results are based on qualitative analysis done through tools and techniques of EL applied in KKU as an example but the same modus operandi can be implemented by any institution in its EL platform. KKU is using Learning Management Services (LMS) for providing online learning practices and Blackboard (BB) for sharing online learning resources, therefore these tools are considered by the researchers for explaining the results of ML and AI.

Lane Detection Based on Inverse Perspective Transformation and Machine Learning in Lightweight Embedded System (경량화된 임베디드 시스템에서 역 원근 변환 및 머신 러닝 기반 차선 검출)

  • Hong, Sunghoon;Park, Daejin
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.17 no.1
    • /
    • pp.41-49
    • /
    • 2022
  • This paper proposes a novel lane detection algorithm based on inverse perspective transformation and machine learning in lightweight embedded system. The inverse perspective transformation method is presented for obtaining a bird's-eye view of the scene from a perspective image to remove perspective effects. This method requires only the internal and external parameters of the camera without a homography matrix with 8 degrees of freedom (DoF) that maps the points in one image to the corresponding points in the other image. To improve the accuracy and speed of lane detection in complex road environments, machine learning algorithm that has passed the first classifier is used. Before using machine learning, we apply a meaningful first classifier to the lane detection to improve the detection speed. The first classifier is applied in the bird's-eye view image to determine lane regions. A lane region passed the first classifier is detected more accurately through machine learning. The system has been tested through the driving video of the vehicle in embedded system. The experimental results show that the proposed method works well in various road environments and meet the real-time requirements. As a result, its lane detection speed is about 3.85 times faster than edge-based lane detection, and its detection accuracy is better than edge-based lane detection.

IoT Security and Machine Learning

  • Almalki, Sarah;Alsuwat, Hatim;Alsuwat, Emad
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.5
    • /
    • pp.103-114
    • /
    • 2022
  • The Internet of Things (IoT) is one of the fastest technologies that are used in various applications and fields. The concept of IoT will not only be limited to the fields of scientific and technical life but will also gradually spread to become an essential part of our daily life and routine. Before, IoT was a complex term unknown to many, but soon it will become something common. IoT is a natural and indispensable routine in which smart devices and sensors are connected wirelessly or wired over the Internet to exchange and process data. With all the benefits and advantages offered by the IoT, it does not face many security and privacy challenges because the current traditional security protocols are not suitable for IoT technologies. In this paper, we presented a comprehensive survey of the latest studies from 2018 to 2021 related to the security of the IoT and the use of machine learning (ML) and deep learning and their applications in addressing security and privacy in the IoT. A description was initially presented, followed by a comprehensive overview of the IoT and its applications and the basic important safety requirements of confidentiality, integrity, and availability and its application in the IoT. Then we reviewed the attacks and challenges facing the IoT. We also focused on ML and its applications in addressing the security problem on the IoT.

Developing of New a Tensorflow Tutorial Model on Machine Learning : Focusing on the Kaggle Titanic Dataset (텐서플로우 튜토리얼 방식의 머신러닝 신규 모델 개발 : 캐글 타이타닉 데이터 셋을 중심으로)

  • Kim, Dong Gil;Park, Yong-Soon;Park, Lae-Jeong;Chung, Tae-Yun
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.14 no.4
    • /
    • pp.207-218
    • /
    • 2019
  • The purpose of this study is to develop a model that can systematically study the whole learning process of machine learning. Since the existing model describes the learning process with minimum coding, it can learn the progress of machine learning sequentially through the new model, and can visualize each process using the tensor flow. The new model used all of the existing model algorithms and confirmed the importance of the variables that affect the target variable, survival. The used to classification training data into training and verification, and to evaluate the performance of the model with test data. As a result of the final analysis, the ensemble techniques is the all tutorial model showed high performance, and the maximum performance of the model was improved by maximum 5.2% when compared with the existing model using. In future research, it is necessary to construct an environment in which machine learning can be learned regardless of the data preprocessing method and OS that can learn a model that is better than the existing performance.

Sentiment Analysis to Classify Scams in Crowdfunding

  • shafqat, Wafa;byun, Yung-cheol
    • Soft Computing and Machine Intelligence
    • /
    • v.1 no.1
    • /
    • pp.24-30
    • /
    • 2021
  • The accelerated growth of the internet and the enormous amount of data availability has become the primary reason for machine learning applications for data analysis and, more specifically, pattern recognition and decision making. In this paper, we focused on the crowdfunding site Kickstarter and collected the comments in order to apply neural networks to classify the projects based on the sentiments of backers. The power of customer reviews and sentiment analysis has motivated us to apply this technique in crowdfunding to find timely indications and identify suspicious activities and mitigate the risk of money loss.

Study on Derivation and Implementation of Quantized Gradient for Machine Learning (기계학습을 위한 양자화 경사도함수 유도 및 구현에 관한 연구)

  • Seok, Jinwuk
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.15 no.1
    • /
    • pp.1-8
    • /
    • 2020
  • A derivation method for a quantized gradient for machine learning on an embedded system is proposed, in this paper. The proposed differentiation method induces the quantized gradient vector to an objective function and provides that the validation of the directional derivation. Moreover, mathematical analysis shows that the sequence yielded by the learning equation based on the proposed quantization converges to the optimal point of the quantized objective function when the quantized parameter is sufficiently large. The simulation result shows that the optimization solver based on the proposed quantized method represents sufficient performance in comparison to the conventional method based on the floating-point system.

Deep Neural Net Machine Learning and Manufacturing (제조업의 심층신경망 기계학습(딥러닝))

  • CHO, Mann;Lee, Mingook
    • Journal of Energy Engineering
    • /
    • v.26 no.3
    • /
    • pp.11-29
    • /
    • 2017
  • In recent years, the use of artificial intelligence technology such as deep neural net machine learning(deep learning) is becoming an effective and practical option in industrial manufacturing process. This study focuses on recent deep learning development environments and their applications in the manufacturing field.

Machine learning-based Multi-modal Sensing IoT Platform Resource Management (머신러닝 기반 멀티모달 센싱 IoT 플랫폼 리소스 관리 지원)

  • Lee, Seongchan;Sung, Nakmyoung;Lee, Seokjun;Jun, Jaeseok
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.17 no.2
    • /
    • pp.93-100
    • /
    • 2022
  • In this paper, we propose a machine learning-based method for supporting resource management of IoT software platforms in a multi-modal sensing scenario. We assume that an IoT device installed with a oneM2M-compatible software platform is connected with various sensors such as PIR, sound, dust, ambient light, ultrasonic, accelerometer, through different embedded system interfaces such as general purpose input output (GPIO), I2C, SPI, USB. Based on a collected dataset including CPU usage and user-defined priority, a machine learning model is trained to estimate the level of nice value required to adjust according to the resource usage patterns. The proposed method is validated by comparing with a rule-based control strategy, showing its practical capability in a multi-modal sensing scenario of IoT devices.

Trend of Edge Machine Learning as-a-Service (서비스형 엣지 머신러닝 기술 동향)

  • Na, J.C.;Jeon, S.H.
    • Electronics and Telecommunications Trends
    • /
    • v.37 no.5
    • /
    • pp.44-53
    • /
    • 2022
  • The Internet of Things (IoT) is growing exponentially, with the number of IoT devices multiplying annually. Accordingly, the paradigm is changing from cloud computing to edge computing and even tiny edge computing because of the low latency and cost reduction. Machine learning is also shifting its role from the cloud to edge or tiny edge according to the paradigm shift. However, the fragmented and resource-constrained features of IoT devices have limited the development of artificial intelligence applications. Edge MLaaS (Machine Learning as-a-Service) has been studied to easily and quickly adopt machine learning to products and overcome the device limitations. This paper briefly summarizes what Edge MLaaS is and what element of research it requires.