• Title/Summary/Keyword: Information input algorithm

Search Result 2,444, Processing Time 0.029 seconds

A Deep Learning Based Over-Sampling Scheme for Imbalanced Data Classification (불균형 데이터 분류를 위한 딥러닝 기반 오버샘플링 기법)

  • Son, Min Jae;Jung, Seung Won;Hwang, Een Jun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.8 no.7
    • /
    • pp.311-316
    • /
    • 2019
  • Classification problem is to predict the class to which an input data belongs. One of the most popular methods to do this is training a machine learning algorithm using the given dataset. In this case, the dataset should have a well-balanced class distribution for the best performance. However, when the dataset has an imbalanced class distribution, its classification performance could be very poor. To overcome this problem, we propose an over-sampling scheme that balances the number of data by using Conditional Generative Adversarial Networks (CGAN). CGAN is a generative model developed from Generative Adversarial Networks (GAN), which can learn data characteristics and generate data that is similar to real data. Therefore, CGAN can generate data of a class which has a small number of data so that the problem induced by imbalanced class distribution can be mitigated, and classification performance can be improved. Experiments using actual collected data show that the over-sampling technique using CGAN is effective and that it is superior to existing over-sampling techniques.

Deep Learning Based Short-Term Electric Load Forecasting Models using One-Hot Encoding (원-핫 인코딩을 이용한 딥러닝 단기 전력수요 예측모델)

  • Kim, Kwang Ho;Chang, Byunghoon;Choi, Hwang Kyu
    • Journal of IKEEE
    • /
    • v.23 no.3
    • /
    • pp.852-857
    • /
    • 2019
  • In order to manage the demand resources of project participants and to provide appropriate strategies in the virtual power plant's power trading platform for consumers or operators who want to participate in the distributed resource collective trading market, it is very important to forecast the next day's demand of individual participants and the overall system's electricity demand. This paper developed a power demand forecasting model for the next day. For the model, we used LSTM algorithm of deep learning technique in consideration of time series characteristics of power demand forecasting data, and new scheme is applied by applying one-hot encoding method to input/output values such as power demand. In the performance evaluation for comparing the general DNN with our LSTM forecasting model, both model showed 4.50 and 1.89 of root mean square error, respectively, and our LSTM model showed high prediction accuracy.

Resolution Estimation Technique in Gaze Tracking System for HCI (HCI를 위한 시선추적 시스템에서 분해능의 추정기법)

  • Kim, Ki-Bong;Choi, Hyun-Ho
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.1
    • /
    • pp.20-27
    • /
    • 2021
  • Eye tracking is one of the NUI technologies, and it finds out where the user is gazing. This technology allows users to input text or control GUI, and further analyzes the user's gaze so that it can be applied to commercial advertisements. In the eye tracking system, the allowable range varies depending on the quality of the image and the degree of freedom of movement of the user. Therefore, there is a need for a method of estimating the accuracy of eye tracking in advance. The accuracy of eye tracking is greatly affected by how the eye tracking algorithm is implemented in addition to hardware variables. Accordingly, in this paper, we propose a method to estimate how many degrees of gaze changes when the pupil center moves by one pixel by estimating the maximum possible movement distance of the pupil center in the image.

Method of Extracting the Topic Sentence Considering Sentence Importance based on ELMo Embedding (ELMo 임베딩 기반 문장 중요도를 고려한 중심 문장 추출 방법)

  • Kim, Eun Hee;Lim, Myung Jin;Shin, Ju Hyun
    • Smart Media Journal
    • /
    • v.10 no.1
    • /
    • pp.39-46
    • /
    • 2021
  • This study is about a method of extracting a summary from a news article in consideration of the importance of each sentence constituting the article. We propose a method of calculating sentence importance by extracting the probabilities of topic sentence, similarity with article title and other sentences, and sentence position as characteristics that affect sentence importance. At this time, a hypothesis is established that the Topic Sentence will have a characteristic distinct from the general sentence, and a deep learning-based classification model is trained to obtain a topic sentence probability value for the input sentence. Also, using the pre-learned ELMo language model, the similarity between sentences is calculated based on the sentence vector value reflecting the context information and extracted as sentence characteristics. The topic sentence classification performance of the LSTM and BERT models was 93% accurate, 96.22% recall, and 89.5% precision, resulting in high analysis results. As a result of calculating the importance of each sentence by combining the extracted sentence characteristics, it was confirmed that the performance of extracting the topic sentence was improved by about 10% compared to the existing TextRank algorithm.

A Proposal for Partial Automation Preparation System of BIM-based Energy Conservation Plan - Case Study on Automation Process Using BIM Software and Excel VBA - (BIM기반 에너지절약계획서 건축부문 부분자동화 작성 시스템 제안 - BIM 소프트웨어와 EXCEL VBA를 이용한 자동화과정을 중심으로 -)

  • Ryu, Jea-Ho;Hwang, Jong-Min;Kim, Sol-Yee;Seo, Hwa-Yeong;Lee, Ji-Hyun
    • Journal of KIBIM
    • /
    • v.12 no.2
    • /
    • pp.49-59
    • /
    • 2022
  • The main idea of this study is to propose a BIM-based automation system drawing up a report of energy conservation plan in the architecture division. In order to obtain a building permit, an energy conservation plan must be prepared for buildings with a total floor area of 500m2 or more under the current law. Currently, it is adopted as a general method to complete a report by obtaining data and drawings necessary for an energy conservation plan through manual work and input them directly into the verification system. This method takes a lot of effort and time in the design phase which ultimately increases the initial cost of the business, including the services of companies specialized in the environmental field. However, in preparation for mandatory BIM work process in the future, it is necessary to introduce BIM-based automatic creation system that has an advantage for shortening the whole process to enable rapid permission of energy-saving designs for buildings. There may be many methods of automation, but this study introduces how to build an application using Dynamo of Revit, in terms of utilizing BIM, and write an energy conservation plan by automatic completion of report through Dynamo and Excel's VBA algorithm, which can save time and cost in preparing the report of energy conservation plan compared with the manual process. Also we have insisted that the digital transformation of architectural process is a necessary for an efficient use of our automation system in the current energy conservation plan workflow.

Automated Verification of Livestock Manure Transfer Management System Handover Document using Gradient Boosting (Gradient Boosting을 이용한 가축분뇨 인계관리시스템 인계서 자동 검증)

  • Jonghwi Hwang;Hwakyung Kim;Jaehak Ryu;Taeho Kim;Yongtae Shin
    • Journal of Information Technology Services
    • /
    • v.22 no.4
    • /
    • pp.97-110
    • /
    • 2023
  • In this study, we propose a technique to automatically generate transfer documents using sensor data from livestock manure transfer systems. The research involves analyzing sensor data and applying machine learning techniques to derive optimized outcomes for livestock manure transfer documents. By comparing and contrasting with existing documents, we present a method for automatic document generation. Specifically, we propose the utilization of Gradient Boosting, a machine learning algorithm. The objective of this research is to enhance the efficiency of livestock manure and liquid byproduct management. Currently, stakeholders including producers, transporters, and processors manually input data into the livestock manure transfer management system during the disposal of manure and liquid byproducts. This manual process consumes additional labor, leads to data inconsistency, and complicates the management of distribution and treatment. Therefore, the aim of this study is to leverage data to automatically generate transfer documents, thereby increasing the efficiency of livestock manure and liquid byproduct management. By utilizing sensor data from livestock manure and liquid byproduct transport vehicles and employing machine learning algorithms, we establish a system that automates the validation of transfer documents, reducing the burden on producers, transporters, and processors. This efficient management system is anticipated to create a transparent environment for the distribution and treatment of livestock manure and liquid byproducts.

Privacy Preserving Techniques for Deep Learning in Multi-Party System (멀티 파티 시스템에서 딥러닝을 위한 프라이버시 보존 기술)

  • Hye-Kyeong Ko
    • The Journal of the Convergence on Culture Technology
    • /
    • v.9 no.3
    • /
    • pp.647-654
    • /
    • 2023
  • Deep Learning is a useful method for classifying and recognizing complex data such as images and text, and the accuracy of the deep learning method is the basis for making artificial intelligence-based services on the Internet useful. However, the vast amount of user da vita used for training in deep learning has led to privacy violation problems, and it is worried that companies that have collected personal and sensitive data of users, such as photographs and voices, own the data indefinitely. Users cannot delete their data and cannot limit the purpose of use. For example, data owners such as medical institutions that want to apply deep learning technology to patients' medical records cannot share patient data because of privacy and confidentiality issues, making it difficult to benefit from deep learning technology. In this paper, we have designed a privacy preservation technique-applied deep learning technique that allows multiple workers to use a neural network model jointly, without sharing input datasets, in multi-party system. We proposed a method that can selectively share small subsets using an optimization algorithm based on modified stochastic gradient descent, confirming that it could facilitate training with increased learning accuracy while protecting private information.

Joint Reasoning of Real-time Visual Risk Zone Identification and Numeric Checking for Construction Safety Management

  • Ali, Ahmed Khairadeen;Khan, Numan;Lee, Do Yeop;Park, Chansik
    • International conference on construction engineering and project management
    • /
    • 2020.12a
    • /
    • pp.313-322
    • /
    • 2020
  • The recognition of the risk hazards is a vital step to effectively prevent accidents on a construction site. The advanced development in computer vision systems and the availability of the large visual database related to construction site made it possible to take quick action in the event of human error and disaster situations that may occur during management supervision. Therefore, it is necessary to analyze the risk factors that need to be managed at the construction site and review appropriate and effective technical methods for each risk factor. This research focuses on analyzing Occupational Safety and Health Agency (OSHA) related to risk zone identification rules that can be adopted by the image recognition technology and classify their risk factors depending on the effective technical method. Therefore, this research developed a pattern-oriented classification of OSHA rules that can employ a large scale of safety hazard recognition. This research uses joint reasoning of risk zone Identification and numeric input by utilizing a stereo camera integrated with an image detection algorithm such as (YOLOv3) and Pyramid Stereo Matching Network (PSMNet). The research result identifies risk zones and raises alarm if a target object enters this zone. It also determines numerical information of a target, which recognizes the length, spacing, and angle of the target. Applying image detection joint logic algorithms might leverage the speed and accuracy of hazard detection due to merging more than one factor to prevent accidents in the job site.

  • PDF

Power analysis attacks against NTRU and their countermeasures (NTRU 암호에 대한 전력 분석 공격 및 대응 방법)

  • Song, Jeong-Eun;Han, Dong-Guk;Lee, Mun-Kyu;Choi, Doo-Ho
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.19 no.2
    • /
    • pp.11-21
    • /
    • 2009
  • The NTRU cryptosystem proposed by Hoffstein et al. in 1990s is a public key cryptosystem based on hard lattice problems. NTRU has many advantages compared to other public key cryptosystems such as RSA and elliptic curve cryptosystems. For example, it guarantees high speed encryption and decryption with the same level of security, and there is no known quantum computing algorithm for speeding up attacks against NTRD. In this paper, we analyze the security of NTRU against the simple power analysis (SPA) attack and the statistical power analysis (STPA) attack such as the correlation power analysis (CPA) attack First, we implement NTRU operations using NesC on a Telos mote, and we show how to apply CPA to recover a private key from collected power traces. We also suggest countermeasures against these attacks. In order to prevent SPA, we propose to use a nonzero value to initialize the array which will store the result of a convolution operation. On the other hand, in order to prevent STPA, we propose two techniques to randomize power traces related to the same input. The first one is random ordering of the computation sequences in a convolution operation and the other is data randomization in convolution operation.

Direct Reconstruction of Displaced Subdivision Mesh from Unorganized 3D Points (연결정보가 없는 3차원 점으로부터 차이분할메쉬 직접 복원)

  • Jung, Won-Ki;Kim, Chang-Heon
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.29 no.6
    • /
    • pp.307-317
    • /
    • 2002
  • In this paper we propose a new mesh reconstruction scheme that produces a displaced subdivision surface directly from unorganized points. The displaced subdivision surface is a new mesh representation that defines a detailed mesh with a displacement map over a smooth domain surface, but original displaced subdivision surface algorithm needs an explicit polygonal mesh since it is not a mesh reconstruction algorithm but a mesh conversion (remeshing) algorithm. The main idea of our approach is that we sample surface detail from unorganized points without any topological information. For this, we predict a virtual triangular face from unorganized points for each sampling ray from a parameteric domain surface. Direct displaced subdivision surface reconstruction from unorganized points has much importance since the output of this algorithm has several important properties: It has compact mesh representation since most vertices can be represented by only a scalar value. Underlying structure of it is piecewise regular so it ran be easily transformed into a multiresolution mesh. Smoothness after mesh deformation is automatically preserved. We avoid time-consuming global energy optimization by employing the input data dependant mesh smoothing, so we can get a good quality displaced subdivision surface quickly.