• Title/Summary/Keyword: MachineLearning

Search Result 5,657, Processing Time 0.031 seconds

Motor Imagery Brain Signal Analysis for EEG-based Mouse Control (뇌전도 기반 마우스 제어를 위한 동작 상상 뇌 신호 분석)

  • Lee, Kyeong-Yeon;Lee, Tae-Hoon;Lee, Sang-Yoon
    • Korean Journal of Cognitive Science
    • /
    • v.21 no.2
    • /
    • pp.309-338
    • /
    • 2010
  • In this paper, we studied the brain-computer interface (BCI). BCIs help severely disabled people to control external devices by analyzing their brain signals evoked from motor imageries. The findings in the field of neurophysiology revealed that the power of $\beta$(14-26 Hz) and $\mu$(8-12 Hz) rhythms decreases or increases in synchrony of the underlying neuronal populations in the sensorymotor cortex when people imagine the movement of their body parts. These are called Event-Related Desynchronization / Synchronization (ERD/ERS), respectively. We implemented a BCI-based mouse interface system which enabled subjects to control a computer mouse cursor into four different directions (e.g., up, down, left, and right) by analyzing brain signal patterns online. Tongue, foot, left-hand, and right-hand motor imageries were utilized to stimulate a human brain. We used a non-invasive EEG which records brain's spontaneous electrical activity over a short period of time by placing electrodes on the scalp. Because of the nature of the EEG signals, i.e., low amplitude and vulnerability to artifacts and noise, it is hard to analyze and classify brain signals measured by EEG directly. In order to overcome these obstacles, we applied statistical machine-learning techniques. We could achieve high performance in the classification of four motor imageries by employing Common Spatial Pattern (CSP) and Linear Discriminant Analysis (LDA) which transformed input EEG signals into a new coordinate system making the variances among different motor imagery signals maximized for easy classification. From the inspection of the topographies of the results, we could also confirm ERD/ERS appeared at different brain areas for different motor imageries showing the correspondence with the anatomical and neurophysiological knowledge.

  • PDF

Computational estimation of the earthquake response for fibre reinforced concrete rectangular columns

  • Liu, Chanjuan;Wu, Xinling;Wakil, Karzan;Jermsittiparsert, Kittisak;Ho, Lanh Si;Alabduljabbar, Hisham;Alaskar, Abdulaziz;Alrshoudi, Fahed;Alyousef, Rayed;Mohamed, Abdeliazim Mustafa
    • Steel and Composite Structures
    • /
    • v.34 no.5
    • /
    • pp.743-767
    • /
    • 2020
  • Due to the impressive flexural performance, enhanced compressive strength and more constrained crack propagation, Fibre-reinforced concrete (FRC) have been widely employed in the construction application. Majority of experimental studies have focused on the seismic behavior of FRC columns. Based on the valid experimental data obtained from the previous studies, the current study has evaluated the seismic response and compressive strength of FRC rectangular columns while following hybrid metaheuristic techniques. Due to the non-linearity of seismic data, Adaptive neuro-fuzzy inference system (ANFIS) has been incorporated with metaheuristic algorithms. 317 different datasets from FRC column tests has been applied as one database in order to determine the most influential factor on the ultimate strengths of FRC rectangular columns subjected to the simulated seismic loading. ANFIS has been used with the incorporation of Particle Swarm Optimization (PSO) and Genetic algorithm (GA). For the analysis of the attained results, Extreme learning machine (ELM) as an authentic prediction method has been concurrently used. The variable selection procedure is to choose the most dominant parameters affecting the ultimate strengths of FRC rectangular columns subjected to simulated seismic loading. Accordingly, the results have shown that ANFIS-PSO has successfully predicted the seismic lateral load with R2 = 0.857 and 0.902 for the test and train phase, respectively, nominated as the lateral load prediction estimator. On the other hand, in case of compressive strength prediction, ELM is to predict the compressive strength with R2 = 0.657 and 0.862 for test and train phase, respectively. The results have shown that the seismic lateral force trend is more predictable than the compressive strength of FRC rectangular columns, in which the best results belong to the lateral force prediction. Compressive strength prediction has illustrated a significant deviation above 40 Mpa which could be related to the considerable non-linearity and possible empirical shortcomings. Finally, employing ANFIS-GA and ANFIS-PSO techniques to evaluate the seismic response of FRC are a promising reliable approach to be replaced for high cost and time-consuming experimental tests.

Principles and Current Trends of Neural Decoding (뉴럴 디코딩의 원리와 최신 연구 동향 소개)

  • Kim, Kwangsoo;Ahn, Jungryul;Cha, Seongkwang;Koo, Kyo-in;Goo, Yong Sook
    • Journal of Biomedical Engineering Research
    • /
    • v.38 no.6
    • /
    • pp.342-351
    • /
    • 2017
  • The neural decoding is a procedure that uses spike trains fired by neurons to estimate features of original stimulus. This is a fundamental step for understanding how neurons talk each other and, ultimately, how brains manage information. In this paper, the strategies of neural decoding are classified into three methodologies: rate decoding, temporal decoding, and population decoding, which are explained. Rate decoding is the firstly used and simplest decoding method in which the stimulus is reconstructed from the numbers of the spike at given time (e. g. spike rates). Since spike number is a discrete number, the spike rate itself is often not continuous and quantized, therefore if the stimulus is not static and simple, rate decoding may not provide good estimation for stimulus. Temporal decoding is the decoding method in which stimulus is reconstructed from the timing information when the spike fires. It can be useful even for rapidly changing stimulus, and our sensory system is believed to have temporal rather than rate decoding strategy. Since the use of large numbers of neurons is one of the operating principles of most nervous systems, population decoding has advantages such as reduction of uncertainty due to neuronal variability and the ability to represent a stimulus attributes simultaneously. Here, in this paper, three different decoding methods are introduced, how the information theory can be used in the neural decoding area is also given, and at the last machinelearning based algorithms for neural decoding are introduced.

Building an Analytical Platform of Big Data for Quality Inspection in the Dairy Industry: A Machine Learning Approach (유제품 산업의 품질검사를 위한 빅데이터 플랫폼 개발: 머신러닝 접근법)

  • Hwang, Hyunseok;Lee, Sangil;Kim, Sunghyun;Lee, Sangwon
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.125-140
    • /
    • 2018
  • As one of the processes in the manufacturing industry, quality inspection inspects the intermediate products or final products to separate the good-quality goods that meet the quality management standard and the defective goods that do not. The manual inspection of quality in a mass production system may result in low consistency and efficiency. Therefore, the quality inspection of mass-produced products involves automatic checking and classifying by the machines in many processes. Although there are many preceding studies on improving or optimizing the process using the data generated in the production process, there have been many constraints with regard to actual implementation due to the technical limitations of processing a large volume of data in real time. The recent research studies on big data have improved the data processing technology and enabled collecting, processing, and analyzing process data in real time. This paper aims to propose the process and details of applying big data for quality inspection and examine the applicability of the proposed method to the dairy industry. We review the previous studies and propose a big data analysis procedure that is applicable to the manufacturing sector. To assess the feasibility of the proposed method, we applied two methods to one of the quality inspection processes in the dairy industry: convolutional neural network and random forest. We collected, processed, and analyzed the images of caps and straws in real time, and then determined whether the products were defective or not. The result confirmed that there was a drastic increase in classification accuracy compared to the quality inspection performed in the past.

A Hybrid Collaborative Filtering-based Product Recommender System using Search Keywords (검색 키워드를 활용한 하이브리드 협업필터링 기반 상품 추천 시스템)

  • Lee, Yunju;Won, Haram;Shim, Jaeseung;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.151-166
    • /
    • 2020
  • A recommender system is a system that recommends products or services that best meet the preferences of each customer using statistical or machine learning techniques. Collaborative filtering (CF) is the most commonly used algorithm for implementing recommender systems. However, in most cases, it only uses purchase history or customer ratings, even though customers provide numerous other data that are available. E-commerce customers frequently use a search function to find the products in which they are interested among the vast array of products offered. Such search keyword data may be a very useful information source for modeling customer preferences. However, it is rarely used as a source of information for recommendation systems. In this paper, we propose a novel hybrid CF model based on the Doc2Vec algorithm using search keywords and purchase history data of online shopping mall customers. To validate the applicability of the proposed model, we empirically tested its performance using real-world online shopping mall data from Korea. As the number of recommended products increases, the recommendation performance of the proposed CF (or, hybrid CF based on the customer's search keywords) is improved. On the other hand, the performance of a conventional CF gradually decreased as the number of recommended products increased. As a result, we found that using search keyword data effectively represents customer preferences and might contribute to an improvement in conventional CF recommender systems.

Construction of Test Collection for Evaluation of Scientific Relation Extraction System (과학기술분야 용어 간 관계추출 시스템의 평가를 위한 테스트컬렉션 구축)

  • Choi, Yun-Soo;Choi, Sung-Pil;Jeong, Chang-Hoo;Yoon, Hwa-Mook;You, Beom-Jong
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2009.05a
    • /
    • pp.754-758
    • /
    • 2009
  • Extracting information in large-scale documents would be very useful not only for information retrieval but also for question answering and summarization. Even though relation extraction is very important area, it is difficult to develop and evaluate a machine learning based system without test collection. The study shows how to build test collection(KREC2008) for the relation extraction system. We extracted technology terms from abstracts of journals and selected several relation candidates between them using Wordnet. Judges who were well trained in evaluation process assigned a relation from candidates. The process provides the method with which even non-experts are able to build test collection easily. KREC2008 are open to the public for researchers and developers and will be utilized for development and evaluation of relation extraction system.

  • PDF

The 4th.industrial revolution and Korean university's role change (4차산업혁명과 한국대학의 역할 변화)

  • Park, Sang-Kyu
    • Journal of Convergence for Information Technology
    • /
    • v.8 no.1
    • /
    • pp.235-242
    • /
    • 2018
  • The interest about 4th Industrial Revolution was impressively increased from newspapers, iindustry, government and academic sectors. Especially AI what could be felt by the skin of many peoples, already overpassed the ability of the human's even in creative areas. Namely, now many people start fo feel that the effect of the revolution is just infront of themselves. There were several issues in this trend, the ability of deep learning by machine, the identity of the human, the change of job environment and the concern about the social change etc. Recently many studies have been made about the 4th industrial revolution in many fields like as AI(artificial intelligence), CRISPR, big data and driverless car etc. As many positive effects and pessimistic effects are existed at the same time and many preventing actions are being suggested recently, these opinions will be compared and analyzed and better solutions will be found eventually. Several educational, political, scientific, social and ethical effects and solutions were studied and suggested in this study. Clear implication from the study is that the world we will live from now on is changing faster than ever in the social, industrial, political and educational environment. If it will reform the social systems according to those changes, a society (nation or government) will grasp the chance of its development or take-off, otherwise, it will consume the resources ineffectively and lose the competition as a whole society. But the method of that reform is not that apparent in many aspects as the revolution is progressing currently and its definition should be made whether in industrial or scientific aspect. The person or nation who will define it will have the advantage of leading the future of that business or society.

A Study on the Effects of Online Word-of-Mouth on Game Consumers Based on Sentimental Analysis (감성분석 기반의 게임 소비자 온라인 구전효과 연구)

  • Jung, Keun-Woong;Kim, Jong Uk
    • Journal of Digital Convergence
    • /
    • v.16 no.3
    • /
    • pp.145-156
    • /
    • 2018
  • Unlike the past, when distributors distributed games through retail stores, they are now selling digital content, which is based on online distribution channels. This study analyzes the effects of eWOM (electronic Word of Mouth) on sales volume of game sold on Steam, an online digital content distribution channel. Recently, data mining techniques based on Big Data have been studied. In this study, emotion index of eWOM is derived by emotional analysis which is a text mining technique that can analyze the emotion of each review among factors of eWOM. Emotional analysis utilizes Naive Bayes and SVM classifier and calculates the emotion index through the SVM classifier with high accuracy. Regression analysis is performed on the dependent variable, sales variation, using the emotion index, the number of reviews of each game, the size of eWOM, and the user score of each game, which is a rating of eWOM. Regression analysis revealed that the size of the independent variable eWOM and the emotion index of the eWOM were influential on the dependent variable, sales variation. This study suggests the factors of eWOM that affect the sales volume when Korean game companies enter overseas markets based on steam.

A Method for Correcting Air-Pressure Data Collected by Mini-AWS (소형 자동기상관측장비(Mini-AWS) 기압자료 보정 기법)

  • Ha, Ji-Hun;Kim, Yong-Hyuk;Im, Hyo-Hyuc;Choi, Deokwhan;Lee, Yong Hee
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.26 no.3
    • /
    • pp.182-189
    • /
    • 2016
  • For high accuracy of forecast using numerical weather prediction models, we need to get weather observation data that are large and high dense. Korea Meteorological Administration (KMA) mantains Automatic Weather Stations (AWSs) to get weather observation data, but their installation and maintenance costs are high. Mini-AWS is a very compact automatic weather station that can measure and record temperature, humidity, and pressure. In contrast to AWS, costs of Mini-AWS's installation and maintenance are low. It also has a little space restraints for installing. So it is easier than AWS to install mini-AWS on places where we want to get weather observation data. But we cannot use the data observed from Mini-AWSs directly, because it can be affected by surrounding. In this paper, we suggest a correcting method for using pressure data observed from Mini-AWS as weather observation data. We carried out preconditioning process on pressure data from Mini-AWS. Then they were corrected by using machine learning methods with the aim of adjusting to pressure data of the AWS closest to them. Our experimental results showed that corrected pressure data are in regulation and our correcting method using SVR showed very good performance.

Study on High-speed Cyber Penetration Attack Analysis Technology based on Static Feature Base Applicable to Endpoints (Endpoint에 적용 가능한 정적 feature 기반 고속의 사이버 침투공격 분석기술 연구)

  • Hwang, Jun-ho;Hwang, Seon-bin;Kim, Su-jeong;Lee, Tae-jin
    • Journal of Internet Computing and Services
    • /
    • v.19 no.5
    • /
    • pp.21-31
    • /
    • 2018
  • Cyber penetration attacks can not only damage cyber space but can attack entire infrastructure such as electricity, gas, water, and nuclear power, which can cause enormous damage to the lives of the people. Also, cyber space has already been defined as the fifth battlefield, and strategic responses are very important. Most of recent cyber attacks are caused by malicious code, and since the number is more than 1.6 million per day, automated analysis technology to cope with a large amount of malicious code is very important. However, it is difficult to deal with malicious code encryption, obfuscation and packing, and the dynamic analysis technique is not limited to the performance requirements of dynamic analysis but also to the virtual There is a limit in coping with environment avoiding technology. In this paper, we propose a machine learning based malicious code analysis technique which improve the weakness of the detection performance of existing analysis technology while maintaining the light and high-speed analysis performance applicable to commercial endpoints. The results of this study show that 99.13% accuracy, 99.26% precision and 99.09% recall analysis performance of 71,000 normal file and malicious code in commercial environment and analysis time in PC environment can be analyzed more than 5 per second, and it can be operated independently in the endpoint environment and it is considered that it works in complementary form in operation in conjunction with existing antivirus technology and static and dynamic analysis technology. It is also expected to be used as a core element of EDR technology and malware variant analysis.