• Title/Summary/Keyword: trend algorithm

Search Result 431, Processing Time 0.023 seconds

An Analysis of Convergence Relation on Economic Activity with Credit cards in Korea and China & A Development of the Algorithm on economic trend Estimation (한중 신용카드가 경제활동에 미치는 융합적 영향 및 경제추이 예측을 위한 알고리즘 개발 연구)

  • Baik, Ran;Ryu, Jae Hee
    • Journal of the Korea Convergence Society
    • /
    • v.7 no.4
    • /
    • pp.9-17
    • /
    • 2016
  • This study is to analyze korea credit card market and the China credit card market, and predict future economic activity by developing the Algorithm for future economic trend Estimation As a results, there is no significant correlation between personal income growth and the credit card usage amount, and significant correlation between the credit card per capita and the credit card usage amount, in korea. there is significant correlation between personal income growth and the credit card usage amount, and between the credit card per capita and the credit card usage amount in china. it could be predicted that the china credit card market would be increased and the rate of increase would be gradually increased over the next five years, under the condition without constraints in the external environment.

The Application Method of Machine Learning for Analyzing User Transaction Tendency in Big Data environments (빅데이터 환경에서 사용자 거래 성향분석을 위한 머신러닝 응용 기법)

  • Choi, Do-hyeon;Park, Jung-oh
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.10
    • /
    • pp.2232-2240
    • /
    • 2015
  • Recently in the field of Big Data, there is a trend of collecting and reprocessing the existing data such as products having high interest of customers and past purchase details to be utilized for the analysis of transaction propensity of users(product recommendations, sales forecasts, etc). Studies related to the propensity of previous users has limitations on its range of subjects and investigation timing and difficult to make predictions on detailed products with lack of real-time thus there exists difficult disadvantages of introducing appropriate and quick sales strategy against the trend. This paper utilizes the machine learning algorithm application to analyze the transaction propensity of users. As a result of applying the machine learning algorithm, it has demonstrated that various indicators which can be deduced by detailed product were able to be extracted.

Hash-Based Signature Scheme Technical Trend and Prospect (해시 기반 서명 기법 최신 기술 동향 및 전망)

  • Park, Tae-hwan;Bae, Bong-jin;Kim, Ho-won
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.26 no.6
    • /
    • pp.1413-1419
    • /
    • 2016
  • In these days, there are a lot of research results on the Post-Quantum Cryptography according to developing of quantum computing technologies and the announcement of the NIST's Post-Quantum Cryptography standard project. The key size of the existing symmetric key block ciphers are needed to increase and the security of discrete logarithm based public key cryptography can be broken by Grover's algorithm and Shor's algorithm. By this reason, a lot of cryptologist and mathematician research on safe cryptography against the quantum computer which is called as the Post-Quantum Cryptography. In this paper, we survey on recent technical trend on the Hash-Based Signature Scheme which is one of the Post-Quantum Cryptography and suggest the prospect of the Hash-Based Signature Scheme.

Analysis of intraday price momentum effect based on patterns using dynamic time warping (DTW를 이용한 패턴 기반 일중 price momentum 효과 분석)

  • Lee, Chunju;Ahn, Wonbin;Oh, Kyong Joo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.4
    • /
    • pp.819-829
    • /
    • 2017
  • The aim of this study is to analyze intraday price momentum. When price trends are formed, price momentum is the phenomenon that future prices tend to follow the trend. When the market opened and closed, a U-shaped trading volume pattern in which the trading volume was concentrated was observed. In this paper, we defined price momentum as the 10 minute trend after market opening is maintained until the end of market. The strategy is to determine buying and selling in accordance with the price change in the initial 10 minutes and liquidating at closing price. In this study, the strategy was empirically analyzed by using minute data, and it showed effectiveness, indicating the presence of an intraday price momentum. A pattern in which returns are increasing at an early stage is called a J-shaped pattern. If the J-shaped pattern occurs, we have found that the price momentum phenomenon tends to be stronger than otherwise. The DTW algorithm, which is well known in the field of pattern recognition, was used for J-shaped pattern recognition and the algorithm was effective in predicting intraday price movements. This study showed that intraday price momentum exists in the KOSPI200 futures market.

Depth-based Correction of Side Scan Sonal Image Data and Segmentation for Seafloor Classification (수심을 고려한 사이드 스캔 소나 자료의 보정 및 해저면 분류를 위한 영상분할)

  • 서상일;김학일;이광훈;김대철
    • Korean Journal of Remote Sensing
    • /
    • v.13 no.2
    • /
    • pp.133-150
    • /
    • 1997
  • The purpose of this paper is to develop an algorithm of classification and interpretation of seafloor based on side scan sonar data. The algorithm consists of mosaicking of sonar data using navigation data, correction and compensation of the acouctic amplitude data considering the charateristics of the side scan sonar system, and segmentation of the seafloor using digital image processing techniques. The correction and compensation process is essential because there is usually difference in acoustic amplitudes from the same distance of the port-side and the starboard-side and the amplitudes become attenuated as the distance is increasing. In this paper, proposed is an algorithm of compensating the side scan sonar data, and its result is compared with the mosaicking result without any compensation. The algorithm considers the amplitude characteristics according to the tow-fish's depth as well as the attenuation trend of the side scan sonar along the beam positions. This paper also proposes an image segmentation algorithm based on the texture, where the criterion is the maximum occurence related with gray level. The preliminary experiment has been carried out with the side scan sonar data and its result is demonstrated.

The Comparative Study based on Gompertz Software Reliability Model of Shape Parameter (곰페르츠형 형상모수에 근거한 소프트웨어 신뢰성모형에 대한 비교연구)

  • Shin, Hyun Cheul;Kim, Hee Cheul
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.10 no.2
    • /
    • pp.29-36
    • /
    • 2014
  • Finite failure NHPP software reliability models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this paper, proposes the Gompertz distribution reliability model, which made out efficiency application for software reliability. Algorithm to estimate the parameters used to maximum likelihood estimator and bisection method, model selection based on mean square error (MSE) and coefficient of determination$(R^2)$, for the sake of efficient model, was employed. Analysis of failure using real data set for the sake of proposing fixed shape parameter of the Gompertz distribution was employed. This analysis of failure data compared with the Gompertz distribution model of shape parameter. In order to insurance for the reliability of data, Laplace trend test was employed. In this study, the proposed Gompertz model is more efficient in terms of reliability in this area. Thus, Gompertz model can also be used as an alternative model. From this paper, software developers have to consider the growth model by prior knowledge of the software to identify failure modes which can was helped.

A study on the trend and selection for scheduling technology of embedded operating system (임베디드 운영체제의 스케줄링 기술 동향 및 선정에 관한 연구)

  • Min, Jae-Hong;Cho, Pyung-Dong;Hahm, Jin-Ho
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.05a
    • /
    • pp.629-632
    • /
    • 2011
  • Embedded operation system is core software technology which will be able to implement the ubiquitous environment. Also, it is basic technology which has a great ripple effect. Therefore, it is advanced and greatly changed for recent several years. Especially, a lot of studies have been done on real-time supporting technology relating to embedded operating system. In this paper, I analyze the characteristics of real-time embedded system and the trend of scheduling technique in order to support them. As a result, I suggest selection technique for selecting scheduling algorithm through considering the characteristics of embedded system.

  • PDF

A Comparative Study of Software finite Fault NHPP Model Considering Inverse Rayleigh and Rayleigh Distribution Property (역-레일리와 레일리 분포 특성을 이용한 유한고장 NHPP모형에 근거한 소프트웨어 신뢰성장 모형에 관한 비교연구)

  • Shin, Hyun Cheul;Kim, Hee Cheul
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.10 no.3
    • /
    • pp.1-9
    • /
    • 2014
  • The inverse Rayleigh model distribution and Rayleigh distribution model were widely used in the field of reliability station. In this paper applied using the finite failure NHPP models in order to growth model. In other words, a large change in the course of the software is modified, and the occurrence of defects is almost inevitable reality. Finite failure NHPP software reliability models can have, in the literature, exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this paper, proposes the inverse Rayleigh and Rayleigh software reliability growth model, which made out efficiency application for software reliability. Algorithm to estimate the parameters used to maximum likelihood estimator and bisection method, model selection based on mean square error (MSE) and coefficient of determination($R^2$), for the sake of efficient model, were employed. In order to insurance for the reliability of data, Laplace trend test was employed. In many aspects, Rayleigh distribution model is more efficient than the reverse-Rayleigh distribution model was proved. From this paper, software developers have to consider the growth model by prior knowledge of the software to identify failure modes which can helped.

Comparison of Three Optimization Methods Using Korean Population Data

  • Oh, Deok-Kyo
    • Korean System Dynamics Review
    • /
    • v.13 no.2
    • /
    • pp.47-71
    • /
    • 2012
  • The purpose of this research is the examination of validity of data as well as simulation model, i.e. to simulate the real data in the SD model with the least error using the adjustments for the faithful reflection of real data to the simulation. In general, SD programs (e.g. VENSIM) utilize the Euler or Runge-Kutta method as an algorithm. It is possible to reflect the trend of real data via these two estimation methods however can cause the validity problem in case of the simulation requiring the accuracy as they have endogenous errors. In this article, the future population estimated by the Korea National Statistical Office (KNSO) to 2050 is simulated by the aging chain model, dividing the population into three cohorts, 0-14, 15-64, 65 and over cohorts by age and offering the adjustments to them. Adjustments are calculated by optimization with three different methods, optimization in EXCEL, manual optimization with iterative calculation, and optimization in VENSIM DSS, the results are compared, and at last the optimal adjustment set with the least error are found among them. The simulation results with the pre-determined optimal adjustment set are validated by methods proposed by Barlas (1996) and other alternative methods. It is concluded that the result of simulation model in this research has no significant difference from the real data and reflects the real trend faithfully.

  • PDF

Development of a New Lunar Regolith Simulant using an Automated Program Framework

  • GyeongRok Kwon;Kyeong Ja Kim;Eungseok Yi
    • Journal of Astronomy and Space Sciences
    • /
    • v.41 no.2
    • /
    • pp.79-85
    • /
    • 2024
  • Nowadays, the trend in lunar exploration missions is shifting from prospecting lunar surface to utilizing in-situ resources and establishing sustainable bridgehead. In the past, experiments were mainly focused on rover maneuvers and equipment operations. But the current shift in trend requires more complex experiments that includes preparations for resource extraction, space construction and even space agriculture. To achieve that, the experiment requires a sophisticated simulation of the lunar environment, but we are not yet prepared for this. Particularly, in the case of lunar regolith simulants, precise physical and chemical composition with a rapid development speed rate that allows different terrains to be simulated is required. However, existing lunar regolith simulants, designed for 20th-century exploration paradigms, are not sufficient to meet the requirements of modern space exploration. In order to prepare for the latest trends in space exploration, it is necessary to innovate the methodology for producing simulants. In this study, the basic framework for lunar regolith simulant development was established to realize this goal. The framework not only has a sample database and a database of potential simulation target compositions, but also has a built-in function to automatically calculate the optimal material mixing ratio through the particle swarm optimization algorithm to reproduce the target simulation, enabling fast and accurate simulant development. Using this framework, we anticipate a more agile response to the evolving needs toward simulants for space exploration.