• Title/Summary/Keyword: time complexity analysis

Search Result 698, Processing Time 0.023 seconds

WHAT CAN WE SAY ABOUT THE TIME COMPLEXITY OF ALGORITHMS \ulcorner

  • Park, Chin-Hong
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.3
    • /
    • pp.959-973
    • /
    • 2001
  • We shall discuss one of some techniques needed to analyze algorithms. It is called a big-O function technique. The measures of efficiency of an algorithm have two cases. One is the time used by a computer to solve the problem using this algorithm when the input values are of a specified size. The other one is the amount of computer memory required to implement the algorithm when the input values are of a specified size. Mainly, we will restrict our attention to time complexity. To figure out the Time Complexity in nonlinear problems of Numerical Analysis seems to be almost impossible.

Time Complexity Analysis of MSP Term Groupting Algorithm for Binary Neural Networks (이진신경회로망에서 MSP Term Grouping 알고리즘의 Time Complexity 분석)

  • 박병준;이정훈
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2000.11a
    • /
    • pp.85-88
    • /
    • 2000
  • 본 논문은 Threshold Logic Unit(TLU)를 기본 뉴런으로 하여 최소화된 이진신경회로망을 합성하는 방법인 MSP Term Grouping(MTG) 알고리즘의 time complexity를 분석하고자 한다. 이를 전체 패턴 탐색을 통한 이진신경회로망 합성의 경우와 비교하여 MTG 알고리즘의 효용성을 보여준다.

  • PDF

Complexity Analysis of Internet Video Coding (IVC) Decoding

  • Park, Sang-hyo;Dong, Tianyu;Jang, Euee S.
    • Journal of Multimedia Information System
    • /
    • v.4 no.4
    • /
    • pp.179-188
    • /
    • 2017
  • The Internet Video Coding (IVC) standard is due to be published by Moving Picture Experts Group (MPEG) for various Internet applications such as internet broadcast streaming. IVC aims at three things fundamentally: 1) forming IVC patents under a free of charge license, 2) reaching comparable compression performance to AVC/H.264 constrained Baseline Profile (cBP), and 3) maintaining computational complexity for feasible implementation of real-time encoding and decoding. MPEG experts have worked diligently on the intellectual property rights issues for IVC, and they reported that IVC already achieved the second goal (compression performance) and even showed comparable performance to even AVC/H.264 High Profile (HP). For the complexity issue, however, there has not been thorough analysis on IVC decoder. In this paper, we analyze the IVC decoder in view of the time complexity by evaluating running time. Through the experimental results, IVC is 3.6 times and 3.1 times more complex than AVC/H.264 cBP under constrained set (CS) 1 and CS2, respectively. Compared to AVC/H.264 HP, IVC is 2.8 times and 2.9 times slower in decoding time under CS1 and CS2, respectively. The most critical tool to be improved for lightweight IVC decoder is motion compensation process containing a resolution-adaptive interpolation filtering process.

A Study on the Type and the Facilities in Compositeness of the Domestic Discount Store (국내 대형할인점의 복합화에 따른 유형과 시설에 관한 연구)

  • 문선욱;양정필
    • Korean Institute of Interior Design Journal
    • /
    • no.41
    • /
    • pp.137-145
    • /
    • 2003
  • This research analyzed the space scheme in connection with complexity, one of the new changes in the discount stores, and has a goal of predicting the direction of space scheme in the upcoming complexity era. The research was conducted in the following way. Firstly, this researcher tried to grasp what kinds of changes were required in the overall distribution industry socially and economically. Secondly, the characteristic and situation of discount stores were scrutinized. Thirdly, the domestic stores' complexity status was classified and types of those were elicited. Fourthly, the time-series change and use were analyzed. The result of this analysis reveals that the types of complexity can be divided by location and adjustment to environmental changes. The time-series analysis shows that total operating area, the number of parked cars and the tenant ratio have increased dramatically in 2000 and 2003. And, according to the correlation analysis between factors, the tenant ratio has, a strong correlation with other two factors. Self-complexity takes the basic form of living facilities and complexity with other facilities is combined with other cultural, sales, educational and administrative ones. Mass-complexity is merged with the stadiums, parks or station sites. As you've seen, the concept of complex shopping mall for the realization of one stop shopping and convenience will continue in the days to come. It is desirable that the study on the large-scale shopping spaces will be conducted continually for the preparedness of future life style.

Korean Maintainability Prediction Methodology Reflecting System Complexity (시스템 복잡도를 반영한 한국형 정비도 예측 방법론)

  • Kwon, Jae-Eon;Hur, Jang-Wook
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.20 no.4
    • /
    • pp.119-126
    • /
    • 2021
  • During the development of a weapon system, the concept of maintainability is used for quantitatively predicting and analyzing the maintenance time. However, owing to the complexity of a weapon system, the standard maintenance time predicted during the system's development differs significantly from the measured time during the operation of the equipment after the system's development. According to the analysis presented in this paper, the maintenance time can be predicted by considering the system's complexity on the basis of the military specifications, and the procedure can be Part B of Procedure II and Method B of Procedure V. The maintenance work elements affected by the system complexity were identified by the analytic hierarchy process technique, and the system-complexity-reflecting weights of the maintenance work elements were calculated by the Delphi method, which involves expert surveys. Based on MIL-HDBK-470A and MIL-HDBK-472, it is going to present a Korean-style maintainability prediction method that reflects system complexity of weapons systems.

An Analysis of Effective Throughput in Distributed Wireless Scheduling

  • Radwan, Amr
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.2
    • /
    • pp.155-162
    • /
    • 2016
  • Several distributed scheduling policies have been proposed with the objective of attaining the maximum throughput region or a guaranteed fraction throughput region. These policies consider only the theoretical throughput and do not account the lost in throughput due to the time complexity of implementing an algorithm in practice. Therefore, we propose a novel concept called effective throughput to characterize the actual throughput by taking into account the time complexity. Effective throughput can be viewed as the actual transmitted data without including the control message overhead. Numerical results demonstrate that in practical scheduling, time complexity significantly affects throughput. The performance of throughput degrades when the time complexity is high.

Low Complexity Decoder for Space-Time Turbo Codes

  • Lee Chang-Woo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.4C
    • /
    • pp.303-309
    • /
    • 2006
  • By combining the space-time diversity technique and iterative turbo codes, space-time turbo codes(STTCS) are able to provide powerful error correction capability. However, the multi-path transmission and iterative decoding structure of STTCS make the decoder very complex. In this paper, we propose a low complexity decoder, which can be used to decode STTCS as well as general iterative codes such as turbo codes. The efficient implementation of the backward recursion and the log-likelihood ratio(LLR) update in the proposed algorithm improves the computational efficiency. In addition, if we approximate the calculation of the joint LLR by using the approximate ratio(AR) algorithm, the computational complexity can be reduced even further. A complexity analysis and computer simulations over the Rayleigh fading channel show that the proposed algorithm necessitates less than 40% of the additions required by the conventional Max-Log-MAP algorithm, while providing the same overall performance.

A Study on the Propriety of the Medical Insurance Fee Schedule of Surgical Operations - In Regard to the Relative Price System and the Classification of the Price Unit of Insurance Fee Schedule - (수술수가의 적정성에 관한 연구 - 상대가격체계와 항목분류를 중심으로 -)

  • Oh Jin Joo
    • Journal of Korean Public Health Nursing
    • /
    • v.2 no.2
    • /
    • pp.21-44
    • /
    • 1988
  • In Korea, fee-for service reimbursement has been adopted from the begining of medical insurance system in 1977, and the importance of the relative value unit is currently being investigated. The purpose of this study was to find out the level of propriety of the difference in the fees for different surgical services, and the appropriateness of the classification of the insurance fee schedule. For the purpose of this study, specific subjects and the procedural methodology is shown as follows: 1. The propriety of the Relative Price System(RPS). 1) Choice of sample operations. In this study, sample operations were selected and classified by specialists in general surgery, and the number of items they classified were 32. For the same group of operations the Insurance Fee Schedule(IFS) classified the operations into 24 separate items. In order to investigate the propriety of the RPS, one of the purpose of this study, was to examine the 24 items classified by the IFS. 2) Evaluation of the complexity of surgery. The data used in this study was collected The data used in this study was collected from 94 specialists in general surgery by mail survey from November I to 15, 1986. Several independent variables (age, location, number of bed, university hospital, whether the medical institution adopt residents or not) were also investigated for analysis of the characteristics of surgical complexity. 3) Complexity and time calculations. Time data was collected from the records of the Seoul National University' Hospital, and the cost per operation was calculated through cost finding methods. 4) Analysis of the propriety of the Relative Price System of the Insurance Fee Schedule. The Relative Price System of the sample operation was regressed on the cost, time, comlexity relative ,value system (RVS) separately. The coefficient of determination indicates the degree of variation in the RPS of the Insurance Fee Schedule explained by the cost, time, complexity RVS separately. 2. The appropriateness of the classification of the Insurance Fee Schedule. 1) Choice of sample operations. The items which differed between the classification of the specialist and the classification of medical, Insurance Fee Schedule were chosen. 2) Comparisons of cost, time and complexity between the items were done to evaluate which classification was more appropriate. The findings of the study can be summarized as follows: 1. The coefficient of determination of the regression of the RPS on-cost RVS was 0.58, on time RVS was 0.65, and on complexity RVS was 0.72. This means that the RPS of Insurance Fee Schedule is improper with respect to the cost, time, complexity separately. Thus this indicates that RPS must be re-shaped according to the standard element. In this study, the correlation coefficients of cost, time, complexity Relative Value System were very high, and this suggests that RPS could be reshaped I according to anyone standard element. Considering of measurement, time was thought to be the most I appropriate. 2. The classifications of specialist and of the Insurance Fee Schedule were compared with respect to cost, time, and complexity separately. For complexity, ANOVA was done and the others were compared to the different values of different classifications. The result was that the classification of specialist was more reasonable and that the classification of Insurance Fee Schedule grouped inappropriately several into one price unit.

  • PDF

JVT(Joint Video Team)압축/복원방식의 복잡도 분석

  • 이영렬
    • Broadcasting and Media Magazine
    • /
    • v.7 no.3
    • /
    • pp.75-82
    • /
    • 2002
  • In this report, the complexity analysis of the JVT(Joint Video Team) codec, which has jointly developed the next video coding standard, is performed. Three types of configurations in terms of coding efficiency are set and the analysis of the memory band width and computation time for each configuration is performed. ATOMIUM complexity analysis tool is used for both the memory access statistics and computation time calculation of JVT codec. Also the complexity of each video coding tool in the encoder and decoder is shown in relative complexity.

Time Complexity Analysis of SPIHT(Set Partitioning in Hierarchy Trees) Image Coding Algorithm (SPIHT 영상코딩 알고리즘의 시간복잡도 해석)

  • 박영석
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.4 no.1
    • /
    • pp.36-40
    • /
    • 2003
  • A number of embedded wavelet image coding methods have been Proposed since the introduction of EZW(Embedded Zerotree Wavelet) algorithm. A common characteristic of these methods is that they use fundamental ideas found in the EZW algorithm. Especially, one of these methods is the SPIHT(Set Partitioning in Hierarchy Trees) algorithm, which became very popular since it was able to achieve equal or better performance than EZW without having to use an arithmetic encoder. The SPIHT algorithm is computationally very simple, but even so it provides excellent numerical and visual results. But the evaluation of its time complexity is no more than the relative result of experimental comparisons and the strict time complexity analysis wasn't taken until now. In this paper, we analyze strictly the processing time complexity of SPIHT algorithm and prove that the time complexity for one bit-plane processing is O( nlog $_2$n) in worst case.

  • PDF