• Title/Summary/Keyword: compression decision mechanism

Search Result 8, Processing Time 0.019 seconds

On-the-fly Data Compression for Efficient TCP Transmission

  • Wang, Min;Wang, Junfeng;Mou, Xuan;Han, Sunyoung
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.3
    • /
    • pp.471-489
    • /
    • 2013
  • Data compression at the transport layer could both reduce transmitted bytes over network links and increase the transmitted application data (TCP PDU) in one RTT at the same network conditions. Therefore, it is able to improve transmission efficiency on Internet, especially on the networks with limited bandwidth or long delay links. In this paper, we propose an on-the-fly TCP data compression scheme, i.e., the TCPComp, to enhance TCP performance. This scheme is primarily composed of the compression decision mechanism and the compression ratio estimation algorithm. When the application data arrives at the transport layer, the compression decision mechanism is applied to determine which data block could be compressed. The compression ratio estimation algorithm is employed to predict compression ratios of upcoming application data for determining the proper size of the next data block so as to maximize compression efficiency. Furthermore, the assessment criteria for TCP data compression scheme are systematically developed. Experimental results show that the scheme can effectively reduce transmitted TCP segments and bytes, leading to greater transmission efficiency compared with the standard TCP and other TCP compression schemes.

Compression Methods for Time Series Data using Discrete Cosine Transform with Varying Sample Size (가변 샘플 크기의 이산 코사인 변환을 활용한 시계열 데이터 압축 기법)

  • Moon, Byeongsun;Choi, Myungwhan
    • KIISE Transactions on Computing Practices
    • /
    • v.22 no.5
    • /
    • pp.201-208
    • /
    • 2016
  • Collection and storing of multiple time series data in real time requires large memory space. To solve this problem, the usage of varying sample size is proposed in the compression scheme using discrete cosine transform technique. Time series data set has characteristics such that a higher compression ratio can be achieved with smaller amount of value changes and lower frequency of the value changes. The coefficient of variation and the variability of the differences between adjacent data elements (VDAD) are presumed to be very good measures to represent the characteristics of the time series data and used as key parameters to determine the varying sample size. Test results showed that both VDAD-based and the coefficient of variation-based scheme generate excellent compression ratios. However, the former scheme uses much simpler sample size decision mechanism and results in better compression performance than the latter scheme.

RDO-based joint bit allocation for MPEG G-PCC

  • Ye, Xiangyu;Cui, Li;Chang, Eun-Young;Cha, Jihun;Ahn, Jae Young;Jang, Euee S.
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2021.06a
    • /
    • pp.81-84
    • /
    • 2021
  • In this paper, a rate-distortion optimization (RDO) model is proposed to find the joint bit allocation of geometry data and color data based on geometry-based point cloud compression (G-PCC) of Moving Picture Experts Group (MPEG). The mechanism of the method is to construct the RD models for geometry and color data through the training process. Afterward, two rate-distortion (RD) models are integrated as well as the decision of the parameter λ to obtain the final RDO model. The experimental results show that the proposed method can decrease 20% of the geometry Bjøntegaard delta bit rate and increase 37% of the color Bjøntegaard delta bit rate compared to the MPEG G-PCC TMC13v12.0 software.

  • PDF

Development of a Structural Safety Evaluation System for Stone Voussoir Arch Bridges (석조 홍예아치교의 구조적 안정성 평가시스템 개발)

  • Kim, Nam-Hee;Koh, Hyun-Moo;Hong, Sung-Gul
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.22 no.1
    • /
    • pp.15-23
    • /
    • 2009
  • Masonry structures that are very strong in compression fail due to the instability of structural shape of geometry rather than the material stress limit. Considering such structural behavior, the use of the limit theorem that focuses on structural collapse mechanisms is more appropriate for the evaluation of the structural safety of stone voussoir arch bridges. This paper is to investigate structural performance of the stone arch bridges constructed using dry construction method in Korea based on the limit theorem and to exploit the result to develop a system for an structural safety margin. It is expected that this study will help us understand structural behavior of stone voussoir arch bridges in Korea. Also, it will provide a guideline to make engineering decision from the viewpoint of the maintenance of cultural heritages.

Correlation between Young and Burgess Classification and Transcatheter Angiographic Embolization in Severe Trauma Patients (중증 외상 환자의 골반골절에서 경피적 혈관 색전술과 Young과 Burgess 분류의 상관관계)

  • Cha, Yong Han;Sul, Young Hoon;Kim, Ha Yong;Choy, Won Sik
    • Journal of Trauma and Injury
    • /
    • v.28 no.3
    • /
    • pp.144-148
    • /
    • 2015
  • Purpose: Immediate identification of vascular injury requiring embolization in patients with pelvic bone fracture isn't an easy task. There have been many trials finding indicators of embolization for patients with pelvic bone fracture. Although Young and Burgess classification is useful in decision making of treatment, it is reported to have little value as indicator of embolization in major trauma patients. The aim of this study is to find out Young and burgess classification on predicting vessel injury by analzyng pelvic radiograph taken from major trauma patients with pelvic bone fracture. Methods: Among major trauma patients with injury severity scores (ISS) higher than 15 who visited our emergency room from January 2011 to June 2014, 200 patients were found with pelvic bone fracture in trauma series and thus pelvic CT angiography was taken. Setting aside patients with exclusion criteria, 153 patients were enrolled in this study for analysis of Young and Burgess classification. Results: The most common mechanism of injury was lateral compression in both groups. There was no statistical significant difference in Young and Burgess classification (p=0.397). The obturator artery was the most commonly injured artery in both groups. Six patients had more than one site of bleeding. Conclusion: Prediction of transcatheter angiographic embolization using Young and Burgess classification in severe trauma patients is difficult and requires additional studies.

  • PDF

Assessing Infinite Failure Software Reliability Model Using SPC (Statistical Process Control) (통계적 공정관리(SPC)를 이용한 무한고장 소프트웨어 신뢰성 모형에 대한 접근방법 연구)

  • Kim, Hee Cheul;Shin, Hyun Cheul
    • Convergence Security Journal
    • /
    • v.12 no.6
    • /
    • pp.85-92
    • /
    • 2012
  • There are many software reliability models that are based on the times of occurrences of errors in the debugging of software. It is shown that it is possible to do asymptotic likelihood inference for software reliability models based on infinite failure model and non-homogeneous Poisson Processes (NHPP). For someone making a decision about when to market software, the conditional failure rate is an important variables. The finite failure model are used in a wide variety of practical situations. Their use in characterization problems, detection of outliers, linear estimation, study of system reliability, life-testing, survival analysis, data compression and many other fields can be seen from the many study. Statistical Process Control (SPC) can monitor the forecasting of software failure and there by contribute significantly to the improvement of software reliability. Control charts are widely used for software process control in the software industry. In this paper, we proposed a control mechanism based on NHPP using mean value function of log Poission, log-linear and Parto distribution.

The Assessing Comparative Study for Statistical Process Control of Software Reliability Model Based on polynomial hazard function (다항 위험함수에 근거한 NHPP 소프트웨어 신뢰모형에 관한 통계적 공정관리 접근방법 비교연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.8 no.5
    • /
    • pp.345-353
    • /
    • 2015
  • There are many software reliability models that are based on the times of occurrences of errors in the debugging of software. It is shown that it is possible to do parameter inference for software reliability models based on finite failure model and non-homogeneous Poisson Processes (NHPP). For someone making a decision to market software, the conditional failure rate is an important variables. In this case, finite failure model are used in a wide variety of practical situations. Their use in characterization problems, detection of outlier, linear estimation, study of system reliability, life-testing, survival analysis, data compression and many other fields can be seen from the many study. Statistical process control (SPC) can monitor the forecasting of software failure and thereby contribute significantly to the improvement of software reliability. Control charts are widely used for software process control in the software industry. In this paper, proposed a control mechanism based on NHPP using mean value function of polynomial hazard function.

The Assessing Comparative Study for Statistical Process Control of Software Reliability Model Based on Musa-Okumo and Power-law Type (Musa-Okumoto와 Power-law형 NHPP 소프트웨어 신뢰모형에 관한 통계적 공정관리 접근방법 비교연구)

  • Kim, Hee-Cheul
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.8 no.6
    • /
    • pp.483-490
    • /
    • 2015
  • There are many software reliability models that are based on the times of occurrences of errors in the debugging of software. It is shown that it is possible to do likelihood inference for software reliability models based on finite failure model and non-homogeneous Poisson Processes (NHPP). For someone making a decision about when to market software, the conditional failure rate is an important variables. The infinite failure model are used in a wide variety of practical situations. Their use in characterization problems, detection of outlier, linear estimation, study of system reliability, life-testing, survival analysis, data compression and many other fields can be seen from the many study. Statistical process control (SPC) can monitor the forecasting of software failure and thereby contribute significantly to the improvement of software reliability. Control charts are widely used for software process control in the software industry. In this paper, proposed a control mechanism based on NHPP using mean value function of Musa-Okumo and Power law type property.