• Title/Summary/Keyword: Code Compression

Search Result 429, Processing Time 0.03 seconds

Numerical Analysis of the flow Characteristics in Intake-Port Piston Head Configurations in a Gasoline Direct-Injection Engine. (가솔린직접분사기관에서 흡기포트 및 피스톤의 형상에 따른 유동해석)

  • Park Chan-Guk;Park Hyung-Koo;Lim Myung-Taeck
    • Journal of computational fluids engineering
    • /
    • v.4 no.3
    • /
    • pp.21-27
    • /
    • 1999
  • In this paper, tile characteristics of flow resulting from the configurations of piston head and intake-port of the cylinder in a gasoline-direct-injection engine are investigated numerically. Calculations are carried out from intake process to the end of compression. GTT code which includes the third order upwind Chakravarthy-Osher TVD scheme and κ-ε turbulence model with the law of wall as a boundary condition. As a result, a piston head with a smaller radius of curvature and larger radius gives stronger reverse tumble. It is also shown that as the maximum tumble ratio increases by the configuration of the intake-port the tumble ratio at the end of compression stroke increases. It is concluded that flows at the end of compression stroke can be controlled by the optimum design of intake-port and piston head.

  • PDF

Image Compressing of Color tone image by transformed Q-factor (Q-factor변형에 의한 색조영상 압축에 관한 연구)

  • Choi, Kum-Su;Moon, Young-Deck
    • Proceedings of the KIEE Conference
    • /
    • 1999.11c
    • /
    • pp.781-783
    • /
    • 1999
  • A storage or transmission of image is difficult without image compression processing because the numbers of generated or reborned image data are very much. In case of the random signal, image compression efficiency is low doing without loss of image information, but compressibility by using JPEG is better. We used Huffman code of JPEG, it assigne the low bit value for data of a lot of generated frequency, assigne the high bit value for data of a small quantity. This paper improved image compression efficiency with transformming Q-factor and certified the results with compressed image. A proposed method is very efficience for continuos a color tone image.

  • PDF

Improving JPEG-LS Performance Using Location Information

  • Woo, Jae Hyeon;Kim, Hyoung Joong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.11
    • /
    • pp.5547-5562
    • /
    • 2016
  • JPEG-LS is an international standard for lossless or near-lossless image-compression algorithms. In this paper, a simple method is proposed to improve the performance of the lossless JPEG-LS algorithm. With respect to JPEG-LS and its supplementary explanation, Golomb-Rice (GR) coding is mainly used for entropy coding, but it is not used for long codewords. The proposed method replaces a set of long codewords with a set of shorter location map information. This paper shows how efficiently the location map guarantees reversibility and enhances the compression rate in terms of performance. Experiments have also been conducted to verify the efficiency of the proposed method.

Wavelet-based Biomedical Signal Compression Using a Multi-stage Vector Quantization (다단계 벡터 양자화를 이용한 웨이브렛 기반 생체 신호 압축)

  • Park, Seo-Young;Kim, Young-Ju;Lee, In-Sung
    • Proceedings of the IEEK Conference
    • /
    • 2006.06a
    • /
    • pp.343-344
    • /
    • 2006
  • In this paper, the biomedical signal compression method with multi-stage vector quantization is proposed. It utilizes characteristic of wavelet coefficients in which the energy is concentrated on approximation coefficients. The transmitted codebook index consists code vectors obtained by wavelet coefficients of ECG and Error signals from the 1024 block length, respectively. The proposed compression method showed 2.1298% in average PRD and 1.8 kbits/sec in CDR.

  • PDF

Long-term deflection of high-strength fiber reinforced concrete beams

  • Ashour, Samir A.;Mahmood, Khalid;Wafa, Faisal F.
    • Structural Engineering and Mechanics
    • /
    • v.8 no.6
    • /
    • pp.531-546
    • /
    • 1999
  • The paper presents an experimental and theoretical study on the influence of steel fibers and longitudinal tension and compression reinforcements on immediate and long-term deflections of high-strength concrete beams of 85 MPa (12,300 psi) compressive, strength. Test results of eighteen beams subjected to sustained load for 180 days show that the deflection behavior depends on the longitudinal tension and compression reinforcement ratios and fiber content; excessive amount of compression reinforcement and fibers may have an unfavorable effect on the long-term deflections. The beams having the ACI Code's minimum longitudinal tension reinforcement showed much higher time-dependent deflection to immediate deflection ratio, when compared with that of the beams having about 50 percent of the balanced tension reinforcement. The results of theoretical analysis of tested beams and those of a parametric study show that the influence of steel fibers in increasing the moment of inertia of cracked transformed sections is most pronounced in beams having small amount of longitudinal tension reinforcement.

Implementation of the modified compression field theory in a tangent stiffness-based finite element formulation

  • Aquino, Wilkins;Erdem, Ibrahim
    • Steel and Composite Structures
    • /
    • v.7 no.4
    • /
    • pp.263-278
    • /
    • 2007
  • A finite element implementation of the modified compression field theory (MCFT) using a tangential formulation is presented in this work. Previous work reported on implementations of MCFT has concentrated mainly on secant formulations. This work describes details of the implementation of a modular algorithmic structure of a reinforced concrete constitutive model in nonlinear finite element schemes that use a Jacobian matrix in the solution of the nonlinear system of algebraic equations. The implementation was verified and validated using experimental and analytical data reported in the literature. The developed algorithm, which converges accurately and quickly, can be easily implemented in any finite element code.

An Efficient Medical Image Compression Considering Brain CT Images with Bilateral Symmetry (뇌 CT 영상의 대칭성을 고려한 관심영역 중심의 효율적인 의료영상 압축)

  • Jung, Jae-Sung;Lee, Chang-Hun
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.12 no.5
    • /
    • pp.39-54
    • /
    • 2012
  • Picture Archiving and Communication System (PACS) has been planted as one of the key infrastructures with an overall improvement in standards of medical informationization and the stream of digital hospitalization in recent days. The kind and data of digital medical imagery are also increasing rapidly in volume. This trend emphasizes the medical image compression for storing large-scale medical image data. Digital Imaging and Communications in Medicine (DICOM), de facto standard in digital medical imagery, specifies Run Length Encode (RLE), which is the typical lossless data compressing technique, for the medical image compression. However, the RLE is not appropriate approach for medical image data with bilateral symmetry of the human organism. we suggest two preprocessing algorithms that detect interested area, the minimum bounding rectangle, in a medical image to enhance data compression efficiency and that re-code image pixel values to reduce data size according to the symmetry characteristics in the interested area, and also presents an improved image compression technique for brain CT imagery with high bilateral symmetry. As the result of experiment, the suggested approach shows higher data compression ratio than the RLE compression in the DICOM standard without detecting interested area in images.

Region-Growing Segmentation Algorithm for Rossless Image Compression to High-Resolution Medical Image (영역 성장 분할 기법을 이용한 무손실 영상 압축)

  • 박정선;김길중;전계록
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.3 no.1
    • /
    • pp.33-40
    • /
    • 2002
  • In this paper, we proposed a lossless compression algorithm of medical images which is essential technique in picture archive and communication system. Mammographic image and magnetic resonance image in among medical images used in this study, proposed a region growing segmentation algorithm for compression of these images. A proposed algorithm was partition by three sub region which error image, discontinuity index map, high order bit data from original image. And generated discontinuity index image data and error image which apply to a region growing algorithm are compressed using JBIG(Joint Bi-level Image experts Group) algorithm that is international hi-level image compression standard and proper image compression technique of gray code digital Images. The proposed lossless compression method resulted in, on the average, lossless compression to about 73.14% with a database of high-resolution digital mammography images. In comparison with direct coding by JBIG, JPEG, and Lempel-Ziv coding methods, the proposed method performed better by 3.7%, 7.9% and 23.6% on the database used.

  • PDF

An Analysis of High Speed Forming Using the Explicit Time Integration Finite Element Method (I) -Effects of Friction and Inertia Force- (엑스플리시트 시간 적분 유한요소법을 이용한 고속 성형 해석 (I) -마찰 및 관성 효과-)

  • 유요한;정동택
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.15 no.1
    • /
    • pp.1-10
    • /
    • 1991
  • Two-dimensional explicit finite element code was developed. The transient dynamics code can analyse large deformations of non-linear materials subjected to extremely high strain rates. The Lagrangian finite element program uses an explicit time integration operator to integrate the equations of motion, thus the stiffness matrix is not introduced. Cylinder upsetting and ring compression problems are simulated to check the effects of friction and inertia force. It is shown that (1) calculated results agree very well with experimental results, (2) constant shear friction method overestimates the decrease of inner ring radius and then underestimates after on in comparison with the Coulomb friction method, and (3) the effect of the increase in initial strain rate is similar to the effect of higher frictional coefficient.

A parameter calibration method for PFC simulation: Development and a case study of limestone

  • Xu, Z.H.;Wang, W.Y.;Lin, P.;Xiong, Y.;Liu, Z.Y.;He, S.J.
    • Geomechanics and Engineering
    • /
    • v.22 no.1
    • /
    • pp.97-108
    • /
    • 2020
  • The time-consuming and less objectivity are the main problems of conventional micromechanical parameters calibration method of Particle Flow Code simulations. Thus this study aims to address these two limitation of the conventional "trial-and-error" method. A new calibration method for the linear parallel bond model (CM-LPBM) is proposed. First, numerical simulations are conducted based on the results of the uniaxial compression tests on limestone. The macroscopic response of the numerical model agrees well with the results of the uniaxial compression tests. To reduce the number of the independent micromechanical parameters, numerical simulations are then carried out. Based on the results of the orthogonal experiments and the multi-factor variance analysis, main micromechanical parameters affecting the macro parameters of rocks are proposed. The macro-micro parameter functions are ultimately established using multiple linear regression, and the iteration correction formulas of the micromechanical parameters are obtained. To further verify the validity of the proposed method, a case study is carried out. The error between the macro mechanical response and the numerical results is less than 5%. Hence the calibration method, i.e., the CM-LPBM, is reliable for obtaining the micromechanical parameters quickly and accurately, providing reference for the calibration of micromechanical parameters.