• Title/Summary/Keyword: compression performance

Search Result 2,165, Processing Time 0.06 seconds

Efficient Image Size Selection for MPEG Video-based Point Cloud Compression

  • Jia, Qiong;Lee, M.K.;Dong, Tianyu;Kim, Kyu Tae;Jang, Euee S.
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2022.06a
    • /
    • pp.825-828
    • /
    • 2022
  • In this paper, we propose an efficient image size selection method for video-based point cloud compression. The current MPEG video-based point cloud compression reference encoding process configures a threshold on the size of images while converting point cloud data into images. Because the converted image is compressed and restored by the legacy video codec, the size of the image is one of the main components in influencing the compression efficiency. If the image size can be made smaller than the image size determined by the threshold, compression efficiency can be improved. Here, we studied how to improve the compression efficiency by selecting the best-fit image size generated during video-based point cloud compression. Experimental results show that the proposed method can reduce the encoding time by 6 percent without loss of coding performance compared to the test model 15.0 version of video-based point cloud encoder.

  • PDF

Effects of JPEG Compression on Joint Transform Correlator

  • Widjaja, Joewono;Suripon, Ubon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1662-1665
    • /
    • 2004
  • A real-time joint transform correlator by using JPEG-compressed reference images is proposed as practical solution to storage problem and improvement of processing time of automatic target recognition system [1]. Effects of compression on recognition performance of join transform correlator are quantitatively investigated under situations where the target is suffered from noise and has contrast difference with respect to the reference. Two images with different spatial-frequency contents and contrast were used as the test scenes. The simulation results show that, the recognition performance of joint transform correlator by using the compressed reference images with high spatial-frequency components is more sensitive to noise and contrast difference than the low spatial-frequency image.

  • PDF

A Study on the Performance Improvement in Sidelobe Suppression for Pulse Compression of LFM Signal (LFM 신호의 펄스압축에 대한 부엽억제 성능향상 기법연구)

  • Shin, Jeong-Hoon
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.9 no.3
    • /
    • pp.95-100
    • /
    • 2006
  • The pulse compression technique using Linear FM signal is commonly used for improving the performance of both the detection range and range resolution in radar system. In general, the compressed LFM waveform has relatively large sidelobe level which may prevent a target from being detected when strong jammer or clutter signal is near the target signal. In this paper, we propose a new weighting method which uses the square-root weight to suppress the sidelobe level. Typical applications are missile seekers and tracking radar systems where target tracking range is available prior to the signal processing. By computer simulation, we show that the performance of the proposed method is better than that of the conventional weighting methods in terms of sidelobe suppression.

A Study on Evaluation of Jamming Performance on SAR Satellite (SAR 위성에 대한 재밍 효과 분석)

  • Lee, Young-Joong;Kim, In-Seon;Park, Joo-Rae;Kwak, Hyun-Kyu;Shin, Wook-Hyun
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.13 no.2
    • /
    • pp.252-257
    • /
    • 2010
  • SAR has pulse compression gain through the process including range and azimuth. Efficient jammers against the SAR with simulated elements are evaluated in the view of power and SAR image. In this paper, J/S is analysed for SAR with RF propagation equation firstly. Several jamming signals on SAR signal are made into SAR image through pulse compression process. Objective jamming performance is evaluated using euclidean distance.

Study on Performance Improvement of Video in the H.264 Codec (H.264 코덱에서 동영상 성능개선 연구)

  • Bong, Jeong-Sik;Jeon, Joon-Hyeon
    • Proceedings of the KIEE Conference
    • /
    • 2005.10b
    • /
    • pp.532-535
    • /
    • 2005
  • These days, many image processing techniques have been studied for effective image compression. Among those, 2D image filtering is widely used for 2D image processing. The 2D image filtering can be implemented by performing ID linear filtering separately in the direction of horizontal and vertical. Efficiency of image compression depends on what filtering method is used. Generally, circular convolution is widely used in the 2D image filtering for image processing. However it doesn't consider correlations at the region of image boundary, therefore filtering can not be performed effectively. To solve this problem. I proposed new convolution technique using Symmetric-Mirroring convolution, satisfying the 'alias-free' and 'error-free' requirement in the reconstructed image. This method could provide more effective performance than former compression methods. Because it used very high correlative data when performed at the boundary region. In this paper, pre-processing filtering in H.264 codec was adopted to analyze efficiency of proposed filtering technique, and the simulator developed by Matlab language was used to examine the performance of the proposed method.

  • PDF

Improving JPEG-LS Performance Using Location Information

  • Woo, Jae Hyeon;Kim, Hyoung Joong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.11
    • /
    • pp.5547-5562
    • /
    • 2016
  • JPEG-LS is an international standard for lossless or near-lossless image-compression algorithms. In this paper, a simple method is proposed to improve the performance of the lossless JPEG-LS algorithm. With respect to JPEG-LS and its supplementary explanation, Golomb-Rice (GR) coding is mainly used for entropy coding, but it is not used for long codewords. The proposed method replaces a set of long codewords with a set of shorter location map information. This paper shows how efficiently the location map guarantees reversibility and enhances the compression rate in terms of performance. Experiments have also been conducted to verify the efficiency of the proposed method.

Multi-Symbol Binary Arithmetic Coding Algorithm for Improving Throughput in Hardware Implementation

  • Kim, Jin-Sung;Kim, Eung Sup;Lee, Kyujoong
    • Journal of Multimedia Information System
    • /
    • v.5 no.4
    • /
    • pp.273-276
    • /
    • 2018
  • In video compression standards, the entropy coding is essential to the high performance compression because redundancy of data symbols is removed. Binary arithmetic coding is one of high performance entropy coding methods. However, the dependency between consecutive binary symbols prevents improving the throughput. For the throughput enhancement, a new probability model is proposed for encoding multi-symbols at one time. In the proposed method, multi-symbol encoder is implemented with only adders and shifters, and the multiplication table for interval subdivision of binary arithmetic coding is removed. Compared to the compression ratio of CABAC of H.264/AVC, the performance degradation on average is only 1.4% which is negligible.

Effects of Compression Materials on Hand Dexterity in the 40's Healthy Subjects: A Preliminary Study

  • Rhee, Hyeon-Sook;Yu, Jae-Ho;Kim, Sung-Joong
    • The Journal of Korean Physical Therapy
    • /
    • v.23 no.6
    • /
    • pp.43-47
    • /
    • 2011
  • Purpose: The aim of this preliminary study was to use hand function tests to Hand dexterity levels provided by the type of compression garment and compression bandages in asymptomatic subjects and to collect baseline data for the comparison of hand functions in the patients with chronic arm lymphedema. Methods: The subjects of this study were 32 healthy volunteer female with a mean age of 45.8 years. Grip strength and hand functions were tested in three conditions-no compression, compression garment, and compression bandages-using the nine-hole peg test (NHPT), the box and block test (B&BT), Minnesota Manual Dexterity test (MMDT), and the hand-held Jamar dynamometer. Results: The grip strength was significantly low in the bandage condition (p<0.05). The performance in both compression groups (i.e., bandage and compression garment) decreased as the thickness of the compression material increased (p<0.05). Conclusion: The findings of this study suggest that grip strength and hand function scores are influenced by the characteristics of the compression applied. Future study is needed to determine the level of hand function between patients with chronic arm lymphedema and healthy individuals.

A Study on the Wavelet Based Algorithm for Lossless and Lossy Image Compression (무손실.손실 영상 압축을 위한 웨이브릿 기반 알고리즘에 관한 연구)

  • An, Chong-Koo;Chu, Hyung-Suk
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.55 no.3
    • /
    • pp.124-130
    • /
    • 2006
  • A wavelet-based image compression system allowing both lossless and lossy image compression is proposed in this paper. The proposed algorithm consists of the two stages. The first stage uses the wavelet packet transform and the quad-tree coding scheme for the lossy compression. In the second stage, the residue image taken between the original image and the lossy reconstruction image is coded for the lossless image compression by using the integer wavelet transform and the context based predictive technique with feedback error. The proposed wavelet-based algorithm, allowing an optional lossless reconstruction of a given image, transmits progressively image materials and chooses an appropriate wavelet filter in each stage. The lossy compression result of the proposed algorithm improves up to the maximum 1 dB PSNR performance of the high frequency image, compared to that of JPEG-2000 algorithm and that of S+P algorithm. In addition, the lossless compression result of the proposed algorithm improves up to the maximum 0.39 compression rates of the high frequency image, compared to that of the existing algorithm.