• Title/Summary/Keyword: lossless data compression

Search Result 73, Processing Time 0.024 seconds

LOSSLESS DATA COMPRESSION ON SAR DISPLAY IMAGES (SAR 디스플레이 영상을 위한 무손실 압축)

  • Lee, Tae-hee;Song, Woo-jin;Do, Dae-won;Kwon, Jun-chan;Yoon, Byung-woo
    • Proceedings of the IEEK Conference
    • /
    • 2001.09a
    • /
    • pp.117-120
    • /
    • 2001
  • Synthetic aperture radar (SAR) is a promising active remote sensing technique to obtain large terrain information of the earth in all-weather conditions. SAR is useful in many applications, including terrain mapping and geographic information system (GIS), which use SAR display images. Usually, these applications need the enormous data storage because they deal with wide terrain images with high resolution. So, compression technique is a useful approach to deal with SAR display images with limited storage. Because there is some indispensable data loss through the conversion of a complex SAR image to a display image, some applications, which need high-resolution images, cannot tolerate more data loss during compression. Therefore, lossless compression is appropriate to these applications. In this paper, we propose a novel lossless compression technique for a SAR display image using one-step predictor and block arithmetic coding.

  • PDF

Depending on PACS Operating System Differences Analysis of Usefulness of Lossless Compression Method in Medical Image Upload: SNR, CNR, Histogram Comparative Analysis (PACS운영 시스템 차이에 따른 의료 영상 업로드 시 무손실 압축 방식의 유용성 분석: SNR, CNR, Histogram 비교 분석을 중심으로)

  • Choi, Ji-An;Hwang, Jun-Ho;Lee, Kyung-Bae
    • The Journal of the Korea Contents Association
    • /
    • v.18 no.3
    • /
    • pp.299-308
    • /
    • 2018
  • This study focused on the fact that medical images that are issued at different hospitals may affect image quality on PACS when different software is used. A university hospital image was copied to the DICOM file and registered on the PACS of the university hospital B. The capacity and image quality of the software used in the university hospital were evaluated by SNR, CNR and histogram. As the compression ratio increased, SNR and CNR tended to decrease. Note that Lossless Compression decreased the data size by half compared to No Compression, but SNR and CNR did not change. As a result of the histogram analysis, the information loss due to the underflow phenomenon was conspicuous. When moving to another hospital, No compression or lossless compression method should be used. In conclusion, it is useful to use the lossless compression method, considering waiting time and economic efficiency in uploading.

A Pattern Matching Extended Compression Algorithm for DNA Sequences

  • Murugan., A;Punitha., K
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.8
    • /
    • pp.196-202
    • /
    • 2021
  • DNA sequencing provides fundamental data in genomics, bioinformatics, biology and many other research areas. With the emergent evolution in DNA sequencing technology, a massive amount of genomic data is produced every day, mainly DNA sequences, craving for more storage and bandwidth. Unfortunately, managing, analyzing and specifically storing these large amounts of data become a major scientific challenge for bioinformatics. Those large volumes of data also require a fast transmission, effective storage, superior functionality and provision of quick access to any record. Data storage costs have a considerable proportion of total cost in the formation and analysis of DNA sequences. In particular, there is a need of highly control of disk storage capacity of DNA sequences but the standard compression techniques unsuccessful to compress these sequences. Several specialized techniques were introduced for this purpose. Therefore, to overcome all these above challenges, lossless compression techniques have become necessary. In this paper, it is described a new DNA compression mechanism of pattern matching extended Compression algorithm that read the input sequence as segments and find the matching pattern and store it in a permanent or temporary table based on number of bases. The remaining unmatched sequence is been converted into the binary form and then it is been grouped into binary bits i.e. of seven bits and gain these bits are been converted into an ASCII form. Finally, the proposed algorithm dynamically calculates the compression ratio. Thus the results show that pattern matching extended Compression algorithm outperforms cutting-edge compressors and proves its efficiency in terms of compression ratio regardless of the file size of the data.

Region-Growing Segmentation Algorithm for Rossless Image Compression to High-Resolution Medical Image (영역 성장 분할 기법을 이용한 무손실 영상 압축)

  • 박정선;김길중;전계록
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.3 no.1
    • /
    • pp.33-40
    • /
    • 2002
  • In this paper, we proposed a lossless compression algorithm of medical images which is essential technique in picture archive and communication system. Mammographic image and magnetic resonance image in among medical images used in this study, proposed a region growing segmentation algorithm for compression of these images. A proposed algorithm was partition by three sub region which error image, discontinuity index map, high order bit data from original image. And generated discontinuity index image data and error image which apply to a region growing algorithm are compressed using JBIG(Joint Bi-level Image experts Group) algorithm that is international hi-level image compression standard and proper image compression technique of gray code digital Images. The proposed lossless compression method resulted in, on the average, lossless compression to about 73.14% with a database of high-resolution digital mammography images. In comparison with direct coding by JBIG, JPEG, and Lempel-Ziv coding methods, the proposed method performed better by 3.7%, 7.9% and 23.6% on the database used.

  • PDF

Improving the Lifetime of NAND Flash-based Storages by Min-hash Assisted Delta Compression Engine (MADE (Minhash-Assisted Delta Compression Engine) : 델타 압축 기반의 낸드 플래시 저장장치 내구성 향상 기법)

  • Kwon, Hyoukjun;Kim, Dohyun;Park, Jisung;Kim, Jihong
    • Journal of KIISE
    • /
    • v.42 no.9
    • /
    • pp.1078-1089
    • /
    • 2015
  • In this paper, we propose the Min-hash Assisted Delta-compression Engine(MADE) to improve the lifetime of NAND flash-based storages at the device level. MADE effectively reduces the write traffic to NAND flash through the use of a novel delta compression scheme. The delta compression performance was optimized by introducing min-hash based LSH(Locality Sensitive Hash) and efficiently combining it with our delta compression method. We also developed a delta encoding technique that has functionality equivalent to deduplication and lossless compression. The results of our experiment show that MADE reduces the amount of data written on NAND flash by up to 90%, which is better than a simple combination of deduplication and lossless compression schemes by 12% on average.

Improved CABAC Method for Lossless Image Compression (무손실 영상 압축을 위한 향상된 CABAC 방법)

  • Heo, Jin;Ho, Yo-Sung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.36 no.6C
    • /
    • pp.355-360
    • /
    • 2011
  • In this paper, we propose a new context-based adaptive binary arithmetic coding (CABAC) method for lossless image compression. Since the conventional CABAC in H.264/AVC was originally designed for lossy coding, it does not yield adequate performance during lossless coding. Therefore, we proposed an improved CABAC method for lossless intra coding by considering the statistical characteristics of residual data in lossless intra coding. Experimental results showed that the proposed method reduced the bit rate by 18.2%, compared to the conventional CABAC for lossless intra coding.

Edge Adaptive Hierarchical Interpolation for Lossless and Progressive Image Transmission

  • Biadgie, Yenewondim;Wee, Young-Chul;Choi, Jung-Ju
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.5 no.11
    • /
    • pp.2068-2086
    • /
    • 2011
  • Based on the quincunx sub-sampling grid, the New Interleaved Hierarchical INTerpolation (NIHINT) method is recognized as a superior pyramid data structure for the lossless and progressive coding of natural images. In this paper, we propose a new image interpolation algorithm, Edge Adaptive Hierarchical INTerpolation (EAHINT), for a further reduction in the entropy of interpolation errors. We compute the local variance of the causal context to model the strength of a local edge around a target pixel and then apply three statistical decision rules to classify the local edge into a strong edge, a weak edge, or a medium edge. According to these local edge types, we apply an interpolation method to the target pixel using a one-directional interpolator for a strong edge, a multi-directional adaptive weighting interpolator for a medium edge, or a non-directional static weighting linear interpolator for a weak edge. Experimental results show that the proposed algorithm achieves a better compression bit rate than the NIHINT method for lossless image coding. It is shown that the compression bit rate is much better for images that are rich in directional edges and textures. Our algorithm also shows better rate-distortion performance and visual quality for progressive image transmission.

Prediction by Edge Detection Technique for Lossless Multi-resolution Image Compression (경계선 정보를 이용한 다중 해상도 무손질 영상 압축을 위한 예측기법)

  • Kim, Tae-Hwa;Lee, Yun-Jin;Wei, Young-Chul
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.3
    • /
    • pp.170-176
    • /
    • 2010
  • Prediction is an important step in high-performance lossless data compression. In this paper, we propose a novel lossless image coding algorithm to increase prediction accuracy which can display low-resolution images quickly with a multi-resolution image technique. At each resolution, we use pixels of the previous resolution image to estimate current pixel values. For each pixel, we determine its estimated value by considering horizontal, vertical, diagonal edge information and average, weighted-average information obtained from its neighborhood pixels. In the experiment, we show that our method obtains better prediction than JPEG-LS or HINT.

Lossless Compression and Rendering of Multiple Layer Displacement Map (다층 변위 맵의 비손실 압축과 렌더링)

  • Chun, Young-Jae;Kim, Hae-Dong;Cho, Sung-Hyun
    • Journal of Korea Game Society
    • /
    • v.9 no.6
    • /
    • pp.171-178
    • /
    • 2009
  • Multiple layer displacement mapping methods are able to represent more complex and general geometries which cannot be presented by single layer displacement mapping methods, and provide a realistic scene to digital contents such as 3D games and movies with relatively low costs. However, as we use more layers for details, data space is wasted more because lower layers have less displacement data than higher layers. In this paper, we suggest a lossless compression and rendering method of a multiple layer displacement map. Since we compress the map without data loss, the proposed method provides the same quality as the rendering result that uses an original multiple layer displacement map.

  • PDF

A Comparative Study of Compression Methods and the Development of CODEC Program of Biological Signal for Emergency Telemedicine Service (응급 원격 진료 서비스를 위한 생체신호 압축 방법 비교 연구 및 압축/복원 프로그램 개발)

  • Yoon Tae-Sung;Lim Young-Ho;Kim Jung-Sang;Yoo Sun-Kook
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.52 no.5
    • /
    • pp.311-321
    • /
    • 2003
  • In an emergency telemedicine system such as the High-quality Multimedia based Real-time Emergency Telemedicine(HMRET) service, it is very important to examine the status of the patient continuously using the multimedia data including the biological signals(ECG, BP, Respiration, $SpO_2)$ of the patient. In order to transmit these data real time through the communication means which have the limited transmission capacity, it is also necessary to compress the biological data besides other multimedia data. For this purpose, we investigate and compare the ECG compression techniques in the time domain and in the wavelet transform domain, and present an effective lossless compression method of the biological signals using PEG Huffman table for an emergency telemedicine system. And, for the HMRET service, we developed the lossless compression and reconstruction program or the biological signals in MSVC++ 6.0 using DPCM method and JPEG Huffman table, and tested in an internet environment.