• Title/Summary/Keyword: Code size

Search Result 1,085, Processing Time 0.023 seconds

Parallel lProcessing of Pre-conditioned Navier-Stokes Code on the Myrinet and Fast-Ethernet PC Cluster (Myrinet과 Fast-Ethernet PC Cluster에서 예조건화 Navier-Stokes코드의 병렬처리)

  • Lee, G.S.;Kim, M.H.;Choi, J.Y.;Kim, K.S.;Kim, S.L.;Jeung, I.S.
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.30 no.6
    • /
    • pp.21-30
    • /
    • 2002
  • A preconditioned Navier-Stokes code was parallelized by the domain decomposition technique, and the accuracy of the parallelized code was verified through a comparison with the result of a sequential code and experimental data. Parallel performance of the code was examined on a Myrinet based PC-cluster and a Fast-Ethernet system. Speed-up ratio was examined as a major performance parameter depending on the number of processor and the network communication topology. In this test, Myrinet system shows a superior parallel performance to the Fast-Ethernet system as was expected. A test for the dependency on problem size also shows that network communication speed in a crucial factor for parallel performance, and the Myrinet based PC-cluster is a plausible candidate for high performance parallel computing system.

Performance evaluation of hybrid acquisition in CDMA systems (DS/CDMA 시스템에서 하이브리드 동기 획득의 성능 분석)

  • 강법주;강창언
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.23 no.4
    • /
    • pp.914-925
    • /
    • 1998
  • This paper considers the evaluation of the hybrid acquistion perdformance for the pilot signal in the direct sequence code division multiple access(DS/CDMA) forward link. the hybrid acquisition is introduced by the combination of two schemes, the parallel and serial acquisions. The mean acquisition time of the proposed scheme is derived to consider both the best case(the correct code-phase offsets are included i one subset) and the worst case(the correct code-phase offsets exist at the boundary of two subsets), which are cause by the distribution of the correct code-phase offsets in the subset. Expressions for the detection, false alarm, and miss probabilities are derived for the case of multiple correct code-phase offsets and multipath Rayleigh fading channel. Numerical results present the hybrid acquistion performance with repect to design parameters such as postdetectio integration length in the search and verification modes, subset size, and number of I/Q noncoherent correlators, and compare the hybrid acquistion with the parallel acquistion in terms of the minimum acquistion time under the same hardware complexity.

  • PDF

A Development of GUI Full-Energy Absorption Peak Analysis Program for Educational Purpose (전 에너지 흡수 피크 분석용 GUI 기반 교육용 프로그램 개발)

  • Sohn, Jong-Wan;Shin, Myung-Suk;Lee, Hye-Jung;Jung, Kyung-Su;Jeong, Min-Su;Kim, Sang-Nyeon
    • Journal of Radiation Protection and Research
    • /
    • v.34 no.2
    • /
    • pp.69-75
    • /
    • 2009
  • To obtain precise information about characteristics of gamma ray detector system responses, we developed new GUI computer program to analize full-energy absorption peak using our developed Delphi computer code for educational purpose. By use of the well known 4 nonlinear peak shaping functions, peaks were fitted with least square fit method in this code. In this paper, we described the methods to search for 12 coefficients in above 4 nonlinear peak shaping functions by use of our developed code in details. The computer code was tested for 1 $\mu$Ci $^{137)Cs$ 661 keV gamma ray peak spectrum detected by 25 % relative efficiency HPGe detector with 5.35 cm (D) $\times$ 5.5 cm (L) size.

Application Development and Performance Analysis of Smartphone-based QR Code Interpreter (스마트폰 기반의 QR코드 해석기 성능분석 및 응용개발)

  • Park, Chan-Jung;Hyun, Jung-Suk
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.13 no.10
    • /
    • pp.2242-2250
    • /
    • 2009
  • Recently, with the advance of Ubiquitous era, the types of services become diverse. Especially, due to the rapid development of mobile technology, the new functions of mobile phones are added and the new applications of mobile phones are developed actively. Among the various applications related to mobile phones, 2 dimensional barcode-based applications are increasing. 2 dimensional barcode is mostly used for the management of past record. However, by combing 2 dimensional barcode with mobile phones, the application areas of 2 dimensional barcode are expanded to the means of publicity for education, tourism, and festivals. In this paper, we develop a QR code decoder running on smartphones, which connects on-line and off-line. In addition, we modify our decoder by detecting the point for performance enhancement based on TRIZ. We compare our decoder with an open-source based decoder in terms of the code size of decoding and the speed of decoding in order to prove that our decoder has a better performance than the other. Finally, we introduce two applications: u-map and u-pamphlet as QR code applications.

Novel construction of quasi-cyclic low-density parity-check codes with variable code rates for cloud data storage systems

  • Vairaperumal Bhuvaneshwari;Chandrapragasam Tharini
    • ETRI Journal
    • /
    • v.45 no.3
    • /
    • pp.404-417
    • /
    • 2023
  • This paper proposed a novel method for constructing quasi-cyclic low-density parity-check (QC-LDPC) codes of medium to high code rates that can be applied in cloud data storage systems, requiring better error correction capabilities. The novelty of this method lies in the construction of sparse base matrices, using a girth greater than 4 that can then be expanded with a lift factor to produce high code rate QC-LDPC codes. Investigations revealed that the proposed large-sized QC-LDPC codes with high code rates displayed low encoding complexities and provided a low bit error rate (BER) of 10-10 at 3.5 dB Eb/N0 than conventional LDPC codes, which showed a BER of 10-7 at 3 dB Eb/N0. Subsequently, implementation of the proposed QC-LDPC code in a softwaredefined radio, using the NI USRP 2920 hardware platform, was conducted. As a result, a BER of 10-6 at 4.2 dB Eb/N0 was achieved. Then, the performance of the proposed codes based on their encoding-decoding speeds and storage overhead was investigated when applied to a cloud data storage (GCP). Our results revealed that the proposed codes required much less time for encoding and decoding (of data files having a 10 MB size) and produced less storage overhead than the conventional LDPC and Reed-Solomon codes.

Lattice based Microstructure Evolution Model for Monte Carlo Finite Element Analysis of Polycrystalline Materials (격자식 미세구조 성장 모델을 이용한 다결정 박막 소재의 유한 요소 해석)

  • 최재환;김한성;이준기;나경환
    • Transactions of Materials Processing
    • /
    • v.13 no.3
    • /
    • pp.248-252
    • /
    • 2004
  • The mechanical properties of polycrystalline thin-films, critical for Micro-Electro-Mechanical Systems (MEMS) components, are known to have the size effect and the scatter in the length scale of microns by the numbers of intensive investigation by experiments and simulations. So, the consideration of the microstructure is essential to cover these length scale effects. The lattice based stochastic model for the microstructure evolution is used to simulate the actual microstructure, and the fast and reliable algorithm is described in this paper. The kinetics parameters, which are the key parameters for the microstructure evolution based on the nucleation and growth mechanism, are extracted from the given micrograph of a polycrystalline material by an inverse method. And the method is verified by the comparison of the quantitative measures, the number of grains and the grain size distribution, for the actual and simulated microstructures. Finite element mesh is then generated on this lattice based microstructure by the developed code. And the statistical finite element analysis is accomplished for selected microstructure.

The Effects of Pressure, Wind Velocity, and Diameter of Wet Element on the Measurement of Relative Humidity by a Psychrometer (압력, 풍속 및 습구온도계의 크기가 건습구습도계를 이용한 상대습도 측정에 미치는 영향)

  • Chi, D.S.;Kim, S.T.;Park, C.B.
    • Korean Journal of Air-Conditioning and Refrigeration Engineering
    • /
    • v.2 no.2
    • /
    • pp.137-141
    • /
    • 1990
  • When the relative humidity is measured with an aspirated psychrometer, three factors, which affect the measurement of relative humidity, are atmospheric pressure, the size of wet element and the wind velocity. This paper investigated the effects of the above three factors, and the computer code was developed in order to enhance the accuracy of the relative humidity measurement. As results, it is found that the relative humidity decreases by 6%RH with increasing atmospheric pressure from 650 mbar to 1100 mbar. It is found that the relative humidity drops down when the size of the wet element increases, though the effect of the size of the wet element is not significant. Finally, relative humidity increases with the increasing wind velocity. The difference between the psychrometic table in the present KS and the present results is about 2%RH maximum. As a conclusion, the three factors mentioned above should be considered in order to secure accurate measurement of relative humidity.

  • PDF

The Experimental Evaluation and Verification of a 300kW Small Engine Cogeneration System (300kW급 가스엔진 열병합발전시스템 성능평가 및 실증)

  • Choi, Jae-Joon;Park, Hwa-Choon
    • Proceedings of the SAREK Conference
    • /
    • 2009.06a
    • /
    • pp.332-337
    • /
    • 2009
  • The importance of the evaluation and verification of small-size cogeneration system has been emphasized because there is no KS-code related to the small-size cogeneration system. The evaluation method of small-size engine cogeneration system, regarding Japanese standard JIS B-8122 and international standard organization, ISO-8528, was applied to the system. The evaluation methods, start-test, rapid-load-up and rapid-load-down, etc. were executed at the system, and reasonable results were acquired. The electrical and thermal efficiencies were executed and analyzed at various load conditions. The NOx emission at various load condition was also measured. Finally, the gas engine cogeneration system was installed to a site for actual usage and it was continually operated during more than 6 months as the site condition.

  • PDF

Performance Evaluation of Different Factors According to ROI Coding Methods in JPEG2000

  • Kim, Ho-Yong;Shim, Jong-Chae;Seo, Yeong-Geon
    • Journal of Digital Contents Society
    • /
    • v.7 no.3
    • /
    • pp.183-191
    • /
    • 2006
  • Currently, the preferred processing of a user-centered ROI(Region-of-Interest) or a specific region of image to transmission and decompression of a full image is needed in different applications, specifically mobile applications. Here, we have to study how different factors affect ROI coding methods. Therefore, an application can select an ROI coding method and several parameters suitable for the environments. The ROI coding methods used in the study are Maxshift and Implicit and the parameters are tile size, image size, code block size, ROI importance and the number of lowest resolution levels. This study shows the experimental results between the different parameters and the two ROI coding methods.

  • PDF

Image Steganography to Hide Unlimited Secret Text Size

  • Almazaydeh, Wa'el Ibrahim A.
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.4
    • /
    • pp.73-82
    • /
    • 2022
  • This paper shows the hiding process of unlimited secret text size in an image using three methods: the first method is the traditional method in steganography that based on the concealing the binary value of the text using the least significant bits method, the second method is a new method to hide the data in an image based on Exclusive OR process and the third one is a new method for hiding the binary data of the text into an image (that may be grayscale or RGB images) using Exclusive and Huffman Coding. The new methods shows the hiding process of unlimited text size (data) in an image. Peak Signal to Noise Ratio (PSNR) is applied in the research to simulate the results.