• Title/Summary/Keyword: data processing technique

Search Result 1,981, Processing Time 0.03 seconds

Improvement and Verification of the Wear Volume Calculation

  • Kim, Hyung-Kyu;Lee, Young-Ho
    • KSTLE International Journal
    • /
    • v.6 no.1
    • /
    • pp.21-27
    • /
    • 2005
  • A technique for a wear volume calculation is improved and verified in this research. The wear profile data measured by a surface roughness tester is used. The present technique uses a data flattening, the FFT and the windowing procedure, which is used for a general signal processing. The measured value of an average roughness of an unworn surfnce is used for the baseline of the integration for the volume calculation. The improvements from the previous technique are the procedures of the data flattening and the determination of a baseline. It is found that the flattening procedure efnciently manipulates the raw data when the levels of it are not horizontal, which enables us to calculate the volume reasonably well and readily. By comparing it with the weight loss method by using artificial dents, the present method reveals more volume by aroung 3~10%. It is attributed to the protruded region of the specimen and the inaccuracy and data averaging during the weght loss measurement. From a thorough investigation, it is concluded that the present technique can provide an accurate wear volume.

Genealogy-based Indexing Technique for XML Documents (XML문서를 위한 족보 기반 인덱싱 기법)

  • 이월영;용환승
    • Journal of KIISE:Databases
    • /
    • v.31 no.1
    • /
    • pp.72-81
    • /
    • 2004
  • Theses days, a number of data over the Internet are represented using XML because of a virtue of XML. In proportion to the increase of XML data, query processing techniques are required that support quickly and efficiently the diverse queries to search the useful information on XML documents. But, up to now, the researches handling queries for XML data are methodologies focusing on how to process regular path expressions. Therefore, we have developed a new genealogy-based indexing technique to solve various queries such as not only regular path expression but also simple path expression, path expression referencing other elements, and so on. Also, we have applied this technique on object-relational model and evaluated the performance for many documents and various query types. The result shows improved performance in comparison with the other storage techniques.

Preformance Comparison of MLE Technique with POF(Pencil of Functions) Method for SEM Parameter Estimation (SEM 파라메타 측정에 대한 MLE 기법과 POF 기법의 성능비교)

  • Kim, Deok-Nyeon
    • The Transactions of the Korea Information Processing Society
    • /
    • v.1 no.4
    • /
    • pp.511-516
    • /
    • 1994
  • Parameter estimation techniques are discussed for the complex frequency analysis of an electromagnetic scatterer. The paper suggests how the Maximum Likelihood estimation technique can be applied for this purpose. Experiments on hypothetical data sets demonstrate that the Maximum Likelihood technique is better than the Pencil of Functions technique. Although there have been several techniques including MLE suggested as tools of the parameter estimation, the proposed method has strong advantages under the noise-contaminated sample data environment because it uses minimal dimension of system matrix that stands totally independent of the length of extracted data set.

  • PDF

Digital Elevation Map Generation using SAR Stereo Technique with Radarsat Images over Seoul Area

  • Ka, Min-Ho;Kim, Man-Jo
    • Korean Journal of Remote Sensing
    • /
    • v.17 no.2
    • /
    • pp.155-164
    • /
    • 2001
  • In this study, we describe the technique for deriving a digital elevation model (DEM) from a synthetic aperture radar (SAR) stereo image pair and apply it to an image pair over "Kwanak-san" in Seoul, Korea. This paper contains brief discussion of the use of stereo SAR to derive topographic data, description of the overall structure of the stereo SAR processing system, description of the site and SAR data used for the evaluation and the source of validation data, results of the stereo SAR processing, analysis and evaluation of their accuracy against map data, and finally summarizes the main highlights of the method used, comments and recommendations on its future implementation.

DWT-PCA Combination for Noise Detection in Wireless Sensor Networks (무선 센서 네트워크에서 노이즈 감지를 위한 DWT-PCA 조합)

  • Dang, Thien-Binh;Le, Duc-Tai;Kim, Moonseong;Choo, Hyunseung
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2020.11a
    • /
    • pp.144-146
    • /
    • 2020
  • Discrete Wavelet Transform (DWT) is an effective technique that is commonly used for detecting noise in collected data of an individual sensor. In addition, the detection accuracy can be significant improved by exploiting the correlation in the data of neighboring sensors of Wireless Sensor Networks (WSNs). Principal component analysis is the powerful technique to analyze the correlation in the multivariate data. In this paper, we propose a DWT-PCA combination scheme for noise detection (DWT-PCA-ND). Experimental results on a real dataset show a remarkably higher performance of DWT-PCA-ND comparing to conventional PCA scheme in detection of noise that is a popular anomaly in collected data of WSN.

An Efficient Processing Method of Top-k(g) Skyline Group Queries for Incomplete Data (불완전 데이터를 위한 효율적 Top-k(g) 스카이라인 그룹 질의 처리 기법)

  • Park, Mi-Ra;Min, Jun-Ki
    • The KIPS Transactions:PartD
    • /
    • v.17D no.1
    • /
    • pp.17-24
    • /
    • 2010
  • Recently, there has been growing interest in skyline queries. Most of works for skyline queries assume that the data do not have null value. However, when we input data through the Web or with other different tools, there exist incomplete data with null values. As a result, several skyline processing techniques for incomplete data have been proposed. However, available skyline query techniques for incomplete data do not consider the environments that coexist complete data and incomplete data since these techniques deal with the incomplete data only. In this paper, we propose a novel skyline group processing technique which evaluates skyline queries for the environments that coexist complete data and incomplete data. To do this, we introduce the top-k(g) skyline group query which searches g skyline groups with respect to the user's dimensional preference. In our experimental study, we show efficiency of our proposed technique.

GPU-Based ECC Decode Unit for Efficient Massive Data Reception Acceleration

  • Kwon, Jisu;Seok, Moon Gi;Park, Daejin
    • Journal of Information Processing Systems
    • /
    • v.16 no.6
    • /
    • pp.1359-1371
    • /
    • 2020
  • In transmitting and receiving such a large amount of data, reliable data communication is crucial for normal operation of a device and to prevent abnormal operations caused by errors. Therefore, in this paper, it is assumed that an error correction code (ECC) that can detect and correct errors by itself is used in an environment where massive data is sequentially received. Because an embedded system has limited resources, such as a low-performance processor or a small memory, it requires efficient operation of applications. In this paper, we propose using an accelerated ECC-decoding technique with a graphics processing unit (GPU) built into the embedded system when receiving a large amount of data. In the matrix-vector multiplication that forms the Hamming code used as a function of the ECC operation, the matrix is expressed in compressed sparse row (CSR) format, and a sparse matrix-vector product is used. The multiplication operation is performed in the kernel of the GPU, and we also accelerate the Hamming code computation so that the ECC operation can be performed in parallel. The proposed technique is implemented with CUDA on a GPU-embedded target board, NVIDIA Jetson TX2, and compared with execution time of the CPU.

Development of Receiving and Image Processing System of GMS/WEFAX Using PC(II) - Software for Receiving and Image Processing - (PC를 이용한 GMS/WEFAX 수신 및 영상처리 시스템 개발(II) - 수신 및 영상처리 소프트 웨어 -)

  • ;;Yun, Gi-Joon;Park, Jong-Hyun
    • Korean Journal of Remote Sensing
    • /
    • v.9 no.1
    • /
    • pp.37-49
    • /
    • 1993
  • In this research, the WEF AX and APT(Automatic Picture Transmission) data receiving and image processing software using PC/AT called WADIPS(WEFAX and APT Data Integrated Processing System) Software has been developed. The main functions of WADIPS software are follow : 1) Real time receiving and saving to hard disk of WEFAX and APT data 2) B/W(Black and White) and false color display 3) Image enhancement using histogram stretch and color control 4) 2-4 times zooming 5) Hard copy of data using dithering and patterning 6) Animation 7) File management 8) On line help. WADIPS can be used in the offices or persons need real time meteorological information and education offices to teach the image processing technique and general characteristics of meteorological satellites.

화상처리를 이용한 표면 실장 기판 외관 검사

  • 백갑환;김현곤;김기현;유건희
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1992.04a
    • /
    • pp.343-348
    • /
    • 1992
  • Using the real-time image processing technique, we have developed an automatic visual inspection system which detects the defects of the surface muonted components in PCB( missing components, mislocation, mismounts, and reverse polarity, etc ) and collects the quality control and production management data. An image processing system based on a commercial parallel processor, TRANSPUTER by which the image processing time can be largely reduced was designed. Analyzing the collected data, the proposed inspection system contributes to the productivity improvement throughthe reduction of defective rate.

Segment Join Technique for Processing in Queries Fast (빠른 XML질의 처리를 위한 세그먼트 조인 기법)

  • ;Moon Bongki;Lee Sukho
    • Journal of KIISE:Databases
    • /
    • v.32 no.3
    • /
    • pp.334-343
    • /
    • 2005
  • Complex queries such as path alld twig patterns have been the focus of much research on processing XML data. Structural join algorithms use a form of encoded structural information for elements in an XML document to facilitate join processing. Recently, structural join algorithms such as Twigstack and TSGeneric- have been developed to process such complex queries, and they have been shown that the processing costs of the algorithms are linearly proportional to the sum of input data. However, the algorithms have a shortcoming that their processing costs increase with the length of a queery. To overcome the shortcoming, we propose the segment join technique to augment the structural join with structural indexes such as the 1-Index. The SegmentTwig algorithm based on the segment join technique performs joins between a pair of segments, which is a series of query nodes, rather than joins between a pair of query nodes. Consequently, the query can be processed by reading only a query node per segment. Our experimental study shorts that segment join algorithms outperform the structural join methods consistently and considerably for various data sets.