• Title/Summary/Keyword: Post-Correlation Processing Software

Search Result 14, Processing Time 0.022 seconds

A Study on Development of Commercial PIV Utilizing Multimedia (멀티미디어 대응 상용 PIV의 국산화개발에 관한 연구)

  • 최장운
    • Journal of Advanced Marine Engineering and Technology
    • /
    • v.22 no.5
    • /
    • pp.652-659
    • /
    • 1998
  • The present study is aimed to develop a new PIV operating software through optimization of vector tracking identification including versatile pre-processings and post-processing techniques. And the result exhibits an improved version corresponding various input and output multimedia compared to previous commercial software developed by other makers. An upgraded identification method called grey-level cross correlation coefficient method by direct calculation is suggested and related user-friendly pop-up menu are also represented. Post-processings comprising turbulence statistics are also introduced with graphic output functions.

  • PDF

The Development of Modularized Post Processing GPS Software Receiving Platform using MATLAB Simulink

  • Kim, Ghang-Ho;So, Hyoung-Min;Jeon, Sang-Hoon;Kee, Chang-Don;Cho, Young-Su;Choi, Wansik
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.9 no.2
    • /
    • pp.121-128
    • /
    • 2008
  • Modularized GPS software defined radio (SDR) has many advantages of applying and modifying algorithm. Hardware based GPS receiver uses many hardware parts (such as RF front, correlators, CPU and other peripherals) that process tracked signal and navigation data to calculate user position, while SDR uses software modules, which run on general purpose CPU platform or embedded DSP. SDR does not have to change hardware part and is not limited by hardware capability when new processing algorithm is applied. The weakness of SDR is that software correlation takes lots of processing time. However, in these days the evolution of processing power of MPU and DSP leads the competitiveness of SDR against the hardware GPS receiver. This paper shows a study of modulization of GPS software platform and it presents development of the GNSS software platform using MATLAB Simulink™. We focus on post processing SDR platform which is usually adapted in research area. The main functions of SDR are GPS signal acquisition, signal tracking, decoding navigation data and calculating stand alone user position from stored data that was down converted and sampled intermediate frequency (IF) data. Each module of SDR platform is categorized by function for applicability for applying for other frequency and GPS signal easily. The developed software platform is tested using stored data which is down-converted and sampled IF data file. The test results present that the software platform calculates user position properly.

Multiple cracking analysis of HTPP-ECC by digital image correlation method

  • Felekoglu, Burak;Keskinates, Muhammer
    • Computers and Concrete
    • /
    • v.17 no.6
    • /
    • pp.831-848
    • /
    • 2016
  • This study aims to characterize the multiple cracking behavior of HTPP-ECC (High tenacity polypropylene fiber reinforced engineered cementitious composites) by Digital Image Correlation (DIC) Method. Digital images have been captured from a dogbone shaped HTPP-ECC specimen exhibiting 3.1% tensile ductility under loading. Images analyzed by VIC-2D software and ${\varepsilon}_{xx}$ strain maps have been obtained. Crack widths were computed from the ${\varepsilon}_{xx}$ strain maps and crack width distributions were determined throughout the specimen. The strain values from real LVDTs were also compared with virtual LVDTs digitally attached on digital images. Results confirmed that it is possible to accurately monitor the initiation and propagation of any single crack or multiple cracks by DIC at the whole interval of testing. Although the analysis require some post-processing operations, DIC based crack analysis methodology can be used as a promising and versatile tool for quality control of HTPP-ECC and other strain hardening composites.

A Study on Correlation Processing Method of Multi-Polarization Observation Data by Daejeon Correlator (대전상관기의 다중편파 관측데이터 상관처리 방법에 관한 연구)

  • Oh, Se-Jin;Yeom, Jae-Hwan;Roh, Duk-Gyoo;Jung, Dong-Kyu;Hwang, Ju-Yeon;Oh, Chungsik;Kim, Hyo-Ryoung
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.19 no.2
    • /
    • pp.68-76
    • /
    • 2018
  • In this paper, we describe the correlation processing method of multi-polarization observation data of the Daejeon Correlator. VLBI observations include single or multiple polarized observations depending on the type of object. Polarization observations are performed to observe the characteristics of the object. During the observations of the celestial object, polarization measurements are also performed to determine the delay values and causes of changes in the object. Correlation processing of polarization observation data of the Daejeon correlator is proposed by OCTAVIA of a synchronous reproduction processing apparatus that outputs data input to each antenna unit by using an output bit selection function to convert bits and the order of the data streams is changed, And the input of the Daejeon correlator is configured to perform the polarization correlation processing by conducting correlation processing by setting the existing stream number to be the same. Correlation processing is conducted on the test data observed for the polarization correlation processing and it is verified through experiments that the polarization correlation processing method of the proposed Daejeon correlator is effective.

Development of Software Correlator for KJJVC (한일공동VLBI상관기를 위한 소프트웨어 상관기의 개발)

  • Yeom, J.H.;Oh, S.J.;Roh, D.G.;Kang, Y.W.;Park, S.Y.;Lee, C.H.;Chung, H.S.
    • Journal of Astronomy and Space Sciences
    • /
    • v.26 no.4
    • /
    • pp.567-588
    • /
    • 2009
  • Korea-Japan Joint VLBI Correlator (KJJVC) is being developed by collaborating KASI (Korea Astronomy and Space Science Institute), Korea, and NAOJ(National Observatory of Japan), Japan. In early 2010, KJJVC will work in normal operation. In this study, we developed the software correlator which is based on VCS (VLBI Correlation Subsystem) hardware specification as the core component of KJJVC. The main specification of software correlator is 8 Gbps, 8192 output channels, and 262,144-points FFT (Fast Fourier Transform) function same as VCS. And the functional algorithm which is same as specification of VCS and arithmetic register are adopted in this software correlator. To verify the performance of developed software correlator, the correlation experiments were carried out using the spectral line and continuum sources which were observed by VERA (VLBI Exploration of Radio Astrometry), NAOJ. And the experimental results were compared to the output of Mitaka FX correlator by referring spectrum shape, phase rate, and fringe detection and so on. Through the experimental results, we confirmed that the correlation results of software correlator are the same as Mitaka FX correlator and verified the effectiveness of it. In future, we expect that the developed software correlator will be the possible software correlator of KVN (Korean VLBI Network) with KJJVC by introducing the correlation post-processing and modifying the user interface as like GUI (Graphic User Interface).

Quality Assessment of Beef Using Computer Vision Technology

  • Rahman, Md. Faizur;Iqbal, Abdullah;Hashem, Md. Abul;Adedeji, Akinbode A.
    • Food Science of Animal Resources
    • /
    • v.40 no.6
    • /
    • pp.896-907
    • /
    • 2020
  • Imaging technique or computer vision (CV) technology has received huge attention as a rapid and non-destructive technique throughout the world for measuring quality attributes of agricultural products including meat and meat products. This study was conducted to test the ability of CV technology to predict the quality attributes of beef. Images were captured from longissimus dorsi muscle in beef at 24 h post-mortem. Traits evaluated were color value (L*, a*, b*), pH, drip loss, cooking loss, dry matter, moisture, crude protein, fat, ash, thiobarbituric acid reactive substance (TBARS), peroxide value (POV), free fatty acid (FFA), total coliform count (TCC), total viable count (TVC) and total yeast-mould count (TYMC). Images were analyzed using the Matlab software (R2015a). Different reference values were determined by physicochemical, proximate, biochemical and microbiological test. All determination were done in triplicate and the mean value was reported. Data analysis was carried out using the programme Statgraphics Centurion XVI. Calibration and validation model were fitted using the software Unscrambler X version 9.7. A higher correlation found in a* (r=0.65) and moisture (r=0.56) with 'a*' value obtained from image analysis and the highest calibration and prediction accuracy was found in lightness (r2c=0.73, r2p=0.69) in beef. Results of this work show that CV technology may be a useful tool for predicting meat quality traits in the laboratory and meat processing industries.

Evaluation of Hippocampal Volume Based on Various Inversion Time in Normal Adults by Manual Tracing and Automated Segmentation Methods

  • Kim, Ju Ho;Choi, Dae Seob;Kim, Seong-hu;Shin, Hwa Seon;Seo, Hyemin;Choi, Ho Cheol;Son, Seungnam;Tae, Woo Suk;Kim, Sam Soo
    • Investigative Magnetic Resonance Imaging
    • /
    • v.19 no.2
    • /
    • pp.67-75
    • /
    • 2015
  • Purpose: To investigate the value of image post-processing software (FreeSurfer, IBASPM [individual brain atlases using statistical parametric mapping software]) and inversion time (TI) in volumetric analyses of the hippocampus and to identify differences in comparison with manual tracing. Materials and Methods: Brain images from 12 normal adults were acquired using magnetization prepared rapid acquisition gradient echo (MPRAGE) with a slice thickness of 1.3 mm and TI of 800, 900, 1000, and 1100 ms. Hippocampal volumes were measured using FreeSurfer, IBASPM and manual tracing. Statistical differences were examined using correlation analyses accounting for spatial interpretations percent volume overlap and percent volume difference. Results: FreeSurfer revealed a maximum percent volume overlap and maximum percent volume difference at TI = 800 ms ($77.1{\pm}2.9%$) and TI = 1100 ms ($13.1{\pm}2.1%$), respectively. The respective values for IBASPM were TI = 1100 ms ($55.3{\pm}9.1%$) and TI = 800 ms ($43.1{\pm}10.7%$). FreeSurfer presented a higher correlation than IBASPM but it was not statistically significant. Conclusion: FreeSurfer performed better in volumetric determination than IBASPM. Given the subjective nature of manual tracing, automated image acquisition and analysis image is accurate and preferable.

Analyzing Machine Learning Techniques for Fault Prediction Using Web Applications

  • Malhotra, Ruchika;Sharma, Anjali
    • Journal of Information Processing Systems
    • /
    • v.14 no.3
    • /
    • pp.751-770
    • /
    • 2018
  • Web applications are indispensable in the software industry and continuously evolve either meeting a newer criteria and/or including new functionalities. However, despite assuring quality via testing, what hinders a straightforward development is the presence of defects. Several factors contribute to defects and are often minimized at high expense in terms of man-hours. Thus, detection of fault proneness in early phases of software development is important. Therefore, a fault prediction model for identifying fault-prone classes in a web application is highly desired. In this work, we compare 14 machine learning techniques to analyse the relationship between object oriented metrics and fault prediction in web applications. The study is carried out using various releases of Apache Click and Apache Rave datasets. En-route to the predictive analysis, the input basis set for each release is first optimized using filter based correlation feature selection (CFS) method. It is found that the LCOM3, WMC, NPM and DAM metrics are the most significant predictors. The statistical analysis of these metrics also finds good conformity with the CFS evaluation and affirms the role of these metrics in the defect prediction of web applications. The overall predictive ability of different fault prediction models is first ranked using Friedman technique and then statistically compared using Nemenyi post-hoc analysis. The results not only upholds the predictive capability of machine learning models for faulty classes using web applications, but also finds that ensemble algorithms are most appropriate for defect prediction in Apache datasets. Further, we also derive a consensus between the metrics selected by the CFS technique and the statistical analysis of the datasets.

Cerebral Activation Area Following Oxygen Administration using a 3 Tesla Functional MR Imaging (고 자장 기능적 MR 영상을 이용한 뇌 운동 영역에서 산소 주입에 따른 활성화 영역에 관한 연구)

  • Goo, Eun-Hoe;Kweon, Dae-Cheol
    • Journal of the Ergonomics Society of Korea
    • /
    • v.24 no.4
    • /
    • pp.47-53
    • /
    • 2005
  • This study aim to investigate the effects of supply of oxygen enhances cerebral activation through increased activation in the brain and using a 3 Tesla fMRI system. Five volunteers (right handed, average age of 21.3) were selected as subjects for this study. Oxygen supply equipment that provides 30% oxygen at a constant rate of 15L/min was given using face mask. A 3 Tesla fMRI system using the EPI BOLD technique, and three-pulse sequence technique get of the true axial planes scanned brain images. The author can get the perfusion images of the brain by oxygen inhalation with susceptibility contrast EPI sequence at the volunteers. Complex movement consisted of a finger task in which subjects flexed and extended all fingers repeatedly in union, without the fingers touching each other. Both task consisted of 96 phases including 6 activations and rests contents. Post-processing was done on MRDx software program by using cross-correlation method. The result shows that there was an improvement in performance and also increased activation in several areas in the oxygen method. These finding demonstrates that while performing cognitive tasks, oxygen administration was due to increase of cerebral activation.

Evaluation of the correlation between the muscle fat ratio of pork belly and pork shoulder butt using computed tomography scan

  • Sheena Kim;Jeongin Choi;Eun Sol Kim;Gi Beom Keum;Hyunok Doo;Jinok Kwak;Sumin Ryu;Yejin Choi;Sriniwas Pandey;Na Rae Lee;Juyoun Kang;Yujung Lee;Dongjun Kim;Kuk-Hwan Seol;Sun Moon Kang;In-Seon Bae;Soo-Hyun Cho;Hyo Jung Kwon;Samooel Jung;Youngwon Lee;Hyeun Bum Kim
    • Korean Journal of Agricultural Science
    • /
    • v.50 no.4
    • /
    • pp.809-815
    • /
    • 2023
  • This study was conducted to find out the correlation between meat quality and muscle fat ratio in pork part meat (pork belly and shoulder butt) using CT (computed tomography) imaging technique. After 24 hours from slaughter, pork loin and belly were individually prepared from the left semiconductors of 26 pigs for CT measurement. The image obtained from CT scans was checked through the picture archiving and communications system (PACS). The volume of muscle and fat in the pork belly and shoulder butt of cross-sectional images taken by CT was estimated using Vitrea workstation version 7. This assemblage was further processed through Vitrea post-processing software to automatically calculate the volumes (Fig. 1). The volumes were measured in milliliters (mL). In addition to volume calculation, a three-dimensional reconstruction of the organ under consideration was generated. Pearson's correlation coefficient was analyzed to evaluate the relationship by region (pork belly, pork shoulder butt), and statistical processing was performed using GraphPad Prism 8. The muscle-fat ratios of pork belly taken by CT was 1 : 0.86, while that of pork shoulder butt was 1 : 0.37. As a result of CT analysis of the correlation coefficient between pork belly and shoulder butt compared to the muscle-fat ratio, the correlation coefficient was 0.5679 (R2 = 0.3295, p < 0.01). CT imaging provided very good estimates of muscle contents in cuts and in the whole carcass.