• Title/Summary/Keyword: Analytical Tool

Search Result 657, Processing Time 0.025 seconds

Parameters Estimation of Clark Model based on Width Function (폭 함수를 기반으로 한 Clark 모형의 매개변수 추정)

  • Park, Sang Hyun;Kim, Joo-Cheol;Jung, Kwansue
    • Journal of Korea Water Resources Association
    • /
    • v.46 no.6
    • /
    • pp.597-611
    • /
    • 2013
  • This paper presents the methodology for construction of time-area curve via the width function and thereby rational estimation of time of concentration and storage coefficient of Clark model within the framework of method of moments. To this end time-area curve is built by rescaling the grid-based width function under the assumption of pure translation and then the analytical expressions for two parameters of Clark model are proposed in terms of method of moments. The methodology in this study based on the analytical expressions mentioned before is compared with both (1) the traditional optimization method of Clark model provided by HEC-1 in which the symmetric time-area curve is used and the difference between observed and simulated hydrographs is minimized (2) and the same optimization method but replacing time-area curve with rescaled width function in respect of peak discharge and time to peak of simulated direct runoff hydrographs and their efficiency coefficient relative to the observed ones. The following points are worth of emphasizing: (1) The optimization method by HEC-1 with rescaled width function among others results in the parameters well reflecting the observed runoff hydrograph with respect to peak discharge coordinates and coefficient of efficiency; (2) For the better application of Clark model it is recommended to use the time-area curve capable of accounting for irregular drainage structure of a river basin such as rescaled width function instead of symmetric time-area curve by HEC-1; (3) Moment-based methodology with rescaled width function developed in this study also gives rise to satisfactory simulation results in terms of peak discharge coordinates and coefficient of efficiency. Especially the mean velocities estimated from this method, characterizing the translation effect of time-area curve, are well consistent with the field surveying results for the points of interest in this study; (4) It is confirmed that the moment-based methodology could be an effective tool for quantitative assessment of translation and storage effects of natural river basin; (5) The runoff hydrographs simulated by the moment-based methodology tend to be more right skewed relative to the observed ones and have lower peaks. It is inferred that this is due to consideration of only one mean velocity in the parameter estimation. Further research is required to combine the hydrodynamic heterogeneity between hillslope and channel network into the construction of time-area curve.

A Study on Utilizing DEA in Efficiency Evaluation of Social Welfare Agencies (자료포락분석(DEA)을 이용한 사회복지관의 효율성 평가에 관한 연구 : 부산지역 사례를 중심으로)

  • Son, Kwang-Hoon
    • Korean Journal of Social Welfare
    • /
    • v.52
    • /
    • pp.117-141
    • /
    • 2003
  • This study is to identify the efficiency in Busan social welfare agencies between input factors and output factors. For this purpose, gathered are the 2001 services reports of those study agencies. This study used 4 difference model, model 1; comparing input factor(social worker number & labor cost) about output factor(the total number of program used person), model 2; comparing input factor(the total number of social welfare agencies staff & the total working expenses) about output factor(the total number of program used person), model 3; comparing input factor(the total number of volunteer, social welfare agencies staff & a period of operation) about output factor(the total number of program used person), model 4; comparing input factor(the total number of volunteer, social welfare agencies staff, a period of operation & the total working expenses) about output factor(the total number of program used person). Charnes's study(1978) provided an analytical tool for efficiency services output of non-profit organizations, and DEA(Data Envelopment Analysis) was a analytical framework for evaluating the impact of social service outcome. the finding are as follows : (1) In the results of comparing 4 models as same standard, we can find 35-55%(16-25) efficiency agencies among the 45 social welfare agencies. (2) For all DMU becoming the efficiency 1 to standard of output factor, model 1; 33 agencies are increasing the social worker number($\Delta$0.8 number), 10 agencies are raising the labor cost of social worker($\Delta$1,189,000 Won), model 2; 30 agencies are increasing the total number of social welfare agencies staff($\Delta$1.25 number), 14 agencies are raising the total working expenses($\Delta$1,447,000 Won), model 3; 8 agencies are increasing the total number of social welfare agencies staff($\Delta$2.26 number), 14 agencies are increasing the total number of volunteer($\Delta$52 number), and 10 agencies are increasing a period of operation($\Delta$13 month), model 4; 24 agencies are increasing the total number of social welfare agencies staff($\Delta$1.8 number), 12 agencies are raising the total working expenses($\Delta$5,017,000 Won), 12 agencies are increasing the total number of volunteer($\Delta$43.2 number), and 23 agencies are increasing a period of operation($\Delta$16 month).

  • PDF

Determination of plasma C16-C24 globotriaosylceramide (Gb3) isoforms by tandem mass spectrometry for diagnosis of Fabry disease (패브리병(Fabry) 진단을 위한 혈장 중 Globotriaosylceramide (Gb3)의 탠덤매스 분석법 개발과 임상 응용)

  • Yoon, Hye-Ran;Cho, Kyung-Hee;Yoo, Han-Wook;Choi, Jin-Ho;Lee, Dong-Hwan;Zhang, Kate;Keutzer, Joan
    • Journal of Genetic Medicine
    • /
    • v.4 no.1
    • /
    • pp.45-52
    • /
    • 2007
  • Purpose : A simple, rapid, and highly sensitive analytical method for Gb3 in plasma was developed without labor-ex tensive pre-treatment by electrospray ionization MS/ MS (ESI-MS/MS). Measurement of globotriaosy lceramide (Gb3, ceramide trihex oside) in plasma has clinical importance for monitoring after enzyme replacement therapy in Fabry disease patients. The disease is an X-linked lipid storage disorder that results from a deficiency of the enzyme ${\alpha}$-galactosidase A (${\alpha}$-Gal A). The lack of ${\alpha}$-Gal A causes an intracellular accumulation of glycosphingolipids, mainly Gb3. Methods : Only simple 50-fold dilution of plasma is necessary for the extraction and isolation of Gb3 in plasma. Gb3 in diluted plasma was dissolved in dioxane containing C17:0 Gb3 as an internal standard. After centrifugation it was directly injected and analyzed through guard column by in combination with multiple reaction monitoring mode of ESI-MS/MS. Results : Eight isoforms of Gb3 were completely resolved from plasma matrix. C16:0 Gb3 occupied 50% of total Gb3 as a major component in plasma. Linear relationship for Gb3 isoforms w as found in the range of 0.001-1.0 ${\mu}g$/mL. The limit of detection (S/N=3) was 0.001 ${\mu}g$/mL and limit of quantification was 0.01 ${\mu}g$/mL for C16:0 Gb3 with acceptable precision and accuracy. Correlation coefficient of calibration curves for 8 Gb3 isoforms ranged from 0.9678 to 0.9982. Conclusion : This quantitative method developed could be useful for rapid and sensitive 1st line Fabry disease screening, monitoring and/or diagnostic tool for Fabry disease.

  • PDF

AN ORBIT PROPAGATION SOFTWARE FOR MARS ORBITING SPACECRAFT (화성 근접 탐사를 위한 우주선의 궤도전파 소프트웨어)

  • Song, Young-Joo;Park, Eun-Seo;Yoo, Sung-Moon;Park, Sang-Young;Choi, Kyu-Hong;Yoon, Jae-Cheol;Yim, Jo-Ryeong;Kim, Han-Dol;Choi, Jun-Min;Kim, Hak-Jung;Kim, Byung-Kyo
    • Journal of Astronomy and Space Sciences
    • /
    • v.21 no.4
    • /
    • pp.351-360
    • /
    • 2004
  • An orbit propagation software for the Mars orbiting spacecraft has been developed and verified in preparations for the future Korean Mars missions. Dynamic model for Mars orbiting spacecraft has been studied, and Mars centered coordinate systems are utilized to express spacecraft state vectors. Coordinate corrections to the Mars centered coordinate system have been made to adjust the effects caused by Mars precession and nutation. After spacecraft enters Sphere of Influence (SOI) of the Mars, the spacecraft experiences various perturbation effects as it approaches to Mars. Every possible perturbation effect is considered during integrations of spacecraft state vectors. The Mars50c gravity field model and the Mars-GRAM 2001 model are used to compute perturbation effects due to Mars gravity field and Mars atmospheric drag, respectively. To compute exact locations of other planets, JPL's DE405 ephemerides are used. Phobos and Deimos's ephemeris are computed using analytical method because their informations are not released with DE405. Mars Global Surveyor's mapping orbital data are used to verify the developed propagator performances. After one Martian day propagation (12 orbital periods), the results show about maximum ${\pm}5$ meter errors, in every position state components(radial, cross-track and along-track), when compared to these from the Astrogator propagation in the Satellite Tool Kit. This result shows high reliability of the developed software which can be used to design near Mars missions for Korea, in future.

Ex vivo Morphometric Analysis of Coronary Stent using Micro-Computed Tomography (미세단층촬영기법을 이용한 관상동맥 스텐트의 동물 모델 분석)

  • Bae, In-Ho;Koh, Jeong-Tae;Lim, Kyung-Seob;Park, Dae-Sung;Kim, Jong-Min;Jeong, Myung-Ho
    • Journal of the Korean Society of Radiology
    • /
    • v.6 no.2
    • /
    • pp.93-98
    • /
    • 2012
  • Micro-computed tomography (microCT) is an important tool for preclinical vascular imaging, with micron-level resolution. This non-destructive means of imaging allows for rapid collection of 2D and 3D reconstructions to visualize specimens prior to destructive analysis such as pathological analysis. Objectives. The aim of this study was to suggest a method for ex vivo, postmortem examination of stented arterial segments with microCT. And ex vivo evaluation of stents such as bare metal or drug eluting stents on in-stent restenosis (ISR) in rabbit model was performed. The bare metal stent (BMS) and drug eluting stent (DES, paclitaxel) were implanted in the left or right iliac arteries alternatively in eight New Zealand white rabbits. After 4 weeks of post-implantation, the part of iliac arteries surrounding the stent were removed carefully and processed for microCT. Prior to microCT analysis, a contrast medium was loaded to lumen of stents. All samples were subjected to an X-ray source operating at 50 kV and 200 ${\mu}A$ by using a 3D isotropic resolution. The region of interest was traced and measured by CTAN analytical software. Objects being exposed to radiation had different Hounsfield unit each other with values of approximately 1.2 at stent area, 0.12 ~ 0.17 at a contrast medium and 0 ~ 0.06 at outer area of stent. Based on above, further analyses were performed. As a result, the difference of lengths and volumes between expanded stents, which may relate to injury score in pathological analysis, was not different significantly. Moreover, ISR area of BMS was 1.6 times higher than that of DES, indicating that paclitaxel has inhibitory effect on cell proliferation and prevent infiltration of restenosis into lumen of stent. And ISR area of BMS was higher ($1.52{\pm}0.48mm^2$) than that of DES ($0.94{\pm}0.42mm^2$), indicating that paclitaxel has inhibitory effect on cell proliferation and prevent infiltration of restenosis into lumen of stent. Though it was not statistically significant, it showed that the extent of neointema of mid-region of stents was relatively higher than that of anterior and posterior region in parts of BMS as showing cross-sectional 2-D image. suggest that microCT can be utilized as an accessorial tool for pathological analysis.

Re-Analysis of Clark Model Based on Drainage Structure of Basin (배수구조를 기반으로 한 Clark 모형의 재해석)

  • Park, Sang Hyun;Kim, Joo Cheol;Jeong, Dong Kug;Jung, Kwan Sue
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.33 no.6
    • /
    • pp.2255-2265
    • /
    • 2013
  • This study presents the width function-based Clark model. To this end, rescaled width function with distinction between hillslope and channel velocity is used as time-area curve and then it is routed through linear storage within the framework of not finite difference scheme used in original Clark model but analytical expression of linear storage routing. There are three parameters focused in this study: storage coefficient, hillslope velocity and channel velocity. SCE-UA, one of the popular global optimization methods, is applied to estimate them. The shapes of resulting IUHs from this study are evaluated in terms of the three statistical moments of hydrologic response functions: mean, variance and the third moment about the center of IUH. The correlation coefficients to the three statistical moments simulated in this study against these of observed hydrographs were estimated at 0.995 for the mean, 0.993 for the variance and 0.983 for the third moment about the center of IUH. The shape of resulting IUHs from this study give rise to satisfactory simulation results in terms of the mean and variance. But the third moment about the center of IUH tend to be overestimated. Clark model proposed in this study is superior to the one only taking into account mean and variance of IUH with respect to skewness, peak discharge and peak time of runoff hydrograph. From this result it is confirmed that the method suggested in this study is useful tool to reflect the heterogeneity of drainage path and hydrodynamic parameters. The variation of statistical moments of IUH are mainly influenced by storage coefficient and in turn the effect of channel velocity is greater than the one of hillslope velocity. Therefore storage coefficient and channel velocity are the crucial factors in shaping the form of IUH and should be considered carefully to apply Clark model proposed in this study.

Optimization of Ingredients for the Preparation of Chinese Quince (Chaenomelis sinensis) Jam by Mixture Design (모과잼 제조시 혼합물 실험계획법에 의한 재료 혼합비율의 최적화)

  • Lee, Eun-Young;Jang, Myung-Sook
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.38 no.7
    • /
    • pp.935-945
    • /
    • 2009
  • This study was performed to find the optimum ratio of ingredients in the Chinese quince jam. The experiment was designed according to the D-optimal design of mixture design, which included 14 experimental points with 4 replicates for three independent variables (Chinese quince paste $45{\sim}60%$, pectin $1.5{\sim}4.5%$, sugar $45.5{\sim}63.5%$). A mathematical analytical tool was employed for the optimization of typical ingredients. The canonical form and trace plot showed the influence of each ingredient in the mixture against final product. By use of F-test, sweetness, pH, L, b, ${\Delta}E$, and firmness were expressed by a linear model, while the spreadmeter value, a, and sensory characteristics (appearance, color, smell, taste, and overall acceptability) were by a quadratic model. The optimum formulations by numerical and graphical method were similar: Chinese quince paste 54.48%, pectin 2.45%, and sugar 53.07%. Optimum ingredient formulation is expected to improve use of Chinese quince and contribute to commercialization of high quality Chinese quince jam.

A Study on the Plan for Creating a Youth Entrepreneurship Education Environment (청소년 기업가정신 교육 환경 조성을 위한 방안 연구)

  • Kang, Kyoung-Kyoon
    • 대한공업교육학회지
    • /
    • v.42 no.2
    • /
    • pp.67-88
    • /
    • 2017
  • The purpose of this research was educational needs of experts for revitalizing youth entrepreneurship education and creating effective conditions for such education. The subjects of the survey were chosen 100 teachers who had participated in entrepreneurship-related professional training for teachers were selected and surveyed. A total of 100 questionnaires were collected, of which 92 (92.00%) were used for the analysis. Eight were excluded as they were not properly answered. As for the used survey tool, a total of 8 areas and 30 items were derived from the review of the literature, and the validity of the contents was examined through expert meetings. The data were analyzed using the SPSS (ver. 20.0) statistical program. The analysis was conducted in terms of the required competency level, perceived competency level and educational needs. As for the used analytical methods, first, the averages of the required competency level and perceived competency level were calculated and the education needs were calculated using Borich's formula, and then the averages were compared through paired t-test. The results turned out to be statistically significant (p<.000). The details are as follows: As a result of the calculation of the educational needs the educational needs in all areas turned out to be very high with the average being 4.94 points, which indicates that the teachers strongly feel the need for educational strengthening in relation to entrepreneurship. These results show that all the educational conditions such as entrepreneurship-related curriculum, teacher professionalism, educational environment, educational support and the perception among school community members are insufficient in the current school settings. For the improvement of the current status, the education conditions in the following areas should be improved: the cooperation from school community members including principals, teacher support such as an exclusive responsibility teacher system, the development of an entrepreneurship curriculum, the securing of teacher professionalism through the implementation of the curriculum, teacher training support for the enhancement of their professionalism and the provision of educational environment and facilities. For enhancing the perception of parents and society regarding entrepreneurship, it is necessary to establish the precise concept of entrepreneurship and promote it based on such work.

Inexpensive Visual Motion Data Glove for Human-Computer Interface Via Hand Gesture Recognition (손 동작 인식을 통한 인간 - 컴퓨터 인터페이스용 저가형 비주얼 모션 데이터 글러브)

  • Han, Young-Mo
    • The KIPS Transactions:PartB
    • /
    • v.16B no.5
    • /
    • pp.341-346
    • /
    • 2009
  • The motion data glove is a representative human-computer interaction tool that inputs human hand gestures to computers by measuring their motions. The motion data glove is essential equipment used for new computer technologiesincluding home automation, virtual reality, biometrics, motion capture. For its popular usage, this paper attempts to develop an inexpensive visual.type motion data glove that can be used without any special equipment. The proposed approach has the special feature; it can be developed as a low-cost one becauseof not using high-cost motion-sensing fibers that were used in the conventional approaches. That makes its easy production and popular use possible. This approach adopts a visual method that is obtained by improving conventional optic motion capture technology, instead of mechanical method using motion-sensing fibers. Compared to conventional visual methods, the proposed method has the following advantages and originalities Firstly, conventional visual methods use many cameras and equipments to reconstruct 3D pose with eliminating occlusions But the proposed method adopts a mono vision approachthat makes simple and low cost equipments possible. Secondly, conventional mono vision methods have difficulty in reconstructing 3D pose of occluded parts in images because they have weak points about occlusions. But the proposed approach can reconstruct occluded parts in images by using originally designed thin-bar-shaped optic indicators. Thirdly, many cases of conventional methods use nonlinear numerical computation image analysis algorithm, so they have inconvenience about their initialization and computation times. But the proposed method improves these inconveniences by using a closed-form image analysis algorithm that is obtained from original formulation. Fourthly, many cases of conventional closed-form algorithms use approximations in their formulations processes, so they have disadvantages of low accuracy and confined applications due to singularities. But the proposed method improves these disadvantages by original formulation techniques where a closed-form algorithm is derived by using exponential-form twist coordinates, instead of using approximations or local parameterizations such as Euler angels.

Development of Manual Multi-Leaf Collimator for Proton Therapy in National Cancer Center (국립암센터의 양성자 치료를 위한 수동형 다엽 콜리메이터 개발)

  • Lee, Nuri;Kim, Tae Yoon;Kang, Dong Yun;Choi, Jae Hyock;Jeong, Jong Hwi;Shin, Dongho;Lim, Young Kyung;Park, Jeonghoon;Kim, Tae Hyun;Lee, Se Byeong
    • Progress in Medical Physics
    • /
    • v.26 no.4
    • /
    • pp.250-257
    • /
    • 2015
  • Multi-leaf collimator (MLC) systems are frequently used to deliver photon-based radiation, and allow conformal shaping of treatment beams. Many proton beam centers currently make use of aperture and snout systems, which involve use of a snout to shape and focus the proton beam, a brass aperture to modify field shape, and an acrylic compensator to modulate depth. However, it needs a lot of time and cost of preparing treatment, therefore, we developed the manual MLC for solving this problem. This study was carried out with the intent of designing an MLC system as an alternative to an aperture block system. Radio-activation and dose due to primary proton beam leakage and the presence of secondary neutrons were taken into account during these iterations. Analytical calculations were used to study the effects of leaf material on activation. We have fabricated tray model for adoption with a wobbling snout ($30{\times}40cm^2$) system which used uniform scanning beam. We designed the manual MLC and tray and can reduce the cost and time for treatment. After leakage test of new tray, we upgrade the tray with brass and made the safety tool. First, we have tested the radio-activation with usually brass and new brass for new manual MLC. It shows similar behavior and decay trend. In addition, we have measured the leakage test of a gantry with new tray and MLC tray, while we exposed the high energy with full modulation process on film dosimetry. The radiation leakage is less than 1%. From these results, we have developed the design of the tray and upgrade for safety. Through the radio-activation behavior, we figure out the proton beam leakage level of safety, where there detects the secondary particle, including neutron. After developing new design of the tray, it will be able to reduce the time and cost of proton treatment. Finally, we have applied in clinic test with original brass aperture and manual MLC and calculated the gamma index, 99.74% between them.