• Title/Summary/Keyword: Quantitative Estimation

Search Result 796, Processing Time 0.028 seconds

Development of the Model for Total Quality Management and Cost of Quality using Activity Based Costing in the Hospital (병원의 활동기준원가를 이용한 총체적 질관리 모형 및 질비용 산출 모형 개발)

  • 조우현;전기홍;이해종;박은철;김병조;김보경;이상규
    • Health Policy and Management
    • /
    • v.11 no.2
    • /
    • pp.141-168
    • /
    • 2001
  • Healthcare service organizations can apply the cost of quality(COQ) model as a method to evaluate a service quality improvement project such as Total Quality Management (TQM). COQ model has been used to quantify and evaluate the efficiency and effectiveness of TQM project through estimation between cost and benefit in intervention for a quality Improvement to provide satisfied services for a customer, and to identify a non value added process. For estimating cost of quality, We used activities and activity costs based on Activity Based Costing(ABC) system. These procedures let the researchers know whether the process is value-added by each activity, and identify a process to require improvement in TQM project. Through the series of procedures, health care organizations are service organizations can identify a problem in their quality improvement programs, solve the problem, and improve their quality of care for their costumers with optimized cost. The study subject was a quality improvement program of the department of radiology department in a hospital with n bed sizes in Metropolitan Statistical Area (MSA). The principal source of data for developing the COQ model was total cases of retaking shots for diagnoses during five months period from December of the 1998 to April of the 1999 in the department. First of the procedures, for estimating activity based cost of the department of diagnostic radiology, the researchers analyzed total department health insurance claims to identify activities and activity costs using one year period health insurance claims from September of the 1998 to August of the 1999. COQ model in this study applied Simpson & Multher's COQ(SM's COQ) model, and SM's COQ model divided cost of quality into failure cost with external and internal failure cost, and evaluation/prevention cost. The researchers identified contents for cost of quality, defined activities and activity costs for each content with the SM's COQ model, and finally made the formula for estimating activity costs relating to implementing service quality improvement program. The results from the formula for estimating cost of quality were following: 1. The reasons for retaking shots were largely classified into technique, appliances, patients, quality management, non-appliances, doctors, and unclassified. These classifications by reasons were allocated into each office doing re-taking shots. Therefore, total retaking shots categorized by reasons and offices, the researchers identified internal and external failure costs based on these categories. 2. The researchers have developed cost of quality (COQ) model, identified activities by content for cost of quality, assessed activity driving factors and activity contribution rate, and calculated total cost by each content for cost for quality, except for activity cost. 3. According to estimation of cost of quality for retaking shots in department of diagnostic radiology, the failure cost was ₩35,880, evaluation/preventive cost was ₩72,521, two times as much as failure cost. The proportion between internal failure cost and external failure cost in failure cost is similar. The study cannot identify trends on input cost and quality improving in cost of qualify over the time, because the study employs cross-sectional design. Even with this limitation, results of this study are much meaningful. This study shows possibility to evaluate value on the process of TQM subjects using activities and activity costs by ABC system, and this study can objectively evaluate quality improvement program through quantitative comparing input costs with marginal benefits in quality improvement.

  • PDF

Pressure-load Calibration of Multi-anvil Press at Ambient Temperature through Structural Change in Cold Compressed Amorphous Pyrope (비정질 파이로프의 저온 압축에 따른 구조 변화를 이용한 멀티 앤빌 프레스의 상온 압력-부하 보정)

  • Lhee, Juho;Kim, Yong-Hyun;Lee, A Chim;Kim, Eun Jeong;Lee, Seoyoung;Lee, Sung Keun
    • Korean Journal of Mineralogy and Petrology
    • /
    • v.35 no.1
    • /
    • pp.65-73
    • /
    • 2022
  • The proper estimation of physical and chemical properties of Earth materials and their structures at high pressure and high temperature conditions is key to the full understanding of diverse geological processes in Earth and planetary interiors. Multi-anvil press - high-pressure generating device - provides unique information of Earth materials under compression, mainly relevant to Earth's upper mantle. The quantitative estimation of the relationship between the oil load within press and the actual pressure conditions within the sample needs to be established to infer the planetary processes. Such pressure-load calibration has often been based on the phase transitions of crystalline earth materials with known pressure conditions; however, unlike at high temperature conditions, phase transitions at low (or room) temperatures can be sluggish, making the calibration at such conditions challenging. In this study, we explored the changes in Al coordination environments of permanently densified pyrope glasses upon the cold compression using the high-resolution 27Al MAS and 3QMAS NMR. The fractions of highly coordinated Al in the cold compressed pyrope glasses increase with increasing oil load and thus, the peak pressure condition. Based on known relationship between the peak pressure and the Al coordination environment in the compressed pyrope glasses at room temperature, we established a room temperature pressure-load calibration of the 14/8 HT assembly in 1,100-ton multi-anvil press. The current results highlight the first pressure-load calibration of any high pressure device using high-resolution NMR. Irreversible structural densification upon cold compression observed for the pyrope glasses provides insights into the deformation and densification mechanisms of amorphous earth materials at low temperature and high pressure conditions within the subducting slabs.

An Expert System for the Estimation of the Growth Curve Parameters of New Markets (신규시장 성장모형의 모수 추정을 위한 전문가 시스템)

  • Lee, Dongwon;Jung, Yeojin;Jung, Jaekwon;Park, Dohyung
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.4
    • /
    • pp.17-35
    • /
    • 2015
  • Demand forecasting is the activity of estimating the quantity of a product or service that consumers will purchase for a certain period of time. Developing precise forecasting models are considered important since corporates can make strategic decisions on new markets based on future demand estimated by the models. Many studies have developed market growth curve models, such as Bass, Logistic, Gompertz models, which estimate future demand when a market is in its early stage. Among the models, Bass model, which explains the demand from two types of adopters, innovators and imitators, has been widely used in forecasting. Such models require sufficient demand observations to ensure qualified results. In the beginning of a new market, however, observations are not sufficient for the models to precisely estimate the market's future demand. For this reason, as an alternative, demands guessed from those of most adjacent markets are often used as references in such cases. Reference markets can be those whose products are developed with the same categorical technologies. A market's demand may be expected to have the similar pattern with that of a reference market in case the adoption pattern of a product in the market is determined mainly by the technology related to the product. However, such processes may not always ensure pleasing results because the similarity between markets depends on intuition and/or experience. There are two major drawbacks that human experts cannot effectively handle in this approach. One is the abundance of candidate reference markets to consider, and the other is the difficulty in calculating the similarity between markets. First, there can be too many markets to consider in selecting reference markets. Mostly, markets in the same category in an industrial hierarchy can be reference markets because they are usually based on the similar technologies. However, markets can be classified into different categories even if they are based on the same generic technologies. Therefore, markets in other categories also need to be considered as potential candidates. Next, even domain experts cannot consistently calculate the similarity between markets with their own qualitative standards. The inconsistency implies missing adjacent reference markets, which may lead to the imprecise estimation of future demand. Even though there are no missing reference markets, the new market's parameters can be hardly estimated from the reference markets without quantitative standards. For this reason, this study proposes a case-based expert system that helps experts overcome the drawbacks in discovering referential markets. First, this study proposes the use of Euclidean distance measure to calculate the similarity between markets. Based on their similarities, markets are grouped into clusters. Then, missing markets with the characteristics of the cluster are searched for. Potential candidate reference markets are extracted and recommended to users. After the iteration of these steps, definite reference markets are determined according to the user's selection among those candidates. Then, finally, the new market's parameters are estimated from the reference markets. For this procedure, two techniques are used in the model. One is clustering data mining technique, and the other content-based filtering of recommender systems. The proposed system implemented with those techniques can determine the most adjacent markets based on whether a user accepts candidate markets. Experiments were conducted to validate the usefulness of the system with five ICT experts involved. In the experiments, the experts were given the list of 16 ICT markets whose parameters to be estimated. For each of the markets, the experts estimated its parameters of growth curve models with intuition at first, and then with the system. The comparison of the experiments results show that the estimated parameters are closer when they use the system in comparison with the results when they guessed them without the system.

A Study on a Quantitative Method in Estimating Forest Effects for Streamflow Regulation (II) - Mainly Dealing with Application of Coefficient for Slope Roughness - (삼림이수기능(森林理水機能)의 정량적(定量的) 평가방법(平價方法)에 관한 연구(硏究)(II) - 조도계수(粗度係數)의 응용(應用)을 중심(中心)으로 -)

  • Lee, Heon Ho
    • Journal of Korean Society of Forest Science
    • /
    • v.81 no.4
    • /
    • pp.337-345
    • /
    • 1992
  • In this research, a kinematic wave model was applied for the runoff analysis, Regulation of streamflow was estimated by the calibration of roughness coefficient as a parameter. The data analyzed were obtained from Ananomiya and Shirasaka experimental basins at Tokyo University Forest in Aichi. Estimation methods and characteristics of roughness coefficient as a evaluation method of hydrological function of forest are summarized as follows ; 1. Roughness coefficient($N_s$) indicates the resistance of hillslope to the flowing water of surface runoff. There exists an hypothesis that resistance of hillslope to flowing water increase with the growth forest and development of the $A_o$ layer. 2. Roughness coefficient($N_s$) was estimated by the parameter when the stream direct runoff was calculated by using the kinematic wave. 3. Secular change of '$N_s$' in ananomiya has a curve which has an upper limit and increases exponentially near the limit. The curve quickly increased from 1935 to 1945 when results of afforestation for erosion control were thought to be effective. On the other hand, slight increase of '$N_s$' in Shirasaka indicates that there was not such a big change in the surface of soil layer. 4. The increase of '$N_s$' was related with decrease of direct runoff and increase of base flow. It was recognized that the rate of direct runoff decreased with the improvement of forest physiognomy and the rate of base flow was increased. But absolute value of water runoff per one storm decreased in chronological order.

  • PDF

Linearity Estimation of PET/CT Scanner in List Mode Acquisition (List Mode에서 PET/CT Scanner의 직선성 평가)

  • Choi, Hyun-Jun;Kim, Byung-Jin;Ito, Mikiko;Lee, Hong-Jae;Kim, Jin-Ui;Kim, Hyun-Joo;Lee, Jae-Sung;Lee, Dong-Soo
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.16 no.1
    • /
    • pp.86-90
    • /
    • 2012
  • Purpose: Quantification of myocardial blood flow (MBF) using dynamic PET imaging has the potential to assess coronary artery disease. Rb-82 plays a key role in the clinical assessment of myocardial perfusion using PET. However, MBF could be overestimated due to the underestimation of left ventricular input function in the beginning of the acquisition when the scanner has non-linearity between count rate and activity concentration due to the scanner dead-time. Therefore, in this study, we evaluated the count rate linearity as a function of the activity concentration in PET data acquired in list mode. Materials & methods: A cylindrical phantom (diameter, 12 cm length, 10.5 cm) filled with 296 MBq F-18 solution and 800 mL of water was used to estimate the linearity of the Biograph 40 True Point PET/CT scanner. PET data was acquired with 10 min per frame of 1 bed duration in list mode for different activity concentration levels in 7 half-lives. The images were reconstructed by OSEM and FBP algorithms. Prompt, net true and random counts of PET data according to the activity concentration were measured. Total and background counts were measured by drawing ROI on the phantom images and linearity was measured using background correction. Results: The prompt count rates in list mode were linearly increased proportionally to the activity concentration. At a low activity concentration (<30 kBq/mL), the prompt net true and random count rates were increased with the activity concentration. At a high activity concentration (>30 kBq/mL), the increasing rate of the prompt net true rates was slightly decreased while the increasing rate of random counts was increased. There was no difference in the image intensity linearity between OSEM and FBP algorithms. Conclusion: The Biograph 40 True Point PET/CT scanner showed good linearity of count rate even at a high activity concentration (~370 kBq/mL).The result indicates that the scanner is useful for the quantitative analysis of data in heart dynamic studies using Rb-82, N-13, O-15 and F-18.

  • PDF

Development of Robotic Inspection System over Bridge Superstructure (교량 상판 하부 안전점검 로봇개발)

  • Nam Soon-Sung;Jang Jung-Whan;Yang Kyung-Taek
    • Proceedings of the Korean Institute Of Construction Engineering and Management
    • /
    • autumn
    • /
    • pp.180-185
    • /
    • 2003
  • The increase of traffic over a bridge has been emerged as one of the most severe problems in view of bridge maintenance, since the load effect caused by the vehicle passage over the bridge has brought out a long-term damage to bridge structure, and it is nearly impossible to maintain operational serviceability of bridge to user's satisfactory level without any concern on bridge maintenance at the phase of completion. Moreover, bridge maintenance operation should be performed by regular inspection over the bridge to prevent structural malfunction or unexpected accidents front breaking out by monitoring on cracks or deformations during service. Therefore, technical breakthrough related to this uninterested field of bridge maintenance leading the public to the turning point of recognition is desperately needed. This study has the aim of development on automated inspection system to lower surface of bridge superstructures to replace the conventional system of bridge inspection with the naked eye, where the monitoring staff is directly on board to refractive or other type of maintenance .vehicles, with which it is expected that we can solve the problems essentially where the results of inspection are varied to change with subjective manlier from monitoring staff, increase stabilities in safety during the inspection, and make contribution to construct data base by providing objective and quantitative data and materials through image processing method over data captured by cameras. By this system it is also expected that objective estimation over the right time of maintenance and reinforcement work will lead enormous decrease in maintenance cost.

  • PDF

Acoustic Scattering Characteristis of the Individual Fish (어체의 초음파 산란특성에 관한 연구)

  • 신형일
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.27 no.1
    • /
    • pp.21-30
    • /
    • 1991
  • The estimation of the fish biomass density or the size of fish by means of the acoustic equipment is an important part in the quantitative assessment of fisheries resources. The precision of such estimates depend upon the target strength of fish and the accuracy to which the acoustic equipment has been calibrated. This paper examine the accuracy of the digital measurement system which is manufactured by way of trial in order to masure the target strength of fish, and calibrations of that system carry out with an ogive and a ellipsoid made of the aluminum and the epoxy, respectively. Furthermore, measurements of target strength for eight species of fish are made at 25, 50, 100 kHz. The accuracy of the digital measurement system is compared the theory with measurements on ogive and ellipsoid, and the agreement is reasonable. Result of establishments on the target strength to fish length and to fish weight regression obtained from the measurements are available to provide the methods of design for use in interpreting acoustic measurements of fish abundance on the experimented eight species.

  • PDF

GIS-based Disaster Management System for a Private Insurance Company in Case of Typhoons(I) (지리정보기반의 재해 관리시스템 구축(I) -민간 보험사의 사례, 태풍의 경우-)

  • Chang Eun-Mi
    • Journal of the Korean Geographical Society
    • /
    • v.41 no.1 s.112
    • /
    • pp.106-120
    • /
    • 2006
  • Natural or man-made disaster has been expected to be one of the potential themes that can integrate human geography and physical geography. Typhoons like Rusa and Maemi caused great loss to insurance companies as well as public sectors. We have implemented a natural disaster management system for a private insurance company to produce better estimation of hazards from high wind as well as calculate vulnerability of damage. Climatic gauge sites and addresses of contract's objects were geo-coded and the pressure values along all the typhoon tracks were vectorized into line objects. National GIS topog raphic maps with scale of 1: 5,000 were updated into base maps and digital elevation model with 30 meter space and land cover maps were used for reflecting roughness of land to wind velocity. All the data are converted to grid coverage with $1km{\times}1km$. Vulnerability curve of Munich Re was ad opted, and preprocessor and postprocessor of wind velocity model was implemented. Overlapping the location of contracts on the grid value coverage can show the relative risk, with given scenario. The wind velocities calculated by the model were compared with observed value (average $R^2=0.68$). The calibration of wind speed models was done by dropping two climatic gauge data, which enhanced $R^2$ values. The comparison of calculated loss with actual historical loss of the insurance company showed both underestimation and overestimation. This system enables the company to have quantitative data for optimizing the re-insurance ratio, to have a plan to allocate enterprise resources and to upgrade the international creditability of the company. A flood model, storm surge model and flash flood model are being added, at last, combined disaster vulnerability will be calculated for a total disaster management system.

A Study on the Construction Cost Risk through Analyzing the Actual Cost of Public Apartment (공공주택 실적공사비 분석을 통한 공사비 리스크에 관한 연구)

  • Yoon, Woo-Sung;Go, Seong-Seok
    • Korean Journal of Construction Engineering and Management
    • /
    • v.12 no.6
    • /
    • pp.65-78
    • /
    • 2011
  • Construction business, which is complex and long-term business, requires accurate estimation and verification in construction costs and payment procedure from project planning to the completion of construction phase. And more importantly, it is necessary to investigate and determine the risk factors related to construction costs during the entire process including design planning, construction drawings, and quantity calculating. But, currently, it is not seem to be adequate to cope with the risk and increased construction costs against the operational budget in terms of actual costs when screening and estimating the bidding cost of public apartment. Therefore, this study selected and analyzed 40 sites' report of construction completion account from 2004 to 2010 focused on the adequacy on the modification of contract and design planning and on the complication of the budget in the beginning of the project. This study deducted various risk causes and results by analyzing actual costs according to year, architectural area, region, construction cost and sale/lease classification. We could find out construction risk according to annual variation of government policy and economy, and also deducted risk items by construction characteristic according to region and architectural area. Study result, we first found out the problems of lowest price award system according to the construction costs. The weight of the cost increase risk was analyzed that subcontract and material costs are very high. Roof and tile work were analyzed highly in subcontract cost risk and reinforcing bar and cement were analyzed highly in material cost risk, among direct construction cost. Finally, this study results could be used in comparing the categories of the construction costs made by specific construction process, belonging to the construction costs, with the operational budget made in the beginning of the project that can enable to grasp unpredictable risks over the construction costs and making quantitative analysis for it through analyzing the range of fluctuation and variations led by the fluctuations in the actual construction costs.

Value of Information Technology Outsourcing: An Empirical Analysis of Korean Industries (IT 아웃소싱의 가치에 관한 연구: 한국 산업에 대한 실증분석)

  • Han, Kun-Soo;Lee, Kang-Bae
    • Asia pacific journal of information systems
    • /
    • v.20 no.3
    • /
    • pp.115-137
    • /
    • 2010
  • Information technology (IT) outsourcing, the use of a third-party vendor to provide IT services, started in the late 1980s and early 1990s in Korea, and has increased rapidly since 2000. Recently, firms have increased their efforts to capture greater value from IT outsourcing. To date, there have been a large number of studies on IT outsourcing. Most prior studies on IT outsourcing have focused on outsourcing practices and decisions, and little attention has been paid to objectively measuring the value of IT outsourcing. In addition, studies that examined the performance of IT outsourcing have mainly relied on anecdotal evidence or practitioners' perceptions. Our study examines the contribution of IT outsourcing to economic growth in Korean industries over the 1990 to 2007 period, using a production function framework and a panel data set for 54 industries constructed from input-output tables, fixed-capital formation tables, and employment tables. Based on the framework and estimation procedures that Han, Kauffman and Nault (2010) used to examine the economic impact of IT outsourcing in U.S. industries, we evaluate the impact of IT outsourcing on output and productivity in Korean industries. Because IT outsourcing started to grow at a significantly more rapid pace in 2000, we compare the impact of IT outsourcing in pre- and post-2000 periods. Our industry-level panel data cover a large proportion of Korean economy-54 out of 58 Korean industries. This allows us greater opportunity to assess the impacts of IT outsourcing on objective performance measures, such as output and productivity. Using IT outsourcing and IT capital as our primary independent variables, we employ an extended Cobb-Douglas production function in which both variables are treated as factor inputs. We also derive and estimate a labor productivity equation to assess the impact of our IT variables on labor productivity. We use data from seven years (1990, 1993, 2000, 2003, 2005, 2006, and 2007) for which both input-output tables and fixed-capital formation tables are available. Combining the input-output tables and fixed-capital formation tables resulted in 54 industries. IT outsourcing is measured as the value of computer-related services purchased by each industry in a given year. All the variables have been converted to 2000 Korean Won using GDP deflators. To calculate labor hours, we use the average work hours for each sector provided by the OECD. To effectively control for heteroskedasticity and autocorrelation present in our dataset, we use the feasible generalized least squares (FGLS) procedures. Because the AR1 process may be industry-specific (i.e., panel-specific), we consider both common AR1 and panel-specific AR1 (PSAR1) processes in our estimations. We also include year dummies to control for year-specific effects common across industries, and sector dummies (as defined in the GDP deflator) to control for time-invariant sector-specific effects. Based on the full sample of 378 observations, we find that a 1% increase in IT outsourcing is associated with a 0.012~0.014% increase in gross output and a 1% increase in IT capital is associated with a 0.024~0.027% increase in gross output. To compare the contribution of IT outsourcing relative to that of IT capital, we examined gross marginal product (GMP). The average GMP of IT outsourcing was 6.423, which is substantially greater than that of IT capital at 2.093. This indicates that on average if an industry invests KRW 1 millon, it can increase its output by KRW 6.4 million. In terms of the contribution to labor productivity, we find that a 1% increase in IT outsourcing is associated with a 0.009~0.01% increase in labor productivity while a 1% increase in IT capital is associated with a 0.024~0.025% increase in labor productivity. Overall, our results indicate that IT outsourcing has made positive and economically meaningful contributions to output and productivity in Korean industries over the 1990 to 2007 period. The average GMP of IT outsourcing we report about Korean industries is 1.44 times greater than that in U.S. industries reported in Han et al. (2010). Further, we find that the contribution of IT outsourcing has been significantly greater in the 2000~2007 period during which the growth of IT outsourcing accelerated. Our study provides implication for policymakers and managers. First, our results suggest that Korean industries can capture further benefits by increasing investments in IT outsourcing. Second, our analyses and results provide a basis for managers to assess the impact of investments in IT outsourcing and IT capital in an objective and quantitative manner. Building on our study, future research should examine the impact of IT outsourcing at a more detailed industry level and the firm level.