• Title/Summary/Keyword: System Reliability Performance

Search Result 2,302, Processing Time 0.029 seconds

A Study on a Effect of Product Design and a Primary factor of Qualify Competitiveness (제품 디자인의 파급효과와 품질경쟁력의 결정요인에 관한 연구)

  • Lim, Chae-Suk;Yoon, Jong-Young
    • Archives of design research
    • /
    • v.18 no.4 s.62
    • /
    • pp.95-104
    • /
    • 2005
  • The purpose of this study is to estimate the determinants of product design and analyze the impacts of product design on quality competitiveness, product reliability, and consumer satisfaction in an attempt to provide a foundation for the theory of design management. For this empirical analysis, this study has derived the relevant measurement variables from a survey on 400 Korean manufacturing firms during the period of $August{\sim}October$ 2003. The empirical findings are summarized as follows: First, the determinants of product design are very significantly (at p<0.001) estimated to be the R&D capability, the level of R&D expenditure, the level of innovative activities(5S, TQM, 6Sigma, QC, etc.). This empirical result can support Pawar and Driva(1999)'s two principles by which the performance of product design and product development can be simultaneously evaluated in the context of CE(concurrent engineering) of NPD(newly product development) activities. Second, the hypothesis on the causality: product design${\rightarrow}$quality competitiveness${\rightarrow}$customer satisfaction${\rightarrow}$customer loyalty is very significantly (at p<0.001) accepted. This implies that product design positively affects consumer satisfaction, not directly but indirectly, by influencing quality competitiveness. This empirical result of this study can also support the studies of for example Flynn et al.(1994), Ahire et at.(1996), Afire and Dreyfus(2000) which conclude that design management is a significant determinant of product quality. The aforementioned empirical results are important in the following sense: the empirical result that quality competitiveness plays a bridging role between product design and consumer satisfaction can reconcile the traditional debate between QFD(quality function development) approach asserted by product developers and conjoint analysis maintained by marketers. The first empirical result is related to QFD approach whereas the second empirical result is related to conjoint analysis. At the same time, the empirical results of this study can support the rationale of design integration(DI) of Ettlie(1997), i.e., the coordination of the timing and substance of product development activities performed by the various disciplines and organizational functions of a product's life cycle. Finally, the policy implication (at the corporate level) from the empirical results is that successful design management(DM) requires not only the support of top management but also the removal of communication barriers, (i.e. the adoption of cross-functional teams) so that concurrent engineering(CE), the simultaneous development of product and process designs can assure product development speed, design quality, and market success.

  • PDF

Managing Duplicate Memberships of Websites : An Approach of Social Network Analysis (웹사이트 중복회원 관리 : 소셜 네트워크 분석 접근)

  • Kang, Eun-Young;Kwahk, Kee-Young
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.1
    • /
    • pp.153-169
    • /
    • 2011
  • Today using Internet environment is considered absolutely essential for establishing corporate marketing strategy. Companies have promoted their products and services through various ways of on-line marketing activities such as providing gifts and points to customers in exchange for participating in events, which is based on customers' membership data. Since companies can use these membership data to enhance their marketing efforts through various data analysis, appropriate website membership management may play an important role in increasing the effectiveness of on-line marketing campaign. Despite the growing interests in proper membership management, however, there have been difficulties in identifying inappropriate members who can weaken on-line marketing effectiveness. In on-line environment, customers tend to not reveal themselves clearly compared to off-line market. Customers who have malicious intent are able to create duplicate IDs by using others' names illegally or faking login information during joining membership. Since the duplicate members are likely to intercept gifts and points that should be sent to appropriate customers who deserve them, this can result in ineffective marketing efforts. Considering that the number of website members and its related marketing costs are significantly increasing, it is necessary for companies to find efficient ways to screen and exclude unfavorable troublemakers who are duplicate members. With this motivation, this study proposes an approach for managing duplicate membership based on the social network analysis and verifies its effectiveness using membership data gathered from real websites. A social network is a social structure made up of actors called nodes, which are tied by one or more specific types of interdependency. Social networks represent the relationship between the nodes and show the direction and strength of the relationship. Various analytical techniques have been proposed based on the social relationships, such as centrality analysis, structural holes analysis, structural equivalents analysis, and so on. Component analysis, one of the social network analysis techniques, deals with the sub-networks that form meaningful information in the group connection. We propose a method for managing duplicate memberships using component analysis. The procedure is as follows. First step is to identify membership attributes that will be used for analyzing relationship patterns among memberships. Membership attributes include ID, telephone number, address, posting time, IP address, and so on. Second step is to compose social matrices based on the identified membership attributes and aggregate the values of each social matrix into a combined social matrix. The combined social matrix represents how strong pairs of nodes are connected together. When a pair of nodes is strongly connected, we expect that those nodes are likely to be duplicate memberships. The combined social matrix is transformed into a binary matrix with '0' or '1' of cell values using a relationship criterion that determines whether the membership is duplicate or not. Third step is to conduct a component analysis for the combined social matrix in order to identify component nodes and isolated nodes. Fourth, identify the number of real memberships and calculate the reliability of website membership based on the component analysis results. The proposed procedure was applied to three real websites operated by a pharmaceutical company. The empirical results showed that the proposed method was superior to the traditional database approach using simple address comparison. In conclusion, this study is expected to shed some light on how social network analysis can enhance a reliable on-line marketing performance by efficiently and effectively identifying duplicate memberships of websites.

Development of relative radiometric calibration system for in-situ measurement spectroradiometers (현장관측용 분광 광도계의 상대 검교정 시스템 개발)

  • Oh, Eunsong;Ahn, Ki-Beom;Kang, Hyukmo;Cho, Seong-Ick;Park, Young-Je
    • Korean Journal of Remote Sensing
    • /
    • v.30 no.4
    • /
    • pp.455-464
    • /
    • 2014
  • After launching the Geostationary Ocean Color Imager (GOCI) on June 2010, field campaigns were performed routinely around Korean peninsula to collect in-situ data for calibration and validation. Key measurements in the campaigns are radiometric ones with field radiometers such as Analytical Spectral Devices FieldSpec3 or TriOS RAMSES. The field radiometers must be regularly calibrated. We, in the paper, introduce the optical laboratory built in KOSC and the relative calibration method for in-situ measurement spectroradiometer. The laboratory is equipped with a 20-inch integrating sphere (USS-2000S, LabSphere) in 98% uniformity, a reference spectrometer (MCPD9800, Photal) covering wavelengths from 360 nm to 1100 nm with 1.6 nm spectral resolution, and an optical table ($3600{\times}1500{\times}800mm^3$) having a flatness of ${\pm}0.1mm$. Under constant temperature and humidity maintainance in the room, the reference spectrometer and the in-situ measurement instrument are checked with the same light source in the same distance. From the test of FieldSpec3, we figured out a slight difference among in-situ instruments in blue band range, and also confirmed the sensor spectral performance was changed about 4.41% during 1 year. These results show that the regular calibrations are needed to maintain the field measurement accuracy and thus GOCI data reliability.

Survey of Physicochemical Methods and Economic Analysis of Domestic Wastewater Treatment Plant for Advanced Treatment of Phosphorus Removal (총인 수질기준강화를 위한 국내 하수종말처리장의 물리화학적처리 특성조사 및 경제성 분석)

  • Park, Hye-Young;Park, Sang-Min;Lee, Ki-Cheol;Kwon, Oh-Sang;Yu, Soon-Ju;Kim, Shin-Jo
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.33 no.3
    • /
    • pp.212-221
    • /
    • 2011
  • Wastewater treatment plants (WWTPs) are required to meet the reinforced discharge standards which are differentiated as 0.2, 0.3 and 0.5 mg-TP/L for the district I, II and III, respectively. Although most of WWTPs are operating advanced biological phosphorus removal system, the supplementary phosphorus treatment facility using chemical addition should be required almost at all WWTPs. Therefore, water quality data from several exemplary full-scale plants operating phosphorus treatment process were analyzed to evaluate the reliability of removal performance. Additionally, a series of jar tests were conducted to find optimal coagulants dose for phosphorus removal by chemical precipitation and to describe characteristics of the reaction and sludge production. Chemical costs and the increasing sludge volume in physicochemical phosphorus removal process were estimated based on the results of jar tests. The minimum coagulant (aluminium sulfate and poly aluminium chloride) doses to keep TP concentration below 0.5 and 0.2 mg/L were around 25 and 30 mg/L (as $Al_2O_3$), respectively, in the mixed liquor of activated sludge. In the tertiary treatment facility, relatively lower coagulant doses of 1/12~1/3 the minimum doses for activated sludge were required to achieve the same TP concentrations of 0.2~0.5 mg/L. Increase in suspended solids concentration due to chemical precipitates in mixed liquor was estimated at 10~11%, compared to the concentration without chemical addition. When coagulant was added into mixed liquor, chemical (aluminium sulfate) cost was estimated to be 4~10 times higher than in secondary effluent coagulation/separation process. Sludge production to be wasted was also 4~10 times higher than secondary effluent coagulation/separation process.

A Study on the Moisturizing Effect and Preparation of Liquid Crystal Structures Using Sucrose Distearate Emulsifier (슈크로오스디스테아레이트를 사용한 액정구조의 생성과 보습효과에 관한 연구)

  • Kwak, Myeong-Heon;Kim, In-Young;Lee, Hwan-Myung;Park, Joo-Hoon
    • Journal of the Korean Applied Science and Technology
    • /
    • v.33 no.1
    • /
    • pp.1-12
    • /
    • 2016
  • This study is to make the liquid crystalline structure using sucrose distearate (Sucro-DS) emulsifier to create the hydrophilic type oil-in-water (O/W) emulsion, the droplets of the emulsion having a structure of a multi-lamellar structure. We have studied the physicochemical properties of Sucro-DS using those techniques. And it has been studied in the emulsion performance. In order to form the liquid crystalline structure applying 3 wt% of Sucro-DS, 5 wt% of glycerin, 5 wt% of squalane, 5 wt% of capric/caprylic triglyceride, 3wt% of cetostearyl alcohol, 1wt% of glyceryl mono-stearate, 78 wt% of pure water in mixture having the lamellar structure of stable multi-layer system was found to formed. By applying them, they were described how to create an unstable active material encapsulated cream. Further, the moisturizing cream was studied using this technique. It reported the results to the skin improvement effect by the human clinical trials. The pH range to produce a stable liquid crystal phase using a Sucro-DS was maintained in 5.2~7.5. In order to increase the stability of the liquid crystal, it was when behenyl alcohol containing 3 wt%, the hardness at this time was 13 kg/mm,min. Viscosity of the same amount was 25,000mPas/min. After a test for the effects of the emulsions, the concentration of 6 wt% Sucro-DS is that was appropriate, the particle size of the liquid crystal was 4~6mm. It was observed through a microscope analysis, reliability of the liquid crystal changes for 3 months was found to get stable at each $4^{\circ}C$, $25^{\circ}C$ and $45^{\circ}C$. In clinical trial test, before applying a moisturizing effect it was $13.4{\pm}7%$. Moisturizing cream liquid crystal was not formed in $14.5{\pm}5%$. Therefore, applying than ever before could see the moisture about 8.2% was improved. On the other hand, it was the moisturizing effect of the liquid cream is $19.2{\pm}7%$. The results showed that 43.3% improvement than that previously used. Applications fields, Sucro-DS emulsifier used liquid cream, lotion, eye cream and a variety of formulations can be developed, as well as the cosmetics industry is expected to be wide fields in the application of the external preparation for skin emulsion technology in the pharmaceutical industry and pharmaceutical industry.

Study on Operating Strategy for Recreation Forests through Comparing the Level of User Satisfaction according to Clusters (군집별 만족도 비교를 통한 자연휴양림의 효율적 운영 방안 연구)

  • Gang, Kee-Rae;Lee, Kee-Cheol
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.38 no.1
    • /
    • pp.39-48
    • /
    • 2010
  • Recreation forests are in the spotlight as the place for personality development, mind and body comfort, companionship, and environment education in forests and valleys. Visitors to recreation forests have been on the increase along with booming in recreation forest building since 1988. Recreation forests are being categorized according to some features such as regional and environmental condition. Recreation forests, however, have not met the expectations of some visitors who want to take a rest with calmness due to the influence of the 5-day-work-week system, increasing interest in rest, leisure, and well-being, and users converge during weekends, summer, and the tourist season. In order to improve visitors' satisfaction efficiently, this study surveyed the level of satisfaction in each cluster based on the precedent study which had classified 85 national or public recreation forests in Korea into clusters. Questionnaires were distributed properly to each cluster and, of the 1,132 questionnaires collected, 1,015 were valid and used for analysis. Reliability of questionnaires and statistical validity of the model were verified. As a result, there are meaningful differences in the ranking of independent variables which affect the level of satisfaction according to clusters. Variables in rest and fatigue recovery have the strongest influence on the level of satisfaction in the clusters of potential factor, internal activation factor, and mixed potential capacity factor. In the use performance and visiting condition factor cluster, appropriateness of visit cost is most influential and, in the education cluster, connectivity with tourist attractions around it is most affective. These results can provide priority in services and maintenance of recreation forests for improving the level of satisfaction and differentiate the distribution of resources according to clusters.

The Impact of Service Level Management(SLM) Process Maturity on Information Systems Success in Total Outsourcing: An Analytical Case Study (토털 아웃소싱 환경 하에서 IT서비스 수준관리(Service Level Management) 프로세스 성숙도가 정보시스템 성공에 미치는 영향에 관한 분석적 사례연구)

  • Cho, Geun Su;An, Joon Mo;Min, Hyoung Jin
    • Asia pacific journal of information systems
    • /
    • v.23 no.2
    • /
    • pp.21-39
    • /
    • 2013
  • As the utilization of information technology and the turbulence of technological change increase in organizations, the adoption of IT outsourcing also grows to manage IT resource more effectively and efficiently. In this new way of IT management technique, service level management(SLM) process becomes critical to derive success from the outsourcing in the view of end users in organization. Even though much of the research on service level management or agreement have been done during last decades, the performance of the service level management process have not been evaluated in terms of final objectives of the management efforts or success from the view of end-users. This study explores the relationship between SLM maturity and IT outsourcing success from the users' point of view by a analytical case study in four client organizations under an IT outsourcing vendor, which is a member company of a major Korean conglomerate. For setting up a model for the analysis, previous researches on service level management process maturity and information systems success are reviewed. In particular, information systems success from users' point of view are reviewed based the DeLone and McLean's study, which is argued and accepted as a comprehensively tested model of information systems success currently. The model proposed in this study argues that SLM process maturity influences information systems success, which is evaluated in terms of information quality, systems quality, service quality, and net effect proposed by DeLone and McLean. SLM process maturity can be measured in planning process, implementation process and operation and evaluation process. Instruments for measuring the factors in the proposed constructs of information systems success and SL management process maturity were collected from previous researches and evaluated for securing reliability and validity, utilizing appropriate statistical methods and pilot tests before exploring the case study. Four cases from four different companies under one vendor company were utilized for the analysis. All of the cases had been contracted in SLA(Service Level Agreement) and had implemented ITIL(IT Infrastructure Library), Six Sigma and BSC(Balanced Scored Card) methods since last several years, which means that all the client organizations pursued concerted efforts to acquire quality services from IT outsourcing from the organization and users' point of view. For comparing the differences among the four organizations in IT out-sourcing sucess, T-test and non-parametric analysis have been applied on the data set collected from the organization using survey instruments. The process maturities of planning and implementation phases of SLM are found not to influence on any dimensions of information systems success from users' point of view. It was found that the SLM maturity in the phase of operations and evaluation could influence systems quality only from users' view. This result seems to be quite against the arguments in IT outsourcing practices in the fields, which emphasize usually the importance of planning and implementation processes upfront in IT outsourcing projects. According to after-the-fact observation by an expert in an organization participating in the study, their needs and motivations for outsourcing contracts had been quite familiar already to the vendors as long-term partners under a same conglomerate, so that the maturity in the phases of planning and implementation seems not to be differentiating factors for the success of IT outsourcing. This study will be the foundation for the future research in the area of IT outsourcing management and success, in particular in the service level management. And also, it could guide managers in practice in IT outsourcing management to focus on service level management process in operation and evaluation stage especially for long-term outsourcing contracts under very unique context like Korean IT outsourcing projects. This study has some limitations in generalization because the sample size is small and the context itself is confined in an unique environment. For future exploration, survey based research could be designed and implemented.

  • PDF

K-DEV: A Borehole Deviation Logging Probe Applicable to Steel-cased Holes (철재 케이싱이 설치된 시추공에서도 적용가능한 공곡검층기 K-DEV)

  • Yoonho, Song;Yeonguk, Jo;Seungdo, Kim;Tae Jong, Lee;Myungsun, Kim;In-Hwa, Park;Heuisoon, Lee
    • Geophysics and Geophysical Exploration
    • /
    • v.25 no.4
    • /
    • pp.167-176
    • /
    • 2022
  • We designed a borehole deviation survey tool applicable for steel-cased holes, K-DEV, and developed a prototype for a depth of 500 m aiming to development of own equipment required to secure deep subsurface characterization technologies. K-DEV is equipped with sensors that provide digital output with verified high performance; moreover, it is also compatible with logging winch systems used in Korea. The K-DEV prototype has a nonmagnetic stainless steel housing with an outer diameter of 48.3 mm, which has been tested in the laboratory for water resistance up to 20 MPa and for durability by running into a 1-km deep borehole. We confirmed the operational stability and data repeatability of the prototype by constantly logging up and down to the depth of 600 m. A high-precision micro-electro-mechanical system (MEMS) gyroscope was used for the K-DEV prototype as the gyro sensor, which is crucial for azimuth determination in cased holes. Additionally, we devised an accurate trajectory survey algorithm by employing Unscented Kalman filtering and data fusion for optimization. The borehole test with K-DEV and a commercial logging tool produced sufficiently similar results. Furthermore, the issue of error accumulation due to drift over time of the MEMS gyro was successfully overcome by compensating with stationary measurements for the same attitude at the wellhead before and after logging, as demonstrated by the nearly identical result to the open hole. We believe that the methodology of K-DEV development and operational stability, as well as the data reliability of the prototype, were confirmed through these test applications.

A Study on Market Size Estimation Method by Product Group Using Word2Vec Algorithm (Word2Vec을 활용한 제품군별 시장규모 추정 방법에 관한 연구)

  • Jung, Ye Lim;Kim, Ji Hui;Yoo, Hyoung Sun
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.1-21
    • /
    • 2020
  • With the rapid development of artificial intelligence technology, various techniques have been developed to extract meaningful information from unstructured text data which constitutes a large portion of big data. Over the past decades, text mining technologies have been utilized in various industries for practical applications. In the field of business intelligence, it has been employed to discover new market and/or technology opportunities and support rational decision making of business participants. The market information such as market size, market growth rate, and market share is essential for setting companies' business strategies. There has been a continuous demand in various fields for specific product level-market information. However, the information has been generally provided at industry level or broad categories based on classification standards, making it difficult to obtain specific and proper information. In this regard, we propose a new methodology that can estimate the market sizes of product groups at more detailed levels than that of previously offered. We applied Word2Vec algorithm, a neural network based semantic word embedding model, to enable automatic market size estimation from individual companies' product information in a bottom-up manner. The overall process is as follows: First, the data related to product information is collected, refined, and restructured into suitable form for applying Word2Vec model. Next, the preprocessed data is embedded into vector space by Word2Vec and then the product groups are derived by extracting similar products names based on cosine similarity calculation. Finally, the sales data on the extracted products is summated to estimate the market size of the product groups. As an experimental data, text data of product names from Statistics Korea's microdata (345,103 cases) were mapped in multidimensional vector space by Word2Vec training. We performed parameters optimization for training and then applied vector dimension of 300 and window size of 15 as optimized parameters for further experiments. We employed index words of Korean Standard Industry Classification (KSIC) as a product name dataset to more efficiently cluster product groups. The product names which are similar to KSIC indexes were extracted based on cosine similarity. The market size of extracted products as one product category was calculated from individual companies' sales data. The market sizes of 11,654 specific product lines were automatically estimated by the proposed model. For the performance verification, the results were compared with actual market size of some items. The Pearson's correlation coefficient was 0.513. Our approach has several advantages differing from the previous studies. First, text mining and machine learning techniques were applied for the first time on market size estimation, overcoming the limitations of traditional sampling based- or multiple assumption required-methods. In addition, the level of market category can be easily and efficiently adjusted according to the purpose of information use by changing cosine similarity threshold. Furthermore, it has a high potential of practical applications since it can resolve unmet needs for detailed market size information in public and private sectors. Specifically, it can be utilized in technology evaluation and technology commercialization support program conducted by governmental institutions, as well as business strategies consulting and market analysis report publishing by private firms. The limitation of our study is that the presented model needs to be improved in terms of accuracy and reliability. The semantic-based word embedding module can be advanced by giving a proper order in the preprocessed dataset or by combining another algorithm such as Jaccard similarity with Word2Vec. Also, the methods of product group clustering can be changed to other types of unsupervised machine learning algorithm. Our group is currently working on subsequent studies and we expect that it can further improve the performance of the conceptually proposed basic model in this study.

A Study on the Establishment of Acceptable Range for Internal Quality Control of Radioimmunoassay (핵의학 검체검사 내부정도관리 허용범위 설정에 관한 고찰)

  • Young Ji, LEE;So Young, LEE;Sun Ho, LEE
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.26 no.2
    • /
    • pp.43-47
    • /
    • 2022
  • Purpose Radioimmunoassay implement quality control by systematizing the internal quality control system for quality assurance of test results. This study aims to contribute to the quality assurance of radioimmunoassay results and to implement systematic quality control by measuring the average CV of internal quality control and external quality control by plenty of institutions for reference when setting the laboratory's own acceptable range. Materials and Methods We measured the average CV of internal quality control and the bounce rate of more than 10.0% for a total of 42 items from October 2020 to December 2021. According to the CV result, we classified and compared the upper group (5.0% or less), the middle group (5.0~10.0%) and the lower group (10.0% or more). The bounce rate of 10.0% or more was compared by classifying the item of five or more institutions into tumor markers, thyroid hormones and other hormones. The average CV was measured by the overall average and standard deviation of the external quality control results for 28 items from the first quarter to the fourth quarter of 2021. In addition, the average CV was measured by the overall average and standard deviation of the proficiency results between institutions for 13 items in the first half and the second half of 2021. The average CV of internal quality control and external quality control was compared by item so we compared and analyzed the items that implement well to quality control and the items that require attention to quality control. Results As a result of measuring the precision average of internal quality control for 42 items of six institutions, the top group (5.0% or less) are Ferritin, HGH, SHBG, and 25-OH-VitD, while the bottom group (≤10.0%) are cortisol, ATA, AMA, renin, and estradiol. When comparing more than 10.0% bounce rate of CV for tumor markers, CA-125 (6.7%), CA-19-9 (9.8%) implemented well, while SCC-Ag (24.3%), CA-15-3 (26.7%) were among the items that require attention to control. As a result of comparing the bounce rate of more than 10.0% of CV for thyroid hormones examination, free T4 (2.1%), T3 (9.3%) showed excellent performance and AMA (39.6%), ATA (51.6%) required attention to control. When comparing the bounce rate of 10.0% or more of CV for other hormones, IGF-1 (8.8%), FSH (9.1%), prolactin (9.2%) showed excellent performance, however estradiol (37.3%), testosterone (37.7%), cortisol (44.4%) required attention to control. As a result of measuring the average CV of the whole institutions participating at external quality control for 28 items, HGH and SCC-Ag were included in the top group (≤10.0%), however ATA, estradiol, TSI, and thyroglobulin included in bottom group (≥30.0%). Conclusion As a result of evaluating 42 items of six institutions, the average CV was 3.7~12.2% showing a 3.3 times difference between the upper group and the lower group. Cortisol, ATA, AMA, Renin and estradiol tests with high CV will require continuous improvement activities to improve precision. In addition, we measured and compared the overall average CV of the internal quality control, the external quality control and the proficiency between institutions participating of six institutions for 41 items excluding HBs-Ab. As a result, ATA, AMA, Renin and estradiol belong to the same subgroup so we require attention to control and consider setting a higher acceptable range. It is recommended to set and control the acceptable range standard of internal quality control CV in consideration of many things in the laboratory due to the different reagents and instruments, and the results vary depending on the test's proficiency and quality control materials. It is thought that the accuracy and reliability of radioimmunoassay results can be improved if systematic quality control is implemented based on the set acceptable range.