• Title/Summary/Keyword: optimal view

Search Result 625, Processing Time 0.022 seconds

Survival of Campylobacter jejuni under Aerobic Condition (인체장염유발 Campylobacter jejuni의 호기적 조건 하에서의 잔존 양상)

  • Shin, Soon-Young;Kim, Kwang-Yup;Park, Jong-Hyun
    • Korean Journal of Food Science and Technology
    • /
    • v.30 no.4
    • /
    • pp.916-923
    • /
    • 1998
  • To provide more information on the enteric pathogen Campylobacter jejuni in the view of food sanitation, survival characteristics of two strains of C. jejuni in the different conditions were investigated. When $10^7{\;}or{\;}10^3{\;}per{\;}ml$ of C. jejuni cells were inoculated in the supplemented Brucella broth and kept at $42^{\circ}C,{\;}25^{\circ}C{\;}and{\;}5^{\circ}C$ under the static aerobic condition for 7 days, organisms exponentially proliferated to $a{\;}>10^8$, even in the $10^3{\;}per{\;}ml$ inoculated-sample at $42^{\circ}C{\;}for{\;}1{\sim}2{\;}days$ and the considerable level of viability maintained during 7 days. At $5^{\circ}C$, most of the initial level of organisms survived at the early period and only $a{\;}<{\;}0.5-log_{10}$ cells decrease were observed during the 7 days. At $25^{\circ}C$, a remarkable number of C. jejuni declined within $1{\sim}2{\;}days$ and showed undetectable level of cells after 4 days. When sterile milk and minced chicken meat were artifically inoculated with $10^7{\;}per{\;}ml$ of C. jejuni, mostly, a $1-to{\;}2-log_{10}$ count decrease occurred at $42^{\circ}C{\;}and{\;}5^{\circ}C$ while $a{\;}>3{\;}log_{10}$ decrease at $25^{\circ}C$ during 7 days. Unexpectedly, no colonies appeared on the plate inoculated from the minced chicken meat sample kept at $42^{\circ}C$ after 4 days. The results suggest that C. jejuni contaminated to food can survive at the refrigeration temperature whereas they are sensitive to at the room temperature. Also, it is shown that the growth of C. jejuni at the optimal temperature may vary to the food sources.

  • PDF

The Impact of the Internet Channel Introduction Depending on the Ownership of the Internet Channel (도입주체에 따른 인터넷경로의 도입효과)

  • Yoo, Weon-Sang
    • Journal of Global Scholars of Marketing Science
    • /
    • v.19 no.1
    • /
    • pp.37-46
    • /
    • 2009
  • The Census Bureau of the Department of Commerce announced in May 2008 that U.S. retail e-commerce sales for 2006 reached $ 107 billion, up from $ 87 billion in 2005 - an increase of 22 percent. From 2001 to 2006, retail e-sales increased at an average annual growth rate of 25.4 percent. The explosive growth of E-Commerce has caused profound changes in marketing channel relationships and structures in many industries. Despite the great potential implications for both academicians and practitioners, there still exists a great deal of uncertainty about the impact of the Internet channel introduction on distribution channel management. The purpose of this study is to investigate how the ownership of the new Internet channel affects the existing channel members and consumers. To explore the above research questions, this study conducts well-controlled mathematical experiments to isolate the impact of the Internet channel by comparing before and after the Internet channel entry. The model consists of a monopolist manufacturer selling its product through a channel system including one independent physical store before the entry of an Internet store. The addition of the Internet store to this channel system results in a mixed channel comprised of two different types of channels. The new Internet store can be launched by the independent physical store such as Bestbuy. In this case, the physical retailer coordinates the two types of stores to maximize the joint profits from the two stores. The Internet store also can be introduced by an independent Internet retailer such as Amazon. In this case, a retail level competition occurs between the two types of stores. Although the manufacturer sells only one product, consumers view each product-outlet pair as a unique offering. Thus, the introduction of the Internet channel provides two product offerings for consumers. The channel structures analyzed in this study are illustrated in Fig.1. It is assumed that the manufacturer plays as a Stackelberg leader maximizing its own profits with the foresight of the independent retailer's optimal responses as typically assumed in previous analytical channel studies. As a Stackelberg follower, the independent physical retailer or independent Internet retailer maximizes its own profits, conditional on the manufacturer's wholesale price. The price competition between two the independent retailers is assumed to be a Bertrand Nash game. For simplicity, the marginal cost is set at zero, as typically assumed in this type of study. In order to explore the research questions above, this study develops a game theoretic model that possesses the following three key characteristics. First, the model explicitly captures the fact that an Internet channel and a physical store exist in two independent dimensions (one in physical space and the other in cyber space). This enables this model to demonstrate that the effect of adding an Internet store is different from that of adding another physical store. Second, the model reflects the fact that consumers are heterogeneous in their preferences for using a physical store and for using an Internet channel. Third, the model captures the vertical strategic interactions between an upstream manufacturer and a downstream retailer, making it possible to analyze the channel structure issues discussed in this paper. Although numerous previous models capture this vertical dimension of marketing channels, none simultaneously incorporates the three characteristics reflected in this model. The analysis results are summarized in Table 1. When the new Internet channel is introduced by the existing physical retailer and the retailer coordinates both types of stores to maximize the joint profits from the both stores, retail prices increase due to a combination of the coordination of the retail prices and the wider market coverage. The quantity sold does not significantly increase despite the wider market coverage, because the excessively high retail prices alleviate the market coverage effect to a degree. Interestingly, the coordinated total retail profits are lower than the combined retail profits of two competing independent retailers. This implies that when a physical retailer opens an Internet channel, the retailers could be better off managing the two channels separately rather than coordinating them, unless they have the foresight of the manufacturer's pricing behavior. It is also found that the introduction of an Internet channel affects the power balance of the channel. The retail competition is strong when an independent Internet store joins a channel with an independent physical retailer. This implies that each retailer in this structure has weak channel power. Due to intense retail competition, the manufacturer uses its channel power to increase its wholesale price to extract more profits from the total channel profit. However, the retailers cannot increase retail prices accordingly because of the intense retail level competition, leading to lower channel power. In this case, consumer welfare increases due to the wider market coverage and lower retail prices caused by the retail competition. The model employed for this study is not designed to capture all the characteristics of the Internet channel. The theoretical model in this study can also be applied for any stores that are not geographically constrained such as TV home shopping or catalog sales via mail. The reasons the model in this study is names as "Internet" are as follows: first, the most representative example of the stores that are not geographically constrained is the Internet. Second, catalog sales usually determine the target markets using the pre-specified mailing lists. In this aspect, the model used in this study is closer to the Internet than catalog sales. However, it would be a desirable future research direction to mathematically and theoretically distinguish the core differences among the stores that are not geographically constrained. The model is simplified by a set of assumptions to obtain mathematical traceability. First, this study assumes the price is the only strategic tool for competition. In the real world, however, various marketing variables can be used for competition. Therefore, a more realistic model can be designed if a model incorporates other various marketing variables such as service levels or operation costs. Second, this study assumes the market with one monopoly manufacturer. Therefore, the results from this study should be carefully interpreted considering this limitation. Future research could extend this limitation by introducing manufacturer level competition. Finally, some of the results are drawn from the assumption that the monopoly manufacturer is the Stackelberg leader. Although this is a standard assumption among game theoretic studies of this kind, we could gain deeper understanding and generalize our findings beyond this assumption if the model is analyzed by different game rules.

  • PDF

Performance Evaluation of Siemens CTI ECAT EXACT 47 Scanner Using NEMA NU2-2001 (NEMA NU2-2001을 이용한 Siemens CTI ECAT EXACT 47 스캐너의 표준 성능 평가)

  • Kim, Jin-Su;Lee, Jae-Sung;Lee, Dong-Soo;Chung, June-Key;Lee, Myung-Chul
    • The Korean Journal of Nuclear Medicine
    • /
    • v.38 no.3
    • /
    • pp.259-267
    • /
    • 2004
  • Purpose: NEMA NU2-2001 was proposed as a new standard for performance evaluation of whole body PET scanners. in this study, system performance of Siemens CTI ECAT EXACT 47 PET scanner including spatial resolution, sensitivity, scatter fraction, and count rate performance in 2D and 3D mode was evaluated using this new standard method. Methods: ECAT EXACT 47 is a BGO crystal based PET scanner and covers an axial field of view (FOV) of 16.2 cm. Retractable septa allow 2D and 3D data acquisition. All the PET data were acquired according to the NEMA NU2-2001 protocols (coincidence window: 12 ns, energy window: $250{\sim}650$ keV). For the spatial resolution measurement, F-18 point source was placed at the center of the axial FOV((a) x=0, and y=1, (b)x=0, and y=10, (c)x=70, and y=0cm) and a position one fourth of the axial FOV from the center ((a) x=0, and y=1, (b)x=0, and y=10, (c)x=10, and y=0cm). In this case, x and y are transaxial horizontal and vertical, and z is the scanner's axial direction. Images were reconstructed using FBP with ramp filter without any post processing. To measure the system sensitivity, NEMA sensitivity phantom filled with F-18 solution and surrounded by $1{\sim}5$ aluminum sleeves were scanned at the center of transaxial FOV and 10 cm offset from the center. Attenuation free values of sensitivity wire estimated by extrapolating data to the zero wall thickness. NEMA scatter phantom with length of 70 cm was filled with F-18 or C-11solution (2D: 2,900 MBq, 3D: 407 MBq), and coincidence count rates wire measured for 7 half-lives to obtain noise equivalent count rate (MECR) and scatter fraction. We confirmed that dead time loss of the last flame were below 1%. Scatter fraction was estimated by averaging the true to background (staffer+random) ratios of last 3 frames in which the fractions of random rate art negligibly small. Results: Axial and transverse resolutions at 1cm offset from the center were 0.62 and 0.66 cm (FBP in 2D and 3D), and 0.67 and 0.69 cm (FBP in 2D and 3D). Axial, transverse radial, and transverse tangential resolutions at 10cm offset from the center were 0.72 and 0.68 cm (FBP in 2D and 3D), 0.63 and 0.66 cm (FBP in 2D and 3D), and 0.72 and 0.66 cm (FBP in 2D and 3D). Sensitivity values were 708.6 (2D), 2931.3 (3D) counts/sec/MBq at the center and 728.7 (2D, 3398.2 (3D) counts/sec/MBq at 10 cm offset from the center. Scatter fractions were 0.19 (2D) and 0.49 (3D). Peak true count rate and NECR were 64.0 kcps at 40.1 kBq/mL and 49.6 kcps at 40.1 kBq/mL in 2D and 53.7 kcps at 4.76 kBq/mL and 26.4 kcps at 4.47 kBq/mL in 3D. Conclusion: Information about the performance of CTI ECAT EXACT 47 PET scanner reported in this study will be useful for the quantitative analysis of data and determination of optimal image acquisition protocols using this widely used scanner for clinical and research purposes.

Pareto Ratio and Inequality Level of Knowledge Sharing in Virtual Knowledge Collaboration: Analysis of Behaviors on Wikipedia (지식 공유의 파레토 비율 및 불평등 정도와 가상 지식 협업: 위키피디아 행위 데이터 분석)

  • Park, Hyun-Jung;Shin, Kyung-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.19-43
    • /
    • 2014
  • The Pareto principle, also known as the 80-20 rule, states that roughly 80% of the effects come from 20% of the causes for many events including natural phenomena. It has been recognized as a golden rule in business with a wide application of such discovery like 20 percent of customers resulting in 80 percent of total sales. On the other hand, the Long Tail theory, pointing out that "the trivial many" produces more value than "the vital few," has gained popularity in recent times with a tremendous reduction of distribution and inventory costs through the development of ICT(Information and Communication Technology). This study started with a view to illuminating how these two primary business paradigms-Pareto principle and Long Tail theory-relates to the success of virtual knowledge collaboration. The importance of virtual knowledge collaboration is soaring in this era of globalization and virtualization transcending geographical and temporal constraints. Many previous studies on knowledge sharing have focused on the factors to affect knowledge sharing, seeking to boost individual knowledge sharing and resolve the social dilemma caused from the fact that rational individuals are likely to rather consume than contribute knowledge. Knowledge collaboration can be defined as the creation of knowledge by not only sharing knowledge, but also by transforming and integrating such knowledge. In this perspective of knowledge collaboration, the relative distribution of knowledge sharing among participants can count as much as the absolute amounts of individual knowledge sharing. In particular, whether the more contribution of the upper 20 percent of participants in knowledge sharing will enhance the efficiency of overall knowledge collaboration is an issue of interest. This study deals with the effect of this sort of knowledge sharing distribution on the efficiency of knowledge collaboration and is extended to reflect the work characteristics. All analyses were conducted based on actual data instead of self-reported questionnaire surveys. More specifically, we analyzed the collaborative behaviors of editors of 2,978 English Wikipedia featured articles, which are the best quality grade of articles in English Wikipedia. We adopted Pareto ratio, the ratio of the number of knowledge contribution of the upper 20 percent of participants to the total number of knowledge contribution made by the total participants of an article group, to examine the effect of Pareto principle. In addition, Gini coefficient, which represents the inequality of income among a group of people, was applied to reveal the effect of inequality of knowledge contribution. Hypotheses were set up based on the assumption that the higher ratio of knowledge contribution by more highly motivated participants will lead to the higher collaboration efficiency, but if the ratio gets too high, the collaboration efficiency will be exacerbated because overall informational diversity is threatened and knowledge contribution of less motivated participants is intimidated. Cox regression models were formulated for each of the focal variables-Pareto ratio and Gini coefficient-with seven control variables such as the number of editors involved in an article, the average time length between successive edits of an article, the number of sections a featured article has, etc. The dependent variable of the Cox models is the time spent from article initiation to promotion to the featured article level, indicating the efficiency of knowledge collaboration. To examine whether the effects of the focal variables vary depending on the characteristics of a group task, we classified 2,978 featured articles into two categories: Academic and Non-academic. Academic articles refer to at least one paper published at an SCI, SSCI, A&HCI, or SCIE journal. We assumed that academic articles are more complex, entail more information processing and problem solving, and thus require more skill variety and expertise. The analysis results indicate the followings; First, Pareto ratio and inequality of knowledge sharing relates in a curvilinear fashion to the collaboration efficiency in an online community, promoting it to an optimal point and undermining it thereafter. Second, the curvilinear effect of Pareto ratio and inequality of knowledge sharing on the collaboration efficiency is more sensitive with a more academic task in an online community.

Information Privacy Concern in Context-Aware Personalized Services: Results of a Delphi Study

  • Lee, Yon-Nim;Kwon, Oh-Byung
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.63-86
    • /
    • 2010
  • Personalized services directly and indirectly acquire personal data, in part, to provide customers with higher-value services that are specifically context-relevant (such as place and time). Information technologies continue to mature and develop, providing greatly improved performance. Sensory networks and intelligent software can now obtain context data, and that is the cornerstone for providing personalized, context-specific services. Yet, the danger of overflowing personal information is increasing because the data retrieved by the sensors usually contains privacy information. Various technical characteristics of context-aware applications have more troubling implications for information privacy. In parallel with increasing use of context for service personalization, information privacy concerns have also increased such as an unrestricted availability of context information. Those privacy concerns are consistently regarded as a critical issue facing context-aware personalized service success. The entire field of information privacy is growing as an important area of research, with many new definitions and terminologies, because of a need for a better understanding of information privacy concepts. Especially, it requires that the factors of information privacy should be revised according to the characteristics of new technologies. However, previous information privacy factors of context-aware applications have at least two shortcomings. First, there has been little overview of the technology characteristics of context-aware computing. Existing studies have only focused on a small subset of the technical characteristics of context-aware computing. Therefore, there has not been a mutually exclusive set of factors that uniquely and completely describe information privacy on context-aware applications. Second, user survey has been widely used to identify factors of information privacy in most studies despite the limitation of users' knowledge and experiences about context-aware computing technology. To date, since context-aware services have not been widely deployed on a commercial scale yet, only very few people have prior experiences with context-aware personalized services. It is difficult to build users' knowledge about context-aware technology even by increasing their understanding in various ways: scenarios, pictures, flash animation, etc. Nevertheless, conducting a survey, assuming that the participants have sufficient experience or understanding about the technologies shown in the survey, may not be absolutely valid. Moreover, some surveys are based solely on simplifying and hence unrealistic assumptions (e.g., they only consider location information as a context data). A better understanding of information privacy concern in context-aware personalized services is highly needed. Hence, the purpose of this paper is to identify a generic set of factors for elemental information privacy concern in context-aware personalized services and to develop a rank-order list of information privacy concern factors. We consider overall technology characteristics to establish a mutually exclusive set of factors. A Delphi survey, a rigorous data collection method, was deployed to obtain a reliable opinion from the experts and to produce a rank-order list. It, therefore, lends itself well to obtaining a set of universal factors of information privacy concern and its priority. An international panel of researchers and practitioners who have the expertise in privacy and context-aware system fields were involved in our research. Delphi rounds formatting will faithfully follow the procedure for the Delphi study proposed by Okoli and Pawlowski. This will involve three general rounds: (1) brainstorming for important factors; (2) narrowing down the original list to the most important ones; and (3) ranking the list of important factors. For this round only, experts were treated as individuals, not panels. Adapted from Okoli and Pawlowski, we outlined the process of administrating the study. We performed three rounds. In the first and second rounds of the Delphi questionnaire, we gathered a set of exclusive factors for information privacy concern in context-aware personalized services. The respondents were asked to provide at least five main factors for the most appropriate understanding of the information privacy concern in the first round. To do so, some of the main factors found in the literature were presented to the participants. The second round of the questionnaire discussed the main factor provided in the first round, fleshed out with relevant sub-factors. Respondents were then requested to evaluate each sub factor's suitability against the corresponding main factors to determine the final sub-factors from the candidate factors. The sub-factors were found from the literature survey. Final factors selected by over 50% of experts. In the third round, a list of factors with corresponding questions was provided, and the respondents were requested to assess the importance of each main factor and its corresponding sub factors. Finally, we calculated the mean rank of each item to make a final result. While analyzing the data, we focused on group consensus rather than individual insistence. To do so, a concordance analysis, which measures the consistency of the experts' responses over successive rounds of the Delphi, was adopted during the survey process. As a result, experts reported that context data collection and high identifiable level of identical data are the most important factor in the main factors and sub factors, respectively. Additional important sub-factors included diverse types of context data collected, tracking and recording functionalities, and embedded and disappeared sensor devices. The average score of each factor is very useful for future context-aware personalized service development in the view of the information privacy. The final factors have the following differences comparing to those proposed in other studies. First, the concern factors differ from existing studies, which are based on privacy issues that may occur during the lifecycle of acquired user information. However, our study helped to clarify these sometimes vague issues by determining which privacy concern issues are viable based on specific technical characteristics in context-aware personalized services. Since a context-aware service differs in its technical characteristics compared to other services, we selected specific characteristics that had a higher potential to increase user's privacy concerns. Secondly, this study considered privacy issues in terms of service delivery and display that were almost overlooked in existing studies by introducing IPOS as the factor division. Lastly, in each factor, it correlated the level of importance with professionals' opinions as to what extent users have privacy concerns. The reason that it did not select the traditional method questionnaire at that time is that context-aware personalized service considered the absolute lack in understanding and experience of users with new technology. For understanding users' privacy concerns, professionals in the Delphi questionnaire process selected context data collection, tracking and recording, and sensory network as the most important factors among technological characteristics of context-aware personalized services. In the creation of a context-aware personalized services, this study demonstrates the importance and relevance of determining an optimal methodology, and which technologies and in what sequence are needed, to acquire what types of users' context information. Most studies focus on which services and systems should be provided and developed by utilizing context information on the supposition, along with the development of context-aware technology. However, the results in this study show that, in terms of users' privacy, it is necessary to pay greater attention to the activities that acquire context information. To inspect the results in the evaluation of sub factor, additional studies would be necessary for approaches on reducing users' privacy concerns toward technological characteristics such as highly identifiable level of identical data, diverse types of context data collected, tracking and recording functionality, embedded and disappearing sensor devices. The factor ranked the next highest level of importance after input is a context-aware service delivery that is related to output. The results show that delivery and display showing services to users in a context-aware personalized services toward the anywhere-anytime-any device concept have been regarded as even more important than in previous computing environment. Considering the concern factors to develop context aware personalized services will help to increase service success rate and hopefully user acceptance for those services. Our future work will be to adopt these factors for qualifying context aware service development projects such as u-city development projects in terms of service quality and hence user acceptance.