• 제목/요약/키워드: Cost information

검색결과 10,989건 처리시간 0.036초

A Cost-Effective Pigsty Monitoring System Based on a Video Sensor

  • Chung, Yongwha;Kim, Haelyeon;Lee, Hansung;Park, Daihee;Jeon, Taewoong;Chang, Hong-Hee
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제8권4호
    • /
    • pp.1481-1498
    • /
    • 2014
  • Automated activity monitoring has become important in many applications. In particular, automated monitoring is an important issue in large-scale management of group-housed livestock because it can save a significant part of farm workers' time or minimize the damage caused by livestock problems. In this paper, we propose an automated solution for measuring the daily-life activities of pigs by using video data in order to manage the group-housed pigs. Especially, we focus on the circadian rhythm of group-housed pigs under windowless and 24-hour light-on conditions. Also, we derive a cost-effective solution within the acceptable range of quality for the activity monitoring application. From the experimental results with the video monitoring data obtained from two pig farms, we believe our method based on circadian rhythm can be applied for detecting management problems of group-housed pigs in a cost-effective way.

INTEGRATION MODEL OF COST AND SCHEDULE IN STEEL BOX GIRDER BRIDGE PRODUCTION PROCESS

  • Seok Kim;Kyoungmin Kim;Seung-Ho Ha ;Kyong Ju Kim
    • 국제학술발표논문집
    • /
    • The 1th International Conference on Construction Engineering and Project Management
    • /
    • pp.1262-1267
    • /
    • 2005
  • It is still difficult to share and utilize the information generated at each phase of a steel box girder production process due to the spatial gap and different level of management information. The physical distance results in the inefficiency of the information transmission, the accidental omission and typos of the relative information, and so on. Various levels of management information make it difficult to embody a new management system. Eventually, these factors incur the loss of cost and schedule and interrupt development of a new management system. This paper analyzes a current process and presents graphical process flow by using IDEF0. Based this analysis, the research for new production process and work breakdown structure (WBS) is conducted. At the end of this paper, the conceptual design of this system is suggested. Through new management system, it is expected that the model proposed in this study will improve the management process in the steel box production, and the improved process will reduce the redundant cost and schedule information, transmission and deposit generated by manual paper.

  • PDF

정보보안 소프트웨어 유지보수 대가기준을 위한 보정계수 산정에 관한 연구 (A Study on an Estimation of Adjusted Coefficient for the Maintenance of Information Security Software in Korea Industry)

  • 박유진;박은주
    • 한국전자거래학회지
    • /
    • 제16권4호
    • /
    • pp.109-123
    • /
    • 2011
  • 최근 정보보안과 관련한 심각한 사태가 발생하면서 정보보안에 대한 사회적 관심이 매우 높아지고 있으며, 사이버 보안 강화는 국가 및 기업의 인프라를 보호하고 경쟁력을 갖기 위해 중요한 부분이 되었다. 그러나 현재 국내 정보보안 소프트웨어 대가기준은 정보보안의 특수성은 배제된 채 일반 소프트웨어 유지보수의 기준으로 대가기준이 산정 되고 있다. 따라서 현실에 맞는 적절한 정보보안 소프트웨어 유지보수 대가 산정 기준이 필요한 실정이다. 본 연구에서는 합리적이고 현실성 있는 정보보안 소프트웨어의 적정 대가기준 산정 방법을 제안하여, 사용자 및 공급자에게 적정한 대가 지급 기준을 수립함으로써 더 나아가 정보보안 소프트웨어 기업의 경쟁력 향상을 도모하고자 한다.

A Fault Tolerant Data Management Scheme for Healthcare Internet of Things in Fog Computing

  • Saeed, Waqar;Ahmad, Zulfiqar;Jehangiri, Ali Imran;Mohamed, Nader;Umar, Arif Iqbal;Ahmad, Jamil
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제15권1호
    • /
    • pp.35-57
    • /
    • 2021
  • Fog computing aims to provide the solution of bandwidth, network latency and energy consumption problems of cloud computing. Likewise, management of data generated by healthcare IoT devices is one of the significant applications of fog computing. Huge amount of data is being generated by healthcare IoT devices and such types of data is required to be managed efficiently, with low latency, without failure, and with minimum energy consumption and low cost. Failures of task or node can cause more latency, maximum energy consumption and high cost. Thus, a failure free, cost efficient, and energy aware management and scheduling scheme for data generated by healthcare IoT devices not only improves the performance of the system but also saves the precious lives of patients because of due to minimum latency and provision of fault tolerance. Therefore, to address all such challenges with regard to data management and fault tolerance, we have presented a Fault Tolerant Data management (FTDM) scheme for healthcare IoT in fog computing. In FTDM, the data generated by healthcare IoT devices is efficiently organized and managed through well-defined components and steps. A two way fault-tolerant mechanism i.e., task-based fault-tolerance and node-based fault-tolerance, is provided in FTDM through which failure of tasks and nodes are managed. The paper considers energy consumption, execution cost, network usage, latency, and execution time as performance evaluation parameters. The simulation results show significantly improvements which are performed using iFogSim. Further, the simulation results show that the proposed FTDM strategy reduces energy consumption 3.97%, execution cost 5.09%, network usage 25.88%, latency 44.15% and execution time 48.89% as compared with existing Greedy Knapsack Scheduling (GKS) strategy. Moreover, it is worthwhile to mention that sometimes the patients are required to be treated remotely due to non-availability of facilities or due to some infectious diseases such as COVID-19. Thus, in such circumstances, the proposed strategy is significantly efficient.

Information Risk and Cost of Equity: The Role of Stock Price Crash Risk

  • SALEEM, Sana;USMAN, Muhammad
    • The Journal of Asian Finance, Economics and Business
    • /
    • 제8권1호
    • /
    • pp.623-635
    • /
    • 2021
  • The purpose of this research is to examine the impact of information risk on the Cost of Equity (COE) and whether the risk of a stock price crash mediates the relation between information risk and COE. To test the dynamic nature of the proposed model, the two-step system GMM dynamic panel estimators are applied to all the non-financial firms listed on the Pakistan Stock Exchange (PSX) from 2007- 2018. The results of this study show that all three types of information risk, as well as the risk of the share price crash, increases the COE. The crash risk strengthens the impact of information risk on the COE. Moreover, these three information risks are correlated with each other and an increase in information quality reduces the effect of asymmetric information and improves the investor interpreting ability, while an increase in private information decreases the transparency. The finding is crucial for asset pricing, portfolio management, and information disclosure. This study contributes to the literature by providing novel findings on the impact of three different types of information risk, i.e. private information, quality of information, and transparency of information on the COE as well as whether crash risk mediates the relationship.

부식을 고려한 판형교의 LCC 분석 데이터구조 설계 (Data Structure Modeling for the LCC Analysis of the Plate Girder Bridge Considering Corrosion)

  • 김동현;김봉근;이상호
    • 한국방재학회:학술대회논문집
    • /
    • 한국방재학회 2007년도 정기총회 및 학술발표대회
    • /
    • pp.497-500
    • /
    • 2007
  • Data structure was designed not only to estimate LCC but also to analyze time-variant reliability index of plate girder bridges. Information model for data structure was categorized into cost information, cost variable information, user cost information, and reliability analysis information according to the characteristic of data. EXPRESS language of STEP was adopted to describe the data structure for the electronic representation of LCC information. The suitability of the developed data structure was verified by estimating LCC and analyzing time-variant reliability index of a plate girder bridge considering corrosion on the basis of the constructed test database.

  • PDF

An Integrated Sequential Inference Approach for the Normal Mean

  • Almahmeed, M.A.;Hamdy, H.I.;Alzalzalah, Y.H.;Son, M.S.
    • Journal of the Korean Statistical Society
    • /
    • 제31권4호
    • /
    • pp.415-431
    • /
    • 2002
  • A unified framework for statistical inference for the mean of the normal distribution to derive point estimates, confidence intervals and statistical tests is proposed. This optimal design is justified after investigating the basic information and requirements that are possible and impossible to control when specifying practical and statistical requirements. Point estimation is only credible when viewed in the larger context of interval estimation, since the information required for optimal point estimation is unspecifiable. Triple sampling is proposed and justified as a reasonable sampling vehicle to achieve the specifiable requirements within the unified framework.

Type-2 Gumbel과 Erlang 분포의 형상모수를 따르는 수명분포에 근거한 소프트웨어 개발 비용모형에 관한 특성 연구 (A characteristic study on the software development cost model based on the lifetime distribution following the shape parameter of Type-2 Gumbel and Erlang distribution)

  • 양태진
    • 한국정보전자통신기술학회논문지
    • /
    • 제11권4호
    • /
    • pp.460-466
    • /
    • 2018
  • 정보기술의 발달로 컴퓨터 소프트웨어 시스템의 규모는 끊임없이 확장되고 있다. 소프트웨어 개발에 대한 신뢰성 및 비용은 소프트웨어 품질에 큰 영향을 미치고 있다. 본 연구에서는 소프트웨어 고장 간격시간 자료를 바탕으로 NHPP 모형에서 Type-2 Gumbel과 Erlang 분포의 형상모수를 따르는 수명분포에 근거한 소프트웨어 개발 비용모형에 관한 특성을 비교하고, 분석하였다. 그 결과, Go-Okumoto 모형 및 제시한 모형인 Erlang 모형과 Type-2 Gumble 모형에 대한 비용곡선의 추세는 모두 초기단계에서 감소하다가, 고장시간이 지나는 후반부에 가서는 점차 증가하는 결과를 보였다. 또한, Erlang 모형과 Type-2 Gumble 모형을 비교한 결과, Erlang 모형이 소프트웨어 출시시기가 빠르고, 출시시점의 비용도 경제적임을 알 수 있었다. 본 연구를 통하여, 소프트웨어 운용자들은 소프트웨어 출시시기 이후에 결함이 감소되도록 운영단계보다 테스팅 단계에서 결함들을 제거해야 하며, 소프트웨어 개발비용에 관한 특성을 파악하는데 필요한 사전정보을 연구할 수 있을 것으로 기대된다.

손실 비용을 고려한 공정 파라미터 허용차 산출 : 망대 특성치의 경우 (Tolerance Computation for Process Parameter Considering Loss Cost : In Case of the Larger is better Characteristics)

  • 김용준;김근식;박형근
    • 산업경영시스템학회지
    • /
    • 제40권2호
    • /
    • pp.129-136
    • /
    • 2017
  • Among the information technology and automation that have rapidly developed in the manufacturing industries recently, tens of thousands of quality variables are estimated and categorized in database every day. The former existing statistical methods, or variable selection and interpretation by experts, place limits on proper judgment. Accordingly, various data mining methods, including decision tree analysis, have been developed in recent years. Cart and C5.0 are representative algorithms for decision tree analysis, but these algorithms have limits in defining the tolerance of continuous explanatory variables. Also, target variables are restricted by the information that indicates only the quality of the products like the rate of defective products. Therefore it is essential to develop an algorithm that improves upon Cart and C5.0 and allows access to new quality information such as loss cost. In this study, a new algorithm was developed not only to find the major variables which minimize the target variable, loss cost, but also to overcome the limits of Cart and C5.0. The new algorithm is one that defines tolerance of variables systematically by adopting 3 categories of the continuous explanatory variables. The characteristics of larger-the-better was presumed in the environment of programming R to compare the performance among the new algorithm and existing ones, and 10 simulations were performed with 1,000 data sets for each variable. The performance of the new algorithm was verified through a mean test of loss cost. As a result of the verification show, the new algorithm found that the tolerance of continuous explanatory variables lowered loss cost more than existing ones in the larger is better characteristics. In a conclusion, the new algorithm could be used to find the tolerance of continuous explanatory variables to minimize the loss in the process taking into account the loss cost of the products.

Advances in Nonlinear Predictive Control: A Survey on Stability and Optimality

  • Kwon, Wook-Hyun;Han, Soo-Hee;Ahn, Choon-Ki
    • International Journal of Control, Automation, and Systems
    • /
    • 제2권1호
    • /
    • pp.15-22
    • /
    • 2004
  • Some recent advances in stability and optimality for the nonlinear receding horizon control (NRHC) or the nonlinear model predictive control (NMPC) are assessed. The NRHCs with terminal conditions are surveyed in terms of a terminal state equality constraint, a terminal cost, and a terminal constraint set. Other NRHCs without terminal conditions are surveyed in terms of a control Lyapunov function (CLF) and cost monotonicity. Additional approaches such as output feedback, fuzzy, and neural network are introduced. This paper excludes the results for linear receding horizon controls and concentrates only on the analytical results of NRHCs, not including applications of NRHCs. Stability and optimality are focused on rather than robustness.