• Title/Summary/Keyword: ORACLE

Search Result 389, Processing Time 0.027 seconds

Study on Patient Outcomes through the Construction of Korean Nursing Minimum Data Set (NMDS) (한국형 Nursing Minimum Data Set(NMDS)구축을 통한 환자결과에 대한 연구)

  • Lee, Eun-Joo
    • Journal of Korean Academy of Nursing Administration
    • /
    • v.12 no.1
    • /
    • pp.14-22
    • /
    • 2006
  • Purpose: The purpose of this study is developing the nursing information system which contains the core elements of nursing practice, the Nursing Minimum Data Set (NMDS) that should be collected and documented all the settings in which nursing care is provided. Method: The program was developed under the hospital information system by TCP/IP protocol and used NANDA, Nursing Interventions Classification (NIC), and Nursing Outcomes Classification (NOC) to fill out the elements of NMDS. The Oracle was used as DBMS under the Windows 98 environment and Power Builder 5.0 was used as a program language. Results: This study developed linkage among the NANDA-NOC-NIC to facilitate choosing correct nursing diagnosis, interventions, and outcomes and stimulate nurses' critical thinking. Also the system developed includes nursing care sensitive patient outcomes, so nurses can actively involve in nursing effectiveness research by analyzing the data stored in the database or by making relational databases with other health care related databases. Conclusion: The program developed in this study ultimately can be used for the nursing research, policy development, reimbursement of nursing care, and calculating staffing and nursing skill mix by providing tool to describe and organize nursing practice and measure the nursing care effectiveness.

  • PDF

Implementation of Education Network using Switching System, Wired System and Radio Communication System (교환시스템과 유${\cdot}$무선통신 시스템을 이용한 교육용 통신망 구현)

  • Park, Jin-Taek;Gahng, Hae-Dong;Park, Young-Goo;Hong, Jin-Keun
    • Journal of the Institute of Electronics Engineers of Korea TE
    • /
    • v.37 no.4
    • /
    • pp.1-10
    • /
    • 2000
  • In this paper, an educational-purposed tree-type telecommunication network using the switching system, and the wired and radio transmission systems which are applied is implemented. Also, the proper training method for the network such as the digital full electronic switching training, the networking experiment, the optical transmission experiment, is developed to robust the adaptation ability to the actual various environment of telecommunication network of the trainers. The technical information such as the technical manuals, the user manuals and the maintenance skill reports are collected and database using Oracle DB system for the effective educational use.

  • PDF

TinyIBAK: Design and Prototype Implementation of An Identity-based Authenticated Key Agreement Scheme for Large Scale Sensor Networks

  • Yang, Lijun;Ding, Chao;Wu, Meng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.11
    • /
    • pp.2769-2792
    • /
    • 2013
  • In this paper, we propose an authenticated key agreement scheme, TinyIBAK, based on the identity-based cryptography and bilinear paring, for large scale sensor networks. We prove the security of our proposal in the random oracle model. According to the formal security validation using AVISPA, the proposed scheme is strongly secure against the passive and active attacks, such as replay, man-in-the middle and node compromise attacks, etc. We implemented our proposal for TinyOS-2.1, analyzed the memory occupation, and evaluated the time and energy performance on the MICAz motes using the Avrora toolkits. Moreover, we deployed our proposal within the TOSSIM simulation framework, and investigated the effect of node density on the performance of our scheme. Experimental results indicate that our proposal consumes an acceptable amount of resources, and is feasible for infrequent key distribution and rekeying in large scale sensor networks. Compared with other ID-based key agreement approaches, TinyIBAK is much more efficient or comparable in performance but provides rekeying. Compared with the traditional key pre-distribution schemes, TinyIBAK achieves significant improvements in terms of security strength, key connectivity, scalability, communication and storage overhead, and enables efficient secure rekeying.

New Public Key Encryption with Equality Test Based on non-Abelian Factorization Problems

  • Zhu, Huijun;Wang, Licheng;Qiu, Shuming;Niu, Xinxin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.2
    • /
    • pp.764-785
    • /
    • 2018
  • In this paper, we present a new public key encryption scheme with equality test (PKEwET). Compared to other PKEwET schemes, we find that its security can be improved since the proposed scheme is based on non-Abelian factorization problems. To our knowledge, it is the first scheme regarding equality test that can resist quantum algorithm attacks. We show that our scheme is one-way against chosen-ciphertext attacks in the case that the computational Diffie-Hellman problem is hard for a Type-I adversary. It is indistinguishable against chosen-ciphertext attacks in the case that the Decisional Diffie-Hellman problem is hard in the random oracle model for a Type-II adversary. To conclude the paper, we demonstrate that our scheme is more efficient.

The Development of Forest Fire Statistical Management System using Web GIS Technology

  • Jo, Myung-Hee;Kim, Joon-Bum;Kim, Hyun-Sik;Jo, Yun-Won
    • Proceedings of the KSRS Conference
    • /
    • 2002.10a
    • /
    • pp.183-190
    • /
    • 2002
  • In this paper forest fire statistical information management system is constructed on web environment using web based GIS(Geographic Information System) technology. Though this system, general users can easily access forest fire statistical information and obtain them in visual method such as maps, graphs, and text if they have web browsers. Moreover, officials related to forest fire can easily control and manage all information in domestic by accessing input interface, retrieval interface, and out interface. In order to implement this system, IIS 5.0 of Microsoft is used as web server and Oracle 8i and ASP(Active Server Page) are used for database construction and dynamic web page operation, respectively. Also, Arc IMS of ESRI is used to serve map data using Java and HTML as system development language. Through this system, general users can obtain the whole information related to forest fire visually in real time also recognize forest fire prevention. In addition, Forest officials can manage the domestic forest resource and control forest fire dangerous area efficiently and scientifically by analyzing and retrieving huge forest data through this system. So, they can save their manpower, time and cost to collect and manage data.

  • PDF

Certificateless Proxy Re-Encryption Scheme and Its Extension to Multiple KGC Environment (무인증서기반 프락시 재암호화 기법 및 다중 KGC 환경으로의 확장)

  • Sur, Chul;Jung, Chae-Duk;Park, Young-Ho;Rhee, Kyung-Hyune
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.4
    • /
    • pp.530-539
    • /
    • 2009
  • In this paper we introduce the notion of certificateless proxy re-encryption which enjoys the advantages of certificateless cryptography while providing the functionalities of proxy re-encryption. We give precise definitions for secure certificateless proxy re-encryption schemes and also present a concrete scheme from bilinear pairing. Our scheme is unidirectional and compatible with current certificateless encryption deployments, In addition, we show that our scheme has chosen ciphertext security in the random oracle model. Finally, we extend the proposed scheme for appling multiple KGC environment.

  • PDF

A Diffie-Hellman Key Exchange Protocol in the Standard Model (표준 모델에서 안전한 Diffie-Hellman 키 교환 프로토콜)

  • Jeong, Ik-Rae;Kwon, Jeong-Ok;Lee, Dong-Hoon;Hong, Do-Won
    • Journal of KIISE:Information Networking
    • /
    • v.35 no.6
    • /
    • pp.465-473
    • /
    • 2008
  • The MQV protocol has been regarded as the most efficient authenticated Diffie- Hellman key exchange protocol, and standardized by many organizations including the US NSA. In Crypto 2005, Hugo Krawczyk showed vulnerabilities of MQV to several attacks and suggested a hashed variant of MQV, called HMQV, which provides the same superb performance of MQV and provable security in the random oracle model. In this paper we suggest an efficient authenticated Diffie-Hellman key exchange protocol providing the same functionalities and security of HMQV without random oracles. So far there are no authenticated Diffie-Hellman protocols which are provably secure without using random oracles and achieve the same level of security goals of HMQV efficiently yet.

On-Line/Off-Line Signature Schemes with Tight Security Reduction to the RSA Problem (RSA 문제와 동등한 안전성을 갖는 온라인/오프라인 서명 기법)

  • Choi, Kyung-yong;Park, Jong Hwan
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.28 no.2
    • /
    • pp.327-338
    • /
    • 2018
  • On-line/off-line signature is a technique for performing heavy computations required for signature generation in the off-line stage and completing the final signature by a simple operation in the online stage. This is suitable for application environments that require immediate signing responses to multiple users. In this paper, we propose two new on-line/off-line signature schemes based on RSA problem. The first technique can generate a signature with a fixed base exponentiation when signing online, and the second technique can complete an online signature with a very simple calculation such as a hash operation. The security of both signatures is based on the RSA problem, which is proven to be tightly secure without security loss in the random oracle model.

Methodology for customizing data mining system (데이터마이닝 시스템의 커스터마이징 방법론)

  • Kim Keun-Hyung;Oh Kyung-Hoon;Kim Min-Cheol
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.8
    • /
    • pp.1835-1842
    • /
    • 2004
  • As a finished product, most commercialized products of Data mining are supposed to support some or all of these functions. These products have an advantage to be available as soon as they are purchased. But, they have some limit to meet users' needs and companies' requirements and to be adaptable to different settings. This study is focused on how to introduce Data mining system and implement it as a form of customizing, instead of a form of finished product. Also, this study try to present several cases which actually used customizing with Oracle products and some other cases which used constructed data mining to analyse sales data.

A Comparison of Data Extraction Techniques and an Implementation of Data Extraction Technique using Index DB -S Bank Case- (원천 시스템 환경을 고려한 데이터 추출 방식의 비교 및 Index DB를 이용한 추출 방식의 구현 -ㅅ 은행 사례를 중심으로-)

  • 김기운
    • Korean Management Science Review
    • /
    • v.20 no.2
    • /
    • pp.1-16
    • /
    • 2003
  • Previous research on data extraction and integration for data warehousing has concentrated mainly on the relational DBMS or partly on the object-oriented DBMS. Mostly, it describes issues related with the change data (deltas) capture and the incremental update by using the triggering technique of active database systems. But, little attention has been paid to data extraction approaches from other types of source systems like hierarchical DBMS, etc. and from source systems without triggering capability. This paper argues, from the practical point of view, that we need to consider not only the types of information sources and capabilities of ETT tools but also other factors of source systems such as operational characteristics (i.e., whether they support DBMS log, user log or no log, timestamp), and DBMS characteristics (i.e., whether they have the triggering capability or not, etc), in order to find out appropriate data extraction techniques that could be applied to different source systems. Having applied several different data extraction techniques (e.g., DBMS log, user log, triggering, timestamp-based extraction, file comparison) to S bank's source systems (e.g., IMS, DB2, ORACLE, and SAM file), we discovered that data extraction techniques available in a commercial ETT tool do not completely support data extraction from the DBMS log of IMS system. For such IMS systems, a new date extraction technique is proposed which first creates Index database and then updates the data warehouse using the Index database. We illustrates this technique using an example application.