• Title/Summary/Keyword: Paper like electronic paper

Search Result 657, Processing Time 0.049 seconds

Investigation on EO Characteristics of SiNx Thin Film Irradiated by Ion-beam (이온 빔 조사된 SiNx 박막의 전기 광학적 특성에 관한 연구)

  • Lee, Sang-Keuk;Oh, Byeong-Yun;Kim, Byoung-Yong;Han, Jin-Woo;Kim, Young-Hwan;Ok, Chul-Ho;Kim, Jong-Hwan;Han, Jeong-Min;Seo, Dae-Shik
    • Proceedings of the Korean Institute of Electrical and Electronic Material Engineers Conference
    • /
    • 2007.11a
    • /
    • pp.429-429
    • /
    • 2007
  • For various applications of liquid crystal displays (LCDs), the uniform alignment of liquid crystal (LC) molecules on treated surfaces is significantly important. Generally, a rubbing method has been widely used to align the LC molecules on polyimide (PI) surfaces. Rubbed PI surfaces have suitable characteristics, such as uniform alignment. However, the rubbing method has some drawbacks, such as the generation of electrostatic charges and the creation of contaminating particles. Thus, we strongly recommend a non contact alignment technique for future generations of large high-resolution LCDs. Most recently, the LC aligning capabilities achieved by ultraviolet and ion-beam exposures which are non contact methods, on diamond-like carbon (DLC) inorganic thin film layers have been successfully studied because DLC thin films have a high mechanical hardness, a high electrical resistivity, optical transparency, and chemical inertness. In addition, nitrogen-doped DLC (NDLC) thin films exhibit properties similar to those of the DLC thin films and a higher thermal stability than the DLC thin films because C:N bonding in the NDLC thin filmsis stronger against thermal stress than C:H bonding in the DLC thin films. Our research group has already studied the NDLC thin films by an ion-beam alignment method. The $SiN_x$ thin films deposited by plasma-enhanced chemical vapor deposition are widely used as an insulation layer for a thin film transistor, which has characteristics similar to those of DLC inorganic thin films. Therefore, in this paper, we report on LC alignment effects and pretilt angle generation on a $SiN_x$, thin film treated by ion-beam irradiation for various N ratios

  • PDF

Development of Series Connectable Wheeled Robot Module (직렬연결이 가능한 소형 바퀴 로봇 모듈의 개발)

  • Kim, Na-Bin;Kim, Ye-Ji;Kim, Ji-Min;Hwang, Yun Mi;Bong, Jae-Hwan
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.5
    • /
    • pp.941-948
    • /
    • 2022
  • Disaster response robots are deployed to disaster sites where human access is difficult and dangerous. The disaster response robots explore the disaster sites prevent a structural collapse and perform lifesaving to minimize damage. It is difficult to operate robots in the disaster sites due to rough terrains where various obstacles are scattered, communication failures and invisible environments. In this paper, we developed a series connectable wheeled robot module. The series connectable wheeled robot module was developed into two types: an active driven robot module and a passive driven robot module. A wheeled robot was built by connecting the two active type robot modules and one passive type robot module. Two robot modules were connected by one DoF rotating joint, allowing the wheeled robot to avoid obstructions in a vertical direction. The wheeled robot performed driving and obstacle avoidance using only pressure sensors, which allows the wheeled robot operate in the invisible environment. An obstacle avoidance experiment was conducted to evaluate the performance of the wheeled robot consisting of two active driven wheeled robot modules and one passive driven wheeled robot module. The wheeled robot successfully avoided step-shaped obstacles with a maximum height of 80 mm in a time of 24.5 seconds using only a pressure sensors, which confirms that the wheeled robot possible to perform the driving and the obstacle avoidance in invisible environment.

A Semantic Classification Model for e-Catalogs (전자 카탈로그를 위한 의미적 분류 모형)

  • Kim Dongkyu;Lee Sang-goo;Chun Jonghoon;Choi Dong-Hoon
    • Journal of KIISE:Databases
    • /
    • v.33 no.1
    • /
    • pp.102-116
    • /
    • 2006
  • Electronic catalogs (or e-catalogs) hold information about the goods and services offered or requested by the participants, and consequently, form the basis of an e-commerce transaction. Catalog management is complicated by a number of factors and product classification is at the core of these issues. Classification hierarchy is used for spend analysis, custom3 regulation, and product identification. Classification is the foundation on which product databases are designed, and plays a central role in almost all aspects of management and use of product information. However, product classification has received little formal treatment in terms of underlying model, operations, and semantics. We believe that the lack of a logical model for classification Introduces a number of problems not only for the classification itself but also for the product database in general. It needs to meet diverse user views to support efficient and convenient use of product information. It needs to be changed and evolved very often without breaking consistency in the cases of introduction of new products, extinction of existing products, class reorganization, and class specialization. It also needs to be merged and mapped with other classification schemes without information loss when B2B transactions occur. For these requirements, a classification scheme should be so dynamic that it takes in them within right time and cost. The existing classification schemes widely used today such as UNSPSC and eClass, however, have a lot of limitations to meet these requirements for dynamic features of classification. In this paper, we try to understand what it means to classify products and present how best to represent classification schemes so as to capture the semantics behind the classifications and facilitate mappings between them. Product information implies a plenty of semantics such as class attributes like material, time, place, etc., and integrity constraints. In this paper, we analyze the dynamic features of product databases and the limitation of existing code based classification schemes. And describe the semantic classification model, which satisfies the requirements for dynamic features oi product databases. It provides a means to explicitly and formally express more semantics for product classes and organizes class relationships into a graph. We believe the model proposed in this paper satisfies the requirements and challenges that have been raised by previous works.

Permanent Preservation and Use of Historical Archives : Preservation Issues Digitization of Historical Collection (역사기록물(Archives)의 항구적인 보존화 이용 : 보존전략과 디지털정보화)

  • Lee, Sang-min
    • The Korean Journal of Archival Studies
    • /
    • no.1
    • /
    • pp.23-76
    • /
    • 2000
  • In this paper, I examined what have been researched and determined about preservation strategy and selection of preservation media in the western archival community. Archivists have primarily been concerned with 'preservation' and 'use' of archival materials worth of being preserved permanently. In the new information era, preservation and use of archival materials were faced with new challenge. Life expectancy of paper records was shortened due to acidification and brittleness of the modem papers. Also emergence of information technology affects the traditional way of preservation and use of archival materials. User expectations are becoming so high technology-oriented and so complicated as to make archivists act like information managers using computer technology rather than traditional archival handicraft. Preservation strategy plays an important role in archival management as well as information management. For a cost-effective management of archives and archival institutions, preservation strategy is a must. The preservation strategy encompasses all aspects of archival preservation process and practices, from selection of archives, appraisal, inventorying, arrangement, description, conservation, microfilming or digitization, archival buildings, and access service. Those archival functions should be considered in their relations to each other to ensure proper preservation of archival materials. In the integrated preservation strategy, 'preservation' and 'use' should be combined and fulfilled without sacrificing the other. Preservation strategy planning is essential to determine the policies of archives to preserve their holdings safe and provide people with a maximum access in most effective ways. Preservation microfilming is to ensure permanent preservation of information held in important archival materials. To do this, a detailed standardization has been developed to guarantee the permanence of microfilm as well as its product quality. Silver gelatin film can last up to 500 years in the optimum storage environment and the most viable option for permanent preservation media. ISO and ANIS developed such standards for the quality of microfilms and microfilming technology. Preservation microfilming guidelines was also developed to ensure effective archival management and picture quality of microfilms. It is essential to assess the need of preservation microfilming. Limit in resources always put a restraint on preservation management. Appraisal (and selection) of what to be preserved was the most important part of preservation microfilming. In addition, microfilms with standard quality can be scanned to produce quality digital images for instant use through internet. As information technology develops, archivists began to utilize information technology to make preservation easier and more economical, and to promote use of archival materials through computer communication network. Digitization was introduced to provide easy and universal access to unique archives, and its large capacity of preserving archival data seems very promising. However, digitization, i.e., transferring images of records to electronic codes, still, needs to be standardized. Digitized data are electronic records, and st present electronic records are very unstable and not to be preserved permanently. Digital media including optical disks materials have not been proved as reliable media for permanent preservation. Due to their chemical coating and physical character using light, they are not stable and can be preserved at best 100 years in the optimum storage environment. Most CD-R can last only 20 years. Furthermore, obsolescence of hardware and software makes hard to reproduce digital images made from earlier versions. Even if when reformatting is possible, the cost of refreshing or upgrading of digital images is very expensive and the very process has to be done at least every five to ten years. No standard for this obsolescence of hardware and software has come into being yet. In short, digital permanence is not a fact, but remains to be uncertain possibility. Archivists must consider in their preservation planning both risk of introducing new technology and promising possibility of new technology at the same time. In planning digitization of historical materials, archivists should incorporate planning for maintaining digitized images and reformatting them in the coming generations of new applications. Without the comprehensive planning, future use of the expensive digital images will become unavailable. And that is a loss of information, and a final failure of both 'preservation' and 'use' of archival materials. As peter Adelstein said, it is wise to be conservative when considerations of conservations are involved.

Design and implementation of smart card-based multi-authentication mechanism for digital contents delivery (디지털콘텐츠 유통을 위한 스마트카드기반의 다중인증처리방법설계 및 구현)

  • Kim, Yong;Lee, Tae-Young
    • Journal of the Korean Society for information Management
    • /
    • v.19 no.1
    • /
    • pp.23-46
    • /
    • 2002
  • With explosively increasing digital contents, library and Information center should have a new role between knowledge providers and knowledge users as information brokering organization. Electronic transaction system should be required for performing this brokering service since economic value is added to information and knowledge in information society. The developments and changes around library are keeping up with increasing building digital library and digitalizing printed sources. With the rapidly changing circumstances, the Internet is currently witnessing an explosive growth. By serving as a virtual information resource. the Internet can dramatically change the way business is conducted and Information is provided. However because of features o( the Internet like openness and information sharing, it has fundamental vulnerabilities in security issues. For Instance, disclosure of private information and line eavesdropping such as password, banking account, transaction data on network and so on are primary obstruction factors to activation of digital contents delivery on network. For high network security and authentication, this paper looks at smart card technologies and proposes multi-authentication protocol based on smart card on open network, implements and analyzes it.

Device Performances Related to Gate Leakage Current in Al2O3/AlGaN/GaN MISHFETs

  • Kim, Do-Kywn;Sindhuri, V.;Kim, Dong-Seok;Jo, Young-Woo;Kang, Hee-Sung;Jang, Young-In;Kang, In Man;Bae, Youngho;Hahm, Sung-Ho;Lee, Jung-Hee
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.14 no.5
    • /
    • pp.601-608
    • /
    • 2014
  • In this paper, we have characterized the electrical properties related to gate leakage current in AlGaN/GaN MISHFETs with varying the thickness (0 to 10 nm) of $Al_2O_3$ gate insulator which also serves as a surface protection layer during high-temperature RTP. The sheet resistance of the unprotected TLM pattern after RTP was rapidly increased to $1323{\Omega}/{\square}$ from the value of $400{\Omega}/{\square}$ of the as-grown sample due to thermal damage during high temperature RTP. On the other hand, the sheet resistances of the TLM pattern protected with thin $Al_2O_3$ layer (when its thickness is larger than 5 nm) were slightly decreased after high-temperature RTP since the deposited $Al_2O_3$ layer effectively neutralizes the acceptor-like states on the surface of AlGaN layer which in turn increases the 2DEG density. AlGaN/GaN MISHFET with 8 nm-thick $Al_2O_3$ gate insulator exhibited extremely low gate leakage current of $10^{-9}A/mm$, which led to superior device performances such as a very low subthreshold swing (SS) of 80 mV/dec and high $I_{on}/I_{off}$ ratio of ${\sim}10^{10}$. The PF emission and FN tunneling models were used to characterize the gate leakage currents of the devices. The device with 5 nm-thick $Al_2O_3$ layer exhibited both PF emission and FN tunneling at relatively lower gate voltages compared to that with 8 nm-thick $Al_2O_3$ layer due to thinner $Al_2O_3$ layer, as expected. The device with 10 nm-thick $Al_2O_3$ layer, however, showed very high gate leakage current of $5.5{\times}10^{-4}A/mm$ due to poly-crystallization of the $Al_2O_3$ layer during the high-temperature RTP, which led to very poor performances.

A design of Optimized Vehicle Routing System(OVRS) based on RSU communication and deep learning (RSU 통신 및 딥러닝 기반 최적화 차량 라우팅 시스템 설계)

  • Son, Su-Rak;Lee, Byung-Kwan;Sim, Son-Kweon;Jeong, Yi-Na
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.13 no.2
    • /
    • pp.129-137
    • /
    • 2020
  • Currently, The autonomous vehicle market is researching and developing four-level autonomous vehicles beyond the commercialization of three-level autonomous vehicles. Because unlike the level 3, the level 4 autonomous vehicle has to deal with an emergency directly, the most important aspect of a four-level autonomous vehicle is its stability. In this paper, we propose an Optimized Vehicle Routing System (OVRS) that determines the route with the lowest probability of an accident at the destination of the vehicle rather than an immediate response in an emergency. The OVRS analyzes road and surrounding vehicle information collected by The RSU communication to predict road hazards, and sets the route for the safer and faster road. The OVRS can improve the stability of the vehicle by executing the route guidance according to the road situation through the RSU on the road like the network routing method. As a result, the RPNN of the ASICM, one of the OVRS modules, was about 17% better than the CNN and 40% better than the LSTM. However, because the study was conducted in a virtual environment using a PC, the possibility of accident of the VPDM was not actually verified. Therefore, in the future, experiments with high accuracy on VPDM due to the collection of accident data and actual roads should be conducted in real vehicles and RSUs.

Efficient Management of Statistical Information of Keywords on E-Catalogs (전자 카탈로그에 대한 효율적인 색인어 통계 정보 관리 방법)

  • Lee, Dong-Joo;Hwang, In-Beom;Lee, Sang-Goo
    • The Journal of Society for e-Business Studies
    • /
    • v.14 no.4
    • /
    • pp.1-17
    • /
    • 2009
  • E-Catalogs which describe products or services are one of the most important data for the electronic commerce. E-Catalogs are created, updated, and removed in order to keep up-to-date information in e-Catalog database. However, when the number of catalogs increases, information integrity is violated by the several reasons like catalog duplication and abnormal classification. Catalog search, duplication checking, and automatic classification are important functions to utilize e-Catalogs and keep the integrity of e-Catalog database. To implement these functions, probabilistic models that use statistics of index words extracted from e-Catalogs had been suggested and the feasibility of the methods had been shown in several papers. However, even though these functions are used together in the e-Catalog management system, there has not been enough consideration about how to share common data used for each function and how to effectively manage statistics of index words. In this paper, we suggest a method to implement these three functions by using simple SQL supported by relational database management system. In addition, we use materialized views to reduce the load for implementing an application that manages statistics of index words. This brings the efficiency of managing statistics of index words by putting database management systems optimize statistics updating. We showed that our method is feasible to implement three functions and effective to manage statistics of index words with empirical evaluation.

  • PDF

A Study on Treatment Target Position Verification by using Electronic Portal Imaging Device & Fractionated Stereotatic Radiotherapy (EPID와 FSRT를 이용한 치료표적위치 검증에 관한 연구)

  • Lee, Dong-Hoon;Kwon, Jang-Woo;Park, Seung-Woo;Kim, Yoon-Jong;Lee, Dong-Han;Ji, Young-Hoon
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.46 no.3
    • /
    • pp.44-51
    • /
    • 2009
  • It is very important to verify generated setup errors in cancer therapy by using a high energy radiation and to perform the precise radiation therapy. Specially, the verification of treatment position is very crucial in special therapies like fractionated stereotatic radiotherapy (FSRT). The FSRT uses normally high-dose, small field size for treating small intracranial lesions. To estimate the developed FSRT system, the isocenter accuracy of gantry, couch and collimator were performed and a total of inaccuracy was less than ${\pm}1mm$. Precise beam targeting is crucial when using high-dose, small field size FSRT for treating small intracranial lesions. The EPID image of the 3mm lead ball mounted on the isocenter with a 25mm collimator cone was acquired and detected to the extent of one pixel (0.76mm) after comparing the difference between the center of a 25mm collimator cone and a 3 mm ball after processing the EPID image. In this paper, the radiation treatment efficiency can be improved by performing precise radiation therapy with a developed video based EPID and FSRT at near real time

Predicting Stock Liquidity by Using Ensemble Data Mining Methods

  • Bae, Eun Chan;Lee, Kun Chang
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.6
    • /
    • pp.9-19
    • /
    • 2016
  • In finance literature, stock liquidity showing how stocks can be cashed out in the market has received rich attentions from both academicians and practitioners. The reasons are plenty. First, it is known that stock liquidity affects significantly asset pricing. Second, macroeconomic announcements influence liquidity in the stock market. Therefore, stock liquidity itself affects investors' decision and managers' decision as well. Though there exist a great deal of literature about stock liquidity in finance literature, it is quite clear that there are no studies attempting to investigate the stock liquidity issue as one of decision making problems. In finance literature, most of stock liquidity studies had dealt with limited views such as how much it influences stock price, which variables are associated with describing the stock liquidity significantly, etc. However, this paper posits that stock liquidity issue may become a serious decision-making problem, and then be handled by using data mining techniques to estimate its future extent with statistical validity. In this sense, we collected financial data set from a number of manufacturing companies listed in KRX (Korea Exchange) during the period of 2010 to 2013. The reason why we selected dataset from 2010 was to avoid the after-shocks of financial crisis that occurred in 2008. We used Fn-GuidPro system to gather total 5,700 financial data set. Stock liquidity measure was computed by the procedures proposed by Amihud (2002) which is known to show best metrics for showing relationship with daily return. We applied five data mining techniques (or classifiers) such as Bayesian network, support vector machine (SVM), decision tree, neural network, and ensemble method. Bayesian networks include GBN (General Bayesian Network), NBN (Naive BN), TAN (Tree Augmented NBN). Decision tree uses CART and C4.5. Regression result was used as a benchmarking performance. Ensemble method uses two types-integration of two classifiers, and three classifiers. Ensemble method is based on voting for the sake of integrating classifiers. Among the single classifiers, CART showed best performance with 48.2%, compared with 37.18% by regression. Among the ensemble methods, the result from integrating TAN, CART, and SVM was best with 49.25%. Through the additional analysis in individual industries, those relatively stabilized industries like electronic appliances, wholesale & retailing, woods, leather-bags-shoes showed better performance over 50%.