• Title/Summary/Keyword: Implementation Phase

Search Result 1,240, Processing Time 0.03 seconds

A Study on the Development of a Competency-Based Intervention Course Curriculum of the Korean Academy of Sensory Integration (대한감각통합치료학회 역량기반 중재과정 교육커리큘럼 개발연구)

  • Namkung, Young;Kim, Kyeong-Mi;Kim, Misun;Lee, Jiyoung
    • The Journal of Korean Academy of Sensory Integration
    • /
    • v.17 no.3
    • /
    • pp.26-45
    • /
    • 2019
  • Objective : The purpose of this study is to develop educational goals, training content, and training methods for the intervention course of the Korean Academy of Sensory Integration (KASI) and to conduct competency-based intervention courses based on the competency model for sensory integration intervention. Methods : This study was conducted on work therapists who participated in the 2019 intervention course of KASI. In the first phase, educational needs were analyzed to set goals for the interventional course. In the second phase, a meeting of researchers drafted the intervention course education program and the methods of education, and the intervention course was conducted. In the third phase, the changes in educational satisfaction and performance level pre- and post-intervention course for each competency index were investigated. Results : The educational goals of "learning and applying the clinical reasoning process of sensory integration intervention" and "intervention by applying the principle of sensory integration intervention" were set after reflecting on the results of the analysis of the educational requirements. The length of the competency-based intervention course was 42 hours. The average education satisfaction level of participants in the arbitration process was 4.48±0.73, and the average education satisfaction level of the supervisor was 3.92±0.71. In both groups, the most satisfying curriculums were the data-driven decision-making process and the intervention goal-setting lecture. But the satisfaction level of was the lowest. Before and after the intervention course, there were significant changes in the performance of the two behavioral indicators of the analytic skills in the expertise competency cluster of the competency model. Conclusion : This study is meaningful in that it conducted a survey of educational needs, the development and implementation of an educational curriculum, and an education satisfaction survey through systematic courses necessary for education development.

A hybrid algorithm for the synthesis of computer-generated holograms

  • Nguyen The Anh;An Jun Won;Choe Jae Gwang;Kim Nam
    • Proceedings of the Optical Society of Korea Conference
    • /
    • 2003.07a
    • /
    • pp.60-61
    • /
    • 2003
  • A new approach to reduce the computation time of genetic algorithm (GA) for making binary phase holograms is described. Synthesized holograms having diffraction efficiency of 75.8% and uniformity of 5.8% are proven in computer simulation and experimentally demonstrated. Recently, computer-generated holograms (CGHs) having high diffraction efficiency and flexibility of design have been widely developed in many applications such as optical information processing, optical computing, optical interconnection, etc. Among proposed optimization methods, GA has become popular due to its capability of reaching nearly global. However, there exits a drawback to consider when we use the genetic algorithm. It is the large amount of computation time to construct desired holograms. One of the major reasons that the GA' s operation may be time intensive results from the expense of computing the cost function that must Fourier transform the parameters encoded on the hologram into the fitness value. In trying to remedy this drawback, Artificial Neural Network (ANN) has been put forward, allowing CGHs to be created easily and quickly (1), but the quality of reconstructed images is not high enough to use in applications of high preciseness. For that, we are in attempt to find a new approach of combiningthe good properties and performance of both the GA and ANN to make CGHs of high diffraction efficiency in a short time. The optimization of CGH using the genetic algorithm is merely a process of iteration, including selection, crossover, and mutation operators [2]. It is worth noting that the evaluation of the cost function with the aim of selecting better holograms plays an important role in the implementation of the GA. However, this evaluation process wastes much time for Fourier transforming the encoded parameters on the hologram into the value to be solved. Depending on the speed of computer, this process can even last up to ten minutes. It will be more effective if instead of merely generating random holograms in the initial process, a set of approximately desired holograms is employed. By doing so, the initial population will contain less trial holograms equivalent to the reduction of the computation time of GA's. Accordingly, a hybrid algorithm that utilizes a trained neural network to initiate the GA's procedure is proposed. Consequently, the initial population contains less random holograms and is compensated by approximately desired holograms. Figure 1 is the flowchart of the hybrid algorithm in comparison with the classical GA. The procedure of synthesizing a hologram on computer is divided into two steps. First the simulation of holograms based on ANN method [1] to acquire approximately desired holograms is carried. With a teaching data set of 9 characters obtained from the classical GA, the number of layer is 3, the number of hidden node is 100, learning rate is 0.3, and momentum is 0.5, the artificial neural network trained enables us to attain the approximately desired holograms, which are fairly good agreement with what we suggested in the theory. The second step, effect of several parameters on the operation of the hybrid algorithm is investigated. In principle, the operation of the hybrid algorithm and GA are the same except the modification of the initial step. Hence, the verified results in Ref [2] of the parameters such as the probability of crossover and mutation, the tournament size, and the crossover block size are remained unchanged, beside of the reduced population size. The reconstructed image of 76.4% diffraction efficiency and 5.4% uniformity is achieved when the population size is 30, the iteration number is 2000, the probability of crossover is 0.75, and the probability of mutation is 0.001. A comparison between the hybrid algorithm and GA in term of diffraction efficiency and computation time is also evaluated as shown in Fig. 2. With a 66.7% reduction in computation time and a 2% increase in diffraction efficiency compared to the GA method, the hybrid algorithm demonstrates its efficient performance. In the optical experiment, the phase holograms were displayed on a programmable phase modulator (model XGA). Figures 3 are pictures of diffracted patterns of the letter "0" from the holograms generated using the hybrid algorithm. Diffraction efficiency of 75.8% and uniformity of 5.8% are measured. We see that the simulation and experiment results are fairly good agreement with each other. In this paper, Genetic Algorithm and Neural Network have been successfully combined in designing CGHs. This method gives a significant reduction in computation time compared to the GA method while still allowing holograms of high diffraction efficiency and uniformity to be achieved. This work was supported by No.mOl-2001-000-00324-0 (2002)) from the Korea Science & Engineering Foundation.

  • PDF

An Ontology Model for Public Service Export Platform (공공 서비스 수출 플랫폼을 위한 온톨로지 모형)

  • Lee, Gang-Won;Park, Sei-Kwon;Ryu, Seung-Wan;Shin, Dong-Cheon
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.149-161
    • /
    • 2014
  • The export of domestic public services to overseas markets contains many potential obstacles, stemming from different export procedures, the target services, and socio-economic environments. In order to alleviate these problems, the business incubation platform as an open business ecosystem can be a powerful instrument to support the decisions taken by participants and stakeholders. In this paper, we propose an ontology model and its implementation processes for the business incubation platform with an open and pervasive architecture to support public service exports. For the conceptual model of platform ontology, export case studies are used for requirements analysis. The conceptual model shows the basic structure, with vocabulary and its meaning, the relationship between ontologies, and key attributes. For the implementation and test of the ontology model, the logical structure is edited using Prot$\acute{e}$g$\acute{e}$ editor. The core engine of the business incubation platform is the simulator module, where the various contexts of export businesses should be captured, defined, and shared with other modules through ontologies. It is well-known that an ontology, with which concepts and their relationships are represented using a shared vocabulary, is an efficient and effective tool for organizing meta-information to develop structural frameworks in a particular domain. The proposed model consists of five ontologies derived from a requirements survey of major stakeholders and their operational scenarios: service, requirements, environment, enterprise, and county. The service ontology contains several components that can find and categorize public services through a case analysis of the public service export. Key attributes of the service ontology are composed of categories including objective, requirements, activity, and service. The objective category, which has sub-attributes including operational body (organization) and user, acts as a reference to search and classify public services. The requirements category relates to the functional needs at a particular phase of system (service) design or operation. Sub-attributes of requirements are user, application, platform, architecture, and social overhead. The activity category represents business processes during the operation and maintenance phase. The activity category also has sub-attributes including facility, software, and project unit. The service category, with sub-attributes such as target, time, and place, acts as a reference to sort and classify the public services. The requirements ontology is derived from the basic and common components of public services and target countries. The key attributes of the requirements ontology are business, technology, and constraints. Business requirements represent the needs of processes and activities for public service export; technology represents the technological requirements for the operation of public services; and constraints represent the business law, regulations, or cultural characteristics of the target country. The environment ontology is derived from case studies of target countries for public service operation. Key attributes of the environment ontology are user, requirements, and activity. A user includes stakeholders in public services, from citizens to operators and managers; the requirements attribute represents the managerial and physical needs during operation; the activity attribute represents business processes in detail. The enterprise ontology is introduced from a previous study, and its attributes are activity, organization, strategy, marketing, and time. The country ontology is derived from the demographic and geopolitical analysis of the target country, and its key attributes are economy, social infrastructure, law, regulation, customs, population, location, and development strategies. The priority list for target services for a certain country and/or the priority list for target countries for a certain public services are generated by a matching algorithm. These lists are used as input seeds to simulate the consortium partners, and government's policies and programs. In the simulation, the environmental differences between Korea and the target country can be customized through a gap analysis and work-flow optimization process. When the process gap between Korea and the target country is too large for a single corporation to cover, a consortium is considered an alternative choice, and various alternatives are derived from the capability index of enterprises. For financial packages, a mix of various foreign aid funds can be simulated during this stage. It is expected that the proposed ontology model and the business incubation platform can be used by various participants in the public service export market. It could be especially beneficial to small and medium businesses that have relatively fewer resources and experience with public service export. We also expect that the open and pervasive service architecture in a digital business ecosystem will help stakeholders find new opportunities through information sharing and collaboration on business processes.

Performance of SE-MMA Blind Adaptive Equalization Algorithm in QAM System (QAM 시스템에서 SE-MMA 블라인드 적응 등화 알고리즘의 성능)

  • Lim, Seung-Gag;Kang, Dae-Soo
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.13 no.3
    • /
    • pp.63-69
    • /
    • 2013
  • This paper related with the performance of SE-MMA (Signed-Error MMA) that is the reduction of computational operation number in algorithm than MMA blind eualization algorithm which are possible to elimination of intersymbol interferance in the band limited and time dispersive nonlinear communication channel. In MMA algorithm which are possible to reduction of amplitude and phase rotation by intersymbol interference that is occurred in channel without using the training sequence, it uses the error signal that is the difference of the equalizer output and constant modulus, the statisticlly characteristic of transmitted signal. But in SE-MMA, it uses the polarity of the error signal, then it is possible to reduce the updating the tap coefficient and to simplify the H/W implementation. The computer simulation were performed in order to compare the performance of SE-MMA and conventional MMA algorithm. For this, the recovered signal constellation that is the output of the equalizer, the convergence performance by MSE, MD (maximum distortion) and residual isi characteristic learning curve, SER were used. As a result of simulation, the SE-MMA has more fast convergence speed than the MMA. But in the other index after reaching the seady state, it gives more worst performance values in the used index.

Development of Measurement and Evaluation Process for Risk-based Configuration Factors in Mixed Used Development in Urban Regeneration Projects (복합용도 도시재생사업에서의 리스크 기반 변화요인 측정 및 평가 프로세스 개발)

  • Son, Myung-Jin;Hyun, Chang-Taek
    • Korean Journal of Construction Engineering and Management
    • /
    • v.13 no.6
    • /
    • pp.94-106
    • /
    • 2012
  • In recent years, the risks and uncertainties associated with mixed used development in urban regeneration projects which have actively been implemented at home and abroad have been on the rise due to the uncertainties of the initial business plan, difficulty of financing, increase in total cost and schedule delay. To cope with rapid social and economic changes and optimize benefits, a risk-based configuration management process that considers life cycle is required, along with accurate planning in the early stage of the business. In addition, it is necessary to prepare measures that can respond to the evaluation and measurement of the configuration factors in relation to the business process. However, the focus of previous studies on configuration management in the field of construction was mainly on humanities and the sociological aspects such as organization, leadership, ideology and similar concepts. There has been limited research on the process and measurement and evaluation methods for configuration factors required in decision-making on the risks and changes that can occur in the actual project implementation phase. Accordingly, in this study, we defined risk-based configuration factors and developed a process and MECA/3DAM/CII methodology to measure and evaluate these factors so as to carry out systematic configuration management of mixed used development in urban regeneration projects.

Design of a CMOS Dual-Modulus Prescaler Using New High-Speed Low-Power TSPC D-Flip Flops (새로운 고속 저전력 TSPC D-플립플롭을 사용한 CMOS Dual-Modulus 프리스케일러 설계)

  • Oh, Kun-Chang;Lee, Jae-Kyong;Kang, Ki-Sub;Park, Jong-Tae;Yu, Chong-Gun
    • Journal of IKEEE
    • /
    • v.9 no.2 s.17
    • /
    • pp.152-160
    • /
    • 2005
  • A prescaler is an essential building block for PLL-based frequency synthesizers and must satisfy high-speed and low-power characteristics. The design of D-flip flips used in the prescaler implementation is thus critical. Conventional TSPC D-flip flops suffer from glitches, unbalanced propagation delay, and unnecessary charge/discharge at internal nodes in precharge phase, which results in increased power consumption. In this paper a new dynamic D-flip flop is proposed to overcome these problems. Glitches are minimized using discharge suppression scheme, speed is improved by making balanced propagation delay, and low power consumption is achieved by removing unnecessary discharge. The proposed D-flip flop is employed in designing a 128/129 dual-modulus prescaler using $0.18{\mu}m$ CMOS process parameters. The designed prescaler operates up to 5GHz while conventional one can operate up to 4.5GHz under same conditions. It consumes 0.394mW at 4GHz that is a 34% improved result compared with conventional one.

  • PDF

Novel synthesis of nanocrystalline thin films by design and control of deposition energy and plasma

  • Han, Jeon G.
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2016.02a
    • /
    • pp.77-77
    • /
    • 2016
  • Thin films synthesized by plasma processes have been widely applied in a variety of industrial sectors. The structure control of thin film is one of prime factor in most of these applications. It is well known that the structure of this film is closely associated with plasma parameters and species of plasma which are electrons, ions, radical and neutrals in plasma processes. However the precise control of structure by plasma process is still limited due to inherent complexity, reproducibility and control problems in practical implementation of plasma processing. Therefore the study on the fundamental physical properties that govern the plasmas becomes more crucial for molecular scale control of film structure and corresponding properties for new generation nano scale film materials development and application. The thin films are formed through nucleation and growth stages during thin film depostion. Such stages involve adsorption, surface diffusion, chemical binding and other atomic processes at surfaces. This requires identification, determination and quantification of the surface activity of the species in the plasma. Specifically, the ions and neutrals have kinetic energies ranging from ~ thermal up to tens of eV, which are generated by electron impact of the polyatomic precursor, gas phase reaction, and interactions with the substrate and reactor walls. The present work highlights these aspects for the controlled and low-temperature plasma enhanced chemical vapour disposition (PECVD) of Si-based films like crystalline Si (c-Si), Si-quantum dot, and sputtered crystalline C by the design and control of radicals, plasmas and the deposition energy. Additionally, there is growing demand on the low-temperature deposition process with low hydrogen content by PECVD. The deposition temperature can be reduced significantly by utilizing alternative plasma concepts to lower the reaction activation energy. Evolution in this area continues and has recently produced solutions by increasing the plasma excitation frequency from radio frequency to ultra high frequency (UHF) and in the range of microwave. In this sense, the necessity of dedicated experimental studies, diagnostics and computer modelling of process plasmas to quantify the effect of the unique chemistry and structure of the growing film by radical and plasma control is realized. Different low-temperature PECVD processes using RF, UHF, and RF/UHF hybrid plasmas along with magnetron sputtering plasmas are investigated using numerous diagnostics and film analysis tools. The broad outlook of this work also outlines some of the 'Grand Scientific Challenges' to which significant contributions from plasma nanoscience-related research can be foreseen.

  • PDF

Construction for the Design Project Management System(DPMS) (디자인 프로젝트 관리 시스템(DPMS)의 구성)

  • 우흥룡
    • Archives of design research
    • /
    • v.12 no.3
    • /
    • pp.227-234
    • /
    • 1999
  • We paid attention to the fact that a project will always tend to increase in size even if its scope is narrowing. The complexities and multidisciplinary aspects of projects require that the many parts should be put together so that the prime objectives- performance, time, and cost- are met. These aspects lead to the use of teams to solve problems that used to be solved by individuals. Firstly, We surveyed the design companies and their clients on the design projects, and categorized the design task into 5 phases, that are marketing, planning, idea development, presentation, and follow-up. Among the phases, the presentation has the most difficult task, longest processing time, and highest cost, whereas idea development phase has relatively low cost, longer processing time, and more difficult task. Most of the companies used to be faced several bottlenecks on their design projects - time control, budget control, and resource control. Secondly, for improving the project managing process, we adopted that dividing and analyzing the sub critical paths may help in the effective managing.(Badiru, Adedeji B., 1995) Some critical paths require almost as much attention as the critical path since they have a high potential of becoming critical when changes occur in the network. Therefrom we suggest the Total task weight(Gt) as a management formula for the design project management.${Gt=\mathrm{T}\ast\leftthreetimes\ast1/100}$<\TEX> ( Gt = Total task weight, mathrm{T} = Task Weight, \leftthreetimes= Criticality ) Thirdly, In order to support to managing for the design projects, we set up an application system, which is graphically planning and implementing a complex undertaking. It is helpful to make the control of a project easy. The DPMS(Design Project Management System), which has two sub system. One is Project Screening System(PSS), and another Project Managing System(PMS). In PMS, we divided the design project into three modules; Project Planning, Project Implementation, and Project Evaluation. As a result, the DPMS will contribute to supply the control of a project easily and effectively. Also teams are used for making decisions and taking action with the DPMS. But we need to get further studies on the relationships between the whole project and its tasks.

  • PDF

Design and Implementation of a Metadata Structure for Large-Scale Shared-Disk File System (대용량 공유디스크 파일 시스템에 적합한 메타 데이타 구조의 설계 및 구현)

  • 이용주;김경배;신범주
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.30 no.1
    • /
    • pp.33-49
    • /
    • 2003
  • Recently, there have been large storage demands for manipulating multimedia data. To solve the tremendous storage demands, one of the major researches is the SAN(Storage Area Network) that provides the local file requests directly from shared-disk storage and also eliminates the server bottlenecks to performance and availability. SAN also improve the network latency and bandwidth through new channel interface like FC(Fibre Channel). But to manipulate the efficient storage network like SAN, traditional local file system and distributed file system are not adaptable and also are lack of researches in terms of a metadata structure for large-scale inode object such as file and directory. In this paper, we describe the architecture and design issues of our shared-disk file system and provide the efficient bitmap for providing the well-formed block allocation in each host, extent-based semi flat structure for storing large-scale file data, and two-phase directory structure of using Extendible Hashing. Also we describe a detailed algorithm for implementing the file system's device driver in Linux Kernel and compare our file system with the general file system like EXT2 and shard disk file system like GFS in terms of file creation, directory creation and I/O rate.

On Flexibility Analysis of Real-Time Control System Using Processor Utilization Function (프로세서 활용도 함수를 이용한 실시간 제어시스템 유연성 분석)

  • Chae Jung-Wha;Yoo Cheol-Jung
    • The KIPS Transactions:PartA
    • /
    • v.12A no.1 s.91
    • /
    • pp.53-58
    • /
    • 2005
  • The use of computers for control and monitoring of industrial process has expanded greatly in recent years. The computer used in such applications is shared between a certain number of time-critical control and monitor function and non time-critical batch processing job stream. Embedded systems encompass a variety of hardware and software components which perform specific function in host computer. Many embedded system must respond to external events under certain timing constraints. Failure to respond to certain events on time may either seriously degrade system performance or even result in a catastrophe. In the design of real-time embedded system, decisions made at the architectural design phase greatly affect the final implementation and performance of the system. Flexibility indicates how well a particular system architecture can tolerate with respect to satisfying real-time requirements. The degree of flexibility of real-time system architecture indicates the capability of the system to tolerate perturbations in timing related specifications. Given degree of flexibility, one may compare and rank different implementations. A system with a higher degree of flexibility is more desirable. Flexibility is also an important factor in the trade-off studies between cost and performance. In this paper, it is identified the need for flexibility function and shows that the existing real-time analysis result can be effective. This paper motivated the need for a flexibility for the efficient analysis of potential design candidates in the architectural design exploration or real time embedded system.