• Title/Summary/Keyword: 머신시뮬레이션

Search Result 85, Processing Time 0.03 seconds

Simulation-Based Analysis of C System in C3 System of Systems Via Machine-Learning Based Abstraction of C2 System (머신러닝 기반의 C2 시스템 추상화를 통한 C3 복합체계에서의 시뮬레이션 기반 통신 시스템 분석)

  • Kang, Bong Gu;Seo, Kyung Min;Kim, Byeong Soo;Kim, Tag Gon
    • Journal of the Korea Society for Simulation
    • /
    • v.27 no.1
    • /
    • pp.61-73
    • /
    • 2018
  • In the defense modeling and simulation, for the detailed analysis of the communication system, many studies have carried out the analysis under the C3 SoS(system of systems) which consists of C2(command and control) and C(communication). However, it requires time and space constraints of the C2 system. To solve this problem, this paper proposes a communication analysis method in the standalone system environment which is combined with the C system after abstracting the C2 system. In the abstraction process, we hypothesize the traffic model and mobility model for C system analysis and learn the parameters in the model based on machine learning. Through the proposed method, it is possible to construct traffic and mobility model with different output according to the battlefield. This case study shows how the process can be applied to the C3 SoS and the enhanced accuracy than the existing method. We expect that it is possible to carry out the efficient communication analysis against many experimental scenarios with various communication parameters.

Development of Machine Learning-Based Platform for Distillation Column (증류탑을 위한 머신러닝 기반 플랫폼 개발)

  • Oh, Kwang Cheol;Kwon, Hyukwon;Roh, Jiwon;Choi, Yeongryeol;Park, Hyundo;Cho, Hyungtae;Kim, Junghwan
    • Korean Chemical Engineering Research
    • /
    • v.58 no.4
    • /
    • pp.565-572
    • /
    • 2020
  • This study developed a software platform using machine learning of artificial intelligence to optimize the distillation column system. The distillation column is representative and core process in the petrochemical industry. Process stabilization is difficult due to various operating conditions and continuous process characteristics, and differences in process efficiency occur depending on operator skill. The process control based on the theoretical simulation was used to overcome this problem, but it has a limitation which it can't apply to complex processes and real-time systems. This study aims to develop an empirical simulation model based on machine learning and to suggest an optimal process operation method. The development of empirical simulations involves collecting big data from the actual process, feature extraction through data mining, and representative algorithm for the chemical process. Finally, the platform for the distillation column was developed with verification through a developed model and field tests. Through the developed platform, it is possible to predict the operating parameters and provided optimal operating conditions to achieve efficient process control. This study is the basic study applying the artificial intelligence machine learning technique for the chemical process. After application on a wide variety of processes and it can be utilized to the cornerstone of the smart factory of the industry 4.0.

A Technique for Provisioning Virtual Clusters in Real-time and Improving I/O Performance on Computational-Science Simulation Environments (계산과학 시뮬레이션을 위한 실시간 가상 클러스터 생성 및 I/O 성능 향상 기법)

  • Choi, Chanho;Lee, Jongsuk Ruth;Kim, Hangi;Jin, DuSeok;Yu, Jung-lok
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.1
    • /
    • pp.13-18
    • /
    • 2015
  • Computational science simulations have been used to enable discovery in a broad spectrum of application areas, these simulations show irregular demanding characteristics of computing resources from time to time. The adoption of virtualized high performance cloud, rather than CPU-centric computing platform (such as supercomputers), is gaining interest of interests mainly due to its ease-of-use, multi-tenancy and flexibility. Basically, provisioning a virtual cluster, which consists of a lot of virtual machines, in a real-time has a critical impact on the successful deployment of the virtualized HPC clouds for computational science simulations. However, the cost of concurrently creating many virtual machines in constructing a virtual cluster can be as much as two orders of magnitude worse than expected. One of the main factors in this bottleneck is the time spent to create the virtual images for the virtual machines. In this paper, we propose a novel technique to minimize the creation time of virtual machine images and improve I/O performance of the provisioned virtual clusters. We also confirm that our proposed technique outperforms the conventional ones using various sets of experiments.

A Study on the Application of Machine Simulation and Angle Milling Head of a 6-Axis Parallel Kinematic Machine (6축 병렬기구 공작기계의 머신 시뮬레이션과 앵글밀링헤드 적용에 관한 연구)

  • Lee, In-Su;Kim, Hae-Ji;Kim, Nam-Kyung
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.16 no.5
    • /
    • pp.47-54
    • /
    • 2017
  • This study examines the implementation of a kinematic machining tool to evaluate the interference and collision phenomenon of 5-axis machining of wing ribs from airplanes, particularly for a large-size model airplane. We develop a machine simulation model of a parallel kinematic machining tool that can operate in a virtual space, which is equivalent to the authentic conditions in the field. The investigation of the simulation function elements indicates the necessity to generate the 6-axis machining, which attaches an angle head to the main axis of the machine. Using an NC program for the wing ribs, we attempt to verify the correspondence and conformity between the machine simulation model and the actual equipment.

Simulation Analysis of Flexible Track Drilling Machines Based on ADAMS (ADAMS 기반의 플렉시블 트랙 드릴링 머신의 시뮬레이션 분석)

  • Zhu, Zhong-gang;Zhang, Qi;Lv, Jian-Hua;Qin, Zhen;Lyu, Sung-Ki
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.17 no.5
    • /
    • pp.1-7
    • /
    • 2018
  • Flexible track drilling machines are credited with important applications in the area of aircraft manufacturing because of their portability, quick installation capabilities, and high efficiency. However, their structures are special and the constitution principles and motion characteristics are difficult to control, increasing the development costs and research cycle in the context of the technology blockade of foreign companies. The simulation analysis of flexible track drilling machines can be conducted by applying virtual prototypes, shortening the development cycle and reducing the cost. In this paper, a model of a machine is established by using the SolidWorks software and imported into ADAMS to conduct kinematic and dynamic simulation analysis. During the analysis, the feasibility of the configuration is checked, a reasonable driving motion is chosen, potential deficiencies are found, and improvement actions are raised.

Design and Implementation of a Simulation Language for Robot Simulation (로봇 시뮬레이션을 위한 시뮬레이션 언어 설계 및 구현)

  • Kim, Jong-Chul;Kim, Jae-Wook;Ryu, Ki-Yeol;Lee, Jung-Tae;Borm, Jin-Hwan
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2006.10b
    • /
    • pp.595-599
    • /
    • 2006
  • 신속한 생산 대응력을 가지기 위해 산업 현장 여러 곳에서 로봇을 도입하고 있는 상황에서 로봇의 설치와 변경의 비용이 문제가 되고 있다. 최근에는 이를 해결하기 위해 컴퓨터 상에서 가상 작업장을 구축하고 시뮬레이션을 하면서 여러 검증을 해 볼 수 있는 OLP(Off-Line Programming) 방식이 많이 쓰이고 있다. 본 연구에서는 이런 OLP시스템을 구축하는 데 있어서 중요한 역할을 하는 시뮬레이션 언어를 설계하고, 이 언어의 컴파일러를 구현했다. 또한, 컴파일의 결과물을 실행할 수 있는 가상머신을 구현했다.

  • PDF

A Basic Research for the Development of Generalized Shape Guided Automatic Deburring Machine (형상안내형 범용형상자동면취기의 개발을 위한 기초연구)

  • Kim, Sang-Myng;Jung, Yoon-Gyo;Cho, Sung-Leem
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.11 no.3
    • /
    • pp.104-109
    • /
    • 2012
  • Recently, the deburring process which is last process of manufacture is one of the important process for complete product. The manual deburring process can cause not only higher error rate but also irregular shape and quality of product. Therefore, Shape Guided Automatic Deburring Machine has been developed to resolve the above problems. But the Shape Guided Automatic Deburring Machine have been applied only to produce a circular product. Therefore, this machine is difficult to apply to products of various shapes. To solve this problem, we would like to develop Generalized Shape Guided Automatic Deburring Machine applicable to various shapes. To this end, we have done the modeling and design using CATIA program and have performed machine simulation.

Conceptual Design of the Vector Machine Attachable to Scalar Machine (스칼라 컴퓨터에 부착 사용가능한 벡터 머신 설계)

  • Cho, Jin-Pyo;Ko, Young-Woong;Cho, Young-Il
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2005.05a
    • /
    • pp.1473-1476
    • /
    • 2005
  • 데이터 주소의 계수를 위한 하드웨어 설계가 없는 본 노이만(von Neumann) 개념(SISD)의 컴퓨터에서 데이터의 주소지정은 소프트웨어적으로 수행된다. 그러므로 벡터 데이터 요소들의 주소지정은 인덱싱 기법에 의해 그 요소 수만큼 해당 변수들을 만들어서 사용해야 한다. 이것은 데이터 계수기 없이 명령어 계수기, 즉 PC(program counter)만 하드웨어로 설계되기 때문이다. 본 연구에서는 중앙처리장치 외부에 외형적 구조와 크기를 갖는 단위 벡터의 요소를 액세스하는 하드웨어 유닛의 설계를 제안한다. 제안한 방법은 시뮬레이션을 통하여 성능 검증을 하였으며, 실험 결과 동일한 프로세싱 유닛을 가지는 벡터 머신 아키텍쳐보다 12 - 30 % 정도 우수한 성능을 내는 것을 확인하였다.

  • PDF

Research Trends in Quantum Machine Learning (양자컴퓨팅 & 양자머신러닝 연구의 현재와 미래)

  • J.H. Bang
    • Electronics and Telecommunications Trends
    • /
    • v.38 no.5
    • /
    • pp.51-60
    • /
    • 2023
  • Quantum machine learning (QML) is an area of quantum computing that leverages its principles to develop machine learning algorithms and techniques. QML is aimed at combining traditional machine learning with the capabilities of quantum computing to devise approaches for problem solving and (big) data processing. Nevertheless, QML is in its early stage of the research and development. Thus, more theoretical studies are needed to understand whether a significant quantum speedup can be achieved compared with classical machine learning. If this is the case, the underlying physical principles may be explained. First, fundamental concepts and elements of QML should be established. We describe the inception and development of QML, highlighting essential quantum computing algorithms that are integral to QML. The advent of the noisy intermediate-scale quantum era and Google's demonstration of quantum supremacy are then addressed. Finally, we briefly discuss research prospects for QML.