• Title/Summary/Keyword: Agile Computing

Search Result 31, Processing Time 0.026 seconds

An intelligent planner of processing equipment for CSCW-based shop floor control in agile manufacturing

  • Kim, Hwajin;Cho, Hyunbo;Jung, Mooyoung
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1995.04a
    • /
    • pp.185-192
    • /
    • 1995
  • A common control model used to implement computer integrated manufacturing(CIM) is based on the hierarchical decomposition of the shop floor activities, in which supervisory controllers are responsible for all the interactions among subordinates. Although the hierarchical control philosophy provides for easy understanding of complex systems, an emerging manufacturing paradigm, agile manufacturing, requires a new control structure necessary to accommodate the rapid development of a shop floor controller. This is what is called CSCW(computer supported cooperative work)-based control or component-based heterarchical control. As computing resources and communication network on the shop floor become increasingly intelligent and powerful, the new control architecture is about to come true in a modern CIM system. In this paper, CSCW-based control is adopted and investigated, in which a controller for a unit of device performs 3 main functions - planning, scheduling and execution. In this paper, attention is paid to a planning function and all the detailed planning activities for CSCW-based shop floor control are identified. Interactions with other functions are also addressed. Generally speaking, planning determines tasks to be scheduled in the future. In other words, planning analyzes process plans and transforms process plans into detailed plans adequate for shop floor control. Planning is also responsible for updating the process plan and identifying/resolving replanning activities whether they come from scheduling or execution.

  • PDF

Proposal : Improvement of Testing Frontier Capability Assessment Model through Comparing International Standards in Software Product and Software Testing Process Perspective (소프트웨어 제품과 프로세스 관점에서 국제표준과 비교를 통한 테스팅 프론티어 역량평가 모델 개선 방안)

  • Yoon, Hyung-Jin;Choi, Jin-Young
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.2
    • /
    • pp.115-120
    • /
    • 2015
  • The Testing Frontier Capability Assessment Model (TCAM) is based on ISO/IEC 9126, TMMi and TPI. Since ISO/IEC 9126, TMMi and TPI were made over 10 years ago, TCAM faces the problem that it can not assess and analyze the capability of small businesses that employ new software development methods or processes, for example Agile, TDD(Test Driven Development), App software, and Web Software. In this paper, a method to improve the problem is proposed. The paper is composed of the following sections: 1) ISO/IEC 9126, ISO/IEC 25010 and ISO/IEC/IEEE 29119 part 2 review 2) TCAM review 3) software product quality perspective comparison, and analysis between ISO/IEC 9126, ISO/IEC 25010 and TCAM 4) comparison, and analysis between ISO/IEC/IEEE 29119 part2 and TCAM and 5) proposal for the improvement of TCAM.

Design and Implementation of File Cloud Server by Using JAVA SDK (Java SDK를 이용한 파일 클라우드 시스템의 설계 및 구현)

  • Lee, Samuel Sangkon
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.8 no.2
    • /
    • pp.86-100
    • /
    • 2015
  • Cloud computing is a computing term that evolved in the late 2000s, based on utility and consumption of computer resources. Google say that "Cloud computing involves deploying groups of remote servers and software networks that allow different kinds of data sources be uploaded for real time processing to generate computing results without the need to store processed data on the cloud. Cloud computing relies on sharing of resources to achieve coherence and economies of scale, similar to a utility (like the electricity grid) over a network. At the foundation of cloud computing is the broader concept of converged infrastructure and shared services. Cloud computing, or in simpler shorthand just "the cloud", also focuses on maximizing the effectiveness of the shared resources." The cloud service is a smart and/or intelligent service to save private files in any device, anytime, anywhere. Dropbox, OAuth, PAClous are required that the accumulated user's data are archives with cloud service. Currently we suggest an implementation technique to process many tasks to the cloud server with a thread pooling. Thread pooling is one of efficient implementating technique for client and service environment. In this paper, to present the implementation technique we suggest three diagrams in the consideration of software engineering.

Development of a planner of processing equipments for heterarchical SFCS (Heterarchical SFCS 를 위한 가공기계의 Planner 모듈 개발)

  • Kim, Hwa-Jin;Cho, Hyun-Bo;Jung, Moo-Young
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.22 no.4
    • /
    • pp.719-739
    • /
    • 1996
  • A common control model used to implement computer integrated manufacturing(CIM) is based on the hierarchical decomposition of the shop floor activities, in which supervisory controllers are responsible for all the interactions among subordinates. Although the hierarchical control philosophy provides for easy understanding of complex systems, an emerging manufacturing paradigm, agile manufacturing, requires a new control structure necessary to accommodate the rapid development of a shop floor controller. This is what is called autonomous agent-based heterarchical control. As computing resources and communication network on the shop floor become increasingly intelligent and powerful, the new control architecture is about to come true in a modern CIM system. In this paper, heterarchical control is adopted and investigated, in which a controller for a unit of device performs three main functions - planning, scheduling and execution. Attention is paid to the planning function and all the detailed planning activities for heterarchical shop floor control are identified. Interactions with other functions are also addressed. In general, planning determines tasks to be scheduled in the future. In other words, planning analyzes process plans and transforms process plans into detailed plans adequate for shop floor control. Planning is also responsible for updating a process plan and identifying/resolving replanning activities whether they come from scheduling or execution.

  • PDF

Effectiveness Analysis of Computer Science Textbooks focusing on Digital Therapeutics

  • Eunsun Choi;Namje Park
    • Journal of Internet Computing and Services
    • /
    • v.25 no.3
    • /
    • pp.9-18
    • /
    • 2024
  • Digital therapy has emerged as a novel treatment modality, propelled by advancements in information and communication technology. In the last five years, there has been a substantial surge in research publications addressing digital therapeutics (DTx) interventions, signaling a sustained upward trajectory in this field. The dynamic nature of computer science, marked by continuous innovation and development, underscores the need for agile adaptation to rapid changes. Consequently, computer science education is compelled to offer students insights into the latest trends. This research endeavors to contribute to the evolving landscape by developing textbooks that impart knowledge about DTx, an integration of information technology. The study focuses on the application of these textbooks to elementary and middle school students in South Korea. The instructional materials have been carefully organized to enable students to learn about the principle of Attention Deficit Hyperactivity Disorder (ADHD) DTx at the elementary level and the DTx that can prevent and address the digital drama at the middle school level. Based on the application of the textbook, students who received instruction using the textbook showed statistically significant improvements in all subcategories of creative problem-solving ability, including idea modification, visualization, task focus, analogy, idea generation, and elaboration (p<.01). Additionally, there were statistically significant changes in students' self-efficacy before and after using the textbook, with negative efficacy decreasing, and positive efficacy and social efficacy increasing (p<.001).

Guidelines for Implementing Configuration Management in Extreme Programming based on CMMI (CMMI 기반의 XP를 위한 형상 관리 프로세스 구축 지침)

  • Han, Dong-Joon;Han, Hyuk-Soo
    • Journal of Internet Computing and Services
    • /
    • v.9 no.2
    • /
    • pp.107-118
    • /
    • 2008
  • The XP, the representative methodology of Agile software development, maximizes the effectiveness of the development by focusing on development itself and using primitive and basic process definition that can be easily implemented in the fields. However, the most of XP's practices came from those of engineering and the manogement practices of work product tend to be overlooked. The research on the implementation of those manogement practices has not been performed enough. Because of deficiency of processes that guide the change control over major baselines of work product and that describe proper continuous integration and refactoring in XP, the integrity of those work products is difficult to be guaranteed. To fulfill this work product integrity, CM(configuration manogement) should be hired and CMMI(Capability Maturity Model Integration) is considered to be the best references for that purpose, CMMI defines the required practices of CM and leave implementation details to the organization so that it could customize those practices based on the characteristics of its development methods. The CM process implementation guidelines based on CMMI could provides work product integrity with a way of keeping XP's agility that includes continuous integration, refactoring and small release. In this research, we selected CM process factors applicable to XP among CMMI's CM practices and based on them we developed the CM implementation guidelines.

  • PDF

Proposal of Standardization Plan for Defense Unstructured Datasets based on Unstructured Dataset Standard Format (비정형 데이터셋 표준포맷 기반 국방 비정형 데이터셋 표준화 방안 제안)

  • Yun-Young Hwang;Jiseong Son
    • Journal of Internet Computing and Services
    • /
    • v.25 no.1
    • /
    • pp.189-198
    • /
    • 2024
  • AI is accepted not only in the private sector but also in the defense sector as a cutting-edge technology that must be introduced for the development of national defense. In particular, artificial intelligence has been selected as a key task in defense science and technology innovation, and the importance of data is increasing. As the national defense department shifts from a closed data policy to data sharing and activation, efforts are being made to secure high-quality data necessary for the development of national defense. In particular, we are promoting a review of the business budget system to secure data so that related procedures can be improved to reflect the unique characteristics of AI and big data, and research and development can begin with sufficient large quantities and high-quality data. However, there is a need to establish standardization and quality standards for structured data and unstructured data at the national defense level, but the defense department is still proposing standardization and quality standards for structured data, so this needs to be supplemented. In this paper, we propose an unstructured data set standard format for defense unstructured data sets, which are most needed in defense artificial intelligence, and based on this, we propose a standardization method for defense unstructured data sets.

Yet Another BGP Archive Forensic Analysis Tool Using Hadoop and Hive (하둡과 하이브를 이용한 BGP 아카이브 데이터의 포렌직 분석 툴)

  • Lee, Yeonhee;Lee, YoungSeok
    • Journal of KIISE
    • /
    • v.42 no.4
    • /
    • pp.541-549
    • /
    • 2015
  • A large volume of continuously growing BGP data files can raise two technical challenges regarding scalability and manageability. Due to the recent development of the open-source distributed computing infrastructure, Hadoop, it becomes feasible to handle a large amount of data in a scalable manner. In this paper, we present a new Hadoop-based BGP tool (BGPdoop) that provides the scale-out performance as well as the extensible and agile analysis capability. In particular, BGPdoop realizes a query-based BGP record exploration function using Hive on the partitioned BGP data structure, which enables flexible and versatile analytics of BGP archive files. From the experiments for the scalability with a Hadoop cluster of 20 nodes, we demonstrate that BGPdoop achieves 5 times higher performance and the user-defined analysis capability by expressing diverse BGP routing analytics in Hive queries.

A Hybrid Modeling Tool for Human Error Control of in Collaborative Workflow (협업 워크플로우에서의 인적오류 제어를 위한 하이브리드 모델링 도구)

  • 이상영;유철중;장옥배
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.10 no.2
    • /
    • pp.156-173
    • /
    • 2004
  • Business process should support the execution of collaboration process with agility and flexibility through the integration of enterprise inner or outer applications and human resources from the collaborative workflow view. Although the dependency of enterprise activities to the automated system has been increasing, human role is as important as ever. In the workflow modelling this human role is emphasized and the structure to control human error by analysing decision-making itself is needed. Also, through the collaboration of activities agile and effective communication should be constructed, eventually by the combination and coordination of activities to the aimed process the product quality should be improved. This paper classifies human errors can be occurred in collaborative workflow by applying GEMS(Generic Error Modelling System) to control them, and suggests human error control method through hybrid based modelling as well. On this base collaborative workflow modeling tool is designed and implemented. Using this modelling methodology it is possible to workflow modeling could be supported considering human characteristics has a tendency of human error to be controlled.

Introduction and Analysis of Open Source Software Development Methodology (오픈소스 SW 개발 방법론 소개 및 분석)

  • Son, Kyung A;Yun, Young-Sun
    • Journal of Software Assessment and Valuation
    • /
    • v.16 no.2
    • /
    • pp.163-172
    • /
    • 2020
  • Recently, concepts of the Fourth Industrial Revolution technologies such as artificial intelligence, big data, and cloud computing have been introduced and the limits of individual or team development policies are being reviewed. Also, a lot of latest technology source codes have been opened to the public, and related studies are being conducted based on them. Meanwhile, the company is applying the strengths of the open source software development methodology to proprietary software development, and publicly announcing support for open source development methodology. In this paper, we introduced several software development methodology such as open source model, inner source model, and the similar DevOps model, which have been actively discussed recently, and compared their characteristics and components. Rather than claiming the excellence of a specific model, we argue that if the software development policy of an individual or affiliated organization is established according to each benefit, they will be able to achieve software quality improvement while satisfying customer requirements.