• Title/Summary/Keyword: Task Component

Search Result 258, Processing Time 0.027 seconds

Ensemble-based deep learning for autonomous bridge component and damage segmentation leveraging Nested Reg-UNet

  • Abhishek Subedi;Wen Tang;Tarutal Ghosh Mondal;Rih-Teng Wu;Mohammad R. Jahanshahi
    • Smart Structures and Systems
    • /
    • v.31 no.4
    • /
    • pp.335-349
    • /
    • 2023
  • Bridges constantly undergo deterioration and damage, the most common ones being concrete damage and exposed rebar. Periodic inspection of bridges to identify damages can aid in their quick remediation. Likewise, identifying components can provide context for damage assessment and help gauge a bridge's state of interaction with its surroundings. Current inspection techniques rely on manual site visits, which can be time-consuming and costly. More recently, robotic inspection assisted by autonomous data analytics based on Computer Vision (CV) and Artificial Intelligence (AI) has been viewed as a suitable alternative to manual inspection because of its efficiency and accuracy. To aid research in this avenue, this study performs a comparative assessment of different architectures, loss functions, and ensembling strategies for the autonomous segmentation of bridge components and damages. The experiments lead to several interesting discoveries. Nested Reg-UNet architecture is found to outperform five other state-of-the-art architectures in both damage and component segmentation tasks. The architecture is built by combining a Nested UNet style dense configuration with a pretrained RegNet encoder. In terms of the mean Intersection over Union (mIoU) metric, the Nested Reg-UNet architecture provides an improvement of 2.86% on the damage segmentation task and 1.66% on the component segmentation task compared to the state-of-the-art UNet architecture. Furthermore, it is demonstrated that incorporating the Lovasz-Softmax loss function to counter class imbalance can boost performance by 3.44% in the component segmentation task over the most employed alternative, weighted Cross Entropy (wCE). Finally, weighted softmax ensembling is found to be quite effective when used synchronously with the Nested Reg-UNet architecture by providing mIoU improvement of 0.74% in the component segmentation task and 1.14% in the damage segmentation task over a single-architecture baseline. Overall, the best mIoU of 92.50% for the component segmentation task and 84.19% for the damage segmentation task validate the feasibility of these techniques for autonomous bridge component and damage segmentation using RGB images.

A Case Study of Active Workflow Component Architecture on Constraints Based (제약식 기반의 능동적 워크플로우 컴포넌트 아키텍쳐 사례 연구)

  • Seo, Jang-Hoon;Shim, Sang-Yong;Lee, Kun-Hyuk;Park, Myeong-Kyu
    • Proceedings of the Safety Management and Science Conference
    • /
    • 2006.11a
    • /
    • pp.415-426
    • /
    • 2006
  • Many technical and nontechnical issues hinder enterprise wide workflow management. The most significant technical issue is the inability to deal with the heterogeneity among users, workflow types, and WFMSs. Not all users demand the same workflow functionality, so user interfaces of different levels of sophistication are required. Because workflow types cannot always be fully predefined, they often need to be adjusted or extended during execution. Unlike relational database management systems, however, each WFMS often has differing workflow metamodels. This leads to incompatibility between WFMSs, making integration into an environment comprising many heterogeneous WFMSs a troublesome and sometimes impossible task. Current Workflow system consists mainly of Database system. It contains some problems like that the integration relationship among system processes cant be expressed properly. This research has been focused on two phases that should be considered in the Workflow system. First of all, the first phase is the analysis phase; one of its role is to figure out independent execution task unit(Workflow component). The second phase is design phase that provides with the framework to execute these task units actively. The Workflow component extraction method in the analysis phase uses a analysis method called C-C Net and, in the design phase, the architecture that makes the these Workflow component executed actively is provided. Through this research, each process is divided into a task unit and more effective Workflow system could be formed by executing these units actively. Current system layer calls task units, on the other hand, the Workflow system this research implemented provides with the architecture that places a layer between them that controls task units actively.

  • PDF

Automatic assessment of post-earthquake buildings based on multi-task deep learning with auxiliary tasks

  • Zhihang Li;Huamei Zhu;Mengqi Huang;Pengxuan Ji;Hongyu Huang;Qianbing Zhang
    • Smart Structures and Systems
    • /
    • v.31 no.4
    • /
    • pp.383-392
    • /
    • 2023
  • Post-earthquake building condition assessment is crucial for subsequent rescue and remediation and can be automated by emerging computer vision and deep learning technologies. This study is based on an endeavour for the 2nd International Competition of Structural Health Monitoring (IC-SHM 2021). The task package includes five image segmentation objectives - defects (crack/spall/rebar exposure), structural component, and damage state. The structural component and damage state tasks are identified as the priority that can form actionable decisions. A multi-task Convolutional Neural Network (CNN) is proposed to conduct the two major tasks simultaneously. The rest 3 sub-tasks (spall/crack/rebar exposure) were incorporated as auxiliary tasks. By synchronously learning defect information (spall/crack/rebar exposure), the multi-task CNN model outperforms the counterpart single-task models in recognizing structural components and estimating damage states. Particularly, the pixel-level damage state estimation witnesses a mIoU (mean intersection over union) improvement from 0.5855 to 0.6374. For the defect detection tasks, rebar exposure is omitted due to the extremely biased sample distribution. The segmentations of crack and spall are automated by single-task U-Net but with extra efforts to resample the provided data. The segmentation of small objects (spall and crack) benefits from the resampling method, with a substantial IoU increment of nearly 10%.

Independent Component Analysis of the Event-Related Potential during Visual Oddball Tasks with Multiple Difficulty Levels (다중 난이도를 갖는 시각적 Oddball 작업 수행 시 사상관련전위의 독립요소분석)

  • Kim, Ja-Hyun;Yoon, Jin;Kim, Kyung-Hwan
    • Journal of Biomedical Engineering Research
    • /
    • v.29 no.1
    • /
    • pp.73-81
    • /
    • 2008
  • The purpose of this study is to observe the brain activity patterns during visual oddball tasks with two difficulty levels by the analysis of high-density event-related potential (ERP). Along with conventional statistical analysis of averaged ERP waveforms, we applied independent component analysis (ICA) for the individual, single-trial analysis and verified its effectiveness. We could identify multiple ERP components such as early visual components (P1, N1), and two components which seem to be important task-related components and showed difficulty-dependent variability (P2, P300). The P2 was found around central region at $180{\sim}220ms$, and the P300 was found globally at $300{\sim}500ms$ poststimulus. As the task became difficult, the P2 amplitude increased, and the P300 amplitude decreased. After single-trial ERPs were decomposed into multiple independent components (ICs), several ICs resulting from P2 and P300 sources were identified. These ICs were projected onto scalp electrodes and the projected ICs were statistically compared according to two task difficulties. For most subjects, the results obtained from single-trial/individual analysis using ICA gave the tendencies of amplitude change that are similar to the averaged ERP analysis for most subjects. The temporal pattern and number of ICs corresponding to ${\mu}$ rhythm was not dependent on the task difficulty. It seems that the motor response was not affected by the task difficulty.

Automatic P300 Detection using ICA with Reference (Reference를 갖는 ICA를 이용한 자동적 P300 검출)

  • Park, Heeyoul;Park, Seungjin
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2003.04c
    • /
    • pp.193-195
    • /
    • 2003
  • The analysis of EEG data is an important task in the domain of Brain Computer Interface (BCI). In general, this task is extremely difficult because EEG data is very noisy and contains many artifacts and consists of mixtures of several brain waves. The P300 component of the evoked potential is a relatively evident signal which has a large positive wave that occurs around 300 msec after a task-relevant stimulus. Thus automatic detection of P300 is useful in BCI. To this end, in this paper we employ a method of reference-based independent component analysis (ICA) which overcomes the ordering ambiguity in the conventional ICA. We show here. that ICA incorporating with prior knowledge is useful in the task of automatic P300 detection.

  • PDF

CompGenX: Component Code Generation System based on GenVoca and XML (CompGenX: GenVoca와 XML 기반의 컴포넌트 코드 생성 시스템)

  • Choi Seung-Hoon
    • Journal of Internet Computing and Services
    • /
    • v.4 no.3
    • /
    • pp.57-67
    • /
    • 2003
  • Software product lines are to attain the rapid development of qualify applications by concretizing the general components populated in software assets and assembling them according to the predefined architectures. For supporting the construction of the software product lines, this paper proposes a component code generation techniques based on GenVoca architecture and XML/XSLT technologies, In addition, CompGenX(Component Generator using XML), a component code generation system, is proposed on the basis of this techniques. By providing reconfigurability of component at the time of code generation, CompGenX allows the reusers to create the component source code that is appropriate to their purpose, In this system, the process of the component development is divided into two tasks which are the component family construction task and the component reuse task, For the component family construction, CompGenX provides the feature modeling tool for domain analysis and the domain architecture definition tool. Also, it provides the tool for building the component configuration know1edge specification and the code templates, For the component reuse task, it offers the component family search tool. the component customizing tool and the component code generator. Component code generation techniques and system in this paper should be applicable as basic technology to build the component-based software product lines.

  • PDF

Analysis Task Scheduling Models based on Hierarchical Timed Marked Graph

  • Ro, Cheul-Woo;Cao, Yang
    • International Journal of Contents
    • /
    • v.6 no.3
    • /
    • pp.19-24
    • /
    • 2010
  • Task scheduling is an integrated component of computing with the emergence of grid computing. In this paper, we address two different task scheduling models, which are static Round-Robin (RR) and dynamic Fastest Site First (FSF) task scheduling method, using extended timed marked graphs, which is a special case of Stochastic Petri Nets (SPN). Stochastic reward nets (SRN) is an extension of SPN and provides compact modeling facilities for system analysis. We build hierarchical SRN models to compare two task scheduling methods. The upper level model simulates task scheduling and the lower level model implements task serving process for different sites with multiple servers. We compare these two models and analyze their performances by giving reward measures in SRN.

Component Software Architecture for Embedded Controller (내장형 제어기를 위한 컴포넌트 소프트웨어 아키텍처)

  • 송오석;김동영;전윤호;이윤수;홍선호;신성훈;최종호
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2000.10a
    • /
    • pp.8-8
    • /
    • 2000
  • PICARD (Port-Interface Component Architecture for Real-time system Design) is a software architecture and environment, which is aimed to reduce development time and cost of real-time, control system. With PICARD, a control engineer can construct a control system software by assembling pre-built software components us ing interact ive graphical development environment. PICARD consists of PVM(Picard Virtual Machine) , a component library, and PICE(PIcard Configuration Editor). PVM is a real-time engine of the PICARD system which runs control tasks on a real-time operating system. The component library is composed of components which are called task blocks. PICE is a visual editor which can configure control tasks by creating data-flow diagrams of task blocks or Ladder diagrams for sequential logics. For the communication between PVM on a target system and PICE on a host computer, a simple protocol and tools for stub generation was dove]oped because RPC or CORBA is difficult to be applied for the embedded system. New features such as a byte-code based run time system and a simple and easy MMI builder are also introduced.

  • PDF

A study on the variables affecting on human performance in information processing tasks and its application to job placement (정보처리작업에서의 인간수행도 관련 변수와 직무배치에의 활용)

  • 이상도;손일문
    • Journal of the Ergonomics Society of Korea
    • /
    • v.14 no.1
    • /
    • pp.25-35
    • /
    • 1995
  • For information processing tasks, it is an important cognitive skill to manipulate and store information, which is known as information intake. One of the tasks which greatly involve this skill would be a spreadsheet calculation task. In this study, a spreadsheet calculation task is analyzed by the cognitive task analysis based on the cognitive factors having been usef for a model of human information processing. By the results of the cognitive task analysis, the spreadsheet calculation tasks to be used in the experiments are designed and the testbattery of cognitive abilities assessment (CCAB ; complex cognitive asssessment battery) are selected. Then, the features of cognitive demands and a human performance model of the spreadsheet calculation task are suggested by means of correlation analysis, principal component factor analysis, and regression analysis of the results of the experiments on task performances and the assessment of cognitive abilities. Also, the application of the results of the study to job placement and further research issues are described.

  • PDF

The Influence of Gender Schema on Children's Memory and Preference for Gender Related Tasks (아동의 성 도식과 성관련 과제의 기억 및 선호)

  • Chung, Soon Hwa;Chung, Ock Boon
    • Korean Journal of Child Studies
    • /
    • v.15 no.1
    • /
    • pp.37-53
    • /
    • 1994
  • The primary purpose of this study was to investigate the validity of a component model of gender role and differences in children's gender concepts with age and sex. The secondary purpose was to investigate the relationship between children's gender schema and memory as well as preference for gender related task. 181 children were interviewed about gender concepts and gender related tasks. Results indicated that three dimensions of the component model (i. e., gender label-component links, within-component links, between-component links) were significantly related to each other. The mean scores of gender role knowledge and attitude were different with age but not with sex. The results of the regression analysis showed that children's age, sex, and gender role attitude explained both memory and preference for gender related tasks. The component model had better explanatory power than the simple model. The findings of the present study suggest that children's gender concepts are better described in terms of the component model than the simple model and may contribute to a theoretical rationale for gender schema theory.

  • PDF