• Title/Summary/Keyword: data complexity

검색결과 2,379건 처리시간 0.034초

The Effects of Emergent Leader on Team Cognitive Complexity and Team Performance

  • Choi, Kyoosang
    • Journal of the Korean Data Analysis Society
    • /
    • 제20권6호
    • /
    • pp.2781-2792
    • /
    • 2018
  • From a cognitive perspective, this study investigates the role of emergent leaders in developing team cognition and affecting team performance. With application of the cognitive complexity theory, this study hypothesizes that emergent leaders' cognitive complexity will be positively associated with team cognitive complexity, and that team cognitive complexity will be positively associated with team performance. In addition, team cognitive complexity is hypothesized to mediate the effect of the cognitive complexity of emergent leaders on team performance. To test the research hypotheses, data were obtained from 100 teams comprising a total of 339 undergraduate students who participated in a business simulation game. The findings of this study suggests that the cognitive complexity of emergent leaders is a significant predictor of team cognitive complexity, and that team cognitive complexity is positively related to team performance. Moreover, team cognitive complexity significantly mediates the effect of emergent leaders' cognitive complexity and team performance.

슬라이스 복잡도 측정을 위한 VFG의 사용 (The Use of VFG for Measuring the Slice Complexity)

  • 문유미;최완규;이성주
    • 한국정보통신학회논문지
    • /
    • 제5권1호
    • /
    • pp.183-191
    • /
    • 2001
  • 본 논문은 데이터 슬라이스에서의 정보 흐름을 모델링하기 위해서 데이터 값 흐름 그래프(VFG: data Value Flow Graph)라고 하는 새로운 데이터 슬라이스(data slice) 표현을 개발한다. 다음으로, VFG에서의 정보 흐름의 복잡도를 측정하기 위해 기존의 흐름 복잡도를 이용하여 슬라이스 복잡도 척도를 정의한다. 본 연구에서는 각 슬라이스에 대한 슬라이스 복잡도와 전체 슬라이스 복잡도 간의 관계를 보여주고, VFG에서의 극소 수정(atomic modification)과 합성 연산자(concatenation operator)를 통해서 슬라이스 복잡도 척도의 스케일(scale) 인자들을 증명한다.

  • PDF

비선형 동역학적 방법을 통한 뇌파 복잡도와 임피던스 심장기록법(ICG) 지표와의 상관성 연구 (A Study on the Correlationship between EEG Complexity by Nonlinear Dynamics Analysis and Impedance Cardiography)

  • 유재민;박영배;박영재
    • 대한한의진단학회지
    • /
    • 제11권2호
    • /
    • pp.128-140
    • /
    • 2007
  • Purpose: We performed this study to examine the correlationship between EEG complexity and impedance cardiography data using correlation analysis. Method: This study performed on 30 healthy subjects(16 males, 14 females). Before and after natural respiration, ICG data were recorded, and EEG raw data were measured by moving windows during 15 minutes. The correlation dimension(D2) was calculated from 15 minutes data. 8 channels EEG data were analysed with 9 index of ICG data by correlation analysis. Result: 1. ACI of impedance cardiography had significant correlationship with ch.4 of EEG complexity(p=0.03). 2. VI of impedance cardiography had significant correlationship with ch.3 of EEG complexity(p=0.034) and ch.4 of EEG complexity(p=0.017). 3. HR, TFC, PEP, LVET, STR of impedance cardiography had no significant correlationship with all of 8 channel EEG complexity. Conclusions: These results suggest that nonlinear analysis of EEG and impedance cardiography have some significant correlationship. And it can make out relationship between brain system and cardiovascular system. In the future, therefore, more study of these fields are necessary.

  • PDF

슬라이스 기반 복잡도 척도 (A Slice-based Complexity Measure)

  • 문유미;최완규;이성주
    • 정보처리학회논문지D
    • /
    • 제8D권3호
    • /
    • pp.257-264
    • /
    • 2001
  • 본 논문은 데이터 슬라이스에서의 데이터 토큰들의 정보 흐름에 기초하여 프로그램에서의 정보 흐름을 모델링하는 SIFG(Slicw-based information Graph)를 개발하였다. 다음으로, SIFG에서의 정보 흐름의 복잡도 측정을 통해서 프로그램의 복잡도를 측정하기 위해 SCM(Slice-based Complexity Measure)을 정의하였다. SCM은 Briand가 제시하는 복잡도 메트릭에 필요한 특성들을 만족하였고, 그리고 기존 척도들과는 달리, SCM은 프로그램 내에서의 제어와 데이터 흐름뿐만 아니라 프로그램의 물리적 크기를 반영하는 측정이 이루어졌다.

  • PDF

An XPDL-Based Workflow Control-Structure and Data-Sequence Analyzer

  • Kim, Kwanghoon Pio
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제13권3호
    • /
    • pp.1702-1721
    • /
    • 2019
  • A workflow process (or business process) management system helps to define, execute, monitor and manage workflow models deployed on a workflow-supported enterprise, and the system is compartmentalized into a modeling subsystem and an enacting subsystem, in general. The modeling subsystem's functionality is to discover and analyze workflow models via a theoretical modeling methodology like ICN, to graphically define them via a graphical representation notation like BPMN, and to systematically deploy those graphically defined models onto the enacting subsystem by transforming into their textual models represented by a standardized workflow process definition language like XPDL. Before deploying those defined workflow models, it is very important to inspect its syntactical correctness as well as its structural properness to minimize the loss of effectiveness and the depreciation of efficiency in managing the corresponding workflow models. In this paper, we are particularly interested in verifying very large-scale and massively parallel workflow models, and so we need a sophisticated analyzer to automatically analyze those specialized and complex styles of workflow models. One of the sophisticated analyzers devised in this paper is able to analyze not only the structural complexity but also the data-sequence complexity, especially. The structural complexity is based upon combinational usages of those control-structure constructs such as subprocesses, exclusive-OR, parallel-AND and iterative-LOOP primitives with preserving matched pairing and proper nesting properties, whereas the data-sequence complexity is based upon combinational usages of those relevant data repositories such as data definition sequences and data use sequences. Through the devised and implemented analyzer in this paper, we are able eventually to achieve the systematic verifications of the syntactical correctness as well as the effective validation of the structural properness on those complicate and large-scale styles of workflow models. As an experimental study, we apply the implemented analyzer to an exemplary large-scale and massively parallel workflow process model, the Large Bank Transaction Workflow Process Model, and show the structural complexity analysis results via a series of operational screens captured from the implemented analyzer.

Low-Complexity MPEG-4 Shape Encoding towards Realtime Object-Based Applications

  • Jang, Euee-Seon
    • ETRI Journal
    • /
    • 제26권2호
    • /
    • pp.122-135
    • /
    • 2004
  • Although frame-based MPEG-4 video services have been successfully deployed since 2000, MPEG-4 video coding is now facing great competition in becoming a dominant player in the market. Object-based coding is one of the key functionalities of MPEG-4 video coding. Real-time object-based video encoding is also important for multimedia broadcasting for the near future. Object-based video services using MPEG-4 have not yet made a successful debut due to several reasons. One of the critical problems is the coding complexity of object-based video coding over frame-based video coding. Since a video object is described with an arbitrary shape, the bitstream contains not only motion and texture data but also shape data. This has introduced additional complexity to the decoder side as well as to the encoder side. In this paper, we have analyzed the current MPEG-4 video encoding tools and proposed efficient coding technologies that reduce the complexity of the encoder. Using the proposed coding schemes, we have obtained a 56 percent reduction in shape-coding complexity over the MPEG-4 video reference software (Microsoft version, 2000 edition).

  • PDF

Software Defined Networking and Network Function Virtualization for improved data privacy using the emergent blockchain in banking systems

  • ALRUWAILI, Anfal;Hendaoui, Saloua
    • International Journal of Computer Science & Network Security
    • /
    • 제21권8호
    • /
    • pp.111-118
    • /
    • 2021
  • Banking systems are sensitive to data privacy since users' data, if not well protected, may be used to perform fake transactions. Blockchains, public and private, are frequently used in such systems thanks to their efficiency and high security. Public blockchains fail to fully protect users' data, despite their power in the accuracy of the transactions. The private blockchain is better used to protect the privacy of the sensitive data. They are not open and they apply authorization to login into the blockchain. However, they have a lower security compared to public blockchain. We propose in this paper a hybrid public-private architecture that profits from network virtualization. The main novelty of this proposal is the use of network virtualization that helps to reduce the complexity and efficiency of the computations. Simulations have been conducted to evaluate the performance of the proposed solution. Findings prove the efficiency of the scheme in reducing complexity and enhancing data privacy by guarantee high security. The contribution conducted by this proposal is that the results are verified by the centralized controller that ensures a correct validation of the resulted blockchains. In addition, computation complexity is to be reduced by profiting from the cooperation performed by the virtual agents.

Predicting Learning Achievements with Indicators of Perceived Affordances Based on Different Levels of Content Complexity in Video-based Learning

  • Dasom KIM;Gyeoun JEONG
    • Educational Technology International
    • /
    • 제25권1호
    • /
    • pp.27-65
    • /
    • 2024
  • The purpose of this study was to identify differences in learning patterns according to content complexity in video-based learning environments and to derive variables that have an important effect on learning achievement within particular learning contexts. To achieve our aims, we observed and collected data on learners' cognitive processes through perceived affordances, using behavioral logs and eye movements as specific indicators. These two types of reaction data were collected from 67 male and female university students who watched two learning videos classified according to their task complexity through the video learning player. The results showed that when the content complexity level was low, learners tended to navigate using other learners' digital logs, but when it was high, students tended to control the learning process and directly generate their own logs. In addition, using derived prediction models according to the degree of content complexity level, we identified the important variables influencing learning achievement in the low content complexity group as those related to video playback and annotation. In comparison, in the high content complexity group, the important variables were related to active navigation of the learning video. This study tried not only to apply the novel variables in the field of educational technology, but also attempt to provide qualitative observations on the learning process based on a quantitative approach.

Impossible Differential Cryptanalysis on DVB-CSA

  • Zhang, Kai;Guan, Jie;Hu, Bin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제10권4호
    • /
    • pp.1944-1956
    • /
    • 2016
  • The Digital Video Broadcasting-Common Scrambling Algorithm is an ETSI-designated algorithm designed for protecting MPEG-2 signal streams, and it is universally used. Its structure is a typical hybrid symmetric cipher which contains stream part and block part within a symmetric cipher, although the entropy is 64 bits, there haven't any effective cryptanalytic results up to now. This paper studies the security level of CSA against impossible differential cryptanalysis, a 20-round impossible differential for the block cipher part is proposed and a flaw in the cipher structure is revealed. When we attack the block cipher part alone, to recover 16 bits of the initial key, the data complexity of the attack is O(244.5), computational complexity is O(222.7) and memory complexity is O(210.5) when we attack CSA-BC reduced to 21 rounds. According to the structure flaw, an attack on CSA with block cipher part reduced to 21 rounds is proposed, the computational complexity is O(221.7), data complexity is O(243.5) and memory complexity is O(210.5), we can recover 8 bits of the key accordingly. Taking both the block cipher part and stream cipher part of CSA into consideration, it is currently the best result on CSA which is accessible as far as we know.

직업정보제공방식의 차이에 따른 청소년의 직업인지복잡성의 증대효과 (The Effect of Occupational Information on the Cognitive Complexity of Adolescents)

  • 이옥
    • 아동학회지
    • /
    • 제12권2호
    • /
    • pp.67-77
    • /
    • 1991
  • An investigation of the effect of occupational information on vocational cognitive complexity was conducted with 331 male and female adolescents in ninth grade. There were 2 experimental groups and 1 control group. Experimental group I was given only occupational information sheets (written form information) while group II was given occupational information through verbal instruction in addition to the occupational information sheets. A modified form of the cognitive complexity grid originally developed by Bodden (1970) was utilized to collect data on the subjects' vocational cognitive complexity. ANOVA and $Scheff{\acute{e}}$ tests revealed that there were significant differences between experimental group II and the other groups in vocational cognitive complexity. The cognitive complexity level of experimental group I and the control group for the most aspired occupation was significantly lower than for the least aspired occupation. However, the cognitive complexity level of experimental group II for the most aspired occupation was higher than for the least aspired occupation. The results suggest that just giving occupational information to adolescents may not be effective and giving occupational information may be effective only when the method of giving occupational information is active enough to induce adolescents' self-confirming cognitive process.

  • PDF