• Title/Summary/Keyword: data complexity

Search Result 2,379, Processing Time 0.024 seconds

A Model Comparison for Spatiotemporal Data in Ubiquitous Environments: A Case Study

  • Noh, Seo-Young;Gadia, Shashi K.
    • Journal of Information Processing Systems
    • /
    • v.7 no.4
    • /
    • pp.635-652
    • /
    • 2011
  • In ubiquitous environments, many applications need to process data with time and space dimensions. Because of this, there is growing attention not only on gathering spatiotemporal data in ubiquitous environments, but also on processing such data in databases. In order to obtain the full benefits from spatiotemporal data, we need a data model that naturally expresses the properties of spatiotemporal data. In this paper, we introduce three spatiotemporal data models extended from temporal data models. The main goal of this paper is to determine which data model is less complex in the spatiotemporal context. To this end, we compare their query languages in the complexity aspect because the complexity of a query language is tightly coupled with its underlying data model. Throughout our investigations, we show that it is important to intertwine space and time dimensions and keep one-to-one correspondence between an object in the real world and a tuple in a database in order to naturally express queries in ubiquitous applications.

An Analysis of Effective Throughput in Distributed Wireless Scheduling

  • Radwan, Amr
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.2
    • /
    • pp.155-162
    • /
    • 2016
  • Several distributed scheduling policies have been proposed with the objective of attaining the maximum throughput region or a guaranteed fraction throughput region. These policies consider only the theoretical throughput and do not account the lost in throughput due to the time complexity of implementing an algorithm in practice. Therefore, we propose a novel concept called effective throughput to characterize the actual throughput by taking into account the time complexity. Effective throughput can be viewed as the actual transmitted data without including the control message overhead. Numerical results demonstrate that in practical scheduling, time complexity significantly affects throughput. The performance of throughput degrades when the time complexity is high.

Scalable Video Coding with Low Complex Wavelet Transform (공간 웨이블릿 변환의 복잡도를 줄인 스케일러블 비디오 코딩에 관한 연구)

  • Park, Seong-Ho;Kim, Won-Ha;Jeong, Se-Yoon
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.298-300
    • /
    • 2004
  • In the decoding process of interframe wavelet coding, the inverse wavelet transform requires huge computational complexity. However, the decoder may need to be used in various devices such as PDAs, notebooks, PCs or set-top Boxes. Therefore, the decoder's complexity should be adapted to the processor's computational power. A decoder designed in accordance with the processor's computational power would provide optimal services for such devices. So, it is natural that the complexity scalability and the low complexity codec are also listed in the requirements for scalable video coding. In this contribution, we develop a method of controlling and lowering the complexity of the spatial wavelet transform while sustaining almost the same coding efficiency as the conventional spatial wavelet transform. In addition, the proposed method may alleviate the ringing effect for certain video data.

  • PDF

Do the Technostress Creators Predict Job Satisfaction and Teacher Efficacy of Primary School Teachers in Korea?

  • LEE, Mignon;LIM, Kyu Yon
    • Educational Technology International
    • /
    • v.21 no.1
    • /
    • pp.69-95
    • /
    • 2020
  • The purpose of this research is to analyze the predictive powers of the five technostress creators - techno-overload, techno-invasion, techno-complexity, techno-insecurity, and techno-uncertainty - in job satisfaction and teacher efficacy of primary school teachers in Korea when they incorporated mobile technology into teaching. A questionnaire was designed to measure the level of teacher's stress from technology, job satisfaction and teacher efficacy. Data were collected from 164 teachers. Multiple regression analysis was conducted to explain which area of technostress led to varying degrees of job satisfaction and teacher efficacy. The results showed that techno-complexity alone predicted both job satisfaction and teacher efficacy. The reason why techno-complexity was the only predictor is that teachers would have first needed to understand how to incorporate mobile technology into teaching, before feeling overloaded, invaded, insecure, or uncertain about it, meaning techno-complexity precedes other constructs. Therefore, the only stress factor that affected them was how to understand the complexity of mobile technology. This calls for adequate training and support from schools and governments in order for the teachers to fully incorporate technology into teaching.

Digital Signage User Satisfaction Model: The Dual Effect of Technological Complexity

  • Lee, Mi-ah;Lee, Sooyeon;Ko, Eunju
    • Asia Marketing Journal
    • /
    • v.23 no.1
    • /
    • pp.5-27
    • /
    • 2021
  • This paper seeks to suggest user satisfaction model of digital signage to see how new in-store technology can effectively lead to customers' shopping satisfaction in fashion retails. Authors in particular focus on technological complexity, which is expected to serve a subtle role in using digital signage. This study employed a scenario-based online survey. Interactive digital signage with virtual try-on and video-captures functions was used as stimuli. Data were collected from 320 respondents and 307 useable responses were analyzed to examine a proposed model. Research model compares dual paths of motivators: the extrinsic motivation route that leads from usefulness to shopping outcome satisfaction and intrinsic motivation route that leads from enjoyment to shopping process satisfaction. Technological complexity of digital signage indirectly and negatively influences shopping outcome and process satisfaction, mediated by usefulness and enjoyment, but directly and positively affects shopping process satisfaction. In omni-channel environments, the findings have implications for fashion retail managers in using digital signage to maximize customer satisfaction and to counterbalance the advantages and disadvantages of technological complexity.

The Operators' Non-compliance Behavior to Conduct Emergency Operating Procedures - Comparing with the Complexity of the Procedural Steps

  • Park Jinkyun;Jung Wondea
    • Nuclear Engineering and Technology
    • /
    • v.35 no.5
    • /
    • pp.412-425
    • /
    • 2003
  • According to the results of related studies, one of the typical factors related to procedure related human errors is the complexity of procedures. This means that comparing the change of the operators' behavior with respect to the complexity of procedures may be meaningful in clarifying the reasons for the operators' non-compliance behavior. In this study, to obtain data related to the operators' non-compliance behavior, emergency training records were collected using a full scope simulator. And three types of the operators' behavior (such as strict adherence, skipping redundant actions and modifying action sequences) observed from the collected emergency training records were compared with the complexity of the procedural steps. As the results, two remarkable relationships are obtained. They are: 1) the operators seem to frequently adopt non-compliance behavior to conduct the procedural steps that have an intermediate procedural complexity, 2) the operators seems to accommodate their non-compliance behavior to the complexity of the procedural steps. Therefore, it is expected that these relationships can be used as meaningful clues not only to scrutinize the reason for non-compliance behavior but also to suggest appropriate remedies for the reduction of non-compliance behavior that can result in procedure related human error.

The factors influencing consumers' perceived complexity of online apparel mass customization service usage

  • Moon, Heekang;Lee, Hyun-Hwa;Chang, Eunyoung
    • The Research Journal of the Costume Culture
    • /
    • v.21 no.2
    • /
    • pp.272-286
    • /
    • 2013
  • Mass customization is a marketing strategy to meet consumer needs for variation and uniqueness of products. Although there are quite a few studies quantitatively investigated the options provided by mass customization process, scholarly work related to mass customization has provided mixed results on consumer perception of complexity and their responses. The purpose of the study is to derive the factors that influence consumer complexity perception in online apparel mass customization process and consumers' needs to enhance mass customization services. Data were collected by conducting focus group interviews of which 29 participations in 4 groups. The results of the study suggested that consumers perceived complexity through mass customization process due to too many choice options. However, the effect of number of options on respondents' complexity perception was different depending on consumer characteristics such as consumer expertise and fashion involvement, and the characteristics of consumer preference development. Shopping context such as shopping purpose is another moderating factor. This study also suggests that a variety of marketing strategies which can enhance mass customization services affect the relationship between the number of options and consumers' complexity perception. The findings of the study provide academic and managerial implications.

STRUCTURED CODEWORD SEARCH FOR VECTOR QUANTIZATION (백터양자화가의 구조적 코더 찾기)

  • 우홍체
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2000.11a
    • /
    • pp.467-470
    • /
    • 2000
  • Vector quantization (VQ) is widely used in many high-quality and high-rate data compression applications such as speech coding, audio coding, image coding and video coding. When the size of a VQ codebook is large, the computational complexity for the full codeword search method is a significant problem for many applications. A number of complexity reduction algorithms have been proposed and investigated using such properties of the codebook as the triangle inequality. This paper proposes a new structured VQ search algorithm that is based on a multi-stage structure for searching for the best codeword. Even using only two stages, a significant complexity reduction can be obtained without any loss of quality.

  • PDF

Regression Analysis of the Relationships between Complexity Metrics and Faults on the Telecommunication Program (통신 소프트웨어의 프로그램 결함과 복잡도의 관련성 분석을 위한 회귀분석 모델)

  • Lee, Gyeong-Hwan;Jeong, Chang-Sin;Hwang, Seon-Myeong;Jo, Byeong-Gyu;Park, Ji-Hun;Kim, Gang-Tae
    • Journal of KIISE:Software and Applications
    • /
    • v.26 no.11
    • /
    • pp.1282-1287
    • /
    • 1999
  • 통신 프로그램은 고도의 신뢰성과 기능성, 확장성, 그리고 유지 보수성이 필요하다. 프로그램 테스트의 결과와 McCabe의 Complexity를 측정한 데이타를 가지고 회귀모델을 만들고 그 신뢰성을 분석함으로서 프로그램의 결함과 복잡도의 관련성을 평가한다.본 연구에서 사용한 통신 프로그램은 500개 블록이 59가지 기능을 수행하는 교환 기능 중에서 복잡도가 너무 많아서 통계 처리의 bias가 될 블록을 제외하고 394 블록을 선정하여 SAS에 의해서 통계 분석을 하고 회귀 분석 모델을 설계하였다. t 분포에 의하여 방정식의 유의성 수준을 검증하고 프로그램의 결함수에 가장 큰 영향을 주고 있는 복잡도가 McCabe의 복잡도와 설계 복잡도 임을 밝혀냈다. 이 연구 결과에 의해서 설계 정보 및 유지 보수 정보를 얻을 수 있다. Abstract Switching software requires high reliability, functionality, extendability and maintainability. For doing, software quality model based on MaCabe's complexity measure is investigated. It is experimentally shown using regression analysis the program fault density depends on the complexity and size of the function unit. The software should be verified and tested if it satisfies its requirements with automated analysis tools. In this paper we propose the regression model with the test data.The sample program for the regression model consists of more than 500 blocks, where each block compose of 10 files, which has 59 functions of switching activity.Among them we choose 394 blocks and analyzed for 59 functions by testing tools and SAS package. We developed Regression Analysis Model and evaluated significant of the equation based on McCabe's cyclomatic complexity, block design complexity, design complexity, and integration complexity.The results of our experimental study are that number of fault are under the influence of McCabe's complexity number and design complexity.

An Analysis of Location Management Cost by Predictive Location Update Policy in Mobile Cellular Networks (이동통신망에서 예측 위치 등록 정책을 통한 위치관리 비용 감소 효과 분석)

  • Ko, Han-Seong;Hong, Jung-Sik;Chang, In-Kap;Lie, Chang-Hoon
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.34 no.2
    • /
    • pp.160-171
    • /
    • 2008
  • MU's mobility patterns can be found from a movement history data. The prediction accuracy and model complexity depend on the degree of application of history data. The more data we use, the more accurate the prediction is. As a result, the location management cost is reduced, but complexity of the model increases. In this paper, we classify MU's mobility patterns into four types. For each type, we find the respective optimal number of application of history data, and predictive location area by using the simulation. The optimal numbers of four types are shown to be different. When we use more than three application of history data, the simulation time and data storage are shown to increase very steeply.