• Title/Summary/Keyword: granularity

Search Result 196, Processing Time 0.029 seconds

A Framework for Concurrency Control and Writing Authority Control in Collaborative Writing Systems (공동저작 시스템에서의 동시성 제어와 쓰기 권한 제어)

  • Yoo, Jae-Hong;Sung, Mee-Young
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.2
    • /
    • pp.347-354
    • /
    • 2000
  • This paper presents the efficient mechanisms for concurrency control and writing authority control in collaborative writing system are represented by the tree structures which consist of the logical objects and the content objects connected to the terminal objects of trees. For concurrency control, we adopted the approach to extend the multiple-granularity-locking-scheme. This scheme allows us to lock any objects at each level of the hierarchy. We also defined the locking compatibility table by analysing the operations applicable to any objects at each level of the hierarchy. We finally suggest the extended-multiple-granularity-locking mechanism which uses the locking compatibilility table for deciding to lock an object. This scheme gives the benefit to maximize the possibility of concurrent accessing to the shared objects. In addition, we suggest a mechanism for writing authority control which prohibits the Non-Group users from modifying the shared objects based on the concept of Group/Non-Group The proposed mechanism allows us to protect copyright very reasonably.

  • PDF

Extended Forecasts of a Stock Index using Learning Techniques : A Study of Predictive Granularity and Input Diversity

  • Kim, Steven H.;Lee, Dong-Yun
    • Asia pacific journal of information systems
    • /
    • v.7 no.1
    • /
    • pp.67-83
    • /
    • 1997
  • The utility of learning techniques in investment analysis has been demonstrated in many areas, ranging from forecasting individual stocks to entire market indexes. To date, however, the application of artificial intelligence to financial forecasting has focused largely on short predictive horizons. Usually the forecast window is a single period ahead; if the input data involve daily observations, the forecast is for one day ahead; if monthly observations, then a month ahead; and so on. Thus far little work has been conducted on the efficacy of long-term prediction involving multiperiod forecasting. This paper examines the impact of alternative procedures for extended prediction using knowledge discovery techniques. One dimension in the study involves temporal granularity: a single jump from the present period to the end of the forecast window versus a web of short-term forecasts involving a sequence of single-period predictions. Another parameter relates to the numerosity of input variables: a technical approach involving only lagged observations of the target variable versus a fundamental approach involving multiple variables. The dual possibilities along each of the granularity and numerosity dimensions entail a total of 4 models. These models are first evaluated using neural networks, then compared against a multi-input jump model using case based reasoning. The computational models are examined in the context of forecasting the S&P 500 index.

  • PDF

A Time-Segmented Storage Structure and Migration Strategies for Temporal Data (시간지원 데이터를 위한 분리 저장 구조와 데이터 이동 방법)

  • Yun, Hong-Won;Kim, Gyeong-Seok
    • The Transactions of the Korea Information Processing Society
    • /
    • v.6 no.4
    • /
    • pp.851-867
    • /
    • 1999
  • Numerous proposals for extending the relational data model as well as conceptual and object-oriented data models have been suggested. However, there has been relatively less research in the area of defining segmented storage structure and data migration strategies for temporal data. This paper presents the segmented storage structure in order to increment search performance and the two data migration strategies for segmented storage structure. this paper presents the two data migration strategies : the migration strategy by Time granularity, the migration strategy by LST-GET. In the migration strategy by Time Granularity, the dividing time point to assign the entity versions to the past segment, the current segment, and future segment is defined and the searching and moving process for data validity at a granularity level are described. In the migration strategy by LST-GET, we describe the process how to compute the value of dividing criterion. searching and moving processes are described for migration on the future segment and the current segment and entity versions 새 assign on each segment are defined. We simulate the search performance of the segmented storage structure in order to compare it with conventional storage structure in relational database system. And extensive simulation studies are performed in order to compare the search performance of the migration strategies with the segmented storage structure.

  • PDF

Batch Resizing Policies and Techniques for Fine-Grain Grid Tasks: The Nuts and Bolts

  • Muthuvelu, Nithiapidary;Chai, Ian;Chikkannan, Eswaran;Buyya, Rajkumar
    • Journal of Information Processing Systems
    • /
    • v.7 no.2
    • /
    • pp.299-320
    • /
    • 2011
  • The overhead of processing fine-grain tasks on a grid induces the need for batch processing or task group deployment in order to minimise overall application turnaround time. When deciding the granularity of a batch, the processing requirements of each task should be considered as well as the utilisation constraints of the interconnecting network and the designated resources. However, the dynamic nature of a grid requires the batch size to be adaptable to the latest grid status. In this paper, we describe the policies and the specific techniques involved in the batch resizing process. We explain the nuts and bolts of these techniques in order to maximise the resulting benefits of batch processing. We conduct experiments to determine the nature of the policies and techniques in response to a real grid environment. The techniques are further investigated to highlight the important parameters for obtaining the appropriate task granularity for a grid resource.

A Study on the Minimum Paste Volume in the Design of Concrete Mixture

  • Fowler, David W.;Hahn, Michael De Moya;Rached, Marc;Choi, Doo-Sun;Choi, Jae-Jin
    • International Journal of Concrete Structures and Materials
    • /
    • v.2 no.2
    • /
    • pp.161-167
    • /
    • 2008
  • Optimization of concrete mixing system is very important for the production of quality mixture of concrete and requires very complicated, specialized knowledge as there are a variety of variables that influence the result. One of the methods of optimizing the concrete mixing system is to minimize the volume of cement paste which, in turn, means maximizing the volume of aggregate. The purpose of this study is to determine the minimum volume of cement paste used in the design of concrete mixture and to design the optimum concrete mixing system based on the fluidity of mortar and concrete. In determining the minimum volume of cement paste, experiments of mortar and concrete were performed based on their workability, material segregation and bleeding. Type of aggregate, granularity distribution and sand percentage were used as test parameters and measurements were taken of the distribution of granularity, usage of HRWRA, minimum volume of paste and drying shrinkage and compressive strength of concrete.

Flow Cytometric Analysis of Bovine Granulosa Cells : Changes of Cell Cycle During Follicular Maturation (Flow Cytometer를 이용한 소 과립막세포의 분석 : 난포성숙에 따른 세포주기의 변화)

  • 김해정;김동훈;이훈택;정길생
    • Korean Journal of Animal Reproduction
    • /
    • v.17 no.4
    • /
    • pp.279-285
    • /
    • 1994
  • The objective of the present study was to characterize the cell cycles of granulosa cell populations during follicular maturation in cattle by using flow cytometer. Granulosa cells were isolated from bovine preovulatory antral follicles of F1(>10mm), F2(5~20mm), F3(3~4mm) and F4(1~2mm) diameter and fixed and stained with fluorochromes that selectively bine to DNA. Flow cytometer equipped with a laser excitation system was used to analyze the intensity of fluorescence from stained cells. Forward angle light-scatter(FSC) and 90$^{\circ}$light-scatter(SSC) signals were adopted to measure the size and the granularity of granulosa cells. As a results of FSC/SSC analysis, granulosa cell populations(G1 phase of cell cycle) from each follicle were relatively regular in size and granularity, regardless of follicular size. However, their distribution in granularity was greater than that in size. Most of granulosa cell populations collected from each follicle were distributed in G0/G1, S and G2/M phases. As the follicles approached to ovulation the percentage of cells in the proliferative phases of cell cycle (S and G2/M) decreased significantly, but there was a concomitant increase in the percentage of granulosa cells in G1 phase. Therefore, these data indicate the proportion of main populations to cell cycle of granulosa cells may be changed from proliferative phase to G1 phase during follicular maturation in cattle.

  • PDF

Concepts and Design Aspects of Granular Models of Type-1 and Type-2

  • Pedrycz, Witold
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.2
    • /
    • pp.87-95
    • /
    • 2015
  • In this study, we pursue a new direction for system modeling by introducing the concept of granular models, which produce results in the form of information granules (such as intervals, fuzzy sets, and rough sets). We present a rationale and several key motivating arguments behind the use of granular models and discuss their underlying design processes. The development of the granular model includes optimal allocation of information granularity through optimizing the criteria of coverage and specificity. The emergence and construction of granular models of type-2 and type-n (in general) is discussed. It is shown that achieving a suitable coverage-specificity tradeoff (compromise) is essential for developing granular models.

Adaptive Scanning Method for Fine Granularity Scalable Video Coding

  • Park, Gwang-Hoon;Kim, Kyu-Heon
    • ETRI Journal
    • /
    • v.26 no.4
    • /
    • pp.332-343
    • /
    • 2004
  • One of the recent and most significant technical properties can be expressed as "digital convergence," which is helping lead the technical paradigm into a ubiquitous environment. As an initial trial of realizing a ubiquitous environment, the convergence between broadcasting and telecommunication fields is now on the way, where it is required to develop a scalable video coding scheme for one-source and multi-use media. Traditional scalable video coding schemes have, however, limitations for higher stable picture quality especially on the region of interest. Therefore, this paper introduces an adaptive scanning method especially designed for a higher regional-stable picture quality under a ubiquitous video coding environment, which can improve the subjective quality of the decoded video by most-preferentially encoding, transmitting, and decoding the top-priority image information of the region of interest. Thus, the video can be more clearly visible to users. From various simulation results, the proposed scanning method in this paper can achieve an improved subjective picture quality far better than the widely used raster scan order in conventional video coding schemes, especially on the region of interest, and without a significant loss of quality in the left-over region.

  • PDF

A Study on the Contour-Preserving Image Filtering for Noise Removal (잡음 제거를 위한 윤곽선 보존 기법에 관한 연구)

  • Yoo, Choong-Woong;Ryu, Dae-Hyun;Bae, Kang-Yeul
    • Journal of the Korean Institute of Telematics and Electronics T
    • /
    • v.36T no.4
    • /
    • pp.24-29
    • /
    • 1999
  • In this paper, a simple contour-preserving filtering algorithm is proposed. The goal of the contour-preserving filtering method is to remove noise ad granularity as the preprocessing for the image segmentation procedure. Our method finds edge map and separates the image into the edge region and the non-edge region using this edge map. For the non-edge region, typical smoothing filters could be used to remove the noise and the small areas during the segmentation procedure. The result of simulation shows that our method is slightly better than the typical methods such as the median filtering and gradient inverse weighted filtering in the point of view of analysis of variance (ANOVA).

  • PDF

Multi-granularity Switching Structure Based on Lambda-Group Model

  • Wang, Yiyun;Zeng, Qingji;Jiang, Chun;Xiao, Shilin;Lu, Lihua
    • ETRI Journal
    • /
    • v.28 no.1
    • /
    • pp.119-122
    • /
    • 2006
  • We present an intelligent optical switching structure based on our lambda-group model along with a working scheme that can provide a distinctive approach for dividing complicated traffic into specific tunnels for better optical performance and grooming efficiency. Both the results and figures from our experiments show that the particular channel partition not only helps in reducing ports significantly, but also improves the average signal-to-noise ratio of the wavelength channel and the blocking performance for dynamic connection requests.

  • PDF