• Title/Summary/Keyword: granularity

Search Result 196, Processing Time 0.031 seconds

A Formal Approach of Defining Measure of Component Granularity (컴포넌트 Granularity를 정의하는 수학적인 방법에 대한 연구)

  • 이종국;김수동
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2000.10a
    • /
    • pp.483-485
    • /
    • 2000
  • 소프트웨어 산업계는 컴포넌트 기반의 개발이 소프트웨어의 생산성과 품질을 향상 시킬 것으로 기대하고 있다. 최근에 많은 연구를 통해 한 시스템의 컴포넌트들은 각각 granularity를 가지는 것이 밝혀졌다. 그러나 지금까지 제시된 많은 방법론들이 컴포넌트 granularity를 정확하고 세부적으로 도출하지 못하고 있다. 본 연구에서는 컴포넌트의 granularity를 도출하는 수학적인 방법을 제시한다. 그리고 이론적인 연구를 통해 비즈니스 단위의 컴포넌트와 공통 컴포넌트가 분리된다는 것을 보인다. 사례 연구를 통해 우리는 우리가 제시한 granularity의 도출 방법이 유용함을 증명하고 컴포넌트를 분리시키는 경계선이 존재한다는 것을 보인다.

  • PDF

The Principle of Justifiable Granularity and an Optimization of Information Granularity Allocation as Fundamentals of Granular Computing

  • Pedrycz, Witold
    • Journal of Information Processing Systems
    • /
    • v.7 no.3
    • /
    • pp.397-412
    • /
    • 2011
  • Granular Computing has emerged as a unified and coherent framework of designing, processing, and interpretation of information granules. Information granules are formalized within various frameworks such as sets (interval mathematics), fuzzy sets, rough sets, shadowed sets, probabilities (probability density functions), to name several the most visible approaches. In spite of the apparent diversity of the existing formalisms, there are some underlying commonalities articulated in terms of the fundamentals, algorithmic developments and ensuing application domains. In this study, we introduce two pivotal concepts: a principle of justifiable granularity and a method of an optimal information allocation where information granularity is regarded as an important design asset. We show that these two concepts are relevant to various formal setups of information granularity and offer constructs supporting the design of information granules and their processing. A suite of applied studies is focused on knowledge management in which case we identify several key categories of schemes present there.

Refined fixed granularity algorithm on Networks of Workstations (NOW 환경에서 개선된 고정 분할 단위 알고리즘)

  • Gu, Bon-Geun
    • The KIPS Transactions:PartA
    • /
    • v.8A no.2
    • /
    • pp.117-124
    • /
    • 2001
  • At NOW (Networks Of Workstations), the load sharing is very important role for improving the performance. The known load sharing strategy is fixed-granularity, variable-granularity and adaptive-granularity. The variable-granularity algorithm is sensitive to the various parameters. But Send algorithm, which implements the fixed-granularity strategy, is robust to task granularity. And the performance difference between Send and variable-granularity algorithm is not substantial. But, in Send algorithm, the computing time and the communication time are not overlapped. Therefore, long latency time at the network has influence on the execution time of the parallel program. In this paper, we propose the preSend algorithm. In the preSend algorithm, the master node can send the data to the slave nodes in advance without the waiting for partial results from the slaves. As the master node sent the next data to the slaves in advance, the slave nodes can process the data without the idle time. As stated above, the preSend algorithm can overlap the computing time and the communication time. Therefore we reduce the influence of the long latency time at the network and the execution time of the parallel program on the NOW. To compare the execution time of two algorithms, we use the $320{\times}320$ matrix multiplication. The comparison results of execution times show that the preSend algorithm has the shorter execution time than the Send algorithm.

  • PDF

A study of the relations between the Silver halide Grain structure in Emulsion and the Granularity (유제중의 AgX grain의 형태와 입상도에 관한 연구)

  • JeWungOh
    • Journal of the Korean Graphic Arts Communication Society
    • /
    • v.8 no.1
    • /
    • pp.21-53
    • /
    • 1990
  • In analysing the Image quality, one of the most important things to be considered is the granularity at a given emulsion speed. To enhance the image quality, the granularity should be lowered by the suitable methods, such as controlling the design of emulsion, grain size and structure, the distribution state of grains in the emulsion, etc. In this paper, the relations between the AgX grain structure and granularity are studied as a way of lowering granularity. According to the results, it is found that the grain structure is a very important factor for determining the granularity characteristics.istics.

  • PDF

Optimal SVM learning method based on adaptive sparse sampling and granularity shift factor

  • Wen, Hui;Jia, Dongshun;Liu, Zhiqiang;Xu, Hang;Hao, Guangtao
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.4
    • /
    • pp.1110-1127
    • /
    • 2022
  • To improve the training efficiency and generalization performance of a support vector machine (SVM) in a large-scale set, an optimal SVM learning method based on adaptive sparse sampling and the granularity shift factor is presented. The proposed method combines sampling optimization with learner optimization. First, an adaptive sparse sampling method based on the potential function density clustering is designed to adaptively obtain sparse sampling samples, which can achieve a reduction in the training sample set and effectively approximate the spatial structure distribution of the original sample set. A granularity shift factor method is then constructed to optimize the SVM decision hyperplane, which fully considers the neighborhood information of each granularity region in the sparse sampling set. Experiments on an artificial dataset and three benchmark datasets show that the proposed method can achieve a relatively higher training efficiency, as well as ensure a good generalization performance of the learner. Finally, the effectiveness of the proposed method is verified.

A Study on ICD-11 through Mapping to KCD-8 - Focusing on the Circulatory and Respiratory System -

  • Hyun-Kyung LEE;Yoo-Kyung BOO
    • Journal of Wellbeing Management and Applied Psychology
    • /
    • v.6 no.3
    • /
    • pp.1-11
    • /
    • 2023
  • Purpose: This research aims to facilitate a smooth transition from KCD-8 to ICD-11 through the study of ICD-11. Research design, data and methodology: Skilled Health Information Managers (HIMs) in Korea performed manual mapping and conducted a study of the code structure of ICD-11 chapters 11 and 12. Results: When comparing the granularity between ICD-11 and KCD-8, 58.1% of ICD-11 codes showed higher granularity, and 38.6% had similar granularity. The granularity of the circulatory system was higher than that of the respiratory system. When comparing the KCD-8 codes mapped by ICD-11 with the total 924 KCD-8 codes, it was found that about 50% of KCD-8 codes were not mapped to ICD-11. This means that 50% of diseases in the KCD-8 do not have individual codes as they did in ICD-11. Conclusions: ICD-11 demonstrated high granularity, indicating its effectiveness in describing cutting-edge medical technology in modern society. However, we also observed that some diseases were removed from KCD-8, while others were added to ICD-11. To ensure smooth statistics transition from KCD8 to ICD-11, especially for leading domestic diseases, integrated management, including the preparation of KCD-9 reflecting ICD-11 and ICD-11 training, will be necessary through the analysis of new codes and the removal of codes.

Reuse of KBS components

  • Oussalah, M.;Messaadia, K.
    • Proceedings of the Korea Inteligent Information System Society Conference
    • /
    • 2001.01a
    • /
    • pp.385-392
    • /
    • 2001
  • This paper proposes a meta modeling technique which permits to describe a KBS according to three axis: the object of reuse axis, the levels of granularity axis and the reuse process axis. The object of reuse axis allows to see a KBS as a set of inter-related components for reuse purposes. The levels of granularity axis allows to describe the KBS components according to different levels of granularity for clarity and reuse purposes. The reuse process axis allows to see the KBS components as (re)usable components.

  • PDF

Design and Evaluation of Function-granularity kernel update in dynamic manner (함수 단위 동적 커널 업데이트 시스템의 설계와 평가)

  • Park, Hyun-Chan;Kim, Se-Won;Yoo, Chuck
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.2 no.3
    • /
    • pp.145-154
    • /
    • 2007
  • Dynamic update of kernel can change kernel functionality and fix bugs in runtime. Dynamic update is important because it leverages availability, reliability and flexibility of kernel. An instruction-granularity update technique has been used for dynamic update. However, it is difficult to apply update technique for a commodity operating system kernel because development and maintenance of update code must be performed with assembly language. To overcome this difficulty, we design the function-granularity dynamic update system which uses high-level language such as C language. The proposed update system makes the development and execution of update convenient by providing the development environment for update code which is same for kernel development. We implement this system for Linux and demonstrate an example of update for do_coredump() function which is reported it has a vulnerable point for security. The update was successfully executed.

  • PDF

Six color separation using the color difference and granularity (색차와 낟알 무늬 값을 이용한 6색 분리 방법)

  • 손창환;김윤태;조양호;하영호
    • Proceedings of the IEEK Conference
    • /
    • 2003.11a
    • /
    • pp.245-248
    • /
    • 2003
  • This parer proposes the six color separation using th color difference and granularity. Conventional method using the color difference increases the graininess in the bright region due to the usage of the cyan or magenta. To reduce the graininess in the bright region, we proposed the six color separation minimizing the graininess within the tolerance of the co]or difference. Initially, granularity is calculated based on the standard deviation of the lightness value and chrominance of the SCIELAB space and is applied to the six color separation using the color difference. Proposed six color separation using the color difference and granularity reduces the graininess in the bright region and obtains the smooth tone.

  • PDF

Analysis of marine sediments between fishing area and non-fishing area in the shrimp beam trawl (새우조망 조업구역과 비조업구역의 해저퇴적물 분석)

  • Cho, Sam-Kwang;Yang, Yong-Soo;Cha, Bong-Jin;Seo, Young-Kyo
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.48 no.3
    • /
    • pp.208-216
    • /
    • 2012
  • The properties of sediment collected from seabed surface to 6cm depth on the four positions were analyzed to investigate turbulence of marine sediments by shrimp beam trawl. Types of sediments in the investigation area were (g)mS (slightly gravely muddy sand) and gmS (gravely muddy sand) showing high sand content, and (g)sM (slightly gravely sandy mud), gsM (gravely sandy mud) as well. It is estimated that position is more crucial factor than seasonal difference for the granularity variation of sediment in each investigation area. Finding the positional characteristics of sediment granularity was difficult before removing shells and organic matter. However, the average granularity is getting larger by going out from inland sea to open sea once those were removed. The granularity of marine sediment got narrow after processing in the fishing area for shrimp beam trawl but there was no big difference for granularity size before and after processing in the non-fishing area. This might be attributed to crushed shell particles going up and down again on the surface in the fishing area. To demonstrate the hypothesis mentioned above, the sediments driven by shrimp beam trawl need to be collected and analyzed.