• Title/Summary/Keyword: Page Region

Search Result 115, Processing Time 0.03 seconds

Effect of ASLR on Memory Duplicate Ratio in Cache-based Virtual Machine Live Migration

  • Piao, Guangyong;Oh, Youngsup;Sung, Baegjae;Park, Chanik
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.9 no.4
    • /
    • pp.205-210
    • /
    • 2014
  • Cache based live migration method utilizes a cache, which is accessible to both side (remote and local), to reduce the virtual machine migration time, by transferring only irredundant data. However, address space layout randomization (ASLR) is proved to reduce the memory duplicate ratio between targeted migration memory and the migration cache. In this pager, we analyzed the behavior of ASLR to find out how it changes the physical memory contents of virtual machines. We found that among six virtual memory regions, only the modification to stack influences the page-level memory duplicate ratio. Experiments showed that: (1) the ASLR does not shift the heap region in sub-page level; (2) the stack reduces the duplicate page size among VMs which performed input replay around 40MB, when ASLR was enabled; (3) the size of memory pages, which can be reconstructed from the fresh booted up state, also reduces by about 60MB by ASLR. With those observations, when applying cache-based migration method, we can omit the stack region. While for other five regions, even a coarse page-level redundancy data detecting method can figure out most of the duplicate memory contents.

Caret Unit Generation Method from PC Web for Mobile Device (캐럿 단위를 이용한 PC 웹 컨텐츠를 모바일 단말기에 서비스 하는 방법)

  • Park, Dae-Hyuck;Kang, Eui-Sun;Lim, Young-Hwan
    • The KIPS Transactions:PartD
    • /
    • v.14D no.3 s.113
    • /
    • pp.339-346
    • /
    • 2007
  • The objective of this study is to satisfy requirements for a variety of terminals to play wired web page contents in ubiquitous environment constantly connected to network. In other words, this study intended to automatically transcode wired web page into mobile web page in order to receive service by using mobile base to carry contents in Internet web page. To achieve this objective, we suggest the method that is to directly enter URL of web page in mobile device and to check contents of the current web page. For this, web page is converted into an image and configured into a mobile web page suitable for personal terminals. Users can obtain the effect of having web services provided by using computer with interfaces to expand, reduce and move the web page as desired. This is a caret unit play method, with which contents of web page are transcoded and played to suit each user According to the method proposed in this study, contents of wired web page can be played by using a mobile device. This study confirms that a single content can be serviced to suit users of various terminals. Through this, it will be able to reuse numerous wired web contents as mobile web contents.

Analyzing errors in selectivity estimation using the multilevel grid file (계층 그리드 화일을 이용한 선택률 추정에서 발생되는 오차 분석)

  • 김상욱;황환규;황규영
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.33B no.9
    • /
    • pp.24-36
    • /
    • 1996
  • In this paper, we discuss the errors in selectivity estimation using the multilevel grid file (MLGF). We first demonstrate that the estimatio errors stem from the uniformity assumption that records are uniformly distributed in their belonging region represented by an entry in a level of an MLGF directory. Bsed on this demonstration, we then investigate five factors affecting the accuracy of estimation: (1) the data distribution in a region (2) the number of records stored in an MLFG (3) the page size, (4) the query region size, and (5) the level of an MLFG directory. Next we present the tendancy of estimation errors according to the change of values for each factor through experiments. The results show that the errors decrease when (1) the distribution of records in a region becomes closer to the uniform one, (2) the number of records in an MLFG increases, (3) the page size decreases, (4) the query region size increases, and (5) the level of an MLFG directory employed as data distribution information becomes lower. After the definition of the granule ratio, the core formula representing the basic relationship between the estimation errors and the above five factors, we finally examine the change of estimation errors according to the change of the values for the granule ratio through experiments. The results indicate that errors tend to be similar depending on the values for the granule ratio regardless of the various changes of the values for the five factors. factors affecting the accuracy of estimation:

  • PDF

2R++: Enhancing 2R FTL to Identify Warm Pages (2R++: Warm Page 식별을 통한 2R FTL 개선)

  • Hyojun, An;Sangwon, Lee
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.11 no.12
    • /
    • pp.419-428
    • /
    • 2022
  • Since in-place updates for pages are not allowed in flash memory, all new page writes should be written in an out-of-place manner. The old overwritten pages are invalidated. Such invalidated pages eventually trigger the costly garbage collection process. Since the garbage collection causes numerous read and write operations, it is one of the flash memory's major performance issues. In 2R, it modified the garbage collection algorithm, which applies the I/O characteristics of the On-Line Transaction Process workload to improve the Write Amplification Factor. However, this algorithm has a region pollution problem. Therefore, in this paper, we developed 2R++ that additionally separates pages with long access intervals to solve the region pollution problem. 2R++ introduces an extra bit per block to separate warm pages based on a second chance mechanism. Prevents warm pages from being misidentified as cold pages to solve region pollution problem. We conducted the experiments on TPC-C and Linkbench to make the performance comparison. The experiment showed that 2R++ achieved a Write Amplification Factor improvement of 57.8% and 13.8% compared to 2R, respectively.

(PMU (Performance Monitoring Unit)-Based Dynamic XIP(eXecute In Place) Technique for Embedded Systems) (내장형 시스템을 위한 PMU (Performance Monitoring Unit) 기반 동적 XIP (eXecute In Place) 기법)

  • Kim, Dohun;Park, Chanik
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.3 no.3
    • /
    • pp.158-166
    • /
    • 2008
  • These days, mobile embedded systems adopt flash memory capable of XIP feature since they can reduce memory usage, power consumption, and software load time. XIP provides direct access to ROM and flash memory for processors. However, using XIP incurs unnecessary degradation of applications' performance because direct access to ROM and flash memory shows more delay than that to main memory. In this paper, we propose a memory management framework, dynamic XIP, which can resolve the performance degradation of using XIP. Using a constrained RAM cache, dynamic XIP can dynamically change XIP region according to page access pattern to reduce performance degradation in execution time or energy consumption resulting from native XIP problem. The proposed framework consists of a page profiler gathering applications' memory access pattern using PMU and an XIP manager deciding that a page is accessed whether in main memory or in flash memory. The proposed framework is implemented and evaluated in Linux kernel. Our evaluation shows that our framework can reduce execution time at most 25% and energy consumption at most 22% compared with using XIP-only case adopted in general mobile embedded systems. Moreover, the evaluation shows that in execution time and energy consumption, our modified LRU algorithm with code page filters can reduce more than at most 90% and 80% respectively compared with applying just existing LRU algorithm to dynamic XIP.

  • PDF

Automatic Reconstruction of Web Pages for Mobile Devices (무선 단말기를 위한 웹 페이지의 자동 재구성)

  • Song, Dong-Rhee;Hwang, Een-Jun
    • The KIPS Transactions:PartB
    • /
    • v.9B no.5
    • /
    • pp.523-532
    • /
    • 2002
  • Recently, with the wide spread of the Internet and development of wireless network technology, it has now become possible to access web pages anytime, anywhere through devices with small display such as PDA But, since most existing web pages are optimized for desktop computers, browsing web pages on the small screen through wireless network requires more scrolling and longer loading time. In this paper, we propose a page reconstruction scheme called PageMap to make it feasible to navigate existing web pages through small screen devices even on the wireless connection. Reconstructed pages reduce the file and page size and thus eventually reduce resource requirements. We have Implemented a prototype system and performed several experiments for typical web sites. We report some of the results.

Characteristics of IEF Patterns and SDS-PAGE Results of Korean EPO Biosimilars

  • Kang, Min-Jung;Shin, Sang-Mi;Yoo, Hey-Hyun;Kwon, Oh-Seung;Jin, Chang-Bae
    • Bulletin of the Korean Chemical Society
    • /
    • v.31 no.9
    • /
    • pp.2493-2496
    • /
    • 2010
  • Erythropoietin (EPO) is mainly produced in kidney and stimulates erythropoiesis. The use of recombinant EPOs for doping is prohibited because of its performance enhancing effect. This study investigated whether biosimilar EPOs could be differentiated from endogenous one by iso-electro-focusing plus double blotting and SDS-PAGE for antidoping analysis. The established method was validated with positive control urine. The band patterns were reproducible and meet the criteria, which was made by world anti doping agency (WADA). Isoelectric focusing was conducted in pH range 2 to 6. Recormon (La Roche), Aropotin (Kunwha), Epokine (CJ Pharm Co.), Eporon (Dong-A), Espogen (LG Life Sciences), and Dynepo (Shire Pharmaceuticals) were detected in basic region. All biosimilars showed discriminative isoelectric profiles from endogenous EPO profiles, but they showed different band patterns with the reference one except Epokine (CJ Pharm Co.). Next, SDS-PAGE of biosimilar EPOs resulted in different molecular weight patterns which were distributed higher than endogenous EPO. Commercial immune assay kit as an immune affinity purification tool and immobilized antibody coated magnetic bead were tested for the purification and concentration of EPO from urinary matrix. The antibody-coated magnetic bead gave better purification yield. The IEF plus double blotting and SDS-PAGE with immunoaffinity purification method established can be used to discriminate biosimilar EPOs from endogenous EPO.

Document Image Segmentation and Classification using Texture Features and Structural Information (텍스쳐 특징과 구조적인 정보를 이용한 문서 영상의 분할 및 분류)

  • Park, Kun-Hye;Kim, Bo-Ram;Kim, Wook-Hyun
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.11 no.3
    • /
    • pp.215-220
    • /
    • 2010
  • In this paper, we propose a new texture-based page segmentation and classification method in which table region, background region, image region and text region in a given document image are automatically identified. The proposed method for document images consists of two stages, document segmentation and contents classification. In the first stage, we segment the document image, and then, we classify contents of document in the second stage. The proposed classification method is based on a texture analysis. Each contents in the document are considered as regions with different textures. Thus the problem of classification contents of document can be posed as a texture segmentation and analysis problem. Two-dimensional Gabor filters are used to extract texture features for each of these regions. Our method does not assume any a priori knowledge about content or language of the document. As we can see experiment results, our method gives good performance in document segmentation and contents classification. The proposed system is expected to apply such as multimedia data searching, real-time image processing.

A Physical Storage Design Method for Access Structures of Image Information Systems

  • Lee, Jung-A;Lee, Jong-Hak
    • Journal of Information Processing Systems
    • /
    • v.14 no.5
    • /
    • pp.1150-1166
    • /
    • 2018
  • This paper presents a physical storage design method for image access structures using transformation techniques of multidimensional file organizations in image information systems. Physical storage design is the process of determining the access structures to provide optimal query processing performance for a given set of queries. So far, there has been no such attempt in the image information system. We first show that the number of pages to be accessed decreases as the shape of the given retrieval query region and that of the data page region become similar in the transformed domain space. Using these properties, we propose a method for finding an optimal image access structure by controlling the shapes of the page regions. For the performance evaluation, we have performed many experiments with a multidimensional file organization using transformation techniques. The results indicate that our proposed method is at least one to maximum five times faster than the conventional method according to the query pattern within the scope of the experiments. The result confirms that the proposed physical storage design method is useful in a practical way.

A Study of Image Enhancement Processing for Letter Extraction of Image Using Terahertz Signal (테라헤르츠 신호를 이용한 영상의 글자 추출을 위한 화질 개선처리에 대한 연구)

  • Kim, Seongyoon;Choi, Hyunkeun;Park, Inho;Kim, Youngseop;Lee, Yonghwan
    • Journal of the Semiconductor & Display Technology
    • /
    • v.16 no.3
    • /
    • pp.111-115
    • /
    • 2017
  • Terahertz waves are superior to conventional X-ray or Magnetic Resonance Tomography(MRI), and the amount of information that can be transmitted is as large as thousands of times that conventional X-ray or MRI. In addition, Terahertz waves have great performance in analyzing an object which have some layered structure. By using this advantage, we can extract the letters of a page by analyzing information such as absorption amount and reflection amount by irradiating a closed book with pulses of various frequencies within gap of a terahertz wave. However, in the image of each page using the Terahertz wave might be obtained various kinds of noise and the different character occlusion region. So, to extract letters from the terahertz image, we must take the noise and occlusion region away. We have been working to enhancement the image quality in various ways, and keep on studying de-noising processing for enhancement about the image quality and high resolution. Finally, we also keep on studying about OCR(Optical Character Recognition) technology, which based on pattern matching technique, to read letters.

  • PDF