• Title/Summary/Keyword: Block Based Information

Search Result 2,206, Processing Time 0.035 seconds

A Block based 3D Map for Recognizing Three Dimensional Spaces (3차원 공간의 인식을 위한 블록기반 3D맵)

  • Yi, Jong-Su;Kim, Jun-Seong
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.49 no.4
    • /
    • pp.89-96
    • /
    • 2012
  • A 3D map provides useful information for intelligent services. Traditional 3D maps, however, consist of a raw image data and are not suitable for real-time applications. In this paper, we propose the Block-based 3D map, that represents three dimensional spaces in a collection of square blocks. The Block_based 3D map has two major variables: an object ratio and a block size. The object ratio is defined as the proportion of object pixels to space pixels in a block and determines the type of the block. The block size is defined as the number of pixels of the side of a block and determines the size of the block. Experiments show the advantage of the Block-based 3D map in reducing noise, and in saving the amount of processing data. With the block size of $40{\times}40$ and the object ratio of 30% to 50% we can get the most matched Block-based 3D map for the $320{\times}240$ depthmap. The Block-based 3D map provides useful information, that can produce a variety of new services with high added value in intelligent environments.

Neural Network Model-based Algorithm for Identifying Job Status in Block Assembly Shop for Shipbuilding (신경망 모델 기반 조선소 조립공장 작업상태 판별 알고리즘)

  • Hong, Seung-Taek;Choi, Jin-Young;Park, Sang-Chul
    • IE interfaces
    • /
    • v.24 no.3
    • /
    • pp.267-273
    • /
    • 2011
  • In the shipbuilding industry, since production processes are so complicated that the data collection for decision making cannot be fully automated, most of production planning and controls are based on the information provided only by field workers. Therefore, without sufficient information it is very difficult to manage the whole production process efficiently. Job status is one of the most important information used for evaluating the remaining processing time in production control, specifically, in block assembly shop. Currently, it is checked by a production manager manually and production planning is modified based on that information, which might cause a delay in production control, resulting in performance degradation. Motivated by these remarks, in this paper we propose an efficient algorithm for identifying job status in block assembly shop for shipbuilding. The algorithm is based on the multi-layer perceptron neural network model using two key factors for input parameters. We showed the superiority of the algorithm by using a numerical experiment, based on real data collected from block assembly shop.

Analysis of the Efficiency for Some Selected Double-Block-Length Hash Functions Based on AES/LEA (AES/LEA 기반 이중블록길이 해쉬함수에 대한 효율성 분석)

  • Kim, Dowon;Kim, Jongsung
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.26 no.6
    • /
    • pp.1353-1360
    • /
    • 2016
  • We analyze the efficiency of the double-block-length hash functions, Abreast-DM, HIROSE, MDC-2, MJH, MJH-Double based on AES or LEA. We use optimized open-source code for AES, and our implemented source code for LEA. As a result, the hash functions based on LEA are generally more efficient than those, based on AES. In terms of speed, the hash function with LEA are 6%~19% faster than those with AES except for Abreast-DM. In terms of memory, the hash functions with LEA has 20~30 times more efficient than those with AES.

Reversible Data Hiding Scheme Based on Maximum Histogram Gap of Image Blocks

  • Arabzadeh, Mohammad;Rahimi, Mohammad Reza
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.6 no.8
    • /
    • pp.1964-1981
    • /
    • 2012
  • In this paper a reversible data hiding scheme based on histogram shifting of host image blocks is presented. This method attempts to use full available capacity for data embedding by dividing the image into non-overlapping blocks. Applying histogram shifting to each block requires that extra information to be saved as overhead data for each block. This extra information (overhead or bookkeeping information) is used in order to extract payload and recover the block to its original state. A method to eliminate the need for this extra information is also introduced. This method uses maximum gap that exists between histogram bins for finding the value of pixels that was used for embedding in sender side. Experimental results show that the proposed method provides higher embedding capacity than the original reversible data hiding based on histogram shifting method and its improved versions in the current literature while it maintains the quality of marked image at an acceptable level.

A High-Quality Image Authentication Scheme for AMBTC-compressed Images

  • Lin, Chia-Chen;Huang, Yuehong;Tai, Wei-Liang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.12
    • /
    • pp.4588-4603
    • /
    • 2014
  • In this paper, we present a high-quality image authentication scheme based on absolute moment block truncation coding. In the proposed scheme, we use the parity of the bitmap (BM) to generate the authentication code for each compressed image block. Data hiding is used to authenticate whether the content has been altered or not. For image authentication, we embed the authentication code to quantization levels of each image block compressed by absolute moment block truncation coding (AMBTC) which will be altered when the host image is manipulated. The embedding position is generated by a pseudo-random number generator for security concerned. Besides, to improve the detection ability we use a hierarchical structure to ensure the accuracy of tamper localization. A watermarked image can be precisely inspected whether it has been tampered intentionally or incautiously by checking the extracted watermark. Experimental results demonstrated that the proposed scheme achieved high-quality embedded images and good detection accuracy, with stable performance and high expansibility. Performance comparisons with other block-based data hiding schemes are provided to demonstrate the superiority of the proposed scheme.

A linear systolic array based architecture for full-search block matching motion estimator (선형 시스토릭 어레이를 이용한 완전탐색 블럭정합 이동 예측기의 구조)

  • 김기현;이기철
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.21 no.2
    • /
    • pp.313-325
    • /
    • 1996
  • This paper presents a new architecture for full-search block-matching motion estimation. The architecture is based on linear systolic arrays. High speed operation is obtained by feeding reference data, search data, and control signals into the linear systolic array in a pipelined fashion. Input data are fed into the linear systolic array at a half of the processor speed, reducing the required data bandwidth to half. The proposed architecture has a good scalability with respect to the number of processors and input bandwidth when the size of reference block and search range change.

  • PDF

A deblocking filer for block-based compressed video sequences (블럭 기반으로 압축된 동영상을 위한 블럭화 현상 제거 기법)

  • 김성덕;이재연;라종범
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.35S no.2
    • /
    • pp.89-96
    • /
    • 1998
  • Conventional block-based video coders induce annoying blocking artifacts in very low bitarte coding. We propose a delocking filter which is appropriate for real time operation in a conventional video decoder. The proposed algorithm uses on dimensional filtering across block boundaries horizontally and vertiaclly with two separate filtering modes. The mode decision is quite simple but is fully based on the characteristics of human visual system and video sequences. In flat regions, a strong smoothing filter is appliced; and in the other regions, a moew sophisticated smoothing filter, which is based on the frequency information around block boundaries, is used to reduce blocking artifacts without introuducing undesired blur. Eeven though the proposed deblocking filter is quite simple, simulation results show that it improves both subjective and objective image quality for various image features.

  • PDF

Synthesizing Intermediate Images Using Stereoscopic Images

  • Kwak, Ji-Hyun;Komar, V.S.V.;Kim, Kyung-Tae
    • Journal of the Optical Society of Korea
    • /
    • v.6 no.4
    • /
    • pp.143-149
    • /
    • 2002
  • In this paper, we present an algorithm for synthesizing intermediate views from a stereoscopic pair of images. Syntheses of intermediate images allows one to realize a more comfortable the 3D display system. The proposed method is based on block matching, which is not ordinarily used. The contour information is used for a block decision. In order to find an equivalent (or corresponding) block, there are two steps: "matching of contour-to-original image" and "matching of contour-to-contour image" methods. "Matching of contour-to-contour image" uses both left and right contour images. This block matching method allows us to find the corresponding block in spite of different block sizes. Experimental results illustrate the performance of the proposed technique and we obtained a high quality image of more than 31 dB PSNR.image of more than 31 dB PSNR.

A Semi-fragile Image Watermarking Scheme Exploiting BTC Quantization Data

  • Zhao, Dongning;Xie, Weixin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.4
    • /
    • pp.1499-1513
    • /
    • 2014
  • This paper proposes a novel blind image watermarking scheme exploiting Block Truncation Coding (BTC). Most of existing BTC-based watermarking or data hiding methods embed information in BTC compressed images by modifying the BTC encoding stage or BTC-compressed data, resulting in watermarked images with bad quality. Other than existing BTC-based watermarking schemes, our scheme does not really perform the BTC compression on images during the embedding process but uses the parity of BTC quantization data to guide the watermark embedding and extraction processes. In our scheme, we use a binary image as the original watermark. During the embedding process, the original cover image is first partitioned into non-overlapping $4{\times}4$ blocks. Then, BTC is performed on each block to obtain its BTC quantized high mean and low mean. According to the parity of high mean and the parity of low mean, two watermark bits are embedded in each block by modifying the pixel values in the block to make sure that the parity of high mean and the parity of low mean in the modified block are equal to the two watermark bits. During the extraction process, BTC is first performed on each block to obtain its high mean and low mean. By checking the parity of high mean and the parity of low mean, we can extract the two watermark bits in each block. The experimental results show that the proposed watermarking method is fragile to most image processing operations and various kinds of attacks while preserving the invisibility very well, thus the proposed scheme can be used for image authentication.

A Garbage Collection Method for Flash Memory Based on Block-level Buffer Management Policy

  • Li, Liangbo;Shin, Song-Sun;Li, Yan;Baek, Sung-Ha;Bae, Hae-Young
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.12
    • /
    • pp.1710-1717
    • /
    • 2009
  • Flash memory has become the most important storage media in mobile devices along with its attractive features such as low power consumption, small size, light weight, and shock resistance. However, a flash memory can not be written before erased because of its erase-before-write characteristic, which lead to some garbage collection when there is not enough space to use. In this paper, we propose a novel garbage collection scheme, called block-level buffer garbage collection. When it is need to do merge operation during garbage collection, the proposed scheme does not merge the data block and corresponding log block but also search the block-level buffer to find the corresponding block which will be written to flash memory in the next future, and then decide whether merge it in advance or not. Our experimental results show that the proposed technique improves the flash performance up to 4.6% by reducing the unnecessary block erase numbers and page copy numbers.

  • PDF