• Title/Summary/Keyword: 임계 크기의 결함

Search Result 216, Processing Time 0.034 seconds

A Study on Building Identification from the Three-dimensional Point Cloud by using Monte Carlo Integration Method (몬테카를로 적분을 통한 3차원 점군의 건물 식별기법 연구)

  • YI, Chaeyeon;AN, Seung-Man
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.23 no.4
    • /
    • pp.16-41
    • /
    • 2020
  • Geospatial input setting to represent the reality of spatial distribution or quantitative property within model has become a major interest in earth system simulation. Many studies showed the variation of grid resolution could lead to drastic changes of spatial model results because of insufficient surface property estimations. Hence, in this paper, the authors proposed Monte Carlo Integration (MCI) to apply spatial probability (SP) in a spatial-sampling framework using a three-dimensional point cloud (3DPC) to keep the optimized spatial distribution and area/volume property of buildings in urban area. Three different decision rule based building identification results were compared : SP threshold, cell size, and 3DPC density. Results shows the identified building area property tend to increase according to the spatial sampling grid area enlargement. Hence, areal building property manipulation in the sampling frameworks by using decision rules is strongly recommended to increase reliability of geospatial modeling and analysis results. Proposed method will support the modeling needs to keep quantitative building properties in both finer and coarser grids.

Improvement in Inefficient Repetition of Gauss Sieve (Gauss Sieve 반복 동작에서의 비효율성 개선)

  • Byeongho Cheon;Changwon Lee;Chanho Jeon;Seokhie Hong;Suhri Kim
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.2
    • /
    • pp.223-233
    • /
    • 2023
  • Gauss Sieve is an algorithm for solving SVP and requires exponential time and space complexity. The terminationcondition of the Sieve is determined by the size of the constructed list and the number of collisions related to space complexity. The term 'collision' refers to the state in which the sampled vector is reduced to the vector that is already inthe list. if collisions occur more than a certain number of times, the algorithm terminates. When executing previous algorithms, we noticed that unnecessary operations continued even after the shortest vector was found. This means that the existing termination condition is set larger than necessary. In this paper, after identifying the point where unnecessary operations are repeated, optimization is performed on the number of operations required. The tests are conducted by adjusting the threshold of the collision that becomes the termination condition and the distribution in whichthe sample vector is generated. According to the experiments, the operation that occupies the largest proportion decreased by62.6%. The space and time complexity also decreased by 4.3 and 1.6%, respectively.

Experimental study on release of plastic particles from coastal sediments to fluid body (해안 유사에서 수체로의 플라스틱 입자 방출에 관한 실험적 연구)

  • Hwang, Dongwook;Park, Yong Sung
    • Journal of Korea Water Resources Association
    • /
    • v.56 no.2
    • /
    • pp.125-137
    • /
    • 2023
  • In marine environments, plastics have become more abundant due to increasing plastic use. Especially, in coastal regions, particles may remain for a long time, and they interact with flows, wind, sand and human activities. This study aimed thus to observe how plastic debris interacts with and escape from sediments. A series of experiments were conducted in order to gain a better understanding of particle release from coastal sediments into water body. An oscillating water tunnel was built for the experiments, and used to generate oscillatory flows of relatively high Reynolds number and induce sediment transport. Spherical plastic particles of three different sizes was used in lieu of plastic debris in environments. It was observed that release of the particles was directly related to change of bedform, which is in turn determined by the flow condition. Also smaller particles tend to escape the sediment more readily. Critical values for dimensionless parameters are proposed.

A Study on the Length of Deceleration Lane at Freeway Diverging Areas (고속도로 분기부에서의 감속차로 길이에 관한 연구)

  • Kim, Dong Nyong
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.29 no.2D
    • /
    • pp.227-234
    • /
    • 2009
  • At present, the length of deceleration lane at freeway diverging areas are designed based on the design speed of main lines and ramps. This is possible on assumption that diverging vehicles decelerate at deceleration section after moving to shoulder lane in advance. But with high diverging volume, several vehicles will try to change to exit lane at the same time. This will cause the distribution of main lane flows or some vehicles may encounter short deceleration length because they miss the proper time to change the lane. The purpose of this study is to establish a design guideline of the length of deceleration section considering the volume of diverging traffic. Also, the results of analysis by the FRESIM simulation model shows that some improvements in respect of delays, speeds and speed deviations of mainline and deceleration lane.

Water body extraction using block-based image partitioning and extension of water body boundaries (블록 기반의 영상 분할과 수계 경계의 확장을 이용한 수계 검출)

  • Ye, Chul-Soo
    • Korean Journal of Remote Sensing
    • /
    • v.32 no.5
    • /
    • pp.471-482
    • /
    • 2016
  • This paper presents an extraction method for water body which uses block-based image partitioning and extension of water body boundaries to improve the performance of supervised classification for water body extraction. The Mahalanobis distance image is created by computing the spectral information of Normalized Difference Water Index (NDWI) and Near Infrared (NIR) band images over a training site within the water body in order to extract an initial water body area. To reduce the effect of noise contained in the Mahalanobis distance image, we apply mean curvature diffusion to the image, which controls diffusion coefficients based on connectivity strength between adjacent pixels and then extract the initial water body area. After partitioning the extracted water body image into the non-overlapping blocks of same size, we update the water body area using the information of water body belonging to water body boundaries. The update is performed repeatedly under the condition that the statistical distance between water body area belonging to water body boundaries and the training site is not greater than a threshold value. The accuracy assessment of the proposed algorithm was tested using KOMPSAT-2 images for the various block sizes between $11{\times}11$ and $19{\times}19$. The overall accuracy and Kappa coefficient of the algorithm varied from 99.47% to 99.53% and from 95.07% to 95.80%, respectively.

Polymeric Micelle Using Poly((R)-3-hydroxybutyric acid)/Poly(ethylene glycol) Amphiphilic Block Copolymer for Drug Delivery System (Poly((R)-3-hydroxybutyric acid)/Poly(ethylene glycol) 양친성 블록 공중합체를 이용한 약물전달체용 고분자 미셀)

  • Jeong, Kwan-Ho;Kim, Young-Jin
    • Polymer(Korea)
    • /
    • v.30 no.6
    • /
    • pp.512-518
    • /
    • 2006
  • A biodegradable polymer poly((R) -3-hydroxybutyric acid) (PHB) was conjugated with a hydrophilic polymer poly(ethylene glycol) (PEG) by the ttansesterification reaction to form the amphiphilic block copolymer. PHB with low molecular weight ($3000{\sim}30000$) was appropriated for the drug delivery materials. High molecular weight PHB was hydrolyzed by an acid-catalyst to produce the low molecular weight one. Amphiphilic block copolymer was formed the self-assembled polymeric micelle system in the aqueous solution that the hydrophillic PEG was wraped the hydrophobic PHB. Generally, polymeric micelle forms the small particle between $10{\sim}200nm$. These polymeric micelle systems have been widely used for the drug delivery systems because they were biodegradable, biocompatible, non-toxic and patient compliant. The hydroxyl group of PEG was substituted with carboxyl group which has the reactivity to the ester group of PHB. Amphiphilic block copolymer was conjugated between PHB, and modified PEG at $176^{\circ}C$ which was higher than the melting point of PHB. Transesterification reaction was verified with DSC, FTIR, $^1H-NMR$. In the aqueous solution, critical micelle concentration (CMC) of the mPEG-co-PHB copolymer measured by the fluororescence scanning spectrometer was $5{\times}10^{-5}g/L$. The shape and size of the nanoparticle was taken by dynamic light scattering and atomic force microscopy. The size of the nanoparticle was about 130 nm and the shape was spherical. Our polymeric micelle system can be used as the passive targeting drug delivery system.

Implementation of High-Throughput SHA-1 Hash Algorithm using Multiple Unfolding Technique (다중 언폴딩 기법을 이용한 SHA-1 해쉬 알고리즘 고속 구현)

  • Lee, Eun-Hee;Lee, Je-Hoon;Jang, Young-Jo;Cho, Kyoung-Rok
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.47 no.4
    • /
    • pp.41-49
    • /
    • 2010
  • This paper proposes a new high speed SHA-1 architecture using multiple unfolding and pre-computation techniques. We unfolds iterative hash operations to 2 continuos hash stage and reschedules computation timing. Then, the part of critical path is computed at the previous hash operation round and the rest is performed in the present round. These techniques reduce 3 additions to 2 additions on the critical path. It makes the maximum clock frequency of 118 MHz which provides throughput rate of 5.9 Gbps. The proposed architecture shows 26% higher throughput with a 32% smaller hardware size compared to other counterparts. This paper also introduces a analytical model of multiple SHA-1 architecture at the system level that maps a large input data on SHA-1 block in parallel. The model gives us the required number of SHA-1 blocks for a large multimedia data processing that it helps to make decision hardware configuration. The hs fospeed SHA-1 is useful to generate a condensed message and may strengthen the security of mobile communication and internet service.

Intermediate Principal Stress Dependency in Strength of Transversely Isotropic Mohr-Coulomb Rock (평면이방성 Mohr-Coulomb 암석 강도의 중간주응력 의존성)

  • Lee, Youn-Kyou
    • Tunnel and Underground Space
    • /
    • v.23 no.5
    • /
    • pp.383-391
    • /
    • 2013
  • A number of true triaxial tests on rock samples have been conducted since the late 1960 and their results strongly suggest that the intermediate principal stress has a considerable effect on rock strength. Based on these experimental evidence, various 3-D rock failure criteria accounting for the effect of the intermediate principal stress have been proposed. Most of the 3-D failure criteria, however, are focused on the phenomenological description of the rock strength from the true triaxial tests, so that the associated strength parameters have little physical meaning. In order to confirm the likelihood that the intermediate principal stress dependency of rock strength is related to the presence of weak planes and their distribution to the preferred orientation, true triaxial tests are simulated with the transversely isotropic rock model. The conventional Mohr-Coulomb criterion is extended to its anisotropic version by incorporating the concept of microstructure tensor. With the anisotropic Mohr-Coulomb criterion, the critical plane approach is applied to calculate the strength of the transversely isotropic rock model and the orientation of the fracture plane. This investigation hints that the spatial distribution of microstructural planes with respect to the principal stress triad is closely related to the intermediate principal stress dependency of rock strength.

Performance Analysis of Handoff Channel Assignment Scheme in CDMA Cellular System (CDMA 셀룰러시스템에서의 핸드오프 채널할당기법 성능분석)

  • Lee, Dong-Myung;Lee, Chul-Hee
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.36S no.6
    • /
    • pp.17-29
    • /
    • 1999
  • In this paper, the prioritized queueing handoff scheme in CDMA (Code Division Multiple Access) cellular system is proposed. Also, the analytical survey for the proposed scheme is carried out, and the performance of this scheme is compared with that of non prioritized scheme and FIFO (First In First Out) queue scheme by computer simulation. The handoff region is defined as the time between the handoff treshold and the receiver threshold, and it is used for the maximum queue waiting time in the proposed scheme. The handoff and the receiver thresholds are defined as rewpectively: 1) the time that the Pilot Strength Measurement Message in the neighbor in the neighbor cell is received to the BS (Base Station) under the T_ADD threshold; and 2) the time that the T_DROP timer is expired and the Pilot Strength Measurement Message in the current cell is received to the BS under the T_DROP threshold. The performance metrics for analyzing the proposed scheme are : 1) probability of forced termination; 2) probability of call blocking; 3) ratio of carried traffic to total offered load; 4) average queue size; 5) average handoff delay time in queue. The simulation results show that the proposed scheme maintains high performance for handoff requests at a small penalty in total system capacity.

  • PDF

A Robust Pattern Watermarking Method by Invisibility and Similarity Improvement (비가시성과 유사도 증가를 통한 강인한 패턴 워터마킹 방법)

  • 이경훈;김용훈;이태홍
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.10
    • /
    • pp.938-943
    • /
    • 2003
  • In this paper, we Propose a method using the Tikhonov-Miller process to improve the robustness of watermarking under various attacks. A visually recognizable pattern watermark is embedded in the LH2, HL2 and HH2 subband of wavelet transformed domain using threshold and besides watermark is embeded by utilizing HVS(Human Visual System) feature. The pattern watermark was interlaced after random Permutation for a security and an extraction rate. To demonstrate the improvement of robustness and similarity of the proposed method, we applied some basic algorithm of image processing such as scaling, filtering, cropping, histogram equalizing and lossy compression(JPEG, gif). As a result of experiment, the proposed method was able to embed robust watermark invisibility and extract with an excellent normalized correlation of watermark under various attacks.