• Title/Summary/Keyword: 성능감소

Search Result 7,586, Processing Time 0.038 seconds

An Analytical Study on the Seismic Behavior and Safety of Vertical Hydrogen Storage Vessels Under the Earthquakes (지진 시 수직형 수소 저장용기의 거동 특성 분석 및 안전성에 관한 해석적 연구)

  • Sang-Moon Lee;Young-Jun Bae;Woo-Young Jung
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.27 no.6
    • /
    • pp.152-161
    • /
    • 2023
  • In general, large-capacity hydrogen storage vessels, typically in the form of vertical cylindrical vessels, are constructed using steel materials. These vessels are anchored to foundation slabs that are specially designed to suit the environmental conditions. This anchoring method involves pre-installed anchors on top of the concrete foundation slab. However, it's important to note that such a design can result in concentrated stresses at the anchoring points when external forces, such as seismic events, are at play. This may lead to potential structural damage due to anchor and concrete damage. For this reason, in this study, it selected an vertical hydrogen storage vessel based on site observations and created a 3D finite element model. Artificial seismic motions made following the procedures specified in ICC-ES AC 156, as well as domestic recorded earthquakes with a magnitude greater than 5.0, were applied to analyze the structural behavior and performance of the target structures. Conducting experiments on a structure built to actual scale would be ideal, but due to practical constraints, it proved challenging to execute. Therefore, it opted for an analytical approach to assess the safety of the target structure. Regarding the structural response characteristics, the acceleration induced by seismic motion was observed to amplify by approximately ten times compared to the input seismic motions. Additionally, there was a tendency for a decrease in amplification as the response acceleration was transmitted to the point where the centre of gravity is located. For the vulnerable components, specifically the sub-system (support columns and anchorages), the stress levels were found to satisfy the allowable stress criteria. However, the concrete's tensile strength exhibited only about a 5% margin of safety compared to the allowable stress. This indicates the need for mitigation strategies in addressing these concerns. Based on the research findings presented in this paper, it is anticipated that predictable load information for the design of storage vessels required for future shaking table tests will be provided.

Development of Seasonal Habitat Suitability Indices for the Todarodes Pacificus around South Korea Based on GOCI Data (GOCI 자료를 활용한 한국 연근해 살오징어의 계절별 서식적합지수 모델 개발)

  • Seonju Lee;Jong-Kuk Choi;Myung-Sook Park;Sang Woo Kim
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.6_2
    • /
    • pp.1635-1650
    • /
    • 2023
  • Under global warming, the steadily increasing sea surface temperature (SST) severely impacts marine ecosystems,such as the productivity decrease and change in marine species distribution. Recently, the catch of Todarodes Pacificus, one of South Korea's primary marine resources, has dramatically decreased. In this study, we analyze the marine environment that affects the formation of fishing grounds of Todarodes Pacificus and develop seasonal habitat suitability index (HSI) models based on various satellite data including Geostationary Ocean Color Imager (GOCI) data to continuously manage fisheries resources over Korean exclusive economic zone. About 83% of catches are found within the range of SST of 14.11-26.16℃,sea level height of 0.56-0.82 m, chlorophyll-a concentration of 0.31-1.52 mg m-3, and primary production of 580.96-1574.13 mg C m-2 day-1. The seasonal HSI models are developed using the Arithmetic Mean Model, which showed the best performance. Comparing the developed HSI value with the 2019 catch data, it is confirmed that the HSI model is valid because the fishing grounds are formed in different sea regions by season (East Sea in winter and Yellow Sea in summer) and the high HSI (> 0.6) concurrences to areas with the high catch. In addition, we identified the significant increasing trend in SST over study regions, which is highly related to the formation of fishing grounds of Todarodes Pacificus. We can expect the fishing grounds will be changed by accelerating ocean warming in the future. Continuous HSI monitoring is necessary to manage fisheries' spatial and temporal distribution.

Comparative study of flood detection methodologies using Sentinel-1 satellite imagery (Sentinel-1 위성 영상을 활용한 침수 탐지 기법 방법론 비교 연구)

  • Lee, Sungwoo;Kim, Wanyub;Lee, Seulchan;Jeong, Hagyu;Park, Jongsoo;Choi, Minha
    • Journal of Korea Water Resources Association
    • /
    • v.57 no.3
    • /
    • pp.181-193
    • /
    • 2024
  • The increasing atmospheric imbalance caused by climate change leads to an elevation in precipitation, resulting in a heightened frequency of flooding. Consequently, there is a growing need for technology to detect and monitor these occurrences, especially as the frequency of flooding events rises. To minimize flood damage, continuous monitoring is essential, and flood areas can be detected by the Synthetic Aperture Radar (SAR) imagery, which is not affected by climate conditions. The observed data undergoes a preprocessing step, utilizing a median filter to reduce noise. Classification techniques were employed to classify water bodies and non-water bodies, with the aim of evaluating the effectiveness of each method in flood detection. In this study, the Otsu method and Support Vector Machine (SVM) technique were utilized for the classification of water bodies and non-water bodies. The overall performance of the models was assessed using a Confusion Matrix. The suitability of flood detection was evaluated by comparing the Otsu method, an optimal threshold-based classifier, with SVM, a machine learning technique that minimizes misclassifications through training. The Otsu method demonstrated suitability in delineating boundaries between water and non-water bodies but exhibited a higher rate of misclassifications due to the influence of mixed substances. Conversely, the use of SVM resulted in a lower false positive rate and proved less sensitive to mixed substances. Consequently, SVM exhibited higher accuracy under conditions excluding flooding. While the Otsu method showed slightly higher accuracy in flood conditions compared to SVM, the difference in accuracy was less than 5% (Otsu: 0.93, SVM: 0.90). However, in pre-flooding and post-flooding conditions, the accuracy difference was more than 15%, indicating that SVM is more suitable for water body and flood detection (Otsu: 0.77, SVM: 0.92). Based on the findings of this study, it is anticipated that more accurate detection of water bodies and floods could contribute to minimizing flood-related damages and losses.

A Comparative Study of Vegetation Phenology Using High-resolution Sentinel-2 Imagery and Topographically Corrected Vegetation Index (고해상도 Sentinel-2 위성 자료와 지형효과를 고려한 식생지수 기반의 산림 식생 생장패턴 비교)

  • Seungheon Yoo;Sungchan Jeong
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.26 no.2
    • /
    • pp.89-102
    • /
    • 2024
  • Land Surface Phenology (LSP) plays a crucial role in understanding vegetation dynamics. The near-infrared reflectance of vegetation (NIRv) has been increasingly adopted in LSP studies, being recognized as a robust proxy for gross primary production (GPP). However, NIR v is sensitive to the terrain effects in mountainous areas due to artifacts in NIR reflectance cannot be canceled out. Because of this, estimating phenological metrics in mountainous regions have a substantial uncertainty, especially in the end of season (EOS). The topographically corrected NIRv (TCNIRv) employs the path length correction (PLC) method, which was deduced from the simplification of the radiative transfer equation, to alleviate limitations related to the terrain effects. TCNIRv has been demonstrated to estimate phenology metrics more accurately than NIRv, especially exhibiting improved estimation of EOS. As the topographic effect is significantly influenced by terrain properties such as slope and aspect, our study compared phenology metrics estimations between south-facing slopes (SFS) and north-facing slopes (NFS) using NIRv and TCNIRv in two distinct mountainous regions: Gwangneung Forest (GF) and Odaesan National Park (ONP), representing relatively flat and rugged areas, respectively. The results indicated that TCNIR v-derived EOS at NFS occurred later than that at SFS for both study sites (GF : DOY 266.8/268.3 at SFS/NFS; ONP : DOY 262.0/264.8 at SFS/NFS), in contrast to the results obtained with NIRv (GF : DOY 270.3/265.5 at SFS/NFS; ONP : DOY 265.0/261.8 at SFS/NFS). Additionally, the gap between SFS and NFS diminished after topographic correction (GF : DOY 270.3/265.5 at SFS/NFS; ONP : DOY 265.0/261.8 at SFS/NFS). We conclude that TCNIRv exhibits discrepancy with NIR v in EOS detection considering slope orientation. Our findings underscore the necessity of topographic correction in estimating photosynthetic phenology, considering slope orientation, especially in diverse terrain conditions.

Clustering Method based on Genre Interest for Cold-Start Problem in Movie Recommendation (영화 추천 시스템의 초기 사용자 문제를 위한 장르 선호 기반의 클러스터링 기법)

  • You, Tithrottanak;Rosli, Ahmad Nurzid;Ha, Inay;Jo, Geun-Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.1
    • /
    • pp.57-77
    • /
    • 2013
  • Social media has become one of the most popular media in web and mobile application. In 2011, social networks and blogs are still the top destination of online users, according to a study from Nielsen Company. In their studies, nearly 4 in 5active users visit social network and blog. Social Networks and Blogs sites rule Americans' Internet time, accounting to 23 percent of time spent online. Facebook is the main social network that the U.S internet users spend time more than the other social network services such as Yahoo, Google, AOL Media Network, Twitter, Linked In and so on. In recent trend, most of the companies promote their products in the Facebook by creating the "Facebook Page" that refers to specific product. The "Like" option allows user to subscribed and received updates their interested on from the page. The film makers which produce a lot of films around the world also take part to market and promote their films by exploiting the advantages of using the "Facebook Page". In addition, a great number of streaming service providers allows users to subscribe their service to watch and enjoy movies and TV program. They can instantly watch movies and TV program over the internet to PCs, Macs and TVs. Netflix alone as the world's leading subscription service have more than 30 million streaming members in the United States, Latin America, the United Kingdom and the Nordics. As the matter of facts, a million of movies and TV program with different of genres are offered to the subscriber. In contrast, users need spend a lot time to find the right movies which are related to their interest genre. Recent years there are many researchers who have been propose a method to improve prediction the rating or preference that would give the most related items such as books, music or movies to the garget user or the group of users that have the same interest in the particular items. One of the most popular methods to build recommendation system is traditional Collaborative Filtering (CF). The method compute the similarity of the target user and other users, which then are cluster in the same interest on items according which items that users have been rated. The method then predicts other items from the same group of users to recommend to a group of users. Moreover, There are many items that need to study for suggesting to users such as books, music, movies, news, videos and so on. However, in this paper we only focus on movie as item to recommend to users. In addition, there are many challenges for CF task. Firstly, the "sparsity problem"; it occurs when user information preference is not enough. The recommendation accuracies result is lower compared to the neighbor who composed with a large amount of ratings. The second problem is "cold-start problem"; it occurs whenever new users or items are added into the system, which each has norating or a few rating. For instance, no personalized predictions can be made for a new user without any ratings on the record. In this research we propose a clustering method according to the users' genre interest extracted from social network service (SNS) and user's movies rating information system to solve the "cold-start problem." Our proposed method will clusters the target user together with the other users by combining the user genre interest and the rating information. It is important to realize a huge amount of interesting and useful user's information from Facebook Graph, we can extract information from the "Facebook Page" which "Like" by them. Moreover, we use the Internet Movie Database(IMDb) as the main dataset. The IMDbis online databases that consist of a large amount of information related to movies, TV programs and including actors. This dataset not only used to provide movie information in our Movie Rating Systems, but also as resources to provide movie genre information which extracted from the "Facebook Page". Formerly, the user must login with their Facebook account to login to the Movie Rating System, at the same time our system will collect the genre interest from the "Facebook Page". We conduct many experiments with other methods to see how our method performs and we also compare to the other methods. First, we compared our proposed method in the case of the normal recommendation to see how our system improves the recommendation result. Then we experiment method in case of cold-start problem. Our experiment show that our method is outperform than the other methods. In these two cases of our experimentation, we see that our proposed method produces better result in case both cases.

Lipopolysaccharide-induced Synthesis of IL-1beta, IL-6, TNF-alpha and TGF-beta by Peripheral Blood Mononuclear Cells (내독소에 의한 말초혈액 단핵구의 IL-1beta, IL-6, TNF-alpha와 TGF-beta 생성에 관한 연구)

  • Jung, Sung-Hwan;Park, Choon-Sik;Kim, Mi-Ho;Kim, Eun-Young;Chang, Hun-Soo;Ki, Shin-Young;Uh, Soo-Taek;Moon, Seung-Hyuk;Kim, Yang-Hoon;Lee, Hi-Bal
    • Tuberculosis and Respiratory Diseases
    • /
    • v.45 no.4
    • /
    • pp.846-860
    • /
    • 1998
  • Background: Endotoxin (LPS : lipopolysaccharide), a potent activator of immune system, can induce acute and chronic inflammation through the production of cytokines by a variety of cells, such as monocytes, endothelial cells, lymphocytes, eosinophils, neutrophils and fibroblasts. LPS stimulate the mononucelar cells by two different pathway, the CD14 dependent and independent way, of which the former has been well documented, but not the latter. LPS binds to the LPS-binding protein (LBP), in serum, to make the LPS-LBP complex which interacts with CD14 molecules on the mononuclear cell surface in peripheral blood or is transported to the tissues. In case of high concentration of LPS, LPS can stimulate directly the macrophages without LBP. We investigated to detect the generation of proinflammatory cytokines such as interleukin 1 (IL-1), IL-6 and TNF-$\alpha$ and fibrogenic cytokine, TGF-$\beta$, by peripheral blood mononuclear cells (PBMC) after LPS stimulation under serum-free conditions, which lacks LBPs. Methods : PBMC were obtained by centrifugation on Ficoll Hypaque solution of peripheral venous bloods from healthy normal subjects, then stimulated in the presence of LPS (0.1 ${\mu}g/mL$ to 100 ${\mu}g/mL$ ). The activities of IL-1, IL-6, TNF, and TGF-$\beta$ were measured by bioassaies using cytokines - dependent proliferating or inhibiting cell lines. The cellular sources producing the cytokines was investigated by immunohistochemical stains and in situ hybridization. Results : PBMC started to produce IL-6, TNF-$\alpha$ and TGF-$\beta$ in 1 hr, 4 hrs and 8hrs, respectively, after LPS stimulation. The production of IL-6, TNF-$\alpha$ and TGF-$\beta$ continuously increased 96 hrs after stimulation of LPS. The amount of production was 19.8 ng/ml of IL-6 by $10^5$ PBMC, 4.1 ng/mL of TNF by $10^6$ PBMC and 34.4 pg/mL of TGF-$\beta$ by $2{\times}10^6$ PBMC. The immunoreactivity to IL-6, TNF-$\alpha$ and TGF-$\beta$ were detected on monocytes in LPS-stimulated PBMC. Some of lymphocytes showed positive immunoreactivity to TGF-$\beta$. Double immunohistochemical stain showed that IL-1$\beta$, IL-6, TNF-$\alpha$ expression was not associated with CD14 postivity on monocytes. IL-1$\beta$, IL-6, TNF-$\alpha$ and TGF-$\beta$mRNA expression were same as observed in immunoreactivity for each cytokines. Conclusion: When monocytes are stimulated with LPS under serum-free conditions, IL-6 and TNF-$\alpha$ are secreted in early stage of inflammation. In contrast, the secretion of TGF-$\beta$ arise in the late stages and that is maintained after 96 hrs. The main cells releasing IL-1$\beta$, IL-6, TNF-$\alpha$ and TGF-$\beta$ are monocytes, but also lymphocytes can secret TGF-$\beta$.

  • PDF

End to End Model and Delay Performance for V2X in 5G (5G에서 V2X를 위한 End to End 모델 및 지연 성능 평가)

  • Bae, Kyoung Yul;Lee, Hong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.1
    • /
    • pp.107-118
    • /
    • 2016
  • The advent of 5G mobile communications, which is expected in 2020, will provide many services such as Internet of Things (IoT) and vehicle-to-infra/vehicle/nomadic (V2X) communication. There are many requirements to realizing these services: reduced latency, high data rate and reliability, and real-time service. In particular, a high level of reliability and delay sensitivity with an increased data rate are very important for M2M, IoT, and Factory 4.0. Around the world, 5G standardization organizations have considered these services and grouped them to finally derive the technical requirements and service scenarios. The first scenario is broadcast services that use a high data rate for multiple cases of sporting events or emergencies. The second scenario is as support for e-Health, car reliability, etc.; the third scenario is related to VR games with delay sensitivity and real-time techniques. Recently, these groups have been forming agreements on the requirements for such scenarios and the target level. Various techniques are being studied to satisfy such requirements and are being discussed in the context of software-defined networking (SDN) as the next-generation network architecture. SDN is being used to standardize ONF and basically refers to a structure that separates signals for the control plane from the packets for the data plane. One of the best examples for low latency and high reliability is an intelligent traffic system (ITS) using V2X. Because a car passes a small cell of the 5G network very rapidly, the messages to be delivered in the event of an emergency have to be transported in a very short time. This is a typical example requiring high delay sensitivity. 5G has to support a high reliability and delay sensitivity requirements for V2X in the field of traffic control. For these reasons, V2X is a major application of critical delay. V2X (vehicle-to-infra/vehicle/nomadic) represents all types of communication methods applicable to road and vehicles. It refers to a connected or networked vehicle. V2X can be divided into three kinds of communications. First is the communication between a vehicle and infrastructure (vehicle-to-infrastructure; V2I). Second is the communication between a vehicle and another vehicle (vehicle-to-vehicle; V2V). Third is the communication between a vehicle and mobile equipment (vehicle-to-nomadic devices; V2N). This will be added in the future in various fields. Because the SDN structure is under consideration as the next-generation network architecture, the SDN architecture is significant. However, the centralized architecture of SDN can be considered as an unfavorable structure for delay-sensitive services because a centralized architecture is needed to communicate with many nodes and provide processing power. Therefore, in the case of emergency V2X communications, delay-related control functions require a tree supporting structure. For such a scenario, the architecture of the network processing the vehicle information is a major variable affecting delay. Because it is difficult to meet the desired level of delay sensitivity with a typical fully centralized SDN structure, research on the optimal size of an SDN for processing information is needed. This study examined the SDN architecture considering the V2X emergency delay requirements of a 5G network in the worst-case scenario and performed a system-level simulation on the speed of the car, radius, and cell tier to derive a range of cells for information transfer in SDN network. In the simulation, because 5G provides a sufficiently high data rate, the information for neighboring vehicle support to the car was assumed to be without errors. Furthermore, the 5G small cell was assumed to have a cell radius of 50-100 m, and the maximum speed of the vehicle was considered to be 30-200 km/h in order to examine the network architecture to minimize the delay.

A Comparative Study of Subset Construction Methods in OSEM Algorithms using Simulated Projection Data of Compton Camera (모사된 컴프턴 카메라 투사데이터의 재구성을 위한 OSEM 알고리즘의 부분집합 구성법 비교 연구)

  • Kim, Soo-Mee;Lee, Jae-Sung;Lee, Mi-No;Lee, Ju-Hahn;Kim, Joong-Hyun;Kim, Chan-Hyeong;Lee, Chun-Sik;Lee, Dong-Soo;Lee, Soo-Jin
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.41 no.3
    • /
    • pp.234-240
    • /
    • 2007
  • Purpose: In this study we propose a block-iterative method for reconstructing Compton scattered data. This study shows that the well-known expectation maximization (EM) approach along with its accelerated version based on the ordered subsets principle can be applied to the problem of image reconstruction for Compton camera. This study also compares several methods of constructing subsets for optimal performance of our algorithms. Materials and Methods: Three reconstruction algorithms were implemented; simple backprojection (SBP), EM, and ordered subset EM (OSEM). For OSEM, the projection data were grouped into subsets in a predefined order. Three different schemes for choosing nonoverlapping subsets were considered; scatter angle-based subsets, detector position-based subsets, and both scatter angle- and detector position-based subsets. EM and OSEM with 16 subsets were performed with 64 and 4 iterations, respectively. The performance of each algorithm was evaluated in terms of computation time and normalized mean-squared error. Results: Both EM and OSEM clearly outperformed SBP in all aspects of accuracy. The OSEM with 16 subsets and 4 iterations, which is equivalent to the standard EM with 64 iterations, was approximately 14 times faster in computation time than the standard EM. In OSEM, all of the three schemes for choosing subsets yielded similar results in computation time as well as normalized mean-squared error. Conclusion: Our results show that the OSEM algorithm, which have proven useful in emission tomography, can also be applied to the problem of image reconstruction for Compton camera. With properly chosen subset construction methods and moderate numbers of subsets, our OSEM algorithm significantly improves the computational efficiency while keeping the original quality of the standard EM reconstruction. The OSEM algorithm with scatter angle- and detector position-based subsets is most available.

Tc-99m ECD Brain SPECT in MELAS Syndrome and Mitochondrial Myopathy: Comparison with MR findings (MELAS 증후군과 미토콘드리아 근육병에서의 Tc-99m ECD 뇌단일 광전자방출 전산화단층촬영 소견: 자기공명영상과의 비교)

  • Park, Sang-Joon;Ryu, Young-Hoon;Jeon, Tae-Joo;Kim, Jai-Keun;Nam, Ji-Eun;Yoon, Pyeong-Ho;Yoon, Choon-Sik;Lee, Jong-Doo
    • The Korean Journal of Nuclear Medicine
    • /
    • v.32 no.6
    • /
    • pp.490-496
    • /
    • 1998
  • Purpose: We evaluated brain perfusion SPECT findings of MELAS syndrome and mitochondrial myopathy in correlation with MR imaging in search of specific imaging features. Materials and Methods: Subjects were five patients (four females and one male; age range, 1 to 25 year) who presented with repeated stroke-like episodes, seizures or developmental delay or asymptomatic but had elevated lactic acid in CSF and serum. Conventional non-contrast MR imaging and Tc-99m-ethyl cysteinate dimer (ECD) brain perfusion SPECT were Performed and imaging features were analyzed. Results: MRI demonstrated increased T2 signal intensities in the affected areas of gray and white matters mainly in the parietal (4/5) and occipital lobes (4/5) and in the basal ganglia (1/5), which were not restricted to a specific vascular territory. SPECT demonstrated decreased perfusion in the corresponding regions of MRI lesions. In addition, there were perfusion defects in parietal (1 patient), temporal (2), and frontal (1) lobes and basal ganglia (1) and thalami (2). In a patient with mitochondrial myopathy who had normal MRI, decreased perfusion was noted in left parietal area and bilateral thalami. Conclusion: Tc-99m ECD SPECT imaging in patients with MELAS syndrome and mitochondrial myopathy showed hypoperfusion of parieto-occipital cortex, basal ganglia, thalamus and temporal cortex, which were not restricted to a specific vascular territory. There were no specific imaging features on SPECT. The significance of abnormal perfusion on SPECT without corresponding MR abnormalities needs to be evaluated further in larger number of patients.

  • PDF

Methods for Integration of Documents using Hierarchical Structure based on the Formal Concept Analysis (FCA 기반 계층적 구조를 이용한 문서 통합 기법)

  • Kim, Tae-Hwan;Jeon, Ho-Cheol;Choi, Joong-Min
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.3
    • /
    • pp.63-77
    • /
    • 2011
  • The World Wide Web is a very large distributed digital information space. From its origins in 1991, the web has grown to encompass diverse information resources as personal home pasges, online digital libraries and virtual museums. Some estimates suggest that the web currently includes over 500 billion pages in the deep web. The ability to search and retrieve information from the web efficiently and effectively is an enabling technology for realizing its full potential. With powerful workstations and parallel processing technology, efficiency is not a bottleneck. In fact, some existing search tools sift through gigabyte.syze precompiled web indexes in a fraction of a second. But retrieval effectiveness is a different matter. Current search tools retrieve too many documents, of which only a small fraction are relevant to the user query. Furthermore, the most relevant documents do not nessarily appear at the top of the query output order. Also, current search tools can not retrieve the documents related with retrieved document from gigantic amount of documents. The most important problem for lots of current searching systems is to increase the quality of search. It means to provide related documents or decrease the number of unrelated documents as low as possible in the results of search. For this problem, CiteSeer proposed the ACI (Autonomous Citation Indexing) of the articles on the World Wide Web. A "citation index" indexes the links between articles that researchers make when they cite other articles. Citation indexes are very useful for a number of purposes, including literature search and analysis of the academic literature. For details of this work, references contained in academic articles are used to give credit to previous work in the literature and provide a link between the "citing" and "cited" articles. A citation index indexes the citations that an article makes, linking the articleswith the cited works. Citation indexes were originally designed mainly for information retrieval. The citation links allow navigating the literature in unique ways. Papers can be located independent of language, and words in thetitle, keywords or document. A citation index allows navigation backward in time (the list of cited articles) and forwardin time (which subsequent articles cite the current article?) But CiteSeer can not indexes the links between articles that researchers doesn't make. Because it indexes the links between articles that only researchers make when they cite other articles. Also, CiteSeer is not easy to scalability. Because CiteSeer can not indexes the links between articles that researchers doesn't make. All these problems make us orient for designing more effective search system. This paper shows a method that extracts subject and predicate per each sentence in documents. A document will be changed into the tabular form that extracted predicate checked value of possible subject and object. We make a hierarchical graph of a document using the table and then integrate graphs of documents. The graph of entire documents calculates the area of document as compared with integrated documents. We mark relation among the documents as compared with the area of documents. Also it proposes a method for structural integration of documents that retrieves documents from the graph. It makes that the user can find information easier. We compared the performance of the proposed approaches with lucene search engine using the formulas for ranking. As a result, the F.measure is about 60% and it is better as about 15%.