• Title/Summary/Keyword: 레이어 기능

Search Result 79, Processing Time 0.026 seconds

International Standardization and Domestic Application Methods according to Interagency Operations Advisory Group (IOAG) Service Catalog (Interagency Operations Advisory Group (IOAG) Service Catalog 에 따른 국제 표준화 및 국내 적용 방안)

  • Lee, Junghyun;Park, Durkjong;Ahn, Sangil
    • Journal of Space Technology and Applications
    • /
    • v.2 no.2
    • /
    • pp.137-145
    • /
    • 2022
  • Space development has been independently performed by space agencies in each country. This causes redundant development for individual function, resulting in waste of space resources. Accordingly, Interagency Operations Advisory Group (IOAG) was established for standardization with mutual agreement between international organizations, and space resources can be used efficiently between space agencies through cross-support. IOAG define Service Catalog#1, #2, and #3, according to the network layer. In this technical paper, the background and main contents of the IOAG Service Catalog, and the application plan for domestic space development will be discussed.

An Assessment of Coastal Area Using Geographic Information Systems and Multi-Criteria Analysis (지리정보시스템(GIS)과 다기준 분석법(MCA)을 적용한 연안지역 평가)

  • Choi, Hee-Jung;Park, Jung-Jae;Hwang, Chul-Sue
    • Journal of the Korean association of regional geographers
    • /
    • v.13 no.2
    • /
    • pp.143-155
    • /
    • 2007
  • There are many conflicts or interests among various stakeholders on the development of the coastal area. The integrated methodology, which is reflective of physical conditions, socio-economic circumstances, and people's sense of values, is thus needed to solve the problems. In this study, geographical information systems(GIS) and analytic hierarchy process(AHP) that arc one of multi-criteria analysis methodologies are loosely coupled to develop better analytic procedures for coastal assessment. Socio-economic and environmental parameters of the study area, Hampyung Bay area, are converted to a GIS system-applicable format, while AHP is used to assess the relative importance level of each parameter by calculating weighting factors. After standardizing and rasterizing spatial data from various sources. the weighting factors are applied to produce the layers for each parameter. Map algebra and overlay analyses are used to create the final layer according to the decision making logic or model proposed here. Cell values of that layer could be considered as spatial alternatives. In addition to this finding, the flexibility with the weighing factors enable decision-makers to understand the procedures and alternatives in relevance with selective strategies for coastal management.

  • PDF

Pedestrian Classification using CNN's Deep Features and Transfer Learning (CNN의 깊은 특징과 전이학습을 사용한 보행자 분류)

  • Chung, Soyoung;Chung, Min Gyo
    • Journal of Internet Computing and Services
    • /
    • v.20 no.4
    • /
    • pp.91-102
    • /
    • 2019
  • In autonomous driving systems, the ability to classify pedestrians in images captured by cameras is very important for pedestrian safety. In the past, after extracting features of pedestrians with HOG(Histogram of Oriented Gradients) or SIFT(Scale-Invariant Feature Transform), people classified them using SVM(Support Vector Machine). However, extracting pedestrian characteristics in such a handcrafted manner has many limitations. Therefore, this paper proposes a method to classify pedestrians reliably and effectively using CNN's(Convolutional Neural Network) deep features and transfer learning. We have experimented with both the fixed feature extractor and the fine-tuning methods, which are two representative transfer learning techniques. Particularly, in the fine-tuning method, we have added a new scheme, called M-Fine(Modified Fine-tuning), which divideslayers into transferred parts and non-transferred parts in three different sizes, and adjusts weights only for layers belonging to non-transferred parts. Experiments on INRIA Person data set with five CNN models(VGGNet, DenseNet, Inception V3, Xception, and MobileNet) showed that CNN's deep features perform better than handcrafted features such as HOG and SIFT, and that the accuracy of Xception (threshold = 0.5) isthe highest at 99.61%. MobileNet, which achieved similar performance to Xception and learned 80% fewer parameters, was the best in terms of efficiency. Among the three transfer learning schemes tested above, the performance of the fine-tuning method was the best. The performance of the M-Fine method was comparable to or slightly lower than that of the fine-tuningmethod, but higher than that of the fixed feature extractor method.

Functions and Roles of Digital Landscape Architectural Drawing (조경 설계에서 디지털 드로잉의 기능과 역할)

  • Lee, Myeong-Jun
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.46 no.2
    • /
    • pp.1-13
    • /
    • 2018
  • This work discusses the major roles of digital technologies in the history of landscape architectural drawing, and offers a critique of the dominant trend towards realism in recent digital landscape representations. During the period of transition from conventional drawing tools, computer technologies generally functioned as mechanical tools to imitate prior manual techniques. Specifically, the GIS was served as a mechanical tool to efficiently process the manual layer cake; CAD software generally functioned to translate physical models to two-dimensional construction documents while graphic software generally functioned as a tool to perform processes similar to those of manual collage and montage techniques. Recent digital landscape drawings tend to adopt a realistic depiction like the painting of landscape appearance. In the representations, discernible traces of cutting and assembling are removed via graphic software; thus, the complete representations are perceived as if they were a copy of an actual landscape. The realistic images are an easy way to communicate with the public. However, it is difficult to achieve a full embodiment of all of the multisensory characteristics of a landscape through these visuals. They often deceive viewers by visualizing idealized conditions of not-yet-actualized landscapes and production of the final images takes up a large portion of the overall design process. Alternatively, 3D digital modeling of landscape performance and creative uses of digital technologies during the overall design process, as well as hybridized techniques with different drawing techniques and technologies, provide the opportunity to explore various aspects of a landscape.

Development of PDA-Based Software for Forest Geographic Information (PDA기반의 산림지리정보 소프트웨어 개발에 관한 연구)

  • Suk, Sooil;Lee, Heonho;Lee, Dohyung
    • Journal of Korean Society of Forest Science
    • /
    • v.96 no.1
    • /
    • pp.7-13
    • /
    • 2007
  • This study was done to develop PDA based application system for forest geographic information with GPS. The major results obtained in this study were as follows. A PDA based application program was developed to work on $Microsoft^{TM}$ PocketPC 2002 and 2003 operating system. The screen of PDA displays a 1:25,000 digital topographical map adopted DXF format converted from PC, and the map data with 1:2,500 to 1:30,000 on PDA can be zoomed in or out to five levels. Current position and navigating path received from GPS can be displayed on the screen and be saved in PDA. Information selected among layers of digital topographic map in DXF format can be converted into binary files which can be used on forest geographic information software. This can compress DXF files to 90% in size, and the processing speed of PDA was improved. The forest geographic information management system can be used to manage sample plots on which forest inventory is done, with the help of the sub-menus and grid index values with position information received from GPS. Forest workers can in quire forest geographic information such as forest type, location, forest roads, soil erosion control dams using forest geographic information management system in the field. The forest geographic information management system can provide current position and mobile path information to people who enjoy forest related activities like mountain-climbing, sightseeing, and visiting to historic spots.

The Intergrated Information Systems for Frequently Flooded Area Using Internet GIS (Internet GIS를 이용한 상습침수지역 종합정보화 시스템)

  • Yeo, Woon-Ki;Jang, Kyung-Soo;Jun, Ji-Young;Jee, Hong-Kee;Lee, Soon-Tak
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2006.05a
    • /
    • pp.1116-1120
    • /
    • 2006
  • 하천주변 저지대에 주택이나 공장 등의 시설물 집중과 산지의 개발 및 인구 집중으로 인한 도시화 지역의 증가로 홍수 피해는 더욱 커지고 있다. 특히, 상습침수지구의 주민들은 항상 재해로부터 위험을 느끼고 있으므로 새로운 재해관리체계를 구축하여 이에 대한 확고한 대책이 필요하다. 최근 인터넷 사용의 증가로 많은 사용자들이 웹을 통해 다양한 데이터를 공유하고 있으며, 이러한 추세는 더욱 더 증가할 것이다. 인터넷은 다양한 정보를 네트워크를 통해 사용자에게 실시간으로 제공하고 있으며, 통신기술의 발전, 네트워크 통합화 속에서 그 내용과 방식이 더욱 다양해지고 있다. GIS분야에 있어서도 인터넷 네트워크를 이용하여 분산되어 있는 많은 조직이나 사용자들에게 그 기능과 서비스를 제공하는 추세로 바뀌고 있다. Internet GIS는 원격 지리정보 데이타에 대한 접근, 전송, 분석 및 GIS를 표현하는 수단으로 인터넷을 이용하는 특별한 GIS 도구이다. Internet GIS는 전통적인 GIS 소프트웨어가 가지고 있는 대부분의 기능은 물론 인터넷 및 그와 관련된 WWW 및 FTP 프로토콜의 장점을 가지는 부가적인 기능들을 포함할 수 있다. 이들 부가적인 기능은 원격 데이타 및 응용 프로그램의 교환, 지역 컴퓨터에 GIS 응용 프로그램 없이 GIS 분석 기능을 수행, 인터넷상에서 상호작용하는 지도 및 데이타를 표현하는 기능들을 포함한다. Internet GIS는 객체지향적이고 상호운영적이며, 분산적이라는 주요한 특징을 갖는다. 인터넷에서 각각의 GIS 데이타 및 기능성은 하나의 객체로서 서로 다른 서버에 위치하며 필요시 조합 또는 통합되어 운영된다. Internet GIS를 이용한 상습침수지구 지역정보 제공사이트를 구축하기 위해서는 인터넷으로 서비스 할 수 있는 인터넷용 상습침수지구 GIS기본도를 구축이 필요하다. 인터넷 서비스를 위한 상습침수지구 기본도는 또 다른 형태의 주제도라고 볼 수 있으며, 이를 구축하기 위해서는 자료변환 및 가공이 필요하다. 즉, 각 상습침수지구에 필요한 지형도는 국립지리원에서 제작된 1:5,000 수치지형도가 있으나 이는 자료가 방대하고 상습침수지구에 필요하지 않은 자료들을 많이 포함하고 있으므로 상습침수지구의 데이터를 인터넷을 통해 서비스하기 위해서는 많은 불필요한 레이어의 삭제, 서비스 속도를 고려한 데이터의 일반화작업, 지도의 축소.확대 등 자료제공 방식에 따른 작업 그리고 가시성을 고려한 심볼 및 색채 디자인 등의 작업이 수반되어야 하며, 이들을 고려한 인터넷용 GIS기본도를 신규 제작한다. 상습침수지구와 관련된 각종 GIS데이타와 각 기관이 보유하고 있는 공공정보 가운데 공간정보와 연계되어야 하는 자료를 인터넷 GIS를 이용하여 효율적으로 관리하기 위해서는 단계별 구축전략이 필요하다. 따라서 본 논문에서는 인터넷 GIS를 이용하여 상습침수구역관련 정보를 검색, 처리 및 분석할 수 있는 상습침수 구역 종합정보화 시스템을 구축토록 하였다.

  • PDF

Analysis of Georeferencing Accuracy in 3D Building Modeling Using CAD Plans (CAD 도면을 활용한 3차원 건축물 모델링의 Georeferencing 정확도 분석)

  • Kim, Ji-Seon;Yom, Jae-Hong;Lee, Dong-Cheon
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.25 no.2
    • /
    • pp.117-131
    • /
    • 2007
  • Representation of building internal space is an active research area as the need for more geometrically accurate and visually realistic increases. 3 dimensional representation is common ground of research for disciplines such as computer graphics, architectural design and engineering and Geographic Information System (GIS). In many cases CAD plans are the starting point of reconstruction of 3D building models. The main objectives of building reconstruction in GIS applications are visualization and spatial analysis. Hence, CAD plans need to be preprocessed and edited to adapt to the data models of GIS SW and then georeferenced to enable spatial analysis. This study automated the preprocessing of CAD data using AutoCAD VBA (Visual Basic Application), and the processed data was topologically restructured for further analysis in GIS environment. Accuracy of georeferencing CAD data was also examined by comparing the results of coordinate transformation by using digital maps and GPS measurements as the sources of ground control points. The reconstructed buildings were then applied to visualization and network modeling.

Effects of Strawberry Powders on the Quality Characteristics of Yellow Layer Cake (딸기 분말의 첨가가 옐로우 레이어 케이크의 품질특성에 미치는 영향)

  • Kim, Yeoung-Ae
    • Korean journal of food and cookery science
    • /
    • v.24 no.4
    • /
    • pp.536-541
    • /
    • 2008
  • In order to determine the effects of strawberry powder on the baking quality of cakes, yellow layer cakes were prepared with four freeze-dried strawberry powders substitutes, at levels of 0%, 1%, 2%, 3%, 4% and 5%. Then, the physical properties, baking properties and sensory characteristics of the finished cakes were assessed. The cakes were stored for 6 days at $22^{\circ}C$ and the change in hardness during storage was evaluated. The Viscosity of the doughs, as well as the specific gravity, increased as the amount of strawberry powder in the flour increased. The volume indices of the strawberry powder cakes were less than that of control, but the other indices did not differ from those of the controls. The crust color of the strawberry cakes evidenced a reduction in L. a and b values. The crumb color also evidenced a reduction in the L and b values, but the a values increased. Sensory characteristics were evaluated by 60 students from the Dept. of Food and Biotechnology. Crust color, crumb color, moistness, softness, taste and overall acceptance were measured via a 5-scale acceptance test. The crust color of cakes containing more than 3% strawberry powders and the crumb color of all strawberry cakes evidenced lower scores than the controls. Cakes containing 5% strawberry powders were least acceptable in terms of overall characteristics. Although cakes prepared with flour containing up to 4% strawberry powder were less acceptable than the controls, general sensory scores ranged in an average${\sim}$like range. The incorporation of strawberry power into cakes was shown to increase the overall hardness.

Clustering Performance Analysis of Autoencoder with Skip Connection (스킵연결이 적용된 오토인코더 모델의 클러스터링 성능 분석)

  • Jo, In-su;Kang, Yunhee;Choi, Dong-bin;Park, Young B.
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.9 no.12
    • /
    • pp.403-410
    • /
    • 2020
  • In addition to the research on noise removal and super-resolution using the data restoration (Output result) function of Autoencoder, research on the performance improvement of clustering using the dimension reduction function of autoencoder are actively being conducted. The clustering function and data restoration function using Autoencoder have common points that both improve performance through the same learning. Based on these characteristics, this study conducted an experiment to see if the autoencoder model designed to have excellent data recovery performance is superior in clustering performance. Skip connection technique was used to design autoencoder with excellent data recovery performance. The output result performance and clustering performance of both autoencoder model with Skip connection and model without Skip connection were shown as graph and visual extract. The output result performance was increased, but the clustering performance was decreased. This result indicates that the neural network models such as autoencoders are not sure that each layer has learned the characteristics of the data well if the output result is good. Lastly, the performance degradation of clustering was compensated by using both latent code and skip connection. This study is a prior study to solve the Hanja Unicode problem by clustering.

An Efficient Cache Mechanism for Improving Response Times in Integrated RFID Middleware (통합 RFID 미들웨어의 응답시간 개선을 위한 효과적인 캐쉬 구조 설계)

  • Kim, Cheong-Ghil;Lee, Jun-Hwan;Park, Kyung-Lang;Kim, Shin-Dug
    • The KIPS Transactions:PartA
    • /
    • v.15A no.1
    • /
    • pp.17-26
    • /
    • 2008
  • This paper proposes an efficient caching mechanism appropriate for the integrated RFID middleware which can integrate wireless sensor networks (WSNs) and RFID (radio frequency identification) systems. The operating environment of the integrated RFID middleware is expected to face the situations of a significant amount of data reading from RFID readers, constant stream data input from large numbers of autonomous sensor nodes, and queries from various applications to history data sensed before and stored in distributed storages. Consequently, an efficient middleware layer equipping with caching mechanism is inevitably necessary for low latency of request-response while processing both data stream from sensor networks and history data from distributed database. For this purpose, the proposed caching mechanism includes two optimization methods to reduce the overhead of data processing in RFID middleware based on the classical cache implementation polices. One is data stream cache (DSC) and the other is history data cache (HDC), according to the structure of data request. We conduct a number of simulation experiments under different parameters and the results show that the proposed caching mechanism contributes considerably to fast request-response times.