• Title/Summary/Keyword: Network Resource

Search Result 2,089, Processing Time 0.026 seconds

Research Trend Analysis Using Bibliographic Information and Citations of Cloud Computing Articles: Application of Social Network Analysis (클라우드 컴퓨팅 관련 논문의 서지정보 및 인용정보를 활용한 연구 동향 분석: 사회 네트워크 분석의 활용)

  • Kim, Dongsung;Kim, Jongwoo
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.195-211
    • /
    • 2014
  • Cloud computing services provide IT resources as services on demand. This is considered a key concept, which will lead a shift from an ownership-based paradigm to a new pay-for-use paradigm, which can reduce the fixed cost for IT resources, and improve flexibility and scalability. As IT services, cloud services have evolved from early similar computing concepts such as network computing, utility computing, server-based computing, and grid computing. So research into cloud computing is highly related to and combined with various relevant computing research areas. To seek promising research issues and topics in cloud computing, it is necessary to understand the research trends in cloud computing more comprehensively. In this study, we collect bibliographic information and citation information for cloud computing related research papers published in major international journals from 1994 to 2012, and analyzes macroscopic trends and network changes to citation relationships among papers and the co-occurrence relationships of key words by utilizing social network analysis measures. Through the analysis, we can identify the relationships and connections among research topics in cloud computing related areas, and highlight new potential research topics. In addition, we visualize dynamic changes of research topics relating to cloud computing using a proposed cloud computing "research trend map." A research trend map visualizes positions of research topics in two-dimensional space. Frequencies of key words (X-axis) and the rates of increase in the degree centrality of key words (Y-axis) are used as the two dimensions of the research trend map. Based on the values of the two dimensions, the two dimensional space of a research map is divided into four areas: maturation, growth, promising, and decline. An area with high keyword frequency, but low rates of increase of degree centrality is defined as a mature technology area; the area where both keyword frequency and the increase rate of degree centrality are high is defined as a growth technology area; the area where the keyword frequency is low, but the rate of increase in the degree centrality is high is defined as a promising technology area; and the area where both keyword frequency and the rate of degree centrality are low is defined as a declining technology area. Based on this method, cloud computing research trend maps make it possible to easily grasp the main research trends in cloud computing, and to explain the evolution of research topics. According to the results of an analysis of citation relationships, research papers on security, distributed processing, and optical networking for cloud computing are on the top based on the page-rank measure. From the analysis of key words in research papers, cloud computing and grid computing showed high centrality in 2009, and key words dealing with main elemental technologies such as data outsourcing, error detection methods, and infrastructure construction showed high centrality in 2010~2011. In 2012, security, virtualization, and resource management showed high centrality. Moreover, it was found that the interest in the technical issues of cloud computing increases gradually. From annual cloud computing research trend maps, it was verified that security is located in the promising area, virtualization has moved from the promising area to the growth area, and grid computing and distributed system has moved to the declining area. The study results indicate that distributed systems and grid computing received a lot of attention as similar computing paradigms in the early stage of cloud computing research. The early stage of cloud computing was a period focused on understanding and investigating cloud computing as an emergent technology, linking to relevant established computing concepts. After the early stage, security and virtualization technologies became main issues in cloud computing, which is reflected in the movement of security and virtualization technologies from the promising area to the growth area in the cloud computing research trend maps. Moreover, this study revealed that current research in cloud computing has rapidly transferred from a focus on technical issues to for a focus on application issues, such as SLAs (Service Level Agreements).

Development of a complex failure prediction system using Hierarchical Attention Network (Hierarchical Attention Network를 이용한 복합 장애 발생 예측 시스템 개발)

  • Park, Youngchan;An, Sangjun;Kim, Mintae;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.127-148
    • /
    • 2020
  • The data center is a physical environment facility for accommodating computer systems and related components, and is an essential foundation technology for next-generation core industries such as big data, smart factories, wearables, and smart homes. In particular, with the growth of cloud computing, the proportional expansion of the data center infrastructure is inevitable. Monitoring the health of these data center facilities is a way to maintain and manage the system and prevent failure. If a failure occurs in some elements of the facility, it may affect not only the relevant equipment but also other connected equipment, and may cause enormous damage. In particular, IT facilities are irregular due to interdependence and it is difficult to know the cause. In the previous study predicting failure in data center, failure was predicted by looking at a single server as a single state without assuming that the devices were mixed. Therefore, in this study, data center failures were classified into failures occurring inside the server (Outage A) and failures occurring outside the server (Outage B), and focused on analyzing complex failures occurring within the server. Server external failures include power, cooling, user errors, etc. Since such failures can be prevented in the early stages of data center facility construction, various solutions are being developed. On the other hand, the cause of the failure occurring in the server is difficult to determine, and adequate prevention has not yet been achieved. In particular, this is the reason why server failures do not occur singularly, cause other server failures, or receive something that causes failures from other servers. In other words, while the existing studies assumed that it was a single server that did not affect the servers and analyzed the failure, in this study, the failure occurred on the assumption that it had an effect between servers. In order to define the complex failure situation in the data center, failure history data for each equipment existing in the data center was used. There are four major failures considered in this study: Network Node Down, Server Down, Windows Activation Services Down, and Database Management System Service Down. The failures that occur for each device are sorted in chronological order, and when a failure occurs in a specific equipment, if a failure occurs in a specific equipment within 5 minutes from the time of occurrence, it is defined that the failure occurs simultaneously. After configuring the sequence for the devices that have failed at the same time, 5 devices that frequently occur simultaneously within the configured sequence were selected, and the case where the selected devices failed at the same time was confirmed through visualization. Since the server resource information collected for failure analysis is in units of time series and has flow, we used Long Short-term Memory (LSTM), a deep learning algorithm that can predict the next state through the previous state. In addition, unlike a single server, the Hierarchical Attention Network deep learning model structure was used in consideration of the fact that the level of multiple failures for each server is different. This algorithm is a method of increasing the prediction accuracy by giving weight to the server as the impact on the failure increases. The study began with defining the type of failure and selecting the analysis target. In the first experiment, the same collected data was assumed as a single server state and a multiple server state, and compared and analyzed. The second experiment improved the prediction accuracy in the case of a complex server by optimizing each server threshold. In the first experiment, which assumed each of a single server and multiple servers, in the case of a single server, it was predicted that three of the five servers did not have a failure even though the actual failure occurred. However, assuming multiple servers, all five servers were predicted to have failed. As a result of the experiment, the hypothesis that there is an effect between servers is proven. As a result of this study, it was confirmed that the prediction performance was superior when the multiple servers were assumed than when the single server was assumed. In particular, applying the Hierarchical Attention Network algorithm, assuming that the effects of each server will be different, played a role in improving the analysis effect. In addition, by applying a different threshold for each server, the prediction accuracy could be improved. This study showed that failures that are difficult to determine the cause can be predicted through historical data, and a model that can predict failures occurring in servers in data centers is presented. It is expected that the occurrence of disability can be prevented in advance using the results of this study.

Performance Analysis and Comparison of Stream Ciphers for Secure Sensor Networks (안전한 센서 네트워크를 위한 스트림 암호의 성능 비교 분석)

  • Yun, Min;Na, Hyoung-Jun;Lee, Mun-Kyu;Park, Kun-Soo
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.18 no.5
    • /
    • pp.3-16
    • /
    • 2008
  • A Wireless Sensor Network (WSN for short) is a wireless network consisting of distributed small devices which are called sensor nodes or motes. Recently, there has been an extensive research on WSN and also on its security. For secure storage and secure transmission of the sensed information, sensor nodes should be equipped with cryptographic algorithms. Moreover, these algorithms should be efficiently implemented since sensor nodes are highly resource-constrained devices. There are already some existing algorithms applicable to sensor nodes, including public key ciphers such as TinyECC and standard block ciphers such as AES. Stream ciphers, however, are still to be analyzed, since they were only recently standardized in the eSTREAM project. In this paper, we implement over the MicaZ platform nine software-based stream ciphers out of the ten in the second and final phases of the eSTREAM project, and we evaluate their performance. Especially, we apply several optimization techniques to six ciphers including SOSEMANUK, Salsa20 and Rabbit, which have survived after the final phase of the eSTREAM project. We also present the implementation results of hardware-oriented stream ciphers and AES-CFB fur reference. According to our experiment, the encryption speeds of these software-based stream ciphers are in the range of 31-406Kbps, thus most of these ciphers are fairly acceptable fur sensor nodes. In particular, the survivors, SOSEMANUK, Salsa20 and Rabbit, show the throughputs of 406Kbps, 176Kbps and 121Kbps using 70KB, 14KB and 22KB of ROM and 2811B, 799B and 755B of RAM, respectively. From the viewpoint of encryption speed, the performances of these ciphers are much better than that of the software-based AES, which shows the speed of 106Kbps.

An Artificial Intelligence Approach to Waterbody Detection of the Agricultural Reservoirs in South Korea Using Sentinel-1 SAR Images (Sentinel-1 SAR 영상과 AI 기법을 이용한 국내 중소규모 농업저수지의 수표면적 산출)

  • Choi, Soyeon;Youn, Youjeong;Kang, Jonggu;Park, Ganghyun;Kim, Geunah;Lee, Seulchan;Choi, Minha;Jeong, Hagyu;Lee, Yangwon
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.5_3
    • /
    • pp.925-938
    • /
    • 2022
  • Agricultural reservoirs are an important water resource nationwide and vulnerable to abnormal climate effects such as drought caused by climate change. Therefore, it is required enhanced management for appropriate operation. Although water-level tracking is necessary through continuous monitoring, it is challenging to measure and observe on-site due to practical problems. This study presents an objective comparison between multiple AI models for water-body extraction using radar images that have the advantages of wide coverage, and frequent revisit time. The proposed methods in this study used Sentinel-1 Synthetic Aperture Radar (SAR) images, and unlike common methods of water extraction based on optical images, they are suitable for long-term monitoring because they are less affected by the weather conditions. We built four AI models such as Support Vector Machine (SVM), Random Forest (RF), Artificial Neural Network (ANN), and Automated Machine Learning (AutoML) using drone images, sentinel-1 SAR and DSM data. There are total of 22 reservoirs of less than 1 million tons for the study, including small and medium-sized reservoirs with an effective storage capacity of less than 300,000 tons. 45 images from 22 reservoirs were used for model training and verification, and the results show that the AutoML model was 0.01 to 0.03 better in the water Intersection over Union (IoU) than the other three models, with Accuracy=0.92 and mIoU=0.81 in a test. As the result, AutoML performed as well as the classical machine learning methods and it is expected that the applicability of the water-body extraction technique by AutoML to monitor reservoirs automatically.

A Study on Case for Localization of Korean Enterprises in India (인도 진출 한국기업의 현지화에 관한 사례 연구)

  • Seo, Min-Kyo;Kim, Hee-Jun
    • International Commerce and Information Review
    • /
    • v.16 no.4
    • /
    • pp.409-437
    • /
    • 2014
  • The purpose of this study is to present the specific ways of successful localization by analyzing the success and failures case for localization within the framework of the strategic models through a theoretical background and strategic models of localization. The strategic models of localization are divided by management aspects such as the localization of product and sourcing, the localization of human resources, the localization of marketing, the localization of R&D, harmony with a local community and delegation of authority between headquarters and local subsidiaries. The results, by comparing and analyzing the success and failures case for localization of individual companies operating in India, indicate that in terms of localization of product and sourcing, there are successful companies which procure a components locally and produce a suitable model which local consumers prefer and the failed companies which can not meet local consumers' needs. In case of localization of human resources, most companies recognize the importance of this portion and make use of superior human resource aggressively through a related education. In case of localization of marketing, It is found that the successful companies perform pre-market research & management and build a effective marketing skills & after service network and select local business partner which has a technical skills and carry out a business activities, customer support, complaint handling with their own organization. In terms of localization of R&D, the successful major companies establish and operate R&D center to promote a suitable model for local customers. In part of harmony with a local community, it shows that companies which made a successful localization understand the cultural environment and contribute to the community through CSR. In aspect of delegation of authority between headquarters and local subsidiaries, it is found that most of Korean companies are very weak for this part. there is a tendency to be determined by the head office rather than local subsidiaries. Implication of this thesis is that Korean enterprises in India should carry forward localization of products and components, foster of local human resource who recognize management and system of company and take part in voluntary market strategy decision, wholly owned subsidiary, establishment and operation of R & D center, understanding of local culture and system, corporate social responsibility, autonomy in management.

  • PDF

Legal status of Priave Transaction Regarding the Geostationary Satellite Orbit (지구정지궤도의 사적 거래의 국제법상 지위에 관한 연구)

  • Shin, Hong Kyun
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.29 no.2
    • /
    • pp.239-272
    • /
    • 2014
  • The rights and obligations of the Member States of ITU in the domain of international frequency management of the spectrum/orbit resource are incorporated in the Constitution and Convention of the ITU and in the Radio Regulations that complement them. These instruments contain the main principles and lay down the specific regulations governing the major elements such as rights and obligations of member administrations in obtaining access to the spectrum/orbit resource, as well as international recognition of these rights by recording frequency assignments and, as appropriate, any associated orbits, including the geostationary-satellite orbits used or intended to be used in the Master International Frequency Register (MIFR) Coordination is a further step in the process leading up to notification of the frequency assignments for recording in the MIFR. This procedure is a formal regulatory obligation both for an administration seeking to assign a frequency in its network and for an administration whose existing or planned services may be affected by that assignment. Regulatory problem lies in allowing administrations to fulfill their "bringing into use" duty for preserving his filing simply putting any satellites, whatever nationlity or technical specification may be, into filed orbit. This sort of regulatory lack may result in the emergence of the secondary market for satellite orbit. Within satellite orbit secondary market, the object of transaction may be the satellite itself, or the regulatory rights in rem, or the orbit registered in the MIFR. Recent case of selling the Koreasat belongs to the typical example of orbit transaction between private companies, the legality of which remains doubtedly controversial from the perspective of international space law as well as international transaction law. It must be noted, however, that the fact is the Koreasat 3 and its filed orbit is for sale.

The physical geography in general:yesterday and tomorrow (자연지리학 일반: 회고와 전망)

  • Son, Ill
    • Journal of the Korean Geographical Society
    • /
    • v.31 no.2
    • /
    • pp.138-159
    • /
    • 1996
  • There has been a tendency for Geomorphology and Climatology to be dominant in Physical Geography for 50 years in Korea. Physical Geography is concerned with the study of the totality of natural environment through the integrated approaches. But, an overall direction or a certain paradigm could not be found, because major sub-divisions of Physical Geography have been studied individually and the subjects and the approaches in studying Physical Geography are enormously diverse. A consensus of opinion could not also exist in deciding what kind of the sub-divisions should be included in the physical geography in general and how those should be summarized. Furthermore it would be considered imprudent to survey the studies of Physical Geography besides those of Geomorphology and Climatology due to the small number of researchers. Assuming that the rest of Physical Geographical studies with the exception of Geomorphological and Climatological studies are the Physical Geography in general, the studies of Physical Geogrpahy in general are summarized and several aspects are drown out as follows. First the descliption of all possible factors of natural environments was the pattern of early studies of Physical Geography and the tendency is maintained in the various kinds of research and project reports. Recently Physical Geographers have published several introductory textbooks or research monographs. In those books, however, the integrated approaches to Physical Geography were not suggested and the relationship between man and nature are dealt with in the elementary level. Second, the authentic soil studies of Physical Geographers are insignificant, because the studies of soil in Physical Geography have been mostly considered as the subsidiary means of Geomorphology Summarizing the studies of Soil Gegraphy by physical geographers and other Pedologists, the subjects are classified as soil-forming processes, soil erosions, soil in the tidal flat and reclaimed land, and soil pollution. Physical Geographers have focused upon the soil-forming processes in order to elucidate the geomorphic processes and the past climatic environment. The results of other subjects are trifling. Thirdy Byogeygrayhers and the results of studies are extremely of small number and the studies of Biogeography in Korea lines in the starting point. But, Biogeography could be a more unifying theme for the Physical-human Geography interface, and it would be expected to play an active part in the field of environmental conservation and resource management. Forth, the studies of Hydrogeography (Geographical Hydrology) in Korea have run through the studies of water balance and the morphometric studies such as the drainage network analysis and the relations of various kinds of morphometric elements in river. Recently, the hydrological model have introduced and developed to predict the flow of sediment, discharge, and ground water. The growth of groundwater studies is worthy of close attention. Finally, the studies on environmental problems was no mole than the general description about environmental destruction, resource development, environmental conservation, etc. until 1970s. The ecological perspectives on the relationship between man and nature were suggested in some studies of natural hazard. The new environmentalism having been introduced since 1980s. Human geographers have lead the studies of Environmental Perception. Environmental Ethics, Environmental Sociology, environmental policy. The Physical geographers have stay out of phase with the climate of the time and concentrate upon the publication of introductory textbooks. Recently, several studies on the human interference and modification of natural environments have been made an attempt in the fields of Geomorphology and climatology. Summarizing the studies of Physical Geography for 50 years in Korea, the integrated approaches inherent in Physical Geography disappeared little by little and the majol sub-divisions of Physical Ceography have develop in connection with the nearby earth sciences such as Geology, Meteorology, Pedology, Biology, Hydrology, etc been rediscovered by non-geographers under the guise of environmental science. It is expected that Physical Geography would revive as the dominant subject to cope with environmental problems, rearming with the innate integrated approaches.

  • PDF

A Study on Information Literacy Education through the University Library Webpages (원격 정보이용교육에 관한 연구 : 대학도서관 웹페이지에 수록된 내용을 중심으로)

  • Park On-Za
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.32 no.4
    • /
    • pp.31-52
    • /
    • 1998
  • User Education is one of the main activities among the traditional library services. Nowadays information technology has brought drastic changes into in the libraries, especially in the types of information sources, information users' behavior and the user education programs. The users need to know new information skills to retrieve the adequate information they need, and consequently libraries should develop new instruction programs to meet the user needs according to the radical changes of information technology and daily increasing electronic Information sources including internet resources. In this paper it was researched how to use the library websites for information instruction at the university libraries in Korea, USA and Canada through literature and by visiting the websites of the university libraries. It was found that Korean university libraries focus on providing the physical access to information, while American and Canadian university libraries focus on providing the intellectual access to information as well as the physical access to information. Most Korean university libraries have the entries about library history, library services and collection, outside networked information sources on the menu of their homepages, while there are guides how to use information sources such as subject bibliographies, reference tools, network resource tools, and guides how to write a paper, and information for user instruction on the websites of the western university libraries. It is very promising to make full use of the library webpages for library user education, which nowadays provide very powerful communication interface and make the library users find appealing and accessible.

  • PDF

Adaptive Power Control Dynamic Range Algorithm in WCDMA Downlink Systems (WCDMA 하향 링크 시스템에서의 적응적 PCDR 알고리즘)

  • 정수성;박형원;임재성
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.8A
    • /
    • pp.918-927
    • /
    • 2004
  • WCDMA system is 3rd generation wireless mobile system specified by 3GPP. In WCDMA downlink, two power control schemes are operated. One is inner loop power control operated in every slot. Another is outer loop power control based on one frame time. Base station (BS) can estimate proper transmission power by these two power control schemes. However, because each MS's transmission power makes a severe effect on BS's performance, BS cannot give excessive transmission power to the specific user. 3GPP defined Power Control Dynamic Range (PCDR) to guarantee proper BS's performance. In this paper, we propose Adaptive PCDR algorithm. By APCDR algorithm, Radio Network Controller (RNC) can estimate each MS's current state using received signal to interference ratio (SIR). APCDR algorithm changes MS's maximum code channel power based on frame. By proposed scheme, each MS can reduce wireless channel effect and endure outages in cell edge. Therefore, each MS can obtain better QoS. Simulation result indicate that APCDR algorithm show more attractive output than fixed PCDR algorithm.

Design of Web based Simulation Provenance Data Sharing Service (웹 기반 시뮬레이션 이력출처 데이터 공유 서비스 설계)

  • Jung, Youngjin;Nam, Dukyun;Yu, Jinseung;Lee, JongSuk Ruth;Cho, Kumwon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1128-1134
    • /
    • 2014
  • Web based simulation service is actively utilized to computably analyze various kinds of phenomena in real world according to progress of computing technology and spread of Network. However it is hard to share data and information among users on the services, because most of web based simulation services do not share and open simulation processing information and results. In this paper, we design a simulation provenance data sharing service on EDISON_CFD (EDucation-research Integration Simulation On the Net for Computational Fluid Dynamics) to share the calculated simulation performance information. To store and share the simulation processing information, we define the simulation processing step as "Problem ${\rightarrow}$ Plan, Design ${\rightarrow}$ Mesh ${\rightarrow}$ Simulation performance ${\rightarrow}$ Result ${\rightarrow}$ Report." Users can understand a problem solving method through a computer simulation by searching the simulation performance information with Search/Share API of the store. Besides, this opened simulation information can reduce the waste of calculation resource to process same simulation jobs.