• Title/Summary/Keyword: network-selection

Search Result 1,782, Processing Time 0.031 seconds

An Optimization Method of Neural Networks using Adaptive Regulraization, Pruning, and BIC (적응적 정규화, 프루닝 및 BIC를 이용한 신경망 최적화 방법)

  • 이현진;박혜영
    • Journal of Korea Multimedia Society
    • /
    • v.6 no.1
    • /
    • pp.136-147
    • /
    • 2003
  • To achieve an optimal performance for a given problem, we need an integrative process of the parameter optimization via learning and the structure optimization via model selection. In this paper, we propose an efficient optimization method for improving generalization performance by considering the property of each sub-method and by combining them with common theoretical properties. First, weight parameters are optimized by natural gradient teaming with adaptive regularization, which uses a diverse error function. Second, the network structure is optimized by eliminating unnecessary parameters with natural pruning. Through iterating these processes, candidate models are constructed and evaluated based on the Bayesian Information Criterion so that an optimal one is finally selected. Through computational experiments on benchmark problems, we confirm the weight parameter and structure optimization performance of the proposed method.

  • PDF

Context-aware Connectivity Analysis Method using Context Data Prediction Model in Delay Tolerant Networks (Delay Tolerant Networks에서 속성정보 예측 모델을 이용한 상황인식 연결성 분석 기법)

  • Jeong, Rae-Jin;Oh, Young-Jun;Lee, Kang-Whan
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.4
    • /
    • pp.1009-1016
    • /
    • 2015
  • In this paper, we propose EPCM(Efficient Prediction-based Context-awareness Matrix) algorithm analyzing connectivity by predicting cluster's context data such as velocity and direction. In the existing DTN, unrestricted relay node selection causes an increase of delay and packet loss. The overhead is occurred by limited storage and capability. Therefore, we propose the EPCM algorithm analyzing predicted context data using context matrix and adaptive revision weight, and selecting relay node by considering connectivity between cluster and base station. The proposed algorithm saves context data to the context matrix and analyzes context according to variation and predicts context data after revision from adaptive revision weight. From the simulation results, the EPCM algorithm provides the high packet delivery ratio by selecting relay node according to predicted context data matrix.

A Method of Coupling Expected Patch Log Likelihood and Guided Filtering for Image De-noising

  • Wang, Shunfeng;Xie, Jiacen;Zheng, Yuhui;Wang, Jin;Jiang, Tao
    • Journal of Information Processing Systems
    • /
    • v.14 no.2
    • /
    • pp.552-562
    • /
    • 2018
  • With the advent of the information society, image restoration technology has aroused considerable interest. Guided image filtering is more effective in suppressing noise in homogeneous regions, but its edge-preserving property is poor. As such, the critical part of guided filtering lies in the selection of the guided image. The result of the Expected Patch Log Likelihood (EPLL) method maintains a good structure, but it is easy to produce the ladder effect in homogeneous areas. According to the complementarity of EPLL with guided filtering, we propose a method of coupling EPLL and guided filtering for image de-noising. The EPLL model is adopted to construct the guided image for the guided filtering, which can provide better structural information for the guided filtering. Meanwhile, with the secondary smoothing of guided image filtering in image homogenization areas, we can improve the noise suppression effect in those areas while reducing the ladder effect brought about by the EPLL. The experimental results show that it not only retains the excellent performance of EPLL, but also produces better visual effects and a higher peak signal-to-noise ratio by adopting the proposed method.

A Location Information-based Gradient Routing Algorithm for Wireless Ad Hoc Networks (무선 애드혹 네트워크를 위한 위치정보 기반 기울기 라우팅 알고리즘)

  • Bang, Min-Young;Lee, Bong-Hwan
    • The KIPS Transactions:PartC
    • /
    • v.17C no.3
    • /
    • pp.259-270
    • /
    • 2010
  • In this paper, a Location Information-based Gradient Routing (LIGR) algorithm is proposed for setting up routing path based on physical location information of sensor nodes in wireless ad-hoc networks. LIGR algorithm reduces the unnecessary data transmission time, route search time, and propagation delay time of packet by determining the transmission direction and search range through the gradient from the source node to sink node using the physical location information. In addition, the low battery nodes are supposed to have the second or third priority in case of forwarding node selection, which reduces the possibility of selecting the low battery nodes. As a result, the low battery node functions as host node rather than router in the wireless sensor networks. The LIGR protocol performed better than the Logical Grid Routing (LGR) protocol in the average receiving rate, delay time, the average residual energy, and the network processing ratio.

An Analysis of Eco-corridors in Korea by Case Study of Domestic and Foreign Cases (생태통로 조성 국내외 사례 조사를 통한 개선과제 연구)

  • Kim, Myoung-Soo;Heo, Hag-Young;Cho, Soo-Min;Shin, Su-An;Ahn, Tong-Mahn
    • Journal of the Korean Society of Environmental Restoration Technology
    • /
    • v.8 no.2
    • /
    • pp.41-55
    • /
    • 2005
  • In an effort to preserve biodiversity in increasingly fragmented green patches, Korea has been installing eco-corridors over or under some arterial or expressways. In a survey of 43 such eco-corridors installed up until the year 2003, some problems and issues were identified. Some selected overseas eco-corridors were also investigated to find implications for the improvements of future installations in Korea. Major findings are; - For most existing eco-corridors, target species are not specified and locations of the eco-corridors are not well considered, and consequently it is questionable if wildlifes are crossing them - Most of existing eco-corridors lack supporting facilities such as fences that guide wildlife to cross them and prevent them from running into the road - Planting on the eco-corridors is not sufficient, not diverse enough in species, and not very considerate of wildlife but designed and planted in a similar manner as in urban parks - Where target species are not well specified, the location, width, cross section, and other aspects of the eco-corridors can not be optimized - It is suggested that eco-corridors are planned at early stages of road planning so that the number and locations of eco-corridor(s) decided as necessary and even the alignment and design of roads consider the installation of eco-corridors in advance - Monitoring of wildlife crossings is needed for improved eco-corridor planning and design - Nationwide green network plan is desirable to be made first and eco-corridors fit into it.

An Adaptable Destination-Based Dissemination Algorithm Using a Publish/Subscribe Model in Vehicular Networks

  • Morales, Mildred Madai Caballeros;Haw, Rim;Cho, Eung-Jun;Hong, Choong-Seon;Lee, Sung-Won
    • Journal of Computing Science and Engineering
    • /
    • v.6 no.3
    • /
    • pp.227-242
    • /
    • 2012
  • Vehicular Ad Hoc Networks (VANETs) are highly dynamic and unstable due to the heterogeneous nature of the communications, intermittent links, high mobility and constant changes in network topology. Currently, some of the most important challenges of VANETs are the scalability problem, congestion, unnecessary duplication of data, low delivery rate, communication delay and temporary fragmentation. Many recent studies have focused on a hybrid mechanism to disseminate information implementing the store and forward technique in sparse vehicular networks, as well as clustering techniques to avoid the scalability problem in dense vehicular networks. However, the selection of intermediate nodes in the store and forward technique, the stability of the clusters and the unnecessary duplication of data remain as central challenges. Therefore, we propose an adaptable destination-based dissemination algorithm (DBDA) using the publish/subscribe model. DBDA considers the destination of the vehicles as an important parameter to form the clusters and select the intermediate nodes, contrary to other proposed solutions. Additionally, DBDA implements a publish/subscribe model. This model provides a context-aware service to select the intermediate nodes according to the importance of the message, destination, current location and speed of the vehicles; as a result, it avoids delay, congestion, unnecessary duplications and low delivery rate.

The Main Path Analysis of Korean Studies Using Text Mining: Based on SCOPUS Literature Containing 'Korea' as a Keyword (텍스트 마이닝을 활용한 한국학 주경로(Main Path) 분석: '한국'을 키워드로 포함하는 SCOPUS 문헌을 대상으로)

  • Kim, Hea-Jin
    • Journal of the Korean Society for information Management
    • /
    • v.37 no.3
    • /
    • pp.253-274
    • /
    • 2020
  • In this study, text mining and main path analysis (MPA) were applied to understand the origins and development paths of research areas that make up the mainstream of Korean studies. To this end, a quantitative analysis was attempted based on digital texts rather than the traditional humanities research methodology, and the main paths of Korean studies were extracted by collecting documents related to Korean studies including citation information using a citation database, and establishing a direct citation network. As a result of the main path analysis, two main path clusters (Korean ancient agricultural culture (history, culture, archeology) and Korean acquisition of English (linguistics)) were found in the key-route search for the Humanities field of Korean studies. In the field of Korean Studies Humanities and Social Sciences, four main path clusters were discovered: (1) Korea regional/spatial development, (2) Korean economic development (Economic aid/Soft power), (3) Korean industry (Political economics), and (4) population of Korea (Sex selection) & North Korean economy (Poverty, South-South cooperation).

R-Trader: An Automatic Stock Trading System based on Reinforcement learning (R-Trader: 강화 학습에 기반한 자동 주식 거래 시스템)

  • 이재원;김성동;이종우;채진석
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.11
    • /
    • pp.785-794
    • /
    • 2002
  • Automatic stock trading systems should be able to solve various kinds of optimization problems such as market trend prediction, stock selection, and trading strategies, in a unified framework. But most of the previous trading systems based on supervised learning have a limit in the ultimate performance, because they are not mainly concerned in the integration of those subproblems. This paper proposes a stock trading system, called R-Trader, based on reinforcement teaming, regarding the process of stock price changes as Markov decision process (MDP). Reinforcement learning is suitable for Joint optimization of predictions and trading strategies. R-Trader adopts two popular reinforcement learning algorithms, temporal-difference (TD) and Q, for selecting stocks and optimizing other trading parameters respectively. Technical analysis is also adopted to devise the input features of the system and value functions are approximated by feedforward neural networks. Experimental results on the Korea stock market show that the proposed system outperforms the market average and also a simple trading system trained by supervised learning both in profit and risk management.

Semantics Aware Packet Scheduling for Optimal Quality Scalable Video Streaming (다계층 멀티미디어 스트리밍을 위한 의미기반 패킷 스케줄링)

  • Won, Yo-Jip;Jeon, Yeong-Gyun;Park, Dong-Ju;Jeong, Je-Chang
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.10
    • /
    • pp.722-733
    • /
    • 2006
  • In scalable streaming application, there are two important knobs to tune to effectively exploit the underlying network resource and to maximize the user perceivable quality of service(QoS): layer selection and packet scheduling. In this work, we propose Semantics Aware Packet Scheduling (SAPS) algorithm to address these issues. Using packet dependency graph, SAPS algorithm selects a layer to maximize QoS. We aim at minimizing distortion in selecting layers. In inter-frame coded video streaming, minimizing packet loss does not imply maximizing QoS. In determining the packet transmission schedule, we exploit the fact that significance of each packet loss is different dependent upon its frame type and the position within group of picture(GOP). In SAPS algorithm, each packet is assigned a weight called QoS Impact Factor Transmission schedule is derived based upon weighted smoothing. In simulation experiment, we observed that QOS actually improves when packet loss becomes worse. The simulation results show that the SAPS not only maximizes user perceivable QoS but also minimizes resource requirements.

Automated Generation Algorithm of the Penetration Scenarios using Association Mining Technique (연관 마이닝 기법을 이용한 침입 시나리오 자동생성 알고리즘)

  • 정경훈;주정은;황현숙;김창수
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 1999.05a
    • /
    • pp.203-207
    • /
    • 1999
  • In this paper we propose the automated generation algorithm of penetration scenario using association mining technique. Until now known intrusion detections are classified into anomaly detection and misuse detection. The former uses statistical method, features selection, neural network method in order to decide intrusion, the latter uses conditional probability, expert system, state transition analysis, pattern matching for deciding intrusion. In proposed many intrusion detection algorithms unknown penetrations are created and updated by security experts. Our algorithm automatically generates penetration scenarios applying association mining technique to state transition technique. Association mining technique discovers efficient and useful unknown information in existing data. In this paper the algorithm we propose can automatically generate penetration scenarios to have been produced by security experts and is easy to cope with intrusions when it is compared to existing intrusion algorithms. Also It has advantage that maintenance cost is not high.

  • PDF