• Title/Summary/Keyword: redundant data

Search Result 442, Processing Time 0.025 seconds

Combing data representation by Sparse Autoencoder and the well-known load balancing algorithm, ProGReGA-KF (Sparse Autoencoder의 데이터 특징 추출과 ProGReGA-KF를 결합한 새로운 부하 분산 알고리즘)

  • Kim, Chayoung;Park, Jung-min;Kim, Hye-young
    • Journal of Korea Game Society
    • /
    • v.17 no.5
    • /
    • pp.103-112
    • /
    • 2017
  • In recent years, expansions and advances of the Internet of Things (IoTs) in a distributed MMOGs (massively multiplayer online games) architecture have resulted in massive growth of data in terms of server workloads. We propose a combing Sparse Autoencoder and one of platforms in MMOGs, ProGReGA. In the process of Sparse Autoencoder, data representation with respect to enhancing the feature is excluded from this set of data. In the process of load balance, the graceful degradation of ProGReGA can exploit the most relevant and less redundant feature of the data representation. We find out that the proposed algorithm have become more stable.

A comparison study of classification method based of SVM and data depth in microarray data (마이크로어레이 자료에서 서포트벡터머신과 데이터 뎁스를 이용한 분류방법의 비교연구)

  • Hwang, Jin-Soo;Kim, Jee-Yun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.2
    • /
    • pp.311-319
    • /
    • 2009
  • A robust L1 data depth was used in clustering and classification, so called DDclus and DDclass by Jornsten (2004). SVM-based classification works well in most of the situation but show some weakness in the presence of outliers. Proper gene selection is important in classification since there are so many redundant genes. Either by selecting appropriate genes or by gene clustering combined with classification method enhance the overall performance of classification. The performance of depth based method are evaluated among several SVM-based classification methods.

  • PDF

An Energy Efficient Distributed Approach-Based Agent Migration Scheme for Data Aggregation in Wireless Sensor Networks

  • Gupta, Govind P.;Misra, Manoj;Garg, Kumkum
    • Journal of Information Processing Systems
    • /
    • v.11 no.1
    • /
    • pp.148-164
    • /
    • 2015
  • The use of mobile agents for collaborative processing in wireless sensor network has gained considerable attention. This is when mobile agents are used for data aggregation to exploit redundant and correlated data. The efficiency of agent-based data aggregation depends on the agent migration scheme. However, in general, most of the proposed schemes are centralized approach-based schemes where the sink node determines the migration paths for the agents before dispatching them in the sensor network. The main limitations with such schemes are that they need global network topology information for deriving the migration paths of the agents, which incurs additional communication overhead, since each node has a very limited communication range. In addition, a centralized approach does not provide fault tolerant and adaptive migration paths. In order to solve such problems, we have proposed a distributed approach-based scheme for determining the migration path of the agents where at each hop, the local information is used to decide the migration of the agents. In addition, we also propose a local repair mechanism for dealing with the faulty nodes. The simulation results show that the proposed scheme performs better than existing schemes in the presence of faulty nodes within the networks, and manages to report the aggregated data to the sink faster.

Image-Centric Integrated Data Model of Medical Information by Diseases: Two Case Studies for AMI and Ischemic Stroke

  • Lee, Meeyeon;Park, Ye-Seul;Lee, Jung-Won
    • Journal of Information Processing Systems
    • /
    • v.12 no.4
    • /
    • pp.741-753
    • /
    • 2016
  • In the medical fields, many efforts have been made to develop and improve Hospital Information System (HIS) including Electronic Medical Record (EMR), Order Communication System (OCS), and Picture Archiving and Communication System (PACS). However, materials generated and used in medical fields have various types and forms. The current HISs separately store and manage them by different systems, even though they relate to each other and contain redundant data. These systems are not helpful particularly in emergency where medical experts cannot check all of clinical materials in the golden time. Therefore, in this paper, we propose a process to build an integrated data model for medical information currently stored in various HISs. The proposed data model integrates vast information by focusing on medical images since they are most important materials for the diagnosis and treatment. Moreover, the model is disease-specific to consider that medical information and clinical materials including images are different by diseases. Two case studies show the feasibility and the usefulness of our proposed data model by building models about two diseases, acute myocardial infarction (AMI) and ischemic stroke.

3D Model Compression For Collaborative Design

  • Liu, Jun;Wang, Qifu;Huang, Zhengdong;Chen, Liping;Liu, Yunhua
    • International Journal of CAD/CAM
    • /
    • v.7 no.1
    • /
    • pp.1-10
    • /
    • 2007
  • The compression of CAD models is a key technology for realizing Internet-based collaborative product development because big model sizes often prohibit us to achieve a rapid product information transmission. Although there exist some algorithms for compressing discrete CAD models, original precise CAD models are focused on in this paper. Here, the characteristics of hierarchical structures in CAD models and the distribution of their redundant data are exploited for developing a novel data encoding method. In the method, different encoding rules are applied to different types of data. Geometric data is a major concern for reducing model sizes. For geometric data, the control points of B-spline curves and surfaces are compressed with the second-order predictions in a local coordinate system. Based on analysis to the distortion induced by quantization, an efficient method for computation of the distortion is provided. The results indicate that the data size of CAD models can be decreased efficiently after compressed with the proposed method.

Encryption-based Image Steganography Technique for Secure Medical Image Transmission During the COVID-19 Pandemic

  • Alkhliwi, Sultan
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.3
    • /
    • pp.83-93
    • /
    • 2021
  • COVID-19 poses a major risk to global health, highlighting the importance of faster and proper diagnosis. To handle the rise in the number of patients and eliminate redundant tests, healthcare information exchange and medical data are transmitted between healthcare centres. Medical data sharing helps speed up patient treatment; consequently, exchanging healthcare data is the requirement of the present era. Since healthcare professionals share data through the internet, security remains a critical challenge, which needs to be addressed. During the COVID-19 pandemic, computed tomography (CT) and X-ray images play a vital part in the diagnosis process, constituting information that needs to be shared among hospitals. Encryption and image steganography techniques can be employed to achieve secure data transmission of COVID-19 images. This study presents a new encryption with the image steganography model for secure data transmission (EIS-SDT) for COVID-19 diagnosis. The EIS-SDT model uses a multilevel discrete wavelet transform for image decomposition and Manta Ray Foraging Optimization algorithm for optimal pixel selection. The EIS-SDT method uses a double logistic chaotic map (DLCM) is employed for secret image encryption. The application of the DLCM-based encryption procedure provides an additional level of security to the image steganography technique. An extensive simulation results analysis ensures the effective performance of the EIS-SDT model and the results are investigated under several evaluation parameters. The outcome indicates that the EIS-SDT model has outperformed the existing methods considerably.

1H*-tree: An Improved Data Cube Structure for Multi-dimensional Analysis of Data Streams (1H*-tree: 데이터 스트림의 다차원 분석을 위한 개선된 데이터 큐브 구조)

  • XiangRui Chen;YuXiang Cheng;Yan Li;Song-Sun Shin;Dong-Wook Lee;Hae-Young Bae
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2008.11a
    • /
    • pp.332-335
    • /
    • 2008
  • In this paper, based on H-tree, which is proposed as the basic data cube structure for multi-dimensional data stream analysis, we have done some analysis. We find there are a lot of redundant nodes in H-tree, and the tree-build method can be improved for saving not only memory, but also time used for inserting tuples. Also, to facilitate more fast and large amount of data stream analysis, which is very important for stream research, H*-tree is designed and developed. Our performance study compare the proposed H*-tree and H-tree, identify that H*-tree can save more memory and time during inserting data stream tuples.

Mailing List Characteristic from Electronic Mail

  • Khaitiyakun, N.;Khunkitti, A.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.917-921
    • /
    • 2004
  • Principle of mailing list was distributed messages to all subscribers in one time. But mailing list operation has constructed a network traffic problem. Because mailing list manager distributed mails without concentrate on subscriber network. If our network has many of subscribers, there will be redundant data in traffic channel. Submailing list has purpose to reduce problems. Analyses of mailing list characteristic in electronic mail were a feature of submailing list system, which manage by human hand (Network Administrator). That will cause trouble for network traffic if Network Administrator could not seek for mailing list characteristic from e-mails in due time. This article will present ideas and recognize methodology for automatic working in submailing list system. Recognize step begin with capture process, which use to trap e-mail information from transfer channel. Next process is preparing raw data into recognition format. Then the third one is recognize part and find out confidential factor. The last process is make decision and determine which electronic mail has properties of mailing list characteristic. Afterward deliver result to submailing list for carry on.

  • PDF

Development of Machining Simulation System using Enhanced Z Map Model (Enhanced Z map을 이용한 절삭 공정 시뮬레이션 시스템의 개발)

  • 이상규;고성림
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2002.05a
    • /
    • pp.551-554
    • /
    • 2002
  • The paper discusses new approach for machining operation simulation using enhanced Z map algorithm. To extract the required geometric information from NC code, suggested algorithm uses supersampling method to enhance the efficiency of a simulation process. By executing redundant Boolean operations in a grid cell and averaging down calculated data, presented algorithm can accurately represent material removal volume though tool swept volume is negligibly small. Supersampling method is the most common form of antialiasing and usually used with polygon mesh rendering in computer graphics. The key advantage of enhanced Z map model is that the data structure is same with conventional Z map model, though it can acquire higher accuracy and reliability with same or lower computation time. By simulating machining operation efficiently, this system can be used to improve the reliability and efficiency of NC machining process as well as the quality of the final product.

  • PDF

The Development of a Highly Portable and Low Cost SPOT Image Receiving System

  • Choi, Wook-Hyun;Shin, Dong-Seok;Kim, Tag-Gon
    • Proceedings of the KSRS Conference
    • /
    • 1999.11a
    • /
    • pp.25-30
    • /
    • 1999
  • This paper covers the development of a highly portable and low cost SPOT image data receiving system. We followed two design approaches. One is the software-based approach by which most of the real-time processing is handled by software. With the complete software-based design, it is simple to add a function for receiving any additional satellite data. Satellite-specific format handlers including error correction, decompression and decryption can easily be accommodated. On the other approach. we used a general hardware platform, IBM-PC and a low cost SCSI RAID (Redundant Away of Independent Disks), and therefore, we can make a low cost system.

  • PDF