• Title/Summary/Keyword: combined systems

Search Result 2,640, Processing Time 0.027 seconds

SVD-LDA: A Combined Model for Text Classification

  • Hai, Nguyen Cao Truong;Kim, Kyung-Im;Park, Hyuk-Ro
    • Journal of Information Processing Systems
    • /
    • v.5 no.1
    • /
    • pp.5-10
    • /
    • 2009
  • Text data has always accounted for a major portion of the world's information. As the volume of information increases exponentially, the portion of text data also increases significantly. Text classification is therefore still an important area of research. LDA is an updated, probabilistic model which has been used in many applications in many other fields. As regards text data, LDA also has many applications, which has been applied various enhancements. However, it seems that no applications take care of the input for LDA. In this paper, we suggest a way to map the input space to a reduced space, which may avoid the unreliability, ambiguity and redundancy of individual terms as descriptors. The purpose of this paper is to show that LDA can be perfectly performed in a "clean and clear" space. Experiments are conducted on 20 News Groups data sets. The results show that the proposed method can boost the classification results when the appropriate choice of rank of the reduced space is determined.

Extreme Learning Machine Ensemble Using Bagging for Facial Expression Recognition

  • Ghimire, Deepak;Lee, Joonwhoan
    • Journal of Information Processing Systems
    • /
    • v.10 no.3
    • /
    • pp.443-458
    • /
    • 2014
  • An extreme learning machine (ELM) is a recently proposed learning algorithm for a single-layer feed forward neural network. In this paper we studied the ensemble of ELM by using a bagging algorithm for facial expression recognition (FER). Facial expression analysis is widely used in the behavior interpretation of emotions, for cognitive science, and social interactions. This paper presents a method for FER based on the histogram of orientation gradient (HOG) features using an ELM ensemble. First, the HOG features were extracted from the face image by dividing it into a number of small cells. A bagging algorithm was then used to construct many different bags of training data and each of them was trained by using separate ELMs. To recognize the expression of the input face image, HOG features were fed to each trained ELM and the results were combined by using a majority voting scheme. The ELM ensemble using bagging improves the generalized capability of the network significantly. The two available datasets (JAFFE and CK+) of facial expressions were used to evaluate the performance of the proposed classification system. Even the performance of individual ELM was smaller and the ELM ensemble using a bagging algorithm improved the recognition performance significantly.

Evaluation of Histograms Local Features and Dimensionality Reduction for 3D Face Verification

  • Ammar, Chouchane;Mebarka, Belahcene;Abdelmalik, Ouamane;Salah, Bourennane
    • Journal of Information Processing Systems
    • /
    • v.12 no.3
    • /
    • pp.468-488
    • /
    • 2016
  • The paper proposes a novel framework for 3D face verification using dimensionality reduction based on highly distinctive local features in the presence of illumination and expression variations. The histograms of efficient local descriptors are used to represent distinctively the facial images. For this purpose, different local descriptors are evaluated, Local Binary Patterns (LBP), Three-Patch Local Binary Patterns (TPLBP), Four-Patch Local Binary Patterns (FPLBP), Binarized Statistical Image Features (BSIF) and Local Phase Quantization (LPQ). Furthermore, experiments on the combinations of the four local descriptors at feature level using simply histograms concatenation are provided. The performance of the proposed approach is evaluated with different dimensionality reduction algorithms: Principal Component Analysis (PCA), Orthogonal Locality Preserving Projection (OLPP) and the combined PCA+EFM (Enhanced Fisher linear discriminate Model). Finally, multi-class Support Vector Machine (SVM) is used as a classifier to carry out the verification between imposters and customers. The proposed method has been tested on CASIA-3D face database and the experimental results show that our method achieves a high verification performance.

Community Model for Smart TV over the Top Services

  • Pandey, Suman;Won, Young Joon;Choi, Mi-Jung;Gil, Joon-Min
    • Journal of Information Processing Systems
    • /
    • v.12 no.4
    • /
    • pp.577-590
    • /
    • 2016
  • We studied the current state-of-the-art of Smart TV, the challenges and the drawbacks. Mainly we discussed the lack of end-to-end solution. We then illustrated the differences between Smart TV and IPTV from network service provider point of view. Unlike IPTV, viewer of Smart TV's over-the-top (OTT) services could be global, such as foreign nationals in a country or viewers having special viewing preferences. Those viewers are sparsely distributed. The existing TV service deployment models over Internet are not suitable for such viewers as they are based on content popularity, hence we propose a community based service deployment methodology with proactive content caching on rendezvous points (RPs). In our proposal, RPs are intermediate nodes responsible for caching routing and decision making. The viewer's community formation is based on geographical locations and similarity of their interests. The idea of using context information to do proactive caching is itself not new, but we combined this with "in network caching" mechanism of content centric network (CCN) architecture. We gauge the performance improvement achieved by a community model. The result shows that when the total numbers of requests are same; our model can have significantly better performance, especially for sparsely distributed communities.

Security Constrained Optimal Power Flow by Hybrid Algorithms (하이브리드 알고리즘을 응용하여 안전도제약을 만족시키는 최적전력조류)

  • Kim, Gyu-Ho;Lee, Sang-Bong;Lee, Jae-Gyu;Yu, Seok-Gu
    • The Transactions of the Korean Institute of Electrical Engineers A
    • /
    • v.49 no.6
    • /
    • pp.305-311
    • /
    • 2000
  • This paper presents a hybrid algorithm for solving optimal power flow(OPF) in order to enhance a systems capability to cope with outages, which is based on combined application of evolutionary computation and local search method. The efficient algorithm combining main advantages of two methods is as follows : Firstly, evolutionary computation is used to perform global exploitation among a population. This gives a good initial point of conventional method. Then, local methods are used to perform local exploitation. The hybrid approach often outperforms either method operating alone and reduces the total computation time. The objective function of the security constrained OPF is the minimization of generation fuel costs and real power losses. The resulting optimal operating point has to be feasible after outages such as any single line outage(respect of voltage magnitude, reactive power generation and power flow limits). In OPF considering security, the outages are selected by contingency ranking method(contingency screening model). The OPF considering security, the outages are selected by contingency ranking method(contingency screening model). The method proposed is applied to IEEE 30 buses system to show its effectiveness.

  • PDF

Formal Representation and Query for Digital Contents Data

  • Khamis, Khamis Abdul-Latif;Song, Huazhu;Zhong, Xian
    • Journal of Information Processing Systems
    • /
    • v.16 no.2
    • /
    • pp.261-276
    • /
    • 2020
  • Digital contents services are one of the topics that have been intensively studied in the media industry, where various semantic and ontology techniques are applied. However, query execution for ontology data is still inefficient, lack of sufficient extensible definitions for node relationships, and there is no specific semantic method fit for media data representation. In order to make the machine understand digital contents (DCs) data well, we analyze DCs data, including static data and dynamic data, and use ontology to specify and classify objects and the events of the particular objects. Then the formal representation method is proposed which not only redefines DCs data based on the technology of OWL/RDF, but is also combined with media segmentation methods. At the same time, to speed up the access mechanism of DCs data stored under the persistent database, an ontology-based DCs query solution is proposed, which uses the specified distance vector associated to a surveillance of semantic label (annotation) to detect and track a moving or static object.

Overload Surge Investigation Using CFD Data

  • Flemming, Felix;Foust, Jason;Koutnik, Jiri;Fisher, Richard K.
    • International Journal of Fluid Machinery and Systems
    • /
    • v.2 no.4
    • /
    • pp.315-323
    • /
    • 2009
  • Pressure oscillations triggered by the unstable interaction of dynamic flow features of the hydraulic turbine with the hydraulic plant system - including the electrical design - can at times reach significant levels and could lead to damage of plant components or could reduce component lifetime significantly. Such a problem can arise for overload as well as for part load operation of the turbine. This paper discusses an approach to analyze the overload high pressure oscillation problem using computational fluid dynamic (CFD) modeling of the hydraulic machine combined with a network modeling technique of the hydraulic system. The key factor in this analysis is the determination of the overload vortex rope volume occurring within the turbine under the runner which is acting as an active element in the system. Two different modeling techniques to compute the flow field downstream of the runner will be presented in this paper. As a first approach, single phase flow simulations are used to evaluate the vortex rope volume before moving to more sophisticated modeling which incorporates two phase flow calculations employing cavitation modeling. The influence of these different modeling strategies on the simulated plant behavior will be discussed.

A Design Method for Cascades Consisting of Circular Arc Blades with Constant Thickness

  • Bian, Tao;Han, Qianpeng;Bohle, Martin
    • International Journal of Fluid Machinery and Systems
    • /
    • v.10 no.1
    • /
    • pp.63-75
    • /
    • 2017
  • Many axial fans have circular arc blades with constant thickness. It is still a challenging task to calculate their performance, i.e. to predict how large their pressure rise and pressure losses are. For this task a need for cascade data exists. Therefore, the designer needs a method which works quickly for design purposes. In the present contribution a design method for such cascades consisting of circular arc blades with constant thickness is described. It is based on a singularity method which is combined with a CFD-data-based flow loss model. The flow loss model uses CFD-data to predict the total pressure losses. An interpolation method for the CFD-data are applied and described in detail. Data of measurements are used to validate the CFD-data and parameter variations are conducted. The parameter variations include the variation of the camber angle, pitch chord ratio and the Reynolds number. Additionally, flow patterns of two dimensional cascades consisting of circular arc blades with constant thickness are shown.

The Study of Mutation Spectrum in Iac / Gene of Transgenic Big Blue$\textregistered$ Cell Line Following Short-Term Exposure to 4-Nitroquinoline N-oxide

  • Youn, Ji-Youn;Kim, Kyung-Ran;Cho, Kyung-Hea;Ryu, Jae-Chun
    • Proceedings of the Korea Society of Environmental Toocicology Conference
    • /
    • 1996.12a
    • /
    • pp.64-64
    • /
    • 1996
  • Transgenic animal and cell line models which are recently developed in toxicology field combined with molecular biological technique, are powerful tools for studying of mutation in vivo and in vitro, respectively. The Big Blue mutagenesis assay system is one of the most widely used transgenic systems. Especially, for the study of direct acting mutagens, Big Blue cell line is very useful and powerful to evaluate mutagenicity because the mutation frequency and mutationspectrlun showed no distinct differences between cell line and animal. The Big Blue cell lines carry stably integrated copies of lambda shuttle vector containing lac I gene as a mutational target. These lambda shuttle vectors are useful for various mutagenesis related studies in eukaryotic system due to their ability to be exposed mutagen and then transfer a suitable target DNA sequence to it convenient organism for analysis. We tried to assess the mutagenic effect of 4-NQO with Big Blue cell line. After the treatment of 4-NQO, genomic DNA was isolated and lambda shuttle vector was packaged by in Vitro packaging and then these were plated on bacterial host in the presence of X-gal to screen mutation in the lac I. We determined MF as a ratio of blue plaques versus colorless plaques and now undergoing the mutation spectrum of 4-NQO in lac J gene sequence.

  • PDF

A Study on Recommendation Method Based on Web 3.0

  • Kim, Sung Rim;Kwon, Joon Hee
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.8 no.4
    • /
    • pp.43-51
    • /
    • 2012
  • Web 3.0 is the next-generation of the World Wide Web and is included two main platforms, semantic technologies and social computing environment. The basic idea of web 3.0 is to define structure data and link them in order to more effective discovery, automation, integration, and reuse across various applications. The semantic technologies represent open standards that can be applied on the top of the web. The social computing environment allows human-machine co-operations and organizing a large number of the social web communities. In the recent years, recommender systems have been combined with ontologies to further improve the recommendation by adding semantics to the context on the web 3.0. In this paper, we study previous researches about recommendation method and propose a recommendation method based on web 3.0. Our method scores documents based on context tags and social network services. Our social scoring model is computed by both a tagging score of a document and a tagging score of a document that was tagged by a user's friends.