• Title/Summary/Keyword: random algorithm

Search Result 1,806, Processing Time 0.03 seconds

A Query Randomizing Technique for breaking 'Filter Bubble'

  • Joo, Sangdon;Seo, Sukyung;Yoon, Youngmi
    • Journal of the Korea Society of Computer and Information
    • /
    • v.22 no.12
    • /
    • pp.117-123
    • /
    • 2017
  • The personalized search algorithm is a search system that analyzes the user's IP, cookies, log data, and search history to recommend the desired information. As a result, users are isolated in the information frame recommended by the algorithm. This is called 'Filter bubble' phenomenon. Most of the personalized data can be deleted or changed by the user, but data stored in the service provider's server is difficult to access. This study suggests a way to neutralize personalization by keeping on sending random query words. This is to confuse the data accumulated in the server while performing search activities with words that are not related to the user. We have analyzed the rank change of the URL while conducting the search activity with 500 random query words once using the personalized account as the experimental group. To prove the effect, we set up a new account and set it as a control. We then searched the same set of queries with these two accounts, stored the URL data, and scored the rank variation. The URLs ranked on the upper page are weighted more than the lower-ranked URLs. At the beginning of the experiment, the difference between the scores of the two accounts was insignificant. As experiments continue, the number of random query words accumulated in the server increases and results show meaningful difference.

Blind Equalizer Algorithms using Random Symbols and Decision Feedback (랜덤 심볼열과 결정 궤환을 사용한 자력 등화 알고리듬)

  • Kim, Nam-Yong
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.13 no.1
    • /
    • pp.343-347
    • /
    • 2012
  • Non-linear equalization techniques using decision feedback structure are highly demanded for cancellation of intersymbol interferences occurred in severe channel environments. In this paper decision feedback structure is applied to the linear blind equalizer algorithm that is based on information theoretic learning and a randomly generated symbol set. At the decision feedback equalizer (DFE) the random symbols are generated to have the same probability density function (PDF) as that of the transmitted symbols. By minimizing difference between the PDF of blind DFE output and that of randomly generated symbols, the proposed DFE algorithm produces equalized output signal. From the simulation results, the proposed method has shown enhanced convergence and error performance compared to its linear counterpart.

Efficient Weighted Random Pattern Generation Using Weight Set Optimization (가중치 집합 최적화를 통한 효율적인 가중 무작위 패턴 생성)

  • 이항규;김홍식;강성호
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.35C no.9
    • /
    • pp.29-37
    • /
    • 1998
  • In weighted random pattern testing it is an important issue to find the optimal weight sets for achieving a high fault coverage using a small number of weighted random patterns. In this paper, a new weight set optimization algorithm is developed, which can generate the optimal weight sets in an efficient way using the sampling probabilities of deterministic tests patterns. In addition, the simulation based method of finding the proper maximum Hamming distance is presented. Experimental results for ISCAS 85 benchmark circuits prove the effectiveness of the new weight set optimization algorithm and the method of finding the proper maximum Hamming distance.

  • PDF

Moving object segmentation using Markov Random Field (마코프 랜덤 필드를 이용한 움직이는 객체의 분할에 관한 연구)

  • 정철곤;김중규
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.27 no.3A
    • /
    • pp.221-230
    • /
    • 2002
  • This paper presents a new moving object segmentation algorithm using markov random field. The algorithm is based on signal detection theory. That is to say, motion of moving object is decided by binary decision rule, and false decision is corrected by markov random field model. The procedure toward complete segmentation consists of two steps: motion detection and object segmentation. First, motion detection decides the presence of motion on velocity vector by binary decision rule. And velocity vector is generated by optical flow. Second, object segmentation cancels noise by Bayes rule. Experimental results demonstrate the efficiency of the presented method.

Dual Sliding Statistics Switching Median Filter for the Removal of Low Level Random-Valued Impulse Noise

  • Suid, Mohd Helmi;Jusof, M F.M.;Ahmad, Mohd Ashraf
    • Journal of Electrical Engineering and Technology
    • /
    • v.13 no.3
    • /
    • pp.1383-1391
    • /
    • 2018
  • A new nonlinear filtering algorithm for effectively denoising images corrupted by the random-valued impulse noise, called dual sliding statistics switching median (DSSSM) filter is presented in this paper. The proposed DSSSM filter is made up of two subunits; i.e. Impulse noise detection and noise filtering. Initially, the impulse noise detection stage of DSSSM algorithm begins by processing the statistics of a localized detection window in sorted order and non-sorted order, simultaneously. Next, the median of absolute difference (MAD) obtained from both sorted statistics and non-sorted statistics will be further processed in order to classify any possible noise pixels. Subsequently, the filtering stage will replace the detected noise pixels with the estimated median value of the surrounding pixels. In addition, fuzzy based local information is used in the filtering stage to help the filter preserves the edges and details. Extensive simulations results conducted on gray scale images indicate that the DSSSM filter performs significantly better than a number of well-known impulse noise filters existing in literature in terms of noise suppression and detail preservation; with as much as 30% impulse noise corruption rate. Finally, this DSSSM filter is algorithmically simple and suitable to be implemented for electronic imaging products.

Obesity Level Prediction Based on Data Mining Techniques

  • Alqahtani, Asma;Albuainin, Fatima;Alrayes, Rana;Al muhanna, Noura;Alyahyan, Eyman;Aldahasi, Ezaz
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.3
    • /
    • pp.103-111
    • /
    • 2021
  • Obesity affects individuals of all gender and ages worldwide; consequently, several studies have performed great works to define factors causing it. This study develops an effective method to trace obesity levels based on supervised data mining techniques such as Random Forest and Multi-Layer Perception (MLP), so as to tackle this universal epidemic. Notably, the dataset was from countries like Mexico, Peru, and Colombia in the 14- 61year age group, with varying eating habits and physical conditions. The data includes 2111 instances and 17 attributes labelled using NObesity, which facilitates categorization of data using Overweight Levels l I and II, Insufficient Weight, Normal Weight, as well as Obesity Type I to III. This study found that the highest accuracy was achieved by Random Forest algorithm in comparison to the MLP algorithm, with an overall classification rate of 96.7%.

A Predictive Model to identify possible affected Bipolar disorder students using Naive Baye's, Random Forest and SVM machine learning techniques of data mining and Building a Sequential Deep Learning Model using Keras

  • Peerbasha, S.;Surputheen, M. Mohamed
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.5
    • /
    • pp.267-274
    • /
    • 2021
  • Medical care practices include gathering a wide range of student data that are with manic episodes and depression which would assist the specialist with diagnosing a health condition of the students correctly. In this way, the instructors of the specific students will also identify those students and take care of them well. The data which we collected from the students could be straightforward indications seen by them. The artificial intelligence has been utilized with Naive Baye's classification, Random forest classification algorithm, SVM algorithm to characterize the datasets which we gathered to check whether the student is influenced by Bipolar illness or not. Performance analysis of the disease data for the algorithms used is calculated and compared. Also, a sequential deep learning model is builded using Keras. The consequences of the simulations show the efficacy of the grouping techniques on a dataset, just as the nature and complexity of the dataset utilized.

Usage of coot optimization-based random forests analysis for determining the shallow foundation settlement

  • Yi, Han;Xingliang, Jiang;Ye, Wang;Hui, Wang
    • Geomechanics and Engineering
    • /
    • v.32 no.3
    • /
    • pp.271-291
    • /
    • 2023
  • Settlement estimation in cohesion materials is a crucial topic to tackle because of the complexity of the cohesion soil texture, which could be solved roughly by substituted solutions. The goal of this research was to implement recently developed machine learning features as effective methods to predict settlement (Sm) of shallow foundations over cohesion soil properties. These models include hybridized support vector regression (SVR), random forests (RF), and coot optimization algorithm (COM), and black widow optimization algorithm (BWOA). The results indicate that all created systems accurately simulated the Sm, with an R2 of better than 0.979 and 0.9765 for the train and test data phases, respectively. This indicates extraordinary efficiency and a good correlation between the experimental and simulated Sm. The model's results outperformed those of ANFIS - PSO, and COM - RF findings were much outstanding to those of the literature. By analyzing established designs utilizing different analysis aspects, such as various error criteria, Taylor diagrams, uncertainty analyses, and error distribution, it was feasible to arrive at the final result that the recommended COM - RF was the outperformed approach in the forecasting process of Sm of shallow foundation, while other techniques were also reliable.

Estimation of frost durability of recycled aggregate concrete by hybridized Random Forests algorithms

  • Rui Liang;Behzad Bayrami
    • Steel and Composite Structures
    • /
    • v.49 no.1
    • /
    • pp.91-107
    • /
    • 2023
  • An effective approach to promoting sustainability within the construction industry is the use of recycled aggregate concrete (RAC) as a substitute for natural aggregates. Ensuring the frost resilience of RAC technologies is crucial to facilitate their adoption in regions characterized by cold temperatures. The main aim of this study was to use the Random Forests (RF) approach to forecast the frost durability of RAC in cold locations, with a focus on the durability factor (DF) value. Herein, three optimization algorithms named Sine-cosine optimization algorithm (SCA), Black widow optimization algorithm (BWOA), and Equilibrium optimizer (EO) were considered for determing optimal values of RF hyperparameters. The findings show that all developed systems faithfully represented the DF, with an R2 for the train and test data phases of better than 0.9539 and 0.9777, respectively. In two assessment and learning stages, EO - RF is found to be superior than BWOA - RF and SCA - RF. The outperformed model's performance (EO - RF) was superior to that of ANN (from literature) by raising the values of R2 and reducing the RMSE values. Considering the justifications, as well as the comparisons from metrics and Taylor diagram's findings, it could be found out that, although other RF models were equally reliable in predicting the the frost durability of RAC based on the durability factor (DF) value in cold climates, the developed EO - RF strategy excelled them all.

A new conjugate gradient method for dynamic load identification of airfoil structure with randomness

  • Lin J. Wang;Jia H. Li;You X. Xie
    • Structural Engineering and Mechanics
    • /
    • v.88 no.4
    • /
    • pp.301-309
    • /
    • 2023
  • In this paper, a new modified conjugate gradient (MCG) method is presented which is based on a new gradient regularizer, and this method is used to identify the dynamic load on airfoil structure without and with considering random structure parameters. First of all, the newly proposed algorithm is proved to be efficient and convergent through the rigorous mathematics theory and the numerical results of determinate dynamic load identification. Secondly, using the perturbation method, we transform uncertain inverse problem about force reconstruction into determinate load identification problem. Lastly, the statistical characteristics of identified load are evaluated by statistical methods. Especially, this newly proposed approach has successfully solved determinate and uncertain inverse problems about dynamic load identification. Numerical simulations validate that the newly developed method in this paper is feasible and stable in solving load identification problems without and with considering random structure parameters. Additionally, it also shows that most of the observation error of the proposed algorithm in solving dynamic load identification of deterministic and random structure is respectively within 11.13%, 20%.