• Title/Summary/Keyword: standardization algorithm

Search Result 122, Processing Time 0.028 seconds

A Study on Scheduling Algorithm of WRR Method in Wireless Network (이동통신망에서 WRR 기법의 스케쥴링 알고리즘 연구)

  • Cho, Hae-Seong
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.626-630
    • /
    • 2006
  • In this paper, I proposed the algorithm that realize scheduler of WRR method in mobile network and analyze the performance of the proposed algorithm. The need for providing quality of service(QoS) for real-time applications in wireless networks has been driving research activities and standardization efforts for some time. In particular, there have been considerable research of scheduling algorithm for wireless environments. The BSW algorithm of WRR method that is suited to wireless environment is developed to the results of these effort. But BSW algorithm is deteriorated the performance by realization complexity in wireless environment is necessary to fast scheduling. To solve of these problem this paper proposed the scheduling algorithm which degrades of implementation complexity and which improves the performance and analysis of the performance of the proposed algorithm.

  • PDF

Database metadata standardization processing model using web dictionary crawling (웹 사전 크롤링을 이용한 데이터베이스 메타데이터 표준화 처리 모델)

  • Jeong, Hana;Park, Koo-Rack;Chung, Young-suk
    • Journal of Digital Convergence
    • /
    • v.19 no.9
    • /
    • pp.209-215
    • /
    • 2021
  • Data quality management is an important issue these days. Improve data quality by providing consistent metadata. This study presents algorithms that facilitate standard word dictionary management for consistent metadata management. Algorithms are presented to automate synonyms management of database metadata through web dictionary crawling. It also improves the accuracy of the data by resolving homonym distinction issues that may arise during the web dictionary crawling process. The algorithm proposed in this study increases the reliability of metadata data quality compared to the existing passive management. It can also reduce the time spent on registering and managing synonym data. Further research on the new data standardization partial automation model will need to be continued, with a detailed understanding of some of the automatable tasks in future data standardization activities.

A Performance Comparison of Cluster Validity Indices based on K-means Algorithm (K-means 알고리즘 기반 클러스터링 인덱스 비교 연구)

  • Shim, Yo-Sung;Chung, Ji-Won;Choi, In-Chan
    • Asia pacific journal of information systems
    • /
    • v.16 no.1
    • /
    • pp.127-144
    • /
    • 2006
  • The K-means algorithm is widely used at the initial stage of data analysis in data mining process, partly because of its low time complexity and the simplicity of practical implementation. Cluster validity indices are used along with the algorithm in order to determine the number of clusters as well as the clustering results of datasets. In this paper, we present a performance comparison of sixteen indices, which are selected from forty indices in literature, while considering their applicability to nonhierarchical clustering algorithms. Data sets used in the experiment are generated based on multivariate normal distribution. In particular, four error types including standardization, outlier generation, error perturbation, and noise dimension addition are considered in the comparison. Through the experiment the effects of varying number of points, attributes, and clusters on the performance are analyzed. The result of the simulation experiment shows that Calinski and Harabasz index performs the best through the all datasets and that Davis and Bouldin index becomes a strong competitor as the number of points increases in dataset.

Performance Evaluation of the JPEG DCT-based Progressive and Hierarchical Codings for Medical Image Communication

  • Ahn, C.B.;Lee, J.S.
    • Proceedings of the KOSOMBE Conference
    • /
    • v.1991 no.11
    • /
    • pp.48-53
    • /
    • 1991
  • The discrete cosine transform (DCT)-based progressive and hierarchical coding schemes developed by the International Standardization Organization (ISO) Joint Photographic Experts Groups (JPEG) are implemented and evaluated for the application of medical image communication. For a series of head sections of magnetic resonance images, a compression ratio of about 10 is obtained by the algorithm without noticeable image degradation.

  • PDF

The Propose of WAVE System Synchronization Algorithm Based on OFDM (OFDM 기반 WAVE 시스템 동기알고리즘 제안)

  • Oh, Se-Kab;Ryu, Ki-Hee;Kang, Heau-Jo
    • Journal of Advanced Navigation Technology
    • /
    • v.12 no.4
    • /
    • pp.341-349
    • /
    • 2008
  • In this paper the Coefficient Tracking Synchronization Algorithm is proposed for the compensation and the suitability of the fast channel fluctuation OFDM in the WAVE(Wireless Access in Vehicular Environments) system. The in progress of standardization WAVE process, IEEE802.11p's physical layer is considered on the coexistence of fading and Frequency offset of OFDM channel to find the system performance ability and the comparing examination which is taken on the proposed method of Coefficient Tracking Synchronization Algorithm and the performance efficiency. Through the simulation result we also can see that the proposed system improves the channel estimation ability and offers an extra efficiency if compared to the existing method.

  • PDF

Two Version Latch Technique for Metadata Management of Documents in Digital Library (전자 도서관에서 문서의 메타데이타 관리를 위한 2 버전 래치 기법)

  • Jwa, Eun-Hee;Park, Seog
    • Journal of KIISE:Databases
    • /
    • v.29 no.3
    • /
    • pp.159-167
    • /
    • 2002
  • Recently, a major issue in the research of metadata is the standardization of metadata format. The new extension capability of metadata in the standardization requires some changes - storing and managing dynamic data consistently. In this paper, we define the characteristics of new metadata and propose a concurrency control called Two Version Latch (2VL). 2VL uses a latch and maintains two versions. Maintaining two versions using latch minimizes conflicts between read operation and write operation. The removal of unnecessary lock holding minimizes refresh latency. Therefore, this algorithm presents fast response time and recent data retrieval in read operation execution. As a result of the performance evaluation, the 2VL algorithm is shown to be better than other algorithms in metadata management system.

An Automatic LOINC Mapping Framework for Standardization of Laboratory Codes in Medical Informatics (의료 정보 검사코드 표준화를 위한 LOINC 자동 매핑 프레임웍)

  • Ahn, Hoo-Young;Park, Young-Ho
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.8
    • /
    • pp.1172-1181
    • /
    • 2009
  • An electronic medical record (EMR) is the medical system that all the test are recorded as text data. However, domestic EMR systems have various forms of medical records. There are a lot of related works to standardize the laboratory codes as a LOINC (Logical Observation Identifiers Names and Code). However the existing researches resolve the problem manually. The manual process does not work when the size of data is enormous. The paper proposes a novel automatic LOINC mapping algorithm which uses indexing techniques and semantic similarity analysis of medical information. They use file system which is not proper to enormous medical data. We designed and implemented mapping algorithm for standardization laboratory codes in medical informatics compared with the existing researches that are only proposed algorithms. The automatic creation of searching words is being possible. Moreover, the paper implemented medical searching framework based on database system that is considered large size of medical data.

  • PDF

Analysis of Attacks and Security Level for Multivariate Quadratic Based Signature Scheme Rainbow (다변수 이차식 기반 서명 기법 Rainbow의 공격 기법 및 보안강도 분석)

  • Cho, Seong-Min;Kim, Jane;Seo, Seung-Hyun
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.31 no.3
    • /
    • pp.533-544
    • /
    • 2021
  • Using Shor algorithm, factoring and discrete logarithm problem can be solved effectively. The public key cryptography, such as RSA and ECC, based on factoring and discrete logarithm problem can be broken in polynomial time using Shor algorithm. NIST has been conducting a PQC(Post Quantum Cryptography) standardization process to select quantum-resistant public key cryptography. The multivariate quadratic based signature scheme, which is one of the PQC candidates, is suitable for IoT devices with limited resources due to its short signature and fast sign and verify process. We analyzes classic attacks and quantum attacks for Rainbow which is the only multivatiate quadratic based signature scheme to be finalized up to the round 3. Also we compute the attack complexity for the round 3 Rainbow parameters, and analyzes the security level of Rainbow, one of the PQC standardization candidates.

Analysis and Design of the Efficient Consolidated Transportation System Model (효율적인 공동 수.배송 시스템 모델의 분석 및 설계)

  • Lee, Myeong-Ho
    • IE interfaces
    • /
    • v.18 no.1
    • /
    • pp.1-9
    • /
    • 2005
  • A new logistics concept is needed through the sharing information between suppliers and consumers, which maximizes the customers service and its flexibility by changing functional- oriented to process-oriented. As in many other industries, communication and data manipulation technology have led to systematical change to the logistics industry. One of the biggest changes of the industry that lies ahead is Consolidated Transportation. To improve this systematically false logistical environment, developing an integrated logistics information system with consolidated transportation, framework, standardization, and data integration is essential. However, no party outstands as the leading party for nationwide improvement of logistics, nor does the right analysis and design for it. Therefore, successful nationwide logistics model is yet to exist. This paper provides individual parties, which consider efficient consolidated transportation as their business models, with instructions for logistics information system so that they could be competitive in the market. It also helps companies collect user requirements for efficient consolidated transportation, and utilize it for its development. Finally, this paper extracts the design of algorithm for the efficient consolidated transportation.

An effective error resilience coding of MPEG-4 video stream using DMB system (DMB를 통한 MPEG-4 비디오 스트림의 효율적인 오류 내성부호화 방안)

  • 백선혜;나남웅;홍성훈;이봉호;함영권
    • Proceedings of the IEEK Conference
    • /
    • 2003.07e
    • /
    • pp.2060-2063
    • /
    • 2003
  • Terrestrial DMB(Digital Multimedia Broad-casting) system that is now under standardization in Korea offers multimedia broadcasting services at mobile environment and is based on Eureka-147 DAB(Digital Audio Broadcasting) for transmission method. Also DMB provides the error protection method of convolution coding. In this paper, we study on the effective error resilience coding of MPEG-4 video stream over DMB system. In our algorithm, the first, we partition the MPEG-4 data using the MPEG-4 data partitioning method, and then controls the convolution coding rate according to the importance of the partitioned data. From our simulation result, we show that our algorithm is proper for terrestrial DMB services.

  • PDF