• Title/Summary/Keyword: BIG4

Search Result 3,614, Processing Time 0.029 seconds

Study on Structure and Principle of Linear Block Error Correction Code (선형 블록 오류정정코드의 구조와 원리에 대한 연구)

  • Moon, Hyun-Chan;Kal, Hong-Ju;Lee, Won-Young
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.13 no.4
    • /
    • pp.721-728
    • /
    • 2018
  • This paper introduces various linear block error correction code and compares performances of the correction circuits. As the risk of errors due to power noise has increased, ECC(: Error Correction Code) has been introduced to prevent the bit error. There are two representatives of ECC structures which are SEC-DED(: Single Error Correction Double Error Detection) and SEC-DED-DAEC(: Double Adjacent Error Correction). According to simulation results, the SEC-DED circuit has advantages of small area and short delay time compared to SEC-DED-DAEC circuits. In case of SED-DED-DAEC, there is no big difference between Dutta's and Pedro's from performance point of view. Therefore, Pedro's code is more efficient than Dutta' code since the correction rate of Pedro's code is higher than that of Dutta's code.

Outlier detection using Grubb test and Cochran test in clinical data (그럽 및 코크란 검정을 이용한 임상자료의 이상치 판단)

  • Sohn, Ki-Cheul;Shin, Im-Hee
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.4
    • /
    • pp.657-663
    • /
    • 2012
  • There are very small values and/or very big values which get out of the normal range for survey data in various fields. The reasons of occurrence for outlier are two. One of them is the error in process of data input and the other is the strange response of the respondent. If the data has outliers, then the summary statistics such as the mean and the variance produce misleading information. Therefore, researcher should be careful in detecting the outlier in data. In particular, it is very important problem for clinical fields because the cost of experiment is very high. This article introduce the Grubb test and Cochran test to detect outliers in the data and we apply this method for clinical data.

Attitude of Western Medicine, Korean Medicine, and Nursing Students toward the East-West Collaborative Medical Practices (한.양방협진에 대한 의.한의.간호대학생의 태도비교)

  • Jeong, Ihn-Sook;Lim, Byung-Mook;Lee, Won-Chul
    • Journal of Society of Preventive Korean Medicine
    • /
    • v.14 no.1
    • /
    • pp.25-35
    • /
    • 2010
  • Objectives : This study was aimed to investigate attitude of western medicine(WM), Korean medicine (KM), and nursing school students toward the east-west collaborative medical practices(EWCMP). Methods : The participants were 185 WM students, 123 KM students, and 230 nursing students belonging to two universities (P and D) in Busan metropolitan city, Korea. Data were collected with self-administered questionnaires and analyzed with descriptive statistics, $X^2$ test, t-test and ANOVA with SPSS win 14.0. Results : Of 538 participants, overall 87.1% has heard EWCMP. Preferred type of EWCMP was significantly different by participants' backgrounds. WM students preferred (western) medical treatment with minor supportive Korean medical care(85.5%). However, KM students emphasized EWCMP with the same weight in both medical and Korean medical treatment(59.0%), and nursing students were in between two schools. Intention to recommend EWCMP for the consumer was 67.4%, and also showed very different between WM students and others, 37.3% of WM students, 89.4% of KM students, and 83.9% of nursing students. WM students showed more negative opinion on the EWCMP than KM and nursing students, Conclusions : The attitude of WM, KM, and nursing school students toward EWCMP was very similar to that of WM doctors, KM doctors, and nurses, respectively. WM students showed big difference in the overall attitude toward EWCMP from that of KM and nursing students. It is recommended to introduce the joint curriculum or exchange programs between WM and KM schools.

Study on Usability and Structure of the Mechanical Pencil (샤프펜슬의 사용편의성과 그 구조에 관한 연구)

  • 윤형건
    • Archives of design research
    • /
    • v.15 no.3
    • /
    • pp.63-72
    • /
    • 2002
  • I made the three main estimated head about the facility of mechanical pencil which we always use in normal days. The head are these, the first is apparent usability by experiment, the second is the feeling when grasping, the third is the feeling when writing. After making the head, 1 selected the head which give the effect to the facility and make sure the interaction between heads by the method of actual test with 36 samples. After this test, 1 can find that there is no high interaction between the facility and the intuitively feeling by experiment. It's because the expectation which was made by experiment for the facility is higher than the real. but there is high interaction between the feeling of wearing and the feeling of writing. The mechanical pencil which has good facility is that the total length is long(169mm), the thickness is big($\Phi$8.5mm), the weight is little bit heavy(21.4g), the intersection is not circle, the surface where is hold by fingers has across grooves, and the body is glossy.

  • PDF

Proposal For Improving Data Processing Performance Using Python (파이썬 활용한 데이터 처리 성능 향상방법 제안)

  • Kim, Hyo-Kwan;Hwang, Won-Yong
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.13 no.4
    • /
    • pp.306-311
    • /
    • 2020
  • This paper deals with how to improve the performance of Python language with various libraries when developing a model using big data. The Python language uses the Pandas library for processing spreadsheet-format data such as Excel. In processing data, Python operates on an in-memory basis. There is no performance issue when processing small scale of data. However, performance issues occur when processing large scale of data. Therefore, this paper introduces a method for distributed processing of execution tasks in a single cluster and multiple clusters by using a Dask library that can be used with Pandas when processing data. The experiment compares the speed of processing a simple exponential model using only Pandas on the same specification hardware and the speed of processing using a dask together. This paper presents a method to develop a model by distributing a large scale of data by CPU cores in terms of performance while maintaining that python's advantage of using various libraries is easy.

Algorithm for Extract Region of Interest Using Fast Binary Image Processing (고속 이진화 영상처리를 이용한 관심영역 추출 알고리즘)

  • Cho, Young-bok;Woo, Sung-hee
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.4
    • /
    • pp.634-640
    • /
    • 2018
  • In this paper, we propose an automatic extraction algorithm of region of interest(ROI) based on medical x-ray images. The proposed algorithm uses segmentation, feature extraction, and reference image matching to detect lesion sites in the input image. The extracted region is searched for matching lesion images in the reference DB, and the matched results are automatically extracted using the Kalman filter based fitness feedback. The proposed algorithm is extracts the contour of the left hand image for extract growth plate based on the left x-ray input image. It creates a candidate region using multi scale Hessian-matrix based sessionization. As a result, the proposed algorithm was able to split rapidly in 0.02 seconds during the ROI segmentation phase, also when extracting ROI based on segmented image 0.53, the reinforcement phase was able to perform very accurate image segmentation in 0.49 seconds.

Methods for Investigating of Edit History about MS PowerPoint Files That Using the OOXML Formats (OOXML형식을 사용하는 MS 파워포인트 파일에 대한 편집 이력 조사 방법)

  • Youn, Ji-Hye;Park, Jung-Heum;Lee, Sang-Jin
    • The KIPS Transactions:PartC
    • /
    • v.19C no.4
    • /
    • pp.215-224
    • /
    • 2012
  • Today, individuals and businesses are a lot of paperwork through a computer. So many documents files are creating to digital type. And the digital type files are copied, moved by various media such as USB, E-mail and so on. A careful analysis of these digital materials can be tracked that occurred during the document editing work history. About these research are on the compound document file format, but has not been studied about the new OOXML format that how to analyze linkages between different document files, tracking an internal order, finding unsaved file for identify the process of creating the file. Future, the use of OOXML format digital documents will further increase, these document work history traceability in digital forensic investigation would be a big help. Therefore, this paper on the new OOXML format(has a forensic viewpoint) will show you how to track the internal order and analyze linkages between the files.

Observation of morphological change of paddy rice under the condition of deep ploughing and heavy fertilization (심경다비료재배조건하에서 수도의 형태변화에 관한 연구)

  • Young-Chul Chang;Su-Bong Park
    • KOREAN JOURNAL OF CROP SCIENCE
    • /
    • v.3
    • /
    • pp.83-87
    • /
    • 1965
  • This experiment was done to observe the morphological change of plant under the cultivation of deep ploughing and heavy fertilization with paddy rice 1963 at Seoul. A seedling of 35 days old was transplanted June 1st, in a galvanized iron pot with botton 20 cm in diameter, which was painted white inside and filled with sand mixed with fertilizers. The treatments were 15cm soil depth of normal fertilization, 30cm of fertilization twice and 45cm of fertilization thrice. Replications were three. The plant was observed on main stem Aug. 6 before heading, Sep. 12 after heading and Oct. 17 at the time of harvest at the same pot. The results are as follows. The length and the width of leaf blades of the upper part on main stem have the tendency to be big and vigorous with the deep ploughing and heavy fertilizations(Fig 1, 2, 3 and 4). The number and the size of vascular bundles of main stem is to increase when the paddy is cultivated with the method of deep ploughing and heavy fertilizations(Fig 5). The number and the weight of roots of main stem increases with deep ploughing and heavy fertilizations(Fig 6).

  • PDF

An application plan of NSWC-98/LE1 when predicting the reliability of mechanical components of design and development phase (체계 개발 단계별 기계 부품에 대한 신뢰도 예측 시 NSWC-98-LE1 적용 방안)

  • Kwon, Ki Sang;Park, Eun Sim;Cho, Cha Hyun;Lee, Dong Woo;Lee, Su Jung
    • Journal of the Korean Society of Systems Engineering
    • /
    • v.4 no.1
    • /
    • pp.35-43
    • /
    • 2008
  • Generally, in analysis of reliability of Design and Development Phase, reliability of electrical components is analyzed based on standards such as MIL-HDBK-217F, Bellcore Issue 4,5,6 by analyzing stress of architectural side (Power, Voltage, Current and quality level of components) of weapon system and stress of operational side (operational environment, operational temperature, Operational Profile). But the reliability of mechanical components is analyzed based on the data book of failure history of mechanical parts called NPRD-95(Nonelectronic Parts Reliability Data-95) without any analysis of above stress. However, even if it's the same mechanical parts, it might have different failure rate(fatigue, wear, corrosion) during operation depending on how weary(stress : pressure, vibration, temperature during operation) the parts are. Therefore, analyzing reliability using just data book can cause big difference in reliability instead of analyzing based upon stressfulness that parts might have, operational concept, and other various factors. Thus, This paper will guide the way of predicting reliability by organizing ways of predicting reliability for system organization and adopt ing NSWC-98/LE1(Naval Surface Warfare Center-98/LE1) for mechanical components.

  • PDF

Data Transmitting and Storing Scheme based on Bandwidth in Hadoop Cluster (하둡 클러스터의 대역폭을 고려한 압축 데이터 전송 및 저장 기법)

  • Kim, Youngmin;Kim, Heejin;Kim, Younggwan;Hong, Jiman
    • Smart Media Journal
    • /
    • v.8 no.4
    • /
    • pp.46-52
    • /
    • 2019
  • The size of data generated and collected at industrial sites or in public institutions is growing rapidly. The existing data processing server often handles the increasing data by increasing the performance by scaling up. However, in the big data era, when the speed of data generation is exploding, there is a limit to data processing with a conventional server. To overcome such limitations, a distributed cluster computing system has been introduced that distributes data in a scale-out manner. However, because distributed cluster computing systems distribute data, inefficient use of network bandwidth can degrade the performance of the cluster as a whole. In this paper, we propose a scheme that compresses data when transmitting data in a Hadoop cluster considering network bandwidth. The proposed scheme considers the network bandwidth and the characteristics of the compression algorithm and selects the optimal compression transmission scheme before transmission. Experimental results show that the proposed scheme reduces data transfer time and size.