• Title/Summary/Keyword: massive

Search Result 3,755, Processing Time 0.037 seconds

Computerized Multiple 15-hue tests for Quantifying Color Vision Acuity (색각 능력의 정량적 평가를 위한 전산화된 다중 15-색상 배열 검사법)

  • Ko S.T.;Hong S.C.;Choi M.J.
    • Journal of Biomedical Engineering Research
    • /
    • v.21 no.3 s.61
    • /
    • pp.321-331
    • /
    • 2000
  • Multiple 15-hue tests were designed and implemented on a PC in the study so as to quickly and quantitatively evaluate color vision acuity. Difficulty of the test was control)ed by the value of CDBACC (color difference between adjacent color chips) calculated using a CIELAB formula. The multiple 15-hue tests consist of eight of the hue tests (test 3-10) and three of the basic color (red, green, blue) tests (test 11-13). The 15 colors used for the hue tests were specified by the 15 color coordinates that were located at a constant distance (d = 2. 3. 5. 7, 10, 20, 30. 40) from white reference in the CIE chromaticity coordinate system and were separated by a constant color difference (CDBACC = 0.75, 1.1, 1.8. 2.5. 3.5. 7.5. 11, 14) from the adjacent chips. The color coordinates for the 15 chips for the basic color tests were the same as those of the 15 points spaced equally by a constant color difference (6.87 for the green color test. 7.27 for the red color test, 7.86 for the blue color test) from the white reference along the axis of red, green and blue. Thirty normal subjects who were not color blind were taken to undergo the multiple 15-hue tests. It was observed that most of the subjects correctly arranged color chips for the tests with CDBACC greater than 5, whereas no one correctly answered for those with CDBACC less than 2. Rapid changes in the number of the subjects correctly arranged took place when CDBACC of the tests was between 2 and 4.5. In the basic color tests, unlike the hue tests having similar values of CDBACC, it was seen that the subjects arranged color chips even less correctly. It was found that JNCD (just noticeable color difference) - a measure of color vision acuity was about 3 in average for the subjects. The JNCD was chosen as the value of the CDBACC of the test for which about $50\%$ of the subjects failed to successfully arrange color chips. ERCCA (error rate of color chips arrangement) for the test with CDBACC the same as the JNCD was shown to be about $20\%$. It is expected that the multi 15-hue tests implemented on a PC in the study will be an economical tool to quickly and quantitatively evaluate color vision acuity and, accordingly, the tests can be used for early diagnosis to massive potential patients suffering from diseases (ex. diabetes, glaucoma) which may induce changes in color vision acuity.

  • PDF

X-tree Diff: An Efficient Change Detection Algorithm for Tree-structured Data (X-tree Diff: 트리 기반 데이터를 위한 효율적인 변화 탐지 알고리즘)

  • Lee, Suk-Kyoon;Kim, Dong-Ah
    • The KIPS Transactions:PartC
    • /
    • v.10C no.6
    • /
    • pp.683-694
    • /
    • 2003
  • We present X-tree Diff, a change detection algorithm for tree-structured data. Our work is motivated by need to monitor massive volume of web documents and detect suspicious changes, called defacement attack on web sites. From this context, our algorithm should be very efficient in speed and use of memory space. X-tree Diff uses a special ordered labeled tree, X-tree, to represent XML/HTML documents. X-tree nodes have a special field, tMD, which stores a 128-bit hash value representing the structure and data of subtrees, so match identical subtrees form the old and new versions. During this process, X-tree Diff uses the Rule of Delaying Ambiguous Matchings, implying that it perform exact matching where a node in the old version has one-to one corrspondence with the corresponding node in the new, by delaying all the others. It drastically reduces the possibility of wrong matchings. X-tree Diff propagates such exact matchings upwards in Step 2, and obtain more matchings downwsards from roots in Step 3. In step 4, nodes to ve inserted or deleted are decided, We aldo show thst X-tree Diff runs on O(n), woere n is the number of noses in X-trees, in worst case as well as in average case, This result is even better than that of BULD Diff algorithm, which is O(n log(n)) in worst case, We experimented X-tree Diff on reat data, which are about 11,000 home pages from about 20 wev sites, instead of synthetic documets manipulated for experimented for ex[erimentation. Currently, X-treeDiff algorithm is being used in a commeercial hacking detection system, called the WIDS(Web-Document Intrusion Detection System), which is to find changes occured in registered websites, and report suspicious changes to users.

Effect of Ethane 1,2-Dimethane Sulfonate(EDS) on the Apoptosis in the Rat Epididymis (흰쥐 부정소에서의 세포자연사에 미치는 Ethane 1,2-Dimethane Sulfonate(EDS)의 효과)

  • Son, Hyeok-Jun;Lee, Sung-Ho
    • Development and Reproduction
    • /
    • v.10 no.3
    • /
    • pp.203-209
    • /
    • 2006
  • Ethane 1,2-Dimethane sulfonate(EDS), a toxin which specifically kills Leydig cells(LC), has been widely used to prepare the reversible testosterone(T) depletion rat model. Previous studies including our own clearly demonstrated that the dramatic weight loss of the T-dependent accessory sex organs such as epididymis and seminal vesicle in this 'LC knock-out' rats. These weight loss could be derived from massive and abrupt death of the cells via apoptotic process. The present study was performed to test the effect of EDS administration on the expression of some apoptotic genes in the rat epididymis. Adult male Sprague-Dawley rats($300{\sim}350$ g B.W.) were injected with single dose of EDS(75 mg/kg, i.p.) and sacrificed on Weeks 0, 1, 2, 3, 4, 5, 6 and 7. Tissue weights and the numbers of the epididymal sperm were measured. The transcriptional activities of the bcl-2, bax, Fas and Fas ligand(Fas-L) were evaluated by semi-quantitative RT-PCR. As expected, the weights and the sperm counts of epididymis declined progressively after the EDS treatment during Week 1 and 2. These decrements were discontinued with a gradual return towards normal during Weeks $5{\sim}7$, although the maximal recoveries of the epididymal weights(71%) and sperm count(38%) were subnormal on Week 7. The initial level of bcl-2 transcripts persisted to Week 6 then elevated significantly on Week 7. The level of bax transcripts significantly decreased on Week 6, and no remarkable change was found in the rest of the experimental period. The transcripts for the Fas in epididymis elevated during Weeks $1{\sim}2$, returned to normal on Week 3, and the level persisted to the Week 7. Similarly, the level of Fas-L transcripts elevated during Weeks $1{\sim}3$ and returned to normal after Week 4. Our results demonstrated the transient T depletion by EDS administration could induce the changes in expression of the apoptotic genes in rat epididymis. The activation of Fas and Fas-L in the epididymis of EDS-treated rats might be responsible for the initial apototic process and consequently the tissue damage and the sperm loss. Future studies will attempt to determine the precise molecular mechanism(s) of apoptosis in the rat epididymis.

  • PDF

Characteristics of Biological Agent and relavent case study (생물무기 특성과 사례연구)

  • Park, Minwoo;Kim, Hwami;Choi, Yeonhwa;Kim, Jusim
    • Journal of the Society of Disaster Information
    • /
    • v.13 no.4
    • /
    • pp.442-454
    • /
    • 2017
  • Biological weapon is manipulated and produced from microorganisms such as bacteria, virus, rickettsia, fungi etc. It is classified as one of the Weapons of Mass Destruction (WMD) along with chemical weapon and radiological weapon. Biological weapon has a number of operational advantages over the other WMDs including ease of development and production, low cost and possibility of covert dissemination. In this study we analyze the history of biological weapon's development and the existing biological threats. Then, we predict the social impact of biological attack based on the physical properties of biological agent and infection mechanisms. By analyzing the recognition, dispersion pattern of agents, characteristics of the diseases in the biological weapon related historical events such as Sverdlovsk anthrax accident, 2001 anthrax attack, we found out some of the facts that biological attack would not likely to be recognized rapidly, produce large number of the exposed, increase number of paients who suffed from severe respiratory illness. It would lead the public health and medical service providers to be struggled with hugh burden. Base on the facts that we found from this case study, we suggested the main capabilities of public health required to respond to bioterrorism event efficiently. Syndromic surveillance and other reporting system need to be operated effeciently so that any suspicious event should be detected promptly. the pathogen which suspected to be used should be identified through laboratory diagnostic system. It is critical for the public health agency to define potentially exposed population under close cooperation with law enforcement agencies. Lastly, massive prophylaxis should be provided rapidly to the people at need by operating human and material resources effeciently. If those capacities of public health are consistantly fortified we would be able to deal with threat of bioterrorism successfully.

An Analysis of IT Trends Using Tweet Data (트윗 데이터를 활용한 IT 트렌드 분석)

  • Yi, Jin Baek;Lee, Choong Kwon;Cha, Kyung Jin
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.143-159
    • /
    • 2015
  • Predicting IT trends has been a long and important subject for information systems research. IT trend prediction makes it possible to acknowledge emerging eras of innovation and allocate budgets to prepare against rapidly changing technological trends. Towards the end of each year, various domestic and global organizations predict and announce IT trends for the following year. For example, Gartner Predicts 10 top IT trend during the next year, and these predictions affect IT and industry leaders and organization's basic assumptions about technology and the future of IT, but the accuracy of these reports are difficult to verify. Social media data can be useful tool to verify the accuracy. As social media services have gained in popularity, it is used in a variety of ways, from posting about personal daily life to keeping up to date with news and trends. In the recent years, rates of social media activity in Korea have reached unprecedented levels. Hundreds of millions of users now participate in online social networks and communicate with colleague and friends their opinions and thoughts. In particular, Twitter is currently the major micro blog service, it has an important function named 'tweets' which is to report their current thoughts and actions, comments on news and engage in discussions. For an analysis on IT trends, we chose Tweet data because not only it produces massive unstructured textual data in real time but also it serves as an influential channel for opinion leading on technology. Previous studies found that the tweet data provides useful information and detects the trend of society effectively, these studies also identifies that Twitter can track the issue faster than the other media, newspapers. Therefore, this study investigates how frequently the predicted IT trends for the following year announced by public organizations are mentioned on social network services like Twitter. IT trend predictions for 2013, announced near the end of 2012 from two domestic organizations, the National IT Industry Promotion Agency (NIPA) and the National Information Society Agency (NIA), were used as a basis for this research. The present study analyzes the Twitter data generated from Seoul (Korea) compared with the predictions of the two organizations to analyze the differences. Thus, Twitter data analysis requires various natural language processing techniques, including the removal of stop words, and noun extraction for processing various unrefined forms of unstructured data. To overcome these challenges, we used SAS IRS (Information Retrieval Studio) developed by SAS to capture the trend in real-time processing big stream datasets of Twitter. The system offers a framework for crawling, normalizing, analyzing, indexing and searching tweet data. As a result, we have crawled the entire Twitter sphere in Seoul area and obtained 21,589 tweets in 2013 to review how frequently the IT trend topics announced by the two organizations were mentioned by the people in Seoul. The results shows that most IT trend predicted by NIPA and NIA were all frequently mentioned in Twitter except some topics such as 'new types of security threat', 'green IT', 'next generation semiconductor' since these topics non generalized compound words so they can be mentioned in Twitter with other words. To answer whether the IT trend tweets from Korea is related to the following year's IT trends in real world, we compared Twitter's trending topics with those in Nara Market, Korea's online e-Procurement system which is a nationwide web-based procurement system, dealing with whole procurement process of all public organizations in Korea. The correlation analysis show that Tweet frequencies on IT trending topics predicted by NIPA and NIA are significantly correlated with frequencies on IT topics mentioned in project announcements by Nara market in 2012 and 2013. The main contribution of our research can be found in the following aspects: i) the IT topic predictions announced by NIPA and NIA can provide an effective guideline to IT professionals and researchers in Korea who are looking for verified IT topic trends in the following topic, ii) researchers can use Twitter to get some useful ideas to detect and predict dynamic trends of technological and social issues.

Finding Weighted Sequential Patterns over Data Streams via a Gap-based Weighting Approach (발생 간격 기반 가중치 부여 기법을 활용한 데이터 스트림에서 가중치 순차패턴 탐색)

  • Chang, Joong-Hyuk
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.3
    • /
    • pp.55-75
    • /
    • 2010
  • Sequential pattern mining aims to discover interesting sequential patterns in a sequence database, and it is one of the essential data mining tasks widely used in various application fields such as Web access pattern analysis, customer purchase pattern analysis, and DNA sequence analysis. In general sequential pattern mining, only the generation order of data element in a sequence is considered, so that it can easily find simple sequential patterns, but has a limit to find more interesting sequential patterns being widely used in real world applications. One of the essential research topics to compensate the limit is a topic of weighted sequential pattern mining. In weighted sequential pattern mining, not only the generation order of data element but also its weight is considered to get more interesting sequential patterns. In recent, data has been increasingly taking the form of continuous data streams rather than finite stored data sets in various application fields, the database research community has begun focusing its attention on processing over data streams. The data stream is a massive unbounded sequence of data elements continuously generated at a rapid rate. In data stream processing, each data element should be examined at most once to analyze the data stream, and the memory usage for data stream analysis should be restricted finitely although new data elements are continuously generated in a data stream. Moreover, newly generated data elements should be processed as fast as possible to produce the up-to-date analysis result of a data stream, so that it can be instantly utilized upon request. To satisfy these requirements, data stream processing sacrifices the correctness of its analysis result by allowing some error. Considering the changes in the form of data generated in real world application fields, many researches have been actively performed to find various kinds of knowledge embedded in data streams. They mainly focus on efficient mining of frequent itemsets and sequential patterns over data streams, which have been proven to be useful in conventional data mining for a finite data set. In addition, mining algorithms have also been proposed to efficiently reflect the changes of data streams over time into their mining results. However, they have been targeting on finding naively interesting patterns such as frequent patterns and simple sequential patterns, which are found intuitively, taking no interest in mining novel interesting patterns that express the characteristics of target data streams better. Therefore, it can be a valuable research topic in the field of mining data streams to define novel interesting patterns and develop a mining method finding the novel patterns, which will be effectively used to analyze recent data streams. This paper proposes a gap-based weighting approach for a sequential pattern and amining method of weighted sequential patterns over sequence data streams via the weighting approach. A gap-based weight of a sequential pattern can be computed from the gaps of data elements in the sequential pattern without any pre-defined weight information. That is, in the approach, the gaps of data elements in each sequential pattern as well as their generation orders are used to get the weight of the sequential pattern, therefore it can help to get more interesting and useful sequential patterns. Recently most of computer application fields generate data as a form of data streams rather than a finite data set. Considering the change of data, the proposed method is mainly focus on sequence data streams.

Introduction of Integrated Coastal Management Program and Sustainable Development of Fishing Villages in Cheonsu Bay Region (연안통합관리계획의 도입과 천수만 어촌의 지속가능발전)

  • 김부성
    • Journal of the Korean Geographical Society
    • /
    • v.38 no.2
    • /
    • pp.184-205
    • /
    • 2003
  • Sustainable Development(SD) is an important concept for the future of the coastal area, and for development of fishing villages. Since 1992 UN Conference on Environment and Development in Rio de Janeiro many governments and local authorities throughout the world have been engaged in preparing and implementing $\ulcorner$Agenda 21$\lrcorner$. Many projects which previously would have been identified as environmental protection are now presented under the banner of sustainable development. Integrated Coastal Management (ICM) is an extension of sustainable development. ICM was presented as a framework for resolution of coastal use conflicts. The aim of the present paper is to assess sustainable development potential of fishing villages in Cheonsu Bay Region according to implementation of ICM. Cheonsu Bay Region was known as one of the productive fishing grounds and Cheonsu Bay Region preserved unique characteristics of traditional fishing villages. But this region is now experiencing many changes through the massive reclamation projects like Seosan A B Project. After a brief overview of concepts and history of SD and ICM, the reclamation process and its impacts on both fishery and fishing communities in Cheonsu Bay Region are discussed. According to their changing environmental and socio-economic characteristics after the reclamation, ca 35 representative coastal villages in this region can be classified into 5 types. Many coastal villages shows diversity in their economic activities, as tourism and recreation function becomes more and more important in this region. In present-day Cheonsu Bay Region, it is possible to differentiate fishing village cooperatives(FVO) with high potential of sustainable fishery development, FVOs with medium potential, FVOs with low potential on the basis of 14 selected indicators.

A Study for strategic cooperaton of enterprise security and business (기업보안과 비즈니스의 전략적 협력에 관한 연구)

  • Ryu, Hyung-Chang
    • Korean Security Journal
    • /
    • no.28
    • /
    • pp.103-130
    • /
    • 2011
  • This study is the research of enterprise security for raising the profitability and stability of Korean companies in global business environment and strategic cooperation of business. As the scientific technology gets complicated as day goes by and new competitors appear regardless the border in the modern business environment, the situation happens frequently which the huge company hands over their market to the new one armed with the innovative thinking overnight. To survive such new environment, the answer is the change of paradigm regarding business management method at the new point of view. With the low level of security risk management of Korean companies which stick to old habit, the security management which helps the companies secure profits is not affordable. The global village where the population of 7 billions live in 21st century is facing up to the rapid ecological adaptation. The rapid change of climatic environment creates the hundreds of thousands of sufferers in a moment, and we have been watching the millions of livestock are buried alive due to new contagious disease everyday. Such change encourages the humans in global village to change the basic way of living. The business ecosystem which is the basic root for economic life cannot be an exception. To survive the business environment of 21st century, the security risk management at management level is required and the reporting line of companies should be established newly for raising business competing power through security risk management. The companies should bear in mind that they can be disappeared into our old memories overnight if they are not sensitive to the changing environment. Out of new risks for the modern companies, the field especially Korean companies are dealing easily is the security risk. Not like past, the security risk which's size is much more massive and its propagation velocity is very fast is the one of important business risks which the management should take care. Out of security risks which influence on the modern companies significantly, the brand of companies, protection of their reputation, continuity of production and operation and keeping customer's trust are prior to the others. This study offered the suggestion regarding enterprise security and the strategic cooperation of business to deal with such security risk effectively.

  • PDF

Aortopulmonary Window (대동맥폐동맥창)

  • Kim Dong-Jin;Min Sun-Kyung;Kim Woong-Han;Lee Jeong-Sang;Kim Yong-Jin;Lee Jeong-Ryul
    • Journal of Chest Surgery
    • /
    • v.39 no.4 s.261
    • /
    • pp.275-280
    • /
    • 2006
  • Background: Aortopulmonary window (APW) is a very rare congenital heart anomaly, often associated with other cardiac anomalies. It causes a significant systemic to pulmonary artery shunt, which requires early surgical correction. Accurate diagnosis and surgical correction will bring good outcomes. The purpose of this study was to describe our 20-year experience of aortopulmonary window. Material and Method: Between March 1985 and January 2005, 16 patients with APW underwent surgical repair. Mean age at operation was $157.8{\pm}245.3$ ($15.0{\sim}994.0$) days and mean weight was $4.8{\pm}2.5$ ($1.7{\sim}10.7$) kg. Patent ductus arteriosus (8), atrial septal defect (7), interruptedaortic arch (5), ventricular septal defect (4), patent foramen ovate (3), tricuspid valve regurgitation (3), mitral valve regurgitation (2), aortic valve regurgitation (1), coarctation of aorta (1), left superior vena cavae (1), and dextrocardia (1) were associated. Repair methods included 1) division of the APW with primary closure or patch closure of aorta and pulmonary artery primary closure or patch closure (11) and 2) intra-arterial patch closure (3). 3) Division of the window and descending aorta to APW anastomosis (2) in the patients with interrupted aortic arch or coarctation. Result: There was one death. The patient had 2.5 cm long severe tracheal stenosis from carina with tracheal bronchus supplying right upper lobe. The patient died at 5th post operative day due to massive tracheal bleeding. Patients with complex aortopulmonary window had longer intensive care unit and hospital stay and showed more morbidities and higher reoperation rates. 5 patients had reoperations due to left pulmonary artery stenosis (4), right pulmonary artery stenosis (2), and main pulmonary artery stenosis (1). The mean follow-up period was $6.8{\pm}5.6$ (57.0 days$\sim$16.7 years)years and all patients belonged to NYHA class 1. Conclusion: With early and prompt correction of APW, excellent surgical outcome can be expected. However, optimal surgical method needs to be established to decrease the rate of stenosis of pulmonary arteries.

Architecture and Depositional Style of Gravelly, Deep-Sea Channels: Lago Sofia Conglomerate, Southeyn Chile (칠레 남부 라고 소피아 (Lago Sofla) 심해저 하도 역암의 층구조와 퇴적 스타일)

  • Choe Moon Young;Jo Hyung Rae;Sohn Young Kwan;Kim Yeadong
    • The Korean Journal of Petroleum Geology
    • /
    • v.10 no.1_2 s.11
    • /
    • pp.23-33
    • /
    • 2004
  • The Lago Sofia conglomerate in southern Chile is a lenticular unit encased within mudstone-dominated, deep-sea successions (Cerro Toro Formation, upper Cretaceous), extending from north to south for more than $120{\cal}km$. The Lago Sofia conglomerate is a unique example of long, gravelly deep-sea channels, which are rare in the modern environments. In the northern part (areas of Lago Pehoe and Laguna Goic), the conglomerate unit consists of 3-5 conglomerate bodies intervened by mudstone sequences. Paleocurrent data from these bodies indicate sediment transport to the east, south, and southeart. The conglomerate bodies in the northern Part are interpreted as the tributary channels that drained down the Paleoslope and converged to form N-S-trending trunk channels. In the southern part (Lago Sofia section), the conglomerate unit comprises a thick (> 300 m) conglomerate body, which probably formed in axial trunk channels of the N-5-trending foredeep trough. The well-exposed Lago Sofia section allowed for detailed investigation of sedimentary facies and large-scale architecture of the deepsea channel conglomerate. The conglomerate in Lago Sofia section comprises stratified conglomerate, massive-to-graded conglomerate, and diamictite, which represent bedload deposition under turbidity currents, deposition by high-density turbidity currents, and muddy debris flows, respectively. Paleocurrent data suggest that the debris flows originated from the failure of nearby channel banks or slopes flanking the channel system, whereas the turbidity currents flowed parallel to the orientation of the overall channel system. Architectural elements produced by turbidity currents represent vertical stacking of gravel sheets, lateral accretion of gravel bars, migration of gravel dunes, and filling of channel thalwegs and scoured hollows, similar to those in terrestrial gravel-bed braided rivers. Observations of large-scale stratal pattern reveal that the channel bodies are offset stacked toward the east, suggestive of an eastward migration of the axial trunk channel. The eastward channel migration is probably due to tectonic tilting related to the uplift of the Andean protocordillera just west of the Lago Sofia deep-sea channel system.

  • PDF