• Title/Summary/Keyword: 연산 효율

Search Result 2,610, Processing Time 0.034 seconds

Extraction of Muscle Areas from Ultrasonographic Images using Information of Fascia (근막 정보를 이용한 초음파 영상에서의 근육 영역 추출)

  • Kim, Kwang-Baek
    • Journal of Korea Multimedia Society
    • /
    • v.11 no.9
    • /
    • pp.1296-1301
    • /
    • 2008
  • Ultrasonography constructs pictures of areas inside the body needs in diagnosis by bouncing high-enorgy sound waves(ultrasound) off internal tissues or organs. In constructing an ultrasonographic image, the weakness of bounding signals induces noises and detailed differences of brightness, so that having a difficulty in detecting and diagnosing with the naked eyes in the analysis of ultrasonogram. Especially, the difficulty is extended when diagnosing muscle areas by using ultrasonographic images in the musculoskeletal test. In this paper, we propose a novel image processing method that computationally extracts a muscle area from an ultrasonographic image to assist in diagnosis. An ultrasonographic image consists of areas corresponding to various tissues and internal organs. The proposed method, based on features of intensity distribution, morphology and size of each area, extracts areas of the fascia, the subcutaneous fat and other internal organs, and then extracts a muscle area enclosed by areas of the fascia. In the extraction of areas of the fascia, a series of image processing methods such as histogram stretching, multiple operation, binarization and area connection by labeling is applied. A muscle area is extracted by using features on relative position and morphology of areas for the fascia and muscle areas. The performance evaluation using real ultrasonographic images and specialists' analysis show that the proposed method is able to extract target areas being approximate to real muscle areas.

  • PDF

A Depth-map Coding Method using the Adaptive XOR Operation (적응적 배타적 논리합을 이용한 깊이정보 맵 코딩 방법)

  • Kim, Kyung-Yong;Park, Gwang-Hoon
    • Journal of Broadcast Engineering
    • /
    • v.16 no.2
    • /
    • pp.274-292
    • /
    • 2011
  • This paper proposes an efficient coding method of the depth-map which is different from the natural images. The depth-map are so smooth in both inner parts of the objects and background, but it has sharp edges on the object-boundaries like a cliff. In addition, when a depth-map block is decomposed into bit planes, the characteristic of perfect matching or inverted matching between bit planes often occurs on the object-boundaries. Therefore, the proposed depth-map coding scheme is designed to have the bit-plane unit coding method using the adaptive XOR method for efficiently coding the depth-map images on the object-boundary areas, as well as the conventional DCT-based coding scheme (for example, H.264/AVC) for efficiently coding the inside area images of the objects or the background depth-map images. The experimental results show that the proposed algorithm improves the average bit-rate savings as 11.8 % ~ 20.8% and the average PSNR (Peak Signal-to-Noise Ratio) gains as 0.9 dB ~ 1.5 dB in comparison with the H.264/AVC coding scheme. And the proposed algorithm improves the average bit-rate savings as 7.7 % ~ 12.2 % and the average PSNR gains as 0.5 dB ~ 0.8 dB in comparison with the adaptive block-based depth-map coding scheme. It can be confirmed that the proposed method improves the subjective quality of synthesized image using the decoded depth-map in comparison with the H.264/AVC coding scheme. And the subjective quality of the proposed method was similar to the subjective quality of the adaptive block-based depth-map coding scheme.

A Load Balancing Method using Partition Tuning for Pipelined Multi-way Hash Join (다중 해시 조인의 파이프라인 처리에서 분할 조율을 통한 부하 균형 유지 방법)

  • Mun, Jin-Gyu;Jin, Seong-Il;Jo, Seong-Hyeon
    • Journal of KIISE:Databases
    • /
    • v.29 no.3
    • /
    • pp.180-192
    • /
    • 2002
  • We investigate the effect of the data skew of join attributes on the performance of a pipelined multi-way hash join method, and propose two new harsh join methods in the shared-nothing multiprocessor environment. The first proposed method allocates buckets statically by round-robin fashion, and the second one allocates buckets dynamically via a frequency distribution. Using harsh-based joins, multiple joins can be pipelined to that the early results from a join, before the whole join is completed, are sent to the next join processing without staying in disks. Shared nothing multiprocessor architecture is known to be more scalable to support very large databases. However, this hardware structure is very sensitive to the data skew. Unless the pipelining execution of multiple hash joins includes some dynamic load balancing mechanism, the skew effect can severely deteriorate the system performance. In this parer, we derive an execution model of the pipeline segment and a cost model, and develop a simulator for the study. As shown by our simulation with a wide range of parameters, join selectivities and sizes of relations deteriorate the system performance as the degree of data skew is larger. But the proposed method using a large number of buckets and a tuning technique can offer substantial robustness against a wide range of skew conditions.

Cost Efficient Virtual Machine Brokering in Cloud Computing (가격 효율적인 클라우드 가상 자원 중개 기법에 대한 연구)

  • Kang, Dong-Ki;Kim, Seong-Hwan;Youn, Chan-Hyun
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.3 no.7
    • /
    • pp.219-230
    • /
    • 2014
  • In the cloud computing environment, cloud service users purchase and use the virtualized resources from cloud resource providers on a pay as you go manner. Typically, there are two billing plans for computing resource allocation adopted by large cloud resource providers such as Amazon, Gogrid, and Microsoft, on-demand and reserved plans. Reserved Virtual Machine(VM) instance is provided to users based on the lengthy allocation with the cheaper price than the one of on-demand VM instance which is based on shortly allocation. With the proper mixture allocation of reserved and on-demand VM corresponding to users' requests, cloud service providers are able to reduce the resource allocation cost. To do this, prior researches about VM allocation scheme have been focused on the optimization approach with the users' request prediction techniques. However, it is difficult to predict the expected demands exactly because there are various cloud service users and the their request patterns are heavily fluctuated in reality. Moreover, the previous optimization processing techniques might require unacceptable huge time so it is hard to apply them to the current cloud computing system. In this paper, we propose the cloud brokering system with the adaptive VM allocation schemes called A3R(Adaptive 3 Resource allocation schemes) that do not need any optimization processes and kinds of prediction techniques. By using A3R, the VM instances are allocated to users in response to their service demands adaptively. We demonstrate that our proposed schemes are able to reduce the resource use cost significantly while maintaining the acceptable Quality of Service(QoS) of cloud service users through the evaluation results.

Design and Implementation of Service based Virtual Screening System in Grids (그리드에서 서비스 기반 가상 탐색 시스템 설계 및 구현)

  • Lee, Hwa-Min;Chin, Sung-Ho;Lee, Jong-Hyuk;Lee, Dae-Won;Park, Seong-Bin;Yu, Heon-Chang
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.35 no.6
    • /
    • pp.237-247
    • /
    • 2008
  • A virtual screening is the process of reducing an unmanageable number of compounds to a limited number of compounds for the target of interest by means of computational techniques such as molecular docking. And it is one of a large-scale scientific application that requires large computing power and data storage capability. Previous applications or softwares for molecular docking such as AutoDock, FlexX, Glide, DOCK, LigandFit, ViSION were developed to be run on a supercomputer, a workstation, or a cluster-computer. However the virtual screening using a supercomputer has a problem that a supercomputer is very expensive and the virtual screening using a workstation or a cluster-computer requires a long execution time. Thus we propose a service-based virtual screening system using Grid computing technology which supports a large data intensive operation. We constructed 3-dimensional chemical molecular database for virtual screening. And we designed a resource broker and a data broker for supporting efficient molecular docking service and proposed various services for virtual screening. We implemented service based virtual screening system with DOCK 5.0 and Globus 3.2 toolkit. Our system can reduce a timeline and cost of drug or new material design.

Development of Change Detection Technique Using Time Seriate Remotely Sensed Satellite Images with User Friendly GIS Interface (사용자 중심적 GIS 인터페이스를 이용한 시계열적 원격탐사 영상의 변화탐지 기법의 개발)

  • 양인태;한성만;윤희천;김흥규
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.22 no.2
    • /
    • pp.151-159
    • /
    • 2004
  • The diversity, expansion of human activity and rapid urbanization make modem society to faced with problems like damage of nature and drain of natural resources. Under these circumstances rapid and accurate change detection techniques, which can detect wide range utilization changes, are needed for efficient management and utilization plan of national territory. In this study to perform change detection from remote sensing images, space analysis technique contained in Geographic Information System is applied. And from this technique, the software. that can execute new change detection algorithm, query, inquiry and analysis, is produced. This software is on the basis of graphic user interface and has many functions such as format conversion, grid calculation, statistical processing, display and reference. In this study, simultaneously change detection for multi-temporal satellite images can be performed and integrated one change image about four different periods was produced. Further more software user can acquire land cover change information for an specific area through querying and questioning about yearly changes. Finally making of every application module for change detection into one window based visual basic program, can be produced user convenience and automatic performances.

A Seamline Extraction Technique Considering the Characteristic of NDVI for High Resolution Satellite Image Mosaics (고해상도 위성영상 모자이크를 위한 NDVI 특성을 이용한 접합선 추출 기법)

  • Kim, Jiyoung;Chae, Taebyeong;Byun, Younggi
    • Korean Journal of Remote Sensing
    • /
    • v.31 no.5
    • /
    • pp.395-408
    • /
    • 2015
  • High-resolution satellite image mosaics are becoming increasingly important in the field of remote sensing image analysis as an essential image processing to create a large image constructed from several smaller images. In this paper, we present an automatic seamline extraction technique and the procedure to generate a mosaic image by this technique. For more effective seamline extraction in the overlap region of adjacent images, an NDVI-based seamline extraction technique is developed, which takes advantage of the computational time and memory. The Normalized Difference Vegetation Index(NDVI) is an index of plant "greeness" or photosynthetic activity that is employed to extract the initial seamline. The NDVI can divide into manmade region and natural region. The cost image is obtained by the canny edge detector and the buffering technique is used to extract the ranging cost image. The seamline is extracted by applying the Dijkstra algorithm to a cost image generated through the labeling process of the extracted edge information. Histogram matching is also conducted to alleviate radiometric distortion between adjacent images acquired at different time. In the experimental results using the KOMPSAT-2/3 satellite imagery, it is confirmed that the proposed method greatly reduces the visual discontinuity caused by geometric difference of adjacent images and the computation time.

A Study on Link Travel Time Prediction by Short Term Simulation Based on CA (CA모형을 이용한 단기 구간통행시간 예측에 관한 연구)

  • 이승재;장현호
    • Journal of Korean Society of Transportation
    • /
    • v.21 no.1
    • /
    • pp.91-102
    • /
    • 2003
  • There are two goals in this paper. The one is development of existing CA(Cellular Automata) model to explain more realistic deceleration process to stop. The other is the application of the updated CA model to forecasting simulation to predict short term link travel time that takes a key rule in finding the shortest path of route guidance system of ITS. Car following theory of CA models don't makes not response to leading vehicle's velocity but gap or distance between leading vehicles and following vehicles. So a following vehicle running at free flow speed must meet steeply sudden deceleration to avoid back collision within unrealistic braking distance. To tackle above unrealistic deceleration rule, “Slow-to-stop” rule is integrated into NaSch model. For application to interrupted traffic flow, this paper applies “Slow-to-stop” rule to both normal traffic light and random traffic light. And vehicle packet method is used to simulate a large-scale network on the desktop. Generally, time series data analysis methods such as neural network, ARIMA, and Kalman filtering are used for short term link travel time prediction that is crucial to find an optimal dynamic shortest path. But those methods have time-lag problems and are hard to capture traffic flow mechanism such as spill over and spill back etc. To address above problems. the CA model built in this study is used for forecasting simulation to predict short term link travel time in Kangnam district network And it's turned out that short term prediction simulation method generates novel results, taking a crack of time lag problems and considering interrupted traffic flow mechanism.

Spherical Slepian Harmonic Expression of the Crustal Magnetic Vector and Its Gradient Components (구면 스레피안 함수로 표현된 지각 자기이상값과 구배 성분)

  • Kim, Hyung Rae
    • Economic and Environmental Geology
    • /
    • v.49 no.4
    • /
    • pp.269-280
    • /
    • 2016
  • I presented three vector crustal magnetic anomaly components and six gradients by using spherical Slepian functions over the cap area of $20^{\circ}$ of radius centered on the South Pole. The Swarm mission, launched by European Space Agency(ESA) in November of 2013, was planned to put three satellites into the low-Earth orbits, two in parallel in East-West direction and one in cross-over of the higher altitude. This orbit configuration will make the gradient measurements possible in North-South direction, vertical direction, as well as E-W direction. The gravity satellites, such as GRACE and GOCE, have already implemented their gradient measurements for recovering the accurate gravity of the Earth and its temporal variation due to mass changes on the subsurface. However, the magnetic gradients have little been applied since Swarm launched. A localized magnetic modeling method is useful in taking an account for a region where data availability was limited or of interest was special. In particular, computation to get the localized solutions is much more efficient and it has an advantage of presenting high frequency anomaly features with numbers of solutions fewer than the global ones. Besides, these localized basis functions that were done by a linear transformation of the spherical harmonic functions, are orthogonal so that they can be used for power spectrum analysis by transforming the global spherical harmonic coefficients. I anticipate in scientific and technical progress in the localized modeling with the gradient measurements from Swarm and here will do discussion on the results of the localized solution to represent the three vector and six gradient anomalies over the Antarctic area from the synthetic data derived from a global solution of the spherical harmonics for the crustal magnetic anomalies of Swarm measurements.

Application of Cyber Physical System (CPS) for Risk Management of a CO2 Storage Site (이산화탄소 저장부지 위해성 관리를 위한 가상물리시스템 적용성 평가)

  • Jeong, Jina;Park, Eungyu;Jun, Seong-Chun;Kim, Hyun-Jun;Yun, Seong-Taek
    • Economic and Environmental Geology
    • /
    • v.50 no.5
    • /
    • pp.363-373
    • /
    • 2017
  • In the present study, adaptability of cyber-physical system (CPS) for risk management of $CO_2$ storage site is examined and the subagging regression (SBR) method is proposed as a key component of the cyber-twin to estimate the risk due to potential $CO_2$ leakage. For these purposes, $CO_2$ concentration data monitored from a controlled $CO_2$ release field experiment is employed to validate the potentialities of the SBR method. From the validation study, it is found that the SBR method has robust estimation capability by showing minimal influence from anomalous measurements, and makes stable and sound predictions for the forthcoming $CO_2$ concentration trend. In addition, the method is found to be well suited as a tool of operational risk assessment based on real-time monitoring data due to the computational efficiency. The overall results suggest that the SBR method has potential to be an important component comprising the cyber twin of CPS for risk management of $CO_2$ storage site.