• Title/Summary/Keyword: standard algorithm

Search Result 2,720, Processing Time 0.035 seconds

A Prediction of N-value Using Artificial Neural Network (인공신경망을 이용한 N치 예측)

  • Kim, Kwang Myung;Park, Hyoung June;Goo, Tae Hun;Kim, Hyung Chan
    • The Journal of Engineering Geology
    • /
    • v.30 no.4
    • /
    • pp.457-468
    • /
    • 2020
  • Problems arising during pile design works for plant construction, civil and architecture work are mostly come from uncertainty of geotechnical characteristics. In particular, obtaining the N-value measured through the Standard Penetration Test (SPT) is the most important data. However, it is difficult to obtain N-value by drilling investigation throughout the all target area. There are many constraints such as licensing, time, cost, equipment access and residential complaints etc. it is impossible to obtain geotechnical characteristics through drilling investigation within a short bidding period in overseas. The geotechnical characteristics at non-drilling investigation points are usually determined by the engineer's empirical judgment, which can leads to errors in pile design and quantity calculation causing construction delay and cost increase. It would be possible to overcome this problem if N-value could be predicted at the non-drilling investigation points using limited minimum drilling investigation data. This study was conducted to predicted the N-value using an Artificial Neural Network (ANN) which one of the Artificial intelligence (AI) method. An Artificial Neural Network treats a limited amount of geotechnical characteristics as a biological logic process, providing more reliable results for input variables. The purpose of this study is to predict N-value at the non-drilling investigation points through patterns which is studied by multi-layer perceptron and error back-propagation algorithms using the minimum geotechnical data. It has been reviewed the reliability of the values that predicted by AI method compared to the measured values, and we were able to confirm the high reliability as a result. To solving geotechnical uncertainty, we will perform sensitivity analysis of input variables to increase learning effect in next steps and it may need some technical update of program. We hope that our study will be helpful to design works in the future.

Development of Regularized Expectation Maximization Algorithms for Fan-Beam SPECT Data (부채살 SPECT 데이터를 위한 정칙화된 기댓값 최대화 재구성기법 개발)

  • Kim, Soo-Mee;Lee, Jae-Sung;Lee, Soo-Jin;Kim, Kyeong-Min;Lee, Dong-Soo
    • The Korean Journal of Nuclear Medicine
    • /
    • v.39 no.6
    • /
    • pp.464-472
    • /
    • 2005
  • Purpose: SPECT using a fan-beam collimator improves spatial resolution and sensitivity. For the reconstruction from fan-beam projections, it is necessary to implement direct fan-beam reconstruction methods without transforming the data into the parallel geometry. In this study, various fan-beam reconstruction algorithms were implemented and their performances were compared. Materials and Methods: The projector for fan-beam SPECT was implemented using a ray-tracing method. The direct reconstruction algorithms implemented for fan-beam projection data were FBP (filtered backprojection), EM (expectation maximization), OS-EM (ordered subsets EM) and MAP-EM OSL (maximum a posteriori EM using the one-step late method) with membrane and thin-plate models as priors. For comparison, the fan-beam protection data were also rebinned into the parallel data using various interpolation methods, such as the nearest neighbor, bilinear and bicubic interpolations, and reconstructed using the conventional EM algorithm for parallel data. Noiseless and noisy projection data from the digital Hoffman brain and Shepp/Logan phantoms were reconstructed using the above algorithms. The reconstructed images were compared in terms of a percent error metric. Results: for the fan-beam data with Poisson noise, the MAP-EM OSL algorithm with the thin-plate prior showed the best result in both percent error and stability. Bilinear interpolation was the most effective method for rebinning from the fan-beam to parallel geometry when the accuracy and computation load were considered. Direct fan-beam EM reconstructions were more accurate than the standard EM reconstructions obtained from rebinned parallel data. Conclusion: Direct fan-beam reconstruction algorithms were implemented, which provided significantly improved reconstructions.

An Implementation of Lighting Control System using Interpretation of Context Conflict based on Priority (우선순위 기반의 상황충돌 해석 조명제어시스템 구현)

  • Seo, Won-Il;Kwon, Sook-Youn;Lim, Jae-Hyun
    • Journal of Internet Computing and Services
    • /
    • v.17 no.1
    • /
    • pp.23-33
    • /
    • 2016
  • The current smart lighting is shaped to offer the lighting environment suitable for current context, after identifying user's action and location through a sensor. The sensor-based context awareness technology just considers a single user, and the studies to interpret many users' various context occurrences and conflicts lack. In existing studies, a fuzzy theory and algorithm including ReBa have been used as the methodology to solve context conflict. The fuzzy theory and algorithm including ReBa just avoid an opportunity of context conflict that may occur by providing services by each area, after the spaces where users are located are classified into many areas. Therefore, they actually cannot be regarded as customized service type that can offer personal preference-based context conflict. This paper proposes a priority-based LED lighting control system interpreting multiple context conflicts, which decides services, based on the granted priority according to context type, when service conflict is faced with, due to simultaneous occurrence of various contexts to many users. This study classifies the residential environment into such five areas as living room, 'bed room, study room, kitchen and bath room, and the contexts that may occur within each area are defined as 20 contexts such as exercising, doing makeup, reading, dining and entering, targeting several users. The proposed system defines various contexts of users using an ontology-based model and gives service of user oriented lighting environment through rule based on standard and context reasoning engine. To solve the issue of various context conflicts among users in the same space and at the same time point, the context in which user concentration is required is set in the highest priority. Also, visual comfort is offered as the best alternative priority in the case of the same priority. In this manner, they are utilized as the criteria for service selection upon conflict occurrence.

A Dynamic Management Method for FOAF Using RSS and OLAP cube (RSS와 OLAP 큐브를 이용한 FOAF의 동적 관리 기법)

  • Sohn, Jong-Soo;Chung, In-Jeong
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.2
    • /
    • pp.39-60
    • /
    • 2011
  • Since the introduction of web 2.0 technology, social network service has been recognized as the foundation of an important future information technology. The advent of web 2.0 has led to the change of content creators. In the existing web, content creators are service providers, whereas they have changed into service users in the recent web. Users share experiences with other users improving contents quality, thereby it has increased the importance of social network. As a result, diverse forms of social network service have been emerged from relations and experiences of users. Social network is a network to construct and express social relations among people who share interests and activities. Today's social network service has not merely confined itself to showing user interactions, but it has also developed into a level in which content generation and evaluation are interacting with each other. As the volume of contents generated from social network service and the number of connections between users have drastically increased, the social network extraction method becomes more complicated. Consequently the following problems for the social network extraction arise. First problem lies in insufficiency of representational power of object in the social network. Second problem is incapability of expressional power in the diverse connections among users. Third problem is the difficulty of creating dynamic change in the social network due to change in user interests. And lastly, lack of method capable of integrating and processing data efficiently in the heterogeneous distributed computing environment. The first and last problems can be solved by using FOAF, a tool for describing ontology-based user profiles for construction of social network. However, solving second and third problems require a novel technology to reflect dynamic change of user interests and relations. In this paper, we propose a novel method to overcome the above problems of existing social network extraction method by applying FOAF (a tool for describing user profiles) and RSS (a literary web work publishing mechanism) to OLAP system in order to dynamically innovate and manage FOAF. We employed data interoperability which is an important characteristic of FOAF in this paper. Next we used RSS to reflect such changes as time flow and user interests. RSS, a tool for literary web work, provides standard vocabulary for distribution at web sites and contents in the form of RDF/XML. In this paper, we collect personal information and relations of users by utilizing FOAF. We also collect user contents by utilizing RSS. Finally, collected data is inserted into the database by star schema. The system we proposed in this paper generates OLAP cube using data in the database. 'Dynamic FOAF Management Algorithm' processes generated OLAP cube. Dynamic FOAF Management Algorithm consists of two functions: one is find_id_interest() and the other is find_relation (). Find_id_interest() is used to extract user interests during the input period, and find-relation() extracts users matching user interests. Finally, the proposed system reconstructs FOAF by reflecting extracted relationships and interests of users. For the justification of the suggested idea, we showed the implemented result together with its analysis. We used C# language and MS-SQL database, and input FOAF and RSS as data collected from livejournal.com. The implemented result shows that foaf : interest of users has reached an average of 19 percent increase for four weeks. In proportion to the increased foaf : interest change, the number of foaf : knows of users has grown an average of 9 percent for four weeks. As we use FOAF and RSS as basic data which have a wide support in web 2.0 and social network service, we have a definite advantage in utilizing user data distributed in the diverse web sites and services regardless of language and types of computer. By using suggested method in this paper, we can provide better services coping with the rapid change of user interests with the automatic application of FOAF.

Development of Intelligent Job Classification System based on Job Posting on Job Sites (구인구직사이트의 구인정보 기반 지능형 직무분류체계의 구축)

  • Lee, Jung Seung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.123-139
    • /
    • 2019
  • The job classification system of major job sites differs from site to site and is different from the job classification system of the 'SQF(Sectoral Qualifications Framework)' proposed by the SW field. Therefore, a new job classification system is needed for SW companies, SW job seekers, and job sites to understand. The purpose of this study is to establish a standard job classification system that reflects market demand by analyzing SQF based on job offer information of major job sites and the NCS(National Competency Standards). For this purpose, the association analysis between occupations of major job sites is conducted and the association rule between SQF and occupation is conducted to derive the association rule between occupations. Using this association rule, we proposed an intelligent job classification system based on data mapping the job classification system of major job sites and SQF and job classification system. First, major job sites are selected to obtain information on the job classification system of the SW market. Then We identify ways to collect job information from each site and collect data through open API. Focusing on the relationship between the data, filtering only the job information posted on each job site at the same time, other job information is deleted. Next, we will map the job classification system between job sites using the association rules derived from the association analysis. We will complete the mapping between these market segments, discuss with the experts, further map the SQF, and finally propose a new job classification system. As a result, more than 30,000 job listings were collected in XML format using open API in 'WORKNET,' 'JOBKOREA,' and 'saramin', which are the main job sites in Korea. After filtering out about 900 job postings simultaneously posted on multiple job sites, 800 association rules were derived by applying the Apriori algorithm, which is a frequent pattern mining. Based on 800 related rules, the job classification system of WORKNET, JOBKOREA, and saramin and the SQF job classification system were mapped and classified into 1st and 4th stages. In the new job taxonomy, the first primary class, IT consulting, computer system, network, and security related job system, consisted of three secondary classifications, five tertiary classifications, and five fourth classifications. The second primary classification, the database and the job system related to system operation, consisted of three secondary classifications, three tertiary classifications, and four fourth classifications. The third primary category, Web Planning, Web Programming, Web Design, and Game, was composed of four secondary classifications, nine tertiary classifications, and two fourth classifications. The last primary classification, job systems related to ICT management, computer and communication engineering technology, consisted of three secondary classifications and six tertiary classifications. In particular, the new job classification system has a relatively flexible stage of classification, unlike other existing classification systems. WORKNET divides jobs into third categories, JOBKOREA divides jobs into second categories, and the subdivided jobs into keywords. saramin divided the job into the second classification, and the subdivided the job into keyword form. The newly proposed standard job classification system accepts some keyword-based jobs, and treats some product names as jobs. In the classification system, not only are jobs suspended in the second classification, but there are also jobs that are subdivided into the fourth classification. This reflected the idea that not all jobs could be broken down into the same steps. We also proposed a combination of rules and experts' opinions from market data collected and conducted associative analysis. Therefore, the newly proposed job classification system can be regarded as a data-based intelligent job classification system that reflects the market demand, unlike the existing job classification system. This study is meaningful in that it suggests a new job classification system that reflects market demand by attempting mapping between occupations based on data through the association analysis between occupations rather than intuition of some experts. However, this study has a limitation in that it cannot fully reflect the market demand that changes over time because the data collection point is temporary. As market demands change over time, including seasonal factors and major corporate public recruitment timings, continuous data monitoring and repeated experiments are needed to achieve more accurate matching. The results of this study can be used to suggest the direction of improvement of SQF in the SW industry in the future, and it is expected to be transferred to other industries with the experience of success in the SW industry.

Recognition and Request for Medical Direction by 119 Emergency Medical Technicians (119 구급대원들이 지각하는 의료지도의 필요성 인식과 요구도)

  • Park, Joo-Ho
    • The Korean Journal of Emergency Medical Services
    • /
    • v.15 no.3
    • /
    • pp.31-44
    • /
    • 2011
  • Purpose : The purpose of emergency medical services(EMS) is to save human lives and assure the completeness of the body in emergency situations. Those who have been qualified on medical practice to perform such treatment as there is the risk of human life and possibility of major physical and mental injuries that could result from the urgency of time and invasiveness inflicted upon the body. In the emergency medical activities, 119 emergency medical technicians mainly perform the task but they are not able to perform such task independently and they are mandatory to receive medical direction. The purpose of this study is to examine the recognition and request for medical direction by 119 emergency medical technicians in order to provide basic information on the development of medical direction program suitable to the characteristics of EMS as well as for the studies on EMS for the sake of efficient operation of pre-hospital EMS. Method : Questionnaire via e-mail was conducted during July 1-31, 2010 for 675 participants who are emergency medical technicians, nurses and other emergency crews in Gyeongbuk. The effective 171 responses were used for the final analysis. In regards to the emergency medical technicians' scope of responsibilities defined in Attached Form 14, Enforcement regulations of EMS, t-test analysis was conducted by using the means and standard deviation of the level of request for medical direction on the scope of responsibilities of Level 1 & Level 2 emergency medical technicians as the scale of medical direction request. The general characteristics, experience result, the reason for necessity, emergency medical technicians & medical director request level, medical direction method, the place of work of the medical director, feedback content and improvement plan request level were analyzed through frequency and percentage. The level of experience in medical direction and necessity were analyzed through ${\chi}^2$ test. Results : In regards to the medical direction experience per qualification, the experience was the highest with 53.3% for Level 1 emergency medical technicians and 80.3% responded that experience was helpful. As for the recognition on the necessity of medical direction, 71.3% responded as "necessary" and it turned out to be the highest of 76.9% in nurses. As for the reason for responding "necessary", the reason for reducing the risk and side-effects from EMS for patients was the largest(75.4%), and the reason of EMS delay due to the request of medical direction was the highest(71.4%) for the reason for responding "not necessary". In regards to the request level of the task scope of emergency medical technicians, injection of certain amount of solution during a state of shock was the highest($3.10{\pm}.96$) for Level 1 emergency rescuers, and the endotracheal intubation was the highest($3.12{\pm}1.03$) for nurses, and the sublingual administration of nitroglycerine(NTG) during chest pain was the highest($2.62{\pm}1.02$) for Level 2 emergency medical technicians, and regulation of heartbeat using AED was the highest($2.76{\pm}.99$) for other emergency crews. For the revitalization of medical direction, the improvement in the capability of EMS(78.9%) was requested from emergency crew, and the ability to evaluate the medical state of patient was the highest(80.1%) in the level of request for medical director. The prehospital and direct medical direction was the highest(60.8%) for medical direction method, and the emergency medical facility was the highest(52.0%) for the placement of medical director, and the evaluation of appropriateness of EMS was the highest(66.1%) for the feedback content, and the reinforcement of emergency crew(emergency medical technicians) personnel was the highest(69.0%) for the improvement plan. Conclusion : The medical direction is an important policy in the prehospital EMS activity because 119 emergency medical technicians agreed the necessity of medical direction and over 80% of those who experienced medical direction said it was helpful. In addition, the simulation training program using algorithm and case study through feedback are necessary in order to enhance the technical capability of ambulance teams on the item of professional EMS with high level of request in the task scope of emergency medical technicians, and recognition of medical direction is the essence of the EMS field. In regards to revitalizing medical direction, the improvement of the task performance capability of 119 emergency medical technicians and medical directors, reinforcement of emergency medical activity personnel, assurance of trust between emergency medical technicians and the emergency physician, and search for professional operation plan of medical direction center are needed to expand the direct medical direction method for possible treatment beforehand through the participation by medical director even at the step in which emergency situation report is received.

Clinical Analysis of Disease Recurrence for the Patients with Secondary Spontaneous Pneumothorax (이차성 자연기흉 환자의 재발양상에 관한 분석)

  • Ryu, Kyoung-Min;Kim, Sam-Hyun;Seo, Pil-Won;Park, Seong-Sik;Ryu, Jae-Wook;Kim, Hyun-Jung
    • Journal of Chest Surgery
    • /
    • v.41 no.5
    • /
    • pp.619-624
    • /
    • 2008
  • Background: Secondary spontaneous pneumothorax is caused by various underlying lung diseases, and this is despite that primary spontaneous pneumotherax is caused by rupture of subpleural blebs. The treatment algorithm for secondary pneumothorax is different from that for primary pneumothorax. We studied the recurrence rate, the characteristics of recurrence and the treatment outcomes of the patients with secondary spontaneous pneumothorax. Material and Method: Between March 2005 to March 2007, 85 patients were treated for their first episodes of secondary spontaneous pneumothorax. We analyzed the characteristics and factors for recurrence of secondary spontaneous pneumothorax by conducting a retrospective review of the medical records. Result: The most common underlying lung disease was pulmonary tuberculosis (49.4%), and the second was chronic obstructive lung disease (27.6%), The recurrence rate was 47.1% (40/85). The second and third recurrence rates were 10.9% and 3.5%, respectively. The mean follow up period was $21.1{\pm}6.7$ months (range: $0{\sim}36$ month). For the recurrence cases, 70.5% of them occurred within a year after the first episode. The success rates according to the treatment modalities were thoracostomy 47.6%, chemical pleurodesis 74.4%, blob resection 71% and Heimlich valve application 50%. Chemical pleurodesis through the chest tube was the most effective method of treatment. The factor that was most predictive of recurrence was 'an air-leak of 7 days or more' at the first episode. (p=0.002) Conclusion: The patients who have a prolonged air-leak at the first episode of pneumothorax tend to have a higher incidence of recurrence. Further studies with more patients are necessary to determine the standard treatment protocol for secondary spontaneous pneumothorax.

Restoring Omitted Sentence Constituents in Encyclopedia Documents Using Structural SVM (Structural SVM을 이용한 백과사전 문서 내 생략 문장성분 복원)

  • Hwang, Min-Kook;Kim, Youngtae;Ra, Dongyul;Lim, Soojong;Kim, Hyunki
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.2
    • /
    • pp.131-150
    • /
    • 2015
  • Omission of noun phrases for obligatory cases is a common phenomenon in sentences of Korean and Japanese, which is not observed in English. When an argument of a predicate can be filled with a noun phrase co-referential with the title, the argument is more easily omitted in Encyclopedia texts. The omitted noun phrase is called a zero anaphor or zero pronoun. Encyclopedias like Wikipedia are major source for information extraction by intelligent application systems such as information retrieval and question answering systems. However, omission of noun phrases makes the quality of information extraction poor. This paper deals with the problem of developing a system that can restore omitted noun phrases in encyclopedia documents. The problem that our system deals with is almost similar to zero anaphora resolution which is one of the important problems in natural language processing. A noun phrase existing in the text that can be used for restoration is called an antecedent. An antecedent must be co-referential with the zero anaphor. While the candidates for the antecedent are only noun phrases in the same text in case of zero anaphora resolution, the title is also a candidate in our problem. In our system, the first stage is in charge of detecting the zero anaphor. In the second stage, antecedent search is carried out by considering the candidates. If antecedent search fails, an attempt made, in the third stage, to use the title as the antecedent. The main characteristic of our system is to make use of a structural SVM for finding the antecedent. The noun phrases in the text that appear before the position of zero anaphor comprise the search space. The main technique used in the methods proposed in previous research works is to perform binary classification for all the noun phrases in the search space. The noun phrase classified to be an antecedent with highest confidence is selected as the antecedent. However, we propose in this paper that antecedent search is viewed as the problem of assigning the antecedent indicator labels to a sequence of noun phrases. In other words, sequence labeling is employed in antecedent search in the text. We are the first to suggest this idea. To perform sequence labeling, we suggest to use a structural SVM which receives a sequence of noun phrases as input and returns the sequence of labels as output. An output label takes one of two values: one indicating that the corresponding noun phrase is the antecedent and the other indicating that it is not. The structural SVM we used is based on the modified Pegasos algorithm which exploits a subgradient descent methodology used for optimization problems. To train and test our system we selected a set of Wikipedia texts and constructed the annotated corpus in which gold-standard answers are provided such as zero anaphors and their possible antecedents. Training examples are prepared using the annotated corpus and used to train the SVMs and test the system. For zero anaphor detection, sentences are parsed by a syntactic analyzer and subject or object cases omitted are identified. Thus performance of our system is dependent on that of the syntactic analyzer, which is a limitation of our system. When an antecedent is not found in the text, our system tries to use the title to restore the zero anaphor. This is based on binary classification using the regular SVM. The experiment showed that our system's performance is F1 = 68.58%. This means that state-of-the-art system can be developed with our technique. It is expected that future work that enables the system to utilize semantic information can lead to a significant performance improvement.

Development of Multimedia Annotation and Retrieval System using MPEG-7 based Semantic Metadata Model (MPEG-7 기반 의미적 메타데이터 모델을 이용한 멀티미디어 주석 및 검색 시스템의 개발)

  • An, Hyoung-Geun;Koh, Jae-Jin
    • The KIPS Transactions:PartD
    • /
    • v.14D no.6
    • /
    • pp.573-584
    • /
    • 2007
  • As multimedia information recently increases fast, various types of retrieval of multimedia data are becoming issues of great importance. For the efficient multimedia data processing, semantics based retrieval techniques are required that can extract the meaning contents of multimedia data. Existing retrieval methods of multimedia data are annotation-based retrieval, feature-based retrieval and annotation and feature integration based retrieval. These systems take annotator a lot of efforts and time and we should perform complicated calculation for feature extraction. In addition. created data have shortcomings that we should go through static search that do not change. Also, user-friendly and semantic searching techniques are not supported. This paper proposes to develop S-MARS(Semantic Metadata-based Multimedia Annotation and Retrieval System) which can represent and extract multimedia data efficiently using MPEG-7. The system provides a graphical user interface for annotating, searching, and browsing multimedia data. It is implemented on the basis of the semantic metadata model to represent multimedia information. The semantic metadata about multimedia data is organized on the basis of multimedia description schema using XML schema that basically comply with the MPEG-7 standard. In conclusion. the proposed scheme can be easily implemented on any multimedia platforms supporting XML technology. It can be utilized to enable efficient semantic metadata sharing between systems, and it will contribute to improving the retrieval correctness and the user's satisfaction on embedding based multimedia retrieval algorithm method.

Development of Gated Myocardial SPECT Analysis Software and Evaluation of Left Ventricular Contraction Function (게이트 심근 SPECT 분석 소프트웨어의 개발과 좌심실 수축 기능 평가)

  • Lee, Byeong-Il;Lee, Dong-Soo;Lee, Jae-Sung;Chung, June-Key;Lee, Myung-Chul;Choi, Heung-Kook
    • The Korean Journal of Nuclear Medicine
    • /
    • v.37 no.2
    • /
    • pp.73-82
    • /
    • 2003
  • Objectives: A new software (Cardiac SPECT Analyzer: CSA) was developed for quantification of volumes and election fraction on gated myocardial SPECT. Volumes and ejection fraction by CSA were validated by comparing with those quantified by Quantitative Gated SPECT (QGS) software. Materials and Methods: Gated myocardial SPECT was peformed in 40 patients with ejection fraction from 15% to 85%. In 26 patients, gated myocardial SPECT was acquired again with the patients in situ. A cylinder model was used to eliminate noise semi-automatically and profile data was extracted using Gaussian fitting after smoothing. The boundary points of endo- and epicardium were found using an iterative learning algorithm. Enddiastolic (EDV) and endsystolic volumes (ESV) and election fraction (EF) were calculated. These values were compared with those calculated by QGS and the same gated SPECT data was repeatedly quantified by CSA and variation of the values on sequential measurements of the same patients on the repeated acquisition. Results: From the 40 patient data, EF, EDV and ESV by CSA were correlated with those by QGS with the correlation coefficients of 0.97, 0.92, 0.96. Two standard deviation (SD) of EF on Bland Altman plot was 10.1%. Repeated measurements of EF, EDV, and ESV by CSA were correlated with each other with the coefficients of 0.96, 0.99, and 0.99 for EF, EDV and ESV respectively. On repeated acquisition, reproducibility was also excellent with correlation coefficients of 0.89, 0.97, 0.98, and coefficient of variation of 8.2%, 5.4mL, 8.5mL and 2SD of 10.6%, 21.2mL, and 16.4mL on Bland Altman plot for EF, EDV and ESV. Conclusion: We developed the software of CSA for quantification of volumes and ejection fraction on gated myocardial SPECT. Volumes and ejection fraction quantified using this software was found valid for its correctness and precision.