• Title/Summary/Keyword: Static Model

Search Result 3,040, Processing Time 0.035 seconds

The Dynamics of Organizational Change: Moderated Mediating Effects of NBA Teams' Playoff Berth (조직변화와 성과 간 상호역동에 관한 연구: 미국프로농구팀의 트레이드와 플레이오프 진출 여부에 따른 조절된 매개효과)

  • Philsoo Kim;Tae Sung Jung;Sang Bum Lee;Sang Hyun Lee
    • Asia-Pacific Journal of Business Venturing and Entrepreneurship
    • /
    • v.18 no.4
    • /
    • pp.117-129
    • /
    • 2023
  • Organizations must seek change in order to adapt to environmental changes and achieve better performance. However, despite this obvious statement, empirical analysis has been almost non-existent due to the difficulty of manipulating organizational performance or change. In this study, we overcame these limitations and analyzed the causes and effects of organizational change by assuming a professional sports team as a venture company, which is relatively easy to objectively measure and evaluate organizational change or performance. We systematically collected and preprocessed traditional and advanced metrics of National Basketball Association (NBA) statistics along with preprocessed trade data from eight years of regular seasons (2014~2015-2021~2022) to analyze our research model. Assessment of process macro model 7 derives the following empirical result. The results of the empirical analysis depict that NBA teams with low organizational performance in the previous season are more likely to make organizational changes through player trades to improve performance. Into the bargain player trades mediate the static relationship between the winning percentage in the previous season and the winning percentage in the current season. However, the indirect effect of a team's previous season's performance on player trades appears to vary depending on the current situations and context of each NBA team. Teams that made the playoffs in the previous season tend to make fewer trades than teams that did not and the previous season's performance is highly correlated with the current season's performance. On the other hand, teams that did not make the playoffs in the previous season tend to make a relatively larger amount of player trades in total, and the mediating effect of trades vanishes in this case. In other words, teams that did not make the playoffs in the previous season experience a larger change in performance due to trades than teams that made the playoffs, even if they achieved the same winning percentage. This empirical analysis of the inverse relationship between organizational change and the performance of professional sports teams has both theoretical and practical implications in the field of sports industry and management by analyzing the fundamentals of organizational change and the performance of professional sports teams.

  • PDF

A Study on Industries's Leading at the Stock Market in Korea - Gradual Diffusion of Information and Cross-Asset Return Predictability- (산업의 주식시장 선행성에 관한 실증분석 - 자산간 수익률 예측 가능성 -)

  • Kim Jong-Kwon
    • Proceedings of the Safety Management and Science Conference
    • /
    • 2004.11a
    • /
    • pp.355-380
    • /
    • 2004
  • I test the hypothesis that the gradual diffusion of information across asset markets leads to cross-asset return predictability in Korea. Using thirty-six industry portfolios and the broad market index as our test assets, I establish several key results. First, a number of industries such as semiconductor, electronics, metal, and petroleum lead the stock market by up to one month. In contrast, the market, which is widely followed, only leads a few industries. Importantly, an industry's ability to lead the market is correlated with its propensity to forecast various indicators of economic activity such as industrial production growth. Consistent with our hypothesis, these findings indicate that the market reacts with a delay to information in industry returns about its fundamentals because information diffuses only gradually across asset markets. Traditional theories of asset pricing assume that investors have unlimited information-processing capacity. However, this assumption does not hold for many traders, even the most sophisticated ones. Many economists recognize that investors are better characterized as being only boundedly rational(see Shiller(2000), Sims(2201)). Even from casual observation, few traders can pay attention to all sources of information much less understand their impact on the prices of assets that they trade. Indeed, a large literature in psychology documents the extent to which even attention is a precious cognitive resource(see, eg., Kahneman(1973), Nisbett and Ross(1980), Fiske and Taylor(1991)). A number of papers have explored the implications of limited information- processing capacity for asset prices. I will review this literature in Section II. For instance, Merton(1987) develops a static model of multiple stocks in which investors only have information about a limited number of stocks and only trade those that they have information about. Related models of limited market participation include brennan(1975) and Allen and Gale(1994). As a result, stocks that are less recognized by investors have a smaller investor base(neglected stocks) and trade at a greater discount because of limited risk sharing. More recently, Hong and Stein(1999) develop a dynamic model of a single asset in which information gradually diffuses across the investment public and investors are unable to perform the rational expectations trick of extracting information from prices. Hong and Stein(1999). My hypothesis is that the gradual diffusion of information across asset markets leads to cross-asset return predictability. This hypothesis relies on two key assumptions. The first is that valuable information that originates in one asset reaches investors in other markets only with a lag, i.e. news travels slowly across markets. The second assumption is that because of limited information-processing capacity, many (though not necessarily all) investors may not pay attention or be able to extract the information from the asset prices of markets that they do not participate in. These two assumptions taken together leads to cross-asset return predictability. My hypothesis would appear to be a very plausible one for a few reasons. To begin with, as pointed out by Merton(1987) and the subsequent literature on segmented markets and limited market participation, few investors trade all assets. Put another way, limited participation is a pervasive feature of financial markets. Indeed, even among equity money managers, there is specialization along industries such as sector or market timing funds. Some reasons for this limited market participation include tax, regulatory or liquidity constraints. More plausibly, investors have to specialize because they have their hands full trying to understand the markets that they do participate in

  • PDF

Analysis and Evaluation of Frequent Pattern Mining Technique based on Landmark Window (랜드마크 윈도우 기반의 빈발 패턴 마이닝 기법의 분석 및 성능평가)

  • Pyun, Gwangbum;Yun, Unil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.101-107
    • /
    • 2014
  • With the development of online service, recent forms of databases have been changed from static database structures to dynamic stream database structures. Previous data mining techniques have been used as tools of decision making such as establishment of marketing strategies and DNA analyses. However, the capability to analyze real-time data more quickly is necessary in the recent interesting areas such as sensor network, robotics, and artificial intelligence. Landmark window-based frequent pattern mining, one of the stream mining approaches, performs mining operations with respect to parts of databases or each transaction of them, instead of all the data. In this paper, we analyze and evaluate the techniques of the well-known landmark window-based frequent pattern mining algorithms, called Lossy counting and hMiner. When Lossy counting mines frequent patterns from a set of new transactions, it performs union operations between the previous and current mining results. hMiner, which is a state-of-the-art algorithm based on the landmark window model, conducts mining operations whenever a new transaction occurs. Since hMiner extracts frequent patterns as soon as a new transaction is entered, we can obtain the latest mining results reflecting real-time information. For this reason, such algorithms are also called online mining approaches. We evaluate and compare the performance of the primitive algorithm, Lossy counting and the latest one, hMiner. As the criteria of our performance analysis, we first consider algorithms' total runtime and average processing time per transaction. In addition, to compare the efficiency of storage structures between them, their maximum memory usage is also evaluated. Lastly, we show how stably the two algorithms conduct their mining works with respect to the databases that feature gradually increasing items. With respect to the evaluation results of mining time and transaction processing, hMiner has higher speed than that of Lossy counting. Since hMiner stores candidate frequent patterns in a hash method, it can directly access candidate frequent patterns. Meanwhile, Lossy counting stores them in a lattice manner; thus, it has to search for multiple nodes in order to access the candidate frequent patterns. On the other hand, hMiner shows worse performance than that of Lossy counting in terms of maximum memory usage. hMiner should have all of the information for candidate frequent patterns to store them to hash's buckets, while Lossy counting stores them, reducing their information by using the lattice method. Since the storage of Lossy counting can share items concurrently included in multiple patterns, its memory usage is more efficient than that of hMiner. However, hMiner presents better efficiency than that of Lossy counting with respect to scalability evaluation due to the following reasons. If the number of items is increased, shared items are decreased in contrast; thereby, Lossy counting's memory efficiency is weakened. Furthermore, if the number of transactions becomes higher, its pruning effect becomes worse. From the experimental results, we can determine that the landmark window-based frequent pattern mining algorithms are suitable for real-time systems although they require a significant amount of memory. Hence, we need to improve their data structures more efficiently in order to utilize them additionally in resource-constrained environments such as WSN(Wireless sensor network).

Analyses of the Efficiency in Hospital Management (병원 단위비용 결정요인에 관한 연구)

  • Ro, Kong-Kyun;Lee, Seon
    • Korea Journal of Hospital Management
    • /
    • v.9 no.1
    • /
    • pp.66-94
    • /
    • 2004
  • The objective of this study is to examine how to maximize the efficiency of hospital management by minimizing the unit cost of hospital operation. For this purpose, this paper proposes to develop a model of the profit maximization based on the cost minimization dictum using the statistical tools of arriving at the maximum likelihood values. The preliminary survey data are collected from the annual statistics and their analyses published by Korea Health Industry Development Institute and Korean Hospital Association. The maximum likelihood value statistical analyses are conducted from the information on the cost (function) of each of 36 hospitals selected by the random stratified sampling method according to the size and location (urban or rural) of hospitals. We believe that, although the size of sample is relatively small, because of the sampling method used and the high response rate, the power of estimation of the results of the statistical analyses of the sample hospitals is acceptable. The conceptual framework of analyses is adopted from the various models of the determinants of hospital costs used by the previous studies. According to this framework, the study postulates that the unit cost of hospital operation is determined by the size, scope of service, technology (production function) as measured by capacity utilization, labor capital ratio and labor input-mix variables, and by exogeneous variables. The variables to represent the above cost determinants are selected by using the step-wise regression so that only the statistically significant variables may be utilized in analyzing how these variables impact on the hospital unit cost. The results of the analyses show that the models of hospital cost determinants adopted are well chosen. The various models analyzed have the (goodness of fit) overall determination (R2) which all turned out to be significant, regardless of the variables put in to represent the cost determinants. Specifically, the size and scope of service, no matter how it is measured, i. e., number of admissions per bed, number of ambulatory visits per bed, adjusted inpatient days and adjusted outpatients, have overall effects of reducing the hospital unit costs as measured by the cost per admission, per inpatient day, or office visit implying the existence of the economy of scale in the hospital operation. Thirdly, the technology used in operating a hospital has turned out to have its ramifications on the hospital unit cost similar to those postulated in the static theory of the firm. For example, the capacity utilization as represented by the inpatient days per employee tuned out to have statistically significant negative impacts on the unit cost of hospital operation, while payroll expenses per inpatient cost has a positive effect. The input-mix of hospital operation, as represented by the ratio of the number of doctor, nurse or medical staff per general employee, supports the known thesis that the specialized manpower costs more than the general employees. The labor/capital ratio as represented by the employees per 100 beds is shown to have a positive effect on the cost as expected. As for the exogeneous variable's impacts on the cost, when this variable is represented by the percent of urban 100 population at the location where the hospital is located, the regression analysis shows that the hospitals located in the urban area have a higher cost than those in the rural area. Finally, the case study of the sample hospitals offers a specific information to hospital administrators about how they share in terms of the cost they are incurring in comparison to other hospitals. For example, if his/her hospital is of small size and located in a city, he/she can compare the various costs of his/her hospital operation with those of other similar hospitals. Therefore, he/she may be able to find the reasons why the cost of his/her hospital operation has a higher or lower cost than other similar hospitals in what factors of the hospital cost determinants.

  • PDF

The influence of occlusal loads on stress distribution of cervical composite resin restorations: A three-dimensional finite element study (교합력이 치경부 복합레진 수복물의 응력분포에 미치는 영향에 관한 3차원 유한요소법적 연구)

  • Park, Chan-Seok;Hur, Bock;Kim, Hyeon-Cheol;Kim, Kwang-Hoon;Son, Kwon;Park, Jeong-Kil
    • Proceedings of the KACD Conference
    • /
    • 2008.05a
    • /
    • pp.246-257
    • /
    • 2008
  • The purpose of this study was to investigate the influence of various occlusal loading sites and directions on the stress distribution of the cervical composite resin restorations of maxillary second premolar, using 3 dimensional (3D) finite element (FE) analysis. Extracted maxillary second premolar was scanned serially with Micro-CT (SkyScan1072; SkyScan, Aartselaar, Belgium). The 3D images were processed by 3D-DOCTOR (Able Software Co., Lexington, MA, USA). HyperMesh (Altair Engineering. Inc., Troy, USA) and ANSYS (Swanson Analysis Systems. Inc., Houston, USA) was used to mesh and analyze 3D FE model. Notch shaped cavity was filled with hybrid (Z100, 3M Dental Products, St. Paul, MN, USA) or flowable resin (Tetric Flow, Viva dent Ets., FL-9494-Schaan, Liechtenstein) and each restoration was simulated with adhesive layer thickness ($40{\mu}m$). A static load of 200 N was applied on the three points of the buccal incline of the palatal cusp and oriented in $20^{\circ}$ increments, from vertical (long axis of the tooth) to oblique $40^{\circ}$ direction towards the buccal. The maximum principal stresses in the occlusal and cervical cavosurface margin and vertical section of buccal surfaces of notch-shaped class V cavity were analyzed using ANSYS. As the angle of loading direction increased, tensile stress increased. Loading site had little effect on it. Under same loading condition. Tetric Flow showed relatively lower stress than Z100 overall, except both point angles. Loading direction and the elastic modulus of restorative material seem to be important factor on the cervical restoration.

  • PDF

The influence of occlusal loads on stress distribution of cervical composite resin restorations: A three-dimensional finite element study (교합력이 치경부 복합레진 수복물의 응력분포에 미치는 영향에 관한 3차원 유한요소법적 연구)

  • Park, Chan-Seok;Hur, Bock;Kim, Hyeon-Cheol;Kim, Kwang-Hoon;Son, Kwon;Park, Jeong-Kil
    • Restorative Dentistry and Endodontics
    • /
    • v.33 no.3
    • /
    • pp.246-257
    • /
    • 2008
  • The purpose of this study was to investigate the influence of various occlusal loading sites and directions on the stress distribution of the cervical composite resin restorations of maxillary second premolar, using 3 dimensional (3D) finite element (FE) analysis. Extracted maxillary second premolar was scanned serially with Micro-CT (SkyScan1072; SkyScan, Aartselaar, Belgium). The 3D images were processed by 3D-DOCTOR (Able Software Co., Lexington, MA, USA). HyperMesh (Altair Engineering, Inc., Troy, USA) and ANSYS (Swanson Analysis Systems, Inc., Houston, USA) was used to mesh and analyze 3D FE model. Notch shaped cavity was filled with hybrid (Z100, 3M Dental Products, St. Paul, MN, USA) or flowable resin (Tetric Flow, Vivadent Ets., FL-9494-Schaan, Liechtenstein) and each restoration was simulated with adhesive layer thickness ($40{\mu}m$). A static load of 200 N was applied on the three points of the buccal incline of the palatal cusp and oriented in $20^{\circ}$ increments, from vertical (long axis of the tooth) to oblique $40^{\circ}$ direction towards the buccal. The maximum principal stresses in the occlusal and cervical cavosurface margin and vertical section of buccal surfaces of notch-shaped class V cavity were analyzed using ANSYS. As the angle of loading direction increased, tensile stress increased. Loading site had little effect on it. Under same loading condition, Tetric Flow showed relatively lower stress than Z100 overall, except both point angles. Loading direction and the elastic modulus of restorative material seem to be important factor on the cervical restoration.

Availability Assessment of Single Frequency Multi-GNSS Real Time Positioning with the RTCM-State Space Representation Parameters (RTCM-SSR 보정요소 기반 1주파 Multi-GNSS 실시간 측위의 효용성 평가)

  • Lee, Yong-Chang;Oh, Seong-Jong
    • Journal of Cadastre & Land InformatiX
    • /
    • v.50 no.1
    • /
    • pp.107-123
    • /
    • 2020
  • With stabilization of the recent multi-GNSS infrastructure, and as multi-GNSS has been proven to be effective in improving the accuracy of the positioning performance in various industrial sectors. In this study, in view that SF(Single frequency) GNSS receivers are widely used due to the low costs, evaluate effectiveness of SF Real Time Point Positioning(SF-RT-PP) based on four multi-GNSS surveying methods with RTCM-SSR correction streams in static and kinematic modes, and also derive response challenges. Results of applying SSR correction streams, CNES presented good results compared to other SSR streams in 2D coordinate. Looking at the results of the SF-RT-PP surveying using SF signals from multi-GNSS, were able to identify the common cause of large deviations in the altitude components, as well as confirm the importance of signal bias correction according to combinations of different types of satellite signals and ionospheric delay compensation algorithm using undifferenced and uncombined observations. In addition, confirmed that the improvement of the infrastructure of Multi-GNSS allows SF-RT-SPP surveying with only one of the four GNSS satellites. In particular, in the case of code-based SF-RT-SPP measurements using SF signals from GPS satellites only, the difference in the application effect between broadcast ephemeris and SSR correction for satellite orbits/clocks was small, but in the case of ionospheric delay compensation, the use of SBAS correction information provided more than twice the accuracy compared to result of the Klobuchar model. With GPS and GLONASS, both the BDS and GALILEO constellations will be fully deployed in the end of 2020, and the greater benefits from the multi-GNSS integration can be expected. Specially, If RT-ionospheric correction services reflecting regional characteristics and SSR correction information reflecting atmospheric characteristics are carried out in real-time, expected that the utilization of SF-RT-PPP survey technology by multi-GNSS and various demands will be created in various industrial sectors.

FINITE ELEMENT ANALYSIS OF MAXILLARY CENTRAL INCISORS RESTORED WITH VARIOUS POST-AND-CORE APPLICATIONS (여러가지 post-and-core로 수복된 상악 중절치의 유한요소법적 연구)

  • Seo, Min-Seock;Shon, Won-Jun;Lee, Woo-Cheol;Yoo, Hyun-Mi;Cho, Byeong-Hoon;Baek, Seung-Ho
    • Restorative Dentistry and Endodontics
    • /
    • v.34 no.4
    • /
    • pp.324-332
    • /
    • 2009
  • The purpose of this study was to investigate the effect of rigidity of post core systems on stress distribution by the theoretical technique, finite element stress-analysis method. Three-dimensional finite element models simulating an endodontically treated maxillary central incisor restored with a zirconia ceramic crown were prepared and 1.5 mm ferrule height was provided. Each model contained cortical bone, trabecular bone, periodontal ligament, 4 mm apical root canal filling, and post-and-core. Six combinations of three parallel type post (zirconia ceramic, glass fiber, and stainless steel) and two core (Paracore and Tetric ceram) materials were evaluated, respectively. A 50 N static occlusal load was applied to the palatal surface of the crown with a $60^{\circ}$angle to the long axis of the tooth. The differences in stress transfer characteristics of the models were analyzed. von Mises stresses were chosen for presentation of results and maximum displacement and hydrostatic pressure were also calculated. An increase of the elastic modulus of the post material increased the stress, but shifted the maximum stress location from the dentin surface to the post material. Buccal side of cervical region (junction of core and crown) of the glass fiber post restored tooth was subjected to the highest stress concentration. Maximum von Mises stress in the remaining radicular tooth structure for low elastic modulus resin core (29.21 MPa) was slightly higher than that for high elastic modulus resin core (29.14 MPa) in case of glass fiber post. Maximum displacement of glass fiber post restored tooth was higher than that of zirconia ceramic or stainless steel post restored tooth.

The influence of composite resin restoration on the stress distribution of notch shaped noncarious cervical lesion A three dimensional finite element analysis study (복합레진 수복물이 쐐기형 비우식성 치경부 병소의 응력 분포에 미치는 영향에 관한 3차원 유한요소법적 연구)

  • Lee, Chae-Kyung;Park, Jeong-Kil;Kim, Hyeon-Cheol;Woo, Sung-Gwan;Kim, Kwang-Hoon;Son, Kwon;Hur, Bock
    • Restorative Dentistry and Endodontics
    • /
    • v.32 no.1
    • /
    • pp.69-79
    • /
    • 2007
  • The purpose of this study was to investigate the effects of composite resin restorations on the stress distribution of notch shaped noncarious cervical lesion using three-dimensional (3D) finite element analysis (FEA). Extracted maxillary second premolar was scanned serially with Micro-CT (SkyScan1072 ; SkyScan, Aartselaar, Belgium). The 3D images were processed by 3D-DOCTOR (Able Software Co., Lexington, MA, USA). ANSYS (Swanson Analysis Systems, Inc., Houston, USA) was used to mesh and analyze 3D FE model. Notch shaped cavity was filled with hybrid or flowable resin and each restoration was simulated with adhesive layer thickness ($40{\mu}m$) A static load of 500 N was applied on a point load condition at buccal cusp (loading A) and palatal cusp (loading B). The principal stresses in the lesion apex (internal line angle of cavity) and middle vertical wall were analyzed using ANSYS. The results were as follows 1. Under loading A, compressive stress is created in the unrestored and restored cavity. Under loading B, tensile stress is created. And the peak stress concentration is seen at near mesial corner of the cavity under each load condition. 2. Compared to the unrestored cavity, the principal stresses at the cemeto-enamel junction (CEJ) and internal line angle of the cavity were more reduced in the restored cavity on both load con ditions. 3. In teeth restored with hybrid composite, the principal stresses at the CEJ and internal line angle of the cavity were more reduced than flowable resin.

The Need for Paradigm Shift in Semantic Similarity and Semantic Relatedness : From Cognitive Semantics Perspective (의미간의 유사도 연구의 패러다임 변화의 필요성-인지 의미론적 관점에서의 고찰)

  • Choi, Youngseok;Park, Jinsoo
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.1
    • /
    • pp.111-123
    • /
    • 2013
  • Semantic similarity/relatedness measure between two concepts plays an important role in research on system integration and database integration. Moreover, current research on keyword recommendation or tag clustering strongly depends on this kind of semantic measure. For this reason, many researchers in various fields including computer science and computational linguistics have tried to improve methods to calculating semantic similarity/relatedness measure. This study of similarity between concepts is meant to discover how a computational process can model the action of a human to determine the relationship between two concepts. Most research on calculating semantic similarity usually uses ready-made reference knowledge such as semantic network and dictionary to measure concept similarity. The topological method is used to calculated relatedness or similarity between concepts based on various forms of a semantic network including a hierarchical taxonomy. This approach assumes that the semantic network reflects the human knowledge well. The nodes in a network represent concepts, and way to measure the conceptual similarity between two nodes are also regarded as ways to determine the conceptual similarity of two words(i.e,. two nodes in a network). Topological method can be categorized as node-based or edge-based, which are also called the information content approach and the conceptual distance approach, respectively. The node-based approach is used to calculate similarity between concepts based on how much information the two concepts share in terms of a semantic network or taxonomy while edge-based approach estimates the distance between the nodes that correspond to the concepts being compared. Both of two approaches have assumed that the semantic network is static. That means topological approach has not considered the change of semantic relation between concepts in semantic network. However, as information communication technologies make advantage in sharing knowledge among people, semantic relation between concepts in semantic network may change. To explain the change in semantic relation, we adopt the cognitive semantics. The basic assumption of cognitive semantics is that humans judge the semantic relation based on their cognition and understanding of concepts. This cognition and understanding is called 'World Knowledge.' World knowledge can be categorized as personal knowledge and cultural knowledge. Personal knowledge means the knowledge from personal experience. Everyone can have different Personal Knowledge of same concept. Cultural Knowledge is the knowledge shared by people who are living in the same culture or using the same language. People in the same culture have common understanding of specific concepts. Cultural knowledge can be the starting point of discussion about the change of semantic relation. If the culture shared by people changes for some reasons, the human's cultural knowledge may also change. Today's society and culture are changing at a past face, and the change of cultural knowledge is not negligible issues in the research on semantic relationship between concepts. In this paper, we propose the future directions of research on semantic similarity. In other words, we discuss that how the research on semantic similarity can reflect the change of semantic relation caused by the change of cultural knowledge. We suggest three direction of future research on semantic similarity. First, the research should include the versioning and update methodology for semantic network. Second, semantic network which is dynamically generated can be used for the calculation of semantic similarity between concepts. If the researcher can develop the methodology to extract the semantic network from given knowledge base in real time, this approach can solve many problems related to the change of semantic relation. Third, the statistical approach based on corpus analysis can be an alternative for the method using semantic network. We believe that these proposed research direction can be the milestone of the research on semantic relation.