• Title/Summary/Keyword: Product Complexity

Search Result 302, Processing Time 0.031 seconds

Strategic Issues in Managing Complexity in NPD Projects (신제품개발 과정의 복잡성에 대한 주요 연구과제)

  • Kim, Jongbae
    • Asia Marketing Journal
    • /
    • v.7 no.3
    • /
    • pp.53-76
    • /
    • 2005
  • With rapid technological and market change, new product development (NPD) complexity is a significant issue that organizations continually face in their development projects. There are numerous factors, which cause development projects to become increasingly costly & complex. A product is more likely to be successfully developed and marketed when the complexity inherent in NPD projects is clearly understood and carefully managed. Based upon the previous studies, this study examines the nature and importance of complexity in developing new products and then identifies several issues in managing complexity. Issues considered include: definition of complexity : consequences of complexity; and methods for managing complexity in NPD projects. To achieve high performance in managing complexity in development projects, these issues need to be addressed, for example: A. Complexity inherent in NPD projects is multi-faceted and multidimensional. What factors need to be considered in defining and/or measuring complexity in a development project? For example, is it sufficient if complexity is defined only from a technological perspective, or is it more desirable to consider the entire array of complexity sources which NPD teams with different functions (e.g., marketing, R&D, manufacturing, etc.) face in the development process? Moreover, is it sufficient if complexity is measured only once during a development project, or is it more effective and useful to trace complexity changes over the entire development life cycle? B. Complexity inherent in a project can have negative as well as positive influences on NPD performance. Thus, which complexity impacts are usually considered negative and which are positive? Project complexity also can affect the entire organization. Any complexity could be better assessed in broader and longer perspective. What are some ways in which the long-term impact of complexity on an organization can be assessed and managed? C. Based upon previous studies, several approaches for managing complexity are derived. What are the weaknesses & strengths of each approach? Is there a desirable hierarchy or order among these approaches when more than one approach is used? Are there differences in the outcomes according to industry and product types (incremental or radical)? Answers to these and other questions can help organizations effectively manage the complexity inherent in most development projects. Complexity is worthy of additional attention from researchers and practitioners alike. Large-scale empirical investigations, jointly conducted by researchers and practitioners, will help gain useful insights into understanding and managing complexity. Those organizations that can accurately identify, assess, and manage the complexity inherent in projects are likely to gain important competitive advantages.

  • PDF

Low Latency Algorithms for Iterative Codes

  • Choi, Seok-Soon;Jung, Ji-Won;Bae, Jong-Tae;Kim, Min-Hyuk;Choi, Eun-A
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.32 no.3C
    • /
    • pp.205-215
    • /
    • 2007
  • This paper presents low latency and/or computation algorithms of iterative codes of turbo codes, turbo product codes and low density parity check codes for use in wireless broadband communication systems. Due to high coding complexity of iterative codes, this paper focus on lower complexity and/or latency algorithms that are easily implementable in hardware and further accelerate the decoding speed.

The Research on the Product's Aesthetic Influential Factors of Simplicity/Complexity and Applying functions to Its Shape, and the Consumers' Preferences Related to Them (심미적 영향요소인 단순/복잡과 제품 형태의 기능 표현 지각 그리고 선호도의 관계)

  • Cho Kwang-Soo
    • Science of Emotion and Sensibility
    • /
    • v.8 no.1
    • /
    • pp.63-74
    • /
    • 2005
  • This research basically defines that relationship among simplicity/complexity, other product aesthetics and consumer preference. The goal of this research is to find answer 'How much the simplicity/complexity influence expressed functional shapes on products?' and 'What is relationship between simplicity/complexity and consumer preference?' This is process that first we make categorize products by analysis, and second we analyze each category group for fading propensity from consumers, and finally we verify through design process.

  • PDF

Low Complexity GF(2$^{m}$ ) Multiplier based on AOP (회로 복잡도를 개선한 AOP 기반의 GF(2$^{m}$ ) 승산기)

  • 변기영;성현경;김흥수
    • Proceedings of the IEEK Conference
    • /
    • 2003.07c
    • /
    • pp.2633-2636
    • /
    • 2003
  • This study focuses on the new hardware design of fast and low-complexity multiplier over GF(2$\^$m/). The proposed multiplier based on the irreducible all one polynomial (AOP) of degree m, to reduced the system's complexity. It composed of Cyclic Shift, Partial Product, and Modular Summation Blocks. Also it consists of (m+1)$^2$2-input AND gates and m(m+1) 2-input XOR gates. Out architecture is very regular, modular and therefore, well-suited for VLSI implementation.

  • PDF

Maximum Product Detection Algorithm for Group Testing Frameworks

  • Seong, Jin-Taek
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.13 no.2
    • /
    • pp.95-101
    • /
    • 2020
  • In this paper, we consider a group testing (GT) framework which is to find a set of defective samples out of a large number of samples. To handle this framework, we propose a maximum product detection algorithm (MPDA) which is based on maximum a posteriori probability (MAP). The key idea of this algorithm exploits iterative detection to propagate belief to neighbor samples by exchanging marginal probabilities between samples and output results. The belief propagation algorithm as a conventional approach has been used to detect defective samples, but it has computational complexity to obtain the marginal probability in the output nodes which combine other marginal probabilities from the sample nodes. We show that the our proposed MPDA provides a benefit to reduce computational complexity up to 12% in runtime, while its performance is only slightly degraded compared to the belief propagation algorithm. And we verify the simulations to compare the difference of performance.

Effect of Product Involvement and Brand Preference on Consumers' Evaluation Effort for Multi-Dimensional Prices (소비자의 다차원가격 평가노력에 대한 제품관여도와 브랜드선호도의 영향)

  • Kim, Jae-Yeong
    • Journal of Distribution Science
    • /
    • v.13 no.4
    • /
    • pp.55-64
    • /
    • 2015
  • Purpose - Multi-dimensional prices comprise multiple components such as monthly payments and a number of payments rather than a single lump-sum amount. According to previous studies, an increase in the number of price dimensions leads to a massive amount of cognitive stress resulting in incorrect calculation, and deterioration in the consistency of the price judgment. However, an increase only in the level of complexity of calculating multi-dimensional prices does not always result in a corresponding decrease in the accuracy of price evaluation. Since diverse variables could affect consumers' purchase-decision-making process, the results of price evaluation would be different. In this study, an empirical analysis was performed to determine how the accuracy of price evaluation varies depending on the extent of the complexity of price dimensions using product involvement and brand preference as moderating variables. Research design, data, and methodology - A survey was conducted on 260 students, and 252 effective responses were used for analysis. The data was analyzed using t-test, one-way ANOVA, and two-way ANOVA. In this study, six hypotheses were developed to examine the effect of product involvement and brand preference on consumers' evaluation effort of multi-dimensional prices. Results - As the number of price dimensions increased, accuracy of price evaluation appeared to be low in high involvement, as expected. However, it showed no differences in price evaluation effort when the level of complexity of calculating multi-dimensional prices is low. When a small number of price dimensions are presented in both cases of high and low involvement, accuracy of price evaluation is much higher in a weak brand preference. On the contrary, a strong brand preference enhances an accuracy of price evaluation only in case of low involvement when the number of price dimensions is increased. An interaction effect of product involvement and brand preference on consumers' evaluation of multi-dimensional prices did not exist irrespective of the level of complexity of calculating prices being high or low. Conclusions - When the number of price dimensions is small, consumers' effort for price evaluation shows almost no difference without the moderating effect of involvement, and a weak brand preference leads to a higher accuracy of price evaluation in an effort to make the best selection. No interaction effect of product involvement and brand preference was found except for a main effect of brand preference. When a price is composed of multiple dimensions rendering it more difficult to calculate the final price, the effort for price evaluation was expected to decrease only slightly in case of combination of high involvement and strong brand preference. This is because people have a higher purchase intentions and trust for that particular brand. However, the accuracy of price evaluation was much lower in cases of high involvement, and there was no interaction effect between product involvement and brand preference except for a main effect of involvement and brand preference, respectively.

New Simplified Sum-Product Algorithm for Low Complexity LDPC Decoding (복잡도를 줄인 LDPC 복호를 위한 새로운 Simplified Sum-Product 알고리즘)

  • Han, Jae-Hee;SunWoo, Myung-Hoon
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.3C
    • /
    • pp.322-328
    • /
    • 2009
  • This paper proposes new simplified sum-product (SSP) decoding algorithm to improve BER performance for low-density parity-check codes. The proposed SSP algorithm can replace multiplications and divisions with additions and subtractions without extra computations. In addition, the proposed SSP algorithm can simplify both the In[tanh(x)] and tanh-1 [exp(x)] by using two quantization tables which can reduce tremendous computational complexity. Moreover, the simulation results show that the proposed SSP algorithm can improve about $0.3\;{\sim}\;0.8\;dB$ of BER performance compared with the existing modified sum-product algorithms.

High-Performance and Low-Complexity Decoding of High-Weight LDPC Codes (높은 무게 LDPC 부호의 저복잡도 고성능 복호 알고리즘)

  • Cho, Jun-Ho;Sung, Won-Yong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.5C
    • /
    • pp.498-504
    • /
    • 2009
  • A high-performance low-complexity decoding algorithm for LDPC codes is proposed in this paper, which has the advantages of both bit-flipping (BF) algorithm and sum-product algorithm (SPA). The proposed soft bit-flipping algorithm requires only simple comparison and addition operations for computing the messages between bit and check nodes, and the amount of those operations is also small. By increasing the utilization ratio of the computed messages and by adopting nonuniform quantization, the signal-to-noise ratio (SNR) gap to the SPA is reduced to 0.4dB at the frame error rate of 10-4 with only 5-bit assignment for quantization. LDPC codes with high column or row weights, which are not suitable for the SPA decoding due to the complexity, can be practically implemented without much worsening the error performance.

High Bit-Rates Quantization of the First-Order Markov Process Based on a Codebook-Constrained Sample-Adaptive Product Quantizers (부호책 제한을 가지는 표본 적응 프로덕트 양자기를 이용한 1차 마르코프 과정의 고 전송률 양자화)

  • Kim, Dong-Sik
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.49 no.1
    • /
    • pp.19-30
    • /
    • 2012
  • For digital data compression, the quantization is the main part of the lossy source coding. In order to improve the performance of quantization, the vector quantizer(VQ) can be employed. The encoding complexity, however, exponentially increases as the vector dimension or bit rate gets large. Much research has been conducted to alleviate such problems of VQ. Especially for high bit rates, a constrained VQ, which is called the sample-adaptive product quantizer(SAPQ), has been proposed for reducing the hugh encoding complexity of regular VQs. SAPQ has very similar structure as to the product VQ(PQ). However, the quantizer performance can be better than the PQ case. Further, the encoding complexity and the memory requirement for the codebooks are lower than the regular full-search VQ case. Among SAPQs, 1-SAPQ has a simple quantizer structure, where each product codebook is symmetric with respect to the diagonal line in the underlying vector space. It is known that 1-SAPQ shows a good performance for i.i.d. sources. In this paper, a study on designing 1-SAPQ for the first-order Markov process. For an efficient design of 1-SAPQ, an algorithm for the initial codebook is proposed, and through the numerical analysis it is shown that 1-SAPQ shows better quantizer distortion than the VQ case, of which encoding complexity is similar to that of 1-SAPQ, and shows distortions, which are close to that of the DPCM(differential pulse coded modulation) scheme with the Lloyd-Max quantizer.