• Title/Summary/Keyword: Newton

Search Result 1,385, Processing Time 0.027 seconds

Understanding the Proof of Inverse Square Law of Newton's Principia from a Heuristic Point of View (Newton의 Principia에서 역제곱 법칙 증명에 대한 발견적 관점에서의 이해)

  • Kang, Jeong Gi
    • Communications of Mathematical Education
    • /
    • v.36 no.1
    • /
    • pp.23-38
    • /
    • 2022
  • The study provided a perspective on which readers can see Newton's proof heuristically in order to overcome the difficulty of proof showing 'QT2/QR converges to the latus rectum of ellipse' in the proof of the inverse square law of Newton's Principia. The heuristic perspective is as follows: The starting point of the proof is the belief that if we transform the denominators and numerators of QT2/QR into expression with respect to segments related to diameter and conjugate diameter, we may obtain some constant, the desired value, by their relationship PV × VG/QV2 = PC2/CD2 in Apollonius' Conic sections. The heuristic perspective proposed in this study is meaningful because it can help readers understand Newton's proof more easily by presenting the direction of transformation of QT2/QR.

A Study on Channel Equalization for DS-CDMA System in Fast Fading Environment (Fast Fading 환경에서 DS-CDMA 시스템에 대한 채널 등화에 관한 연구)

  • 김원균;박노진;강철호
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.7B
    • /
    • pp.937-943
    • /
    • 2001
  • fast fading 채널 특성을 갖는 DS-CDMA 다중 사용자 환경에서 Normalized CMA(Constant Modulus Algorithm)와 Newton 방식을 이용한 CMA를 이용하여 빠른 수렴속도와 작은 평균 자승 오차(Mean Square Error)를 동시에 개선할 수 있는 등화 방법을 제안하였다. Normalized CMA는 Newton 방식을 이용한 CMA에 비해 작은 평균 자승오차를 갖지만 수렴속도가 느리다는 단점이 있다. 반면 Newton 방식을 이용한 CMA는 Normalized CMA에 비해 수렴속도는 빠르지만 큰 평균 자승 오차를 갖는다는 단점이 있다. 따라서 빠른 수렴 속도와 작은 평균 자승 오차를 동시에 얻기 위한 구조를 제안하였으며, 이 구조는 각각의 알고리즘을 사용하는 방법과는 달리 두 개의 알고리즘을 동시에 이용한다. 모의 실험 결과, 제안한 기법이 Normalized CMA보다 약 320번, Newton 방식을 이용한 CMA보다는 170번 정도 빠른 수렴 속도를 나타냈으며, 동시에 수렴시의 평균 자승 오차는 Newton 방식을 이용한 CMA보다 약 0.6dB, Normalized CMA보다 약 0.4dB 정도 낮은 수치를 나타내는 것을 확인할 수 있었다.

  • PDF

Time Variant Parameter Estimation using RLS Algorithm with Adaptive Forgetting Factor Based on Newton-Raphson Method (Newton-Raphson법 기반의 적응 망각율을 갖는 RLS 알고리즘에 의한 원격센서시스템의 시변파라메타 추정)

  • Kim, Kyung-Yup;Lee, Joon-Tark
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2007.04a
    • /
    • pp.435-439
    • /
    • 2007
  • This paper deals with RLS algorithm using Newton-Raphson method based adaptive forgetting factor for a passive telemetry RF sensor system in order to estimate the time variant parameter to be included in RF sensor model. For this estimation with RLS algorithm, phasor typed RF sensor system modelled with inductive coupling principle is used. Instead of applying constant forgetting factor to estimate time variant parameter, the adaptive forgetting factor based on Newton-Raphson method is applied to RLS algorithm without constant forgetting factor to be determined intuitively. Finally, we provide numerical examples to evaluate the feasibility and generality of the proposed method in this paper.

  • PDF

AFFINE INVARIANT LOCAL CONVERGENCE THEOREMS FOR INEXACT NEWTON-LIKE METHODS

  • Argyros, Ioannis K.
    • Journal of applied mathematics & informatics
    • /
    • v.6 no.2
    • /
    • pp.393-406
    • /
    • 1999
  • Affine invariant sufficient conditions are given for two local convergence theorems involving inexact Newton-like methods. The first uses conditions on the first Frechet-derivative whereas the second theorem employs hypotheses on the second. Radius of con-vergence as well as rate of convergence results are derived. Results involving superlinear convergence and known to be true for inexact Newton methods are extended here. Moreover we show that under hypotheses on the second Frechet-derivation our radius of convergence results are derived. Results involving superlinear convergence and known to be true or inexact Newton methods are extended here. Moreover we show that under hypotheses on the second Frechet-derivative our radius of conver-gence is larger than the corresponding one in [10]. This allows a wider choice for the initial guess. A numerical example is also pro-vided to show that our radius of convergence is larger then the one in [10].

SOLVING MATRIX POLYNOMIALS BY NEWTON'S METHOD WITH EXACT LINE SEARCHES

  • Seo, Jong-Hyeon;Kim, Hyun-Min
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.12 no.2
    • /
    • pp.55-68
    • /
    • 2008
  • One of well known and much studied nonlinear matrix equations is the matrix polynomial which has the form $P(X)=A_0X^m+A_1X^{m-1}+{\cdots}+A_m$, where $A_0$, $A_1$, ${\cdots}$, $A_m$ and X are $n{\times}n$ complex matrices. Newton's method was introduced a useful tool for solving the equation P(X)=0. Here, we suggest an improved approach to solve each Newton step and consider how to incorporate line searches into Newton's method for solving the matrix polynomial. Finally, we give some numerical experiment to show that line searches reduce the number of iterations for convergence.

  • PDF

ON THE ORDER AND RATE OF CONVERGENCE FOR PSEUDO-SECANT-NEWTON'S METHOD LOCATING A SIMPLE REAL ZERO

  • Kim, Young Ik
    • Journal of the Chungcheong Mathematical Society
    • /
    • v.19 no.2
    • /
    • pp.133-139
    • /
    • 2006
  • By combining the classical Newton's method with the pseudo-secant method, pseudo-secant-Newton's method is constructed and its order and rate of convergence are investigated. Given a function $f:\mathbb{R}{\rightarrow}\mathbb{R}$ that has a simple real zero ${\alpha}$ and is sufficiently smooth in a small neighborhood of ${\alpha}$, the convergence behavior is analyzed near ${\alpha}$ for pseudo-secant-Newton's method. The order of convergence is shown to be cubic and the rate of convergence is proven to be $\(\frac{f^{{\prime}{\prime}}(\alpha)}{2f^{\prime}(\alpha)}\)^2$. Numerical experiments show the validity of the theory presented here and are confirmed via high-precision programming in Mathematica.

  • PDF