• Title/Summary/Keyword: C-embedding

Search Result 139, Processing Time 0.022 seconds

WEAKLY LAGRANGIAN EMBEDDING AND PRODUCT MANIFOLDS

  • Byun, Yang-Hyun;Yi, Seung-Hun
    • Bulletin of the Korean Mathematical Society
    • /
    • v.35 no.4
    • /
    • pp.809-817
    • /
    • 1998
  • We investigate when the product of two smooth manifolds admits a weakly Lagrangian embedding. Prove that, if $M^m$ and $N^n$ are smooth manifolds such that M admits a weakly Lagrangian embedding into ${\mathbb}C^m$ whose normal bundle has a nowhere vanishing section and N admits a weakly Lagrangian immersion into ${\mathbb}C^n$, then $M \times N$ admits a weakly Lagrangian embedding into ${\mathbb}C^{m+n}$. As a corollary, we obtain that $S^m {\times} S^n$ admits a weakly Lagrangian embedding into ${\mathbb}C^{m+n}$ if n=1,3. We investigate the problem of whether $S^m{\times}S^n$ in general admits a weakly Lagrangian embedding into ${\mathbb} C^{m+n}$.

  • PDF

WEAKLY LAGRANGIAN EMBEDDING $S^m\;{\times}\;S^n$ INTO $C^{m+n}$

  • Byun, Yang-Hyun;Yi, Seung-Hun
    • Bulletin of the Korean Mathematical Society
    • /
    • v.36 no.4
    • /
    • pp.799-808
    • /
    • 1999
  • We investigate when the .product of two smooth manifolds admits a weakly Lagrangian embedding. Assume M, N are oriented smooth manifolds of dimension m and n,. respectively, which admit weakly Lagrangian immersions into $C^m$ and $C^n$. If m and n are odd, then $M\;{\times}\;N$ admits a weakly Lagrangian embedding into $C^{m+n}$ In the case when m is odd and n is even, we assume further that $\chi$(N) is an even integer. Then $M\;{\times}\;N$ admits a weakly Lagrangian embedding into $C^{m+n}$. As a corollary, we obtain the result that $S^n_1\;{\times}\;S^n_2\;{\times}\;...{\times}\;S^n_k$, $\kappa$>1, admits a weakly Lagrang.ian embedding into $C^n_1+^n_2+...+^n_k$ if and only if some ni is odd.

  • PDF

Word Embedding using word position information (단어의 위치정보를 이용한 Word Embedding)

  • Hwang, Hyunsun;Lee, Changki;Jang, HyunKi;Kang, Dongho
    • Annual Conference on Human and Language Technology
    • /
    • 2017.10a
    • /
    • pp.60-63
    • /
    • 2017
  • 자연어처리에 딥 러닝을 적용하기 위해 사용되는 Word embedding은 단어를 벡터 공간상에 표현하는 것으로 차원축소 효과와 더불어 유사한 의미의 단어는 유사한 벡터 값을 갖는다는 장점이 있다. 이러한 word embedding은 대용량 코퍼스를 학습해야 좋은 성능을 얻을 수 있기 때문에 기존에 많이 사용되던 word2vec 모델은 대용량 코퍼스 학습을 위해 모델을 단순화 하여 주로 단어의 등장 비율에 중점적으로 맞추어 학습하게 되어 단어의 위치 정보를 이용하지 않는다는 단점이 있다. 본 논문에서는 기존의 word embedding 학습 모델을 단어의 위치정보를 이용하여 학습 할 수 있도록 수정하였다. 실험 결과 단어의 위치정보를 이용하여 word embedding을 학습 하였을 경우 word-analogy의 syntactic 성능이 크게 향상되며 어순이 바뀔 수 있는 한국어에서 특히 큰 효과를 보였다.

  • PDF

Word Embedding using word position information (단어의 위치정보를 이용한 Word Embedding)

  • Hwang, Hyunsun;Lee, Changki;Jang, HyunKi;Kang, Dongho
    • 한국어정보학회:학술대회논문집
    • /
    • 2017.10a
    • /
    • pp.60-63
    • /
    • 2017
  • 자연어처리에 딥 러닝을 적용하기 위해 사용되는 Word embedding은 단어를 벡터 공간상에 표현하는 것으로 차원축소 효과와 더불어 유사한 의미의 단어는 유사한 벡터 값을 갖는다는 장점이 있다. 이러한 word embedding은 대용량 코퍼스를 학습해야 좋은 성능을 얻을 수 있기 때문에 기존에 많이 사용되던 word2vec 모델은 대용량 코퍼스 학습을 위해 모델을 단순화 하여 주로 단어의 등장 비율에 중점적으로 맞추어 학습하게 되어 단어의 위치 정보를 이용하지 않는다는 단점이 있다. 본 논문에서는 기존의 word embedding 학습 모델을 단어의 위치정보를 이용하여 학습 할 수 있도록 수정하였다. 실험 결과 단어의 위치정보를 이용하여 word embedding을 학습 하였을 경우 word-analogy의 syntactic 성능이 크게 향상되며 어순이 바뀔 수 있는 한국어에서 특히 큰 효과를 보였다.

  • PDF

Embedding Algorithm between Folded Hypercube and HFH Network (폴디드 하이퍼큐브와 HFH 네트워크 사이의 임베딩 알고리즘)

  • Kim, Jongseok;Lee, Hyeongok;Kim, Sung Won
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.2 no.4
    • /
    • pp.151-154
    • /
    • 2013
  • In this paper, we will analyze embedding between Folded Hypercube and HFH. We will show Folded Hypercube $FQ_{2n}$ can be embedded into HFH($C_n,C_n$) with dilation 4, expansion $\frac{(C_n)^2}{2^{2n}}$ and HFH($C_d,C_d$) can be embedded into $FQ_{4d-2}$ with dilation O(d).

Embedding algorithms among hypercube and star graph variants (하이퍼큐브와 스타 그래프 종류 사이의 임베딩 알고리즘)

  • Kim, Jongseok;Lee, Hyeongok
    • The Journal of Korean Association of Computer Education
    • /
    • v.17 no.2
    • /
    • pp.115-124
    • /
    • 2014
  • Hypercube and star graph are widely known as interconnection network. The embedding of an interconnection network is a mapping of a network G into other network H. The possibility of embedding interconnection network G into H with a low cost, has an advantage of efficient algorithms usage in network H, which was developed in network G. In this paper, we provide an embedding algorithm between HCN and HON. HCN(n,n) can be embedded into HON($C_{n+1},C_{n+1}$) with dilation 3 and HON($C_d,C_d$) can be embedded into HCN(2d-1,2d-1) with dilation O(d). Also, star graph can be embedded to half pancake's value of dilation 11, expansion 1, and average dilation 8. Thus, the result means that various algorithms designed for HCN and Star graph can be efficiently executed on HON and half pancake, respectively.

  • PDF

Function Embedding and Projective Measurement of Quantum Gate by Probability Amplitude Switch (확률진폭 스위치에 의한 양자게이트의 함수 임베딩과 투사측정)

  • Park, Dong-Young
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.12 no.6
    • /
    • pp.1027-1034
    • /
    • 2017
  • In this paper, we propose a new function embedding method that can measure mathematical projections of probability amplitude, probability, average expectation and matrix elements of stationary-state unit matrix at all control operation points of quantum gates. The function embedding method in this paper is to embed orthogonal normalization condition of probability amplitude for each control operating point into a binary scalar operator by using Dirac symbol and Kronecker delta symbol. Such a function embedding method is a very effective means of controlling the arithmetic power function of a unitary gate in a unitary transformation which expresses a quantum gate function as a tensor product of a single quantum. We present the results of evolutionary operation and projective measurement when we apply the proposed function embedding method to the ternary 2-qutrit cNOT gate and compare it with the existing methods.

Embedding Algorithms of Hierarchical Folded HyperStar Network (계층적 폴디드 하이퍼스타 네트워크의 임베딩 알고리즘)

  • Kim, Jong-Seok;Lee, Hyeong-Ok;Kim, Sung-Won
    • The KIPS Transactions:PartA
    • /
    • v.16A no.4
    • /
    • pp.299-306
    • /
    • 2009
  • Hierarchical Folded HyperStar Network has lower network cost than HCN(n,n) and HFN(n,n) which are hierarchical networks with the same number of nodes. In this paper, we analyze embedding between Hierarchical Folded HyperStar HFH($C_n,C_n$) and Hypercube, HCN(n,n), HFN(n,n). The results of embedding are that HCN(n,n), HFN(n,n) and Hypercube $Q_{2n}$ can be embedded into HFH($C_n,C_n$) with expansion $\frac{C^n}{2^{2n}}$ and dilation 2, 3, and 4, respectively. Also, HFH($C_n,C_n$) can be embedded into HFN(2n,2n) with dilation 1. These results mean so many developed algorithms in Hypercube, HCN(n,n), HFN(n,n) can be used efficiently in HFH($C_n,C_n$).

Reversible Watermarking Based On Advanced Histogram Shifting (개선된 히스토그램 쉬프팅 기법을 이용한 리버서블 워터마킹)

  • Hwang, Jin-Ha;Kim, Jong-Weon;Choi, Jong-Uk
    • The KIPS Transactions:PartC
    • /
    • v.14C no.1 s.111
    • /
    • pp.39-44
    • /
    • 2007
  • In this paper, we propose a reversible watermarking method to recover an original image after the watermark has been extracted. Most watermarking algorithms cause degradation of image quality in original digital content in the process of embedding watermark. In the proposed algorithm, the original image can be obtained when the degradation is removed from the watermarked image after extracting watermark information. In the proposed method, we utilize histogram shifting concept and Location Map structure. We could solve the Filp-Flop problem by using Location Map structure and enlarge the information embedding capacity by embedding recursively. Experimental results demonstrate that the embedding information as large as 120k bits can be realized while the invisibility as high as 41dB can be maintained.