• Title/Summary/Keyword: common source

Search Result 1,039, Processing Time 0.064 seconds

Open Digital Textbook for Smart Education (스마트교육을 위한 오픈 디지털교과서)

  • Koo, Young-Il;Park, Choong-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.2
    • /
    • pp.177-189
    • /
    • 2013
  • In Smart Education, the roles of digital textbook is very important as face-to-face media to learners. The standardization of digital textbook will promote the industrialization of digital textbook for contents providers and distributers as well as learner and instructors. In this study, the following three objectives-oriented digital textbooks are looking for ways to standardize. (1) digital textbooks should undertake the role of the media for blended learning which supports on-off classes, should be operating on common EPUB viewer without special dedicated viewer, should utilize the existing framework of the e-learning learning contents and learning management. The reason to consider the EPUB as the standard for digital textbooks is that digital textbooks don't need to specify antoher standard for the form of books, and can take advantage od industrial base with EPUB standards-rich content and distribution structure (2) digital textbooks should provide a low-cost open market service that are currently available as the standard open software (3) To provide appropriate learning feedback information to students, digital textbooks should provide a foundation which accumulates and manages all the learning activity information according to standard infrastructure for educational Big Data processing. In this study, the digital textbook in a smart education environment was referred to open digital textbook. The components of open digital textbooks service framework are (1) digital textbook terminals such as smart pad, smart TVs, smart phones, PC, etc., (2) digital textbooks platform to show and perform digital contents on digital textbook terminals, (3) learning contents repository, which exist on the cloud, maintains accredited learning, (4) App Store providing and distributing secondary learning contents and learning tools by learning contents developing companies, and (5) LMS as a learning support/management tool which on-site class teacher use for creating classroom instruction materials. In addition, locating all of the hardware and software implement a smart education service within the cloud must have take advantage of the cloud computing for efficient management and reducing expense. The open digital textbooks of smart education is consdered as providing e-book style interface of LMS to learners. In open digital textbooks, the representation of text, image, audio, video, equations, etc. is basic function. But painting, writing, problem solving, etc are beyond the capabilities of a simple e-book. The Communication of teacher-to-student, learner-to-learnert, tems-to-team is required by using the open digital textbook. To represent student demographics, portfolio information, and class information, the standard used in e-learning is desirable. To process learner tracking information about the activities of the learner for LMS(Learning Management System), open digital textbook must have the recording function and the commnincating function with LMS. DRM is a function for protecting various copyright. Currently DRMs of e-boook are controlled by the corresponding book viewer. If open digital textbook admitt DRM that is used in a variety of different DRM standards of various e-book viewer, the implementation of redundant features can be avoided. Security/privacy functions are required to protect information about the study or instruction from a third party UDL (Universal Design for Learning) is learning support function for those with disabilities have difficulty in learning courses. The open digital textbook, which is based on E-book standard EPUB 3.0, must (1) record the learning activity log information, and (2) communicate with the server to support the learning activity. While the recording function and the communication function, which is not determined on current standards, is implemented as a JavaScript and is utilized in the current EPUB 3.0 viewer, ths strategy of proposing such recording and communication functions as the next generation of e-book standard, or special standard (EPUB 3.0 for education) is needed. Future research in this study will implement open source program with the proposed open digital textbook standard and present a new educational services including Big Data analysis.

Comparison of Deep Learning Frameworks: About Theano, Tensorflow, and Cognitive Toolkit (딥러닝 프레임워크의 비교: 티아노, 텐서플로, CNTK를 중심으로)

  • Chung, Yeojin;Ahn, SungMahn;Yang, Jiheon;Lee, Jaejoon
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.2
    • /
    • pp.1-17
    • /
    • 2017
  • The deep learning framework is software designed to help develop deep learning models. Some of its important functions include "automatic differentiation" and "utilization of GPU". The list of popular deep learning framework includes Caffe (BVLC) and Theano (University of Montreal). And recently, Microsoft's deep learning framework, Microsoft Cognitive Toolkit, was released as open-source license, following Google's Tensorflow a year earlier. The early deep learning frameworks have been developed mainly for research at universities. Beginning with the inception of Tensorflow, however, it seems that companies such as Microsoft and Facebook have started to join the competition of framework development. Given the trend, Google and other companies are expected to continue investing in the deep learning framework to bring forward the initiative in the artificial intelligence business. From this point of view, we think it is a good time to compare some of deep learning frameworks. So we compare three deep learning frameworks which can be used as a Python library. Those are Google's Tensorflow, Microsoft's CNTK, and Theano which is sort of a predecessor of the preceding two. The most common and important function of deep learning frameworks is the ability to perform automatic differentiation. Basically all the mathematical expressions of deep learning models can be represented as computational graphs, which consist of nodes and edges. Partial derivatives on each edge of a computational graph can then be obtained. With the partial derivatives, we can let software compute differentiation of any node with respect to any variable by utilizing chain rule of Calculus. First of all, the convenience of coding is in the order of CNTK, Tensorflow, and Theano. The criterion is simply based on the lengths of the codes and the learning curve and the ease of coding are not the main concern. According to the criteria, Theano was the most difficult to implement with, and CNTK and Tensorflow were somewhat easier. With Tensorflow, we need to define weight variables and biases explicitly. The reason that CNTK and Tensorflow are easier to implement with is that those frameworks provide us with more abstraction than Theano. We, however, need to mention that low-level coding is not always bad. It gives us flexibility of coding. With the low-level coding such as in Theano, we can implement and test any new deep learning models or any new search methods that we can think of. The assessment of the execution speed of each framework is that there is not meaningful difference. According to the experiment, execution speeds of Theano and Tensorflow are very similar, although the experiment was limited to a CNN model. In the case of CNTK, the experimental environment was not maintained as the same. The code written in CNTK has to be run in PC environment without GPU where codes execute as much as 50 times slower than with GPU. But we concluded that the difference of execution speed was within the range of variation caused by the different hardware setup. In this study, we compared three types of deep learning framework: Theano, Tensorflow, and CNTK. According to Wikipedia, there are 12 available deep learning frameworks. And 15 different attributes differentiate each framework. Some of the important attributes would include interface language (Python, C ++, Java, etc.) and the availability of libraries on various deep learning models such as CNN, RNN, DBN, and etc. And if a user implements a large scale deep learning model, it will also be important to support multiple GPU or multiple servers. Also, if you are learning the deep learning model, it would also be important if there are enough examples and references.

The Clinical and Histopathologic Findings of Lymphonodular Hyperplasia of the Colon in Infancy and Childhood (소아에서 대장 림프결절증식의 임상적 및 병리조직학적 소견)

  • Nam, Yoo-Nee;Lee, Seung-Hyeon;Chung, Dong-Hae;Sim, So-Yeon;Eun, Byung-Wook;Choi, Deok-Young;Sun, Yong-Han;Cho, Kang-Ho;Ryoo, Eell;Son, Dong-Woo;Jeon, In-Sang;Tchah, Hann
    • Pediatric Gastroenterology, Hepatology & Nutrition
    • /
    • v.12 no.1
    • /
    • pp.1-9
    • /
    • 2009
  • Purpose: Lymphonodular hyperplasia of the colon (LNHC) is a rare finding in children and its significance as a pathologic finding is unclear. The aim of this study was to investigate the clinical significance of LNHC by analyzing clinical and histopathologic findings in children with LNHC. Methods: We analyzed data from 38 patients who were confirmed to have LNHC by colonoscopy. We checked age, birth history, past history, family history, and clinical symptoms. A hematologic exam, stool exam, and image studies were performed and biopsy specimens were examined by a pathologist. All patients were asked to have short- and long-term follow-up. Results: The mean age of the patients was 12.5${\pm}$14.4 months. All patients presented with complaints of bloody stool. They appeared healthy and the hematologic findings were within a normal range, with the exception of one case. There was no other identified source of bleeding. On histologic exam, 36 patients (94.7%) had lymphoid follicles and 34 patients (84.5%) fulfilled the criteria of allergic colitis. Regardless of diet modification and presence of residual symptom, there was no recurrence of bloody stool through long-term follow-up in all patients. Conclusion: LNHC is more common in infants who are affected by allergic colitis, but it can appear even after infancy. LNHC should be regarded as the etiology when there are any other causes of rectal bleeding, especially in healthy children. We suggest that LNHC has a benign course regardless of diet modification and it might not require excessive concerns.

  • PDF

A Study on the Problem of Organic Image in the 20th Post-paintings (20세기 후기회화에 있어서 유기 이미지의 문제)

  • Park Ji-Sook
    • Journal of Science of Art and Design
    • /
    • v.3
    • /
    • pp.145-177
    • /
    • 2001
  • The artist's interest has been captivated by ecological phenomena in Nature. Her keen captivation has then been focused into plastic art depicting the image of primitive life. The wide sweep of her work encompasses the totality of nature which consists of the human's subconscious power and imagination which she then portrays by organic images. These organic images are in contrast to scientific, mathematical and logical inference and consciousness. This research examines the character of the organic images in modern art by her analysis of some representative works by others. The image is an essential concept in the art which appeared in very different ways and in different perspectives. The image in the artwork appears to be the realistic expression until the early part of the 20th Century. Well into the 20th Century, it began being expressed in various ways such as combined images by imagination which is combined or rejected in the story of artwork. It also began being expressed by transferred images by changed original conditions. It is the main purpose of this research is to study of various expressions of organic images in the artwork of the Post-Modernism era. The character and meaning of organic image painting helps people to approach the human instinct more easily to find out the natural essence. It is also an objective of the organic image to tenderise our human sensibilities, thus helping us to regain vitality and recover our poor humanity in the barren wilderness of modern society. 'Life communion with nature' is a meeting point and common ground for Oriental Philosophy and organic image painting. Through this research, organic image painting is characterised in the four following ways : 1st) Organic image painting seeks regularity and perfection of outer shapes, in contrast to disordered and deformed nature, resulting in organic and biotic formalistic mode of plastic art. 2nd) Organic image painting seeks the formative. 3rd) Organic image painting pursues the priceless dignity of life by researching the formatted arrangement and figure, which contains primitive power of life. 4th) Organic image painting makes crystal clear the power of human and nature, which is a historic and biological phenomenon. This, in turn, exposes the humanistic view of the world from modern society best characterised in lost self-understanding, isolation and materialism. The representative organic image painting artists are Elizabeth Murray, Kusama Yayoi, and Niki do Saint Phalle. Elizabeth Murray used shaped canvas and a round construction of relief works. Kusama Yayoi used Automatistic expressionism originating from the realms of unconsciousness and which is represented by the mass and shape of a water drop. Niki do Saint Phalle shows the transcendence of universal life and anti-life to respect the dignity of life and the eco-friendliness relationship of human and nature in the post-modernism in art history. This is accomplished by surrealistic, symbolic, fantastic and humoristic expression. These three artists' works express the spirit of the organic image in contemporary art. It contains the stream of nature and life to seek not only the state of materialism in the reality, but also the harmonized world of nature and human which has almost lost the important meaning in modern times. Finally, this organic image is the plastic language of the majestic life. It is the romantic idea that the intimacy of nature and the universe and Surrealism, which emphasizes the unconsciousness , is the source of truth and spirit. Also it is influenced by primitive art and abstract art. According to this research, the subject 'Research About Organic Images' is not only an important element in the plastic arts from primitive society to the present, but is also fundamental to an true understanding of Post-Modernism.

  • PDF

Predictive factors for severe infection among febrile infants younger than three months of age (발열을 주소로 내원한 3개월 미만의 영아에서 중증 감염의 예측 인자)

  • Cho, Eun-Young;Song, Hwa;Kim, Ae-Suk;Lee, Sun-Ju;Lee, Dong-Seok;Kim, Doo-Kwun;Choi, Sung-Min;Lee, Kwan;Park, Byoung-Chan
    • Clinical and Experimental Pediatrics
    • /
    • v.52 no.8
    • /
    • pp.898-903
    • /
    • 2009
  • Purpose : This study investigated the predictive factors for identifying infection-prone febrile infants younger than three months. Methods : We conducted a retrospective study of 167 infants younger than three months with an axillary temperature >$38^{\circ}C$ who were hospitalized between 2006 and 2008. If they met any of the following criteria, positive blood culture, CSF WBC ${\geq}11/mm^3$ or positive CSF culture, urinalysis WBC ${\geq}6$/HPF and positive urine culture, WBC ${\geq}6$/HPF on microscopic stool examination or positive stool culture, they were considered at high risk for severe infection. Infants with focal infection, respiratory infection or antibiotic administration prior to admission to the hospital were excluded. We evaluated the symptoms, physical examination findings, laboratory data, and the clinical course between the high risk and low risk groups for severe infection. Results : The high-risk group included 77(46.1%) infants, and the most common diagnosis was urinary tract infection (51.9%). Factors, such as male sex, ESR and CRP were statistically different between the two groups. But, a multilinear regression analysis for severe infection showed that male and ESR factors are significant. Conclusion : We did not find the distinguishing symptoms and laboratory findings for identifying severe infection-prone febrile infants younger than three months. However, the high-risk group was male and ESR-dominated, and these can possibly be used as predictive factors for severe infection.

Review: Distribution, Lactose Malabsorption, and Alleviation Strategies of Lactose Intolerance (유당불내증(Lactose Intolerance)의 발생 원인과 경감 방안에 대한 고찰)

  • Yoon, Sung-Sik
    • Journal of Dairy Science and Biotechnology
    • /
    • v.27 no.2
    • /
    • pp.55-62
    • /
    • 2009
  • Milk is called an almost complete food in terms of nutrition, especially for the younger generations because it contains a number of nutrients required for growth and development. Lactose intolerance is defined as a malabsorption of lactose in the intestine with some typical symptoms of abdominal pains and bloating, and occurred at 75% of global populations, which hampers milk consumption worldwide. Lacks of milk consumption in the underdeveloped countries frequently lead to many nutrients deficiencies, so that diseases including osteoporosis, hypertension, and colon cancer are more prevalent in the recent days. Lactose in foods needs to be hydrolyzed prior to intestinal absorption. The hydrolytic enzyme responsible for splitting lactose into its monomeric forms, glucose and galactose, is called as lactase or $\beta$-galactosidase. The former is primarily used as blood sugar and energy source and the latter used in glycolipid synthesis of brain tissues in infants. Lactose is clinically diagnosed with the breath hydrogen production test as well as intestinal biopsy. Reportedly, symptoms of lactose intolerance are widely prevalent at 25% of Europeans, 50 to 80% of Hispanics, South Indians, Africans, and Jews, almost 100% of Asians and native Americans. For the adults, phenotype of lactase persistence, which is able to hydrolyse lactose, is more common in the northern Europeans, but in the other area lactase non-persistence or adult-type hypolactasia is dominant. Genetic analysis on human lactase gene continued that lactase persistence was closely related to the err site of 1390 single nucleotide polymorphism from the 5'-end. To alleviate severity of lactose intolerance symptoms, some eating patterns including drinking milk a single cup or less, consumption along with other foods, whole milk rather than skimmed milk, and drink with live yogurt cultures, are highly recommended for the lactose maldigesters. Also, delay of gastric emptying is effective to avoid the symptoms from lactose intolerance. Frequency of lactose intolerance with conventional diagnosis is thought overestimated mainly because the subjects are exposed to too much lactose of 50 g rather than a single serving amount. Thus simple and accurate diagnostic method for lactose intolerance need to be established. It is thought that fermented milk products and low- or free lactose milks help improve currently stagnant milk consumption due to lactose intolerance which contributes to major barrier in milk marketing especially in Asian countries.

  • PDF

Restoring Omitted Sentence Constituents in Encyclopedia Documents Using Structural SVM (Structural SVM을 이용한 백과사전 문서 내 생략 문장성분 복원)

  • Hwang, Min-Kook;Kim, Youngtae;Ra, Dongyul;Lim, Soojong;Kim, Hyunki
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.2
    • /
    • pp.131-150
    • /
    • 2015
  • Omission of noun phrases for obligatory cases is a common phenomenon in sentences of Korean and Japanese, which is not observed in English. When an argument of a predicate can be filled with a noun phrase co-referential with the title, the argument is more easily omitted in Encyclopedia texts. The omitted noun phrase is called a zero anaphor or zero pronoun. Encyclopedias like Wikipedia are major source for information extraction by intelligent application systems such as information retrieval and question answering systems. However, omission of noun phrases makes the quality of information extraction poor. This paper deals with the problem of developing a system that can restore omitted noun phrases in encyclopedia documents. The problem that our system deals with is almost similar to zero anaphora resolution which is one of the important problems in natural language processing. A noun phrase existing in the text that can be used for restoration is called an antecedent. An antecedent must be co-referential with the zero anaphor. While the candidates for the antecedent are only noun phrases in the same text in case of zero anaphora resolution, the title is also a candidate in our problem. In our system, the first stage is in charge of detecting the zero anaphor. In the second stage, antecedent search is carried out by considering the candidates. If antecedent search fails, an attempt made, in the third stage, to use the title as the antecedent. The main characteristic of our system is to make use of a structural SVM for finding the antecedent. The noun phrases in the text that appear before the position of zero anaphor comprise the search space. The main technique used in the methods proposed in previous research works is to perform binary classification for all the noun phrases in the search space. The noun phrase classified to be an antecedent with highest confidence is selected as the antecedent. However, we propose in this paper that antecedent search is viewed as the problem of assigning the antecedent indicator labels to a sequence of noun phrases. In other words, sequence labeling is employed in antecedent search in the text. We are the first to suggest this idea. To perform sequence labeling, we suggest to use a structural SVM which receives a sequence of noun phrases as input and returns the sequence of labels as output. An output label takes one of two values: one indicating that the corresponding noun phrase is the antecedent and the other indicating that it is not. The structural SVM we used is based on the modified Pegasos algorithm which exploits a subgradient descent methodology used for optimization problems. To train and test our system we selected a set of Wikipedia texts and constructed the annotated corpus in which gold-standard answers are provided such as zero anaphors and their possible antecedents. Training examples are prepared using the annotated corpus and used to train the SVMs and test the system. For zero anaphor detection, sentences are parsed by a syntactic analyzer and subject or object cases omitted are identified. Thus performance of our system is dependent on that of the syntactic analyzer, which is a limitation of our system. When an antecedent is not found in the text, our system tries to use the title to restore the zero anaphor. This is based on binary classification using the regular SVM. The experiment showed that our system's performance is F1 = 68.58%. This means that state-of-the-art system can be developed with our technique. It is expected that future work that enables the system to utilize semantic information can lead to a significant performance improvement.

Membership Fluidity and Knowledge Collaboration in Virtual Communities: A Multilateral Approach to Membership Fluidity (가상 커뮤니티의 멤버 유동성과 지식 협업: 멤버 유동성에 대한 다각적 접근)

  • Park, Hyun-jung;Shin, Kyung-shik
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.2
    • /
    • pp.19-47
    • /
    • 2015
  • In this era of knowledge economy, a variety of virtual communities are proliferating for the purpose of knowledge creation and utilization. Since the voluntary contributions of members are the essential source of knowledge, member turnover can have significant implications on the survival and success of virtual communities. However, there is a dearth of research on the effect of membership turnover and even the method of measurement for membership turnover is left unclear in virtual communities. In a traditional context, membership turnover is calculated as the ratio of the number of departing members to the average number of members for a given time period. In virtual communities, while the influx of newcomers can be clearly measured, the magnitude of departure is elusive since explicit withdrawals are seldom executed. In addition, there doesn't exist a common way to determine the average number of community members who return and contribute intermittently at will. This study initially examines the limitations in applying the concept of traditional turnover to virtual communities, and proposes five membership fluidity measures based on a preliminary analysis of editing behaviors of 2,978 featured articles in English Wikipedia. Subsequently, this work investigates the relationships between three selected membership fluidity measures and group collaboration performance, reflecting a moderating effect dependent on work characteristic. We obtained the following results: First, membership turnover relates to collaboration efficiency in a right-shortened U-shaped manner, with a moderating effect from work characteristic; given the same turnover rate, the promotion likelihood for a more professional task is lower than that for a less professional task, and the likelihood difference diminishes as the turnover rate increases. Second, contribution period relates to collaboration efficiency in a left-shortened U-shaped manner, with a moderating effect from work characteristic; the marginal performance change per unit change of contribution period is greater for a less professional task. Third, the number of new participants per month relates to collaboration efficiency in a left-shortened reversed U-shaped manner, for which the moderating effect from work characteristic appears to be insignificant.

An alternative way of Animation Industry : Focusing on Avatar sevice's Lock-in Effect (애니메이션 산업의 대안적 연구 - 아바타 서비스의 소비자 고착화(lock-in) 전략을 중심으로)

  • Han, Chang-Wan
    • Cartoon and Animation Studies
    • /
    • s.6
    • /
    • pp.152-171
    • /
    • 2002
  • This study analyses the avatar service, which is recognized as an alternative strategy of animation industry. The research questions of this study are following: (1) How have the avatar services been developed and what are the present dominant types? (2) Which structural characteristics of e-business environment are needed for the success of avatar services? (3) What is the economic characteristics of avatar business model? To solve these research questions, the basic conditions and the structural characteristics of avatar services have been investigated. In the first place, two forms of avatar service are classified. One is the internet service site whole primary service is to provide chatting service based on avatar service. The other is the portal site in which many kinds of products and services are presented as bundles to meet the needs of internet users. So avatar service is one of bundles which those portal sites are providing with. In this study, the big five internet service sites are selected based on the profits they earned through the sales of avatar service. The result of analysis is that the pricing strategy of those big five sites is very different from those of traditional off-line markets. The pricing mechanism are based on the value which internet users endow with the avatar items, not based on the costs of making the products. Avatar is the representative informative goods. The informative goods have the original cost structures, constant fixed costs and zero marginal costs, so the providers of avatar services make much of the subjective values of consumers. The sayclub, which is the most successful avatar service site and earn the average sales of 3 billing won a month, takes the aggressive strategy of pricing avatar items at highest price in the industry. The avatar service providers which make lots of profits are planning of making differentiate the services, introducing well-known brand items and star-named items. Nevertheless, the fact that the members of the sayclub are not decreasing means that the network effect of the site is so strongly manifest. Moreover, the costs the members have paid for the avatar items are so big not as to switch from one site to the other site, it can be very costly. These switching costs are endemic in high-technology industries and digital contents industries. It can be so large that switching suppliers is virtually unthinkable, a situation known as 'lock-in'. When switching costs are substantial, competition can be intense to attract new customers, since, one they are locked in, they can be a substantial source of profit. The consumers of avatar items have switching costs if they subscribe for the new avatar service site. The switching costs can be subscription costs as well as the costs of giving up the items they already paid for. One common example of switching costs involves specialized supplies, as with inkjet printer cartridges. In this example, the switching cost is the purchase of a new printer. The market is competitive ex ante, but since cartridges are incompatible, it is monopolized dx post. So the providers of printer/cartridges set pricing printer so cheap and cartridges expensive. On the contrary, since the avatar service can be successful with the strong network effect, the providers of avatar services have to compete aggressively for new customers. So they allow the subscription at a low price(almost marginal cost) in the early market. The network effect can be maximized when the members are sufficiently growing. The providers which have the monopoly power with sufficient subscribers. begin to raise the prices over the lifetime of the product and make profits.

  • PDF

"Legal Study on Boundary between Airspace and Outer Space" (영공(領空)과 우주공간(宇宙空間)의 한계(限界)에 관한 법적(法的) 고찰(考察))

  • Choi, Wan-Sik
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.2
    • /
    • pp.31-67
    • /
    • 1990
  • One of the first issues which arose in the evolution of air law was the determination of the vertical limits of airspace over private property. In 1959 the UN in its Ad Hoc Committee on the Peaceful Uses of Outer Space, started to give attention to the question of the meaning of the term "outer space". Discussions in the United Nations regarding the delimitation issue were often divided between those in favour of a functional approach ("functionalists"), and those seeking the delineation of a boundary ("spatialists"). The functionalists, backed initially by both major space powers, which viewed any boundary as possibly restricting their access to space(Whether for peaceful or military purposes), won the first rounds, starting with the 1959 Report of the Ad Hoc Committee on the Peaceful Uses of Outer Space which did not consider that the topic called for priority consideration. In 1966, however, the spatialists, were able to place the issue on the agenda of the Outer Sapce Committee pursuant to Resolution 2222 (xxx1). However, the spatialists were not able to present a common position since there existed a variety of propositions for delineation of a boundary. Over the years, the funtionalists have seemed to be losing ground. As the element of location is a decisive factor for the choice of the legal regime to be applied, a purely functional approach to the regulation of activities in the space above the Earth does not offer a solution. It is therefore to be welcomed that there is clear evidence of a growing recognition of the defect inherent to such an approach and that a spatial approach to the problem is gaining support both by a growing number of States as well as by publicists. The search for a solution of the problem of demarcating the two different legal regimes governing the space above the Earth has undoubtedly been facilitated, and a number of countries, among them Argentina, Belgium, France, Italy and Mexico have already advocated the acceptance of the lower boundary of outer space at a height of 100km. The adoption of the principle of sovereignty at that height does not mean that States would not be allowed to take protective measures against space activities above that height which constitute a threat to their security. A parallel can be drawn with the defence of the State's security on the high seas. Measures taken by States in their own protection on the high seas outside the territorial waters-provided that they are proportionate to the danger-are not considered to infringe the principle of international law. The most important issue in this context relates to the problem of a right of passage for space craft through foreign air space in order to reach outer space. In the reports to former ILA Conferences an explanation was given of the reasons why no customary rule of freedom of passage for aircraft through foreign territorial air space could as yet be said to exist. It was suggested, however, that though the essential elements for the creation of a rule of customary international law allowing such passage were still lacking, developments apperaed to point to a steady growth of a feeling of necessity for such a rule. A definite treaty solution of the demarcation problem would require further study which should be carried out by the UN Outer Space Committee in close co-operation with other interested international organizations, including ICAO. If a limit between air space and outer space were established, air space would automatically come under the regime of the Chicago Convention alone. The use of the word "recognize" in Art. I of chicago convention is an acknowledgement of sovereignty over airspace existing as a general principle of law, the binding force of which exists independently of the Convention. Further it is important to note that the Aricle recognizes this sovereignty, as existing for every state, holding it immaterial whether the state is or is not a contracting state. The functional criteria having been created by reference to either the nature of activity or the nature of the space object, the next hurdle would be to provide methods of verification. With regard to the question of international verification the establishment of an International Satelite Monitoring Agency is required. The path towards the successful delimitation of outer space from territorial space is doubtless narrow and stony but the establishment of a precise legal framework, consonant with the basic principles of international law, for the future activities of states in outer space will, it is still believed, remove a source of potentially dangerous conflicts between states, and furthermore afford some safeguard of the rights and interests of non-space powers which otherwise are likely to be eroded by incipient customs based on at present almost complete freedom of action of the space powers.

  • PDF