• 제목/요약/키워드: Virtual reality simulation

Search Result 476, Processing Time 0.023 seconds

The effect of restaurant in-store color and music congruency on customer's emotional responses and behavioral intentions (레스토랑 실내의 색채와 배경 음악의 조화가 고객의 감정적 반응 및 행동 의도에 미치는 영향)

  • Jo, Mi-Na
    • Science of Emotion and Sensibility
    • /
    • v.14 no.1
    • /
    • pp.27-38
    • /
    • 2011
  • This study was aimed to investigate the effects of restaurant in-store color and music congruency on consumer's emotional responses and behavioral intentions. The web survey was conducted among 400 customers(aged from 20~39 years old) who lived in Seoul and Kyunggi, Incheon Province. To find ensemble effect of color and music, 3D studio MAX were used to make high-stimulus(exciting) and low-stimulus(calm) and 3D virtual reality restaurant simulation stimulus were applied. The statistical data analyses were performed using SPSS/WIN 18.0 and reliability analysis, factor analysis, regression analysis were used. Based on the result of the conducting factor analysis, emotional responses were classified into 2 factors: positive emotion and negative emotion. Satisfaction was classified into 1 factor: satisfaction. Loyalty was classified into 1 factor: loyalty. Cronbach's alpha was calculated for the reliability of the survey instrument. Consequently, restaurant in-store color and music congruency were shown to affect positive emotion and negative emotion. Positive emotion and negative emotion were shown to affect satisfaction. Satisfaction were shown to affect loyalty. Music congruency had a higher effect on positive emotion than color congruency. Color congruency had a higher effect on negative emotion than music congruency. The results of this study will serve as a basis of color and music congruency with restaurant atmospherics.

  • PDF

GPU-only Terrain Rendering for Walk-through (Walk-through를 지원하는 GPU 기반 지형렌더링)

  • Park, Sun-Yong;Oh, Kyoung-Su;Cho, Sung-Hyun
    • Journal of Korea Game Society
    • /
    • v.7 no.4
    • /
    • pp.71-80
    • /
    • 2007
  • In this paper, we introduce an efficient GPU-based real-time rendering technique applicable to every kind of game. Our method, without an extra geometry, can represent terrain just with a height map. It makes it possible to freely go around in the air or on the surface, so we can directly apply it to any computer games as well as a virtual reality. Since our method is not based on any geometrical structure, it doesn't need special LOD policy and the precision of geometrical representation and visual quality absolutely depend on the resolution of height map and color map. Moreover, GPU-only technique allows the general CPU to be dedicated to more general work, and as a result, enhances the overall performance of the computer. To date, there have been many researches related to the terrain representation, but most of them rely on CPU or confmed its applications to flight simulation, Improving existing displacement mapping techniques and applying it to our terrain rendering, we completely ruled out the problems, such as cracking, poping etc, which cause in polygon-based techniques, The most important contributions are to efficiently deal with arbitrary LOS(Line Of Sight) and dramatically improve visual quality during walk-through by reconstructing a height field with curved patches. We suggest a simple and useful method for calculating ray-patch intersections. We implemented all these on GPU 100%, and got tens to hundreds of framerates with height maps a variety of resolutions$(256{\times}256\;to\;4096{\times}4096)$.

  • PDF

Analysis of Emerging Geo-technologies and Markets Focusing on Digital Twin and Environmental Monitoring in Response to Digital and Green New Deal (디지털 트윈, 환경 모니터링 등 디지털·그린 뉴딜 정책 관련 지질자원 유망기술·시장 분석)

  • Ahn, Eun-Young;Lee, Jaewook;Bae, Junhee;Kim, Jung-Min
    • Economic and Environmental Geology
    • /
    • v.53 no.5
    • /
    • pp.609-617
    • /
    • 2020
  • After introducing the industry 4.0 policy, Korean government announced 'Digital New Deal' and 'Green New Deal' as 'Korean New Deal' in 2020. We analyzed Korea Institute of Geoscience and Mineral Resources (KIGAM)'s research projects related to that policy and conducted markets analysis focused on Digital Twin and environmental monitoring technologies. Regarding 'Data Dam' policy, we suggested the digital geo-contents with Augmented Reality (AR) & Virtual Reality (VR) and the public geo-data collection & sharing system. It is necessary to expand and support the smart mining and digital oil fields research for '5th generation mobile communication (5G) and artificial intelligence (AI) convergence into all industries' policy. Korean government is suggesting downtown 3D maps for 'Digital Twin' policy. KIGAM can provide 3D geological maps and Internet of Things (IoT) systems for social overhead capital (SOC) management. 'Green New Deal' proposed developing technologies for green industries including resource circulation, Carbon Capture Utilization and Storage (CCUS), and electric & hydrogen vehicles. KIGAM has carried out related research projects and currently conducts research on domestic energy storage minerals. Oil and gas industries are presented as representative applications of digital twin. Many progress is made in mining automation and digital mapping and Digital Twin Earth (DTE) is a emerging research subject. The emerging research subjects are deeply related to data analysis, simulation, AI, and the IoT, therefore KIGAM should collaborate with sensors and computing software & system companies.

A Study on Metaverse Construction Based on 3D Spatial Information of Convergence Sensors using Unreal Engine 5 (언리얼 엔진 5를 활용한 융복합센서의 3D 공간정보기반 메타버스 구축 연구)

  • Oh, Seong-Jong;Kim, Dal-Joo;Lee, Yong-Chang
    • Journal of Cadastre & Land InformatiX
    • /
    • v.52 no.2
    • /
    • pp.171-187
    • /
    • 2022
  • Recently, the demand and development for non-face-to-face services are rapidly progressing due to the pandemic caused by the COVID-19, and attention is focused on the metaverse at the center. Entering the era of the 4th industrial revolution, Metaverse, which means a world beyond virtual and reality, combines various sensing technologies and 3D reconstruction technologies to provide various information and services to users easily and quickly. In particular, due to the miniaturization and economic increase of convergence sensors such as unmanned aerial vehicle(UAV) capable of high-resolution imaging and high-precision LiDAR(Light Detection and Ranging) sensors, research on digital-Twin is actively underway to create and simulate real-life twins. In addition, Game engines in the field of computer graphics are developing into metaverse engines by expanding strong 3D graphics reconstuction and simulation based on dynamic operations. This study constructed a mirror-world type metaverse that reflects real-world coordinate-based reality using Unreal Engine 5, a recently announced metaverse engine, with accurate 3D spatial information data of convergence sensors based on unmanned aerial system(UAS) and LiDAR. and then, spatial information contents and simulations for users were produced based on various public data to verify the accuracy of reconstruction, and through this, it was possible to confirm the construction of a more realistic and highly utilizable metaverse. In addition, when constructing a metaverse that users can intuitively and easily access through the unreal engine, various contents utilization and effectiveness could be confirmed through coordinate-based 3D spatial information with high reproducibility.

Development of a Stock Trading System Using M & W Wave Patterns and Genetic Algorithms (M&W 파동 패턴과 유전자 알고리즘을 이용한 주식 매매 시스템 개발)

  • Yang, Hoonseok;Kim, Sunwoong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.63-83
    • /
    • 2019
  • Investors prefer to look for trading points based on the graph shown in the chart rather than complex analysis, such as corporate intrinsic value analysis and technical auxiliary index analysis. However, the pattern analysis technique is difficult and computerized less than the needs of users. In recent years, there have been many cases of studying stock price patterns using various machine learning techniques including neural networks in the field of artificial intelligence(AI). In particular, the development of IT technology has made it easier to analyze a huge number of chart data to find patterns that can predict stock prices. Although short-term forecasting power of prices has increased in terms of performance so far, long-term forecasting power is limited and is used in short-term trading rather than long-term investment. Other studies have focused on mechanically and accurately identifying patterns that were not recognized by past technology, but it can be vulnerable in practical areas because it is a separate matter whether the patterns found are suitable for trading. When they find a meaningful pattern, they find a point that matches the pattern. They then measure their performance after n days, assuming that they have bought at that point in time. Since this approach is to calculate virtual revenues, there can be many disparities with reality. The existing research method tries to find a pattern with stock price prediction power, but this study proposes to define the patterns first and to trade when the pattern with high success probability appears. The M & W wave pattern published by Merrill(1980) is simple because we can distinguish it by five turning points. Despite the report that some patterns have price predictability, there were no performance reports used in the actual market. The simplicity of a pattern consisting of five turning points has the advantage of reducing the cost of increasing pattern recognition accuracy. In this study, 16 patterns of up conversion and 16 patterns of down conversion are reclassified into ten groups so that they can be easily implemented by the system. Only one pattern with high success rate per group is selected for trading. Patterns that had a high probability of success in the past are likely to succeed in the future. So we trade when such a pattern occurs. It is a real situation because it is measured assuming that both the buy and sell have been executed. We tested three ways to calculate the turning point. The first method, the minimum change rate zig-zag method, removes price movements below a certain percentage and calculates the vertex. In the second method, high-low line zig-zag, the high price that meets the n-day high price line is calculated at the peak price, and the low price that meets the n-day low price line is calculated at the valley price. In the third method, the swing wave method, the high price in the center higher than n high prices on the left and right is calculated as the peak price. If the central low price is lower than the n low price on the left and right, it is calculated as valley price. The swing wave method was superior to the other methods in the test results. It is interpreted that the transaction after checking the completion of the pattern is more effective than the transaction in the unfinished state of the pattern. Genetic algorithms(GA) were the most suitable solution, although it was virtually impossible to find patterns with high success rates because the number of cases was too large in this simulation. We also performed the simulation using the Walk-forward Analysis(WFA) method, which tests the test section and the application section separately. So we were able to respond appropriately to market changes. In this study, we optimize the stock portfolio because there is a risk of over-optimized if we implement the variable optimality for each individual stock. Therefore, we selected the number of constituent stocks as 20 to increase the effect of diversified investment while avoiding optimization. We tested the KOSPI market by dividing it into six categories. In the results, the portfolio of small cap stock was the most successful and the high vol stock portfolio was the second best. This shows that patterns need to have some price volatility in order for patterns to be shaped, but volatility is not the best.

Edge to Edge Model and Delay Performance Evaluation for Autonomous Driving (자율 주행을 위한 Edge to Edge 모델 및 지연 성능 평가)

  • Cho, Moon Ki;Bae, Kyoung Yul
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.191-207
    • /
    • 2021
  • Up to this day, mobile communications have evolved rapidly over the decades, mainly focusing on speed-up to meet the growing data demands of 2G to 5G. And with the start of the 5G era, efforts are being made to provide such various services to customers, as IoT, V2X, robots, artificial intelligence, augmented virtual reality, and smart cities, which are expected to change the environment of our lives and industries as a whole. In a bid to provide those services, on top of high speed data, reduced latency and reliability are critical for real-time services. Thus, 5G has paved the way for service delivery through maximum speed of 20Gbps, a delay of 1ms, and a connecting device of 106/㎢ In particular, in intelligent traffic control systems and services using various vehicle-based Vehicle to X (V2X), such as traffic control, in addition to high-speed data speed, reduction of delay and reliability for real-time services are very important. 5G communication uses high frequencies of 3.5Ghz and 28Ghz. These high-frequency waves can go with high-speed thanks to their straightness while their short wavelength and small diffraction angle limit their reach to distance and prevent them from penetrating walls, causing restrictions on their use indoors. Therefore, under existing networks it's difficult to overcome these constraints. The underlying centralized SDN also has a limited capability in offering delay-sensitive services because communication with many nodes creates overload in its processing. Basically, SDN, which means a structure that separates signals from the control plane from packets in the data plane, requires control of the delay-related tree structure available in the event of an emergency during autonomous driving. In these scenarios, the network architecture that handles in-vehicle information is a major variable of delay. Since SDNs in general centralized structures are difficult to meet the desired delay level, studies on the optimal size of SDNs for information processing should be conducted. Thus, SDNs need to be separated on a certain scale and construct a new type of network, which can efficiently respond to dynamically changing traffic and provide high-quality, flexible services. Moreover, the structure of these networks is closely related to ultra-low latency, high confidence, and hyper-connectivity and should be based on a new form of split SDN rather than an existing centralized SDN structure, even in the case of the worst condition. And in these SDN structural networks, where automobiles pass through small 5G cells very quickly, the information change cycle, round trip delay (RTD), and the data processing time of SDN are highly correlated with the delay. Of these, RDT is not a significant factor because it has sufficient speed and less than 1 ms of delay, but the information change cycle and data processing time of SDN are factors that greatly affect the delay. Especially, in an emergency of self-driving environment linked to an ITS(Intelligent Traffic System) that requires low latency and high reliability, information should be transmitted and processed very quickly. That is a case in point where delay plays a very sensitive role. In this paper, we study the SDN architecture in emergencies during autonomous driving and conduct analysis through simulation of the correlation with the cell layer in which the vehicle should request relevant information according to the information flow. For simulation: As the Data Rate of 5G is high enough, we can assume the information for neighbor vehicle support to the car without errors. Furthermore, we assumed 5G small cells within 50 ~ 250 m in cell radius, and the maximum speed of the vehicle was considered as a 30km ~ 200 km/hour in order to examine the network architecture to minimize the delay.