• Title/Summary/Keyword: Architecture exploration

Search Result 170, Processing Time 0.024 seconds

Development and Application of a Source for Crosshole Seismic Method to Determine Body Wave Velocity with Depth at Multi-layered Sites (다층 구성 부지에서의 깊이별 실체파 속도의 결정을 위한 시추공간 탄성파 탐사 발진 장치 개발 및 적용)

  • Sun, Chang-Guk;Mok, Young-Jin
    • Geophysics and Geophysical Exploration
    • /
    • v.9 no.3
    • /
    • pp.193-206
    • /
    • 2006
  • Among various borehole seismic testing techniques for determining body wave velocity, crosshole seismic method has been known as one of the most suitable technique for evaluating reliably geotechnical dynamic properties. In this study, to perform successfully the crosshole seismic test for rock as well as soil layers regardless of the groundwater level, multi-purposed spring-loaded source which impact horizontally a subsurface ground in vertical borehole was developed and applied at major facility sites in Korea. The geotechnical dynamic properties were evaluated by determining efficiently the body wave velocities such as shear wave velocity and compressional wave velocity from the horizontally impacted crosshole seismic tests at study sites, and were provided as the fundamental parameters for the seismic performance evaluation and seismic design of the target facilities.

A Design and Implementation of a Timing Analysis Simulator for a Design Space Exploration on a Hybrid Embedded System (Hybrid 내장형 시스템의 설계공간탐색을 위한 시간분석 시뮬레이터의 설계 및 구현)

  • Ahn, Seong-Yong;Shim, Jea-Hong;Lee, Jeong-A
    • The KIPS Transactions:PartA
    • /
    • v.9A no.4
    • /
    • pp.459-466
    • /
    • 2002
  • Modern embedded system employs a hybrid architecture which contains a general micro processor and reconfigurable devices such as FPGAS to retain flexibility and to meet timing constraints. It is a hard and important problem for embedded system designers to explore and find a right system configuration, which is known as design space exploration (DSE). With DES, it is possible to predict a final system configuration during the design phase before physical implementation. In this paper, we implement a timing analysis simulator for a DSE on a hybrid embedded system. The simulator, integrating exiting timing analysis tools for hardware and software, is designed by extending Y-chart approach, which allows quantitative performance analysis by varying design parameters. This timing analysis simulator is expected to reduce design time and costs and be used as a core module of a DSE for a hybrid embedded system.

Acceleration of computation speed for elastic wave simulation using a Graphic Processing Unit (그래픽 프로세서를 이용한 탄성파 수치모사의 계산속도 향상)

  • Nakata, Norimitsu;Tsuji, Takeshi;Matsuoka, Toshifumi
    • Geophysics and Geophysical Exploration
    • /
    • v.14 no.1
    • /
    • pp.98-104
    • /
    • 2011
  • Numerical simulation in exploration geophysics provides important insights into subsurface wave propagation phenomena. Although elastic wave simulations take longer to compute than acoustic simulations, an elastic simulator can construct more realistic wavefields including shear components. Therefore, it is suitable for exploration of the responses of elastic bodies. To overcome the long duration of the calculations, we use a Graphic Processing Unit (GPU) to accelerate the elastic wave simulation. Because a GPU has many processors and a wide memory bandwidth, we can use it in a parallelised computing architecture. The GPU board used in this study is an NVIDIA Tesla C1060, which has 240 processors and a 102 GB/s memory bandwidth. Despite the availability of a parallel computing architecture (CUDA), developed by NVIDIA, we must optimise the usage of the different types of memory on the GPU device, and the sequence of calculations, to obtain a significant speedup of the computation. In this study, we simulate two- (2D) and threedimensional (3D) elastic wave propagation using the Finite-Difference Time-Domain (FDTD) method on GPUs. In the wave propagation simulation, we adopt the staggered-grid method, which is one of the conventional FD schemes, since this method can achieve sufficient accuracy for use in numerical modelling in geophysics. Our simulator optimises the usage of memory on the GPU device to reduce data access times, and uses faster memory as much as possible. This is a key factor in GPU computing. By using one GPU device and optimising its memory usage, we improved the computation time by more than 14 times in the 2D simulation, and over six times in the 3D simulation, compared with one CPU. Furthermore, by using three GPUs, we succeeded in accelerating the 3D simulation 10 times.

The role of geophysics in understanding salinisation in Southwestern Queensland (호주 Queensland 남서부 지역의 염분작용 조사)

  • Wilkinson Kate;Chamberlain Tessa;Grundy Mike
    • Geophysics and Geophysical Exploration
    • /
    • v.8 no.1
    • /
    • pp.78-85
    • /
    • 2005
  • This study, combining geophysical and environmental approaches, was undertaken to investigate the causes of secondary salinity in the Goondoola basin, in southwestern Queensland. Airborne radiometric, electromagnetic and ground electromagnetic datasets were acquired, along with data on soils and subsurface materials and groundwater. Relationships established between radiometric, elevation data, and measured material properties allowed us to generate predictive maps of surface materials and recharge potential. Greatest recharge to the groundwater is predicted to occur on the weathered bedrock rises surrounding the basin. Electromagnetic data (airborne, ground, and downhote), used in conjunction with soil and drillhole measurements, were used to quantify regolith salt store and to define the subsurface architecture. Conductivity measurements reflect soil salt distribution. However, deeper in the regolith, where the salt content is relatively constant, the AEM signal is influenced by changes in porosity or material type. This allowed the lateral distribution of bedrock weathering zones to be mapped. Salinisation in this area occurs because of local-andintermediate-scale processes, controlled strongly by regolith architecture. The present surface outbreak is the result of evaporative concentration above shallow saline groundwater, discharging at break of slope. The integration of surficial and subsurface datasets allowed the identification of similar landscape settings that are most at risk of developing salinity with groundwater rise. This information is now being used by local land managers to refine management choices that prevent excess recharge and further salt mobilisation.

A Hardware Design Space Exploration toward Low-Area and High-Performance Architecture for the 128-bit Block Cipher Algorithm SEED (128-비트 블록 암호화 알고리즘 SEED의 저면적 고성능 하드웨어 구조를 위한 하드웨어 설계 공간 탐색)

  • Yi, Kang
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.13 no.4
    • /
    • pp.231-239
    • /
    • 2007
  • This paper presents the trade-off relationship between area and performance in the hardware design space exploration for the Korean national standard 128-bit block cipher algorithm SEED. In this paper, we compare the following four hardware design types of SEED algorithm : (1) Design 1 that is 16 round fully pipelining approach, (2) Design 2 that is a one round looping approach, (3) Design 3 that is a G function sharing and looping approach, and (4) Design 4 that is one round with internal 3 stage pipelining approach. The Design 1, Design 2, and Design 3 are the existing design approaches while the Design 4 is the newly proposed design in this paper. Our new design employs the pipeline between three G-functions and adders consisting of a F function, which results in the less area requirement than Design 2 and achieves the higher performance than Design 2 and Design 3 due to pipelining and module sharing techniques. We design and implement all the comparing four approaches with real hardware targeting FPGA for the purpose of exact performance and area analysis. The experimental results show that Design 4 has the highest performance except Design 1 which pursues very aggressive parallelism at the expanse of area. Our proposed design (Design 4) shows the best throughput/area ratio among all the alternatives by 2.8 times. Therefore, our new design for SEED is the most efficient design comparing with the existing designs.

Design Space Exploration of Many-Core Processor for High-Speed Cluster Estimation (고속의 클러스터 추정을 위한 매니코어 프로세서의 디자인 공간 탐색)

  • Seo, Jun-Sang;Kim, Cheol-Hong;Kim, Jong-Myon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.10
    • /
    • pp.1-12
    • /
    • 2014
  • This paper implements and improves the performance of high computational subtractive clustering algorithm using a single instruction, multiple data (SIMD) based many-core processor. In addition, this paper implements five different processing element (PE) architectures (PEs=16, 64, 256, 1,024, 4,096) to select an optimal PE architecture for the subtractive clustering algorithm by estimating execution time and energy efficiency. Experimental results using two different medical images and three different resolutions ($128{\times}128$, $256{\times}256$, $512{\times}512$) show that PEs=4,096 achieves the highest performance and energy efficiency for all the cases.

Subsea Responses to the BP Oil Spill in the Gulf of Mexico (멕시코만의 BP사 오일유출 해저 대책에 대한 분석)

  • Choi, Han-Suk;Lee, Seung-Keon;Do, Chang-Ho
    • Journal of Ocean Engineering and Technology
    • /
    • v.25 no.3
    • /
    • pp.90-95
    • /
    • 2011
  • On April 20, 2010, a well control event allowed hydrocarbon (oil and gas) to escape from the Macondo well onto Deepwater Horizon (DWH), resulting in an exploration and fire on the rig. While 17 people were injured, 11 others lost their lives. The fire continued for 36 hours until the rig sank. Hydrocarbons continued to flow out from the reservoir through the well bore and blowout preventer (BOP) for 87 days, causing an unprecedented oil spill. Beyond Petroleum (BP) and the US federal government tried various methods to prevent the oil spill and to capture the spilled oil. The corresponding responses were very challenging due to the scale, intensity, and duration of the incident that occurred under extreme conditions in terms of pressure, temperature, and amount of flow. On July 15, a capping stack, which is another BOP on top of the existing BOP, was successfully installed, and the oil spill was stopped. After several tests and subsea responses, the well was permanently sealed by a relief well and a bottom kill on September 19. This paper analyzes the subsea responses and engineering efforts to capture the oil, stop the leaking, and kill the subsea well. During the investigation and analysis of subsea responses, information was collected and data bases were established for future accident prevention and the development of subsea engineering.

Precedent based design foundations for parametric design: The case of navigation and wayfinding

  • Kondyli, Vasiliki;Bhatt, Mehul;Hartmann, Timo
    • Advances in Computational Design
    • /
    • v.3 no.4
    • /
    • pp.339-366
    • /
    • 2018
  • Parametric design systems serve as powerful assistive tools in the design process by providing a flexible approach for the generation of a vast number of design alternatives. However, contemporary parametric design systems focus primarily on low-level engineering and structural forms, without an explicit means to also take into account high-level, cognitively motivated people-centred design goals. We present a precedent-based parametric design method that integrates people-centred design "precedents" rooted in empirical evidence directly within state of the art parametric design systems. As a use-case, we illustrate the general method in the context of an empirical study focusing on the multi-modal analysis of wayfinding behaviour in two large-scale healthcare environments. With this use-case, we demonstrate the manner in which: (1). a range of empirically established design precedents -e.g., pertaining to visibility and navigation- may be articulated as design constraints to be embedded directly within state of the art parametric design tools (e.g., Grasshopper); and (2). embedded design precedents lead to the (parametric) generation of a number of morphologies that satisfy people-centred design criteria (in this case, pertaining to wayfinding). Our research presents an exemplar for the integration of cognitively motivated design goals with parametric design-space exploration methods. We posit that this opens-up a range of technological challenges for the engineering and development of next-generation computer aided architecture design systems.

Retargetable Instruction-Set Simulator for Energy Consumption Monitoring (에너지 소비 모니터링을 위한 재목적 인스트럭션-셋 시뮬레이터)

  • Ko, Kwang-Man
    • Journal of Korea Multimedia Society
    • /
    • v.14 no.3
    • /
    • pp.462-470
    • /
    • 2011
  • Retargetability is typically achieved by providing target machine information, ADL, as input. The ADL are used to specify processor and memory architectures and generate software toolkit including compiler, simulator, etc. Simulator are critical components of the exploration and software design toolkit for the system designer. They can be used to perform diverse tasks such as verifying the functionality and/or timing behavior of the system, and generating quantitative measurements(e.g., power energy consumption) which can be used to aid the design process. In this paper, we generate the energy consumption estimation simulator through ADL. For this goal, firstly, we describes the energy consumption estimation and monitoring informations on the ADL based on EXPRESSION. Secondly, we generate the energy estimation and monitoring simulation library and then constructs the simulator, RenergySim. Lastly, we represent the energy estimations results for MIPS R4000 ADL description. From this subjects, we contribute to the efficient architecture developments and prompt SDK generation through programmable experiments in the field of mobile software development.

A study on the characteristic of problem solving process in the architectural design process (건축디자인과정에서 문제해결의 특성에 관한 연구)

  • Kim, Yong-Il;Han, Jae-Su
    • Journal of The Korean Digital Architecture Interior Association
    • /
    • v.11 no.3
    • /
    • pp.53-59
    • /
    • 2011
  • In creative design, it is necessary to understand the characteristic of architectural design. In the world of design problem, a distinction can be made between those that are well-defined and those that are ill-defined. Well-defined problems are those for which the ends or goal, are already prescribed and apparent, their solution requires the provision of appropriate means. For ill-defined problems, on the other hand, both the ends and the means of solution are unknown at the outset of the problem solving exercise, at least in their entirety. Most of design problems is ill-defined, which is unknown at the beginning of the problem solving exercise. In order to solve the design problem, Designers take advantage of the search methods of problem space, such as global-search-methods(depth-first-methods, breath-first-methods), local-search-methods(generate and test, heuristics, hill-climbing, reasoning) and visual thinking, which is represented through sketching. Sketching is a real part of design reasoning and it does so through a special kind of visual imagery. Also in the design problem solving it have been an important means of problem exploration and solution generation. By sketching, they represent images held in the mind as well as makes graphic images which help generate mental images of entity that is being designed. The search methods of problem space and a visual thinking have been crucially considered in the architectural design. The purpose of this paper is to explore the property of design by means of the pre-existed-experiment data and literature research. The findings will help design the architectural design for more creative results.