• Title/Summary/Keyword: IoT (internet of things)

Search Result 1,916, Processing Time 0.036 seconds

A Study on the Development of Artificial Intelligence Crop Environment Control Framework

  • Guangzhi Zhao
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.15 no.2
    • /
    • pp.144-156
    • /
    • 2023
  • Smart agriculture is a rapidly growing field that seeks to optimize crop yields and reduce risk through the use of advanced technology. A key challenge in this field is the need to create a comprehensive smart farm system that can effectively monitor and control the growth environment of crops, particularly when cultivating new varieties. This is where fuzzy theory comes in, enabling the collection and analysis of external environmental factors to generate a rule-based system that considers the specific needs of each crop variety. By doing so, the system can easily set the optimal growth environment, reducing trial and error and the user's risk burden. This is in contrast to existing systems where parameters need to be changed for each breed and various factors considered. Additionally, the type of house used affects the environmental control factors for crops, making it necessary to adapt the system accordingly. While developing such a framework requires a significant investment of labour and time, the benefits are numerous and can lead to increased productivity and profitability in the field of smart agriculture. We developed an AI platform for optimal control of facility houses by integrating data from mushroom crops and environmental factors, and analysing the correlation between optimal control conditions and yield. Our experiments demonstrated significant performance improvement compared to the existing system.

Pub/Sub-based Sensor virtualization framework for Cloud environment

  • Ullah, Mohammad Hasmat;Park, Sung-Soon;Nob, Jaechun;Kim, Gyeong Hun
    • International journal of advanced smart convergence
    • /
    • v.4 no.2
    • /
    • pp.109-119
    • /
    • 2015
  • The interaction between wireless sensors such as Internet of Things (IoT) and Cloud is a new paradigm of communication virtualization to overcome resource and efficiency restriction. Cloud computing provides unlimited platform, resources, services and also covers almost every area of computing. On the other hand, Wireless Sensor Networks (WSN) has gained attention for their potential supports and attractive solutions such as IoT, environment monitoring, healthcare, military, critical infrastructure monitoring, home and industrial automation, transportation, business, etc. Besides, our virtual groups and social networks are in main role of information sharing. However, this sensor network lacks resource, storage capacity and computational power along with extensibility, fault-tolerance, reliability and openness. These data are not available to community groups or cloud environment for general purpose research or utilization yet. If we reduce the gap between real and virtual world by adding this WSN driven data to cloud environment and virtual communities, then it can gain a remarkable attention from all over, along with giving us the benefit in various sectors. We have proposed a Pub/Sub-based sensor virtualization framework Cloud environment. This integration provides resource, service, and storage with sensor driven data to the community. We have virtualized physical sensors as virtual sensors on cloud computing, while this middleware and virtual sensors are provisioned automatically to end users whenever they required. Our architecture provides service to end users without being concerned about its implementation details. Furthermore, we have proposed an efficient content-based event matching algorithm to analyze subscriptions and to publish proper contents in a cost-effective manner. We have evaluated our algorithm which shows better performance while comparing to that of previously proposed algorithms.

On Additive Signal Dependent Gaussian Noise Channel Capacity for NOMA in 5G Mobile Communication

  • Chung, Kyuhyuk
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.12 no.2
    • /
    • pp.37-44
    • /
    • 2020
  • The fifth generation (5G) mobile communication has been commercialized and the 5G applications, such as the artificial intelligence (AI) and the internet of things (IoT), are deployed all over the world. The 5G new radio (NR) wireless networks are characterized by 100 times more traffic, 1000 times higher system capacity, and 1 ms latency. One of the promising 5G technologies is non-orthogonal multiple access (NOMA). In order for the NOMA performance to be improved, sometimes the additive signal-dependent Gaussian noise (ASDGN) channel model is required. However, the channel capacity calculation of such channels is so difficult, that only lower and upper bounds on the capacity of ASDGN channels have been presented. Such difficulties are due to the specific constraints on the dependency. Herein, we provide the capacity of ASDGN channels, by removing the constraints except the dependency. Then we obtain the ASDGN channel capacity, not lower and upper bounds, so that the clear impact of ASDGN can be clarified, compared to additive white Gaussian noise (AWGN). It is shown that the ASDGN channel capacity is greater than the AWGN channel capacity, for the high signal-to-noise ratio (SNR). We also apply the analytical results to the NOMA scheme to verify the superiority of ASDGN channels.

IoT Enabled Intelligent System for Radiation Monitoring and Warning Approach using Machine Learning

  • Muhammad Saifullah ;Imran Sarwar Bajwa;Muhammad Ibrahim;Mutyyba Asgher
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.5
    • /
    • pp.135-147
    • /
    • 2023
  • Internet of things has revolutionaries every field of life due to the use of artificial intelligence within Machine Learning. It is successfully being used for the study of Radiation monitoring, prediction of Ultraviolet and Electromagnetic rays. However, there is no particular system available that can monitor and detect waves. Therefore, the present study designed in which IOT enables intelligence system based on machine learning was developed for the prediction of the radiation and their effects of human beings. Moreover, a sensor based system was installed in order to detect harmful radiation present in the environment and this system has the ability to alert the humans within the range of danger zone with a buzz, so that humans can move to a safer place. Along with this automatic sensor system; a self-created dataset was also created in which sensor values were recorded. Furthermore, in order to study the outcomes of the effect of these rays researchers used Support Vector Machine, Gaussian Naïve Bayes, Decision Trees, Extra Trees, Bagging Classifier, Random Forests, Logistic Regression and Adaptive Boosting Classifier were used. To sum up the whole discussion it is stated the results give high accuracy and prove that the proposed system is reliable and accurate for the detection and monitoring of waves. Furthermore, for the prediction of outcome, Adaptive Boosting Classifier has shown the best accuracy of 81.77% as compared with other classifiers.

Design of Multi-Level Abnormal Detection System Suitable for Time-Series Data (시계열 데이터에 적합한 다단계 비정상 탐지 시스템 설계)

  • Chae, Moon-Chang;Lim, Hyeok;Kang, Namhi
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.16 no.6
    • /
    • pp.1-7
    • /
    • 2016
  • As new information and communication technologies evolve, security threats are also becoming increasingly intelligent and advanced. In this paper, we analyze the time series data continuously entered through a series of periods from the network device or lightweight IoT (Internet of Things) devices by using the statistical technique and propose a system to detect abnormal behaviors of the device or abnormality based on the analysis results. The proposed system performs the first level abnormal detection by using previously entered data set, thereafter performs the second level anomaly detection according to the trust bound configured by using stored time series data based on time attribute or group attribute. Multi-level analysis is able to improve reliability and to reduce false positives as well through a variety of decision data set.

Evolutionary game theory-based power control for uplink NOMA

  • Riaz, Sidra;Kim, Jihwan;Park, Unsang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.6
    • /
    • pp.2697-2710
    • /
    • 2018
  • Owing to the development of Internet of Things (IoT), the fifth-generation (5G) wireless communication is going to foresee a substantial increase of mobile traffic demand. Energy efficiency and spectral efficiency are the challenges in a 5G network. Non-orthogonal multiple access (NOMA) is a promising technique to increase the system efficiency by adaptive power control (PC) in a 5G network. This paper proposes an efficient PC scheme based on evolutionary game theory (EGT) model for uplink power-domain NOMA system. The proposed PC scheme allows users to adaptively adjusts their transmit power level in order to improve their payoffs or throughput which results in an increase of the system efficiency. In order to separate the user signals, a successive interference cancellation (SIC) receiver installed at the base station (BS) site. The simulation results demonstrate that the proposed EGT-based PC scheme outperforms the traditional game theory-based PC schemes and orthogonal multiple access (OMA) in terms of energy efficiency and spectral efficiency.

Improved Meet-in-the-Middle Attacks on Crypton and mCrypton

  • Cui, Jingyi;Guo, Jiansheng;Huang, Yanyan;Liu, Yipeng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.5
    • /
    • pp.2660-2679
    • /
    • 2017
  • Crypton is a SP-network block cipher that attracts much attention because of its excellent performance on hardware. Based on Crypton, mCrypton is designed as a lightweight block cipher suitable for Internet of Things (IoT) and Radio Frequency Identification (RFID). The security of Crypton and mCrypton under meet-in-the-middle attack is analyzed in this paper. By analyzing the differential properties of cell permutation, several differential characteristics are introduced to construct generalized ${\delta}-sets$. With the usage of a generalized ${\delta}-set$ and differential enumeration technique, a 6-round meet-in-the-middle distinguisher is proposed to give the first meet-in-the-middle attack on 9-round Crypton-192 and some improvements on the cryptanalysis of 10-round Crypton-256 are given. Combined with the properties of nibble permutation and substitution, an improved meet-in-the-middle attack on 8-round mCrypton is proposed and the first complete attack on 9-round mCrypton-96 is proposed.

Big Data Management in Structured Storage Based on Fintech Models for IoMT using Machine Learning Techniques (기계학습법을 이용한 IoMT 핀테크 모델을 기반으로 한 구조화 스토리지에서의 빅데이터 관리 연구)

  • Kim, Kyung-Sil
    • Advanced Industrial SCIence
    • /
    • v.1 no.1
    • /
    • pp.7-15
    • /
    • 2022
  • To adopt the development in the medical scenario IoT developed towards the advancement with the processing of a large amount of medical data defined as an Internet of Medical Things (IoMT). The vast range of collected medical data is stored in the cloud in the structured manner to process the collected healthcare data. However, it is difficult to handle the huge volume of the healthcare data so it is necessary to develop an appropriate scheme for the healthcare structured data. In this paper, a machine learning mode for processing the structured heath care data collected from the IoMT is suggested. To process the vast range of healthcare data, this paper proposed an MTGPLSTM model for the processing of the medical data. The proposed model integrates the linear regression model for the processing of healthcare information. With the developed model outlier model is implemented based on the FinTech model for the evaluation and prediction of the COVID-19 healthcare dataset collected from the IoMT. The proposed MTGPLSTM model comprises of the regression model to predict and evaluate the planning scheme for the prevention of the infection spreading. The developed model performance is evaluated based on the consideration of the different classifiers such as LR, SVR, RFR, LSTM and the proposed MTGPLSTM model and the different size of data as 1GB, 2GB and 3GB is mainly concerned. The comparative analysis expressed that the proposed MTGPLSTM model achieves ~4% reduced MAPE and RMSE value for the worldwide data; in case of china minimal MAPE value of 0.97 is achieved which is ~ 6% minimal than the existing classifier leads.

Acoustic Event Detection and Matlab/Simulink Interoperation for Individualized Things-Human Interaction (사물-사람 간 개인화된 상호작용을 위한 음향신호 이벤트 감지 및 Matlab/Simulink 연동환경)

  • Lee, Sanghyun;Kim, Tag Gon;Cho, Jeonghun;Park, Daejin
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.10 no.4
    • /
    • pp.189-198
    • /
    • 2015
  • Most IoT-related approaches have tried to establish the relation by connecting the network between things. The proposed research will present how the pervasive interaction of eco-system formed by touching the objects between humans and things can be recognized on purpose. By collecting and sharing the detected patterns among all kinds of things, we can construct the environment which enables individualized interactions of different objects. To perform the aforementioned, we are going to utilize technical procedures such as event-driven signal processing, pattern matching for signal recognition, and hardware in the loop simulation. We will also aim to implement the prototype of sensor processor based on Arduino MCU, which can be integrated with system using Arduino-Matlab/Simulink hybrid-interoperation environment. In the experiment, we use piezo transducer to detect the vibration or vibrates the surface using acoustic wave, which has specific frequency spectrum and individualized signal shape in terms of time axis. The signal distortion in time and frequency domain is recorded into memory tracer within sensor processor to extract the meaningful pattern by comparing the stored with lookup table(LUT). In this paper, we will contribute the initial prototypes for the acoustic touch processor by using off-the-shelf MCU and the integrated framework based on Matlab/Simulink model to provide the individualization of the touch-sensing for the user on purpose.

Combing data representation by Sparse Autoencoder and the well-known load balancing algorithm, ProGReGA-KF (Sparse Autoencoder의 데이터 특징 추출과 ProGReGA-KF를 결합한 새로운 부하 분산 알고리즘)

  • Kim, Chayoung;Park, Jung-min;Kim, Hye-young
    • Journal of Korea Game Society
    • /
    • v.17 no.5
    • /
    • pp.103-112
    • /
    • 2017
  • In recent years, expansions and advances of the Internet of Things (IoTs) in a distributed MMOGs (massively multiplayer online games) architecture have resulted in massive growth of data in terms of server workloads. We propose a combing Sparse Autoencoder and one of platforms in MMOGs, ProGReGA. In the process of Sparse Autoencoder, data representation with respect to enhancing the feature is excluded from this set of data. In the process of load balance, the graceful degradation of ProGReGA can exploit the most relevant and less redundant feature of the data representation. We find out that the proposed algorithm have become more stable.