• Title/Summary/Keyword: Computer-networks

Search Result 5,251, Processing Time 0.034 seconds

Medical Information Dynamic Access System in Smart Mobile Environments (스마트 모바일 환경에서 의료정보 동적접근 시스템)

  • Jeong, Chang Won;Kim, Woo Hong;Yoon, Kwon Ha;Joo, Su Chong
    • Journal of Internet Computing and Services
    • /
    • v.16 no.1
    • /
    • pp.47-55
    • /
    • 2015
  • Recently, the environment of a hospital information system is a trend to combine various SMART technologies. Accordingly, various smart devices, such as a smart phone, Tablet PC is utilized in the medical information system. Also, these environments consist of various applications executing on heterogeneous sensors, devices, systems and networks. In these hospital information system environment, applying a security service by traditional access control method cause a problems. Most of the existing security system uses the access control list structure. It is only permitted access defined by an access control matrix such as client name, service object method name. The major problem with the static approach cannot quickly adapt to changed situations. Hence, we needs to new security mechanisms which provides more flexible and can be easily adapted to various environments with very different security requirements. In addition, for addressing the changing of service medical treatment of the patient, the researching is needed. In this paper, we suggest a dynamic approach to medical information systems in smart mobile environments. We focus on how to access medical information systems according to dynamic access control methods based on the existence of the hospital's information system environments. The physical environments consist of a mobile x-ray imaging devices, dedicated mobile/general smart devices, PACS, EMR server and authorization server. The software environment was developed based on the .Net Framework for synchronization and monitoring services based on mobile X-ray imaging equipment Windows7 OS. And dedicated a smart device application, we implemented a dynamic access services through JSP and Java SDK is based on the Android OS. PACS and mobile X-ray image devices in hospital, medical information between the dedicated smart devices are based on the DICOM medical image standard information. In addition, EMR information is based on H7. In order to providing dynamic access control service, we classify the context of the patients according to conditions of bio-information such as oxygen saturation, heart rate, BP and body temperature etc. It shows event trace diagrams which divided into two parts like general situation, emergency situation. And, we designed the dynamic approach of the medical care information by authentication method. The authentication Information are contained ID/PWD, the roles, position and working hours, emergency certification codes for emergency patients. General situations of dynamic access control method may have access to medical information by the value of the authentication information. In the case of an emergency, was to have access to medical information by an emergency code, without the authentication information. And, we constructed the medical information integration database scheme that is consist medical information, patient, medical staff and medical image information according to medical information standards.y Finally, we show the usefulness of the dynamic access application service based on the smart devices for execution results of the proposed system according to patient contexts such as general and emergency situation. Especially, the proposed systems are providing effective medical information services with smart devices in emergency situation by dynamic access control methods. As results, we expect the proposed systems to be useful for u-hospital information systems and services.

Bankruptcy Forecasting Model using AdaBoost: A Focus on Construction Companies (적응형 부스팅을 이용한 파산 예측 모형: 건설업을 중심으로)

  • Heo, Junyoung;Yang, Jin Yong
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.35-48
    • /
    • 2014
  • According to the 2013 construction market outlook report, the liquidation of construction companies is expected to continue due to the ongoing residential construction recession. Bankruptcies of construction companies have a greater social impact compared to other industries. However, due to the different nature of the capital structure and debt-to-equity ratio, it is more difficult to forecast construction companies' bankruptcies than that of companies in other industries. The construction industry operates on greater leverage, with high debt-to-equity ratios, and project cash flow focused on the second half. The economic cycle greatly influences construction companies. Therefore, downturns tend to rapidly increase the bankruptcy rates of construction companies. High leverage, coupled with increased bankruptcy rates, could lead to greater burdens on banks providing loans to construction companies. Nevertheless, the bankruptcy prediction model concentrated mainly on financial institutions, with rare construction-specific studies. The bankruptcy prediction model based on corporate finance data has been studied for some time in various ways. However, the model is intended for all companies in general, and it may not be appropriate for forecasting bankruptcies of construction companies, who typically have high liquidity risks. The construction industry is capital-intensive, operates on long timelines with large-scale investment projects, and has comparatively longer payback periods than in other industries. With its unique capital structure, it can be difficult to apply a model used to judge the financial risk of companies in general to those in the construction industry. Diverse studies of bankruptcy forecasting models based on a company's financial statements have been conducted for many years. The subjects of the model, however, were general firms, and the models may not be proper for accurately forecasting companies with disproportionately large liquidity risks, such as construction companies. The construction industry is capital-intensive, requiring significant investments in long-term projects, therefore to realize returns from the investment. The unique capital structure means that the same criteria used for other industries cannot be applied to effectively evaluate financial risk for construction firms. Altman Z-score was first published in 1968, and is commonly used as a bankruptcy forecasting model. It forecasts the likelihood of a company going bankrupt by using a simple formula, classifying the results into three categories, and evaluating the corporate status as dangerous, moderate, or safe. When a company falls into the "dangerous" category, it has a high likelihood of bankruptcy within two years, while those in the "safe" category have a low likelihood of bankruptcy. For companies in the "moderate" category, it is difficult to forecast the risk. Many of the construction firm cases in this study fell in the "moderate" category, which made it difficult to forecast their risk. Along with the development of machine learning using computers, recent studies of corporate bankruptcy forecasting have used this technology. Pattern recognition, a representative application area in machine learning, is applied to forecasting corporate bankruptcy, with patterns analyzed based on a company's financial information, and then judged as to whether the pattern belongs to the bankruptcy risk group or the safe group. The representative machine learning models previously used in bankruptcy forecasting are Artificial Neural Networks, Adaptive Boosting (AdaBoost) and, the Support Vector Machine (SVM). There are also many hybrid studies combining these models. Existing studies using the traditional Z-Score technique or bankruptcy prediction using machine learning focus on companies in non-specific industries. Therefore, the industry-specific characteristics of companies are not considered. In this paper, we confirm that adaptive boosting (AdaBoost) is the most appropriate forecasting model for construction companies by based on company size. We classified construction companies into three groups - large, medium, and small based on the company's capital. We analyzed the predictive ability of AdaBoost for each group of companies. The experimental results showed that AdaBoost has more predictive ability than the other models, especially for the group of large companies with capital of more than 50 billion won.

A Study on the Development Trend of Artificial Intelligence Using Text Mining Technique: Focused on Open Source Software Projects on Github (텍스트 마이닝 기법을 활용한 인공지능 기술개발 동향 분석 연구: 깃허브 상의 오픈 소스 소프트웨어 프로젝트를 대상으로)

  • Chong, JiSeon;Kim, Dongsung;Lee, Hong Joo;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.1-19
    • /
    • 2019
  • Artificial intelligence (AI) is one of the main driving forces leading the Fourth Industrial Revolution. The technologies associated with AI have already shown superior abilities that are equal to or better than people in many fields including image and speech recognition. Particularly, many efforts have been actively given to identify the current technology trends and analyze development directions of it, because AI technologies can be utilized in a wide range of fields including medical, financial, manufacturing, service, and education fields. Major platforms that can develop complex AI algorithms for learning, reasoning, and recognition have been open to the public as open source projects. As a result, technologies and services that utilize them have increased rapidly. It has been confirmed as one of the major reasons for the fast development of AI technologies. Additionally, the spread of the technology is greatly in debt to open source software, developed by major global companies, supporting natural language recognition, speech recognition, and image recognition. Therefore, this study aimed to identify the practical trend of AI technology development by analyzing OSS projects associated with AI, which have been developed by the online collaboration of many parties. This study searched and collected a list of major projects related to AI, which were generated from 2000 to July 2018 on Github. This study confirmed the development trends of major technologies in detail by applying text mining technique targeting topic information, which indicates the characteristics of the collected projects and technical fields. The results of the analysis showed that the number of software development projects by year was less than 100 projects per year until 2013. However, it increased to 229 projects in 2014 and 597 projects in 2015. Particularly, the number of open source projects related to AI increased rapidly in 2016 (2,559 OSS projects). It was confirmed that the number of projects initiated in 2017 was 14,213, which is almost four-folds of the number of total projects generated from 2009 to 2016 (3,555 projects). The number of projects initiated from Jan to Jul 2018 was 8,737. The development trend of AI-related technologies was evaluated by dividing the study period into three phases. The appearance frequency of topics indicate the technology trends of AI-related OSS projects. The results showed that the natural language processing technology has continued to be at the top in all years. It implied that OSS had been developed continuously. Until 2015, Python, C ++, and Java, programming languages, were listed as the top ten frequently appeared topics. However, after 2016, programming languages other than Python disappeared from the top ten topics. Instead of them, platforms supporting the development of AI algorithms, such as TensorFlow and Keras, are showing high appearance frequency. Additionally, reinforcement learning algorithms and convolutional neural networks, which have been used in various fields, were frequently appeared topics. The results of topic network analysis showed that the most important topics of degree centrality were similar to those of appearance frequency. The main difference was that visualization and medical imaging topics were found at the top of the list, although they were not in the top of the list from 2009 to 2012. The results indicated that OSS was developed in the medical field in order to utilize the AI technology. Moreover, although the computer vision was in the top 10 of the appearance frequency list from 2013 to 2015, they were not in the top 10 of the degree centrality. The topics at the top of the degree centrality list were similar to those at the top of the appearance frequency list. It was found that the ranks of the composite neural network and reinforcement learning were changed slightly. The trend of technology development was examined using the appearance frequency of topics and degree centrality. The results showed that machine learning revealed the highest frequency and the highest degree centrality in all years. Moreover, it is noteworthy that, although the deep learning topic showed a low frequency and a low degree centrality between 2009 and 2012, their ranks abruptly increased between 2013 and 2015. It was confirmed that in recent years both technologies had high appearance frequency and degree centrality. TensorFlow first appeared during the phase of 2013-2015, and the appearance frequency and degree centrality of it soared between 2016 and 2018 to be at the top of the lists after deep learning, python. Computer vision and reinforcement learning did not show an abrupt increase or decrease, and they had relatively low appearance frequency and degree centrality compared with the above-mentioned topics. Based on these analysis results, it is possible to identify the fields in which AI technologies are actively developed. The results of this study can be used as a baseline dataset for more empirical analysis on future technology trends that can be converged.

SANET-CC : Zone IP Allocation Protocol for Offshore Networks (SANET-CC : 해상 네트워크를 위한 구역 IP 할당 프로토콜)

  • Bae, Kyoung Yul;Cho, Moon Ki
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.87-109
    • /
    • 2020
  • Currently, thanks to the major stride made in developing wired and wireless communication technology, a variety of IT services are available on land. This trend is leading to an increasing demand for IT services to vessels on the water as well. And it is expected that the request for various IT services such as two-way digital data transmission, Web, APP, etc. is on the rise to the extent that they are available on land. However, while a high-speed information communication network is easily accessible on land because it is based upon a fixed infrastructure like an AP and a base station, it is not the case on the water. As a result, a radio communication network-based voice communication service is usually used at sea. To solve this problem, an additional frequency for digital data exchange was allocated, and a ship ad-hoc network (SANET) was proposed that can be utilized by using this frequency. Instead of satellite communication that costs a lot in installation and usage, SANET was developed to provide various IT services to ships based on IP in the sea. Connectivity between land base stations and ships is important in the SANET. To have this connection, a ship must be a member of the network with its IP address assigned. This paper proposes a SANET-CC protocol that allows ships to be assigned their own IP address. SANET-CC propagates several non-overlapping IP addresses through the entire network from land base stations to ships in the form of the tree. Ships allocate their own IP addresses through the exchange of simple requests and response messages with land base stations or M-ships that can allocate IP addresses. Therefore, SANET-CC can eliminate the IP collision prevention (Duplicate Address Detection) process and the process of network separation or integration caused by the movement of the ship. Various simulations were performed to verify the applicability of this protocol to SANET. The outcome of such simulations shows us the following. First, using SANET-CC, about 91% of the ships in the network were able to receive IP addresses under any circumstances. It is 6% higher than the existing studies. And it suggests that if variables are adjusted to each port's environment, it may show further improved results. Second, this work shows us that it takes all vessels an average of 10 seconds to receive IP addresses regardless of conditions. It represents a 50% decrease in time compared to the average of 20 seconds in the previous study. Also Besides, taking it into account that when existing studies were on 50 to 200 vessels, this study on 100 to 400 vessels, the efficiency can be much higher. Third, existing studies have not been able to derive optimal values according to variables. This is because it does not have a consistent pattern depending on the variable. This means that optimal variables values cannot be set for each port under diverse environments. This paper, however, shows us that the result values from the variables exhibit a consistent pattern. This is significant in that it can be applied to each port by adjusting the variable values. It was also confirmed that regardless of the number of ships, the IP allocation ratio was the most efficient at about 96 percent if the waiting time after the IP request was 75ms, and that the tree structure could maintain a stable network configuration when the number of IPs was over 30000. Fourth, this study can be used to design a network for supporting intelligent maritime control systems and services offshore, instead of satellite communication. And if LTE-M is set up, it is possible to use it for various intelligent services.

Analysis of Trading Performance on Intelligent Trading System for Directional Trading (방향성매매를 위한 지능형 매매시스템의 투자성과분석)

  • Choi, Heung-Sik;Kim, Sun-Woong;Park, Sung-Cheol
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.3
    • /
    • pp.187-201
    • /
    • 2011
  • KOSPI200 index is the Korean stock price index consisting of actively traded 200 stocks in the Korean stock market. Its base value of 100 was set on January 3, 1990. The Korea Exchange (KRX) developed derivatives markets on the KOSPI200 index. KOSPI200 index futures market, introduced in 1996, has become one of the most actively traded indexes markets in the world. Traders can make profit by entering a long position on the KOSPI200 index futures contract if the KOSPI200 index will rise in the future. Likewise, they can make profit by entering a short position if the KOSPI200 index will decline in the future. Basically, KOSPI200 index futures trading is a short-term zero-sum game and therefore most futures traders are using technical indicators. Advanced traders make stable profits by using system trading technique, also known as algorithm trading. Algorithm trading uses computer programs for receiving real-time stock market data, analyzing stock price movements with various technical indicators and automatically entering trading orders such as timing, price or quantity of the order without any human intervention. Recent studies have shown the usefulness of artificial intelligent systems in forecasting stock prices or investment risk. KOSPI200 index data is numerical time-series data which is a sequence of data points measured at successive uniform time intervals such as minute, day, week or month. KOSPI200 index futures traders use technical analysis to find out some patterns on the time-series chart. Although there are many technical indicators, their results indicate the market states among bull, bear and flat. Most strategies based on technical analysis are divided into trend following strategy and non-trend following strategy. Both strategies decide the market states based on the patterns of the KOSPI200 index time-series data. This goes well with Markov model (MM). Everybody knows that the next price is upper or lower than the last price or similar to the last price, and knows that the next price is influenced by the last price. However, nobody knows the exact status of the next price whether it goes up or down or flat. So, hidden Markov model (HMM) is better fitted than MM. HMM is divided into discrete HMM (DHMM) and continuous HMM (CHMM). The only difference between DHMM and CHMM is in their representation of state probabilities. DHMM uses discrete probability density function and CHMM uses continuous probability density function such as Gaussian Mixture Model. KOSPI200 index values are real number and these follow a continuous probability density function, so CHMM is proper than DHMM for the KOSPI200 index. In this paper, we present an artificial intelligent trading system based on CHMM for the KOSPI200 index futures system traders. Traders have experienced on technical trading for the KOSPI200 index futures market ever since the introduction of the KOSPI200 index futures market. They have applied many strategies to make profit in trading the KOSPI200 index futures. Some strategies are based on technical indicators such as moving averages or stochastics, and others are based on candlestick patterns such as three outside up, three outside down, harami or doji star. We show a trading system of moving average cross strategy based on CHMM, and we compare it to a traditional algorithmic trading system. We set the parameter values of moving averages at common values used by market practitioners. Empirical results are presented to compare the simulation performance with the traditional algorithmic trading system using long-term daily KOSPI200 index data of more than 20 years. Our suggested trading system shows higher trading performance than naive system trading.

A study on the developing and implementation of the Cyber University (가상대학 구현에 관한 연구)

  • Choi, Sung;Yoo, Gab-Sang
    • Proceedings of the Technology Innovation Conference
    • /
    • 1998.06a
    • /
    • pp.116-127
    • /
    • 1998
  • The Necessity of Cyber University. Within the rapidly changing environment of global economics, the environment of higher education in the universities, also, has been, encountering various changes. Popularization on higher education related to 1lifetime education system, putting emphasis on the productivity of education services and the acquisition of competitiveness through the market of open education, the breakdown of the ivory tower and the Multiversitization of universities, importance of obtaining information in the universities, and cooperation between domestic and oversea universities, industry and educational system must be acquired. Therefore, in order to adequately cope wi th these kinds of rapid changes in the education environment, operating Cyber University by utilizing various information technologies and its fixations such as Internet, E-mail, CD-ROMs, Interact ive Video Networks (Video Conferencing, Video on Demand), TV, Cable etc., which has no time or location limitation, is needed. Using informal ion and telecommunication technologies, especially the Internet is expected to Or ing about many changes in the social, economics and educational area. Among the many changes scholars have predicted, the development and fixations of Distant Learning or Cyber University was the most dominant factor. In the case of U. S. A., Cyber University has already been established and in under operation by the Federate Governments of 13 states. Any other universities (around 500 universities has been opened until1 now), with the help of the government and private citizens have been able to partly operate the Cyber University and is planning on enlarging step-by-step in the future. It could be seen not only as U. S. A. trying to elevate its higher education through their leading information technologies, but also could be seen as their objective in putting efforts on subordinating the culture of the education worldwide. UTRA University in U. S. A., for example, is already exporting its class lectures to China, and Indonesia regions. Influenced by the Cyber University current in the U.S., the Universities in Korea is willing .to arrange various forms of Cyber Universities. In line with this, at JUNAM National University, internet based Cyber University, which has set about its work on July of 1997, is in the state of operating about 100 Cyber Universities. Also, in the case of Hanam University, the Distant Learning classes are at its final stage of being established; this is a link in the rapid speed project of setting an example by the Korean Government. In addition, the department of education has selected 5 universities, including Seoul Cyber Design University for experimentation and is in the stage of strategic operation. Over 100 universities in Korea are speeding up its preparation for operating Cyber University. This form of Distant Learning goes beyond the walls of universities and is in the trend of being diffused in business areas or in various training programs of financial organizations and more. Here, in the hope that this material would some what be of help to other Universities which are preparing for Cyber University, I would 1ike to introduce some general concepts of the components forming Cyber University and Open Education System which has been established by JUNAM University. System of Cyber University could be seen as a general solution offered by tile computer technologies for the management on the students, Lectures On Demand, real hour based and satellite classes, media product ion lab for the production of the multimedia Contents, electronic library, the Groupware enabling exchange of information between students and professors. Arranging general concepts of components in the aspect of Cyber University and Open Education, it would be expressed in the form of the establishment of Cyber University and the service of Open Education as can be seen in the diagram below.

  • PDF

A Study on the animation music video production for the viral marketing purposes A case study of project (바이럴 마케팅용 애니메이션 뮤직비디오 제작 연구 : 월드컵 응원가 <일어나라 대한민국> 사례를 중심으로)

  • Han, Sang-Gyun;Kim, Tak-Hoon;Kim, Yu-Mi
    • Cartoon and Animation Studies
    • /
    • s.22
    • /
    • pp.47-63
    • /
    • 2011
  • Recently, contemporary cultural contents have been shown its diversity changes followed by the birth of new media platforms with consumers' new needs in the global market. Also, developments of Internet and computer system networks are the main contributors of making this changes happened rapidly. This study aims to know that how to usefully use those new media platforms through the great example of stop-motion animation music video by analyzing its production and marketing process. The music video production had been focused to be completed with high quality by adjusting the production process economically in spite of the relatively short period(less than one month)from its crank-up to the deadline. Because the production was planned that main characters lead the whole story, the creative team had been tried to reduce the production hours by commonly use the same mold when they make original clay models by collecting the similarities of characters' appearances. By using CG technic, could overcome the visual monotonous from the similarities which inferred above. Also, the repeated rhythm in the music video, the similar scenes of backgrounds were commonly used by copy of the original scene. At the point of directing, the creative team considered both economical and art aspects for the quality work. In details, they divided the scenes into foreground and background, and removed unnecessary parts to save the production hours and budget but make depth of fields in the scenes. Except the viral marketing purposes, was searching for the methods to compensate the production cost. For this, the characters in the music video dressed the same T-shirts which are world-cup logo on, and those were designed for the sale after released the music video. Even the result of the sales was not enough to satisfied, it was estimated a great attempt to the domestic animation industry.

  • PDF

Selectively Partial Encryption of Images in Wavelet Domain (웨이블릿 영역에서의 선택적 부분 영상 암호화)

  • ;Dujit Dey
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.6C
    • /
    • pp.648-658
    • /
    • 2003
  • As the usage of image/video contents increase, a security problem for the payed image data or the ones requiring confidentiality is raised. This paper proposed an image encryption methodology to hide the image information. The target data of it is the result from quantization in wavelet domain. This method encrypts only part of the image data rather than the whole data of the original image, in which three types of data selection methodologies were involved. First, by using the fact that the wavelet transform decomposes the original image into frequency sub-bands, only some of the frequency sub-bands were included in encryption to make the resulting image unrecognizable. In the data to represent each pixel, only MSBs were taken for encryption. Finally, pixels to be encrypted in a specific sub-band were selected randomly by using LFSR(Linear Feedback Shift Register). Part of the key for encryption was used for the seed value of LFSR and in selecting the parallel output bits of the LFSR for random selection so that the strength of encryption algorithm increased. The experiments have been performed with the proposed methods implemented in software for about 500 images, from which the result showed that only about 1/1000 amount of data to the original image can obtain the encryption effect not to recognize the original image. Consequently, we are sure that the proposed are efficient image encryption methods to acquire the high encryption effect with small amount of encryption. Also, in this paper, several encryption scheme according to the selection of the sub-bands and the number of bits from LFSR outputs for pixel selection have been proposed, and it has been shown that there exits a relation of trade-off between the execution time and the effect of the encryption. It means that the proposed methods can be selectively used according to the application areas. Also, because the proposed methods are performed in the application layer, they are expected to be a good solution for the end-to-end security problem, which is appearing as one of the important problems in the networks with both wired and wireless sections.

Information technology and changes in firm activities:A case of the service industry in the United States (정보기술과 기업활동의 변화:미국의 서비스산업을 사례로)

  • Lee, Jeong Rock
    • Journal of the Korean Geographical Society
    • /
    • v.29 no.4
    • /
    • pp.402-419
    • /
    • 1994
  • Telecommunication and intormation technology have been conceived as crucial as well as revolutionary elements for recent and future social and economic development, and their development have led to a spatial reorganization and locational change of economic activities. Information technology has resulted in important changes in the organization structure and location of firm. This study draws attention to the understanding of the relationship between the diffusion of information technology and changes in firm activities with the special reference to the service industry of the United States. Information technology has had a significant impact on the growth and changes of the service industry of the United States through changes in the organizational and employment structure, market structure, and locational changes. The impact of information technology on location changes of the service industry shows two opposite patterns, concentration and decentralization. Among these patterns, the location change in the service industry of the United States reveals predominantly the decentralization tendency such as suburbanization and transfer to lower ranking cities rather than concentration. In case of Korea, however, it is anticipated that the rapid development of information technology may lead to the concentration of the service industry in Seoul and Capital region.

  • PDF

Electronic Word-of-Mouth in B2C Virtual Communities: An Empirical Study from CTrip.com (B2C허의사구중적전자구비(B2C虚拟社区中的电子口碑): 관우휴정려유망적실증연구(关于携程旅游网的实证研究))

  • Li, Guoxin;Elliot, Statia;Choi, Chris
    • Journal of Global Scholars of Marketing Science
    • /
    • v.20 no.3
    • /
    • pp.262-268
    • /
    • 2010
  • Virtual communities (VCs) have developed rapidly, with more and more people participating in them to exchange information and opinions. A virtual community is a group of people who may or may not meet one another face to face, and who exchange words and ideas through the mediation of computer bulletin boards and networks. A business-to-consumer virtual community (B2CVC) is a commercial group that creates a trustworthy environment intended to motivate consumers to be more willing to buy from an online store. B2CVCs create a social atmosphere through information contribution such as recommendations, reviews, and ratings of buyers and sellers. Although the importance of B2CVCs has been recognized, few studies have been conducted to examine members' word-of-mouth behavior within these communities. This study proposes a model of involvement, statistics, trust, "stickiness," and word-of-mouth in a B2CVC and explores the relationships among these elements based on empirical data. The objectives are threefold: (i) to empirically test a B2CVC model that integrates measures of beliefs, attitudes, and behaviors; (ii) to better understand the nature of these relationships, specifically through word-of-mouth as a measure of revenue generation; and (iii) to better understand the role of stickiness of B2CVC in CRM marketing. The model incorporates three key elements concerning community members: (i) their beliefs, measured in terms of their involvement assessment; (ii) their attitudes, measured in terms of their satisfaction and trust; and, (iii) their behavior, measured in terms of site stickiness and their word-of-mouth. Involvement is considered the motivation for consumers to participate in a virtual community. For B2CVC members, information searching and posting have been proposed as the main purpose for their involvement. Satisfaction has been reviewed as an important indicator of a member's overall community evaluation, and conceptualized by different levels of member interactions with their VC. The formation and expansion of a VC depends on the willingness of members to share information and services. Researchers have found that trust is a core component facilitating the anonymous interaction in VCs and e-commerce, and therefore trust-building in VCs has been a common research topic. It is clear that the success of a B2CVC depends on the stickiness of its members to enhance purchasing potential. Opinions communicated and information exchanged between members may represent a type of written word-of-mouth. Therefore, word-of-mouth is one of the primary factors driving the diffusion of B2CVCs across the Internet. Figure 1 presents the research model and hypotheses. The model was tested through the implementation of an online survey of CTrip Travel VC members. A total of 243 collected questionnaires was reduced to 204 usable questionnaires through an empirical process of data cleaning. The study's hypotheses examined the extent to which involvement, satisfaction, and trust influence B2CVC stickiness and members' word-of-mouth. Structural Equation Modeling tested the hypotheses in the analysis, and the structural model fit indices were within accepted thresholds: ${\chi}^2^$/df was 2.76, NFI was .904, IFI was .931, CFI was .930, and RMSEA was .017. Results indicated that involvement has a significant influence on satisfaction (p<0.001, ${\beta}$=0.809). The proportion of variance in satisfaction explained by members' involvement was over half (adjusted $R^2$=0.654), reflecting a strong association. The effect of involvement on trust was also statistically significant (p<0.001, ${\beta}$=0.751), with 57 percent of the variance in trust explained by involvement (adjusted $R^2$=0.563). When the construct "stickiness" was treated as a dependent variable, the proportion of variance explained by the variables of trust and satisfaction was relatively low (adjusted $R^2$=0.331). Satisfaction did have a significant influence on stickiness, with ${\beta}$=0.514. However, unexpectedly, the influence of trust was not even significant (p=0.231, t=1.197), rejecting that proposed hypothesis. The importance of stickiness in the model was more significant because of its effect on e-WOM with ${\beta}$=0.920 (p<0.001). Here, the measures of Stickiness explain over eighty of the variance in e-WOM (Adjusted $R^2$=0.846). Overall, the results of the study supported the hypothesized relationships between members' involvement in a B2CVC and their satisfaction with and trust of it. However, trust, as a traditional measure in behavioral models, has no significant influence on stickiness in the B2CVC environment. This study contributes to the growing body of literature on B2CVCs, specifically addressing gaps in the academic research by integrating measures of beliefs, attitudes, and behaviors in one model. The results provide additional insights to behavioral factors in a B2CVC environment, helping to sort out relationships between traditional measures and relatively new measures. For practitioners, the identification of factors, such as member involvement, that strongly influence B2CVC member satisfaction can help focus technological resources in key areas. Global e-marketers can develop marketing strategies directly targeting B2CVC members. In the global tourism business, they can target Chinese members of a B2CVC by providing special discounts for active community members or developing early adopter programs to encourage stickiness in the community. Future studies are called for, and more sophisticated modeling, to expand the measurement of B2CVC member behavior and to conduct experiments across industries, communities, and cultures.