• Volume 26,Issue 11,2017 Table of Contents
    Select All
    Display Type: |
    • Component-Based Description Language of Cyber-Physical System

      2017, 26(11):1-10. DOI: 10.15888/j.cnki.csa.006022

      Abstract (2362) HTML (0) PDF 2.16 M (2043) Comment (0) Favorites

      Abstract:This paper presents a component-based language named CDL (CPS Description Language) to describe Cyber-Physical Systems according to the actor-oriented model. The sensors, actuators and compute components in a Cyber-Physics system are encapsulated into components with uniform abstraction. Therefore, a system description is decomposed into the components contained in the system, the directed relationships between components, and the system constraints. In addition, we design and implement the development tool of CPS based on CDL, which provides the functions of system description generation, verification and installation. At last, two system examples are implemented to prove that the amount of code written by developers could be reduced and the efficiency of programming could be improved to a certain extent by using CDL.

    • Multi-Thread Downloading Technology Based on HTML5

      2017, 26(11):11-18. DOI: 10.15888/j.cnki.csa.006091

      Abstract (1835) HTML (0) PDF 1.19 M (2830) Comment (0) Favorites

      Abstract:This paper mainly studies the technology of multi-thread downloading in browser, which aims to solve the problem of low efficiency of single thread downloading and over reliance on the target server. Based on HTML5 Web Workers technology, we propose and implement a novel multi-thread downloading technology which runs in the browser. Though segmenting file to chunks, we download a file from multiple sources. We use ArrayBuffer array and Blob objects to merge the file fragments in the browser. The results show that this method is superior to the single thread downloading technology in the large file downloading, large network delay or high packet loss rate.

    • Research on Mobile Application Development and Middle Controller Based on Web Technology

      2017, 26(11):19-27. DOI: 10.15888/j.cnki.csa.006038

      Abstract (1686) HTML (0) PDF 2.65 M (2452) Comment (0) Favorites

      Abstract:This paper comes up with a mode of development to build a mobile application with Web technology, together with its meaning. Based on this kind of mode, using MUI and HTML5+, we could build a cross-platform mobile sports competition platform. At last, to verify the feasibility and advantage of this mode of development, we package and deploy the application to run on mobile devices such as ios, Android System and pad. Since Web technology has become highly developed, this mode of development will be the focus and used more and more widely.

    • Implicit Music Recommender Based on Large Scale Word-Embedding

      2017, 26(11):28-35. DOI: 10.15888/j.cnki.csa.006049

      Abstract (1740) HTML (0) PDF 1.96 M (2442) Comment (0) Favorites

      Abstract:A large scale word-embedding based implicit music recommender is proposed to address the problems that most of current recommendation systems cannot work in the scenario of large scale implicit feedback recommendation. This model employs the Word2Vec technique which is popular in Natural Language Processing in recent years. By learning the songs co-occurrences in the users' history collections, we can get the distributed representation of users and songs with a low-dimension and dense vector. In this way, we can get the similarities of users and songs which could be used for the recommendation and we also analyze the correctness of application of Word2Vec technique in recommendation. This model can effectively solve the problem mentioned above with the accuracy remaining the same. In addition, this model can converge faster and take less memory than those of traditional methods.

    • Storage Medium-Oriented Data Secure Deletion

      2017, 26(11):36-44. DOI: 10.15888/j.cnki.csa.006092

      Abstract (1796) HTML (0) PDF 1.06 M (3300) Comment (0) Favorites

      Abstract:With the rapid development of information and communication technology, the amount of information in storage medium is significantly increasing. Just deleting the index file on table cannot really delete the information. Long-term storage of those information may easily lead to data leakage. Therefore, secure deletion of the storage medium is one of the most attractive research areas. In order to address this problem, the basic knowledge about storage medium is introduced in this paper, including the basic structure, the principle of storage and the principle of deletion of the storage medium. Then, some commonly standards of data deletion are analyzed, the characteristics of these standards are studied. Finally, we design and implement a secure deletion prototype system, and the existing data deletion software is tested and analyzed. The result demonstrates that the secure deletion prototype system can safely delete the data in the storage medium and effectively protect users' privacy.

    • Analysis on the Limitation and Performance of Taint-Based Directed Fuzzing

      2017, 26(11):45-51. DOI: 10.15888/j.cnki.csa.006051

      Abstract (2021) HTML (0) PDF 823.43 K (2216) Comment (0) Favorites

      Abstract:The tainted-based directed fuzzing is an important technology to find bugs towards several given suspicious vulnerable code areas in black-box scenarios. It sets program's input as initial taints, uses dynamic taint tracing to locate the regions of input related to suspicious code areas. Then it only fuzzes the located input, thus avoids a large part of testing unrelated to the suspicious areas. But the existing researches haven't analyzed its real world challenges systematically and haven't evaluated its performance enhancement mathematically. To solve this problem, this paper uses 14 CVEs as benchmark to do its limitation analysis, abstracts the fuzzing as shifted geometric distribution to get performance enhancement equation and analyzes the performance variation trend. Analyses show that the tainted-based directed fuzzing has limitations on fuzzing bugs containing metadata relation in the taint propagation. And the experiments verify that the performance enhancement equation has a good reference value.

    • Design of Data Interaction Manner in Service-Oriented Tool Integration

      2017, 26(11):52-59. DOI: 10.15888/j.cnki.csa.006032

      Abstract (1654) HTML (0) PDF 1.36 M (1895) Comment (0) Favorites

      Abstract:In view of the widespread problems like syntax heterogeneity and semantic heterogeneity in the service-oriented tool integration, this paper proposes a new data interaction manner and gives a detailed description of its implementation method and related technology. The manner's syntax definition is based on JSON format and it also takes an advantage of general vocabulary to handle the problem of semantic heterogeneity. At last, we implement the manner in an actual case of the integration of common software development tools, encapsulating tool's function as service. The experimental results show that the design of data interaction manner can well solve the problems mentioned above, thus provides good data interaction base for tools integration.

    • Fault Analysis Method of Interbank Transaction System

      2017, 26(11):60-66. DOI: 10.15888/j.cnki.csa.006047

      Abstract (2207) HTML (0) PDF 1.58 M (2384) Comment (0) Favorites

      Abstract:The current interbank transaction system is huge and complicated. How to quickly locate and analyze the cause of transaction failure from the massive amount of logs generated by the transaction system is very important. This paper proposes an analysis method based on the transaction log slice, which cuts the whole log from the start to the end of a transaction into slice. Then it leverages Hadoop framework and executes big data analysis with predefined scripts, which can locate the error information and find the solution more rapidly. A fault analysis system is implemented with this method. The experiments show that the method proposed in this paper can obviously improve the efficiency of transaction log query and fault analysis, which will reduce the operational cost of interbank transaction system.

    • Research on Cluster Scheduling Based on CoreOS Oriented Load Integration

      2017, 26(11):67-75. DOI: 10.15888/j.cnki.csa.006034

      Abstract (2592) HTML (0) PDF 1.08 M (1924) Comment (0) Favorites

      Abstract:CoreOS is a new containerized cluster server operating system based on Docker. It is developing rapidly and has been supported by mainstream cloud service providers such as OpenStack, kubernetes, Salesforce and Ebay. The load in the cloud environment is dynamic, therefore the resource requirement is dynamic, which poses a challenge to the efficient utilization of cluster resources. The strategy of static pre-allocation of peak resources brings a huge waste of cloud resources, and in the meantime idle calculation wastes a lot of energy consumption. In this paper, load-integrated cluster scheduling system (LICSS) is proposed to monitor the load distribution of the cluster in real time. In order to release idle resources in time to reduce energy consumption in the scheduling process, the nodes are allocated using the compact scheduling strategy and the load is dynamically integrated by the task migration technique. The LICSS system implements the node load metric, task metric, load integration algorithm, and calculates the node adaptive load threshold. Experiments show that the LICSS system can effectively integrate the load according to the dynamic changes of the cluster load in different time periods, and can improve the average resource utilization rate by 12.2%, and reduce the cluster energy consumption by triggering the dormancy of the redundant nodes during the low load period.

    • Access Control on USB Mass Storage Devices Based on Thin Hypervisor

      2017, 26(11):76-81. DOI: 10.15888/j.cnki.csa.006039

      Abstract (1650) HTML (0) PDF 743.52 K (1964) Comment (0) Favorites

      Abstract:USB mobile storage devices are widely used to transfer and exchange data for their small size and large capacity. These features provide challenges for us to protect our confidential information because thieves can take secrets away by USB storage devices easily. At present, there are many studies on how to protect confidential data on USB storage devices. Most of these studies are based on application layer or operating system layer. When there are malicious codes on operating system, the operations protecting confidential information can be easily bypassed by attackers. In this paper, we present a USB devices access control system which is implemented on a thin hypervisor. The thin hypervisor is transparent to OS, which can guarantee that the security of the system is independent of OS, so that the system can be more secure.

    • Self-Optimizing Mechanism for Dynamic Switch Migration in SDN

      2017, 26(11):82-88. DOI: 10.15888/j.cnki.csa.006140

      Abstract (1810) HTML (0) PDF 954.35 K (2370) Comment (0) Favorites

      Abstract:In order to make full use of the resource of SDN controllers, as well as to improve the load balance degree among those controllers, a dynamic switch migration mechanism is proposed in this paper. The proposed dynamic migration solution is designed based on the Self-Optimizing Mechanism (SOM). It divides the network into several domains according to the deployment of SDN controllers. By comparing the relevant parameters of each domain, the proposed mechanism can quickly select appropriate target switches and migrating destinations. The load balance of multi-controller, flow latency and algorithm complexity are the main factors of the algorithm. The advantage of the algorithm is that it can flexibly manage the SDN control plane by local dynamic adjustment. Simulation results verify that the proposed mechanism can enhance the balance among controllers, and reduce the flow setup latency, while the computation complexity during the migrating process is kept at a reasonable level.

    • Research and Optimization of SIP Server in Information Communications Technology

      2017, 26(11):89-94. DOI: 10.15888/j.cnki.csa.006074

      Abstract (1463) HTML (0) PDF 3.10 M (2378) Comment (0) Favorites

      Abstract:In the Mobile Internet era, many Instant Messaging applications such as WeChat, Ali DingTalk, YiXin and other similar applications have emerged. How to serve for Instant Messaging applications effectively in the current system is a problem to be considered. Session Initiation Protocol is a signaling protocol in application layer. It has advantages in dealing with session, but inadequate in dealing with instant message. This paper describes the design and implementation of a new SIP server system based on the research and analysis of SIP which can deal with instant message better. In addition, SIP Broker can be deployed through the cluster, which can provide supports for future services expansion.

    • Human Head Detection Based on GPU_CPU Heterogeneous Parallel Acceleration

      2017, 26(11):95-100. DOI: 10.15888/j.cnki.csa.006079

      Abstract (1635) HTML (0) PDF 1.83 M (2289) Comment (0) Favorites

      Abstract:In the multi-scale collaborative human head detection system, the gradient direction histogram cannot meet the real-time requirement of video surveillance because of the massive computation in the high-definition video surveillance field. This paper proposes a method of human head detection based on GPU_CPU heterogeneous parallel acceleration. The GPU is responsible for the HOG feature extraction of large-intensive block parallel computing, and CPU is responsible for the implementation of other modules. The traditional parallel reduction algorithm is not excellent in the HOG feature extraction, and an improved parallel reduction algorithm is therefore proposed, which reduces the time complexity by the parallel computing of down-sweep to reduce calculated times of nodes, and the experimental results show that the proposed method is more efficient than the traditional one for over about 10 times.

    • Research on the Optimization of BLAS Level 1 and 2 Functions on Shenwei Many-Core Processor

      2017, 26(11):101-108. DOI: 10.15888/j.cnki.csa.006045

      Abstract (1856) HTML (0) PDF 4.20 M (2752) Comment (0) Favorites

      Abstract:BLAS (Basic Linear Algebra Subprograms) is a specification that prescribes a set of low-level routines for performing common linear algebra operations such as vector addition, scalar multiplication, dot products, linear combinations, and matrix multiplication. The functions in this library are divided into three levels, and each level provides basic operations between vector-vector (level 1), vector-matrix (level 2), and matrix-matrix (level 3), respectively. In this paper, we study the parallel implementation of BLAS level 1 and level 2 functions on Shenwei many-core processor, and make full use of the characteristics of the platform to optimize their performance, and sum up the parallel implementation and optimization techniques of the program on Shenwei platform. Shenwei 26010 CPU uses heterogeneous multi-core architecture, and has an obvious advantage in operating speed. Many computing cores provide large-scale parallel processing capabilities, so that, double precision floating-point computing performance of one single chip can reach 3TFLOPS. The experimental results show that the average speedup of BLAS level 1 and level 2 functions is as high as 11.x and 6.x times of GotoBLAS reference implementations respectively.

    • Experimental Data Publishing System for NBI Based on B/S Architecture

      2017, 26(11):109-113. DOI: 10.15888/j.cnki.csa.006053

      Abstract (1904) HTML (0) PDF 1.54 M (2321) Comment (0) Favorites

      Abstract:Experiments on the traffic information auto-extraction and mixed traffic travel schemes auto-creation system show that the system has high precision and is adaptive to web pages in different domains with different structures. Experimental data of EAST-NBI (EAST- Neutral Beam Injection) System are published in several pieces of software. It is difficult to integrate them because of the difference in languages. Besides, these pieces are based on C/S (Client/Server) architecture. The server only offers service to specific client in LAN (Local Area Network). Clients on the internet can't get access to the data. To meet the need via the internet, a data publishing system based on B/S (Browser/Server) is designed. The browser side is developed by HTML+CSS+JavaScript and the server side is developed by Java. The system consists of latest data publishing mode and historical data query mode, and integrated existing data items. The data are transmitted based on WebSocket protocol, which enables a full-duplex channel for automatic push of new data and request and response for historical data. Meanwhile, a comment function is provided in this system. The system has been put into the EAST-NBI experiment and got positive results.

    • Intelligent Agriculture System Based on Mobile Internet and Web Service

      2017, 26(11):114-117. DOI: 10.15888/j.cnki.csa.006093

      Abstract (2344) HTML (0) PDF 2.59 M (2498) Comment (0) Favorites

      Abstract:Based on the mobile Internet and Web Service technology, with ZigBee technology to build the underlying WSN, the mobile communication networks to achieve information transmission, HTTP protocol and JSON data format for data exchange, the Client and Web Service server is designed and developed, for the intelligent agricultural environment, including the construction of environmental indicators, historical data query, manual control, system settings and other functions as one of the intelligent agricultural APP system. According to the environmental parameters required for the crop growth cycle, the agricultural management displayed and operated by the Android mobile phone improves the intelligent level of the agricultural system and enhances its backwardness.

    • Clustering Algorithm Based on Self-Optimizing Center and Boundary of Classes

      2017, 26(11):118-123. DOI: 10.15888/j.cnki.csa.006077

      Abstract (1543) HTML (0) PDF 3.06 M (1834) Comment (0) Favorites

      Abstract:With the deep development and popularization of Internet, new data types emerge in new application fields so that many classic clustering algorithms are no longer effectively adapted to new situations, so data mining is becoming thorny issues and research focus. Therefore the article proposes a novel clustering algorithm based on self-optimizing the centers and boundaries of classes. The algorithm contains the points' distance-radius-distribution matrix-R and the cumulative radius-distribution matrix-ΣR characterizing the degree of data aggregation. The data points with the minimum R and ΣR as the class centers are searched under the breadth-first. The algorithm also includes the partial derivative matrix-R' of the distance-radius distribution to describe the gradient change of the loose degree between different points. According to self-optimizing and breadth-first, the transition point of matrix-R', which its partial derivative is the biggest one in adjacent points, is found as the class boundary, inside which all points belong to the class. After emulating and testing the algorithm by typical clustering data sets of Aggregation, the result shows that the algorithm can effectively cluster the data sets with different shapes, sizes and different densities, identify the isolated points and noises, and also have better robustness and accuracy.

    • Speech Enhancement Algorithm Using Wiener Filtering Based on Improved Energy to Entropy Ratio

      2017, 26(11):124-131. DOI: 10.15888/j.cnki.csa.006033

      Abstract (1880) HTML (0) PDF 1.73 M (2180) Comment (0) Favorites

      Abstract:In order to achieve the improved effectiveness of speech enhancement under low-SNR circumstance and the robustness of the algorithm, this paper puts forward a new speech enhancement algorithm which is based on wiener filtering algorithm combined with speech endpoint detection algorithm on account of frequency domain features. The endpoint detection algorithm adopts the ratio between spectrum entropy for wavelet packet ERB sub-band and energy entropy for improvement of frequency domain. Therein, the spectral entropy of wavelet packet ERB sub-band takes masking properties of human auditory and the difference between speech and noise signal frequency distribution into account; the frequency-domain energy takes advantages of the energy difference between voice-frames and non- voice-frames. In addition, the wiener filtering algorithm acquits real time data and uses the new parameters to distinguish voice segment and no-voice segment where noise spectrum is updated smoothly. At last, the experimental results demonstrate that the endpoint detection algorithm can be able to effectively distinguish between speech segments and no speech segments, leading to the improvement of speech enhancement in the case of low SNR and the guarantee of robustness as well as real-time of the algorithm. In contrast with the other two algorithms, the new approach to speech enhancement has a better effect.

    • Passive Multi-Dimensional Host Fingerprint Model in High-Speed Hybrid Network

      2017, 26(11):132-138. DOI: 10.15888/j.cnki.csa.006063

      Abstract (1779) HTML (0) PDF 851.55 K (2475) Comment (0) Favorites

      Abstract:Host identification is very important for computer forensics and anonymous attack resistance. In order to accurately identify the target host on the network, the definition and properties of the multi-dimensional host fingerprint model are given and formalized. Then, in view of the problem of reliability and accuracy of fingerprint acquisition, this paper proposes a multi-dimensional host fingerprint model for high-speed hybrid network traffic, which integrates the hardware characteristic information, host software environment characteristic information and host network behavior characteristic information. The experimental results show that the proposed model can extract data flexibly and efficiently in the high-speed hybrid network, and the multi-dimensional host fingerprint model can effectively identify the host with the accuracy of 93.33%, which has increased by nearly 8 percent compared with the single-dimension host fingerprint identification, and the multi-dimensional host fingerprint model is not affected by IP address changes. In general, the multi-dimensional host fingerprint model has higher reliability and accuracy compared with the single-dimensional host fingerprint identification.

    • Clustering Protocol for Wireless Sensor Networks Based on Inertia Weight Chaos-PSO Optimization

      2017, 26(11):139-144. DOI: 10.15888/j.cnki.csa.006037

      Abstract (1494) HTML (0) PDF 816.02 K (1820) Comment (0) Favorites

      Abstract:In order to solve the multi-factor conflict problem in cluster head election process, a clustering algorithm based on adaptive inertia weight chaotic particle swarm optimization (AWCPSO) is proposed to optimize the cluster head election and extend the network life cycle. This algorithm considers the residual energy of the nodes, the distance from the base station and the probability of the node as the cluster head during the cluster head election process. At the same time, it uses the adaptive inertia weight chaotic particle swarm algorithm to optimize the cluster head election, and elects the cluster members around the node communication range. The number of cluster heads can satisfy the optimal number of cluster heads, which further improves the energy efficiency of the network. The simulation results show that the proposed algorithm can save energy more effectively compared with the SEP and DEEC algorithm, and the stability and lifetime of the network can be improved by 62.31% and 16.45%, respectively.

    • Adaptive Level Set Segmentation Algorithm Based on Local Region

      2017, 26(11):145-151. DOI: 10.15888/j.cnki.csa.006042

      Abstract (1289) HTML (0) PDF 3.93 M (2252) Comment (0) Favorites

      Abstract:Intensity inhomogeneity often occurs in natural and medical images, and it is hard to accurately segment intensity inhomogeneous images because most popular segmentation models are based on intensity homogeneous images. In this paper, we propose a novel level set-based segmentation model which integrates adaptive gradient weighted information (AGWI) and local region information to handle intensity inhomogeneous images. By employing AGWI in local regions, we combine the edge information and region information. Furthermore, the complementation of edge information and region information will enhance the robustness and effectiveness of our method. Finally, we compare our model with the local Chan-Vese (LCV) model and local intensity clustering (LIC) model. Some experiments on synthetic and nature images will be shown to demonstrate the efficiency and robustness of our method.

    • Automatic Video Object Segmentation Algorithm for Multiple Scenes

      2017, 26(11):152-158. DOI: 10.15888/j.cnki.csa.006044

      Abstract (1504) HTML (0) PDF 2.61 M (1956) Comment (0) Favorites

      Abstract:Aiming at the problems of poor robustness in the complex environment, lens movement and light instability, a video object segmentation algorithm combining optical flow and graph cutting is proposed. The main idea is to improve the segmentation result by analyzing the motion information of the foreground object and obtaining the prior knowledge of the foreground area on the single frame image. Firstly, the motion information in the video is collected by the optical flow field, and the prior knowledge of the foreground object is extracted. Then, the foreground object segmentation is realized by combining the priori areas of foreground and background. Finally, in order to improve the robustness of the algorithm in different scenarios, this paper improves the traditional geodesic saliency model, and employs the dynamic position model optimization mechanism based on Gaussian Mixture Model based on the intrinsic temporary smoothness of video. Experimental results on two benchmark datasets show that the proposed algorithm reduces the error rate of the segmentation results compared with other video object segmentation algorithms, which effectively improves the robustness in many scenarios.

    • Document Classification Method Based on Word2vec

      2017, 26(11):159-164. DOI: 10.15888/j.cnki.csa.006055

      Abstract (1378) HTML (0) PDF 732.73 K (3439) Comment (0) Favorites

      Abstract:The feature extraction and the vector representation are the key points in document classification. In this paper, we propose a classification method based on word2vec for the two key points. This method builds the bag of feature words by Document Frequency (DF) to retain the important feature of the document as much as possible. It takes advantage of the Latent Semantic Analysis of word2vec thus to reduce the size of bag of feature words and the dimension of document vector effectively, which replaces the semantically relevant words with the product of a topic word and proper parameters. Besides, it also gives each feature word the optimal weight by combining with the TF-IDF algorithm. Finally, compared with two other document classification methods, the method presented in this paper has made some significant progress, and the experimental result has proved its effectiveness.

    • Data Quality Evaluation Method Based on Rule Base

      2017, 26(11):165-169. DOI: 10.15888/j.cnki.csa.006046

      Abstract (2199) HTML (0) PDF 800.98 K (8183) Comment (0) Favorites

      Abstract:In today's era of big data, data quality is the premise of the significance of big data. The evaluation of data quality is one of the most important research topics. In this paper, the data quality assessment method based on rule base is put forward, and the overall model of data quality assessment is presented, which includes rules, rule base, data quality evaluation index, evaluation model and evaluation report. This paper designs the rule evaluation template, combines rules in the rule base, sets rule weight according to the importance of data quality evaluation index, adopts the evaluation method that combines the simple ratio method and the weighted average method, calculates the evaluation result, determines the grade of the data quality, and shows the evaluation result of data quality with the data visualization technology. In order to fairly and accurately assess the data quality, and concisely and intuitively present the evaluation results, the paper does not only consider the execution rate of a single rule, but also considers the proportion of each rule in the data quality evaluation template.

    • Semi-Supervised Clustering Algorithm Based on RFM Model

      2017, 26(11):170-175. DOI: 10.15888/j.cnki.csa.006078

      Abstract (1736) HTML (0) PDF 2.20 M (2676) Comment (0) Favorites

      Abstract:As an important management method of customer relationship management (CRM), the customer classification is the basis for enterprises to carry out marketing. The classification of customers is conducive to accurate assessment of customer value and facilitate the precise marketing. In this paper, we study the priori structured information hidden in the RFM model dataset, and mark two sets of customer data as a priori category mark, and then get two initial clustering centers. Based on the traditional K-means algorithm, the K value and the initial clustering center are determined with the adaptive method. Combining the two types of constraints of Must-link and Cannot-link, the category markers are transformed into pairs of constraint information. Based on HMRF-KMeans pairs, the constraints and constraint bonuses are introduced to improve the clustering guidance and clustering results. The improved semi-supervised clustering algorithm (RFM-SS-means) was used to test the standard data set, and the Food mart data set was also used to compare the RFM-SS-means algorithm with the traditional K-means algorithm and the two-steps algorithm Class effect. From the experimental results, it can be seen that the CH coefficient of RFM-SS-means is the largest, and the clustering effect is good without prior determination of K value and initial clustering center.

    • User Behavior Authentication Method Based on SVM Algorithm

      2017, 26(11):176-181. DOI: 10.15888/j.cnki.csa.006056

      Abstract (1714) HTML (0) PDF 1.71 M (2473) Comment (0) Favorites

      Abstract:In order to enhance security of the mobile phone, a method of user operation behavior authentication based on SVM is proposed. By monitoring the mobile phone touch screen device, continuous access to user operation when the sliding track, contact area and other raw data. The user behavior feature extraction algorithm is designed to establish the user characteristic sample, and the SVM algorithm is used to train the user behavior characteristic model. The user access goals and historical certification results are integrated with different authentication strategies to achieve key protection sensitive data to facilitate the user access to non-sensitive data. Experimental verification in the Android system environment shows that the method has a good authentication effect.

    • Algorithm of Bus Route Trajectory Based on GIS Road Network

      2017, 26(11):182-186. DOI: 10.15888/j.cnki.csa.006054

      Abstract (1964) HTML (0) PDF 1.01 M (3183) Comment (0) Favorites

      Abstract:In order to solve the bus trail problem that bus route trajectory offset road network and GIS road network information is missing, especially on the rural roads, this paper proposes an algorithm of bus route trajectory based on GIS road networks. Firstly, it makes an in-depth analysis of bus GPS data, clustering line up and down track points respectively. Secondly, it cleans the track points and sort. Thirdly, it combines with GIS road network information for map matching. Finally, according to the improved Dijkstra algorithm, it solves the bus trail problem that GIS road network information is missing. The algorithm is applied in City A with 35 bus lines. The successful match rate is 85%. Unsuccessful matches are due to missing samples or wrong road network information. It can be seen that the algorithm has good accuracy and practicability.

    • Real-Time Trajectory Visual Algorithm of Civil Aviation Aircrafts Based on GM(1, 1) Algorithm

      2017, 26(11):187-192. DOI: 10.15888/j.cnki.csa.006065

      Abstract (1307) HTML (0) PDF 801.59 K (2914) Comment (0) Favorites

      Abstract:Aiming at the lagging and jumping problems in dynamic flight visualization of civil aircrafts, a real-time trajectory visualization algorithm for civil aircrafts based on prediction is studied and put forward. The algorithm includes three parts:track point prediction, target traveling and error correction. The track point prediction is improved based on GM(1, 1) algorithm. In each iteration, the development coefficients is dynamically adjusted according to all priori points and the search direction. The target traveling is based on the TSUS (Time Slice Uniform Speed) algorithm, which ensures that the target does indeed move to the destination within a period of time and adjusts the trajectory according to the initial direction. The error correction adopts the segmented error correction strategy to balance the accuracy and practicality of different situations. Experiments show that the algorithm can be effectively applied to visual trajectory rendering of civil aircrafts and improve the usability and user experience of the system.

    • Improvement of Energy-Efficient Clustering Multi-Hop Routing Algorithm for WSN

      2017, 26(11):193-198. DOI: 10.15888/j.cnki.csa.006108

      Abstract (1708) HTML (0) PDF 780.73 K (2195) Comment (0) Favorites

      Abstract:The LEACH algorithm is a typical single-hop clustering routing algorithm for wireless sensor networks. Aiming at mending the shortcomings of LEACH, this paper proposes an improved energy-efficient clustering multi-hop routing algorithm which uses the analytic hierarchy process to determine the weight coefficients of the four factors:the node degree, the communication distance between nodes, the residual energy of nodes and the distance from node to base station. The four factors are introduced in the cluster election after each round of the election of cluster head. The genetic algorithm is used to find a traversal of all cluster head node and the base station of the optimal path. The algorithm realizes the function of transmitting data from the cluster head to base station by multi-hop communication mode. The experimental results show that the proposed algorithm has better performance than CECA, LEACH-GA and LEACH algorithm in the life cycle of the network, the network energy consumption and the balance of energy. It has achieved the balance in energy and has prolonged the network lifecycle.

    • Influence Maximization on Multi-Social Networks Based on Bridge Users

      2017, 26(11):199-204. DOI: 10.15888/j.cnki.csa.006080

      Abstract (1373) HTML (0) PDF 1006.23 K (2132) Comment (0) Favorites

      Abstract:The influence maximization on single network has aroused widespread concerns and has become a research hotspot. However, there is a trend of information exchange between multi-social networks. The bridge user (BU), which refers to the user that has multi-accounts on multi-social networks, has the ability to share the information from one social network to another. Due to this, information spread is not limited to a single network. In this paper, we study the influence maximization on multi-social networks. We analyze the role of bridge user in multi-social networks information spread and propose a multi-social network aggregation algorithm based on bridge users, then we solve the problem of influence maximization on multi-social networks based on aggregate graph. Experiments solve the problem of influence maximization on multi-social networks and confirm the role of bridge users in the information spread on multi-social networks.

    • Energy Consumption Improvement of LEACH in WSNs

      2017, 26(11):205-212. DOI: 10.15888/j.cnki.csa.006095

      Abstract (1661) HTML (0) PDF 1.25 M (2309) Comment (0) Favorites

      Abstract:LEACH(Low Energy Adaptive Clustering Hierarchy) protocol has many shortcomings, such as the strong randomness of cluster heads' selection, not considering nodes' residual energy when selecting cluster heads. The defects of LEACH protocol will increase the energy consumption of the network and reduce the network lifetime. This paper proposes a new improved protocol named LEACH-CR(Low Energy Adaptive Clustering Hierarchy-Consumption Reduction)based on the research and analysis of LEACH protocol. LEACH-CR protocol takes many factors into consideration including the number, distribution, residual energy and distance to the base station of cluster heads. Finally, the outcome of MATLAB simulation indicates that the new protocol has effectively prolonged the lifetime of the network.

    • Application and Improvement of BTM in Short Text Classification Algorithm of the Same Topic

      2017, 26(11):213-219. DOI: 10.15888/j.cnki.csa.006071

      Abstract (1811) HTML (0) PDF 993.47 K (4599) Comment (0) Favorites

      Abstract:In order to solve the problem of large-scale short-text corpus topic model parameter K, the FBTM model is proposed to reduce the sampling complexity from O (K) to O (1). Aiming at the short spelling of short text and the weak description ability, this paper proposes a short text classification algorithm with biterm with the same topic and FBTM. Firstly, we use FBTM to model the text, and extend the same topic biterm in a sliding window as feature in the original text. Then, we use the FBTM topic distribution as another part of the text feature. The results show that this method has significantly improved the classification performance of Weibo corpus.

    • Mining Clinical Pathways Algorithm Based on Prefix Constraints

      2017, 26(11):220-225. DOI: 10.15888/j.cnki.csa.006073

      Abstract (1769) HTML (0) PDF 836.64 K (1955) Comment (0) Favorites

      Abstract:As lots of research has discussed, clinical pathways provide an effective way for improving the efficiency of hospitals. However, how to find useful clinical pathways conveniently is a problem. With the rapid development of networking, data storage and the data collections capacity, the hospitals have accumulated a large number of clinical data. In this paper, we characterize the problem of mining clinical pathways as a sequential patterns mining problem. We propose the concept of prefix of CP and integrate the prefix set into our algorithm CPM-PC:Clinical Pathways Mining with Prefix Constraints. The algorithm is more suitable for mining clinical pathways and will not search the sequences that has no medical significance. And the method has been applied to a real world data set to find clinical pathways and performs well.

    • Land Surface Phenology Remote Sensing Recognition Method Based on Segmented Morlet Wavelet Transform

      2017, 26(11):226-232. DOI: 10.15888/j.cnki.csa.006103

      Abstract (1330) HTML (0) PDF 2.18 M (1979) Comment (0) Favorites

      Abstract:In this paper, the author proposes to identify Land Surface Phenology from remote sensing data by using segmented Morlet wavelet transform. Land Surface Phenology is a necessary parameter for human understanding the Earth's ecological system, and the essential basis for protecting animals and plants, farming and other activities. The author finds that there are some defects in the existing methods, such as inaccurate in identifying phenology, poor at removing noise, while Morlet wavelet performs very good in cycle identification and noise removal. Therefore, the Morlet wavelet transform is used to deal with NDVI of Qinghai Lake Basin from 2003 to 2014. Then it is found that there is a case where the transformed curve is not fit with the original NDVI or the phenological period is shifted. So the author proposes an improvement method:segmenting Morlet wavelet transform, which means to divide each NDVI cycle into two sections according to the NDVI maximum, and then use Morlet wavelet transform on two segments respectively, and finally select appropriate parameter automatically. With this method, the phenology identification will be more reasonable and accurate. The authors extract the LSP parameters of Qinghai Lake Basin by segmenting Morlet wavelet transform and maximum slope, analysis of LSP parameters on time, space, and special year scales, and reveal the characteristics of Land Surface Phenology in the Qinghai Lake Basin. At the same time, it is proved that the Land Surface Phenology remote sensing recognition method based on segmented Morlet wavelet transform has improved both its accuracy and efficiency.

    • Three Security Solutions of Android Mobile Application

      2017, 26(11):233-237. DOI: 10.15888/j.cnki.csa.006107

      Abstract (1388) HTML (0) PDF 1.74 M (2331) Comment (0) Favorites

      Abstract:With the rapid upgrading of intelligent mobile terminal equipment, the mobile terminal applications of android operating system also have more species, and their update is faster. The rapid development of android smart phones makes android security mechanism a hot issue for developers and users. After a brief analysis of present situation and the security mechanism for the android security, aiming at the three security problems, the corresponding solutions are proposed. With practical examples, the method is realized a relevant security mechanism solution. Practice has proved that the method improves the security of the application.

    • Progressive Mesh Generating Method Based on Half-Edge Structure and √3 Subdivision

      2017, 26(11):238-242. DOI: 10.15888/j.cnki.csa.005470

      Abstract (1618) HTML (0) PDF 3.26 M (1907) Comment (0) Favorites

      Abstract:Progressive meshes will meet the requirements of generating multi-resolutions meshes of a 3D model. Among the methods available, more than 4 adjacent vertices are associated to simplify a vertex. Moreover, meshes are represented by vertex-face list structure, which has bad experience in searching neighbor information. In this paper, √3 subdivision method is introduced to predict the vertex to be simplified, and only 3 adjacent vertices are considered. In the meantime, a half-edge map is constructed to replace vertex-face list so as to speed up the neighbor information search. Experimental results show that the method proposed in this paper improves both time efficiency and space efficiency of generating progressive meshes.

    • Simulation Software for Hot Metal Movement in Blast Furnace Based on Fluent

      2017, 26(11):243-248. DOI: 10.15888/j.cnki.csa.006061

      Abstract (1531) HTML (0) PDF 2.70 M (2923) Comment (0) Favorites

      Abstract:Fluent calculation is an important means to simulate the flow of molten iron in blast furnace hearth. However, Fluent has many problems such as complicated operation and difficult entry and so on. By using Microsoft Visual Studio (VS) to develop the Fluent again, log file(Journal file) coded by TUI(Text User Interface) is used as the interface between VS and Fluent to interact data, and to establish geometric model of the flow of hot metal in blast furnace hearth based on many main parameters, such as hearth diameter, hearth radius, iron mouth depth, taper angle, and dead column state to optimize the process of parameter transfer. Modifying and replacing parameters of the log file with the method of variation replacement, which drives Fluent to achieve the model of the flow of hot metal in blast furnace hearth and simulate the stream field. The application software matches with practical engineering and it has simple and friendly interface, which reduces the use requirement of Fluent for the users and improves the work efficiency, and enhances the versatility and fastness of Fluent in the blast furnace industry.

    • Data Driven Test and Evaluation Method for Intelligent Vehicle Object Detection Capability

      2017, 26(11):249-253. DOI: 10.15888/j.cnki.csa.006043

      Abstract (1994) HTML (0) PDF 1.17 M (2736) Comment (0) Favorites

      Abstract:To address the problems such as incomplete index system, low degree of quantification and real-time evaluation of test and evaluation for intelligent vehicle object detection capability, we put forward a set of quantitative evaluation index system on object classification and object recognition. Then, we conduct a comprehensive evaluation by TOPSIS. According to the index system, we establish the data driven platform for intelligent vehicle object detection ability evaluation, the platform can also meet the real-time requirements of the evaluation on intelligent vehicle object detection ability. Finally, several groups of vehicle detection algorithms are used to verify the index system.

    • Objects Extraction of Comment Based on Conditional Random Field

      2017, 26(11):254-259. DOI: 10.15888/j.cnki.csa.006050

      Abstract (1802) HTML (0) PDF 870.75 K (1880) Comment (0) Favorites

      Abstract:Extracting object of comment is an important part of emotional analysis. In view of the irregularity of language and the characteristics of network in Chinese online comment, this paper presents a method of extracting objects based on the syntactic analysis and conditional random field. It analyzes experimentally the effect of different templates and different combinations of features on the F value. In the implementation of the system, this paper uses Harbin Institute of Technology language platform open interface and CRFs open source tools to train and test on comment data sets. Finally, the F values of the two types of data sets have reached 82.98% and 83.50% respectively.

    • Influence of Opening Traffic for Closed Residential Districts on Urban Traffic

      2017, 26(11):260-265. DOI: 10.15888/j.cnki.csa.006094

      Abstract (1527) HTML (0) PDF 676.46 K (3610) Comment (0) Favorites

      Abstract:Along with the increasing demands on traffics, urban traffic is becoming more serious. Whether we can relieve the urban traffic pressure by opening the roads of the closed residential districts deserves our research. In this paper, a network model of linear programming based on the shortest path of traffic diversion is built. Three evaluation indexes are put forward including the factor of traffic improvement, the average degree of traffic saturation, and the standard deviation of traffic saturation, to evaluate and contrast the traffic flow and the degree of road usage and reflect the specific influence of opening the roads of closed residential district on urban traffics. In this paper, two types of data of traffic network and traffic flow are emulated and solved through model. The results show that although the density of road network increases, which may reduce the total traffic burden, the opening of all residential districts cannot relieve the urban traffic pressure. The size, location, external and internal road conditions of residential districts and many other factors can affect the traffic state of road network. So, this issue cannot be generalized. It is necessary to evaluate the specific conditions of the residential districts and the roads around before determining whether the roads of the district can be opened or not.

    • Link State Detection Protocol Based on ICMP Extension

      2017, 26(11):266-270. DOI: 10.15888/j.cnki.csa.006072

      Abstract (1346) HTML (0) PDF 876.68 K (1989) Comment (0) Favorites

      Abstract:The conventional ICMP has been widely used in network information acquisition such as host survivability detection, port scanning and network topology discovery. But the problems like less detection information, the inflexible method, and limitations in network, and other issues are still prominent. In this paper, we propose a connectivity detection method based on ICMP protocol to carry link information. Mainly based on the original ICMP protocol, we add a variable-length link-state field for storage of interface device identification and bandwidth load. This paper focuses on how to use the ICMP Echo Reply to carry these interface information to the source and intermediate nodes of the receiving process. This method can effectively help us understand the entire network topology and bandwidth delay, filling the gap that the traditional connectivity detection method lacks network link state information.

    • Monitoring Framework Based on OpenFlow in Clouds

      2017, 26(11):271-276. DOI: 10.15888/j.cnki.csa.006099

      Abstract (1259) HTML (0) PDF 1.10 M (1941) Comment (0) Favorites

      Abstract:Monitoring networks in cloud is important and complex. In the cloud computing, network security equipment does not only monitor external network traffic, but also review the traffic within the network to resist attacks. There are some researches that introduce SDN and OpenFlow to route the traffic to security device. However, these methods are limited by the number of devices of the same type, and their performance is hence affected. In this paper, we propose a new flexible cloud computing monitoring framework based on OpenFlow. And we also propose the corresponding algorithm to improve the performance and efficiency. By doing so, we can monitor the network traffic efficiently and flexibly.

    • Multiple Attribute Decision Making Method of Intuitionistic Fuzzy TOPSIS Based on Rough Sets

      2017, 26(11):277-281. DOI: 10.15888/j.cnki.csa.006076

      Abstract (1238) HTML (0) PDF 739.72 K (3042) Comment (0) Favorites

      Abstract:Aiming at the problem of multiple attribute decision making with attribute values as intuitionistic fuzzy information and unknown attribute weights, a multiple attribute decision making method of intuitionistic fuzzy TOPSIS based on rough sets is proposed. Firstly, the positive and negative ideal points of the intuitionistic fuzzy information are given. The discernibility matrix is obtained according to the given threshold and the similarity degree between attribute values and the ideal points. Then, the attribute reduction is operated and the weight of attribute is determined by the discernibility matrix. Finally, according to the TOPSIS idea, the weighted similarity degree between each alternative and the ideal point is calculated. And the order of the alternatives is obtained. The validity of the method is verified with an example.

    • Industrial Lithium Battery Remaining Useful Life Prediction Based on the ARIMA Model

      2017, 26(11):282-287. DOI: 10.15888/j.cnki.csa.006067

      Abstract (1478) HTML (0) PDF 2.20 M (3358) Comment (0) Favorites

      Abstract:The present study focuses on the application of stochastic modeling technique in analyzing the remaining useful life prediction of lithium battery. For this, the Box-Jenkins ARIMA model has been used for simulating the lithium battery degradation process. The lithium battery dataset has been collected from NASA PCoE. ADF unit root test and difference method are used to smooth the original data of lithium battery capacity. The parameters are estimated by analyzing autocorrelation function and partial autocorrelation function. Several ARIMA models have been generated and their validation has been verified by assessing various estimation parameters. According to AIC, SC criteria and normalized BIC, the optimal prediction model is selected. After rigorous evaluation of the selected models, the ARIMA(2,1,2)is indentified as the best fit model. Satisfactory results have been obtained with the selected ARIMA models, indicating that the ARIMA model is highly accurate and feasible in the short term.

    • Cloud Service Selection Based on ELECTRE Method

      2017, 26(11):288-291. DOI: 10.15888/j.cnki.csa.005897

      Abstract (1711) HTML (0) PDF 584.43 K (2047) Comment (0) Favorites

      Abstract:The growing number of cloud services has made service selection a challenging decision-making problem by offering wide ranging choices for cloud service consumers. This necessitates the use of appropriate decision making methodologies to assist a decision maker in selecting the service that best fulfills the user's requirements. In this paper, we present a cloud service description form of quality of service, and present a cloud service selection methodology. The method of ELECTRE is introduced, which establishes the consistency matrix and the contradiction matrix to obtain the priority relation of each service. Finally,the algorithm is verified by an example.

    • New Algorithm for Segmentation of Cavity Region Based on CT Cardiac Image

      2017, 26(11):292-295. DOI: 10.15888/j.cnki.csa.005838

      Abstract (2075) HTML (0) PDF 3.40 M (2541) Comment (0) Favorites

      Abstract:To solve the problem of heart cavity over-segmentation in image processing, a new algorithm is proposed. It combines a new multi-threshold segmentation based on histogram with marker-based watershed segmentation algorithm. Because CT heart cavity region gray values are very similar and the heart tissues are connected with each other, it can't get the target area with the traditional threshold segmentation. The appropriate threshold values are selected according to the corresponding histogram of the CT images in new multi-threshold algorithm, and the segmentation results can be used as input images of marker-based watershed algorithm, eventually the cavity area is constructed via the three-dimension reconstruction. Compared with other methods, the segmentation result is more accurate, and the boundary is clear because of the use of multi-threshold based on histogram. It effectively reduces the over segmentation if only the original watershed algorithm is used.

Current Issue


Volume , No.

Table of Contents

Archive

Volume

Issue

联系方式
  • 《计算机系统应用》
  • 1992年创刊
  • 主办单位:中国科学院软件研究所
  • 邮编:100190
  • 电话:010-62661041
  • 电子邮箱:csa (a) iscas.ac.cn
  • 网址:http://www.c-s-a.org.cn
  • 刊号:ISSN 1003-3254
  • CN 11-2854/TP
  • 国内定价:50元
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063