Back to results list
Show full item record
Please use this identifier to cite or link to this item:
|Title:||Sensor web and geospatial cloud computing modeling and its application in real-time collaborative earth observation data processing||Authors:||Xiao, Fei||Degree:||Ph.D.||Issue Date:||2016||Abstract:||The geospatial science is one of data-intensive domains, where research and development typically produce and analyze large volumes of distributed heterogeneous geospatial data sets. The recent advancements of sensor network and computing technologies have resulted in an explosion of geospatial data. In addition, scientific workflows and Web Services have been widely employed in geospatial data infrastructures. These technologies allow distributed data and model resources to be accessed and chained together to achieve complex scientific problems. The emergence of cloud computing provides a new way for processing big geoscience data by dynamically scheduling computing and storage resources over the Internet. Although the geospatial community tends to deploy the Earth Observation and geospatial model resources onto the cloud, there are still some challenges on effectively applying cloud computing paradigm to manage and analyze big geoscience data. First, the service-orientated cloud computing paradigm is transforming traditional geoscientific workflow management system from a close and centralized control system into a worldwide dynamic business process, which always consists of complex interactions among a large set of geographically distributed processing resources deployed and maintained by various organizations. Out of the necessity, these complex applications need to make use of large volumes of heterogeneous data and be executed in distributed computing environments. Furthermore, Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web.
To address these challenges, this research presents the Hypercube Geospatial Service Framework (HyperCGSF), an agent-based framework comprising a scalable architecture and a set of distributed algorithms for decentralized enactment of construction and execution of geospatial processing workflows in the cloud computing environment. Using the Integrated Dust storm Detection Model (IDDM) as a case study, this research investigates how geospatial cloud computing and Earth Observation Sensor Web technologies can be utilized to realize standard-compliant geospatial web services, service composition, model input integration, and output utilization. Additionally, this research will explore how to apply a scalable hypercube Peer-to-Peer (P2P) topology to organize an arbitrary number of geospatial service agents, which can then collaborate in the decentralized execution and monitoring of geospatial workflows. Contrary to traditional centralized approaches (e.g. BPEL), each service agent does not fully take charge of executing the whole workflow and all of the processes in a workflow are evenly distributed among the participating nodes in a fine-grained manner. An experimental evolution of HyperCGSF and a comparison with traditional centralized BPEL engine architecture demonstrate that the proposed HyperCGSF can dramatically decrease the execution times of complex workflow and increase the stability of the whole systems.
Geospatial data -- Data processing.
Geographic information systems.
Hong Kong Polytechnic University -- Dissertations
|Pages:||xv, 182 pages : color illustrations|
|Appears in Collections:||Thesis|
View full-text via https://theses.lib.polyu.edu.hk/handle/200/8755
Citations as of Jun 26, 2022
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.