Big data : techniques and technologies in geoinformatics / edited by Hassan A. Karimi.Publisher: Boca Raton : CRC Press, Taylor & Francis Group, Copyright date: 2014Description: xiv, 298 pages : illustrations, maps ; 24 cmContent type: text Media type: unmediated Carrier type: volumeISBN: 9781466586512; 1466586516Subject(s): Geography -- Data processing | Big data | Geographic information systems | Geospatial data | High performance computing | Big data | Geographic information systems | Geography -- Data processing | Geospatial data | High performance computingLOC classification: G70.2 | .B54 2014
|Item type||Current library||Call number||Status||Date due||Barcode||Item holds|
|BOOK||NCAR Library Mesa Lab||G70.2 .B54 2014||Available||50583020005983|
Includes bibliographical references and index.
Chapter 1. Distributed and parallel computing / Monir H. Sharker and Hassan A. Karimi -- chapter 2. GEOSS Clearinghouse : integrating geospatial resources to support the global earth observation system of systems / Chaowei Yang, Kai Liu, Zhenlong Li, Wenwen Li, Huayi Wu, Jizhe Xia, Qunying Huang, Jing Li, Min Sun, Lizhi Miao, Nanyin Zhou, and Doug Nebert -- chapter 3. Using a cloud computing environment to process large 3D spatial datasets / Ramanathan Sugumaran, Jeffrey Burnett, and Marc P. Armstrong -- chapter 4. Building open environments to meet big data challenges in earth sciences / Meixia Deng and Liping Di -- chapter 5. Developing online visualization and analysis services for NASA satellite-derived global precipitation products during the big geospatial data era / Zhong Liu, Dana Ostrenga, William Teng, and Steven Kempler -- chapter 6. Algorithmic design considerations for geospatial and/or temporal big data / Terence van Zyl -- chapter 7. Machine learning on geospatial big data / Terence van Zyl -- chapter 8. Spatial big data : case studies on volume, velocity, and variety / Michael R. Evans, Dev Oliver, Xun Zhou, and Shashi Shekhar -- chapter 9. Exploiting big VGI to improve routing and navigation services / Mohamed Bakillah, Johannes Lauer, Steve H.L. Liang, Alexander Zipf, Jamal Jokar Arsanjani, Amin Mobasheri, and Lukas Loos -- chapter 10. Efficient frequent sequence mining on taxi trip records using road network shortcuts / Jianting Zhang -- chapter 11. Geoinformatics and social media : new big data challenge / Arie Croitoru, Andrew Crooks, Jacek Radzikowski, Anthony Stefanidis, Ranga R. Vatsavai, and Nicole Wayant -- chapter 12. Insights and knowledge discovery from big geospatial data using TMC-pattern / Roland Assam and Thomas Seidl -- chapter 13. Geospatial cyberinfrastructure for addressing the big data challenges on the worldwide sensor web / Steve H.L. Liang and Chih-Yuan Huang -- chapter 14. OGC standards and geospatial big data / Carl Reed.
"Preface What is big data? Due to increased interest in this phenomenon, many recent papers and reports have focused on defining and discussing this subject. A review of these publications would point to a consensus about how big data is perceived and explained. It is widely agreed that big data has three specific characteristics: volume, in terms of large-scale data storage and processing; variety, or the availability of data in different types and formats; and velocity, which refers to the fast rate of new data acquisition. These characteristics are widely referred to as the three Vs of big data, and while projects involving datasets that only feature one of these Vs are considered to be big, most datasets from such fields as science, engineering, and social media feature all three Vs. To better understand the recent spurt of interest in big data, I provide here a new and different perspective on it. I argue that the answer to the question of "What is big data?" depends on when the question is asked, what application is involved, and what computing resources are available. In other words, understanding what big data is requires an analysis of time, applications, and resources. In light of this, I categorize the time element into three groups: past (since the introduction of computing several decades ago), near-past (within the last few years), and present (now). One way of looking at the time element is that, in general, big data in the past meant dealing with gigabyte-sized datasets, in the near-past, terabyte-sized datasets, and in the present, petabyte-sized datasets. I also categorize the application element into three groups: scientific (data used for complex modeling, analysis, and simulation), business (data used for business analysis and modeling), and general"-- Provided by publisher.