Tuesday, 24 May 2011
Spatial Energy Spins Off European Subsidiary
- Spatial Energy Spins Off European Subsidiary
Monday, May 23, 2011
Spatial Energy
Spatial Energy announced the opening of a European subsidiary. Spatial Energy made the announcement at the 2011 EAGE (European Association of Geoscientists and Engineers) Conference & Exhibition in Vienna, Austria.
Chris Carlston, co-founder and Vice President of Sales for Spatial Energy in Boulder, Colorado, USA, has accepted the new position of Managing Director, Spatial Energy GmbH, which will be based in Vienna. Spatial Energy GmbH is the fourth global office opened since the parent company's founding in 2005.
With increased interest in oil and gas exploration and production, including activity in Africa, as well as in shale gas plays in Eastern and Western Europe, Spatial Energy made the strategic decision to focus on building and strengthening relationships with key EAME energy companies. By establishing an office in Vienna with one of its key principal executives, the company is able to provide dedicated sales and support for customers and partners in key markets throughout the Western and Eastern Europe, Africa and the Middle East (EAME).
"Long term customer relations and service are a critical part of our success. As our customers expand their operations globally, a subsidiary in Europe is our logical next step," said Bud Pope, President, Spatial Energy. "We see Europe and Africa as strong growth markets where the concept of enterprise imagery hosting and management is catching on fast. We're dedicated to ensuring that our current and future clients can rely on our remote sensing expertise no matter where their business takes them."
Carlston stated, "More and more, oil and gas companies are beginning to understand the value of integrating disparate spatial data sets and making them more accessible throughout their organizations. They also place a high value on service and support, which is the hallmark of our company. I look forward to serving the EAME market with the leading enterprise imagery and data management services to the Energy sector. With our global remote sensing applications and analytical offerings, and now with the opening of the Vienna office, I'll be able to provide our global customers with the consistent, personal attention they deserve."
DOCOMO Demonstrates the Effectiveness of Mobile Spatial Statistics for Disaster-Prevention and Urban Planning
Mobile spatial statistics are aggregate data about mobile phone locations and user attributes. In the use of such statistics, individual users are never identified.
A joint research project with Kogakuin University studied how DOCOMO's mobile spatial statistics could support disaster-prevention planning. The project was conducted across Tokyo from November 22, 2010 to March 31, 2011. The results of the study showed that spatial statistics can be used to estimate the distribution of people who would have difficulties returning home if a major earthquake centered on Tokyo were ever to occur.
The study focusing on urban planning, which was carried out with the University of Tokyo from November 1, 2010 to March 31, 2011, proved that mobile spatial statistics provided by DOCOMO can be used to evaluate suburb communities that relatively large populations of daily commuters compared to their current scale of public busing.
The University of Tokyo and Kogakuin University will release the details of their respective studies with DOCOMO at future academic conferences.
The studies were two of DOCOMO's latest initiatives to contribute proactively to society through its mobile business.
Disaster-Prevention Planning Joint Research Project: http://www.nttdocomo.com/pr/files/20110524_attachment01.pdf
Monday, 23 May 2011
Transforming Location Intelligence Into Profit
More businesses are now realizing that technology is only as good as the content and quality of the data that is being managed. Location intelligence provides the capability to organize and understand convoluted processes through the use of geographic relationships. At an enterprise level, Location Intelligence has the capacity to optimize business processes to improve profitability and competitiveness. Location intelligence helps businesses use the principles of location to organize, reason, plan, and problem solve. The degree of information is enriched by the successful integration into a process that results in better business decision making. Transforming location intelligence into profit is about transforming business processes to create more enterprise opportunities. The business use of location intelligence can be divided into the following three sub-categories that are designed to improve profitability.
Consumer Applications: These are enterprise applications that build loyalty among customers and influence purchasing behaviors. For example, retailers can execute store-specific promotions with more accuracy or use location intelligence to enhance loyalty program services.
Customer Service: Applications that facilitate customer service and self-service to improve the overall customer experience. For example, a government agency can more efficiently measure service levels or plan for the distribution of services that are in many cases dependent on variables that change over space, such as household income or number of children.
Enterprise Decision Support: Enterprise applications that help create optimal business strategies where the outcome is improved profitability. For example, identifying common customers and determining how to offer services to achieve the greatest value.
To access all possible revenue opportunities through location intelligence, a location intelligence provider should be able to provide the following:
All Validated Addresses in a Market: Creating the capability to determine the exact households in a serviceable market'
Mapping to the Property: Adding geo-targeted precision through geographic references to the property parcel or roof-top levels for contemporary location intelligence applications.
Residential, Business and Unit Information: Illuminating residential and office buildings with access to validated unit and floor information.
Address Management: Address validation, data cleansing, and data maintenance. The enterprise benefits from clean, current, and standardized address elements with geographically explicit references. verifying the existence of individual addresses by matching them against a master address database of functional addresses.
Mapping Addresses: Transforming addresses to embed explicit references of location. This allows the enterprise to map customers and to understand the opportunities among business assets, programs and competitive threats.
Standardizing Addresses: Ensures a common address structure, syntax, and nomenclature. A corporate-wide level of standardization allows for more predictable levels of data quality and system performance.
Merging customer information from a master address database provides complete viewing of all addressed dwellings. This allows for product penetration views, even into Multiple Dwelling Unit (MDU) buildings. Changes to customer product and service levels by building can now be measured to assess marketing, sales, and operational activities. Using location intelligence helps a business understand fundamental characteristics about customers, prospects, and their potential relationships to an enterprise's revenue-generating operations. Location Intelligence has the capacity to optimize business processes to improve profitability and gain a competitive edge.
Consumer Applications: These are enterprise applications that build loyalty among customers and influence purchasing behaviors. For example, retailers can execute store-specific promotions with more accuracy or use location intelligence to enhance loyalty program services.
Customer Service: Applications that facilitate customer service and self-service to improve the overall customer experience. For example, a government agency can more efficiently measure service levels or plan for the distribution of services that are in many cases dependent on variables that change over space, such as household income or number of children.
Enterprise Decision Support: Enterprise applications that help create optimal business strategies where the outcome is improved profitability. For example, identifying common customers and determining how to offer services to achieve the greatest value.
To access all possible revenue opportunities through location intelligence, a location intelligence provider should be able to provide the following:
All Validated Addresses in a Market: Creating the capability to determine the exact households in a serviceable market'
Mapping to the Property: Adding geo-targeted precision through geographic references to the property parcel or roof-top levels for contemporary location intelligence applications.
Residential, Business and Unit Information: Illuminating residential and office buildings with access to validated unit and floor information.
Address Management: Address validation, data cleansing, and data maintenance. The enterprise benefits from clean, current, and standardized address elements with geographically explicit references. verifying the existence of individual addresses by matching them against a master address database of functional addresses.
Mapping Addresses: Transforming addresses to embed explicit references of location. This allows the enterprise to map customers and to understand the opportunities among business assets, programs and competitive threats.
Standardizing Addresses: Ensures a common address structure, syntax, and nomenclature. A corporate-wide level of standardization allows for more predictable levels of data quality and system performance.
Merging customer information from a master address database provides complete viewing of all addressed dwellings. This allows for product penetration views, even into Multiple Dwelling Unit (MDU) buildings. Changes to customer product and service levels by building can now be measured to assess marketing, sales, and operational activities. Using location intelligence helps a business understand fundamental characteristics about customers, prospects, and their potential relationships to an enterprise's revenue-generating operations. Location Intelligence has the capacity to optimize business processes to improve profitability and gain a competitive edge.
Sunday, 22 May 2011
DoT wins award for GIS implementation
DUBAI - The Department of Transport, or DoT, in Abu Dhabi has won the Excellence Award in Geographic Information System (GIS) Implementation for the second successive time by GISTEC. The award was announced at GISWORX ’11, the 2011 GIS Workshops and Exhibition for Esri Users.
This excellence award was given to the DoT for its work in Geographic Information Systems in the Transportation Category and also for its implementation across the Emirate. At the GISWORX ’11, DoT showcased its activities and initiatives and provided important information on its GIS Web Applications. The DoT team also answered inquiries related to different topics such as the Intranet Portal, Corporate GIS Database and the developed spatial tools.
Abu Dhabi Systems and Information Centre (ADSIC), the GISWORK’11 government partner and the organisation tasked with making the Abu Dhabi Government more effective and efficient in delivering modern, efficient and constituent-centric e-Government services, praises DoT’s efforts and initiatives through its participation as a successful and long-standing member of the spatial data community. ADSiC and the Department of Transport cooperate on Abu Dhabi’s spatial data programme, an ADSIC administered e-government program to facilitate the sharing of geospatial data among government agencies and other stakeholders. The GISWORX ’11 provided a unique in-depth learning environment on different aspects of GIS to both beginners and advanced users through several technical workshops on focused topics. The workshop and exhibitions help organisations take full advantage of their software investment by enhancing the knowledge and skills of their GIS professionals. It also addresses key issues related to GIS implementation in specific industries.
Saturday, 21 May 2011
Location Intelligence: Why is it Useful?
Posted by kirsty Blog, Business Intelligence
What is location intelligence?
Wikipedia:
Location Intelligence is the capacity to organize and understand complex phenomena through the use of geographic relationships inherent in all information. By combining geographic- and location-related data with other business data, organizations can gain critical insights, make better decisions and optimize important processes and applications. Location Intelligence offers organizations opportunities to streamline their business processes and customer relationships to improve performance and results.
LI enables business analysts to apply geographic contexts to business data.
It’s not surprising that organizations want to combine geographic and location data with traditional business data – employees, customers, facilities, inventory, vendors and suppliers, and other assets all have a location component. By combining geographic data with traditional business data, users are provided with the insights and context to make better business decisions.
What about online businesses? Well, Location Intelligence is still important for them too. Knowing where your customers come from allows you to tailor effective marketing campaigns. Knowing their location might also help you choose which medium or media channel to use to get your message out.
Context: how is Location Intelligence being used today?
- In Retail, location services like Foursquare provide the geospatial information that a customer is at or near a retailer’s store. Combining this with customer data such as preferences and purchase history allows retailers to make timely and relevant offers to consumers that can result in additional sales.
- Location is very important in the insurance industry, where customers and natural disasters are both tied to a location. When a natural disaster occurs, insurance companies have the ability to instantly understand their claims exposure by visually plotting their customer data and the affected area on a map. This also allows them to more accurately estimate the resources they will need to process claims in an affected area.
- Site selection, the decision about where to locate a new store or facility, is probably the most common application of Location Intelligence today. When location data is combined with available real estate data, demographic data, data on current customers, and information on prospective customers, the resulting Location Intelligence can help identify a site location with maximum revenue potential. A recent research report – Location Intelligence: What can be Expected as BI embraces Location and Cloud – released in March this year by Saugatuck Technology defines the benefit of LI in the following way: “Integration of Location (GIS) and standard BI platforms brings LI to greater usefulness by making it available as an option to anyone who is familiar with the more readily-available BI solutions, and without the need to master new concepts or a new user interface. Spatial relationships also greatly enhance many of the details commonly reported by BI systems, providing an added level of analysis that is useful in viewing and assessing trends (and existing data types).”
- Site selection
- Geographic impacts and factors for current and future developments
- Optimizing transit routes (e.g. the fastest transportation routes and enhancing on-the-move communications by mapping cell phone towers)
- Enabling effective forecasting (matching store locations with the size of surrounding populations can be used as a guide to determine potential profitability)
- Optimizing warehousing processes and stock flows based on the consumption rates of particular products by locality
- Customer clustering
- Revenues / sales per country, state, region
- Postal addresses of target customers
- Marketplace gaps, opportunities, threats and level of penetration (by adding a time element, it is also possible to track and predict growth)
Usefulness by business function
Here are some examples of how LI can be useful in different departments of a business: Planning and constructionThursday, 19 May 2011
U.S. Will Not Pay $25 Million Osama Bin Laden Reward, Say Officials
No one will receive the $25 million reward for the capture of Osama bin Laden, say U.S. officials, because the raid that killed the al Qaeda leader in Pakistan on May 2 was the result of electronic intelligence, not human informants.
"We do not expect a reward to be paid," said a senior U.S. official familiar with the bin Laden hunt, meaning that the $25 million bounty offered by the U.S. under the Rewards for Justice program after the 9/11 terror attacks will probably remain uncollected.
The reason is simple, say officials involved in or knowledgeable about the hunt for the world's most wanted man: the CIA and the military never had an al Qaeda operative as an informer willing to give him up. Instead, what killed bin Laden was electronic surveillance, and an operational mistake by one of his closest associates. After a slow drip of intelligence year after year, and then a final flurry of data collection and analysis brought a team of SEALs to bin Laden's Abbottabad compound on a moonless night.
In previous manhunts, such as the capture of 9/11 mastermind Khalid Sheikh Mohammed and former Iraqi dictator Saddam Hussein or the killing of his two sons, U.S. intelligence and military commandos had the help of insiders and human sources. In each case, someone received millions of dollars in reward money for their efforts.
The long and sometimes circular path that led to bin Laden was paved by satellites, drones, phone surveillance and luck. The CIA declined to comment on specific intelligence methods, but U.S. officials have said the intelligence was a "mosaic" and "multi-streamed," meaning from every avenue in the government's arsenal, the strongest of which is still the technological wizardry of the CIA, the National Security Agency, the National Geospatial-Intelligence Agency and the Department of Defense.
Bin Laden's survival for nearly ten years was the result of the limits of American power and intelligence--the ability to recruit sources inside al Qaeda or support networks in Pakistan, and his death was the result in the overwhelming superiority of American electronic, signals, and technological capabilities.
By the summer of 2009, the trail for bin Laden had gone cold. The CIA simply had no tangible evidence of any place he'd been since he'd slipped away from U.S. air attacks in his redoubt in Afghanistan's Tora Bora mountains. Marty Martin, a former top CIA official who led the hunt for bin Laden from 2002 to 2004, said that for years his colleagues were baffled as to where the fugitive had hidden.
"We could see from his videos what his circumstances were," Martin said of bin Laden's video messages that were released in the years after his Tora Bora escape. "In the immediate years afterward he looked battle fatigued and on the run. He didn't look healthy. We knew he was moving. But where? We simply didn't know. Then, he gained weight and looked healthy. I told my analysts, 'He's gone urban, moved somewhere stable and safe.' "
During all the years the trail went cold, the CIA had been unable to develop a human source inside al Qaeda or inside their support network. Several former intelligence officials involved in the hunt for bin Laden said developing a spy inside bin Laden's inner circle was never very likely because of the level of commitment his followers possessed. The man who turned in Khalid Sheikh Mohammed was an Afghan informer who provided low-level support to the al Qaeda chief of operations, not a fellow operative. Beyond that, the CIA tried to monitor those who facilitated communications and operations for al Qaeda, while learning as much as they could from detainees.
But in 2009, the CIA caught a break. The Pakistani intelligence service, known as the ISI, delivered a gift: a cellphone number they gathered when they recorded a call made from Pakistan to the Middle East. The number belonged to an al Qaeda courier that the CIA had long been searching for, Abu Ahmed al Kuwaiti.
Osama Bin Laden's Courier Makes A Mistake
After the 2009 phone call that the Pakistanis tapped, however, al Kuwaiti's number went dark. But the courier had exposed himself, and the CIA suspected that if they could find where al Kuwaiti lived, they might be about to find bin Laden.
The call had located al Kuwaiti in northwest Pakistan and gave the CIA a starting point for a renewed hunt. A year later, in the summer of 2010, despite fastidious operational security by al Kuwaiti -- he normally drove 90 minutes from the compound before inserting the battery in his cellphone, preventing signals intelligence pinpointing his starting point – he made a twofold mistake. For the first time in almost a year, he used the cellphone simcard that U.S. intelligence had linked to him, and he made a call with that simcard close to bin Laden's compound.
The National Security Agency, the world's most powerful signals intelligence organization, had been waiting to pounce on any calls made from that simcard since 2009. The NSA picked up the call and located al Kuwaiti in Abbottabad, Pakistan. They were even able to pinpoint the neighborhood the call had probably come from. From there, the CIA and National Geospatial-Intelligence Agency (NGA) began searching aerial satellite photographs to deduce which house would likely be bin Laden's.
When they discovered a newer building with high perimeter walls, custom construction and a third floor terrace wall of seven feet--the CIA knew they had their target. The search was almost over.
By August of 2010, CIA director Leon Panetta briefed President Obama and had a new stealth drone begin flights over the compound, undetected by Pakistani air defenses. The CIA was sure a high-value target lived in the compound, and given al Kuwaiti's relationship to bin Laden--learned in bits and pieces from interrogations of captured detainees since 2002 -- was "60 to 80 percent" sure bin Laden was hiding in the compound, according to Panetta.
The CIA, the NGA and the Pentagon studied reams of signals intelligence, electronic emissions, infrared technology, almost all from drones and satellites, in order to learn the compound's construction and the number of people living inside. Intelligence analysts even studied the water tables underneath the Abbottabad valley to determine whether it was likely bin Laden had built an escape tunnel underneath the house.
"We were pretty sure it was too wet to build a tunnel," one US official familiar with the CIA's intelligence said.
Martin, the retired CIA official, said bin Laden also undoubtedly learned from his al Qaeda operatives' mistakes.
"He was not stupid. If you see your men killed by drones or captured, you learn from experience what kind of entourage to have and how to change your profile."
Bin Laden had taken away all signs of his importance that for years the CIA had searched for from the sky: armed guards, rings of protection, transportation convoys -- he left it all behind and hid behind an 18-foot wall for five years. When the Navy SEALs eventually stormed the compound, only a few rifles and handguns were seized. He had dropped virtually all his protection so that spy satellites, and drone surveillance would be unable to differentiate his compound from any other in the area. The SEALs also found, and killed, the courier whose single errant phone call, snapped up in a web of electronic surveillance, had led them to Abbottabad.
What Is Location Intelligence Anyway?
by Steve Benner
What is Location Intelligence anyway?
How is it different from:
The ingredients of location intelligence are debated even by those in the industry. How confused must the BI community be about it? Maybe they don't think about it, blissfully (dangerously?) unaware of what it takes to deliver a LI solution? Dots on maps is what they know and they are happy with it. To them, this is LI, or GBI, or GeoBI, or LBS, or ... whatever.
To help this topic along lets start with some basic ingredients that we can collectively modify and blend in our own measure to see what we can bake for a definition LI. Here is a basic list of ingredients:
If you are in the LI business, share your favorite recipe! Give us the chef's secrets about each ingredient, how to prepare it, and how to bake it all into a solution. If you are new to LI, tell us what you're hungry for whether it's a big juicy problem or just an appetizer. If we're successful here, we'll end up with quite a cookbook, one full of problems and recipes to solve them.
If you don't care to comment here but would like to share some thoughts or longer comments privately, send them to inquiries@tlii.org.
We look forward to your feedback!
steve
How is it different from:
Location-based Services (LBS)A simple Google search on any of these terms will likely lead to more confusion than clarity on the topic. The terms are used by a wide variety of vendors to differentiate themselves, at least in marketecture. The list will likely grow as the popularity of mapping within BI and other business applications grows. Just playing with the words spatial, geospatial, geographic, geo-, location, location-based, business, and intelligence got me a dozen combinations that had a nice ring to them. So we still have a ways to go. There are at least five new categories waiting for some marketing genius to exploit!
Location-based whatever (LBx - you supply the "x")
Geographic Business Intelligence (GBI)
Geospatial Business Intelligence (GBI again?)
Geo-Business Intelligence (GeoBI)
Geographic Information System (GIS)
Spatial Information Management (SIM)
The ingredients of location intelligence are debated even by those in the industry. How confused must the BI community be about it? Maybe they don't think about it, blissfully (dangerously?) unaware of what it takes to deliver a LI solution? Dots on maps is what they know and they are happy with it. To them, this is LI, or GBI, or GeoBI, or LBS, or ... whatever.
To help this topic along lets start with some basic ingredients that we can collectively modify and blend in our own measure to see what we can bake for a definition LI. Here is a basic list of ingredients:
- Standard or custom geometries such as points lines or polygons (states, my territories).
- Information about each geometric feature (demographics for Maine, territory 3 sales).
- Visualization of maps of the geometries and their associated information, paper and electronic.
- Analysis of spatial relationships, both intuitively and computerized.
- And the most important: a problem whose solution can be improved by blending the right amounts of the above.
If you are in the LI business, share your favorite recipe! Give us the chef's secrets about each ingredient, how to prepare it, and how to bake it all into a solution. If you are new to LI, tell us what you're hungry for whether it's a big juicy problem or just an appetizer. If we're successful here, we'll end up with quite a cookbook, one full of problems and recipes to solve them.
If you don't care to comment here but would like to share some thoughts or longer comments privately, send them to inquiries@tlii.org.
We look forward to your feedback!
steve
Wednesday, 18 May 2011
Pacific Hydro taps the power of GIS
With at least $A1.6B in new initiatives underway across the world, Pacific Hydro has sought to marry the collaborative qualities of SharePoint with its geographic information system (GIS) server to make geospatial data more broadly available in offices in Australia, Chile, and Brazil.
The clean energy future starts now with Australia’s Pacific Hydro, a global developer and operator of clean energy projects that include solar, geothermal, hydro and wind.
Maja Barnett, GIS Co-ordinator at Pacific Hydro, estimates that over 95% of documents across the organisation have some sort of geospatial reference and has cited this as the reason for developing a system to spatially enable documents stored within SharePoint.
SharePoint 2007 and an ESRI Geographical Information System (GIS) server have been in place at the organisation since 2009, however initially there were no links between the two platforms.
Limitations of a centralised approach to GIS became evident as different departments failed to receive updates to geospatial data, meaning their maps may be out of date.
SharePoint documents could not be searched spatially or by individual asset, and instead multiple documents for an individual asset or geographic location must be searched individually and often within different document libraries.
Also spatial relationships between documents could not be viewed and spatial information stored within documents was unable to be analysed.
A small team of GIS professionals located in both Melbourne and Santiago Chile undertook a project in 2010 to correct this by spatially enabling existing data for analysis.
Rather than have staff rely on maps and plans that must be generated by a small number of dedicated GIS personnel, the company also wanted to open it up to more widespread access.
“We treat our GIS as a centralised repository of information, its not just environmental data there is planning, legal and construction that all feed information into the GIS,” said Barnett.
The first objective was to open up GIS data stored within geodatabases and provide a dynamic link to the mapping interface within SharePoint.
Visual Fusion from IDV solutions was selected to link SharePoint and ESRI to enable this.. Melbourne-based IT integrated solutions company, Geomatic Technologies (GT), undertook the project to integrate the data sources, which ensures that any updates to the GIS data are now automatically reflected in the mapping interface.
Staff can now view the location of assets, for instance within a wind turbine power generator, and when selecting assets view additional information stored within the geodatabase.
Unlike other Internet mapping solutions, any data updated within the geodatabase is immediately reflected in the mapping interface.
“Making spatial data more widely available has allowed individual departments to be better informed and resulted in improved decision-making,” said Barnett.
Providing the ability for documents to be “spatially searched” was another objective.
Many documents that are specific to individual assets or a particular geographic location now have a metadata field that links them to assets stored within the GIS geodatabase. Geospatial data is manually added to documents as they are uploaded to SharePoint.
This provides for improved document search functionality as well as better efficiency and workflows.
“It also provides a user friendly interface for document searches,” said Barnett.
The move to spatially enable data has provided major benefits, for instance being able to analyse occupational health & safety incidents by locations. Events are traditionally recorded in an Excel spreadsheet, and when this is migrated to a SharePoint List the location co-ordinates (Latitude/Longitude) are added to enable this.
The success of the system comes from empowering non-GIS users in the business to access and contribute toward the physical location of assets, documents and incidents.
The Visual Fusion solution deployed by GT on the SharePoint platform has provided the tools to effectively extend the mapping and visualisation of business data without the need for all users to be trained in GIS.
All of Pacific Hydro’s operational wind farm projects are currently held in the system, and in the future it is hoped to extend this to include all projects in both development and operation. Another item on the agenda is the addition of reporting tools that can alert to health and safety incidents for instance.
The clean energy future starts now with Australia’s Pacific Hydro, a global developer and operator of clean energy projects that include solar, geothermal, hydro and wind.
Maja Barnett, GIS Co-ordinator at Pacific Hydro, estimates that over 95% of documents across the organisation have some sort of geospatial reference and has cited this as the reason for developing a system to spatially enable documents stored within SharePoint.
SharePoint 2007 and an ESRI Geographical Information System (GIS) server have been in place at the organisation since 2009, however initially there were no links between the two platforms.
Limitations of a centralised approach to GIS became evident as different departments failed to receive updates to geospatial data, meaning their maps may be out of date.
SharePoint documents could not be searched spatially or by individual asset, and instead multiple documents for an individual asset or geographic location must be searched individually and often within different document libraries.
Also spatial relationships between documents could not be viewed and spatial information stored within documents was unable to be analysed.
A small team of GIS professionals located in both Melbourne and Santiago Chile undertook a project in 2010 to correct this by spatially enabling existing data for analysis.
Rather than have staff rely on maps and plans that must be generated by a small number of dedicated GIS personnel, the company also wanted to open it up to more widespread access.
“We treat our GIS as a centralised repository of information, its not just environmental data there is planning, legal and construction that all feed information into the GIS,” said Barnett.
The first objective was to open up GIS data stored within geodatabases and provide a dynamic link to the mapping interface within SharePoint.
Visual Fusion from IDV solutions was selected to link SharePoint and ESRI to enable this.. Melbourne-based IT integrated solutions company, Geomatic Technologies (GT), undertook the project to integrate the data sources, which ensures that any updates to the GIS data are now automatically reflected in the mapping interface.
Staff can now view the location of assets, for instance within a wind turbine power generator, and when selecting assets view additional information stored within the geodatabase.
Unlike other Internet mapping solutions, any data updated within the geodatabase is immediately reflected in the mapping interface.
“Making spatial data more widely available has allowed individual departments to be better informed and resulted in improved decision-making,” said Barnett.
Providing the ability for documents to be “spatially searched” was another objective.
Many documents that are specific to individual assets or a particular geographic location now have a metadata field that links them to assets stored within the GIS geodatabase. Geospatial data is manually added to documents as they are uploaded to SharePoint.
This provides for improved document search functionality as well as better efficiency and workflows.
“It also provides a user friendly interface for document searches,” said Barnett.
The move to spatially enable data has provided major benefits, for instance being able to analyse occupational health & safety incidents by locations. Events are traditionally recorded in an Excel spreadsheet, and when this is migrated to a SharePoint List the location co-ordinates (Latitude/Longitude) are added to enable this.
The success of the system comes from empowering non-GIS users in the business to access and contribute toward the physical location of assets, documents and incidents.
The Visual Fusion solution deployed by GT on the SharePoint platform has provided the tools to effectively extend the mapping and visualisation of business data without the need for all users to be trained in GIS.
All of Pacific Hydro’s operational wind farm projects are currently held in the system, and in the future it is hoped to extend this to include all projects in both development and operation. Another item on the agenda is the addition of reporting tools that can alert to health and safety incidents for instance.
Geographic profiling as a novel spatial tool for targeting infectious disease control
Geographic profiling is a statistical tool originally developed in criminology to prioritise large lists of suspects in cases of serial crime. Here, we use two data sets - one historical and one modern - to show how it can be used to locate the sources of infectious disease.
Results: First, we re-analyse data from a classic epidemiological study, the 1854 London cholera outbreak.
Using 321 disease sites as input, we evaluate the locations of 13 neighbourhood water pumps. The Broad Street pump - the outbreak's source - ranks first, situated in the top 0.2% of the geoprofile.
We extend our study with an analysis of reported malaria cases in Cairo, Egypt, using 139 disease case locations to rank 59 mosquitogenic local water sources, seven of which tested positive for the vector Anopheles sergentii. Geographic profiling ranks six of these seven sites in positions 1-6, all in the top 2% of the geoprofile.
In both analyses the method outperformed other measures of spatial central tendency.
Conclusions: We suggest that geographic profiling could form a useful component of integrated control strategies relating to a wide variety of infectious diseases, since evidence-based targeting of interventions is more efficient, environmentally friendly and cost-effective than untargeted intervention.
Author: Steven Le ComberD RossmoAli HassanDoug FullerJohn Beier
Credits/Source: International Journal of Health Geographics 2011, 10:35
Geospatial-Intelligence Agency helps eyeball Mississippi floods
Bob Brewin 05/17/2011
The National Geospatial-Intelligence Agency, which helped pinpoint the Pakistan hideout of Osama bin Laden, this week is using its tools, systems and analysts to help assess potential impact of Mississippi River floods on bridges, roads and other critical infrastructure in the South, a senior agency analytical manager told Nextgov.
The manager, who declined to be identified for security reasons, said NGA is also helping the Federal Emergency Management Agency eyeball and map debris fields in Alabama in the wake of the hundreds of tornadoes that battered that state last month.
NGA maintains detailed information on the country's critical infrastructure in its Homeland Security Infrastructure Program, which was set up in 2001 to serve as a clearinghouse of mission-critical geospatial and remote sensing information needed to reduce response and recovery times in the event of a natural or terrorist-caused disaster within the United States.
An internal NGA presentation shows that this database includes imagery data at a resolution of 1 foot or better, along with elevation data and vector data on critical infrastructure.
The NGA manager said the agency has tapped this database in support of FEMA to provide precise information, for example, on bridges that could be overwhelmed by the floods. This database, the manager said, details the height and width of those bridges, which becomes important as the Mississippi flood crest moves downstream and wide swaths of rural Louisiana in the Atchafalaya River basin become inundated following the this week's deliberate opening of floodgates on the Morganza Spillway, 40 miles north of Baton Rouge.
NGA said in a press release that it is supporting FEMA, the Homeland Security Department and the Army Corps of Engineers by producing models to predict the effects of releases from the spillway.
NGA's analyses include predicted and actual effects on critical infrastructure including roads, railways, airports, hospitals, Red Cross and other emergency facilities, power plants, piers and port facilities, petroleum refineries, and other industrial facilities, schools, water supplies and more.
"NGA provides a common operating picture that enables FEMA and emergency responders to work together more effectively and efficiently," said Philip J. Plack, NGA liaison to FEMA.
NGA also uses unclassified imagery data from commercial sources and the National Oceanographic and Atmospsheric Administration as well as classified imagery data to produce for FEMA what he called tailored products on the impact of the floods on bridges, roads, railroads, power plants and other key infrastructure, with all imagery information provided at the unclassified level.
These products are not delivered to FEMA as maps, the manager said, but rather as geographical information system data overlays to FEMA GIS systems that, if called upon, can output paper maps.
The manager said two NGA employees in Alabama are providing FEMA with data on the precise path of the tornadoes based in part on analysis of weather radar imagery. NGA also has two employees stationed at the FEMA regional headquarters in Denton, Texas, which is managing the larger response to the Mississippi floods.
The manager, who declined to be identified for security reasons, said NGA is also helping the Federal Emergency Management Agency eyeball and map debris fields in Alabama in the wake of the hundreds of tornadoes that battered that state last month.
NGA maintains detailed information on the country's critical infrastructure in its Homeland Security Infrastructure Program, which was set up in 2001 to serve as a clearinghouse of mission-critical geospatial and remote sensing information needed to reduce response and recovery times in the event of a natural or terrorist-caused disaster within the United States.
An internal NGA presentation shows that this database includes imagery data at a resolution of 1 foot or better, along with elevation data and vector data on critical infrastructure.
The NGA manager said the agency has tapped this database in support of FEMA to provide precise information, for example, on bridges that could be overwhelmed by the floods. This database, the manager said, details the height and width of those bridges, which becomes important as the Mississippi flood crest moves downstream and wide swaths of rural Louisiana in the Atchafalaya River basin become inundated following the this week's deliberate opening of floodgates on the Morganza Spillway, 40 miles north of Baton Rouge.
NGA said in a press release that it is supporting FEMA, the Homeland Security Department and the Army Corps of Engineers by producing models to predict the effects of releases from the spillway.
NGA's analyses include predicted and actual effects on critical infrastructure including roads, railways, airports, hospitals, Red Cross and other emergency facilities, power plants, piers and port facilities, petroleum refineries, and other industrial facilities, schools, water supplies and more.
"NGA provides a common operating picture that enables FEMA and emergency responders to work together more effectively and efficiently," said Philip J. Plack, NGA liaison to FEMA.
NGA also uses unclassified imagery data from commercial sources and the National Oceanographic and Atmospsheric Administration as well as classified imagery data to produce for FEMA what he called tailored products on the impact of the floods on bridges, roads, railroads, power plants and other key infrastructure, with all imagery information provided at the unclassified level.
These products are not delivered to FEMA as maps, the manager said, but rather as geographical information system data overlays to FEMA GIS systems that, if called upon, can output paper maps.
The manager said two NGA employees in Alabama are providing FEMA with data on the precise path of the tornadoes based in part on analysis of weather radar imagery. NGA also has two employees stationed at the FEMA regional headquarters in Denton, Texas, which is managing the larger response to the Mississippi floods.
Monday, 16 May 2011
How A Location Intelligence Solution can Help Mortgage Lenders Navigate Through Turbulent Times
Download the Entire White Paper
Due to the 2008 “shake-out” of the banking industry along with the decline in the housing market in general, mortgage lending has become even a more competitive business today. During the peak of the housing market, many lending institutions provided subprime loans. As a result, these same mortgage lenders are now dealing with excessive foreclosures or threat of future foreclosures as these bad loans begin to “reset” to unaffordable interest rates. Now, more than ever, mortgage lenders need a scalable process to find the right prospect or customer. If not, they will either be acquired or simply shut down all of its branches.Given that location comprises approximately 80% of a mortgage lender’s data warehouse, deploying a light-weight geo-spatial mapping solution provides the most economical method of targeting the right borrowers. Unlike tabular data or Excel spreadsheets, maps provide an intuitive interface where users of all types can gain immediate business insights:
- Which competitors are originating the most loans within a 1 mile radius from my local branch?
- Can I prove to a compliance officer that my local branch is servicing low-income zip codes in good faith?
- Which branches can be closed due to a lack of demand?
- Where are my best customers to up-sell better HELOCs?
- Within a 2 mile radius of my local branch, where are the target customers for first-time mortgages? (e.g., Renters with good credit scores, starting families, etc.)
Real-Time Space Shuttle in Google Earth
› Preview it in Google Earth
You can now track the space shuttle during launch and landing in Google Earth using real-time data from Mission Control.
You will need Google Earth to use this file. Don’t have Google Earth? Download it here.
After you have installed Google Earth, download the live groundtrack file here.
You will be prompted to save/open the file. Select the "Open" option, and the file will automatically open in Google Earth, if you have it installed.
Using live shuttle data, a 3D model of NASA’s space shuttle is plotted in Google Earth to show its current position and trajectory.
Google Earth displays the world in 3D with satellite imagery, and you can also visualize geospatial data. The space shuttle trajectory - the path that the vehicle flies - is shown as a yellow line. It represents the path that the shuttle has flown so far. Mission events are shown as colored dots on the yellow line when and where they occurred. Examples of mission events include:
- Solid Rocket Booster Separation
- Main Engine Cut-Off (MECO)
- External Tank Separation
- Speed milestones
- Altitude milestones
- Landing events
Auto-Follow will automatically move with the space shuttle as it launches/lands. During the dynamic launch and entry phases of flight, normally the view is set to a Chase View or Low View. This is the default option. This is recommended for passively monitoring the launch/landing ("set it and forget it") or if you are not familiar with navigating in Google Earth.
Manual Control lets the user control the view within Google Earth. It will not automatically follow the Space Shuttle. This enables users that are already comfortable with Google Earth's navigation controls to pan and zoom to whatever view they desire.
If you click on the colored dots along the trajectory, a balloon will open to display information about the shuttle's flight at that point in the trajectory.
Sunday, 15 May 2011
Do Maps Have Morals?
from TechnologyReview.com
By Daniel Charles June 2005
On a snowy morning in early March, looking for the frontiers of digital mapmaking, I hopped
into the back seat of an SUV sporting a Global Positioning System receiver on the roof. In front
sat two representatives from Navteq, one of the companies that builds the street maps that you
see on MapQuest. Phil Satlof, senior geographic analyst, operated a laptop computer hooked up
to the GPS receiver.
Looking at the computer's screen, I felt we'd stepped inside a video game. A flashing arrow
showed our progress through the familiar grid of Washington's streets. I watched, fascinated, as
it marched down a line marked "River Road" into Montgomery County, Maryland.
Half an hour later, in the wealthy suburb of Potomac, the arrow reached the limits of its
knowledge. On the screen, the road ended. But our vehicle kept moving, around three new culde-
sacs in a barren landscape of newly graded dirt, monstrous half-built houses, and yawning
holes waiting for foundations. As we drove, the flashing arrow traced our route, expanding
Navteq's map of the navigable universe. Satlof added, by hand, what the GPS receiver couldn't
see: house numbers, one-way streets, and anything else that pizza delivery drivers may need to
know. "We're really a routing company, and as a by-product, we make a map," he explained.
A few hours later and a few miles away, Apollo Teng, manager of Montgomery County's office
for geographic information systems, sat down at his computer and retrieved his own map of the
area we'd just visited. I saw, once again, those three cul-de-sacs, stretching like fingers toward
the Potomac River. Another click of the mouse, and we saw an aerial photograph of the scene,
precisely aligned with the street map. Then property lines appeared on the map, as if by magic.
When Teng clicked on a property, up came information about its owner, its tax assessment, and
its most recent sale. Teng added other features: paths of sewer pipes; areas of protected wetlands;
a boundary between watersheds.
Teng's database contains more than a hundred different "layers" of information that he can add to
the digital map, each layer showing different aspects of the landscape. Any piece of information
that comes attached to a street address or latitude-longitude coördinates can slide effortlessly into
this visualization: a neighborhood's median income, its history of robberies, even its residents'
contributions to political campaigns.
Welcome to the astonishing world of modern mapmaking--what insiders call geographic
information systems, or GIS. In fact, it's more than mapmaking. It's a way of organizing
information about anything that happens at a particular geographic location. That includes realestate
development, military operations, logging, farming, oil drilling--the list goes on and on.
GIS lets companies use mailing addresses to build maps of their customer bases,
environmentalists study the effects of climate change on vegetation and glacier movement, and
medical researchers investigate links between contaminated drinking water and cancer incidence.
Some devotees of the technology say that it's more than just a useful tool. They call it a new
language, one that allows us to understand, and improve, our planet.
The Power of Pictures
The most vocal advocate of the benefits of GIS is also the world's leading seller of GIS software
and services: Jack Dangermond, founder and coöwner of a Redlands, CA, company called the
Environmental Systems Research Institute (ESRI). (Its motto: "Better decisions through
2
modeling and mapping our world.") Every summer, Dangermond presides over the ESRI User
Conference, an international geography jamboree that brings together thousands of digital
mapmakers from around the world. ESRI publishes some of the finest work displayed at the
conference in a glossy "map book," released annually. In recent years, these books have borne
such high-minded titles as Sustaining Our World and Serving Our World.
This grandiose vision comes from Dangermond himself. "Our science is making a better world
for human existence and economic development and arguably could be something that
counterbalances everything negative about globalization," he says.
The power of GIS, Dangermond argues, is that it lets us witness the world, from deforestation in
the Amazon to crime in local neighborhoods. And having seen what's happening, we can
imagine changing it. GIS, Dangermond believes, "will allow us to create a better future."
According to many, digital mapmakers have had an idealistic streak from the beginning. The
most comprehensive volume on GIS, The History of Geographic Information Systems, opens
with an essay by landscape architect Ian McHarg, who advocated "transparent-overlay maps" in
the 1960s as a way for planners to see more clearly the aspects of nature--forests, wildlife, and
marshes--that new roads and buildings would obliterate.
Those physical overlay maps inspired a generation of environmentalists, including Dangermond,
who studied landscape architecture at Harvard University. The idea of creating maps from layers
of data became the heart of GIS, and it's the secret of its power. Using software like
Dangermond's, people could combine census information, satellite photos, and many other types
of data to reveal relationships that were never obvious before.
At first glance, the value of GIS seems self-evident. It's hard to imagine a more innocent and
enlightening technology than a map. Maps reveal the truth about our world, and the truth, as the
saying goes, will set us free. Or will it?
Nazis, Soviets, and Software
I slide a copy of The History of Geographic Information Systems across the table toward
historian John Cloud, and he recoils as if it were something toxic. "The enemy," he mutters with
a twisted smile, only partly joking.
Cloud considers this book a "cover story," a misleading history that gives university-based
scientists more credit for GIS than they deserve. The real roots of digital mapping, he says, reach
back to the Cold War and to the U.S. Defense Department's secret campaign to assemble
accurate maps of nuclear targets in the Soviet Union.
Before taking his current job as a historian for the National Oceanic and Atmospheric
Administration, Cloud spent more than a decade assembling an alternative genealogy of GIS,
showing military planners, not idealistic landscape architects, to be its fathers. In the 1950s, the
Defense Department recruited scientists to determine the exact distances between the earth's
continents--essential for aiming intercontinental ballistic missiles. Later, Pentagon officials sent
the first remote-sensing satellites aloft to photograph "denied territories" inside the Soviet Union.
In the 1960s, the Pentagon converted those images into digital data, and in the 1980s, the U.S.
Air Force launched the Global Positioning System, the essential tool for today's mapmakers.
These military projects were the pillars on which geographic information systems were built,
Cloud says. In scale and sophistication, they dwarfed anything accomplished in the civilian
world at that time. And the world imagined in these maps was not one of environmental
sustainability but one of nuclear war.
As for Ian McHarg's transparent-overlay maps, intended to help preserve nature and facilitate
more-livable cities--well, that, too, is a nice-sounding cover story, says Cloud. There were other
3
forerunners of layered digital maps, he says, including some that were used for less uplifting
purposes than McHarg's.
Searching through archives and old cartography publications, Cloud found several overlay maps
from the 1930s and 1940s. They were, he says, "the most complex and accomplished uses of
overlays yet found." One set, prepared by federal officials during the New Deal, depicted
American cities and showed, with different translucent layers, data about problems such as high
concentrations of decrepit buildings. Later maps, concealed for many years from public view,
carried fateful red lines that enclosed blocks occupied mainly "by any distinct racial, national, or
income group that would be considered an undesirable element if introduced into other parts of
the city," in the words of a 1936 document cited by Cloud. Thus was born the term "redlining"
(say, charging residents of targeted areas more for loans or insurance). Yet Cloud has found no
evidence that others adopted these innovative mapmaking techniques and applied them more
widely. Apparently, they were used and then abandoned.
While interviewing Lawrence Ayers, former deputy director of the U.S. Defense Mapping
Agency, Cloud learned of another set of overlay maps that may have fallen on more-fertile
ground. The maps were created by the German military during World War II and captured by
American forces near the end of the war. They were composed of transparent sheets--sometimes
20 or more--showing such things as vegetation, soil, and road surfaces. According to Ayers, the
Defense Department's own mapmakers quickly saw the value of this technique and adopted it
themselves, first applying it to physical maps, then to digital sets of data. "The concept of the
overlay is what the software writers picked up and used to take advantage of digital technology,"
says Ayers. "It goes back to the Germans."
Ayers and Cloud make an odd pair of allies. But the two of them--one a retired defense official
and corporate executive, the other a sandal-wearing former academic and environmental activist-
-agree the Defense Department laid the foundations for what's now called GIS. It created the
earliest digital maps, and its contracts "pumped money," as Ayers puts it, into several companies
that now play leading roles in the GIS industry.
Showing the Way
Some historians and GIS pioneers, however, dismiss this version of GIS history with a mixture
of irritation and disdain. Nicholas Chrisman, from Laval University in Quebec City, says the
Pentagon produced little, apart from the Global Positioning System, that the commercial world
ever found useful. And none of the early developers of GIS software even knew about the
physical overlay maps of the New Deal or the Nazi era, he says, while they certainly did know
about McHarg's. ESRI's Dangermond, for his part, says his company knew little about the
military's work and profited even less from it.
The Pentagon didn't invent the entire field of GIS, as Clark implies. Yet his search for the dark,
hidden ancestors of modern mapmaking illustrates something simple and true: maps--like
technological progress itself--are not inherently benevolent.
Even Dangermond, when pressed, concedes the point. "I'm not political about how technology
gets used. It gets used," he says. "My own interest was obviously in the area of environmental
things. But it gets used by everybody."
The consequences of those uses vary. Six months ago, relief workers used digital maps to find
their way through areas devastated by the Indian Ocean tsunami. The U.S. Air Force relies on
such maps in Iraq. Aerial photographs and digital mapmaking tools are allowing the
governments of Uruguay and Brazil to survey and sell off vast tracts of land. "Sitting there in
Arlington, Virginia, you can buy land in Brazil," says Christopher Simpson, a professor of
4
communications at American University in Washington, DC, who's been studying current uses
of remote sensing in Latin America. In theory, Brazilian peasants can buy the land they currently
till. But in practice, Simpson says, the best properties will be snapped up by "those with the most
resources, who are best organized, with the best overview." In other words, those with access to
digital maps of millions of unclaimed acres.
Geographic information systems extend the reach of the human imagination, but in the end, they
mainly help people do what they wanted to do in the first place. They're tools for preserving
nature or destroying it, for defending human communities or obliterating them, for empowering
or impoverishing. Maps can show us the way, wherever we choose to go.
Daniel Charles reported on technology for National Public Radio and wrote Master Mind: The
Rise and Fall of Fritz Haber, the Nobel Laureate Who Launched the Age of Chemical Warfare.
By Daniel Charles June 2005
On a snowy morning in early March, looking for the frontiers of digital mapmaking, I hopped
into the back seat of an SUV sporting a Global Positioning System receiver on the roof. In front
sat two representatives from Navteq, one of the companies that builds the street maps that you
see on MapQuest. Phil Satlof, senior geographic analyst, operated a laptop computer hooked up
to the GPS receiver.
Looking at the computer's screen, I felt we'd stepped inside a video game. A flashing arrow
showed our progress through the familiar grid of Washington's streets. I watched, fascinated, as
it marched down a line marked "River Road" into Montgomery County, Maryland.
Half an hour later, in the wealthy suburb of Potomac, the arrow reached the limits of its
knowledge. On the screen, the road ended. But our vehicle kept moving, around three new culde-
sacs in a barren landscape of newly graded dirt, monstrous half-built houses, and yawning
holes waiting for foundations. As we drove, the flashing arrow traced our route, expanding
Navteq's map of the navigable universe. Satlof added, by hand, what the GPS receiver couldn't
see: house numbers, one-way streets, and anything else that pizza delivery drivers may need to
know. "We're really a routing company, and as a by-product, we make a map," he explained.
A few hours later and a few miles away, Apollo Teng, manager of Montgomery County's office
for geographic information systems, sat down at his computer and retrieved his own map of the
area we'd just visited. I saw, once again, those three cul-de-sacs, stretching like fingers toward
the Potomac River. Another click of the mouse, and we saw an aerial photograph of the scene,
precisely aligned with the street map. Then property lines appeared on the map, as if by magic.
When Teng clicked on a property, up came information about its owner, its tax assessment, and
its most recent sale. Teng added other features: paths of sewer pipes; areas of protected wetlands;
a boundary between watersheds.
Teng's database contains more than a hundred different "layers" of information that he can add to
the digital map, each layer showing different aspects of the landscape. Any piece of information
that comes attached to a street address or latitude-longitude coördinates can slide effortlessly into
this visualization: a neighborhood's median income, its history of robberies, even its residents'
contributions to political campaigns.
Welcome to the astonishing world of modern mapmaking--what insiders call geographic
information systems, or GIS. In fact, it's more than mapmaking. It's a way of organizing
information about anything that happens at a particular geographic location. That includes realestate
development, military operations, logging, farming, oil drilling--the list goes on and on.
GIS lets companies use mailing addresses to build maps of their customer bases,
environmentalists study the effects of climate change on vegetation and glacier movement, and
medical researchers investigate links between contaminated drinking water and cancer incidence.
Some devotees of the technology say that it's more than just a useful tool. They call it a new
language, one that allows us to understand, and improve, our planet.
The Power of Pictures
The most vocal advocate of the benefits of GIS is also the world's leading seller of GIS software
and services: Jack Dangermond, founder and coöwner of a Redlands, CA, company called the
Environmental Systems Research Institute (ESRI). (Its motto: "Better decisions through
2
modeling and mapping our world.") Every summer, Dangermond presides over the ESRI User
Conference, an international geography jamboree that brings together thousands of digital
mapmakers from around the world. ESRI publishes some of the finest work displayed at the
conference in a glossy "map book," released annually. In recent years, these books have borne
such high-minded titles as Sustaining Our World and Serving Our World.
This grandiose vision comes from Dangermond himself. "Our science is making a better world
for human existence and economic development and arguably could be something that
counterbalances everything negative about globalization," he says.
The power of GIS, Dangermond argues, is that it lets us witness the world, from deforestation in
the Amazon to crime in local neighborhoods. And having seen what's happening, we can
imagine changing it. GIS, Dangermond believes, "will allow us to create a better future."
According to many, digital mapmakers have had an idealistic streak from the beginning. The
most comprehensive volume on GIS, The History of Geographic Information Systems, opens
with an essay by landscape architect Ian McHarg, who advocated "transparent-overlay maps" in
the 1960s as a way for planners to see more clearly the aspects of nature--forests, wildlife, and
marshes--that new roads and buildings would obliterate.
Those physical overlay maps inspired a generation of environmentalists, including Dangermond,
who studied landscape architecture at Harvard University. The idea of creating maps from layers
of data became the heart of GIS, and it's the secret of its power. Using software like
Dangermond's, people could combine census information, satellite photos, and many other types
of data to reveal relationships that were never obvious before.
At first glance, the value of GIS seems self-evident. It's hard to imagine a more innocent and
enlightening technology than a map. Maps reveal the truth about our world, and the truth, as the
saying goes, will set us free. Or will it?
Nazis, Soviets, and Software
I slide a copy of The History of Geographic Information Systems across the table toward
historian John Cloud, and he recoils as if it were something toxic. "The enemy," he mutters with
a twisted smile, only partly joking.
Cloud considers this book a "cover story," a misleading history that gives university-based
scientists more credit for GIS than they deserve. The real roots of digital mapping, he says, reach
back to the Cold War and to the U.S. Defense Department's secret campaign to assemble
accurate maps of nuclear targets in the Soviet Union.
Before taking his current job as a historian for the National Oceanic and Atmospheric
Administration, Cloud spent more than a decade assembling an alternative genealogy of GIS,
showing military planners, not idealistic landscape architects, to be its fathers. In the 1950s, the
Defense Department recruited scientists to determine the exact distances between the earth's
continents--essential for aiming intercontinental ballistic missiles. Later, Pentagon officials sent
the first remote-sensing satellites aloft to photograph "denied territories" inside the Soviet Union.
In the 1960s, the Pentagon converted those images into digital data, and in the 1980s, the U.S.
Air Force launched the Global Positioning System, the essential tool for today's mapmakers.
These military projects were the pillars on which geographic information systems were built,
Cloud says. In scale and sophistication, they dwarfed anything accomplished in the civilian
world at that time. And the world imagined in these maps was not one of environmental
sustainability but one of nuclear war.
As for Ian McHarg's transparent-overlay maps, intended to help preserve nature and facilitate
more-livable cities--well, that, too, is a nice-sounding cover story, says Cloud. There were other
3
forerunners of layered digital maps, he says, including some that were used for less uplifting
purposes than McHarg's.
Searching through archives and old cartography publications, Cloud found several overlay maps
from the 1930s and 1940s. They were, he says, "the most complex and accomplished uses of
overlays yet found." One set, prepared by federal officials during the New Deal, depicted
American cities and showed, with different translucent layers, data about problems such as high
concentrations of decrepit buildings. Later maps, concealed for many years from public view,
carried fateful red lines that enclosed blocks occupied mainly "by any distinct racial, national, or
income group that would be considered an undesirable element if introduced into other parts of
the city," in the words of a 1936 document cited by Cloud. Thus was born the term "redlining"
(say, charging residents of targeted areas more for loans or insurance). Yet Cloud has found no
evidence that others adopted these innovative mapmaking techniques and applied them more
widely. Apparently, they were used and then abandoned.
While interviewing Lawrence Ayers, former deputy director of the U.S. Defense Mapping
Agency, Cloud learned of another set of overlay maps that may have fallen on more-fertile
ground. The maps were created by the German military during World War II and captured by
American forces near the end of the war. They were composed of transparent sheets--sometimes
20 or more--showing such things as vegetation, soil, and road surfaces. According to Ayers, the
Defense Department's own mapmakers quickly saw the value of this technique and adopted it
themselves, first applying it to physical maps, then to digital sets of data. "The concept of the
overlay is what the software writers picked up and used to take advantage of digital technology,"
says Ayers. "It goes back to the Germans."
Ayers and Cloud make an odd pair of allies. But the two of them--one a retired defense official
and corporate executive, the other a sandal-wearing former academic and environmental activist-
-agree the Defense Department laid the foundations for what's now called GIS. It created the
earliest digital maps, and its contracts "pumped money," as Ayers puts it, into several companies
that now play leading roles in the GIS industry.
Showing the Way
Some historians and GIS pioneers, however, dismiss this version of GIS history with a mixture
of irritation and disdain. Nicholas Chrisman, from Laval University in Quebec City, says the
Pentagon produced little, apart from the Global Positioning System, that the commercial world
ever found useful. And none of the early developers of GIS software even knew about the
physical overlay maps of the New Deal or the Nazi era, he says, while they certainly did know
about McHarg's. ESRI's Dangermond, for his part, says his company knew little about the
military's work and profited even less from it.
The Pentagon didn't invent the entire field of GIS, as Clark implies. Yet his search for the dark,
hidden ancestors of modern mapmaking illustrates something simple and true: maps--like
technological progress itself--are not inherently benevolent.
Even Dangermond, when pressed, concedes the point. "I'm not political about how technology
gets used. It gets used," he says. "My own interest was obviously in the area of environmental
things. But it gets used by everybody."
The consequences of those uses vary. Six months ago, relief workers used digital maps to find
their way through areas devastated by the Indian Ocean tsunami. The U.S. Air Force relies on
such maps in Iraq. Aerial photographs and digital mapmaking tools are allowing the
governments of Uruguay and Brazil to survey and sell off vast tracts of land. "Sitting there in
Arlington, Virginia, you can buy land in Brazil," says Christopher Simpson, a professor of
4
communications at American University in Washington, DC, who's been studying current uses
of remote sensing in Latin America. In theory, Brazilian peasants can buy the land they currently
till. But in practice, Simpson says, the best properties will be snapped up by "those with the most
resources, who are best organized, with the best overview." In other words, those with access to
digital maps of millions of unclaimed acres.
Geographic information systems extend the reach of the human imagination, but in the end, they
mainly help people do what they wanted to do in the first place. They're tools for preserving
nature or destroying it, for defending human communities or obliterating them, for empowering
or impoverishing. Maps can show us the way, wherever we choose to go.
Daniel Charles reported on technology for National Public Radio and wrote Master Mind: The
Rise and Fall of Fritz Haber, the Nobel Laureate Who Launched the Age of Chemical Warfare.
Crisis mappers: Mobile technology helps disaster victims worldwide
There are now 6.8 billion people on the planet. And about 5 billion cell phones.
This extraordinary ability to connect has turned a modern convenience into a lifeline through a system called crisis mapping. It first gained prominence after the earthquake in Haiti, when people used their cell phones to send text messages to a centralized response team. Since then, crisis mapping has been used to help victims in emergency zones following the tornadoes in the Midwest, the earthquake in Japan and the unrest in the Middle East.
Today, there are hundreds of volunteers in more than 50 countries creating maps of crises around the world, using a system that incorporates the lessons learned in Haiti.
Watch the full episode. See more Need To Know.
This extraordinary ability to connect has turned a modern convenience into a lifeline through a system called crisis mapping. It first gained prominence after the earthquake in Haiti, when people used their cell phones to send text messages to a centralized response team. Since then, crisis mapping has been used to help victims in emergency zones following the tornadoes in the Midwest, the earthquake in Japan and the unrest in the Middle East.
Today, there are hundreds of volunteers in more than 50 countries creating maps of crises around the world, using a system that incorporates the lessons learned in Haiti.
National Geospatial-Intelligence Agency Aids Flood, Tornado Disasters
The National Geospatial-Intelligence Agency is supporting the Federal Emergency Management Agency and other agencies in responding to the flooding and tornadoes that have caused catastrophic damage in the United States.
In the aftermath of the tornadoes, NGA has provided damage assessments and other geospatial intelligence products crucial in disaster response and debris cleanup in Alabama, Arkansas, Georgia, Mississippi, North Carolina, Tennessee and Virginia. NGA is providing daily updates in support of FEMA requirements.
In connection with the Mississippi River flooding, NGA has been producing both predictive analyses and damage assessments as water levels rise and diversion efforts proceed along the river.
NGA supported FEMA, the Department of Homeland Security and the U.S. Army Corps of Engineers by producing models predicting the effects of releases from the Morganza Spillway in Louisiana.
NGA's analyses include predicted and actual effects on critical infrastructure including roads, railways, airports, hospitals, Red Cross and other emergency facilities, power plants, piers and port facilities, petroleum refineries and other industrial facilities, schools, water supplies, and more.
"NGA provides a common operating picture that enables FEMA and emergency responders to work together more effectively and efficiently," said Philip J. Plack, NGA liaison to FEMA.
To provide quick-turnaround support to FEMA and other agencies, NGA has deployed geospatial analysts to the Joint Field Office in Birmingham, Ala., and to FEMA's Regional Response Coordination Centers in Denton, Texas, and Kansas City, Mo.
The National Geospatial-Intelligence Agency is the nation's premier source of geospatial intelligence. As a Department of Defense combat support agency and a member of the U.S. Intelligence Community, NGA provides imagery, geospatial and targeting analysis, along with image sciences and modeling for U.S. national defense, disaster relief and safety of navigation. NGA seeks to know the Earth, show the way, and understand the world
In the aftermath of the tornadoes, NGA has provided damage assessments and other geospatial intelligence products crucial in disaster response and debris cleanup in Alabama, Arkansas, Georgia, Mississippi, North Carolina, Tennessee and Virginia. NGA is providing daily updates in support of FEMA requirements.
In connection with the Mississippi River flooding, NGA has been producing both predictive analyses and damage assessments as water levels rise and diversion efforts proceed along the river.
NGA supported FEMA, the Department of Homeland Security and the U.S. Army Corps of Engineers by producing models predicting the effects of releases from the Morganza Spillway in Louisiana.
NGA's analyses include predicted and actual effects on critical infrastructure including roads, railways, airports, hospitals, Red Cross and other emergency facilities, power plants, piers and port facilities, petroleum refineries and other industrial facilities, schools, water supplies, and more.
"NGA provides a common operating picture that enables FEMA and emergency responders to work together more effectively and efficiently," said Philip J. Plack, NGA liaison to FEMA.
To provide quick-turnaround support to FEMA and other agencies, NGA has deployed geospatial analysts to the Joint Field Office in Birmingham, Ala., and to FEMA's Regional Response Coordination Centers in Denton, Texas, and Kansas City, Mo.
The National Geospatial-Intelligence Agency is the nation's premier source of geospatial intelligence. As a Department of Defense combat support agency and a member of the U.S. Intelligence Community, NGA provides imagery, geospatial and targeting analysis, along with image sciences and modeling for U.S. national defense, disaster relief and safety of navigation. NGA seeks to know the Earth, show the way, and understand the world
US geospatial centre aims to set modern surveying standards
Doug Cashman
A group of technical education institutions in the US have come together to form the National Geospatial Technology Centre (NGTC), with the aim of “the voice for geospatial programs nationwide.”
The NGTC has confirmed annual funding of $1.25 million for four years from the National Scientific Foundation and is a partnership of seven community colleges, a community and technical college system, and Penn State and San Diego State Universities.
The Centre’s director, Phillip Davis, said that advanced geospatial technology and surveying equipment now affected almost every aspect of normal life, and that it was their aim to define the industry standards and establish professional models for the development of skills.
“The world is so interconnected today, and everything is based on spatial relationships,” Davis said. “It is one of our nation’s essential core tools.”
The NGTC will work with schools to develop faculty training and curriculum skills development in the field. It recently drafted a ‘competency model’ for the Department of Labor, to set the national skills standards for those who work in the industry. The model provides career guidance, curriculum development and evaluation, and outreach efforts to promote geospatial technology careers, according to the Department of Labor.
A group of technical education institutions in the US have come together to form the National Geospatial Technology Centre (NGTC), with the aim of “the voice for geospatial programs nationwide.”
The NGTC has confirmed annual funding of $1.25 million for four years from the National Scientific Foundation and is a partnership of seven community colleges, a community and technical college system, and Penn State and San Diego State Universities.
The Centre’s director, Phillip Davis, said that advanced geospatial technology and surveying equipment now affected almost every aspect of normal life, and that it was their aim to define the industry standards and establish professional models for the development of skills.
“The world is so interconnected today, and everything is based on spatial relationships,” Davis said. “It is one of our nation’s essential core tools.”
The NGTC will work with schools to develop faculty training and curriculum skills development in the field. It recently drafted a ‘competency model’ for the Department of Labor, to set the national skills standards for those who work in the industry. The model provides career guidance, curriculum development and evaluation, and outreach efforts to promote geospatial technology careers, according to the Department of Labor.
Marine atlas maps out Shetland's energy potential Wave and tidal resource map
A marine atlas has been created to highlight Shetland's potential as a source of wave and tidal power.
The atlas is based on a major study aimed at promoting the islands as an ideal location for the development of marine renewable energy.
It features resource maps allowing would-be developers to identify the best sites for generating electricity.
The study is to be presented at the UK's largest renewable energy event in Aberdeen next week.
It was commissioned last year by Shetland Islands Council and Highlands and Islands Enterprise (HIE), supported by European funding.
Data and mapping contained in the study will become part of the marine atlas of the Shetland Islands marine spatial plan.
The resource maps cover major tidal energy sites at Bluemull and Yell Sound and wave energy resources available up to 500m off the Shetland coastline.
David Priest of HIE said: "This is a really useful piece of work and fills in a missing gap of information available on the seas around Shetland making it easier for developers to plan.
"It clearly demonstrates where the best wave and tidal energy is and shows how good resources are in and around Shetland waters."
HIE said the atlas would help inform planning decisions in the seas around Shetland and more accurately demonstrate the links and interactions between different economic activities and the priorities of marine users.
The atlas also outlines existing constraints, designations and issues that might arise when applying for planning permission or a marine works licence.
Shetland Islands wave and tidal resource assessment map of Bluemull Sound (Courtesy of British Crown and Seazone Solutions Ltd - derived in part from material obtained from the UK Hydrographic Office with the permission of HM Stationery Office and the UK Hydrographic Office) The resource maps cover major tidal energy sites at Bluemull and Yell Sound
It is designed to help marine energy developers identify the most appropriate locations for future marine energy developments and reducing duplication of work in providing quality information.
The Shetland Islands marine spatial plan is one of four pilot projects under the umbrella of the Scottish Sustainable Marine Environment Initiative (SSMEI) initiated by the Scottish government to inform future marine policy.
The study, which was undertaken by specialist technical consultancy Natural Power, will be promoted next week at the All-Energy Exhibition and Conference in Aberdeen.
The event will feature more than 570 exhibitors from 20 countries, who specialise in all forms of clean and renewable energy.
Josie Simpson, chairman of the Shetland Islands development committee, commented: "Shetland has a huge untapped marine energy resource.
"Finding ways to exploit this resource sustainably is very important for Shetland's future prosperity."
Would-be developers are being asked to contact HIE or Shetland Islands Council for copies of the marine atlas.
The atlas is based on a major study aimed at promoting the islands as an ideal location for the development of marine renewable energy.
It features resource maps allowing would-be developers to identify the best sites for generating electricity.
The study is to be presented at the UK's largest renewable energy event in Aberdeen next week.
It was commissioned last year by Shetland Islands Council and Highlands and Islands Enterprise (HIE), supported by European funding.
Data and mapping contained in the study will become part of the marine atlas of the Shetland Islands marine spatial plan.
The resource maps cover major tidal energy sites at Bluemull and Yell Sound and wave energy resources available up to 500m off the Shetland coastline.
David Priest of HIE said: "This is a really useful piece of work and fills in a missing gap of information available on the seas around Shetland making it easier for developers to plan.
"It clearly demonstrates where the best wave and tidal energy is and shows how good resources are in and around Shetland waters."
HIE said the atlas would help inform planning decisions in the seas around Shetland and more accurately demonstrate the links and interactions between different economic activities and the priorities of marine users.
The atlas also outlines existing constraints, designations and issues that might arise when applying for planning permission or a marine works licence.
Shetland Islands wave and tidal resource assessment map of Bluemull Sound (Courtesy of British Crown and Seazone Solutions Ltd - derived in part from material obtained from the UK Hydrographic Office with the permission of HM Stationery Office and the UK Hydrographic Office) The resource maps cover major tidal energy sites at Bluemull and Yell Sound
It is designed to help marine energy developers identify the most appropriate locations for future marine energy developments and reducing duplication of work in providing quality information.
The Shetland Islands marine spatial plan is one of four pilot projects under the umbrella of the Scottish Sustainable Marine Environment Initiative (SSMEI) initiated by the Scottish government to inform future marine policy.
The study, which was undertaken by specialist technical consultancy Natural Power, will be promoted next week at the All-Energy Exhibition and Conference in Aberdeen.
The event will feature more than 570 exhibitors from 20 countries, who specialise in all forms of clean and renewable energy.
Josie Simpson, chairman of the Shetland Islands development committee, commented: "Shetland has a huge untapped marine energy resource.
"Finding ways to exploit this resource sustainably is very important for Shetland's future prosperity."
Would-be developers are being asked to contact HIE or Shetland Islands Council for copies of the marine atlas.
Thursday, 12 May 2011
geoVues Location Intelligence Software Brings Analytical Muscle to Golds Gym
geoVue, a leading provider of site selection, market optimization and other location intelligence software solutions, has been selected by leading fitness authority Golds Gym for its comprehensive suite of site selection, market planning, direct marketing and consulting services. Golds Gym, the largest co-ed gym chain in the world, will use geoVues solutions to better understand their customer base and trade areas; select and rank prospective new markets (professional services), map potential sites and generate demographic reports (iSITE), proactively carve out franchise territories, seek favorable corporate acquisitions (iPLAN), and optimize direct mail campaigns (targetVue). The software will be launched as both a desktop version (iSITE, iPLAN and targetVue) and an online platform (marketVue Portal), enabling Golds Gym to publish territory optimization results and share information across franchise and corporate divisions.
geoVue gives us an instant competitive advantage when entering or expanding into a marketplace," said Eduardo Afonso, Director of Franchise Administration The software will significantly enhance our ability to evaluate real-estate and analyze both the dynamics and the potential of a local market, right down to the zip code. We plan to extend the benefits to our franchisees, as they use targetVue to optimize their direct mail campaigns."
Our site selection and market optimization software has enabled many of the best-known and fastest-growing retailers and franchisors to maximize the potential of their real estate operations," said geoVue president and founder James A. Stone. From site selection, territory and store network planning, to cannibalization analysis, sales forecasting, and direct marketing optimization, geoVue has become established as the leading enterprise real estate system for retailers and franchisors. We look forward to adding Golds Gym to our growing list of satisfied customers."
About Golds Gym:
Established in Venice, Calif. in 1965, Gold's Gym is the largest co-ed gym chain in the world with more than 600 locations in 42 states and 26 countries. Gold's Gym offers the latest equipment and services, including group exercise, personal training, cardiovascular equipment, spinning, Pilates and yoga, while maintaining its core weight lifting tradition. With nearly 3 million members worldwide, Gold's Gym continues to change lives by helping people achieve their individual potential. For more information about Gold's Gym, please visit www.goldsgym.com or call 1-800-99-GOLDS
About geoVue:
geoVue is a leading provider of location-based decision support systems for the real estate, retail, and restaurant industries; with clients including The Limited, United Parcel Service, Albertsons, Dunkin Donuts, Radio Shack, REI and Prudential Real Estate. This unique combination of analytical software and human expertise provides solutions for market planning, site selection, direct marketing and merchandise planning that maximize the effectiveness of chain stores nationwide.
geoVue gives us an instant competitive advantage when entering or expanding into a marketplace," said Eduardo Afonso, Director of Franchise Administration The software will significantly enhance our ability to evaluate real-estate and analyze both the dynamics and the potential of a local market, right down to the zip code. We plan to extend the benefits to our franchisees, as they use targetVue to optimize their direct mail campaigns."
Our site selection and market optimization software has enabled many of the best-known and fastest-growing retailers and franchisors to maximize the potential of their real estate operations," said geoVue president and founder James A. Stone. From site selection, territory and store network planning, to cannibalization analysis, sales forecasting, and direct marketing optimization, geoVue has become established as the leading enterprise real estate system for retailers and franchisors. We look forward to adding Golds Gym to our growing list of satisfied customers."
About Golds Gym:
Established in Venice, Calif. in 1965, Gold's Gym is the largest co-ed gym chain in the world with more than 600 locations in 42 states and 26 countries. Gold's Gym offers the latest equipment and services, including group exercise, personal training, cardiovascular equipment, spinning, Pilates and yoga, while maintaining its core weight lifting tradition. With nearly 3 million members worldwide, Gold's Gym continues to change lives by helping people achieve their individual potential. For more information about Gold's Gym, please visit www.goldsgym.com or call 1-800-99-GOLDS
About geoVue:
geoVue is a leading provider of location-based decision support systems for the real estate, retail, and restaurant industries; with clients including The Limited, United Parcel Service, Albertsons, Dunkin Donuts, Radio Shack, REI and Prudential Real Estate. This unique combination of analytical software and human expertise provides solutions for market planning, site selection, direct marketing and merchandise planning that maximize the effectiveness of chain stores nationwide.
Insurance Data Structure: The Next Wave
By Keith Peterson
An insurer’s data is much more than bits and bytes to be assembled for required regulatory reporting. It is the very lifeblood of the insurance organization and provides the ingredients by which the insight is developed that drives action to improve operational results.
Three convergent technology trends are opening dramatic new opportunities for insurers seeking competitive and operational advantage—and creating a potential crisis for their IT leaders. Predictive analytics, social networks, and location intelligence are on a rapid adoption curve in the insurance industry, but few companies are fully prepared to handle the crashing wave of the exponential growth in data that accompanies these new tools. How does the modern insurer ensure that they don’t sink —or worse—drown?
Insurers vary widely in the maturity of their database and data warehouse implementations. However, these innovation trends make good database performance a business imperative. As data gets bigger and its uses become more complex, demands on data structure grow exponentially. It is imperative that businesses maintain focus on data quality, relevance, and security.
Data Quality
Data quality is maintained by how it’s acquired and managed. As data availability explodes in the organization, quality initiatives must focus particularly on handling accuracy, completeness and cost-benefit. Master data management initiatives are essential to organize and sustain the most important enterprise data in a formalized way that ensures compliance with corporate wide requirements. These initiatives should include careful consideration of new analytical and social technology trends to anticipate their ultimate uses and the role of the data they require and create.
Data Relevance
New technologies are extremely data intensive but few companies can practically manage all the data potentially available for collection. The first key to managing big data is to prioritize it starting with the data that explains relationships among customers, their products and policies, providers, and claims.
Understanding these relationships is critical to helping IT understand and generate the meaningful relationships in their database schemas. Second, user generated content, external data streams, and digital content fundamentally swamp the requirements of traditional databases. Now, more than ever, insurers need a judicious strategy for deciding what data are important and what levels of detail are required to meet business needs. And these strategies need to encompass both business requirements for meaningful information and IT requirements for manageable costs, storage and performance.
Security and Trust
Expanding access to data creates new risk around data being secured for appropriate use by the appropriate audience. As data becomes more complex, data authenticity and credentials become more vital. Data streams must be managed with a systematized structure encompassing security, quality and compliance.
Unless an organization centralizes its enterprise data to create a “single view of the truth,” it requires truth rules; for instance, a means of enforcing shared rules for managing missing and dirty data, tracking document authority, and cardinality for shared information like budgets and forecasts.
These issues impose a new degree of schema complexity unlike that realistically supported by many insurers’ current environments.
Technology Impact
The importance of data quality, relevance and security are magnified considerably when we turn our attention to new technology trends. Consider the data structure issues associated with three innovations.
Predictive Analytics
While analytical software makes adjustments for missing or skewed data, the best predictive models are built from high quality data. One quality dimension is granularity. Highly granular transactional data are usually expensive and cumbersome for analytics. Summarize the data too much, however, and data lose predictive power.
Pre-categorizing continuous quantitative data save storage and processing costs, but modeling efforts derive predictive power from grouping or “binning” data variables in segments most relevant to the analysis. For example, using age to predict automotive accident probabilities might show that fine groupings of the youngest and oldest drivers are useful but drivers in the middle years can be lumped together.
IT leaders should proactively engage the full range of business analysts, statisticians, and business subject matter experts to identify key modeling variables and explore the required granularity. Without insight into the nature of these predictor variables, the data value may—ultimately—be quite limited.
Social Networks
Understanding relationships among constituents is critical to creating meaningful relationships in the database environment. The advent of digital social networks requires the linking of a much broader array of participants in the insurance ecosystems from a variety of public, commercial and proprietary sources.
For example, claims organizations are finding the ability to link and track relationships among claimants, providers, corporations and attorneys especially useful for identifying fraud and excessive treatment. Creating and maintaining these linkages, however, is a complex data undertaking.
Both the constituents in the network and their relationships must be managed as data entities and continuously refreshed. In most cases, insurers are likely to find that social networks linking data outside their organization are best licensed from specialized commercial providers.
Data inside their organization, however, will likely prove a significant competitive differentiator and insurers should manage these as proprietary intellectual property.
Location Intelligence
Insurers have long known the geo-spatial relationships can help explain a range of outcomes. Provider treatment patterns can vary regionally based on regional medical training and presence of specialized facilities.
Fraud tends to cluster geographically. While measuring these relationships has historically been cumbersome, innovations in location intelligence technology make it far easier to collect and analyze geographic data.
These technologies include free geographic information system (GIS) tools, digital data and imagery as well as mashups (combinations of data from two or more sources to create new data). For instance, digital images from accident scenes may soon be instantly transmitted to an insurer, their contents digitized and codified as predictive attributes and immediately used to improve severity scoring and claims assignment at first notice of loss.
To prepare for location data use, IT leaders must understand the unique nature of geo-spatial data and its retrieval. Users of mapping and GIS tools from search engines must be aware of differences in geocoding accuracy. When a precise location a house or building is required, for example, parcel geocoding delivers more accurate results than street-level geocoding.
Conclusion
Taken together, the rising tide of data availability combined with new technology innovations create a potential boon for insurance business leaders and new challenges for those tasked with managing the data. Insurers can profitably ride this wave by developing a clearer of sense of what data has meaning, creating flexibility to add new types of data into existing environments, and imposing simplicity wherever possible in the structure of their data schemas and management practices.
An insurer’s data is much more than bits and bytes to be assembled for required regulatory reporting. It is the very lifeblood of the insurance organization and provides the ingredients by which the insight is developed that drives action to improve operational results.
Three convergent technology trends are opening dramatic new opportunities for insurers seeking competitive and operational advantage—and creating a potential crisis for their IT leaders. Predictive analytics, social networks, and location intelligence are on a rapid adoption curve in the insurance industry, but few companies are fully prepared to handle the crashing wave of the exponential growth in data that accompanies these new tools. How does the modern insurer ensure that they don’t sink —or worse—drown?
Insurers vary widely in the maturity of their database and data warehouse implementations. However, these innovation trends make good database performance a business imperative. As data gets bigger and its uses become more complex, demands on data structure grow exponentially. It is imperative that businesses maintain focus on data quality, relevance, and security.
Data Quality
Data quality is maintained by how it’s acquired and managed. As data availability explodes in the organization, quality initiatives must focus particularly on handling accuracy, completeness and cost-benefit. Master data management initiatives are essential to organize and sustain the most important enterprise data in a formalized way that ensures compliance with corporate wide requirements. These initiatives should include careful consideration of new analytical and social technology trends to anticipate their ultimate uses and the role of the data they require and create.
Data Relevance
New technologies are extremely data intensive but few companies can practically manage all the data potentially available for collection. The first key to managing big data is to prioritize it starting with the data that explains relationships among customers, their products and policies, providers, and claims.
Understanding these relationships is critical to helping IT understand and generate the meaningful relationships in their database schemas. Second, user generated content, external data streams, and digital content fundamentally swamp the requirements of traditional databases. Now, more than ever, insurers need a judicious strategy for deciding what data are important and what levels of detail are required to meet business needs. And these strategies need to encompass both business requirements for meaningful information and IT requirements for manageable costs, storage and performance.
Security and Trust
Expanding access to data creates new risk around data being secured for appropriate use by the appropriate audience. As data becomes more complex, data authenticity and credentials become more vital. Data streams must be managed with a systematized structure encompassing security, quality and compliance.
Unless an organization centralizes its enterprise data to create a “single view of the truth,” it requires truth rules; for instance, a means of enforcing shared rules for managing missing and dirty data, tracking document authority, and cardinality for shared information like budgets and forecasts.
These issues impose a new degree of schema complexity unlike that realistically supported by many insurers’ current environments.
Technology Impact
The importance of data quality, relevance and security are magnified considerably when we turn our attention to new technology trends. Consider the data structure issues associated with three innovations.
Predictive Analytics
While analytical software makes adjustments for missing or skewed data, the best predictive models are built from high quality data. One quality dimension is granularity. Highly granular transactional data are usually expensive and cumbersome for analytics. Summarize the data too much, however, and data lose predictive power.
Pre-categorizing continuous quantitative data save storage and processing costs, but modeling efforts derive predictive power from grouping or “binning” data variables in segments most relevant to the analysis. For example, using age to predict automotive accident probabilities might show that fine groupings of the youngest and oldest drivers are useful but drivers in the middle years can be lumped together.
IT leaders should proactively engage the full range of business analysts, statisticians, and business subject matter experts to identify key modeling variables and explore the required granularity. Without insight into the nature of these predictor variables, the data value may—ultimately—be quite limited.
Social Networks
Understanding relationships among constituents is critical to creating meaningful relationships in the database environment. The advent of digital social networks requires the linking of a much broader array of participants in the insurance ecosystems from a variety of public, commercial and proprietary sources.
For example, claims organizations are finding the ability to link and track relationships among claimants, providers, corporations and attorneys especially useful for identifying fraud and excessive treatment. Creating and maintaining these linkages, however, is a complex data undertaking.
Both the constituents in the network and their relationships must be managed as data entities and continuously refreshed. In most cases, insurers are likely to find that social networks linking data outside their organization are best licensed from specialized commercial providers.
Data inside their organization, however, will likely prove a significant competitive differentiator and insurers should manage these as proprietary intellectual property.
Location Intelligence
Insurers have long known the geo-spatial relationships can help explain a range of outcomes. Provider treatment patterns can vary regionally based on regional medical training and presence of specialized facilities.
Fraud tends to cluster geographically. While measuring these relationships has historically been cumbersome, innovations in location intelligence technology make it far easier to collect and analyze geographic data.
These technologies include free geographic information system (GIS) tools, digital data and imagery as well as mashups (combinations of data from two or more sources to create new data). For instance, digital images from accident scenes may soon be instantly transmitted to an insurer, their contents digitized and codified as predictive attributes and immediately used to improve severity scoring and claims assignment at first notice of loss.
To prepare for location data use, IT leaders must understand the unique nature of geo-spatial data and its retrieval. Users of mapping and GIS tools from search engines must be aware of differences in geocoding accuracy. When a precise location a house or building is required, for example, parcel geocoding delivers more accurate results than street-level geocoding.
Conclusion
Taken together, the rising tide of data availability combined with new technology innovations create a potential boon for insurance business leaders and new challenges for those tasked with managing the data. Insurers can profitably ride this wave by developing a clearer of sense of what data has meaning, creating flexibility to add new types of data into existing environments, and imposing simplicity wherever possible in the structure of their data schemas and management practices.
“Where” is the Location Intelligence Officer? Location Data Requires Strategic Attention
By Marc Gill, President, Lightship Consulting Group LLC
Executive level officers within an organization —CEO, CFO, CMO, CTO, CIO, COO—all bring specific skills to operate a business. These skills include at a minimum organizational management, budgeting and finance, technology, sales and marketing, and industry knowledge, as well as the ability to leverage operational, customer, and marketing data. Most successful executives are good at knowing how to extract, analyze and act upon that data; after all, these executives learned it either in business school or on the job.
But the new data kid on the block is location data, which presents some interesting opportunities and challenges for executives unfamiliar with it. Location information is generally not part of any business’s strategic plan, and is also generally not a core competency of an organization. Therefore, as location information becomes more readily available, and as organizations embark on incorporating location applications, who within the organization and within the C-suite is responsible for ensuring that location is being used to meet the company’s strategic goals?
As my first assignment in the Coast Guard on the first day of my reporting to the engine room, I was handed a pencil and paper and was told to trace every pipe in the engine room. It was my first exercise in understanding the importance of “where.” I didn’t know how any of the systems worked, but after I finished crawling through the bilges and every level of the engine room for days, I did know where everything was and how everything was connected. This exercise was critical to understanding all the systems on the ship—water, electrical, steam, fuel, ballast and pneumatic systems. (I was on a WWII vintage ship that had no electronics.) My job was boilerman and my main responsibility was for the steam generation side of the engine room, but as part of that function I needed to understand how all the systems on the ship interacted and affected each other. I have maintained this practice of “tracing” the business throughout my career.
Most white collar workers and newly minted MBAs come in at a managerial level in a business and they get, at best, a high-level or very specific departmental view of the business; to be successful at becoming a more effective manager of the business, they need to understand where things are, how they are connected, how all the systems function and how the team works together in order to get the “ship” to go in the right direction.
As location information and location-based technologies become more pervasive, businesses that are impacted by location are forced to rethink how they will accomplish their mission, along with what kind of supporting infrastructure will be required to support the mission. Businesses that need to market or operate across vast geographic footprints are inherently location dependent. If a company has a need to incorporate location information to achieve its goals, then it needs that expertise at the top of the organization to spread that experience across the organization and enable the integration of location information across the company.
Editor’s Note
For the Top 10 Location Intelligence mistakes resulting from lack of technology, data, and people experience see the article by Arthur Berrill, Vice President, Technology, DMTI Spatial, and Natasha Leger.
Integrating location information and location-based technologies is not just about the data and the technology. We are not talking about a market research project or a technology project. Integration of location intelligence into an organization is not just operational, or about implementing the “right” system or applications; it’s a mindset and a game changer impacting the culture of the business. Integrating location intelligence is about leadership and organizational culture. This requires a Location Intelligence “Officer” (LIO).
The LIO must bring a skillset similar to those of the other CxOs, along with location information experience, including how location information is developed, managed, provided, maintained, used (how it shouldn’t be used), and integrated into the business process. The LIO needs to understand the culture of the business and how it operates.
What to look for in a LIO candidate
Whether recruiting at the CxO, VP or Director level, the candidate profile is someone who understands every component of the business. Similar to when CIOs and CTOs were introduced to align technology with the mission and goals of the company, the LIO will also have to show that he or she understands how the business operates end to end and have the ability to work across the company to align location information with the mission and goals of the business. A person such as a general manager or line of business manager with P&L responsibilities and specific industry knowledge most adeptly reflects this converged skilled set.
If you haven’t picked up the “where” component or location experience, it is difficult to understand how to integrate it intothe business. Just because you can develop a location-based application, plot data on a Google Map or Bing, or manage a geospatial application doesn’t mean that you have location experience. These are all superficial skill sets (relative to the experience required of a Location Intelligence Officer).
Location experience is about understanding the content and context of location, the way you use it, how it’s integrated into a business and most importantly its impact on the business. Location experience has at a minimum two required components: the location information influence on the components of business processes, and knowledge and understanding of location data and location technologies.
Location influence on business processes: Understanding the business is not about walking the halls; it’s about getting out into the field. The candidate will need to demonstrate his or her knowledge of business processes or the ability and willingness to learn them. This will differ by industry, but for example, in my past experience in telecommunications, getting out into the field was a critical component of understanding the customer, the network infrastructure, where and how things were connected and the processes driving the business.
Location data and technologies: Location data and supporting technology experience will come from engaging and working with data and technology providers such as satellite and aerial imagery, demographics, econometric, geospatial software, and value added location applications companies. If you know the business but don’t know location data and its supporting technologies, go out and educate yourself through various organizations, conferences, courses, and companies that do business in the location space. They are readily available and the industry as a whole encourages and supports location education through these options.
Profile for a Location Intelligence Officer
When integrating the “where” component or location information into the business, it is important to remember that the person or organization using the location data or developing the geospatial system in most cases is not providing the source location data. That data will come from someplace else; therefore it is important to know what type of location data you need, where to get it, and how much you should be paying for it, either internally (what are the resources required?) or for 3rd party data, and finally, what the business case is. Location-based applications have a very seductive appeal to companies. The most important question to be able to ask and answer relative to any location project is, “What is the benefit to the business beyond business as usual?”
Location Intelligence Officer Profile
Experience: Organizational management, budgeting and finance, technology, sales and marketing and industry knowledge
If small company background, look for people who have moved between companies
If large company background, look for people who have moved between departments or functions
Skills: Budgeting, Project Management, Engineering/Product Development, Operations, Sales & Marketing and the ability to understand the impact of operational, customer, and marketing data
Location Intelligence Experience: Location Data and its procurement, use, and maintenance; Geospatial Technology; Technical Infrastructure and Architecture and Enterprise Data Integration
Development and execution of business case and ROI for location-based projects: Demonstrate ability to sell the project internally, and once sold, the ability to deliver it, monitor it and maintain it. Location information management is an ongoing, evolving technology that requires regular attention. (This is a recurring problem with the traditional approach to GIS projects; it’s viewed as a one-time project that goes away.)
Companies are at different stages of their lifecycles from start-up to mature; they are at different levels of financial health, with different strategic and tactical imperatives, and range from innovative to conservative in culture. Despite all the hype around location intelligence and all the competitive advantages claimed, all these characteristics play a role in whether a company is ready for a Location Intelligence Officer.
To date, companies within the telecommunications, utilities, real estate, retail, natural resources planning, oil & gas, and mining industries have established geospatial departments because mapping and knowing the location of the assets of their business is a mission-critical requirement. Today, the nature of the extended enterprise (think outsourced services and manufacturing), the global supply chain, the location-based market drivers (who are your customers, where are they located, what do they want or need, and what other choices do they have?) and asset management is not just about knowing where your plant and facilities are located. It’s about the other assets that were in some cases traditionally managed as accounts receivables. Today the customer asset is about how to get the right product or service to the right customer at the right time. If the growth of your business depends on effectively managing an extended enterprise, a global supply chain, and effectively identifying, creating, and maintaining customers, then your organization requires a Location Intelligence Officer.
It’s not in the title
The reality is that location-based services, applications, and mobile devices are changing the way people do business. When individuals embrace a game changer, then the organization has to take notice. Location intelligence just needs to be treated strategically; the location information officer doesn’t need to be at a CxO level. He or she can be a Vice President or Director, but the C-suite DOES have to champion the role of location intelligence in the organization.
Executive level officers within an organization —CEO, CFO, CMO, CTO, CIO, COO—all bring specific skills to operate a business. These skills include at a minimum organizational management, budgeting and finance, technology, sales and marketing, and industry knowledge, as well as the ability to leverage operational, customer, and marketing data. Most successful executives are good at knowing how to extract, analyze and act upon that data; after all, these executives learned it either in business school or on the job.
But the new data kid on the block is location data, which presents some interesting opportunities and challenges for executives unfamiliar with it. Location information is generally not part of any business’s strategic plan, and is also generally not a core competency of an organization. Therefore, as location information becomes more readily available, and as organizations embark on incorporating location applications, who within the organization and within the C-suite is responsible for ensuring that location is being used to meet the company’s strategic goals?
As my first assignment in the Coast Guard on the first day of my reporting to the engine room, I was handed a pencil and paper and was told to trace every pipe in the engine room. It was my first exercise in understanding the importance of “where.” I didn’t know how any of the systems worked, but after I finished crawling through the bilges and every level of the engine room for days, I did know where everything was and how everything was connected. This exercise was critical to understanding all the systems on the ship—water, electrical, steam, fuel, ballast and pneumatic systems. (I was on a WWII vintage ship that had no electronics.) My job was boilerman and my main responsibility was for the steam generation side of the engine room, but as part of that function I needed to understand how all the systems on the ship interacted and affected each other. I have maintained this practice of “tracing” the business throughout my career.
Most white collar workers and newly minted MBAs come in at a managerial level in a business and they get, at best, a high-level or very specific departmental view of the business; to be successful at becoming a more effective manager of the business, they need to understand where things are, how they are connected, how all the systems function and how the team works together in order to get the “ship” to go in the right direction.
As location information and location-based technologies become more pervasive, businesses that are impacted by location are forced to rethink how they will accomplish their mission, along with what kind of supporting infrastructure will be required to support the mission. Businesses that need to market or operate across vast geographic footprints are inherently location dependent. If a company has a need to incorporate location information to achieve its goals, then it needs that expertise at the top of the organization to spread that experience across the organization and enable the integration of location information across the company.
Editor’s Note
For the Top 10 Location Intelligence mistakes resulting from lack of technology, data, and people experience see the article by Arthur Berrill, Vice President, Technology, DMTI Spatial, and Natasha Leger.
Integrating location information and location-based technologies is not just about the data and the technology. We are not talking about a market research project or a technology project. Integration of location intelligence into an organization is not just operational, or about implementing the “right” system or applications; it’s a mindset and a game changer impacting the culture of the business. Integrating location intelligence is about leadership and organizational culture. This requires a Location Intelligence “Officer” (LIO).
The LIO must bring a skillset similar to those of the other CxOs, along with location information experience, including how location information is developed, managed, provided, maintained, used (how it shouldn’t be used), and integrated into the business process. The LIO needs to understand the culture of the business and how it operates.
What to look for in a LIO candidate
Whether recruiting at the CxO, VP or Director level, the candidate profile is someone who understands every component of the business. Similar to when CIOs and CTOs were introduced to align technology with the mission and goals of the company, the LIO will also have to show that he or she understands how the business operates end to end and have the ability to work across the company to align location information with the mission and goals of the business. A person such as a general manager or line of business manager with P&L responsibilities and specific industry knowledge most adeptly reflects this converged skilled set.
If you haven’t picked up the “where” component or location experience, it is difficult to understand how to integrate it intothe business. Just because you can develop a location-based application, plot data on a Google Map or Bing, or manage a geospatial application doesn’t mean that you have location experience. These are all superficial skill sets (relative to the experience required of a Location Intelligence Officer).
Location experience is about understanding the content and context of location, the way you use it, how it’s integrated into a business and most importantly its impact on the business. Location experience has at a minimum two required components: the location information influence on the components of business processes, and knowledge and understanding of location data and location technologies.
Location influence on business processes: Understanding the business is not about walking the halls; it’s about getting out into the field. The candidate will need to demonstrate his or her knowledge of business processes or the ability and willingness to learn them. This will differ by industry, but for example, in my past experience in telecommunications, getting out into the field was a critical component of understanding the customer, the network infrastructure, where and how things were connected and the processes driving the business.
Location data and technologies: Location data and supporting technology experience will come from engaging and working with data and technology providers such as satellite and aerial imagery, demographics, econometric, geospatial software, and value added location applications companies. If you know the business but don’t know location data and its supporting technologies, go out and educate yourself through various organizations, conferences, courses, and companies that do business in the location space. They are readily available and the industry as a whole encourages and supports location education through these options.
Profile for a Location Intelligence Officer
When integrating the “where” component or location information into the business, it is important to remember that the person or organization using the location data or developing the geospatial system in most cases is not providing the source location data. That data will come from someplace else; therefore it is important to know what type of location data you need, where to get it, and how much you should be paying for it, either internally (what are the resources required?) or for 3rd party data, and finally, what the business case is. Location-based applications have a very seductive appeal to companies. The most important question to be able to ask and answer relative to any location project is, “What is the benefit to the business beyond business as usual?”
Location Intelligence Officer Profile
Experience: Organizational management, budgeting and finance, technology, sales and marketing and industry knowledge
If small company background, look for people who have moved between companies
If large company background, look for people who have moved between departments or functions
Skills: Budgeting, Project Management, Engineering/Product Development, Operations, Sales & Marketing and the ability to understand the impact of operational, customer, and marketing data
Location Intelligence Experience: Location Data and its procurement, use, and maintenance; Geospatial Technology; Technical Infrastructure and Architecture and Enterprise Data Integration
Development and execution of business case and ROI for location-based projects: Demonstrate ability to sell the project internally, and once sold, the ability to deliver it, monitor it and maintain it. Location information management is an ongoing, evolving technology that requires regular attention. (This is a recurring problem with the traditional approach to GIS projects; it’s viewed as a one-time project that goes away.)
Companies are at different stages of their lifecycles from start-up to mature; they are at different levels of financial health, with different strategic and tactical imperatives, and range from innovative to conservative in culture. Despite all the hype around location intelligence and all the competitive advantages claimed, all these characteristics play a role in whether a company is ready for a Location Intelligence Officer.
To date, companies within the telecommunications, utilities, real estate, retail, natural resources planning, oil & gas, and mining industries have established geospatial departments because mapping and knowing the location of the assets of their business is a mission-critical requirement. Today, the nature of the extended enterprise (think outsourced services and manufacturing), the global supply chain, the location-based market drivers (who are your customers, where are they located, what do they want or need, and what other choices do they have?) and asset management is not just about knowing where your plant and facilities are located. It’s about the other assets that were in some cases traditionally managed as accounts receivables. Today the customer asset is about how to get the right product or service to the right customer at the right time. If the growth of your business depends on effectively managing an extended enterprise, a global supply chain, and effectively identifying, creating, and maintaining customers, then your organization requires a Location Intelligence Officer.
It’s not in the title
The reality is that location-based services, applications, and mobile devices are changing the way people do business. When individuals embrace a game changer, then the organization has to take notice. Location intelligence just needs to be treated strategically; the location information officer doesn’t need to be at a CxO level. He or she can be a Vice President or Director, but the C-suite DOES have to champion the role of location intelligence in the organization.
Geospatial Technology as a Core Tool....Impacts everything from navigating to law enforcement
By Marlene Cimons, National Science Foundation
Geospatial technology affects almost every aspect of life, from navigating an unfamiliar neighborhood to locating the world’s most wanted terrorist.
“They couldn’t have found Osama bin Laden without it,” says Phillip Davis, director of the National Geospatial Technology Center, referring to the recent U.S. Navy SEALs raid on bin Laden’s compound hideout in Pakistan, where he was killed. “The world is so interconnected today, and everything is based on spatial relationships. It is one of our nation’s essential core tools.”
Geospatial technology refers to equipment used in visualization, measurement, and analysis of earth’s features, typically involving such systems as GPS (global positioning systems), GIS (geographical information systems), and RS (remote sensing). Its use is well-known and widespread in the military and in homeland security, but its influence is pervasive everywhere, even in areas with a lower public profile, such as land use, flood plain mapping and environmental protection.
“You have people who work in surveying, who map out where a shopping center or street is going to be, and those involved in your local country property appraisals,” Davis says. “It’s also used in law enforcement to locate crimes and for fire response and in disaster management—before, during and after. It is used to locate water resources, or in public health to track the spread of disease. It’s used by the guys who drive around for Google Earth. It’s very high impact.”
The U.S. Department of Labor considers the field a high growth industry, particularly within the public sector—federal, state and local governments—as well as in regulated industries, such as telecommunications, utilities and transportation. The private sector also has begun to embrace the technology; moreover, its market has been growing at an annual rate of almost 35 percent, according to the department.
There are about 600,000 U.S. workers in geospatial technology today, a number expected to reach more than 850,000 by 2018, according to Davis, professor of computer science at Del Mar College in Corpus Christi, Texas, where the center is based.
The National Geospatial Technology (NGT) Center, one of 40 Advanced Technological Education program centers of the National Science Foundation, wants to become “the voice for geospatial programs nationwide,” Davis says.
The NGT Center, which NSF is funding with $1.25 million annually for four years, is a partnership of seven community colleges, a community and technical college system, as well as two four-year universities, Penn State and San Diego State, in collaboration with industry and state and local governments representing all regions of the country. About 500 of the nation’s estimated 1,200 community colleges have geospatial skills programs, according to Davis.
While the center works with schools to develop faculty training and curriculum skills development in the field, it most recently drafted a “competency model” for the Department of Labor’s employment training administration to set nationwide skills standards for those who work in the industry. “It’s a national model of what the occupation requires, just as you would expect for any profession,” Davis says. “We’ve defined a consistent national standard of the skills they need. We have never had this before.”
The model, released by the department last July, is a resource for career guidance, curriculum development and evaluation, career pathway development, recruitment and hiring, continuing professional development, certification and assessment development, apprenticeship program development and outreach efforts to promote geospatial technology careers, the department said.
It includes the broad range of services, technical and manufacturing professions, and products within the fields of geography, surveying and mapping, computer science, information science and other specialized areas of application that comprise geospatial technology.
“Workers need to know about cartography and geography,” Davis says. “They need to have certain computer programming skills, and scientific knowledge.”
Training often starts as early as high school, with skills emphasis at the community college level, Davis says. Interestingly, even students who hold degrees from four-year colleges are returning to community colleges for skills training—and certificates—in order to get jobs. “They come to get the technical skills,” he says. “We’re having a lot of reverse transfer phenomena.”
The center’s partners serve a diverse student population, including Del Mar College, where Davis teaches, and Southwestern College in Chula Vista, Calif., which serve mostly Hispanic students, and two in the Southeast with significant African American enrollment, Gainesville State College in Gainesville, Ga., and Edgecombe College, with campuses in Tarboro and Rocky Mount, N.C. The center also is working with various disability agencies to attract disabled veterans into the field.
“Learning to think spatially is something that society needs to do,” Davis says. “It’s something we need to encourage in our youth and K-12 education. We’re not just talking about geography, or drawing maps with crayons, but learning about spatial relationships—cause and effect. When you build too many homes along the coast, or near a fault susceptible to earthquakes, everything is spatially related.
“This needs to become as fundamental to our education system as reading, writing and arithmetic,” he adds. “I like to say, ‘geospatial technology: you’re lost without it.’”
Geospatial technology affects almost every aspect of life, from navigating an unfamiliar neighborhood to locating the world’s most wanted terrorist.
“They couldn’t have found Osama bin Laden without it,” says Phillip Davis, director of the National Geospatial Technology Center, referring to the recent U.S. Navy SEALs raid on bin Laden’s compound hideout in Pakistan, where he was killed. “The world is so interconnected today, and everything is based on spatial relationships. It is one of our nation’s essential core tools.”
Geospatial technology refers to equipment used in visualization, measurement, and analysis of earth’s features, typically involving such systems as GPS (global positioning systems), GIS (geographical information systems), and RS (remote sensing). Its use is well-known and widespread in the military and in homeland security, but its influence is pervasive everywhere, even in areas with a lower public profile, such as land use, flood plain mapping and environmental protection.
“You have people who work in surveying, who map out where a shopping center or street is going to be, and those involved in your local country property appraisals,” Davis says. “It’s also used in law enforcement to locate crimes and for fire response and in disaster management—before, during and after. It is used to locate water resources, or in public health to track the spread of disease. It’s used by the guys who drive around for Google Earth. It’s very high impact.”
The U.S. Department of Labor considers the field a high growth industry, particularly within the public sector—federal, state and local governments—as well as in regulated industries, such as telecommunications, utilities and transportation. The private sector also has begun to embrace the technology; moreover, its market has been growing at an annual rate of almost 35 percent, according to the department.
There are about 600,000 U.S. workers in geospatial technology today, a number expected to reach more than 850,000 by 2018, according to Davis, professor of computer science at Del Mar College in Corpus Christi, Texas, where the center is based.
The National Geospatial Technology (NGT) Center, one of 40 Advanced Technological Education program centers of the National Science Foundation, wants to become “the voice for geospatial programs nationwide,” Davis says.
The NGT Center, which NSF is funding with $1.25 million annually for four years, is a partnership of seven community colleges, a community and technical college system, as well as two four-year universities, Penn State and San Diego State, in collaboration with industry and state and local governments representing all regions of the country. About 500 of the nation’s estimated 1,200 community colleges have geospatial skills programs, according to Davis.
While the center works with schools to develop faculty training and curriculum skills development in the field, it most recently drafted a “competency model” for the Department of Labor’s employment training administration to set nationwide skills standards for those who work in the industry. “It’s a national model of what the occupation requires, just as you would expect for any profession,” Davis says. “We’ve defined a consistent national standard of the skills they need. We have never had this before.”
The model, released by the department last July, is a resource for career guidance, curriculum development and evaluation, career pathway development, recruitment and hiring, continuing professional development, certification and assessment development, apprenticeship program development and outreach efforts to promote geospatial technology careers, the department said.
It includes the broad range of services, technical and manufacturing professions, and products within the fields of geography, surveying and mapping, computer science, information science and other specialized areas of application that comprise geospatial technology.
“Workers need to know about cartography and geography,” Davis says. “They need to have certain computer programming skills, and scientific knowledge.”
Training often starts as early as high school, with skills emphasis at the community college level, Davis says. Interestingly, even students who hold degrees from four-year colleges are returning to community colleges for skills training—and certificates—in order to get jobs. “They come to get the technical skills,” he says. “We’re having a lot of reverse transfer phenomena.”
The center’s partners serve a diverse student population, including Del Mar College, where Davis teaches, and Southwestern College in Chula Vista, Calif., which serve mostly Hispanic students, and two in the Southeast with significant African American enrollment, Gainesville State College in Gainesville, Ga., and Edgecombe College, with campuses in Tarboro and Rocky Mount, N.C. The center also is working with various disability agencies to attract disabled veterans into the field.
“Learning to think spatially is something that society needs to do,” Davis says. “It’s something we need to encourage in our youth and K-12 education. We’re not just talking about geography, or drawing maps with crayons, but learning about spatial relationships—cause and effect. When you build too many homes along the coast, or near a fault susceptible to earthquakes, everything is spatially related.
“This needs to become as fundamental to our education system as reading, writing and arithmetic,” he adds. “I like to say, ‘geospatial technology: you’re lost without it.’”
Subscribe to:
Posts (Atom)