Wednesday, 11 May 2011

Why Location-Based Services Will be the Killer App of the 2012 Elections

The last presidential election was only three years ago, but that seems like a generation in the social media age.

Just think, back then Twitter was a novelty. Facebook was popular but spent the first half of the year being the second-biggest social network behind MySpace. U.S. smartphone penetration in the U.S. was just 20% by the close of 2008 vs. a projected 50% by the end of this year. The biggest political marketing innovator in 2008 was Barack Obama, who employed Facebook, Twitter and other social networks to great effect, and took Howard Dean’s online micro-donation idea to the next level.

Obviously, 2012 will be a different landscape for several reasons, but the biggest is the ascendance of mobile and, in particular, location-based services. Up until now, such services have been a curiosity driven by early adopters who have no issues about broadcasting their whereabouts to the whole world. In a nation of 310 million people, this is still a niche market. The biggest of the services, Foursquare, has about 8 million users, which is respectable but not mass.

Part of the issue is a lack of purpose for the activity. Let’s face it, being mayor of your local Chinese food joint has limited appeal.

All that could change in the coming election. In this deeply polarized nation, a fair amount of people care about who wins in November 2012, which gives them a reason to make sure their like-minded friends hit the polls. Until now, the only way you could do that was to drive them there yourself.

Imagine, however, a grass roots organization that depended, in part, on committed volunteers who were charged with getting as many people in their Facebook and Twitter networks to commit to vote. Then, when Election Day rolls around, they can prove that they at least got those people to go to their local polling places. Finally, a scientific way to prove political marketing efficacy.

The infrastructure is already in place for such a plan. Thanks to a nonpartisan, get-out-the-vote effort in 2010, Foursquare made it possible to check in at any of the country’s polling places and then broadcast it. Of course, Foursquare’s not the only game in town; Facebook Places lets you check in at the polls and tell all your Facebook friends about it.

That ability — to check in to a physical location and thus bridge the offline and online world — didn’t really exist in 2008 and could be a game changer in the coming election. The major challenge for political marketing has always been getting voters to actually go to the polls. In the past, get-out-the-vote campaigns have consisted of a combination of door-to-door, telephone and snail mail reminders. For a portion of the population –- say those over 65 or so -– this will still have to be the case, but for younger voters, an email plus a pledge to vote -– to be verified with a location-based check-in, will do.

Some politicians are already seeing the potential of LBS. It’s unclear at this point whether Obama’s campaign will lean heavily on such services, but likely Republican candidate Tim Pawlenty’s campaign is already awarding points and badges to supporters a la Foursquare.

Will it work? Don’t underestimate the value of peer pressure. On November 6, 2012, everyone will know if you actually voted. Not doing so will effectively disqualify you from kvetching about either political party. At that point, being a voter may trump being a mayor.

Indoor WiFi Mapping

Location determined by WiFi-only iOS devices.

With the advent of the iPhoneTracker, mobile device location has been brought back to the forefront of our minds. A Skidmore Faculty member compiled several versions of iPhoneTracker from GitHub, and one of these pulled the locations determined from WiFi positioning from the consolidated.db database. This meant that locations determined from WiFi proximity/triangulation were displayed along with the data from the A-GPS chip on his iPhone. When studying the new results, there were a large number of points in Japan and the south of France, both places that he had not been (at least in the last 10 months). In the GIS Center, we were reminded that we discovered the Japan location-fix when beginning to work with mobile device locations while on the Skidmore College network. The iPhone and iPad 3G do not have any issue since the location from their A-GPS chip trumps that from WiFi positioning, but the iPod Touch and iPad get bad locations on the Skidmore College networks.





To the above-right is the location returned by iOS devices, no matter where they are across campus. For comparison, the actual location is below. Both of these locations were taken from the Dana Atrium. How do we address the issue of location when WiFi is the only source for location information?
Indoor WiFi Mapping—iPhone 3GS—Skidmore

Location determined by A-GPS assisted iOS devices.

To answer this, we must first explore how mobile devices determine their location.

With the current version of iOS (all 4.X releases) Apple has switched to using their own location database. This stores location information from iOS devices with their position, as well as nearby cell towers and WiFi hotspots. MacRumors writes:

[T]he iOS location database does not record exact GPS data, instead seeking to pinpoint the locations of Wi-Fi access points and cell towers that the device comes within range of, although the database does offer a clear general track of a user’s movements.

The goal of maintaining such a database is to speed determining a users location by having a database with WiFi hotspots mapped to a rough physical location.

For iOS versions before 4.0, Apple used Skyhook’s WiFi location database. Skyhook worked to collect location information for WiFi hotspots, making it available for determining location. The iPod Touch used for testing is running iOS 3.1.3, so it is pulling its location from the Skyhook database. Skyhook has a handy feature which lets users add their WiFi hotspot to Skyhook’s database of WiFi hotspots. Could we add all the hotspots to Skyhook’s database, and then have a reliable solution for determining position indoors on campus using accurately mapped hotspots and WiFi triangulation?

We were able to get blueprints of the Harder and Dana academic buildings with all of the WiFi hotspots marked from IT. We then took each floor of the blueprints, georeferenced them in ArcMap, and got lat/lon coordinates for each of the hotspots. From IT we were also able to get a table with the MAC address information for each hotspot. This is a unique identifier, which allows the device to know which physical hotspot it is connected to. When this information is entered in Skyhook’s database, a mobile device will send the MAC address of the hotspots it is connected to and/or near. Skyhook should then return a triangulated position based on the locations that we submitted to the database. We also created the following model, allowing you see all of the hotspots over the floor plans in 3 dimensions.

Harder & Dana WiFi Hotspots from Skidmore College GIS Center on Vimeo.


Now that the hotspots have been mapped, we need to add them to Skyhook’s database and see if that improves the quality of the location information for pre-iOS 4 devices.

We tweeted @SkyhookWireless and they were kind enough to set us up with a bulk submit form. This let us compile a file to easily submit many WiFi access points with the precise locations of our access points. Skyhook offers a web form that allows you to enter a single AP on a Google Map. This is great for a single point in a residential setting, but for mapping 20+ APs in a tight area the form is not accurate enough to yield highly accurate results when mapping indoors. Using the locations we got from georeferencing the hot spots we created a tab-delimited file with latitude/longitude, MAC address, date, and text representing the location. We submitted this file to Skyhook, and I. The coming weeks they will add the APs to their database.

The end result of this work is that once these points have been added we should be able to determine position with a reasonable accuracy indoors. Once Skyhook has added the APs to their location databases we will compare the accuracy of the iPod Touch to that of the iPhone.

Ford aims to use Google's Prediction API to bolster analytics

Ford said that it will use Google’s new Prediction API, which uses cloud computing, storage and external data for analytics, and combine it with its own research and development on driver predictive behavior.
Ford on Tuesday said that it will use Google’s new Prediction API, which uses cloud computing, storage and external data for analytics, and combine it with its own research and development on driver predictive behavior. The goal: Create cars that can optimize fuel economy and efficiency based on driver habits.




The announcement, made at Google’s I/O conference in San Francisco, is an interesting tag-team. Ford has strong auto technology partnerships with Ford and Nuance to name a few. The tie-up with Google highlights how algorithms that connect public and private data will be needed to enhance the driver experience.

Ford presented a case study on how Google’s Prediction API could be used to enhance the performance of a plug-in hybrid electric vehicle. For instance, an electric vehicle driver could have access to a route that would take into account battery usage, optimize power and work out details to stay in all-electric mode in a city.

Among the key points:

* Ford will add Google’s Prediction API to its research and analysis.
* Google’s API can convert historical driving data into real time predictions.
* The outcome from the algorithm combination will be used to predict routes.

Today, the combination of Google and Ford data is largely conceptual—as are the outcomes. Ford envisions something like this happening:

* Vehicle owner opts into the predictive service and driver data is collected and encrypted. The system learns over time and all data is encrypted for privacy and security.
* When the car is started Google Prediction will use history to optimize routes and performance based on location and time of day.
* The car via voice recognition would confirm its predictions based on input from the driver.

The next step for Ford is to use conduct feasibility studies for using Google’s Prediction API. In the big picture, Ford’s Google I/O announcement signals that in-car technology is moving beyond infotainment to more complicated tasks.

Johannes Kristinsson, system architect, Vehicle Controls Architecture and Algorithm Design, at Ford said:

“Ford already offers cloud-based services through Ford SYNC, but those services thus far have been used for infotainment, navigation and real-time traffic purposes to empower the driver. This technology has the potential to empower our vehicles to anticipate the driver’s needs.”

Is Spatial Intelligence Essential for Innovation and Can We Increase it Through Training?

Do males and females differ in spatial intelligence?


What is Spatial Intelligence or Spatial Ability?

According to David Lohman of the University of Iowa, spatial ability can best be defined as the ability to "generate, retain, retrieve, and transform well-structured visual images." An example of a great inventor who used his high level of spatial ability to innovate was Nikola Tesla, who provided the basis for alternating current (AC) power systems. Tesla is said (or fabled) to have been able to visualize an entire working engine in his mind and be able to test each part over time to see what would break first. Rather than a great feat of mental math, one could consider this a great feat of mental imagery.




Why Don't the SAT, ACT, and GRE Include a Spatial Ability Measure?

The Scholastic Assessment Test (SAT), the American College Test (ACT), and the Graduate Record Examination (GRE) traditionally have included mathematical and verbal measures. In some of my research, along with work by my colleagues Rose Mary Webb of Appalachian State University and David Lubinski and Camilla Benbow of Vanderbilt University, we essentially found that over a half century of research on spatial ability has uncovered the significance it plays-in addition to math and verbal ability-In educational and occupational settings where it is essential, such as engineering, physics, math, and computer science. Therefore this clearly begs the question: why is there no spatial measure included in these standardized tests? Do we miss identifying a group of individuals who might be spatially talented but less mathematically and verbally talented?

Do Males and Females Differ on Average in Spatial Ability?

Diane Halpern of Claremont McKenna College, in her excellent book Sex Differences in Cognitive Abilities, has documented the evidence that males and females show a robust sex difference in spatial ability, favoring males.

Do Males and Females Differ in Math, Science Reasoning, and Spatial Ability in the Extreme Right Tail?

sat act sex differences math science reasoning

In a recent study I conducted along with my colleagues Megan Cacchio, Martha Putallaz and Matthew C. Makel of the Duke University Talent Identification Program, we examined data from 1981 to 2010 on the SAT and ACT among over 1.6 million students who were 12 years of age. We found that the male-female ratio among students in the extreme right tail who scored 700 or higher on the SAT-M (the top 0.01% in ability, a group with an average IQ level of about 180) was about 13 to 1 in the early 1980's but that it rapidly decreased in the first decade and has been stable at roughly 4 to 1 for the past 20 years (see the figure above). In addition to data from the SAT, we also had an independent sample who took the ACT, which includes not only a mathematics measure (ACT-M) but also a science reasoning measure (ACT-S). For the last 20 years, both the mathematics and science reasoning measures have demonstrated a male-female ratio of about 3 to 1.

Why Might This Matter For Male-Female Representation In High Level Math and Science Careers?

As I mentioned in an earlier post, in some of my research, even within the top 1% of mathematical ability for students who took the SAT-Math at age 12, when comparing the top quartile to the bottom quartile, there were significant differences between these groups about twenty years later in the earning of math and science outcomes, including Ph.D.s, publications, patents, and even securing tenure at a top university. Therefore, because we still find a sex difference on the SAT-Math, ACT-Math, and ACT-Science, math and science reasoning are likely still a part of the equation of explanation for the underrepresentation of women in high level math and science careers. Keep in mind that it is the individual's ability, not their sex, which matters in predicting these long-term outcomes.

However, because neither the SAT or the ACT include a spatial ability measure, we were not able to uncover whether there is a male-female difference in spatial ability in the extreme right tail. Because average spatial ability differences between males and females are quite robust it would make sense that there would also be a male advantage on spatial ability among the highest scorers. This is because small average differences usually translate into large differences in the tails. However, this is something that has not yet been empirically demonstrated and requires future research.

Can We Increase Spatial Ability, Perhaps Through Training?

The fact that we find these differences is intriguing but what is perhaps a more important question is what can we do about it? Shouldn't our goal be to help all men and women who have the ability, interest, and are passionate about math and science be able to pursue such a high level career in this area? In particular, what could we do to increase the numbers of women in high level math and science careers?

Now David Miller, a graduate student at University of California Berkeley, and Diane Halpern of Claremont McKenna College have conducted a fascinating study that examines whether spatial ability might be able to be increased through training. The authors examined highly gifted STEM (science, technology, engineering, and mathematics) undergraduates who completed twelve hours of spatial training and compared them to undergraduates that did not complete training.

According to Mr. Miller, these were the critical findings of the study:

1. "Compared to students in the control group, students in the training group showed larger improvements in spatial skills despite extremely high spatial skills prior to training."

2. "We found large gender differences in spatial skills prior to training, as many other researchers have. However, these gender differences were narrowed after training."

3. "Students in the training group had one-third of a letter grade higher GPA in a challenging calculus-based physics course."

4. "None of these training improvements lasted over eight to ten months."

Mr. Miller told me that "These results demonstrate that even highly gifted STEM undergraduates can benefit from spatial instruction, although twelve hours of instruction could have limited longitudinal effects."

Monday, 9 May 2011

NASA concludes Gravity Probe B space-time experiment, proves Einstein really was a genius!


Well, it looks like Einstein knew what he was talking about, after all. Earlier this week, researchers at NASA and Stanford released the findings from their six-year Gravity Probe B (GP-B) mission, launched to test Einstein's general theory of relativity. To do so, engineers strapped the GP-B satellite with four ultra-precise gyroscopes to measure two pillars of the theory: the geodetic effect (the bending of space and time around a gravitational body) and frame dragging (the extent to which rotating bodies drag space and time with them as they spin on their axes). As they circled the Earth in polar orbit, the GP-B's gyroscopes were pointed squarely at the IM Pegasi guide star, while engineers observed their behavior. In the universe outlined by Einstein's theories, space and time are interwoven to create a four-dimensional web, atop which the Earth and other planetary bodies sit. The Earth's mass, he argued, creates a vortex in this web, implying that all objects orbiting the planet would follow the general curvature of this dimple. If the Earth's gravity had no effect on space and time, then, the position of NASA's gyroscopes would have remained unchanged throughout the orbit. Ultimately, though, researchers noticed small, but quantifiable changes in their spin as they made their way around the globe -- changes that corroborated Einstein's theory. Francis Everitt, a Stanford physicist and principal investigator for the mission, poetically explained the significance of the findings, in a statement:

"Imagine the Earth as if it were immersed in honey. As the planet rotated its axis and orbited the Sun, the honey around it would warp and swirl, and it's the same with space and time. GP-B confirmed two of the most profound predictions of Einstein's universe, having far-reaching implications across astrophysics research. Likewise, the decades of technological innovation behind the mission will have a lasting legacy on Earth and in space."


The GP-B mission was originally conceived more than 50 years ago, when the technology required to realize the experiment still didn't exist. In fact, the experiment didn't actually get off the ground until 2004, when the satellite was launched into orbit 400 miles above Earth. After spending just one year collecting data (and an impressive five years analyzing the information), NASA has finally confirmed something we always quietly suspected: Einstein was smart. Head past the break to see a more in-depth diagram of how the GP-B gathered its data.

Sunday, 8 May 2011

Found Bin Laden's Hideout Long Before the CIA

Two years ago, a class of UCLA undergrads pretty accurately predicted the the location where Osama Bin Laden was hiding out. The students, working under UCLA geography professors Thomas Gillespie and John Agnew, used geographical theories and GIS software to home in on the world's most wanted fugitive.




Science Insider explains:

According to a probabilistic model they created, there was an 89.9% chance that bin Laden was hiding out in a city less than 300 km from his last known location in Tora Bora: a region that included Abbottabad, Pakistan, where he was killed last night.

On top of this, they identified 26 "city islands" that they considered to be the highest probability hideouts. To be clear: the class identified the nearby city of Parachinar as being the most likely hideout.

Here's the kicker. Gillespie focus isn't national security or terrorism or intelligence or any sort of political geography. He works on ecosystems. Said Gillespie:

It’s not my thing to do this type of [terrorism] stuff. But the same theories we use to study endangered birds can be used to do this.

Gillespie's class focuses on using remote sensing from satellites to study ecosystems, and one common challenge is finding where endangered species would be located within an ecosystem. As a class exercise, Gillespie introduced the Bin Laden search. The students used a geographical theory called "island biogeography" to home in on what turned out to be Bin Laden's real hideout. Gillespie was so impressed by his students' work, they published the findings in the MIT International Review (PDF).

Now that his work is being celebrated by the intelligence community, will Gillespie be working with the FBI to track down more of our most wanted?

Nope. Says Gillespie:

Right now, I’m working on the dry forests of Hawaii where 45% of the trees are on the endangered species list. I’m far more interested in getting trees off the endangered species list.

The Little-Known Agency That Helped Kill Bin Laden

The National Geospatial Agency mapped bin Laden's compound, analyzed drone data, and helped the SEALs simulate their mission

President Obama's first brush with the National Geospatial-Intelligence Agency was ignominious. Out for lunch in May 2009, at a Five Guys burger franchise in Washington, the new President started to shake the hands of other customers, TV cameras in tow. Then he turned to men with government ID badges.

"So what do you?" the president asked. "I work for at NGA, National Geospatial-Intelligence Agency," one said.

"Outstanding. How long have you been doing that?" Obama wondered. "Six years." Obama then asked: "So, explain to me exactly what this National Geospatial..." His voice trailed off. "Uh, we work with, uh, satellite imagery." Obama: "Sounds like good work." The response is obscured by the audio.

Suffice it to say: Obama knows what the NGA does today.

Any number of officials and agencies have been in the limelight since the raid on Osama bin Laden, including the CIA and the Defense Department. But the little-known and little-heralded work of the National Geospatial-Intelligence Agency, often called the NGA, was central to the demise of the terrorist leader.
NJ logo.JPG
MORE FROM NATIONAL JOURNAL:
PICTURES: Obama in New York
Voices of 9/11
Reports: Firefight in Bin Laden Compound Was One Sided

The NGA integrates several core intelligence functions. It makes maps and interprets imagery from satellites and drones; it also exploits the electromagnetic spectrum to track terrorists and decipher signatures off of enemy radar. And notably, the NGA is the first intel agency to be headed by a woman: Letitia Long, a longtime intelligence veteran.

The NGA's contributions to the bin Laden mission are substantial. As described to National Journal by senior U.S. policymakers who do not work for the agency, they include:

* Creating three-dimensional renderings of the Abbottabad compounds using imagery and laser-based sensing devices--laser radar, or ladar.

* Analyzing data from a sophisticated next-generation drone that kept watch on the compound before, during, and after the raid. The drone was an RQ-170 built by Lockheed Martin.

* Helping the Joint Special Operations Command create mission simulators for the pilots who flew the helicopters into the breach. (This was first reported by Washingtonian magazine.)

* Providing to the CIA and other policymakers assessments of the number of people who lived inside the compound, their heights and genders.

NGA Director Long issued a statement on May 2 saying, "I am extremely proud of the work that NGA men and women have done that led directly to this outcome. Their GEOINT was critical to helping the intelligence community pinpoint bin Ladin's compound."

Long is being modest. The NGA also helped the CIA find the compound itself. Based on tips from human sources and intercepted cell phone conversations, intelligence analysts had a basic description of the type of place where the courier trusted by bin Laden lived. But it was NGA analysts who pored over detailed maps and crunched data, coming up with several places in Pakistan that fit the model. One of them was Abbottabad. The bin Laden compound itself was easily noticeable.

"We found a location, and it eventually, as we got more and more fidelity, we were able to render it visually. You can build a story and an understanding that you can take to the senior analysts," a senior NGA analyst said in an interview.

The analyst, who participated in the bin Laden hunt and who has been forward deployed to Afghanistan, was permitted to speak only on the condition of anonymity. His account was confirmed by intelligence officials who don't work for the agency.

Not surprisingly, NGA's technological capacities are very secret, because if the terrorists can ascertain what the agency can do with remote sensing data, they can alter their plans accordingly. But the NGA analyst, who had several tours of duty alongside warfighters in Afghanistan, described some of them.

* NGA can determine, from quite a distance, what an object or a building is made out of.

* It conducts sophisticated pattern analysis of human characteristics, like gait and body size.

* It possesses some of the most sophisticated facial recognition software on earth.

* It has mastered "all-weather" imagery analysis: hyperspectral and multispectral sensors on satellites and drones can see through thick clouds.

Professional defense and aviation journals suggest that NGA, the National Reconnaissance Office, and the Air Force have developed sensors that can penetrate foliage on the ground, peek deep under water and even into the ground. Officials wouldn't comment.

James Clapper, the director of national intelligence, ran NGA during the first part of George W. Bush's administration. Though the agency is best known for its maps, he shifted its focus toward real-time, dynamic, three-dimensional support using all parts of the spectrum.

When Clapper said in a statement after the bin Laden raid that the intelligence integration he observed in the operation was "the best" he's seen in 50 years of intelligence service, part of his pride came from the knowledge that his former agency was a significant contributor to the mission's success.

NGA's work is expensive, but it has escaped much of the budget pressure faced by other national security agencies, in part because the premium on exploiting intelligence is so essential to a war on terror and in dealing with states with extensive nuclear ambitions such as Iran and North Korea.

"We are very proud of the role we played, but there are problems to scaling this," the analyst said. By scaling, the analyst means that demand, in other words, exceeds capacity. The agency describes itself as serving customers--other departments--and those customers are ravenous. It not only includes the military but also, say, helping the Federal Emergency Management Agency with hurricanes and wildfires. If it involves analyzing aerial data, the NGA is on it. In a world of climate change, terrorists and rogue states, the demand for the agency's analysis isn't likely to abate anytime soon even with the death of Osama bin Laden.