Pitney Bowes Business Insight (PBBI), formerly Group1 and MapInfo, provides location intelligence and predictive analytics to business and government customers. V1 Editor Matt Ball spoke with Sean Richards, Director, Product Management at Pitney Bowes Business Insight, Asia-Pac, about the evolving focus of the company, and about the new approaches they’re taking to expand the market for location intelligence.
V1: Under Pitney-Bowes ownership of MapInfo there has been an emphasis on enterprise data. What kinds of benefits does spatial technology bring to a large enterprise customer, and where does the MapInfo toolset fall within the larger business of Pitney-Bowes?
Richards: Enterprise data management isn’t really a spatial story, it’s an enterprise data story. With Pitney Bowes, the software is $500 million of a $6.5 Billion company, so we’re a little small relative to the overall company, but we’re definitely part of the overall growth strategy.
Enterprise data management is a capability of ours that began in 2007, and we’ve been investing in it for a number of years. Principally, the story starts with data quality, data governance, trusted data programs, cleaning and verifying data, and not analysing before you trust your data behind the scenes.
It doesn’t sound very spatial, but what we’ve done is to put in all the gains that you get from spatial under the heading of enrichment. In the spatial world there are some old fashioned concepts such as proximity analysis, point and polygon analysis, spatial relationships, drive-time catchments, point-to-point routing and geocoding. These are all fairly fundamental spatial concepts. But if you pull them into a mainstream data environment where you’re passing the data through validation procedures and verifying and matching against authoritative data, it then provides a unique means of enriching that data with more information – a richer perspective of what your data represents. I think that is the key point here – fundamental spatial concepts applied to mainstream data enrichments requirements. When it comes to data analytics, the potential for drawing insight is significantly enhanced.
We talk to insurance companies all the time about ways to score risk quickly for new policies before they underwrite the risk. If you can verify the address, geocode the address, and then proximity analyse the address for its relationship to flood hazards, bush fires, and propensity models for theft, break and enter, etc., then you can better measure and score these risks.
It’s traditional spatial capability that has been re-purposed as enrichment, and that story is unique and highly valuable in the enterprise data management space.
V1: Does this then tie into the location intelligence marketing and approach that you’ve taken for some time?
Richards: This is probably a more robust orchestration of location intelligence. We use the term location intelligence primarily because the term GIS is a little scary to users in some industries. In government agencies and some commercial organisations you have GIS professionals. But there are plenty of organisations that do not want to invest in a specialist expertise like GIS when really all they’re looking for is where to put their next retail outlet or identify where their vehicles are right now. The GIS capability is critical, but it’s not a profession in which such industries necessarily want to specialise.
Academia doesn’t like the sound of that, and they try to promote a position that GIS should be in every IT portfolio. Our position is more pragmatic, where we respond to what the market needs, which essentially is to present location intelligence in a form that is readily digestible and actionable for mainstream enterprise data management and business process workflows.
We have property analysts in a commercial property business, and they want to map out their expansion over the next five years. They need technology that gives them the answers, where location is just one set of variables that factor with a whole range of other variables like parking spaces, town planning, what the land costs, and how long it will take for development to require more retail outlets. Putting location in there is elegant, it’s not the be all and end all, but it adds significant value to the overall trusted data program with insights they didn’t have before, but critically rely on for competitive advantage.
It’s almost like fifteen years ago when you put up a thematic map and people became wowed by it. You’re just reaching another tier of user base that doesn’t appreciate the gains of combining their data with proximity to other data. By ensuring better quality data and better enriched data we facilitate more successful business intelligence (BI) and customer relationship management (CRM).
V1: How is the data enrichment delivered? Is this an enterprise-level database tool, a web-based service, or what?
Richards: We begin with an organisation that has a need to perform analytics on their business data, but identifies a data hygiene problem in their data which hinders the analytics required. It’s a project in itself to lift the data to a level of established trust, and technology can be deployed to go through a whole sequence of data processes to get to a standard that complies with what they need. Then it’s about ongoing policing of data integrity.
Once data integrity has been established, the same technology needs to be deployed in a dynamic fashion to ensure ongoing data management adheres to quality practice. This is best delivered through a Web Services infrastructure that can deal with an organisation’s numerous business systems, some of which may be aging or legacy in nature. Like before, enrichment through spatial operations is performed on demand to satisfy enrichment requirements.
It’s important that these processes aren’t buried in a bespoke, compiled program. An organisation’s need for data management and data enrichment evolves over time. Assumptions are tested and requirements change. The procedural workflows need to be readily available for tweaking and refining over time without the need for cumbersome application recompilations. We empower data stewards to guide the processes for ongoing data integrity and enrichment with configurable workflow management controls that manage these essential Web services.
V1: I’ve noticed the recent launch of MapInfo Manager. Where does that tool fit in terms of data quality management?
Richards: MapInfo Manager is less about data quality than it is about spatial data infrastructure. It’s about metadata creation, management, discovery, and using metadata as a means to collaborate and share spatial data as well as to make fit-for-purpose decisions. It’s not the same everywhere in the world, but generally the spatial industry has done a poor job of adhering to good data management practices through the diligent employment of metadata and maintenance of quality spatial data infrastructure.
I’ve seen umpteen programs where an organisation indicated that they will capture their metadata, and then they pop their heads up a year later with people looking very exhausted and in pain. Then they make a special website for searching the metadata, but these agencies don’t market that site well and people forget about it. Then, the whole ecosystem of what makes a metadata management program work falls over because it isn’t readily consumable. Painful to administer, impractical to exploit.
There could be agencies around the world that have been very good about this and diligent in working this out, but many agencies that we see just don’t handle this well. The result is that it takes a long time for a user to find the data they are looking for, and it’s hard to understand if the data is suitable for their analysis. GIS managers struggle with the issue of having 40 different versions of the same data set, because people keep copying and editing the data but forgetting to upload it again to the server. Data custodians suffer because they can’t control the master version and ensure people are using it.
MapInfo Manager was developed to try to change that whole ROI equation for metadata that was triggered by the EU’s INSPIRE program. Metadata is a chore that feels like it isn’t really worthwhile because it is hard to build and maintain, and to provide accessibility and value. Metadata discovery needs to be injected into the workflow of the user.
We have invested heavily in the MapInfo suite to facilitate a simple story: data custodians can quickly and easily create and maintain metadata, faster than ever before; users can access catalogs of this metadata when needed, as part of their existing workflow using existing technology; then they can determine appropriateness of use and then access that data. Simple administration and simple access changes the return on investment equation for SDI massively.
V1: How does MapInfo Manager go about aiding and facilitating better metadata management?
Richards: We learned a lot of lessons from our acquisition of Encom in 2008. They are market leaders in natural resource spatial analytics, empowering geologists for mineral exploration. Along the way they built up impressive expertise around spatial data infrastructure and metadata management. They built some tools that made the harvest of metadata fast, the uploading very easy, and the consumption very simple. We took all those capabilities and inserted them into the MapInfo suite.
MapInfo Manager offers data access orchestration where people register all their spatial assets and it doesn’t replicate where it’s stored, it just tells the user where it lives. The user interrogates catalogs to discover data. You know, it is the little things that make a big difference, like the automatic harvesting of some metadata values and the ability to bulk update other variables. I have seen the use of some metadata tools require a data custodian to type in their contact details for every dataset they are responsible for. Such lunacy belongs in the dark ages.
Our extensive standards conformance is great too. When a user chooses to publish a catalog it is published to the OGC standard so that anyone can search and discover that catalog, and access it with the right permissions. Spatial metadata standards are also adhered to, like ISO 19115 and ISO 19139 to ensure organisations can facilitate optimum collaboration and clarity.
V1: Are there synergies between the data management and spatial data infrastructure efforts?
Richards: Absolutely. We’re working to bring these two closer together, because we think there’s a lot of value that comes from enterprise data management workflow procedures that haven’t really been introduced to the spatial arena. We think by introducing workflow management capabilities that the spatial community can better penetrate mainstream IT. To date, we have just scratched the surface.
The MapInfo suite will be further expanded this year to bring enterprise data management practices and technology to the spatial arena. For me, I am most excited by the enterprise designer capabilities that enable a spatial professional to control the use of location intelligence as configurable web services. Think simple web mapping like Google Maps, with a full breadth of spatial analytics available behind the scenes to exploit in a discrete fashion. Access to capability will be modular and we have plans in the roadmap for this capability to be cloud based in horizon 2.
V1: Both the data management and the SDI-focused offerings are at a rather large scale. Is the bulk of your market and focus on large enterprise customers?
Richards: It’s a key strategy of ours to offer enterprise-grade capability. The MapInfo brand has evolved over time to be more than powerful GIS for the analyst. Organisations need the output of those analysts injected into enterprise workflow and analytics. We are building out the MapInfo technology suite to facilitate this - exploiting location intelligence for the masses, not just the few.
We still invest heavily for those analysts though. We maintain dedicated analyst offerings, some application specific. MapInfo Crime Profiler has specific algorithms for crime incident hot spotting, while MapInfo Discover has everything a geologist needs to help a mining company perform exploration research and analysis.
V1: What is the big-picture vision in relation to the traditional GIS community?
Richards: The over-arching vision behind Pitney Bowes Business Insight is to enable lifetime customer relationships. The commercial organisations that we deal with need to have happy customers, because it means return business, growth, less churn in their customer base, and targeted upsell to the right customers. From a corporate perspective the opportunity these days to connect and communicate with your customer is limited. You need to make every interaction count. Behind that is a myriad of data analytics and decision support processes. Behind that is trusted data, rich data – a platform to facilitate the effective management of the customer lifecycle. Location intelligence is literally riddled throughout the process.
From a public sector perspective we believe this agenda to engage the ‘customer’ is not lost on agency leaders. There is a groundswell of interest around the world to provide more rapid, more dynamic, more pertinent information to a highly mobile and demanding community. Call it eGov or Gov 2.0, or whatever, the need to make every interaction count remains the same.
Coming back to our vision of enabling lifetime customer relationships, when you dig a little deeper you can see that location is a critical element of that. Our journey includes helping the GIS community appreciate that. We want to be more than being able to process spatial data faster than our competition, instead we want to take the story to another level and make location intelligence an essential capability used across an organisation in a mission-critical fashion.