I don’t have to remind you that technology is marching on at an incredible pace. Software and hardware becomes obsolete in the approximate time that it takes for FedEx to deliver the boxes.
Only a few years ago, an entire sales force armed with palm computers and wireless real-time connections to shipping, inventory and customer service was an expensive dream, or should I say technological nightmare. Whether the goal is to implement BI, CRM, SFA, PRM, ERM; all initiatives to better business, enable the employee or know the customer depend on one thing, sharing of data.
In order to build customer intelligence, companies must strive to maintain centralized repositories of information, regardless of their storage capabilities and current operational tools. While each department in a company has it’s own needs and systems, islands of data must be shared for the common good of the customer.
On paper, data integration may seem simple; after all, how hard can it be to unify data from multiple sources? All one needs to do is extract data from their diverse sources, transform the feeds to a single or common format and load into a central repository such as a database or data warehouse. This process is referred to as ETL, or extract, transform and load…
There are entire companies dedicated to the first step of the process. This step must occur before all else, as this is the point that the roadmap for the entire effort is planned. Defining and mapping data, the repositories, formats, values, transformations, frequency of update, etc. can be looked at as the mapping of veins and arteries for the companies’ lifeblood (data). The result is data that describes data. This is called metadata. This is one step that should not be rushed, as errors can remain hidden easily and have disastrous effects down the road. New tools are being created every day to assist in mapping data, sources, frequencies and variables. One such product is AMB Dataminers’ eCartography.
Once the data sources have been mapped, pipelines need to be established to facilitate the path to the database, warehouse or to set up 2-way communications between multiple systems. Defining "views" of the data is also required to meet the needs of users and applications. This step includes establishing security rules as defining or limiting access to data fields, types, values or controls.
The tools, or functionality used to connect data also grow, change and develop at astronomical rate. There are many established protocols for establishing common exchange such as EDI , or Electronic Data Interchange, ODBC, Open Database Connectivity, JDBC, Java Database Connectivity (modeled after ODBC), and UDA Universal Data Access (Microsoft Office specifications).
Additionally, there are descriptors for managing exchange, such as metadata and the data itself. DBMS, Database Management Systems, OLE, Object Linking and Embedding (Microsoft Office specifications), ADO, ActiveX Data Objects (3rd version of OLE), XML, Extensible Markup Language (used to create metadata) and CORBA or Common Object Request Broker Architecture are just a few.
Entities that are exchanged in the integration process or flow are DAO, Data Access Object (works with VB and db jet engines), BLOBs, Binary Large Object (collection of binary data in DBMS) and RDO, or Remote Data Objects.
Software usually functions in either real-time, batch mode or a combination of the two. Conversion tools, such as Group1 List Convert allow for batch converting rental lists for a prospect mailing. DQ Plus functions in real-time to perform on-line data validation and correction. Enterprise application integration tools (EAI) often function not only as “middleware”, a gatekeeper and pipeline between applications and legacy systems, but as a hygiene tool, data warehouse, performance monitor and analytical tool.
Application Service Providers and Service Bureaus often provide suites of services combining technology, tools and proprietary data repositories consisting of accumulated or derived data. Providers are companies like Acxiom, Experian and Polk.
Integrating data provides the ability to perform unified query, reporting or update services from multiple, diverse data sources, often independent of the data's location, platform or structure. This type of initiative can help an organization get a clearer, more complete picture of the customer base, sales channel interactions and speed accounting or inventory reconciliation.
The integration process will aid in stabilizing and normalizing the environment, open access to legacy systems, increase efficiency and reduce costs. Integration creates a bridge for information flow where none may have existed before, allowing knowledge to be shared for better decision support and reporting.
Now those sales reps armed with their state of the art palm computers can log onto their company network via ether links and middleware, and access customer data. The rep can then add data on a prospect that will be verified real time, have information appended such as credit information from a third party source, set up an order that will be replicated in the legacy inventory and shipping systems and track shipment! Another application that the rep can perform once data is integrated across platforms is to access inventory real time to find slow moving items and cater the sales call to move older stock. The rep will also be able to access customer service records to verify status of calls made to the company even mere moments prior to walking in the door.
As you can see, building customer intelligence starts with data, the most basic of building blocks. As long as the blocks work together, a strong foundation can be built to support the customer and company. If data is maintained as separate islands, there will be plenty of room to fall between the cracks.