Simplifying Your IT Environment: Steps for Successful Multiple Vendor Data Collation

March 16, 2016 By Tony Sprague

Data integration is a complex issue but a framework for user experience best practice gives you a platform for continuous improvement. 

There’s data, data everywhere. And like an insatiable woodpecker, your marketing team is probably tapping away at you and your colleagues in IT for faster and more reliable access.

But in a multivendor environment, with data coming in from multiple channels and being stored in diverse repositories, how can you begin to establish best practice for a high-quality user experience?

The path to successful multi-vendor data integration is certainly fraught with risks. But there are a number of steps you can take to address them, while pursuing a strategy of continuous improvement which moves towards excellence – for your end-users, and for the performance of your systems.

Customer first

The benefits to the business – including a clear, customer-centric approach to data analysis – are increasingly well understood by marketing: one of the reasons for their insistent tapping!

They are well explored in technology writer J. D. Hildebrand’s insightful post looking at  How Data Integration Tools can Turbocharge your Marketing, in which he concludes: “Without data integration tools, valuable existing data remains locked up in silos, the flow of actionable data is slower, and a company’s competitive edge can erode”.

Best practice

On the plus side, best practice data integration not only prevents an accumulation of data silos, but also delivers key IT management benefits – cutting out vendor-lock-in and reducing TCO.

The downside, however, is that a multivendor environment can introduce greater complexity in terms of the sheer number of layers that need to be managed – and the number of consoles required for error-tracking.

And as a food-for-thought slideshow from Information Management suggests, there are pitfalls to avoid when it comes to establishing user experience best practice in such an environment:

  • Don’t misidentify your metadata definitions

  • Pay attention to the data itself, not just the business process

  • Don’t take a common document architecture for granted

  • Don’t let the arrival of the next big thing overtake your core data integration efforts

  • Don’t take a cost-first approach to the nuts and bolts of integration: break coding, and the Extract, Transform and Load (ETL) processes down to see where the priorities lie for your business.

Whether you are looking at the network or the storage infrastructure, the advice from non-partisan commentators is to capitalise on the benefits of using third-party multi-vendor tools to manage your environment.

Accurate reporting

As ManageEngine marketing analyst Nikhil Premanandan says in a Data Center Knowledge post on the  benefits of a multi-vendor storage strategy, success can depend on making best use of the reporting functions in these tools to provide scientific forecasts for resource consumption.

And this is a key area for data integration, which is in itself a resource-hungry strategy – at a time when the demand for business analytics is competing for the same resources.

Time, according to Teradata magazine, to think about tuning, redesigning and expanding your data integration function – and to consider the respective merits of offloading or re-architecting data integration.

The success of either approach depends on having a measurable goal for implementing and optimising user experience best practice – such as how much CPU you can recover from the analytic database for user availability.

Teradata recommends a  2-Step Solution for a Solid Data Ecosystem:

1.       Define your metrics and goals

2.       Engineer a solution to match.

By taking an analytics-first approach, you are more likely to realise improved SLAs and CPU cycle recovery – and improve the end-user experience with faster search times.

Managed change

This could involve deeper changes. Data discovery specialist Veritas, for example, recommends best practices for adjusting the settings of your Enterprise Vault, Compliance Accelerator and Discovery Accelerator – as long as you  plan your changes to minimise any disruption to the data environment.

Simplifying your IT environment through more streamlined data integration makes perfect sense. But it requires nimble thought and planning. Forming a data integration plan is like entering a web [of connections] with a machete, according to Safe Software’s Tiana Warner. “Data integration is about managing complexity, streamlining these connections, and making it easy to deliver data to any system,” she says, giving  9 Reasons You Should Have a Data Integration Plan.

Data as an asset

Achieving multi-vendor data integration is probably the biggest challenge for IT when it comes to helping the business to turn data into an asset and establishing end user best practice.

As pointed out here,  Five New Data Integration Requirements, achieving this goal not only means more pervasive BI and analytics, but also more powerful and flexible data integration technologies that can capture real-time data, transform it into a usable form, and make it available both on-premise and in the cloud – uninterrupted and 24/7.

Takeaways:

  • Plan any changes to your underlying settings to minimise disruption to data services.
  • Establish measurable goals for optimisation that will help to improve database availability and end-user experience.
  • Consider using multi-vendor data integration tools to limit the complexity and limitation of vendor lock-in

Discover further roles that IT plays in Marketing Operations by downloading: Optimising the user experience for operational excellence