What can a Vendor do to prepare for Interoperability Challenges?

Aug 14, 2012

This content is taken from a presentation that I made to the Australian Aged Care IT Vendors Association AGM today. Interoperability presents many demands for a vendor, especially a small vendor. What can do you to prepare for them?

Financial

My experience is that the costs of developing and supporting an integration between two systems is generally more than it’s worth to the purchaser. Yet they still need the integration, and if you don’t provide it, that will look really bad and hurt sales. So you have to recover the costs elsewhere. Possible sources:

  • Direct Charging for the development
  • Licensing the developed code for multiple uses of the same integration
  • Selling more software licenses because you did the integration (this is the Ponzi equivalent)
  • Ongoing support changes for some/all customers
  • Government subsidies (I note that government subsidies to consumers are passed onto vendors with great reluctance, btw)

A vendor that is prepared has a very clear picture about how they finance integration development. I’ve seen lots of models that work, but I’ve also seen that if you don’t have a plan for this, you’ll get yourself into trouble.

Staffing

You need skilled staff that understand the relevant standards, that know how to work with other people, that can get things done. These people are rare… expensive… where are you going to get them from?

  • Are you going to train them up? how?
  • Are you going to buy them? where from?

And once you have them, the most important thing to do is to hang on to them. Perhaps it costs you in the region of $50-$100k to acquire and embed a capable person… you really need to hold on to them. And at least in Australia, you can’t have a person like that without them becoming known, and once they’re known, they’re going to have people calling them offering them jobs. So you need a really good retention policy for them.

Data Policy

Opening up your data to exchange is more challenging than just implementing a closed system. It makes more demands on your data. Here’s some principles I’ve learnt:

  • Normalise your data
  • Don’t hide blobs of data in some syntax that needs code to read it
  • Pay attention to likely applicable standards when you design your data store (if you go different, simpler, or more complex, you’re going to have problems)
  • Keep good metadata and make it available in the code (as the format becomes less context specific - i.e. national EHRs - that’s going to become more important)
  • Keep your data store design as consistent as you can
  • Make sure your code is modular and limit the dependencies between logic modules (and no dependencies on UI modules from logic modules!)

External Integration Point

Use an external integration point - a single sub-system that all the external exchange flows through (or as much of it as possible). Mostly these are called interface engines, though sometimes they are called Message Transfer Agents, or Gateways. There’s any number of commercial engines (including my own), and Mirth is a pretty good engine. Using something like this is a really good idea because:

  • It isolates your application from variable site specific requirements and partner application life cycles
  • They provide advanced troubleshooting options and installation robustness
  • You can leverage them to support multiple standards more cheaply

 

I still see vendors that don’t do some of these things. And we know that there’s going to be winners and losers in the never ending battle between vendors. And the losers…. go bust. These are the things that I know you can do to avoid going bust when the government leans on you to provide better integration functionality.

If you know other things - drop me a line in the comments.