Search This Blog

Thursday, September 18, 2008

Top 10 mistakes made when implementing EDC

(last update from admin @ eclinicalopinion)

Ok, I am calling for a challenge here.  I am making an attempt at identifying 10 top Top 10 winnermistakes that I believe are made when companies attempt to implement EDC.  No science to this. Just a bit of fun.

I will make edits if anyone posts comments that I believe out do my own;

 

1. Pick the nastiest, most complex study to implement as a first study

Sponsors may be trying to check the EDC system and vendor as a confirmation of claims of functionality, services and support.  It may also be an internal organizations 'sell' when bringing in a new system. In reality, the risk factors are at their highest, and the chances of failure greater than at any subsequent time in an Enterprise EDC system rollout.  Instead of learning and improving with optimized processes and a well designed workflow model, a 'get it out the door quickly' approach is forced and pain is suffered from all parties!

2. Expecting the return on EDC to be immediate - admin @ eclinicalopinion

Many clients are very experienced with paper and have wrung the very last drop of efficiency out of their process. They start with EDC believing that they are entering a new golden era only to be disappointed with the gains (or losses!) on their first study.
As with any new process or technology, it takes time to refine. The potential gains are real but it will take a few trials before a company hits its stride with EDC.

3. Over emphasis on faster closeout- admin @ eclinicalopinion

Companies new to EDC getting excited about the faster closeout of EDC trials but ignoring the issue of longer start-up times with EDC. With paper you could print the CRF's and send them out before you have finalized (or even built) the database that will finally store the data.

4. Use all the functionality that was demonstrated

A common problem. When a sales person demo's the product, it looks cool. Almost every feature looks good, and could add value.... Well, in reality, not always.  Many EDC systems developed today offer features as a 'tick in the box', but when the feature is used and combined with other features, sometimes the value falls short.  For example, most systems offer some form of data flagging... Reviewed, SDV'd, Frozen, Locked etc etc. Do not use all flags on all fields.  That will be slower than paper.

5. Resource the same way

If you have the same resourcing for Data Management and Monitoring AND you are also resourcing separately for building and testing EDC studies - then you have done something wrong.

With a good EDC product, the rules that would typically be applied manually are applied automatically. The delta should be picked up by a smaller number of 'eyes'.  Many CRO's have played the 'better safe than sorry' card to charge for the same Monitoring and Data Management as paper as well as EDC license and deployment costs.  This demonstrates an inexperienced CRO.

6. Model the eCRF according to the paper CRF Layout

Trying to make an electronic CRF identical to an original paper CRF will result in tears.  The users will be frustrated with the workflow.  The 'e' nature of the medium will not be utilized and the study will be less effective.

Instead, consider appropriate workflow and dynamic eCRF's.  I will stress 'appropriate'. Overdoing the bells and whistles can cause frustration, but, no bells and whistles and many of the advantages of EDC are lost.

7. eCRF Design by committee

The surest way to blown budgets and timelines is to attempt to develop an eCRF based on a committee of individuals.  The sponsor should delegate a chosen few (ideally 1) to work with the EDC CRF Designer. The study should be built to a greater %, then following this period, a wider review should be carried out.

8. Wait until the end of the study to look at the Data

It is surprising how often this is still the case. EDC means cleaner data faster, but often sponsors, and their Statistical departments are geared towards working with final data-sets. Good EDC systems can deliver clean data on a continuous basis.  Whether the data is able to achieve a statistically significant sample size is another question, but, information is often available for companies that are willing to leverage it.

9. Fail to use the built in communication tools provided

Many EDC systems offer the means for different parties involved in the execution to communicate.  These might be in the form of Post it Notes, Query Messages or internal email. Often these facilities are either not used, or not used effectively.  This means that the true status of study is a combination of information in the EDC tool, tasks in Outlook, actions in emails, scribbled notes on a Monitors pad etc etc.

10. Do lots of programming for a study

This covers many areas. It could be programming to handle complicated validation rules, or it could be programming to adapt data-sets to meet requirements. If you're EDC system requires lots of programming in order to define an EDC study, then I would suspect you have the wrong EDC system. Good EDC systems today are configured based on metadata stored in tables. Old systems relied on heavy customization of the core code for each deployment, or, relied on some form of programming in order to complete the study build. If you write code, then you need to test it. The testing is similar to software validation.  This takes time and money. 

Most EDC tools can be extended through 'programming'. If you need to do this, try to do the work outside of the critical path of a study.  Develop an interface, test and dry run it, and then, utilize it in a live study. In this way, you will have time to do it right with proper documentation, support and processes.

and relegated below the top 10...

11. Start by developing Library Standards from Day 1

This may sound like an odd one, but, let me explain.  Implementing EDC effectively with an EDC system, even with a highly experienced vendor, takes time.  All parties are learning, and, modern EDC systems take a while to adapt. Workflow, system settings and integrations all need to come up to speed and be optimized before standards can really be applied and add value. Start too early, and the library standards are just throw aways once the teams come up to speed. It is best to leverage the knowledge and skills of the CRO or Vendor first.

12. Develop your Data Extracts after First Patient In

Sometimes tempting due to tight timelines, but, if you have an EDC tool that requires either programming or mapping of data to reach the target format, then the less consideration you give to the outputs when you design the study, the harder it can be to meet the these requirements after FPI. This leads to higher costs, and, the potential for post deployment changes if you discover something is missing.

Conclusion

Many thanks to admin @ eclinicalopinion for the additional 2 mistakes made, now coming in at numbers 2 & 3!  More comments welcome...

3 comments:

Eco said...

Hard to fault this list. I have seen your number 1 time and time again often with EDC naive clients. With these clients there are no baby steps, it's straight into running the 400M in the Olympic final.

Madness but it happens over and over.

I think one mistake that is related to that is expecting the return on EDC to be immediate. Many clients are very experienced with paper and have wrung the very last drop of efficiency out of their process. They start with EDC believing that they are entering a new golden era only to be dissapointed with the gains (or losses!) on their first study.

As with any new process or technology, it takes time to refine. The potential gains are real but it will take a few trials before a company hits its stride with EDC.

The second issue I see, and again it is related to the first, is companies new to EDC getting excited about the faster closeout of EDC trials but ignoring the issue of longer start-up times with EDC. With paper you could print the CRF's and send them out before you have finalized (or even built) the database that will finally store the data.

EDC requires more setup effort which can be a surprise to teams who have been working with the paper process for years.

Eco said...

Okay, here's another more subtle one. Not quite sure how to categorize it, maybe "Failure to realize potential". It probably doesn't even belong in this top-10 of implementation mistakes.

Sponsors get sold on the key benefits that are talked about for EDC time and again:

1. Real-time access to the data.
2. Remote monitoring (including reduction in site visits)
3. Faster time to database lock.
4. No paper to print & ship.

It doesn't sound so bad does it. But this list is really focusing on the differences between an electronic medium and paper. It's what we had before, only faster.

After working with EDC for a while Sponsors should be asking themselves what EDC could allow them to do different. Companies built their companies around optimizing for paper but struggle to re-organize around EDC.

Think about the automobile. Yes it was faster/more convenient than travelling by horse but more importantly its adoption allowed entirely new patterns of work and life to emerge.

Adopting EDC is just the first step, the really successful adopters will discover that EDC enables them to change not just the way their organization operates but the organization itself.

Maybe we don't need CRA's to visit sites at all? If monitoring is done remotely, training is provided on-line and CRA's only visit sites to perform source document verification (an important but clerical task) maybe that work can be outsourced? Maybe we need different kinds of site-visiting staff? Relationship Managers? Maybe increased online collaboration tools can meet that need too?

That's an idea. It may not be a good one but there's a new paradigm to explore.

EDC has been going for twenty years but we haven't even begun to reach it's potential to transform the way we do business.

EDC Guru said...

I think the number one mistake made when implementing EDC is not allowing enough startup time. Switching from paper to EDC requires process changes in Clinical and Data Management departments. Ideally Programming and Regulatory are included on the startup discussions since there are many benefits that EDC can bring to them as well.
For a first EDC study, the sponsor company needs time to understand the system they will be working with and adapt their current process to the new electronic system. A lot of times this is tricky since many companies are worried about making radical changes in their process while implementing a new system, but in order to benefit from features of the system this must be done.
Startup activities such as CRF design, edit check specification and output dataset specs must all be completed well before FPI.
The sponsor must also have enough time for UAT testing, and ideally factor in time during the UAT for a simulated process run-through to assure that their new processes will flow smoothly. There must also be training on the new processes for all people involved.
A sponsor should allow for several extra weeks during startup for additional discussions, development of training plans and refining of processes to assure a smooth transition into EDC.