Honing submissions resource management

24th May 2018

How do life sciences firms measure the efficiency of regulatory submissions management, and what’s the key to improving productivity?

Regulatory submissions can be a major challenge. Perhaps the most common issue to cause problems in regulatory submissions is a change within the regulatory landscape. Insight into the risk/benefit profile of therapies can often cause regulators to raise expectations on submissions – what may have been a simple submission one year, can easily switch to a more complex submission the following year, due to a heavier safety data burden.

Likewise, individual territories are still divergent in the format in which they will receive submissions. One year it may be possible to provide a submission to a number of territories in a single format, in subsequent years this could diverge to two different formats (with the associated greater load on your team), before finally harmonising back towards a single format again.

Finally, as products become more established and understood, Health Agencies begin to discover how therapies are received amongst their specific population – often requiring some tailored submission material. In situations such as these it is vital to understand the effort associated with tailoring an individual submission, vs. creating a more global submission and then responding to a number of Agency follow up questions.

So what are the options for refining regulatory workload measurement?

The default approach until recently has been to do this according to the raw number of documents processed. If one year 1,000 documents were authored, and the next year the number was 900, there was an expectation that this should cost less.

However, simple formulae like this do not account for the increased complexity of submissions. In recent years regulatory volumes have soared as authorities have introduced new requirements and firms have taken products out to additional markets. Firms have not always allowed for this the associated increase in workload in regulatory administration, causing friction when costs and timescales need to be reviewed.

The shortcomings of spreadsheets

Accurate calculation and monitoring of regulatory workloads is common practice for a range of life sciences companies that have been tasked with delivering improved productivity.

The most deployed approach up to now has been for teams to input every action into a hugely onerous timesheet, which can be unwieldy and unreliable as the basis for measuring improvement because the quality of data can vary so much.

Any more refined solution must inevitably start with a more robust and consistent measure of output: a means to benchmark this reliably, and thereby chart any improvement.

Granularity gives rise to richer insights

For life sciences firms, the ability to estimate resources accurately based on global requirements for each submission type, and to break submissions down into constituent tasks, is highly prized. It enables teams to say with considerable accuracy what makes one job more complex than another.

The resulting matrix of resource requirement per submission type will contribute towards the development of a repeatable framework, which is detailed enough to overcome the challenge of variability between submissions.

Such computational frameworks have proved popular with life sciences firms because they remove any grey areas, so budget-holders do not have to worry about being overcharged for processing regulatory changes. There is also the potential to allow for other factors such as time lost to IT outages, or spent in training, which can provide new operational insight for managers.

If the head of R&D can see in black and white the impact of IT downtime on regulatory productivity and cost, it soon focuses the mind on where action needs to be taken. This level of granularity and insight is something that simple spreadsheets can’t deliver.

Analytics & accessibility

Comparative performance reporting at regular intervals, meanwhile, can boost morale, prompt competition between teams, and provide new levels of intelligence about success factors. The longer the period of recording extends, the richer the scope for data mapping and the more persuasive and powerful the trend intelligence. This in turn could focus discussions about bringing batches of work forward to avoid busy periods, or times when staff resources are thinner on the ground.

Advances in technology and the rapidly falling price of data storage and cloud-based analytics are contributing to more companies’ ability to be smarter about resource management. Using cloud-based data collection tools can enable multiple people to add data at the same time, and to do this from different locations, for example.

Upholding good practice

Systems that display everything on one screen, that can be accessed via mobile phones, for instance, can contribute to compliance. More sophisticated systems can also remove the inaccuracies of gut feel. The result is a quantifiable, data-driven system for monitoring resources – the kind of supporting documentation budget-holders value highly.

Compared to other service disciplines, regulatory outsourcing in life sciences has some way to go to catch up to measurement practices seen elsewhere, but there is a lot that can be learnt from looking at best practices in use in other markets. With regulatory demands set to keep increasing, and the pressures on life sciences firms to be smarter in how they manage resources and leverage product lines, the demand for more discrete measurement strategies and techniques can only grow.

Adrian Leibert is life science and pharmaceutical program manager for regulatory outsourcing at Kinapse.

Tags