1 2 3 4 Previous Next

SAP for Life Sciences

50 Posts

In the previous blog, UCB’s strategic approach how to achieve complete visibility in the pharmaceutical supply chain and to comply with serialized track and trace regulatory requirements was outlined.


In this blog, Stéphane Aubert, Director IT – SAP Logistics Execution and Program Manager Serialization Track&Trace – IT with UCB, shares five principles which were followed to successfully run the implementation project.

 

1. Anchor the business value firmly

 

Defining the business value was vital to get business buy-in, as well as to keep on track with the right focus. In this case, the key value was easy to determine: compliance. Non-compliance would lead to the inability to distribute our products on various markets, meaning numerous patients not receiving their treatments, so the investment was clearly justified.

 

2. Determine requirements early on

 

When the project started in 2012, regulatory requirements for serialized pharmaceutical products were not clearly defined. UCB needed to base the project on assumptions. The first key decision was to keep technical requirements as simple as possible to be able to adapt when regulatory requirements would become clearer. Further, the advantages of corporate standardization had to be balanced with the requirements of the local specificities. Consequently, a top-down and a bottom-up approach had to be combined. To minimize cost and complexity, the SAP standard was applied where possible and a pilot was also implemented to confirm the selected design before rolling-out solution to production environment.

 

3. Cooperate with stakeholders and use staff project carefully

 

The high level of dependencies across business domains, locations and suppliers required strong integration to ensure alignment and accountabilities. Legal and process requirements had to be checked on global and local levels, joint validation of key rules were agreed on early, and close collaboration helped promote acceptance.

 

The project was staffed with executives, talented employees from UCB and top consultants from implementation partners, e.g. Accenture and Advanco. In addition, strategic resources were allocated to build the final solution, coordinate the parties, and manage the transition to support organizations.

 

4. Ensure timely delivery, effective tracking and proper risk management

 

The program was complex. The solution involved new technology and high level of interdependencies. This meant UCB had to be prepared to deal with unforeseen issues. To ensure on time delivery, a close pulse was taken on the development progress and testing. Corrective measures were taken on a weekly basis, especially on prioritization and parallel phase execution to ensure maximum adherence to plans.

 

In order to anticipate issues, the team built a tool that mapped processes, scenarios, developments and tests scenarios. This gave visibility of interdependent components that enabled the team to quickly identify how an issue in one area cascaded into others and thus better assess the true impact.

 

Risk management was run carefully, as UCB was dealing with a validated environment with GxP relevant components and as the project touched upon core business processes, i.e. production, warehousing and distribution. A centralized issue and risk management log was used as of day 1 and placed in the central collaboration site for continuous project control.

 

5. Manage change and achieve operational readiness

 

Lastly, before going live, change management and training played a key role. Expectations were managed proactively, organizational and cultural changes prepared in detail, and strategies had to be in place to overcome possible resistance.

 

Disciplined project management was worth the effort. UCB achieved readiness to existing serialization and track & trace regulations, prepared for upcoming ones, and improved performance and quality of the manufacturing processes.

 

 

Stéphane Aubert                                                                                                    Susan Rafizadeh

Director IT – SAP Logistics Execution                                                                   Life Sciences

Program Manager Serialization Track&Trace - IT                                                 SAP

UCB

Drugs can save lives and give us relief from illnesses and pain – people rely and trust in them. But high quality of drugs is not a natural given. Recent incidents on product counterfeits have raised the awareness that an effort is needed to make sure that patients actually get what they think to be sold, and also other unexpected events may require companies to recall products – which should happen quickly and entirely.


Here is how UCB developed a best practice in cooperation with Accenture and Advanco to secure patient safety, which was awarded with SAP Quality Award 2014 in Belgium and Luxembourg with Gold in the category “Business Transformation”.


Challenges: achieving regulatory compliance and patient safety

 

The question drug manufacturers need to solve is how to secure patient safety. How can drug counterfeits and other unexpected events be avoided proactively, how can product recalls be done quickly and efficiently, and how can drug expiries be detected early enough?


This is a task UCB, a global pharma and biopharma company located in Brussels, has identified as priority to solve at the highest quality level. There are global regulations for track and trace in pharma in place, and the regulatory landscape for pharma serialization has been and will be continuously evolving further. On top of complying with all these existing, upcoming, and changing regulations, UCB wanted to go one step further by leveraging the investment made into compliance and also benefit from a process performance and quality perspective all across the supply chain. Not an easy task, but they did it!

 

Winning strategy: inclusion of all parties through standardized approach


The key to stay compliant with all global serialization requirements now and in the future, to ensure safe market supply all over the world and to further improve operational quality was standardization. All sites, product lines, and external partners like Contract Manufacturing Organizations (CMOs) or Third Party Logistics Providers (3PLs) should adhere to a standardized way of communication.

 

The reasons behind this approach can be explained as follows: The communication chain can be easily disrupted as they all work in siloes. They have different quality processes and different systems to capture and organize data. This means, it’s hard to stay updated at any point in time, and errors can occur easily through multiple data entries. The manufacturer of the product is accountable for compliance and to make sure that all data are accurate, complete and timely available. The standardized approach not only enabled UCB to gain complete and valid global visibility and cohesion on each product through the entire supply chain. It also saves IT cost and allows to adopt best practices quickly across the organization to maximize global performance.

 

Of course local specifics and constraints cannot be ignored to achieve realistic successes, so the IT solution driving the processes needed to allow room for flexibility on top of supporting a standardized concept. UCB running one single instance of SAP as corporate ERP system, the technology the project was based on was chosen quickly. The solution provides a single source of truth on product data from production to sales and distribution for all involved stakeholders including packaging and logistics of all plants globally, for CMOs, 3PLs, Wholesalers, as well as for authorities.

 

Creating an overall idea may sound easy, but probably everybody agrees putting it into practice can be quite challenging. How the way to success looked like will be outlined in the next blog “UCB Awarded for Pharma Serialization 5 principles leading to their success “.

 

 

 

Stéphane Aubert                                                                                                                                 Susan Rafizadeh

Director IT – SAP Logistics Execution                                                                                                Life Sciences

Program Manager Serialization Track&Trace - IT                                                                              SAP

UCB

Brazil is one of the markets that is pushing hard to implement legislation. While anti-counterfeiting is one of the drivers for the initiative the high risk of tax evasion is definitely an additional issue that the BRazilian Government wants to tackle.

 

The Brazilian government body ANVISA published Resolution RDC 54  on December 10th, 2013. RDC54 already covers quite some details about how track and trace in Brazil should look like. It clearly states the applicability to all prescription drugs subject to registration at the National Health Surveillance Agency and explicitely includes free samples!

 

In addition the following requirements are included:

  • Serialization on unit level is explicitly required.
  • It is also required by the Registration Holder to track any unit down to the point of dispensing.
  • Imported products have to be serialized before import.
  • Tracking will be based on the Unique Medication Identifier Code (UIM), which includes
    • Medication Record number at Anvisa (13 digits),
    • Serial Number (numeric 13 digits),
    • Expiration Date,
    • Lot Number.
  • Aggregation is required as per Art 9 §2 which states that the transport packaging must contain an identification code in which all IUM that composes the packaging is related.

 

Chapter 3, Art 7 explicitely calls for non-deterministic randomization of serial numbers and for serial numbers that must be unique across the product produced by the MAH. In the meantime it was clarified however that both requirements wil be implemented in a reduced fashion: deterministic randomization will be accepted and only the serial numbers for products that will be sold in Brazil have to be unique.

 

As per the orginal legislation as defined in RDC 54, no central government operated database was planned. However, in the recently published "Normative Instruction" of August 2014 a government database is explicitly mentioned. In addition, the Normative Instruction covers the following:

  • Necessity to communicate events from Medication Registration Holder (=MAH) to Anvisa (Art. 1, Art. 10)
  • Necessity to communicate events between supply chain participants (Art. 1,§1, Chapter III). This means that every participant in the supply chain has to send logistic events
    • to the MAH (Art. 4),
    • to the receiving chain member (Art. 6) and
    • to the previous supply chain member (Art. 4) .

Supply Chain members also have to send all communication received from the posterior chain member to the previous chain member and the MAH (Art. 5) and have to send differences between declaration and real product to both to previous supply chain member and MAH (Art. 6, §2)

  • The MAH is responsible to monitor logistic movements (Art. 1,§4) as defined in Art 2, Chapter II, including the Creation (medication “emerges”), Change of ownership (“passage between members”) and Decommissioning (“extinction”) of medicinal products.
  • Art. 3 cover the definition of "Logistic Events" and includes „purchase“ and „sales“ which in SAP‘s view are not logistic events and needs to be further clarified, but it also explicitely includes Free Samples.
  • The normative instruction also includes the requirement to report “anormalous events” (Art. 12) like movement of medication with IDs not generated by MAH, Duplication of IDs and rReporting of decommissioned IDs.

 

So overall, this will result in a huge number of messages being sent back and forth as layed out in the use cases below.

 

Brazil_Scenario_1.jpg

 

Brazil_Scenario_2.jpg

 

Brazil_Scenario_3.jpg

 

Brazil_Scenario_4.jpg

 

 

To ease the pain of sending all the messages back and forth a network model is currently in discussion and in the past few weeks some more details have emerged through the collaboration with the different stakeholder organizations like interfarma and Sindusfarma. Stay tuned for some more news!

 

PS: The original documents and english translations are available and can be shared with SAP Customers.

Biotech companies can differentiate already through the ability to process large amounts of data from different sources, i.e. to gather it, consolidate it, represent it in a comprehensible way and to deliver it safely. Some biotech companies already offer this as core service. Those ones who can validate the quality of data and extract the relevant information have a competitive advantage. In-memory technology even opens up more opportunities as different data formats including unstructured texts can be consolidated easily and quickly from various data sources.

 

Another way to positively stand out in the market is the ability to analyze raw data that biotech companies can get by collaborating with hospitals using smart
algorithms. To do this, a biotech company can pursue two possible strategies:

  1. Providing an added value by processing Big Data very rapidly through a comparably simple algorithm or
  2. Offering insights from relatively few data from various sources by using a highly complex algorithm.


Both approaches can be pushed by technology. Big Data solutions offer the necessary IT-capacities to process huge data masses in extremely short time periods as well as the capability to run data mining in an intelligent and quick way.


Examples indicating the future direction

There are many success stories to tell already how Big Data technology helped advance personalized medicine:

 

The National Center for Tumor Diseases (NCT) has gained new insights to fight cancer. The project “Medical Research Insights” which is based on the in-memory platform SAP HANA helps develop new personalized therapies. Employees can now capture enormous amounts of data per patient and analyze in  real time. No matter if we are looking at medical reports, MRT results, genetic analysis and cancer registry data, all information come together in one central place. It can be found out very quickly which therapy has the greatest probability to work best at which patient.

 

„ProteomicsDB“, a database focusing on the human proteome that was jointly developed by one of Munich’s universities, Technische Universität München (TUM), and SAP supports scientists to explore the human proteome and conduct fundamental research. When building up this database, the human proteome was captured in an unprecedented degree of completeness, assessed in a structured way and pooled. Extremely high data volumes and various scattered data sources were coped with.

 

Alacris Theranostics has come up with an innovative Virtual Patient Platform that allows to accurately analyze the exact type of cancer of a specific patient and to find the best therapy. In the background molecular data of the patient and algorithms are used to derive the behavior of the tumor and to simulate the efficacy of different therapies. The complex mathematical model contains thousands of variables like genes, proteins and tests for every drug and every dose. With SAP HANA these simulations could be reduced to only a few minutes.

 

Stanford School of Medicine has provided a database of genetic predispositions that was combined with thousands of genomes from the 1,000 genomes project in SAP HANA to allow analyzing the data interactively. Some analyses could be accelerated by a factor of 17-600 and others were not even thinkable with traditional setups. Through this database many new opportunities arise, e.g. finding personalized therapies for chronic diseases like diabetes.

 

Mitsui Knowledge Industry (MKI) leveraged SAP HANA to not only extremely accelerate DNA sequencing, but also lower cost for DNA extraction and analysis from 1m USD to less than 1,000 USD. DNA analysis became affordable for many more patients through this.

 

Summing up, new architectures of databases and new IT infrastructures enable to overcome hurdles like complexities as well as high amounts of data and/or data sources and find answers to questions that were not possible to solve before. Further, saving time and computing capacity can enable pharma companies to make sound decision in earlier R&D stages than before, which means saving cost and time to market. Big Data technology also allows understanding illnesses that were not much investigated through clinical studies much better than before. For example, many clinical trials geographically concentrate on Europe which means other regions are currently underrepresented. Big Data solutions can help fill the gap by identifying suitable participants worldwide and process the data. They can also help shift focus of R&D from investigating how to deal with symptoms towards the root causes of a chronic disease and to further advance personalized medicine quicker.

 

This blog was written jointly by Emanuel Ziegler and me. I would like to express to thank Emanuel for all his great insights and support! The content of this blog was first published in a shorter version in German in goingpublic.de

Pharma companies need strong partners for R&D

 

Data and scientific insights are the key to innovation for pharma companies. The need to increase R&D productivity has grown due to the patent cliff, intense global competition, and other reasons. As scientific advances progress rapidly, opportunities are certainly there – but currently paired with challenges.

 

Complex collaboration networks in R&D and personalized medicine

 

To be able to bring innovations to market more quickly and to understand earlier if a R&D project should be stopped or continued, the pharma industry collaborates closely with external partners like biotech companies and Contract Research Organizations from all over the world. From a practical perspective, this means that pharma companies need to be able to analyze data from a large number of partners, and from even more possible data sources – which mostly come in different formats. This especially becomes a challenge if historic data needs to be connected with most recent studies. If this hurdle cannot be overcome, potentially important new scientific results stay unexploited.

 

Not only the complex collaboration networks in R&D lead to massive amounts of data, also the advances in personalized medicine do so. As in personalized medicine, genomic profiles and other individual characteristics analyzed to develop tailored therapies, genomics, transcriptomics and proteomics offer huge opportunities for biotech companies. But the opportunities are tied with challenges from a pure data perspective already.

 

Data privacy

 

Owing to data protection laws, data can become available too late, or it can become less meaningful for further investigation of new scientific questions. One example: If you want to find out if and how traits like gender, region, nutrition or genetic preconditions are impacting an illness, and in which case which  therapy would have the highest efficacy, you simply need a certain set of information about patients. But data is not allowed to be shown in a way that you can draw conclusions about individual patients. The operative word is “can”: It is not allowed that single non-authorized persons are theoretically able to conclude which patient is behind the information – and this can quickly happen if you for example examine persons with a relatively rare illness in a small defined region.

 

This requires sophisticated setups protecting information from unauthorized access and still some problems remain touch to tackle.

 

Varying data quality

 

Data quality can vary extremely for various reasons. Biggest quality gaps can be found in the age of data, data maintenance, and incorrectness of data that can arise from measuring errors for example. Keeping the whole original data set instead of using pre-aggregated data allows finding and correcting such inconsistencies more easily.

 

In future, we can expect that data quality will improve more and more. But even if the amount of useful data won’t be the bottleneck for R&D in Life Sciences, the permanently growing amount of data can be still quite challenging.


Growing data volumes

 

Genome sequencing becomes more and more affordable and even proteomes can be analyzed more and more quickly. Consequently, new correlations can be found faster – e.g. the effect of specific therapies for dedicated genome mutations, which again can provide enormous opportunities to explore complex interrelationships of the human metabolism. Genomes, transcriptomes, proteomes, phenotypes – the amount of data for personalized medicine is growing at breathtaking pace.

 

Hospitals have a great potential to provide a vast amount of high quality insights about the causes of diseases and about clinical studies. In addition, data can be generated directly from patients through wearable medical devices or other mobile devices like smartphones. With growing acceptance from the patient side, data volumes could explode. To fully take advantage of the scientific power of this data, some hurdles have to be taken. Not only to comply with data protection laws, but also from a technical point of view, companies and hospitals need support to capture the data in a systematic way. In most cases, data is generated in different departments within clinics that mostly work in siloes. As an example, data from the internist division, from oncology, and from rehab are needed to better understand the situation of a patient, data capture alone can be cumbersome.

 

Biotech companies who will shine with faster and more precise results for R&D in pharma will be the winners of tomorrow. As outlined above, this is also a Big Data play which can be approached in various ways, which we will describe in the next blog “Big Data strategies for biotech companies”.

 

This blog was written jointly by Emanuel Ziegler and me. I would like to express to thank Emanuel for all his great insights and support! The content of this blog was first published in a shorter version in German in goingpublic.de

Pharmaceutical companies know that the age of blockbuster drugs is dead. Billion-dollar drugs go off patent before generating expected returns or before a company can develop another blockbuster. A study of 150 major products from 15 drug makers between 2007 and 2013 found that 54 products lost growth in sales and 26 products lost blockbuster.  Companies are struggling to stay profitable while they figure out  what’s next.

 

Meanwhile, healthcare providers are grappling with declining reimbursements from private and public payers. U.S. healthcare utilization was up across the board in 2013, yet Medicare and Medicaid reimbursements to doctors continued to decline. In 2013, more than 200 nonprofit hospitals and health systems grew expenses faster than revenues, according to Moody’s.

 

The healthcare financial crisis is not limited to the United States, either. Across Europe, governments are pushing patients and providers to be more accountable for skyrocketing costs and in some cases are raising patient copays.

 

To get paid in the near future, providers must figure out how to deliver excellent patient outcomes that lower the cost of care. So long, sweet days of profits by volume and fee-for-service income. Drug makers need to abandon one-size-fits-all, blockbuster drug development in favor of developing more drugs for smaller patient populations and developing other lines of business.

 

See the attached link to the full white paper on "How to Save Healthcare: Personalize It"

Ashutosh Tol

Stem Cell Theraphy

Posted by Ashutosh Tol Oct 18, 2014

The Human body made up of about 200 different kind of specialized cells but all specialized cells are made from one cell that is called stem cell.  Stem cells also know as God cells. Now a day’s it is emerging branch of medicine and new Era in scientific research for untreated diseases. Stem cell therapy in simple words the therapy in which diseases are treated with stem cells. Recent study told that stem cells have the potential to treat wide range of disease

IMG1.png.

IMG2.png

These are the sources for stem cells present in Human body.

Stem Cells work on mainly damaged cells of body. It is natural nature of stem cells to migrate towards the damage part of body. Stem cells are undifferentiated cells that have potential to develop in different kind of cells. Stem cells naturally act as a repair system in human body.

img3.png

 

These are the region where patient with different disease are treated with stem cells

.

This therapy also creates opportunity for new business area in life science industry that is cord blood banking and stem cells banking.

We can get idea that future of stem cell therapy is bright future because now days stem cells are used for in liver failure for replace the liver cells which are destructed by viruses, medicine drug and alcohol.

In Parkinson’s disease it replaces the loss of dopamine releasing cells. In diabetes it will help in replacement of loss of insulin-producing beta cells.

So in short stem cell therapy open door for new world where we can think on treatment of diseases which are untreated now a days in our present medicine world.

I recently attended the American Medical Device Summit hosted by Generis, which was targeted towards senior managers in the main areas of engineering, manufacturing, quality management and regulatory affairs – so the conference covered broad topics ranging from R&D to quality management and compliance to risk management and M&A. I first wondered if this topic selection wouldn’t be too broad. But there was one element that tied everything together: the patient.

 

Best run medical devices companies probably always have thought of the patient first. But now as in many countries, reimbursements are being paid based on patient outcomes, patient satisfaction becomes even more important. A patient centric approach was vividly illustrated within the innovation process by one of the speakers. What many consumer products companies do with their consumers, best-run medical devices companies apply as well: they go out to those patients who are willing to share real-life experiences and observe how products are being used in reality. This way they can see what patients may not even recognize to be worthwhile to tell – a huge potential for new innovative ideas to improve existing products or to optimize design of new prototypes.


From many presentations, one message came out clearly: Patient-centricity is also key when it comes to product quality and compliance. How would you define quality criteria if not from a patient perspective on safety and efficacy? And what happens to quality if this is not perfectly understood on all levels within the medical device manufacturer and its suppliers? To make sure top quality is achieved, all employees need to execute on quality. Similarly to visiting patient to explore their real needs, quality managers need to get a deep understanding of the environment employees are working in before defining measures to ensure process and product quality. This implies walking directly to them, asking them, and collaborating with them. Top quality also means applying the same quality standards across the enterprise. I can only stress this, since I have heard from many of our life sciences customers that the FDA wants to see what the company does as a whole, not fractures of various locations. This is why medical device companies should stick to one process, implement one single source of truth, and take advantage from automated workflows, e.g. for approval of documents, assigning responsibilities, or guided escalation processes.

 

Talking about escalations, in order to prevent them, medical devices manufacturers need a solid risk management system – which is a science in itself. You can always strive to minimize risk, you can benchmark within and across industries, but you will always be left with a residual risk. Risk with the highest priority rating concern patient safety, immediately followed by risk affecting product efficacy. What is the best advice to avoid quality risk of medical devices? Strive for highest quality and don’t hesitate to apply most sophisticated risk models if needed. If you try this and miss some points, you will most probably still arrive in an acceptable range. If you just aim for the average, you may run the risk of falling below acceptable quality.

 

And how does M&A fit into the concept of patient-centricity? When running post-merger integration, one key advise stated at the event should be kept in mind: don’t lose focus on the value that is pursued through the investment – which in the end should be to serve the patient better.

 

Summing up the three best practices I took from the summit are: put the patient in the center of all actions, patient safety and product efficacy come first, and strive for best-in class quality rather than for the average level.

Healthcare reforms, scientific progresses and technology advances are pushing towards personalized medicine and leading the life sciences industry to rethink therapies and business models. Some driving forces from the technical side that seem to shift the future of medication and health solutions are machine-to-machine technologies, mobile solutions, Big Data, and cloud computing.


Machine-to-machine technology: This sounds pretty powerful: According to a 2014 Vodaphone report, 57% of health sciences companies will utilize M2M technologies by 2016. You could imagine applications within homecare where sensors and applications interact directly and either give an alert when health indicators exceed certain limits, or help patients to comply with therapy instructions. Of course you could also apply this technology within prevention. Sensors combined with apps that can support healthier living could be built in anywhere in the future, which might not even stop at the bathroom door, as pointed out in this video by Constellation research – we will see how far patients and consumers will actually be willing to go...


Mobile: A direct link between mobile solutions and machine-to-machine technology can be done – so mobile solutions are probably equally powerful. There are already mobile apps that can support collaborative care by bringing patients, their friends and families, physicians, and clinics closer together. The inherent data offer also opportunities to develop new drugs and therapies tailored to patient populations. Another set of mobile apps lies within sales and services – many of our life sciences customers provide their sales reps with mobile devices to make it easier for them to demonstrate products to their customers, and within the medical devices sector, service technicians use mobile apps to be at the right place at the right time. A third area for mobile apps: helping workers on the shop floor run manufacturing and conduct maintenance of machines more easily.

 

Big Data: This seems to be a strong one as well. R&D is the most obvious space where Big Data comes into play in life sciences organizations which have to deal with genomic and proteomic data to develop new products and services in support of personalized medicine, and also need to consolidate clinical studies as well as scientific data from various sources. Also in other business areas Big Data can support decision-making through what-if scenarios when analyzing markets, risk, operational processes or demand and supply. Big Data tools can not only increase speed and precision, they can also reveal completely new insights, change perspectives, and help find new strategic directions.

 

Cloud: Cloud computing cannot be denied as a facilitator for gaining speed in Life Sciences, either. It helps save cost, run more agile, and scale more easily – which is vital during times of change, rapid growth, or after M&A. Also from a business perspective in Life Sciences, cloud solutions can simplify processes, e.g. in Sales and Marketing, HR, and when collaborating with CxOs. A hybrid setting between public and private cloud seems to be the most popular option in Life Sciences to safely deal with sensitive data and to also benefit from the high flexibility.

 

No doubt, opportunities are there. What is your favorite? Which technology will give life sciences companies the greatest power to get new competitive edges? And where are possible hurdles?


Please let us know your thoughts as comment here, contact us on twitter @SAP_Healthcare, or fill in the microsurvey “Future of Life Sciences technology”. We very much look forward to hearing from you! Likewise, we will keep you posted. Thank you very much in advance for your feedback and inputs!

Product manufacturing organizations deal with shelf life of the products every day as a part of business activities. Shelf life determines product suitability for its usage. Shelf life dates explains the date from which the product can be used and until when it can be used.

Especially in Pharmaceutical, Food, Beverages and Chemical industries, these dates are utmost important for planning supply chain activities.

Supply chain challenges with Shelf life expiry dates:

  1. Minimum shelf life of product is required before the goods enter into an organization supply chain. This minimum value supports planners to plan and utilize the goods in manufacturing and delivering to customer. And also as per quality compliance these values are to be maintained against each batch of the product.
  2. Minimum remaining shelf is required while utilizing the product in manufacturing and delivering to customer. Products from warehouse should possess required remaining shelf life before it is issued to production or customers
  3. Shelf life information on labels for the same product may vary when supplying to customers of different or same geographic location. For EOU organizations supplying finished goods to various customers this will become a cumbersome issue. Since product data remains same and only for delivery purpose the label information has to be changed.
  4. Batch details carry forward from API to Finished goods is required to update batch details of finished goods. Details like date of expiry and manufacturing are forwarded to finish goods batch since these represent actual dates of an API/Bulk. In case there are multiple batches of API being used in finished goods then earliest expiry date of the API shall be applied to finished goods batch.
  5. Shelf life rounding rules are sometimes required to round off the shelf life expiry date either period start or period end.
  6. Most importantly shelf life details consideration in planning run. Standard MRP II will not consider expiry dates while net requirement calculation. It will even consider to-be expired batches falling in the planning horizon. This will create stock deficiency at actual day of execution and creates chaos in warehouse management and shop floor.
  7. Recalling of expired batches from the supply chain is a tedious task. If the primary supply chain is integrated with information systems it would be little difficult to manage, but the batches lying at pharmacies, hospitals etc, will be impossible to track without data exchange platforms.

 

While most of the challenges can be solved in Std SAP, but planning can be done by enhancing standard MRP program to consider shelf life details.

www.isapbox.com has provided a solution which demonstrates MRP including shelf life details.

Written by: Ruud Nieuweboer, SAP Life Science Consultant

With the coming ISO IDMP (2016) and Serialization (> 2017) legislation, many pharmaceutical companies are struggling to comply. Some follow an ostrich tactic and some are facing the complexity head on. The latter discover that hidden in this complexity, there are  opportunities in streamlining internal processes (ISO IDMP) and increasing revenues (Serialization).

How, you might ask?

In short, what is ISO IDMP? It stands for Identification of Medicinal Products.  ISO(1) has created five new standards. With these standards it will be possible to better track patient safety issues across countries/ brands and analyze the data for root cause analysis. A key element is the unique identification of the product and all of its substances.

"ISO IDMP will break silos and is a catalyst for harmonizing data."

ISO IDMP is all about data, most of the data is scattered and stored in various systems and controlled documents. For globalized companies, updating their data to a global standard will be a huge task. But, experiences from current projects learn that ISO IDMP will break silos and is a catalyst for harmonizing data. In the end this will result in a streamlined static master data process, where the advantages are numerous. Like reducing costs in administrative processes, better insight in shared materials where for instance substantial supply savings can be achieved. In a harmonized environment, data is also better interpretable especially when static and dynamic data is combined. Here Serialization steps in, this covers the dynamic spectrum of the data.

“With a serialized infrastructure a pharmaceutical company will gain more supply chain insight and become more attractive as a CMO or distributor.”

Serialization is another burning topic within the industry. The key thing about serialization is adding a unique number to the unit of issue (one or more levels beyond the batch level) for prescription drugs. The individual ‘packed product’ needs to be tracked throughout the supply chain to ensure at the point of dispense that the product is genuine and not counterfeit. The legislation will result in many changes  to be made to artwork, controlled documents, it-systems and packaging lines, against high costs without apparent savings or benefits. Is that really true?

 

The big pharma companies are already in third gear and have the funding and power to make these complex projects work. Most big pharma companies are even participating in the fora that discuss the future operating models for serialization. The mid-size market however is looking for ways to manage these projects at low cost, for example by sharing a platform, or taking a close look at the physical supply chain to see where changes can be made to minimize impact. Although the mid-market is feeling the pressure a business case can be made. Due to a serialized infrastructure a pharmaceutical company will gain more supply chain insight and become more attractive as a CMO or distributor. Investing in a serialized infrastructure is a strategic choice to effectively stay ahead of your competition.

 

ISO IDMP and Serialization are two main drivers for increasing patient safety, but both also have a big impact on the pharmaceutical companies process and IT infrastructure. The deadlines are deceiving, as they appear to be in the far future. The reality however is that the lead time of these types of projects is long. If the legislative pressure isn’t enough to start now, why not focus more on the possible benefits and create a business case?

1)www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=55034

By Steven de Bruijn


Private and business users benefit from cloud technology everyday, however it is not used to its full potential by the average regulated company yet. We need to understand what the cloud means for these companies, what the perceived obstacles are, and how to overcome these obstacles while fulfilling the regulator’s expectations.

“The Cloud”

Before we start, let’s define what the cloud is, and explore what flavors the market has to offer. Cloud computing is a very generic term, and suggests the idea of a “black box”. And in fact, this is quite accurate as the cloud is an abstract mixture of IT infrastructural components. Furthermore various sorts of applications can be deployed in the cloud, such as collaborative tools, ERP systems, procurement platforms, document management systems, and so on.

 

Typically, three models of services are distinguished for cloud computing:

 

  1. Software as a Service (Saas) - Configured applications, containing all infrastructure and platform components including hosting facilities, are delivered to the regulated company.
  2. Platform as a Service (PaaS) - Middleware, including all infrastructure and hosting facilities, is delivered to the regulated company. Middleware configuration, application installation and configuration are done by the regulated company.
  3. Infrastructure as a Service (IaaS) - Computational and storage resources, including all network components and hosting facilities, are delivered to the regulated company. Depending on contract conditions, the regulated company can install, configure and maintain the OS, middleware and software applications.

 

Another classification is found in the type of cloud, namely Public, Private, Hybrid, or Community. Simply put, in a public cloud the end users do not know who else has jobs running on the same external server, network or disks. While in a private cloud the infrastructure is designed and delivered for the exclusive use by the regulated company and may be located in-house or externally. For the specifics of each type, visit the National Institute of Standards and Technology (NIST) website at www.nist.gov.

 

Cloud providers offer several clear benefits such as extremely fast and flexible solution delivery, on-demand scalability, business continuity solutions, relatively easy solutions for backup and archiving, and reduced TCO on infrastructure components. This is a strong proposition at considerably lower cost than traditional in-house computing. So why not start immediately?

Obstacles

So aren’t there any drawbacks to cloud computing and its providers at all? Experience shows that many suppliers offer cloud services but lack understanding of the needs of a regulated company. They fail to recognize the most significant GxP risks. Chances are that dropping the term “Annex 11” will not ring too many bells. However, if regulated content is managed in the cloud, the solution should go beyond what is required of most non-regulated business applications. Most importantly, the cloud solution should be validated and auditable.

 

Some companies in the regulated industry seem to have a lack of understanding of what cloud computing is and, equally important, what the cloud is not. There still is a lot of ground to cover when it comes to agreeing on consistent terms that apply to the whole company, to understand the enabling technologies, and to recognize the interactions between the cloud and other applications. Such insights will convince your Quality department and prevent your quality controllers from misunderstanding the concept of a private or public cloud and overestimate the regulatory needs causing them not to allow for any cloud functionality at all. Last but not least, many regulated companies struggle to define their methods for validating cloud solutions. How to break down this struggle into manageable pieces focusing on regulator’s areas of interest?

Cloud Compliance

For computer system validation, the regulated companies traditionally rely on their IT department which owns and manages the corporate IT infrastructure. This way the regulated company would set up and qualify their own machines, platforms, and environments for development, acceptance testing and live use. The software supplier would be audited, and ultimately the implemented system would be validated.

 

In case of cloud computing, an entirely different approach is required depending on the cloud model used. In any case, the regulated company is accountable to the regulatory authorities for the compliance of the IT infrastructure (IaaS and PaaS) and (GxP-)applications (SaaS) that are used. This accountability cannot be transferred to cloud providers. The central goal of validation for the regulated company is to verify that the cloud provider conducts appropriate control over the cloud solution. This all starts with auditing the supplier to clarify what services will be provided and how they will be implemented, managed and controlled, and maintained.

 

In models 2 & 3 (PaaS and IaaS), the supplier qualifies and controls the infrastructure. It is the responsibility of the regulated company to verify that appropriate control is in place. Applications are owned and controlled by the regulated company in this scenario. Therefore, the validation of the applications will be similar to validating applications the traditional way, apart from some cloud-specific risks or issues.

 

Validation becomes more complex in a model 1 (SaaS) scenario, because the regulated company is not the owner and controller of the application, yet still responsible for validating the GxP SaaS application. The application is already installed and configured by the supplier and can’t always be reconfigured or customized to meet the regulated company’s requirements. The approach we propose is to assure that the application meets the requirements by verification through formal testing. Furthermore, verify that the split of responsibilities and tasks between the cloud provider and regulated company are documented in e.g. a formal SLA, as this is an Annex 11 (§3.1) requirement. Also ensure that appropriate control is conducted by the cloud provider, and establish procedures for use of the application.

Why go through all this qualification and validation effort?

Besides obvious reasons such as mitigation of your GxP and business risks, another driver should be the fact that the regulators are increasingly sticking their heads in the cloud as well. An auditor will be interested in what risks have been defined and how these are mitigated. Attention will be paid to how the integrity of your regulated data is assured, and what data backup and recovery measures have been taken. Compared to a traditional hosting model, more emphasis will be placed on cyber security for the networked cloud systems and to what extent privacy is safeguarded. Because a system in the cloud is as secure as its host, regulators will examine your supplier audits, assess SLAs and contracts you agreed on with your supplier, and inspect the supplier’s quality system.

 

New approaches for auditing are crucial, requiring cloud-specific IT technology knowledge, awareness of current IT certifications, understanding of legal aspects, and GxP & CSV knowledge. Goldfish ICT is developing compliant strategies and validation best practices on utilization of the cloud in a regulated environment. We enable our relations to adopt this technology, while maintaining control of their IT landscape in a consistent manner. We would be very interested to share our findings and current state of knowledge with you. If you have any questions or remarks, please contact Steven de Bruijn, who is more than willing to get in touch on cloud computing in a GxP context.

Wearable health-related devices offer great opportunities. As cardiac electrophysiologist Kevin R. Campbell states in one of his blogs, more data is better. He finds that it enables better decisions based on facts, and it can provide least a bit more control for patients to control their illness, which he claims to be especially true for people with chronic diseases. I would agree. I also see opportunities for innovating drugs and devices, as the data can be analyzed according to different patient populations for the development of better and more tailored treatments. If only the devices and apps were already broadly accepted! There are many privacy concerns out there, one writer even calls wearables a “privacy nightmare”.


I think building trust is possible, but some effort is necessary. Some first evident points, most of them inspired by Alison Diana’s commentary in InformationWeek:

  1. All involved parties, e.g. physicians, patients and researchers, set up and confirm by signature their buy-in into common guidelines which data is generated and how the data is used
  2. This should be followed by detailed instructions and education what this concretely means during daily life and execution for everybody who deals with the data
  3. Track, trace and control data from its generation, to its distribution when, where, from who to whom until who accesses the raw or aggregated anonymized data and when, which allows for higher control if everybody follows the rules
  4. The patient needs to get full transparency on all above mentioned points, get access to every detail that concern him or her to be able to wipe out concerns, and the patient needs to be authorized to intervene when necessary
  5. The solution has to be validated and trustworthy

 

There may be more points to consider. It definitely needs some work and investments to achieve broad acceptance, but I think it will be worthwhile!


Want to read more about medical wearables? Here are some more blogs:

  1. Wearable Medical Devices: Always On for Better Health
  2. Making Medical Wearables Work: Top 3 Traits to Keep in Mind

What do you think about the issues discussed here? Continue the conversation in the comments below and on Twitter @SAP_Healthcare!

As pointed out in my previous blog "Wearable Medical Devices: Always On for Better Health", medical wearables may be the next big thing. Like probably any new trend, there may be some show stoppers out there. Mainly, who will accept a wearable medical device if it’s not trustworthy? It’s all about effectiveness, quality and safety.


1. Effectiveness:

What comes into mind first, is especially related to devices that monitor health indicators and life style. Will higher transparency really change behavior in a positive way? Do patients really want this kind of being watched – I mean do they truly want it. Everybody who has broken up a diet at least once in a life knows the difference between initial motivation and long-lasting will. Of course when medical wearables relieve from severe suffering, patients won’t go without them anymore. Secondly, do physicians have the capacity and tools to handle the additional information intelligently?


2. Safety:

Any kind of side-effect should be avoided. Side effects can be related to the body, e.g. google glass is suspected to have some health risk. Side effects may also relate to external actors trying to access your private data and misuse them. This is such a big concern that I will add a separate blog about privacy.


3. Quality:

This seems to be a no-brainer – when it’s about health, any measurements have to be precise, and all functions have to work perfectly. Otherwise the medical wearable would just contribute to confusion, false alarms, wrong actions or even worse consequences. And there are concerns that some medical apps don’t meet the high quality standards as they should. Depending on the feature of the wearable, quality also comprises sophisticated technical service for the customers and maintenance. As one of the main features is to be “always on” for improved health, manufacturers need to make sure to prevent a device gets broken, so they need excellent predictive analytics capabilities and the ability to deal with a huge amount of customer data rapidly in order to respond timely.

 

From companies that are not experienced with authority approvals for life sciences products, efforts do be done in order to get approved are sometimes under-estimated. However, it makes sense to double check if a wearable is classified as medical device. For example, the U.S. Food and Drug Administration (FDA) has just recently published their guidelines for Mobile Medical Applications, which points out a risk-based approach. If a mobile app poses minimal risk to patients, premarket review applications, registrations or listings are not necessary, e.g. apps providing regular educational information

 

So, when a manufacturer can tick all boxes mentioned above, technically the medical wearable is ready to take off. There is one more question remaining: What is about data privacy regarding wearables used to improve health? As this topic is hot, please find more about it in the next blog “Will Wearable Medical Devices Break Through? – Talking about Broad Acceptance and Privacy”.


What do you think about the issues discussed here? Continue the conversation in the comments below and on Twitter @SAP_Healthcare!

The wearable market takes on pace – especially in the healthcare and medical devices space. Opportunities to improve health through “always on” diagnosis, constant monitoring of health indicators for preventive or therapeutic reasons with possibilities of instant alarming the patient, doctors or family and friends, and even direct therapeutic features are just too promising. Experts predict a multi-billion USD market to become reality in the upcoming years. Medical wearables can be basically grouped into two major categories: devices used in diagnostics and devices used in medical treatments.

 

1. Wearables supporting health monitoring and rehabilitation

 

Wearables supporting health monitoring and rehabilitation are already wide-spread: bio-sensors are able to track heart beats (ECG), stress through skin reactions (electrodermal activity), brain activities (EEG), respiration, sleep quality, body temperature, steps, calory burn, plunges – you name it. This can be done through wrist bands, watches, smart glasses, plasters, smart tattoos, or devices through devices you can wear around your waist or torso. Intelligent textiles like smart suits or even smart bras might also become serious business in the near future – who knows?


One big bet the big players are keen to win is around wearables that are able to facilitate glocuse monitoring through minimal or non-invasive wearables that leverage electricity or ultra-sounds. Some insiders say Google invests billion USD in research in glucose monitoring, and also Apple as well as the big medical device players are in the game as well. The market would be massive not only because of the growing number of diabetes patients, but also as this might lower the barrier to use these devices as soon as they might become affordable, so that the use case could broaden up for diets as well.

 

Bio-sensors a bit apart from what you would classically associated with wearables – but too fascinating not to mention here – are self-implantable devices that work under the skin and smart pills that contain technology to monitor glucose or heart rates at high precision.

 

Opportunities for Collaborative Care

 

All these diagnostics wearables can improve remote monitoring and home healthcare, but what is even more, they also enable to include friends and family to support therapies – when this is wished by the patient. For example, they can be alerted in real time on health risk before the patient status gets critical. Or, they can participate in a more healthy life style of the patient through gamification elements. As the data can be aggregated and anonymized with in-memory technology, there are new opportunities for collaborative research between life sciences companies, clinicians and other researchers enabling multi-disciplinary approaches for personalized as well as evidence-based medicine.

 

2. Wearables supporting therapies and physicians

 

The second big part of medical wearables is around therapies and supporting physicians. This covers many traditional medical devices like pacemakers, voice modulators, and others. More recent example that are prominently present in the media are smart glasses supporting physicians during appointments with patients as well as during and after surgeries. Where will the future head to? I found one interesting blog that lists new ideas like wearables that are able to deliver acupressure to reduce pain, influence eye pressure, affect blood circulation micro and needle patches for acupuncture or drug delivery. There might be also opportunities to power up such new medical devices with solutions from SAP, e.g. mobile solutions in order to create tailored apps, or in-memory-technology in order to analyze data generated from the devices.

 

 

Which wearable medical device will be the next big breakthrough? This will heavily depend on their reliability, efficacy and acceptance by authorities, physicians and patients.

 

Would you like to read more about this? In this case, check out the following two blogs as well:

  1. Making Medical Wearables Work: Top 3 Traits to Keep in Mind
  2. Will Wearable Medical Devices Break Through? – Talking about Broad Acceptance and Privacy


What do you think about the issues discussed here? Continue the conversation in the comments below and on Twitter @SAP_Healthcare!

Actions

Filter Blog

By author: By date:
By tag: