1 2 3 4 Previous Next

SAP for Life Sciences

53 Posts

Medical wearables and mobile health apps are hot in the press, as these technologies promise to be a fast growing market given the ability to bring huge advantages to patients of being better informed, and to get better treatments. Numbers seem to confirm that there is not just a buzz behind this topic, but a real trend around mobile health technology evolving – if players get data security concerns resolved, as you can see in the infographic “Healthcare on the move: Changing outcomes and lives”.




These numbers are the result of an investigation by The Economist Intelligence Unit where 144 leaders in healthcare organizations and manufacturers of pharmaceuticals, biotech, and medical devices in 23 countries were interviewed. It also states that a majority of 65% of all private companies in the survey responded that they expect mobile solutions to help patients to benefit from better health outcomes through better medical information, and 62% of all private companies think that an increase about personal data will help patients and doctors improve decision-making about therapies and health.


Overall, the future looks bright for mobile health apps and most probably also wearables. For today, we are still at the starting point of a whole new market to evolve. First steps have been done already, for example by Roche Diabetes Care. SAP and Roche created a mobile app based on the SAP HANA Cloud Platform to help physicians and patients collaborate better, sharing data online and gaining a constant view of health parameters, which enables both to intervene before the disease worsens.

To find out further which trends, opportunities and challenges healthcare organizations and life sciences companies are considering as key, please read the full Economist report “Power to the patient: How mobile technology is transforming healthcare”.

Imagine multiple GMP critical systems across different plants, departments and across the globe… No wonder this creates a lot of complexity for the people who control access to these systems.


Authorisation checks for access to systems is often cumbersome and, more importantly, not risk-free. In my experience, the processes around Access Control often face these problems:

  • It is manually managed in disparate systems
  • Authorisation on access approvals is poorly documented and managed
  • Segregation of duties is often completely overlooked, unclear, not documented and subject to change during the life cycle of systems
  • Authorisation on access approvals has no relation with training/qualification records
  • Emergency or temporary access is not supported and poorly documented
  • Access Control is not audit-ready at every moment in time

Needless to say, the problems above can lead to inappropriate, unauthorised access, which ultimately can lead to higher risks in the operation, non-compliance with 21 CFR Part 11 paragraph 11.10(d)/(g)/(i), loss of proprietary information and misuse of systems.


The graph below shows quality and effort of authorisation checks without an automated Access Control tool:



Figure 1: Quality and effort without AC tool

It is clear that this graph is very reactive, and therefore volatile. In a controlled/regulated environment this is of course undesirable.

Creating a business case for automated Access Control

That said, making a case for an automated Access Control system can actually be quite straightforward because you can measure direct impact by recording and analysing:

  • The access request process, with requests, changes and their throughput times and documentation
  • The efficiency of the approval process
  • A list of systems that are (or should be) subject to Access Control
  • The internal and external audit findings
  • Possible risks of SoD
  • Extent of compliance with GMP regulations such as 21 CFR Part 11 (Electronic records; Electronic signatures)

In the following figure, we have mapped the quality and effort of an automated tool onto the same graph. It is obvious that the effort is high when kicking off an implementation project like this, but the return on investment on quality and reduction of effort is achieved relatively quickly.



Figure 2: Quality and effort with AC tool

In my opinion, manually managing Access Control within (large) GMP critical environments is nearly impossible. The risks of non-compliance due to human error increase as new applications are introduced more rapidly and the IT landscape becomes increasingly more complex.


Introducing automated tools can help in structuring the requesting and management process. The additional benefit is that you have an extensive check on the current state.


Looking for help to build your business case? Get in touch!

Necessary Evil


As a Computer System Validation (CSV) consultant I've participated in numerous SAP ERP implementation projects. Whereas most SAP SCN blogs are written from a functional or technical point of view, through this blog I'd like to share my experiences from an alternative perspective. CSV is often perceived as documentation overhead. Validation is often seen by project members as a necessary evil that does not make a useful contribution to the actual development and progress of the project. I too have concluded from recent discussions in the LinkedIn CSV and SAP groups I often visit that this perception is very widespread. But is it valid? In my opinion it isn’t. Not only does CSV reduce your regulatory risk, it also increases understanding of your SAP business processes during the implementation project and enforces a structured, well-defined method of project execution, with the potential to save both time and costs. While this is not an easy task, the CSV consultant (or CSV SME) has a major role to play in achieving these benefits.

Acting after the event


Speaking as a CSV consultant, it unfortunately is not unusual to join a project as the validation SME to discover that the implementation of SAP is well underway, the ABAP’ers have completed their work and the users are geared up for acceptance testing. You have been requested to dig up some specification documentation, to take care of traceability retroactively and to please the Quality Assurance (QA) resource so he or she will not pose a risk in terms of meeting the project go-live date. Fine. You get started and give it your best shot, but given the circumstances it will probably not be the most rewarding assignment you have had, nor will the validation be of real value from a regulatory perspective.


Engagement is key


The situation described in the example above could have been very different if there had been a better common understanding of the value of CSV and acknowledgement of the position that QA and the validation role should have in a typical SAP implementation project in a regulated environment. There are many professionals – even in life sciences – who do not  realise that Regulatory Affairs (RA) and Quality Assurance (QA) are ultimately running the show in a GMP world, not a Project Manager (PM) or IT. This is why I would advise to kick off SAP projects with a GAMP 5 and/or GxP awareness training. While most of the life sciences professionals are familiar with GxP, it turns out that a brief refresher helps the project team to better understand the context in which they will be carrying out their project responsibilities. It is actually a kind of a preventative session that helps “sell” CSV to fellow team members and avoid friction at an early stage. It’s also done in order to clarify to the group that CSV is an extension of RA/QA and should not be confused with an activity carried out on the side by an IT department.

Leverage the things you will do anyway


On a positive note, I’ve worked with team members on many SAP projects who understand that “validation” means documenting the good things that they were doing as part of the implementation anyway. For example, why not leverage your blueprint documentation by converting it into a Functional Design Specification as deliverable and input for the validation package? From a project management perspective, why shouldn’t you integrate the main validation activities in your project planning and combine them with your ASAP milestones?

The CSV consultant’s mission

In addition to the aforementioned enabling factors, the CSV consultant’s influence is key in turning CSV into a beneficial project activity. A professional CSV consultant will always defend good validation practice for the actual value it adds and not hide behind regulators such as the FDA in order to justify its use. Anyone can read a procedure or Part 11/Annex 11, but it is the CSV consultant’s task to understand the underlying quality requirements and their implications for SAP ERP, explain the intent of those requirements and transform them into validation practice.


The mission of every CSV consultant should be to ensure that the design, customizing and ABAP developments are specified and that the SAP system is verified as fit for its intended use. This can’t be done by over-testing, over-documenting, obtaining numerous approvals or pulling the “but what if the FDA…” card. Educating fellow project team members and being an enabler through active participation across all project phases is key in completing your mission. Personally I strive to minimize the documentation effort by working risk-based, leveraging existing and supplier documentation (if possible, making use of SAP Solution Manager!), without compromising on regulatory and procedural compliance. Based on a good rationale, you should dare to say ‘no’, instead of generating huge piles of documentation just for the sake of it. After all, validation isn’t done by the kilogram.


Making CSV work in an SAP implementation


Difficulties in interpreting regulations and applying CSV techniques “from the book” have shaped the perception of CSV as being a project burden in SAP implementations. But in the hands of a competent consultant, CSV will likely yield time and cost savings at project start-up and even greater downstream savings by avoiding costly retrospective validation and project delays. CSV must be a risk-based effort that focuses on the regulator’s core concerns of product quality, patient safety and data integrity. By integrating it with your project management and ASAP processes, CSV ultimately delivers a SAP system that is as robust as it is compliant.

In the previous blog, UCB’s strategic approach how to achieve complete visibility in the pharmaceutical supply chain and to comply with serialized track and trace regulatory requirements was outlined.

In this blog, Stéphane Aubert, Director IT – SAP Logistics Execution and Program Manager Serialization Track&Trace – IT with UCB, shares five principles which were followed to successfully run the implementation project.


1. Anchor the business value firmly


Defining the business value was vital to get business buy-in, as well as to keep on track with the right focus. In this case, the key value was easy to determine: compliance. Non-compliance would lead to the inability to distribute our products on various markets, meaning numerous patients not receiving their treatments, so the investment was clearly justified.


2. Determine requirements early on


When the project started in 2012, regulatory requirements for serialized pharmaceutical products were not clearly defined. UCB needed to base the project on assumptions. The first key decision was to keep technical requirements as simple as possible to be able to adapt when regulatory requirements would become clearer. Further, the advantages of corporate standardization had to be balanced with the requirements of the local specificities. Consequently, a top-down and a bottom-up approach had to be combined. To minimize cost and complexity, the SAP standard was applied where possible and a pilot was also implemented to confirm the selected design before rolling-out solution to production environment.


3. Cooperate with stakeholders and use staff project carefully


The high level of dependencies across business domains, locations and suppliers required strong integration to ensure alignment and accountabilities. Legal and process requirements had to be checked on global and local levels, joint validation of key rules were agreed on early, and close collaboration helped promote acceptance.


The project was staffed with executives, talented employees from UCB and top consultants from implementation partners, e.g. Accenture and Advanco. In addition, strategic resources were allocated to build the final solution, coordinate the parties, and manage the transition to support organizations.


4. Ensure timely delivery, effective tracking and proper risk management


The program was complex. The solution involved new technology and high level of interdependencies. This meant UCB had to be prepared to deal with unforeseen issues. To ensure on time delivery, a close pulse was taken on the development progress and testing. Corrective measures were taken on a weekly basis, especially on prioritization and parallel phase execution to ensure maximum adherence to plans.


In order to anticipate issues, the team built a tool that mapped processes, scenarios, developments and tests scenarios. This gave visibility of interdependent components that enabled the team to quickly identify how an issue in one area cascaded into others and thus better assess the true impact.


Risk management was run carefully, as UCB was dealing with a validated environment with GxP relevant components and as the project touched upon core business processes, i.e. production, warehousing and distribution. A centralized issue and risk management log was used as of day 1 and placed in the central collaboration site for continuous project control.


5. Manage change and achieve operational readiness


Lastly, before going live, change management and training played a key role. Expectations were managed proactively, organizational and cultural changes prepared in detail, and strategies had to be in place to overcome possible resistance.


Disciplined project management was worth the effort. UCB achieved readiness to existing serialization and track & trace regulations, prepared for upcoming ones, and improved performance and quality of the manufacturing processes.



Stéphane Aubert                                                                                                    Susan Rafizadeh

Director IT – SAP Logistics Execution                                                                   Life Sciences

Program Manager Serialization Track&Trace - IT                                                 SAP


Drugs can save lives and give us relief from illnesses and pain – people rely and trust in them. But high quality of drugs is not a natural given. Recent incidents on product counterfeits have raised the awareness that an effort is needed to make sure that patients actually get what they think to be sold, and also other unexpected events may require companies to recall products – which should happen quickly and entirely.

Here is how UCB developed a best practice in cooperation with Accenture and Advanco to secure patient safety, which was awarded with SAP Quality Award 2014 in Belgium and Luxembourg with Gold in the category “Business Transformation”.

Challenges: achieving regulatory compliance and patient safety


The question drug manufacturers need to solve is how to secure patient safety. How can drug counterfeits and other unexpected events be avoided proactively, how can product recalls be done quickly and efficiently, and how can drug expiries be detected early enough?

This is a task UCB, a global pharma and biopharma company located in Brussels, has identified as priority to solve at the highest quality level. There are global regulations for track and trace in pharma in place, and the regulatory landscape for pharma serialization has been and will be continuously evolving further. On top of complying with all these existing, upcoming, and changing regulations, UCB wanted to go one step further by leveraging the investment made into compliance and also benefit from a process performance and quality perspective all across the supply chain. Not an easy task, but they did it!


Winning strategy: inclusion of all parties through standardized approach

The key to stay compliant with all global serialization requirements now and in the future, to ensure safe market supply all over the world and to further improve operational quality was standardization. All sites, product lines, and external partners like Contract Manufacturing Organizations (CMOs) or Third Party Logistics Providers (3PLs) should adhere to a standardized way of communication.


The reasons behind this approach can be explained as follows: The communication chain can be easily disrupted as they all work in siloes. They have different quality processes and different systems to capture and organize data. This means, it’s hard to stay updated at any point in time, and errors can occur easily through multiple data entries. The manufacturer of the product is accountable for compliance and to make sure that all data are accurate, complete and timely available. The standardized approach not only enabled UCB to gain complete and valid global visibility and cohesion on each product through the entire supply chain. It also saves IT cost and allows to adopt best practices quickly across the organization to maximize global performance.


Of course local specifics and constraints cannot be ignored to achieve realistic successes, so the IT solution driving the processes needed to allow room for flexibility on top of supporting a standardized concept. UCB running one single instance of SAP as corporate ERP system, the technology the project was based on was chosen quickly. The solution provides a single source of truth on product data from production to sales and distribution for all involved stakeholders including packaging and logistics of all plants globally, for CMOs, 3PLs, Wholesalers, as well as for authorities.


Creating an overall idea may sound easy, but probably everybody agrees putting it into practice can be quite challenging. How the way to success looked like will be outlined in the next blog “UCB Awarded for Pharma Serialization 5 principles leading to their success “.




Stéphane Aubert                                                                                                                                 Susan Rafizadeh

Director IT – SAP Logistics Execution                                                                                                Life Sciences

Program Manager Serialization Track&Trace - IT                                                                              SAP


Brazil is one of the markets that is pushing hard to implement legislation. While anti-counterfeiting is one of the drivers for the initiative the high risk of tax evasion is definitely an additional issue that the BRazilian Government wants to tackle.


The Brazilian government body ANVISA published Resolution RDC 54  on December 10th, 2013. RDC54 already covers quite some details about how track and trace in Brazil should look like. It clearly states the applicability to all prescription drugs subject to registration at the National Health Surveillance Agency and explicitely includes free samples!


In addition the following requirements are included:

  • Serialization on unit level is explicitly required.
  • It is also required by the Registration Holder to track any unit down to the point of dispensing.
  • Imported products have to be serialized before import.
  • Tracking will be based on the Unique Medication Identifier Code (UIM), which includes
    • Medication Record number at Anvisa (13 digits),
    • Serial Number (numeric 13 digits),
    • Expiration Date,
    • Lot Number.
  • Aggregation is required as per Art 9 §2 which states that the transport packaging must contain an identification code in which all IUM that composes the packaging is related.


Chapter 3, Art 7 explicitely calls for non-deterministic randomization of serial numbers and for serial numbers that must be unique across the product produced by the MAH. In the meantime it was clarified however that both requirements wil be implemented in a reduced fashion: deterministic randomization will be accepted and only the serial numbers for products that will be sold in Brazil have to be unique.


As per the orginal legislation as defined in RDC 54, no central government operated database was planned. However, in the recently published "Normative Instruction" of August 2014 a government database is explicitly mentioned. In addition, the Normative Instruction covers the following:

  • Necessity to communicate events from Medication Registration Holder (=MAH) to Anvisa (Art. 1, Art. 10)
  • Necessity to communicate events between supply chain participants (Art. 1,§1, Chapter III). This means that every participant in the supply chain has to send logistic events
    • to the MAH (Art. 4),
    • to the receiving chain member (Art. 6) and
    • to the previous supply chain member (Art. 4) .

Supply Chain members also have to send all communication received from the posterior chain member to the previous chain member and the MAH (Art. 5) and have to send differences between declaration and real product to both to previous supply chain member and MAH (Art. 6, §2)

  • The MAH is responsible to monitor logistic movements (Art. 1,§4) as defined in Art 2, Chapter II, including the Creation (medication “emerges”), Change of ownership (“passage between members”) and Decommissioning (“extinction”) of medicinal products.
  • Art. 3 cover the definition of "Logistic Events" and includes „purchase“ and „sales“ which in SAP‘s view are not logistic events and needs to be further clarified, but it also explicitely includes Free Samples.
  • The normative instruction also includes the requirement to report “anormalous events” (Art. 12) like movement of medication with IDs not generated by MAH, Duplication of IDs and rReporting of decommissioned IDs.


So overall, this will result in a huge number of messages being sent back and forth as layed out in the use cases below.











To ease the pain of sending all the messages back and forth a network model is currently in discussion and in the past few weeks some more details have emerged through the collaboration with the different stakeholder organizations like interfarma and Sindusfarma. Stay tuned for some more news!


PS: The original documents and english translations are available and can be shared with SAP Customers.

Biotech companies can differentiate already through the ability to process large amounts of data from different sources, i.e. to gather it, consolidate it, represent it in a comprehensible way and to deliver it safely. Some biotech companies already offer this as core service. Those ones who can validate the quality of data and extract the relevant information have a competitive advantage. In-memory technology even opens up more opportunities as different data formats including unstructured texts can be consolidated easily and quickly from various data sources.


Another way to positively stand out in the market is the ability to analyze raw data that biotech companies can get by collaborating with hospitals using smart
algorithms. To do this, a biotech company can pursue two possible strategies:

  1. Providing an added value by processing Big Data very rapidly through a comparably simple algorithm or
  2. Offering insights from relatively few data from various sources by using a highly complex algorithm.

Both approaches can be pushed by technology. Big Data solutions offer the necessary IT-capacities to process huge data masses in extremely short time periods as well as the capability to run data mining in an intelligent and quick way.

Examples indicating the future direction

There are many success stories to tell already how Big Data technology helped advance personalized medicine:


The National Center for Tumor Diseases (NCT) has gained new insights to fight cancer. The project “Medical Research Insights” which is based on the in-memory platform SAP HANA helps develop new personalized therapies. Employees can now capture enormous amounts of data per patient and analyze in  real time. No matter if we are looking at medical reports, MRT results, genetic analysis and cancer registry data, all information come together in one central place. It can be found out very quickly which therapy has the greatest probability to work best at which patient.


„ProteomicsDB“, a database focusing on the human proteome that was jointly developed by one of Munich’s universities, Technische Universität München (TUM), and SAP supports scientists to explore the human proteome and conduct fundamental research. When building up this database, the human proteome was captured in an unprecedented degree of completeness, assessed in a structured way and pooled. Extremely high data volumes and various scattered data sources were coped with.


Alacris Theranostics has come up with an innovative Virtual Patient Platform that allows to accurately analyze the exact type of cancer of a specific patient and to find the best therapy. In the background molecular data of the patient and algorithms are used to derive the behavior of the tumor and to simulate the efficacy of different therapies. The complex mathematical model contains thousands of variables like genes, proteins and tests for every drug and every dose. With SAP HANA these simulations could be reduced to only a few minutes.


Stanford School of Medicine has provided a database of genetic predispositions that was combined with thousands of genomes from the 1,000 genomes project in SAP HANA to allow analyzing the data interactively. Some analyses could be accelerated by a factor of 17-600 and others were not even thinkable with traditional setups. Through this database many new opportunities arise, e.g. finding personalized therapies for chronic diseases like diabetes.


Mitsui Knowledge Industry (MKI) leveraged SAP HANA to not only extremely accelerate DNA sequencing, but also lower cost for DNA extraction and analysis from 1m USD to less than 1,000 USD. DNA analysis became affordable for many more patients through this.


Summing up, new architectures of databases and new IT infrastructures enable to overcome hurdles like complexities as well as high amounts of data and/or data sources and find answers to questions that were not possible to solve before. Further, saving time and computing capacity can enable pharma companies to make sound decision in earlier R&D stages than before, which means saving cost and time to market. Big Data technology also allows understanding illnesses that were not much investigated through clinical studies much better than before. For example, many clinical trials geographically concentrate on Europe which means other regions are currently underrepresented. Big Data solutions can help fill the gap by identifying suitable participants worldwide and process the data. They can also help shift focus of R&D from investigating how to deal with symptoms towards the root causes of a chronic disease and to further advance personalized medicine quicker.


This blog was written jointly by Emanuel Ziegler and me. I would like to express to thank Emanuel for all his great insights and support! The content of this blog was first published in a shorter version in German in goingpublic.de

Pharma companies need strong partners for R&D


Data and scientific insights are the key to innovation for pharma companies. The need to increase R&D productivity has grown due to the patent cliff, intense global competition, and other reasons. As scientific advances progress rapidly, opportunities are certainly there – but currently paired with challenges.


Complex collaboration networks in R&D and personalized medicine


To be able to bring innovations to market more quickly and to understand earlier if a R&D project should be stopped or continued, the pharma industry collaborates closely with external partners like biotech companies and Contract Research Organizations from all over the world. From a practical perspective, this means that pharma companies need to be able to analyze data from a large number of partners, and from even more possible data sources – which mostly come in different formats. This especially becomes a challenge if historic data needs to be connected with most recent studies. If this hurdle cannot be overcome, potentially important new scientific results stay unexploited.


Not only the complex collaboration networks in R&D lead to massive amounts of data, also the advances in personalized medicine do so. As in personalized medicine, genomic profiles and other individual characteristics analyzed to develop tailored therapies, genomics, transcriptomics and proteomics offer huge opportunities for biotech companies. But the opportunities are tied with challenges from a pure data perspective already.


Data privacy


Owing to data protection laws, data can become available too late, or it can become less meaningful for further investigation of new scientific questions. One example: If you want to find out if and how traits like gender, region, nutrition or genetic preconditions are impacting an illness, and in which case which  therapy would have the highest efficacy, you simply need a certain set of information about patients. But data is not allowed to be shown in a way that you can draw conclusions about individual patients. The operative word is “can”: It is not allowed that single non-authorized persons are theoretically able to conclude which patient is behind the information – and this can quickly happen if you for example examine persons with a relatively rare illness in a small defined region.


This requires sophisticated setups protecting information from unauthorized access and still some problems remain touch to tackle.


Varying data quality


Data quality can vary extremely for various reasons. Biggest quality gaps can be found in the age of data, data maintenance, and incorrectness of data that can arise from measuring errors for example. Keeping the whole original data set instead of using pre-aggregated data allows finding and correcting such inconsistencies more easily.


In future, we can expect that data quality will improve more and more. But even if the amount of useful data won’t be the bottleneck for R&D in Life Sciences, the permanently growing amount of data can be still quite challenging.

Growing data volumes


Genome sequencing becomes more and more affordable and even proteomes can be analyzed more and more quickly. Consequently, new correlations can be found faster – e.g. the effect of specific therapies for dedicated genome mutations, which again can provide enormous opportunities to explore complex interrelationships of the human metabolism. Genomes, transcriptomes, proteomes, phenotypes – the amount of data for personalized medicine is growing at breathtaking pace.


Hospitals have a great potential to provide a vast amount of high quality insights about the causes of diseases and about clinical studies. In addition, data can be generated directly from patients through wearable medical devices or other mobile devices like smartphones. With growing acceptance from the patient side, data volumes could explode. To fully take advantage of the scientific power of this data, some hurdles have to be taken. Not only to comply with data protection laws, but also from a technical point of view, companies and hospitals need support to capture the data in a systematic way. In most cases, data is generated in different departments within clinics that mostly work in siloes. As an example, data from the internist division, from oncology, and from rehab are needed to better understand the situation of a patient, data capture alone can be cumbersome.


Biotech companies who will shine with faster and more precise results for R&D in pharma will be the winners of tomorrow. As outlined above, this is also a Big Data play which can be approached in various ways, which we will describe in the next blog “Big Data strategies for biotech companies”.


This blog was written jointly by Emanuel Ziegler and me. I would like to express to thank Emanuel for all his great insights and support! The content of this blog was first published in a shorter version in German in goingpublic.de

Pharmaceutical companies know that the age of blockbuster drugs is dead. Billion-dollar drugs go off patent before generating expected returns or before a company can develop another blockbuster. A study of 150 major products from 15 drug makers between 2007 and 2013 found that 54 products lost growth in sales and 26 products lost blockbuster.  Companies are struggling to stay profitable while they figure out  what’s next.


Meanwhile, healthcare providers are grappling with declining reimbursements from private and public payers. U.S. healthcare utilization was up across the board in 2013, yet Medicare and Medicaid reimbursements to doctors continued to decline. In 2013, more than 200 nonprofit hospitals and health systems grew expenses faster than revenues, according to Moody’s.


The healthcare financial crisis is not limited to the United States, either. Across Europe, governments are pushing patients and providers to be more accountable for skyrocketing costs and in some cases are raising patient copays.


To get paid in the near future, providers must figure out how to deliver excellent patient outcomes that lower the cost of care. So long, sweet days of profits by volume and fee-for-service income. Drug makers need to abandon one-size-fits-all, blockbuster drug development in favor of developing more drugs for smaller patient populations and developing other lines of business.


See the attached link to the full white paper on "How to Save Healthcare: Personalize It"

Ashutosh Tol

Stem Cell Theraphy

Posted by Ashutosh Tol Oct 18, 2014

The Human body made up of about 200 different kind of specialized cells but all specialized cells are made from one cell that is called stem cell.  Stem cells also know as God cells. Now a day’s it is emerging branch of medicine and new Era in scientific research for untreated diseases. Stem cell therapy in simple words the therapy in which diseases are treated with stem cells. Recent study told that stem cells have the potential to treat wide range of disease



These are the sources for stem cells present in Human body.

Stem Cells work on mainly damaged cells of body. It is natural nature of stem cells to migrate towards the damage part of body. Stem cells are undifferentiated cells that have potential to develop in different kind of cells. Stem cells naturally act as a repair system in human body.



These are the region where patient with different disease are treated with stem cells


This therapy also creates opportunity for new business area in life science industry that is cord blood banking and stem cells banking.

We can get idea that future of stem cell therapy is bright future because now days stem cells are used for in liver failure for replace the liver cells which are destructed by viruses, medicine drug and alcohol.

In Parkinson’s disease it replaces the loss of dopamine releasing cells. In diabetes it will help in replacement of loss of insulin-producing beta cells.

So in short stem cell therapy open door for new world where we can think on treatment of diseases which are untreated now a days in our present medicine world.

I recently attended the American Medical Device Summit hosted by Generis, which was targeted towards senior managers in the main areas of engineering, manufacturing, quality management and regulatory affairs – so the conference covered broad topics ranging from R&D to quality management and compliance to risk management and M&A. I first wondered if this topic selection wouldn’t be too broad. But there was one element that tied everything together: the patient.


Best run medical devices companies probably always have thought of the patient first. But now as in many countries, reimbursements are being paid based on patient outcomes, patient satisfaction becomes even more important. A patient centric approach was vividly illustrated within the innovation process by one of the speakers. What many consumer products companies do with their consumers, best-run medical devices companies apply as well: they go out to those patients who are willing to share real-life experiences and observe how products are being used in reality. This way they can see what patients may not even recognize to be worthwhile to tell – a huge potential for new innovative ideas to improve existing products or to optimize design of new prototypes.

From many presentations, one message came out clearly: Patient-centricity is also key when it comes to product quality and compliance. How would you define quality criteria if not from a patient perspective on safety and efficacy? And what happens to quality if this is not perfectly understood on all levels within the medical device manufacturer and its suppliers? To make sure top quality is achieved, all employees need to execute on quality. Similarly to visiting patient to explore their real needs, quality managers need to get a deep understanding of the environment employees are working in before defining measures to ensure process and product quality. This implies walking directly to them, asking them, and collaborating with them. Top quality also means applying the same quality standards across the enterprise. I can only stress this, since I have heard from many of our life sciences customers that the FDA wants to see what the company does as a whole, not fractures of various locations. This is why medical device companies should stick to one process, implement one single source of truth, and take advantage from automated workflows, e.g. for approval of documents, assigning responsibilities, or guided escalation processes.


Talking about escalations, in order to prevent them, medical devices manufacturers need a solid risk management system – which is a science in itself. You can always strive to minimize risk, you can benchmark within and across industries, but you will always be left with a residual risk. Risk with the highest priority rating concern patient safety, immediately followed by risk affecting product efficacy. What is the best advice to avoid quality risk of medical devices? Strive for highest quality and don’t hesitate to apply most sophisticated risk models if needed. If you try this and miss some points, you will most probably still arrive in an acceptable range. If you just aim for the average, you may run the risk of falling below acceptable quality.


And how does M&A fit into the concept of patient-centricity? When running post-merger integration, one key advise stated at the event should be kept in mind: don’t lose focus on the value that is pursued through the investment – which in the end should be to serve the patient better.


Summing up the three best practices I took from the summit are: put the patient in the center of all actions, patient safety and product efficacy come first, and strive for best-in class quality rather than for the average level.

Healthcare reforms, scientific progresses and technology advances are pushing towards personalized medicine and leading the life sciences industry to rethink therapies and business models. Some driving forces from the technical side that seem to shift the future of medication and health solutions are machine-to-machine technologies, mobile solutions, Big Data, and cloud computing.

Machine-to-machine technology: This sounds pretty powerful: According to a 2014 Vodaphone report, 57% of health sciences companies will utilize M2M technologies by 2016. You could imagine applications within homecare where sensors and applications interact directly and either give an alert when health indicators exceed certain limits, or help patients to comply with therapy instructions. Of course you could also apply this technology within prevention. Sensors combined with apps that can support healthier living could be built in anywhere in the future, which might not even stop at the bathroom door, as pointed out in this video by Constellation research – we will see how far patients and consumers will actually be willing to go...

Mobile: A direct link between mobile solutions and machine-to-machine technology can be done – so mobile solutions are probably equally powerful. There are already mobile apps that can support collaborative care by bringing patients, their friends and families, physicians, and clinics closer together. The inherent data offer also opportunities to develop new drugs and therapies tailored to patient populations. Another set of mobile apps lies within sales and services – many of our life sciences customers provide their sales reps with mobile devices to make it easier for them to demonstrate products to their customers, and within the medical devices sector, service technicians use mobile apps to be at the right place at the right time. A third area for mobile apps: helping workers on the shop floor run manufacturing and conduct maintenance of machines more easily.


Big Data: This seems to be a strong one as well. R&D is the most obvious space where Big Data comes into play in life sciences organizations which have to deal with genomic and proteomic data to develop new products and services in support of personalized medicine, and also need to consolidate clinical studies as well as scientific data from various sources. Also in other business areas Big Data can support decision-making through what-if scenarios when analyzing markets, risk, operational processes or demand and supply. Big Data tools can not only increase speed and precision, they can also reveal completely new insights, change perspectives, and help find new strategic directions.


Cloud: Cloud computing cannot be denied as a facilitator for gaining speed in Life Sciences, either. It helps save cost, run more agile, and scale more easily – which is vital during times of change, rapid growth, or after M&A. Also from a business perspective in Life Sciences, cloud solutions can simplify processes, e.g. in Sales and Marketing, HR, and when collaborating with CxOs. A hybrid setting between public and private cloud seems to be the most popular option in Life Sciences to safely deal with sensitive data and to also benefit from the high flexibility.


No doubt, opportunities are there. What is your favorite? Which technology will give life sciences companies the greatest power to get new competitive edges? And where are possible hurdles?

Please let us know your thoughts as comment here, contact us on twitter @SAP_Healthcare, or fill in the microsurvey “Future of Life Sciences technology”. We very much look forward to hearing from you! Likewise, we will keep you posted. Thank you very much in advance for your feedback and inputs!

Product manufacturing organizations deal with shelf life of the products every day as a part of business activities. Shelf life determines product suitability for its usage. Shelf life dates explains the date from which the product can be used and until when it can be used.

Especially in Pharmaceutical, Food, Beverages and Chemical industries, these dates are utmost important for planning supply chain activities.

Supply chain challenges with Shelf life expiry dates:

  1. Minimum shelf life of product is required before the goods enter into an organization supply chain. This minimum value supports planners to plan and utilize the goods in manufacturing and delivering to customer. And also as per quality compliance these values are to be maintained against each batch of the product.
  2. Minimum remaining shelf is required while utilizing the product in manufacturing and delivering to customer. Products from warehouse should possess required remaining shelf life before it is issued to production or customers
  3. Shelf life information on labels for the same product may vary when supplying to customers of different or same geographic location. For EOU organizations supplying finished goods to various customers this will become a cumbersome issue. Since product data remains same and only for delivery purpose the label information has to be changed.
  4. Batch details carry forward from API to Finished goods is required to update batch details of finished goods. Details like date of expiry and manufacturing are forwarded to finish goods batch since these represent actual dates of an API/Bulk. In case there are multiple batches of API being used in finished goods then earliest expiry date of the API shall be applied to finished goods batch.
  5. Shelf life rounding rules are sometimes required to round off the shelf life expiry date either period start or period end.
  6. Most importantly shelf life details consideration in planning run. Standard MRP II will not consider expiry dates while net requirement calculation. It will even consider to-be expired batches falling in the planning horizon. This will create stock deficiency at actual day of execution and creates chaos in warehouse management and shop floor.
  7. Recalling of expired batches from the supply chain is a tedious task. If the primary supply chain is integrated with information systems it would be little difficult to manage, but the batches lying at pharmacies, hospitals etc, will be impossible to track without data exchange platforms.


While most of the challenges can be solved in Std SAP, but planning can be done by enhancing standard MRP program to consider shelf life details.

www.isapbox.com has provided a solution which demonstrates MRP including shelf life details.

Written by: Ruud Nieuweboer, SAP Life Science Consultant

With the coming ISO IDMP (2016) and Serialization (> 2017) legislation, many pharmaceutical companies are struggling to comply. Some follow an ostrich tactic and some are facing the complexity head on. The latter discover that hidden in this complexity, there are  opportunities in streamlining internal processes (ISO IDMP) and increasing revenues (Serialization).

How, you might ask?

In short, what is ISO IDMP? It stands for Identification of Medicinal Products.  ISO(1) has created five new standards. With these standards it will be possible to better track patient safety issues across countries/ brands and analyze the data for root cause analysis. A key element is the unique identification of the product and all of its substances.

"ISO IDMP will break silos and is a catalyst for harmonizing data."

ISO IDMP is all about data, most of the data is scattered and stored in various systems and controlled documents. For globalized companies, updating their data to a global standard will be a huge task. But, experiences from current projects learn that ISO IDMP will break silos and is a catalyst for harmonizing data. In the end this will result in a streamlined static master data process, where the advantages are numerous. Like reducing costs in administrative processes, better insight in shared materials where for instance substantial supply savings can be achieved. In a harmonized environment, data is also better interpretable especially when static and dynamic data is combined. Here Serialization steps in, this covers the dynamic spectrum of the data.

“With a serialized infrastructure a pharmaceutical company will gain more supply chain insight and become more attractive as a CMO or distributor.”

Serialization is another burning topic within the industry. The key thing about serialization is adding a unique number to the unit of issue (one or more levels beyond the batch level) for prescription drugs. The individual ‘packed product’ needs to be tracked throughout the supply chain to ensure at the point of dispense that the product is genuine and not counterfeit. The legislation will result in many changes  to be made to artwork, controlled documents, it-systems and packaging lines, against high costs without apparent savings or benefits. Is that really true?


The big pharma companies are already in third gear and have the funding and power to make these complex projects work. Most big pharma companies are even participating in the fora that discuss the future operating models for serialization. The mid-size market however is looking for ways to manage these projects at low cost, for example by sharing a platform, or taking a close look at the physical supply chain to see where changes can be made to minimize impact. Although the mid-market is feeling the pressure a business case can be made. Due to a serialized infrastructure a pharmaceutical company will gain more supply chain insight and become more attractive as a CMO or distributor. Investing in a serialized infrastructure is a strategic choice to effectively stay ahead of your competition.


ISO IDMP and Serialization are two main drivers for increasing patient safety, but both also have a big impact on the pharmaceutical companies process and IT infrastructure. The deadlines are deceiving, as they appear to be in the far future. The reality however is that the lead time of these types of projects is long. If the legislative pressure isn’t enough to start now, why not focus more on the possible benefits and create a business case?


By Steven de Bruijn

Private and business users benefit from cloud technology everyday, however it is not used to its full potential by the average regulated company yet. We need to understand what the cloud means for these companies, what the perceived obstacles are, and how to overcome these obstacles while fulfilling the regulator’s expectations.

“The Cloud”

Before we start, let’s define what the cloud is, and explore what flavors the market has to offer. Cloud computing is a very generic term, and suggests the idea of a “black box”. And in fact, this is quite accurate as the cloud is an abstract mixture of IT infrastructural components. Furthermore various sorts of applications can be deployed in the cloud, such as collaborative tools, ERP systems, procurement platforms, document management systems, and so on.


Typically, three models of services are distinguished for cloud computing:


  1. Software as a Service (Saas) - Configured applications, containing all infrastructure and platform components including hosting facilities, are delivered to the regulated company.
  2. Platform as a Service (PaaS) - Middleware, including all infrastructure and hosting facilities, is delivered to the regulated company. Middleware configuration, application installation and configuration are done by the regulated company.
  3. Infrastructure as a Service (IaaS) - Computational and storage resources, including all network components and hosting facilities, are delivered to the regulated company. Depending on contract conditions, the regulated company can install, configure and maintain the OS, middleware and software applications.


Another classification is found in the type of cloud, namely Public, Private, Hybrid, or Community. Simply put, in a public cloud the end users do not know who else has jobs running on the same external server, network or disks. While in a private cloud the infrastructure is designed and delivered for the exclusive use by the regulated company and may be located in-house or externally. For the specifics of each type, visit the National Institute of Standards and Technology (NIST) website at www.nist.gov.


Cloud providers offer several clear benefits such as extremely fast and flexible solution delivery, on-demand scalability, business continuity solutions, relatively easy solutions for backup and archiving, and reduced TCO on infrastructure components. This is a strong proposition at considerably lower cost than traditional in-house computing. So why not start immediately?


So aren’t there any drawbacks to cloud computing and its providers at all? Experience shows that many suppliers offer cloud services but lack understanding of the needs of a regulated company. They fail to recognize the most significant GxP risks. Chances are that dropping the term “Annex 11” will not ring too many bells. However, if regulated content is managed in the cloud, the solution should go beyond what is required of most non-regulated business applications. Most importantly, the cloud solution should be validated and auditable.


Some companies in the regulated industry seem to have a lack of understanding of what cloud computing is and, equally important, what the cloud is not. There still is a lot of ground to cover when it comes to agreeing on consistent terms that apply to the whole company, to understand the enabling technologies, and to recognize the interactions between the cloud and other applications. Such insights will convince your Quality department and prevent your quality controllers from misunderstanding the concept of a private or public cloud and overestimate the regulatory needs causing them not to allow for any cloud functionality at all. Last but not least, many regulated companies struggle to define their methods for validating cloud solutions. How to break down this struggle into manageable pieces focusing on regulator’s areas of interest?

Cloud Compliance

For computer system validation, the regulated companies traditionally rely on their IT department which owns and manages the corporate IT infrastructure. This way the regulated company would set up and qualify their own machines, platforms, and environments for development, acceptance testing and live use. The software supplier would be audited, and ultimately the implemented system would be validated.


In case of cloud computing, an entirely different approach is required depending on the cloud model used. In any case, the regulated company is accountable to the regulatory authorities for the compliance of the IT infrastructure (IaaS and PaaS) and (GxP-)applications (SaaS) that are used. This accountability cannot be transferred to cloud providers. The central goal of validation for the regulated company is to verify that the cloud provider conducts appropriate control over the cloud solution. This all starts with auditing the supplier to clarify what services will be provided and how they will be implemented, managed and controlled, and maintained.


In models 2 & 3 (PaaS and IaaS), the supplier qualifies and controls the infrastructure. It is the responsibility of the regulated company to verify that appropriate control is in place. Applications are owned and controlled by the regulated company in this scenario. Therefore, the validation of the applications will be similar to validating applications the traditional way, apart from some cloud-specific risks or issues.


Validation becomes more complex in a model 1 (SaaS) scenario, because the regulated company is not the owner and controller of the application, yet still responsible for validating the GxP SaaS application. The application is already installed and configured by the supplier and can’t always be reconfigured or customized to meet the regulated company’s requirements. The approach we propose is to assure that the application meets the requirements by verification through formal testing. Furthermore, verify that the split of responsibilities and tasks between the cloud provider and regulated company are documented in e.g. a formal SLA, as this is an Annex 11 (§3.1) requirement. Also ensure that appropriate control is conducted by the cloud provider, and establish procedures for use of the application.

Why go through all this qualification and validation effort?

Besides obvious reasons such as mitigation of your GxP and business risks, another driver should be the fact that the regulators are increasingly sticking their heads in the cloud as well. An auditor will be interested in what risks have been defined and how these are mitigated. Attention will be paid to how the integrity of your regulated data is assured, and what data backup and recovery measures have been taken. Compared to a traditional hosting model, more emphasis will be placed on cyber security for the networked cloud systems and to what extent privacy is safeguarded. Because a system in the cloud is as secure as its host, regulators will examine your supplier audits, assess SLAs and contracts you agreed on with your supplier, and inspect the supplier’s quality system.


New approaches for auditing are crucial, requiring cloud-specific IT technology knowledge, awareness of current IT certifications, understanding of legal aspects, and GxP & CSV knowledge. Goldfish ICT is developing compliant strategies and validation best practices on utilization of the cloud in a regulated environment. We enable our relations to adopt this technology, while maintaining control of their IT landscape in a consistent manner. We would be very interested to share our findings and current state of knowledge with you. If you have any questions or remarks, please contact Steven de Bruijn, who is more than willing to get in touch on cloud computing in a GxP context.


Filter Blog

By author: By date:
By tag: