1 2 3 4 Previous Next

SAP for Life Sciences

47 Posts

Biotech companies can differentiate already through the ability to process large amounts of data from different sources, i.e. to gather it, consolidate it, represent it in a comprehensible way and to deliver it safely. Some biotech companies already offer this as core service. Those ones who can validate the quality of data and extract the relevant information have a competitive advantage. In-memory technology even opens up more opportunities as different data formats including unstructured texts can be consolidated easily and quickly from various data sources.


Another way to positively stand out in the market is the ability to analyze raw data that biotech companies can get by collaborating with hospitals using smart
algorithms. To do this, a biotech company can pursue two possible strategies:

  1. Providing an added value by processing Big Data very rapidly through a comparably simple algorithm or
  2. Offering insights from relatively few data from various sources by using a highly complex algorithm.

Both approaches can be pushed by technology. Big Data solutions offer the necessary IT-capacities to process huge data masses in extremely short time periods as well as the capability to run data mining in an intelligent and quick way.

Examples indicating the future direction

There are many success stories to tell already how Big Data technology helped advance personalized medicine:


The National Center for Tumor Diseases (NCT) has gained new insights to fight cancer. The project “Medical Research Insights” which is based on the in-memory platform SAP HANA helps develop new personalized therapies. Employees can now capture enormous amounts of data per patient and analyze in  real time. No matter if we are looking at medical reports, MRT results, genetic analysis and cancer registry data, all information come together in one central place. It can be found out very quickly which therapy has the greatest probability to work best at which patient.


„ProteomicsDB“, a database focusing on the human proteome that was jointly developed by one of Munich’s universities, Technische Universität München (TUM), and SAP supports scientists to explore the human proteome and conduct fundamental research. When building up this database, the human proteome was captured in an unprecedented degree of completeness, assessed in a structured way and pooled. Extremely high data volumes and various scattered data sources were coped with.


Alacris Theranostics has come up with an innovative Virtual Patient Platform that allows to accurately analyze the exact type of cancer of a specific patient and to find the best therapy. In the background molecular data of the patient and algorithms are used to derive the behavior of the tumor and to simulate the efficacy of different therapies. The complex mathematical model contains thousands of variables like genes, proteins and tests for every drug and every dose. With SAP HANA these simulations could be reduced to only a few minutes.


Stanford School of Medicine has provided a database of genetic predispositions that was combined with thousands of genomes from the 1,000 genomes project in SAP HANA to allow analyzing the data interactively. Some analyses could be accelerated by a factor of 17-600 and others were not even thinkable with traditional setups. Through this database many new opportunities arise, e.g. finding personalized therapies for chronic diseases like diabetes.


Mitsui Knowledge Industry (MKI) leveraged SAP HANA to not only extremely accelerate DNA sequencing, but also lower cost for DNA extraction and analysis from 1m USD to less than 1,000 USD. DNA analysis became affordable for many more patients through this.


Summing up, new architectures of databases and new IT infrastructures enable to overcome hurdles like complexities as well as high amounts of data and/or data sources and find answers to questions that were not possible to solve before. Further, saving time and computing capacity can enable pharma companies to make sound decision in earlier R&D stages than before, which means saving cost and time to market. Big Data technology also allows understanding illnesses that were not much investigated through clinical studies much better than before. For example, many clinical trials geographically concentrate on Europe which means other regions are currently underrepresented. Big Data solutions can help fill the gap by identifying suitable participants worldwide and process the data. They can also help shift focus of R&D from investigating how to deal with symptoms towards the root causes of a chronic disease and to further advance personalized medicine quicker.


This blog was written jointly by Emanuel Ziegler and me. I would like to express to thank Emanuel for all his great insights and support! The content of this blog was first published in a shorter version in German in goingpublic.de

Pharma companies need strong partners for R&D


Data and scientific insights are the key to innovation for pharma companies. The need to increase R&D productivity has grown due to the patent cliff, intense global competition, and other reasons. As scientific advances progress rapidly, opportunities are certainly there – but currently paired with challenges.


Complex collaboration networks in R&D and personalized medicine


To be able to bring innovations to market more quickly and to understand earlier if a R&D project should be stopped or continued, the pharma industry collaborates closely with external partners like biotech companies and Contract Research Organizations from all over the world. From a practical perspective, this means that pharma companies need to be able to analyze data from a large number of partners, and from even more possible data sources – which mostly come in different formats. This especially becomes a challenge if historic data needs to be connected with most recent studies. If this hurdle cannot be overcome, potentially important new scientific results stay unexploited.


Not only the complex collaboration networks in R&D lead to massive amounts of data, also the advances in personalized medicine do so. As in personalized medicine, genomic profiles and other individual characteristics analyzed to develop tailored therapies, genomics, transcriptomics and proteomics offer huge opportunities for biotech companies. But the opportunities are tied with challenges from a pure data perspective already.


Data privacy


Owing to data protection laws, data can become available too late, or it can become less meaningful for further investigation of new scientific questions. One example: If you want to find out if and how traits like gender, region, nutrition or genetic preconditions are impacting an illness, and in which case which  therapy would have the highest efficacy, you simply need a certain set of information about patients. But data is not allowed to be shown in a way that you can draw conclusions about individual patients. The operative word is “can”: It is not allowed that single non-authorized persons are theoretically able to conclude which patient is behind the information – and this can quickly happen if you for example examine persons with a relatively rare illness in a small defined region.


This requires sophisticated setups protecting information from unauthorized access and still some problems remain touch to tackle.


Varying data quality


Data quality can vary extremely for various reasons. Biggest quality gaps can be found in the age of data, data maintenance, and incorrectness of data that can arise from measuring errors for example. Keeping the whole original data set instead of using pre-aggregated data allows finding and correcting such inconsistencies more easily.


In future, we can expect that data quality will improve more and more. But even if the amount of useful data won’t be the bottleneck for R&D in Life Sciences, the permanently growing amount of data can be still quite challenging.

Growing data volumes


Genome sequencing becomes more and more affordable and even proteomes can be analyzed more and more quickly. Consequently, new correlations can be found faster – e.g. the effect of specific therapies for dedicated genome mutations, which again can provide enormous opportunities to explore complex interrelationships of the human metabolism. Genomes, transcriptomes, proteomes, phenotypes – the amount of data for personalized medicine is growing at breathtaking pace.


Hospitals have a great potential to provide a vast amount of high quality insights about the causes of diseases and about clinical studies. In addition, data can be generated directly from patients through wearable medical devices or other mobile devices like smartphones. With growing acceptance from the patient side, data volumes could explode. To fully take advantage of the scientific power of this data, some hurdles have to be taken. Not only to comply with data protection laws, but also from a technical point of view, companies and hospitals need support to capture the data in a systematic way. In most cases, data is generated in different departments within clinics that mostly work in siloes. As an example, data from the internist division, from oncology, and from rehab are needed to better understand the situation of a patient, data capture alone can be cumbersome.


Biotech companies who will shine with faster and more precise results for R&D in pharma will be the winners of tomorrow. As outlined above, this is also a Big Data play which can be approached in various ways, which we will describe in the next blog “Big Data strategies for biotech companies”.


This blog was written jointly by Emanuel Ziegler and me. I would like to express to thank Emanuel for all his great insights and support! The content of this blog was first published in a shorter version in German in goingpublic.de

Pharmaceutical companies know that the age of blockbuster drugs is dead. Billion-dollar drugs go off patent before generating expected returns or before a company can develop another blockbuster. A study of 150 major products from 15 drug makers between 2007 and 2013 found that 54 products lost growth in sales and 26 products lost blockbuster.  Companies are struggling to stay profitable while they figure out  what’s next.


Meanwhile, healthcare providers are grappling with declining reimbursements from private and public payers. U.S. healthcare utilization was up across the board in 2013, yet Medicare and Medicaid reimbursements to doctors continued to decline. In 2013, more than 200 nonprofit hospitals and health systems grew expenses faster than revenues, according to Moody’s.


The healthcare financial crisis is not limited to the United States, either. Across Europe, governments are pushing patients and providers to be more accountable for skyrocketing costs and in some cases are raising patient copays.


To get paid in the near future, providers must figure out how to deliver excellent patient outcomes that lower the cost of care. So long, sweet days of profits by volume and fee-for-service income. Drug makers need to abandon one-size-fits-all, blockbuster drug development in favor of developing more drugs for smaller patient populations and developing other lines of business.


See the attached link to the full white paper on "How to Save Healthcare: Personalize It"

Ashutosh Tol

Stem Cell Theraphy

Posted by Ashutosh Tol Oct 18, 2014

The Human body made up of about 200 different kind of specialized cells but all specialized cells are made from one cell that is called stem cell.  Stem cells also know as God cells. Now a day’s it is emerging branch of medicine and new Era in scientific research for untreated diseases. Stem cell therapy in simple words the therapy in which diseases are treated with stem cells. Recent study told that stem cells have the potential to treat wide range of disease



These are the sources for stem cells present in Human body.

Stem Cells work on mainly damaged cells of body. It is natural nature of stem cells to migrate towards the damage part of body. Stem cells are undifferentiated cells that have potential to develop in different kind of cells. Stem cells naturally act as a repair system in human body.



These are the region where patient with different disease are treated with stem cells


This therapy also creates opportunity for new business area in life science industry that is cord blood banking and stem cells banking.

We can get idea that future of stem cell therapy is bright future because now days stem cells are used for in liver failure for replace the liver cells which are destructed by viruses, medicine drug and alcohol.

In Parkinson’s disease it replaces the loss of dopamine releasing cells. In diabetes it will help in replacement of loss of insulin-producing beta cells.

So in short stem cell therapy open door for new world where we can think on treatment of diseases which are untreated now a days in our present medicine world.

I recently attended the American Medical Device Summit hosted by Generis, which was targeted towards senior managers in the main areas of engineering, manufacturing, quality management and regulatory affairs – so the conference covered broad topics ranging from R&D to quality management and compliance to risk management and M&A. I first wondered if this topic selection wouldn’t be too broad. But there was one element that tied everything together: the patient.


Best run medical devices companies probably always have thought of the patient first. But now as in many countries, reimbursements are being paid based on patient outcomes, patient satisfaction becomes even more important. A patient centric approach was vividly illustrated within the innovation process by one of the speakers. What many consumer products companies do with their consumers, best-run medical devices companies apply as well: they go out to those patients who are willing to share real-life experiences and observe how products are being used in reality. This way they can see what patients may not even recognize to be worthwhile to tell – a huge potential for new innovative ideas to improve existing products or to optimize design of new prototypes.

From many presentations, one message came out clearly: Patient-centricity is also key when it comes to product quality and compliance. How would you define quality criteria if not from a patient perspective on safety and efficacy? And what happens to quality if this is not perfectly understood on all levels within the medical device manufacturer and its suppliers? To make sure top quality is achieved, all employees need to execute on quality. Similarly to visiting patient to explore their real needs, quality managers need to get a deep understanding of the environment employees are working in before defining measures to ensure process and product quality. This implies walking directly to them, asking them, and collaborating with them. Top quality also means applying the same quality standards across the enterprise. I can only stress this, since I have heard from many of our life sciences customers that the FDA wants to see what the company does as a whole, not fractures of various locations. This is why medical device companies should stick to one process, implement one single source of truth, and take advantage from automated workflows, e.g. for approval of documents, assigning responsibilities, or guided escalation processes.


Talking about escalations, in order to prevent them, medical devices manufacturers need a solid risk management system – which is a science in itself. You can always strive to minimize risk, you can benchmark within and across industries, but you will always be left with a residual risk. Risk with the highest priority rating concern patient safety, immediately followed by risk affecting product efficacy. What is the best advice to avoid quality risk of medical devices? Strive for highest quality and don’t hesitate to apply most sophisticated risk models if needed. If you try this and miss some points, you will most probably still arrive in an acceptable range. If you just aim for the average, you may run the risk of falling below acceptable quality.


And how does M&A fit into the concept of patient-centricity? When running post-merger integration, one key advise stated at the event should be kept in mind: don’t lose focus on the value that is pursued through the investment – which in the end should be to serve the patient better.


Summing up the three best practices I took from the summit are: put the patient in the center of all actions, patient safety and product efficacy come first, and strive for best-in class quality rather than for the average level.

Healthcare reforms, scientific progresses and technology advances are pushing towards personalized medicine and leading the life sciences industry to rethink therapies and business models. Some driving forces from the technical side that seem to shift the future of medication and health solutions are machine-to-machine technologies, mobile solutions, Big Data, and cloud computing.

Machine-to-machine technology: This sounds pretty powerful: According to a 2014 Vodaphone report, 57% of health sciences companies will utilize M2M technologies by 2016. You could imagine applications within homecare where sensors and applications interact directly and either give an alert when health indicators exceed certain limits, or help patients to comply with therapy instructions. Of course you could also apply this technology within prevention. Sensors combined with apps that can support healthier living could be built in anywhere in the future, which might not even stop at the bathroom door, as pointed out in this video by Constellation research – we will see how far patients and consumers will actually be willing to go...

Mobile: A direct link between mobile solutions and machine-to-machine technology can be done – so mobile solutions are probably equally powerful. There are already mobile apps that can support collaborative care by bringing patients, their friends and families, physicians, and clinics closer together. The inherent data offer also opportunities to develop new drugs and therapies tailored to patient populations. Another set of mobile apps lies within sales and services – many of our life sciences customers provide their sales reps with mobile devices to make it easier for them to demonstrate products to their customers, and within the medical devices sector, service technicians use mobile apps to be at the right place at the right time. A third area for mobile apps: helping workers on the shop floor run manufacturing and conduct maintenance of machines more easily.


Big Data: This seems to be a strong one as well. R&D is the most obvious space where Big Data comes into play in life sciences organizations which have to deal with genomic and proteomic data to develop new products and services in support of personalized medicine, and also need to consolidate clinical studies as well as scientific data from various sources. Also in other business areas Big Data can support decision-making through what-if scenarios when analyzing markets, risk, operational processes or demand and supply. Big Data tools can not only increase speed and precision, they can also reveal completely new insights, change perspectives, and help find new strategic directions.


Cloud: Cloud computing cannot be denied as a facilitator for gaining speed in Life Sciences, either. It helps save cost, run more agile, and scale more easily – which is vital during times of change, rapid growth, or after M&A. Also from a business perspective in Life Sciences, cloud solutions can simplify processes, e.g. in Sales and Marketing, HR, and when collaborating with CxOs. A hybrid setting between public and private cloud seems to be the most popular option in Life Sciences to safely deal with sensitive data and to also benefit from the high flexibility.


No doubt, opportunities are there. What is your favorite? Which technology will give life sciences companies the greatest power to get new competitive edges? And where are possible hurdles?

Please let us know your thoughts as comment here, contact us on twitter @SAP_Healthcare, or fill in the microsurvey “Future of Life Sciences technology”. We very much look forward to hearing from you! Likewise, we will keep you posted. Thank you very much in advance for your feedback and inputs!

Product manufacturing organizations deal with shelf life of the products every day as a part of business activities. Shelf life determines product suitability for its usage. Shelf life dates explains the date from which the product can be used and until when it can be used.

Especially in Pharmaceutical, Food, Beverages and Chemical industries, these dates are utmost important for planning supply chain activities.

Supply chain challenges with Shelf life expiry dates:

  1. Minimum shelf life of product is required before the goods enter into an organization supply chain. This minimum value supports planners to plan and utilize the goods in manufacturing and delivering to customer. And also as per quality compliance these values are to be maintained against each batch of the product.
  2. Minimum remaining shelf is required while utilizing the product in manufacturing and delivering to customer. Products from warehouse should possess required remaining shelf life before it is issued to production or customers
  3. Shelf life information on labels for the same product may vary when supplying to customers of different or same geographic location. For EOU organizations supplying finished goods to various customers this will become a cumbersome issue. Since product data remains same and only for delivery purpose the label information has to be changed.
  4. Batch details carry forward from API to Finished goods is required to update batch details of finished goods. Details like date of expiry and manufacturing are forwarded to finish goods batch since these represent actual dates of an API/Bulk. In case there are multiple batches of API being used in finished goods then earliest expiry date of the API shall be applied to finished goods batch.
  5. Shelf life rounding rules are sometimes required to round off the shelf life expiry date either period start or period end.
  6. Most importantly shelf life details consideration in planning run. Standard MRP II will not consider expiry dates while net requirement calculation. It will even consider to-be expired batches falling in the planning horizon. This will create stock deficiency at actual day of execution and creates chaos in warehouse management and shop floor.
  7. Recalling of expired batches from the supply chain is a tedious task. If the primary supply chain is integrated with information systems it would be little difficult to manage, but the batches lying at pharmacies, hospitals etc, will be impossible to track without data exchange platforms.


While most of the challenges can be solved in Std SAP, but planning can be done by enhancing standard MRP program to consider shelf life details.

www.isapbox.com has provided a solution which demonstrates MRP including shelf life details.

Written by: Ruud Nieuweboer, SAP Life Science Consultant

With the coming ISO IDMP (2016) and Serialization (> 2017) legislation, many pharmaceutical companies are struggling to comply. Some follow an ostrich tactic and some are facing the complexity head on. The latter discover that hidden in this complexity, there are  opportunities in streamlining internal processes (ISO IDMP) and increasing revenues (Serialization).

How, you might ask?

In short, what is ISO IDMP? It stands for Identification of Medicinal Products.  ISO(1) has created five new standards. With these standards it will be possible to better track patient safety issues across countries/ brands and analyze the data for root cause analysis. A key element is the unique identification of the product and all of its substances.

"ISO IDMP will break silos and is a catalyst for harmonizing data."

ISO IDMP is all about data, most of the data is scattered and stored in various systems and controlled documents. For globalized companies, updating their data to a global standard will be a huge task. But, experiences from current projects learn that ISO IDMP will break silos and is a catalyst for harmonizing data. In the end this will result in a streamlined static master data process, where the advantages are numerous. Like reducing costs in administrative processes, better insight in shared materials where for instance substantial supply savings can be achieved. In a harmonized environment, data is also better interpretable especially when static and dynamic data is combined. Here Serialization steps in, this covers the dynamic spectrum of the data.

“With a serialized infrastructure a pharmaceutical company will gain more supply chain insight and become more attractive as a CMO or distributor.”

Serialization is another burning topic within the industry. The key thing about serialization is adding a unique number to the unit of issue (one or more levels beyond the batch level) for prescription drugs. The individual ‘packed product’ needs to be tracked throughout the supply chain to ensure at the point of dispense that the product is genuine and not counterfeit. The legislation will result in many changes  to be made to artwork, controlled documents, it-systems and packaging lines, against high costs without apparent savings or benefits. Is that really true?


The big pharma companies are already in third gear and have the funding and power to make these complex projects work. Most big pharma companies are even participating in the fora that discuss the future operating models for serialization. The mid-size market however is looking for ways to manage these projects at low cost, for example by sharing a platform, or taking a close look at the physical supply chain to see where changes can be made to minimize impact. Although the mid-market is feeling the pressure a business case can be made. Due to a serialized infrastructure a pharmaceutical company will gain more supply chain insight and become more attractive as a CMO or distributor. Investing in a serialized infrastructure is a strategic choice to effectively stay ahead of your competition.


ISO IDMP and Serialization are two main drivers for increasing patient safety, but both also have a big impact on the pharmaceutical companies process and IT infrastructure. The deadlines are deceiving, as they appear to be in the far future. The reality however is that the lead time of these types of projects is long. If the legislative pressure isn’t enough to start now, why not focus more on the possible benefits and create a business case?


By Steven de Bruijn

Private and business users benefit from cloud technology everyday, however it is not used to its full potential by the average regulated company yet. We need to understand what the cloud means for these companies, what the perceived obstacles are, and how to overcome these obstacles while fulfilling the regulator’s expectations.

“The Cloud”

Before we start, let’s define what the cloud is, and explore what flavors the market has to offer. Cloud computing is a very generic term, and suggests the idea of a “black box”. And in fact, this is quite accurate as the cloud is an abstract mixture of IT infrastructural components. Furthermore various sorts of applications can be deployed in the cloud, such as collaborative tools, ERP systems, procurement platforms, document management systems, and so on.


Typically, three models of services are distinguished for cloud computing:


  1. Software as a Service (Saas) - Configured applications, containing all infrastructure and platform components including hosting facilities, are delivered to the regulated company.
  2. Platform as a Service (PaaS) - Middleware, including all infrastructure and hosting facilities, is delivered to the regulated company. Middleware configuration, application installation and configuration are done by the regulated company.
  3. Infrastructure as a Service (IaaS) - Computational and storage resources, including all network components and hosting facilities, are delivered to the regulated company. Depending on contract conditions, the regulated company can install, configure and maintain the OS, middleware and software applications.


Another classification is found in the type of cloud, namely Public, Private, Hybrid, or Community. Simply put, in a public cloud the end users do not know who else has jobs running on the same external server, network or disks. While in a private cloud the infrastructure is designed and delivered for the exclusive use by the regulated company and may be located in-house or externally. For the specifics of each type, visit the National Institute of Standards and Technology (NIST) website at www.nist.gov.


Cloud providers offer several clear benefits such as extremely fast and flexible solution delivery, on-demand scalability, business continuity solutions, relatively easy solutions for backup and archiving, and reduced TCO on infrastructure components. This is a strong proposition at considerably lower cost than traditional in-house computing. So why not start immediately?


So aren’t there any drawbacks to cloud computing and its providers at all? Experience shows that many suppliers offer cloud services but lack understanding of the needs of a regulated company. They fail to recognize the most significant GxP risks. Chances are that dropping the term “Annex 11” will not ring too many bells. However, if regulated content is managed in the cloud, the solution should go beyond what is required of most non-regulated business applications. Most importantly, the cloud solution should be validated and auditable.


Some companies in the regulated industry seem to have a lack of understanding of what cloud computing is and, equally important, what the cloud is not. There still is a lot of ground to cover when it comes to agreeing on consistent terms that apply to the whole company, to understand the enabling technologies, and to recognize the interactions between the cloud and other applications. Such insights will convince your Quality department and prevent your quality controllers from misunderstanding the concept of a private or public cloud and overestimate the regulatory needs causing them not to allow for any cloud functionality at all. Last but not least, many regulated companies struggle to define their methods for validating cloud solutions. How to break down this struggle into manageable pieces focusing on regulator’s areas of interest?

Cloud Compliance

For computer system validation, the regulated companies traditionally rely on their IT department which owns and manages the corporate IT infrastructure. This way the regulated company would set up and qualify their own machines, platforms, and environments for development, acceptance testing and live use. The software supplier would be audited, and ultimately the implemented system would be validated.


In case of cloud computing, an entirely different approach is required depending on the cloud model used. In any case, the regulated company is accountable to the regulatory authorities for the compliance of the IT infrastructure (IaaS and PaaS) and (GxP-)applications (SaaS) that are used. This accountability cannot be transferred to cloud providers. The central goal of validation for the regulated company is to verify that the cloud provider conducts appropriate control over the cloud solution. This all starts with auditing the supplier to clarify what services will be provided and how they will be implemented, managed and controlled, and maintained.


In models 2 & 3 (PaaS and IaaS), the supplier qualifies and controls the infrastructure. It is the responsibility of the regulated company to verify that appropriate control is in place. Applications are owned and controlled by the regulated company in this scenario. Therefore, the validation of the applications will be similar to validating applications the traditional way, apart from some cloud-specific risks or issues.


Validation becomes more complex in a model 1 (SaaS) scenario, because the regulated company is not the owner and controller of the application, yet still responsible for validating the GxP SaaS application. The application is already installed and configured by the supplier and can’t always be reconfigured or customized to meet the regulated company’s requirements. The approach we propose is to assure that the application meets the requirements by verification through formal testing. Furthermore, verify that the split of responsibilities and tasks between the cloud provider and regulated company are documented in e.g. a formal SLA, as this is an Annex 11 (§3.1) requirement. Also ensure that appropriate control is conducted by the cloud provider, and establish procedures for use of the application.

Why go through all this qualification and validation effort?

Besides obvious reasons such as mitigation of your GxP and business risks, another driver should be the fact that the regulators are increasingly sticking their heads in the cloud as well. An auditor will be interested in what risks have been defined and how these are mitigated. Attention will be paid to how the integrity of your regulated data is assured, and what data backup and recovery measures have been taken. Compared to a traditional hosting model, more emphasis will be placed on cyber security for the networked cloud systems and to what extent privacy is safeguarded. Because a system in the cloud is as secure as its host, regulators will examine your supplier audits, assess SLAs and contracts you agreed on with your supplier, and inspect the supplier’s quality system.


New approaches for auditing are crucial, requiring cloud-specific IT technology knowledge, awareness of current IT certifications, understanding of legal aspects, and GxP & CSV knowledge. Goldfish ICT is developing compliant strategies and validation best practices on utilization of the cloud in a regulated environment. We enable our relations to adopt this technology, while maintaining control of their IT landscape in a consistent manner. We would be very interested to share our findings and current state of knowledge with you. If you have any questions or remarks, please contact Steven de Bruijn, who is more than willing to get in touch on cloud computing in a GxP context.

Wearable health-related devices offer great opportunities. As cardiac electrophysiologist Kevin R. Campbell states in one of his blogs, more data is better. He finds that it enables better decisions based on facts, and it can provide least a bit more control for patients to control their illness, which he claims to be especially true for people with chronic diseases. I would agree. I also see opportunities for innovating drugs and devices, as the data can be analyzed according to different patient populations for the development of better and more tailored treatments. If only the devices and apps were already broadly accepted! There are many privacy concerns out there, one writer even calls wearables a “privacy nightmare”.

I think building trust is possible, but some effort is necessary. Some first evident points, most of them inspired by Alison Diana’s commentary in InformationWeek:

  1. All involved parties, e.g. physicians, patients and researchers, set up and confirm by signature their buy-in into common guidelines which data is generated and how the data is used
  2. This should be followed by detailed instructions and education what this concretely means during daily life and execution for everybody who deals with the data
  3. Track, trace and control data from its generation, to its distribution when, where, from who to whom until who accesses the raw or aggregated anonymized data and when, which allows for higher control if everybody follows the rules
  4. The patient needs to get full transparency on all above mentioned points, get access to every detail that concern him or her to be able to wipe out concerns, and the patient needs to be authorized to intervene when necessary
  5. The solution has to be validated and trustworthy


There may be more points to consider. It definitely needs some work and investments to achieve broad acceptance, but I think it will be worthwhile!

Want to read more about medical wearables? Here are some more blogs:

  1. Wearable Medical Devices: Always On for Better Health
  2. Making Medical Wearables Work: Top 3 Traits to Keep in Mind

What do you think about the issues discussed here? Continue the conversation in the comments below and on Twitter @SAP_Healthcare!

As pointed out in my previous blog "Wearable Medical Devices: Always On for Better Health", medical wearables may be the next big thing. Like probably any new trend, there may be some show stoppers out there. Mainly, who will accept a wearable medical device if it’s not trustworthy? It’s all about effectiveness, quality and safety.

1. Effectiveness:

What comes into mind first, is especially related to devices that monitor health indicators and life style. Will higher transparency really change behavior in a positive way? Do patients really want this kind of being watched – I mean do they truly want it. Everybody who has broken up a diet at least once in a life knows the difference between initial motivation and long-lasting will. Of course when medical wearables relieve from severe suffering, patients won’t go without them anymore. Secondly, do physicians have the capacity and tools to handle the additional information intelligently?

2. Safety:

Any kind of side-effect should be avoided. Side effects can be related to the body, e.g. google glass is suspected to have some health risk. Side effects may also relate to external actors trying to access your private data and misuse them. This is such a big concern that I will add a separate blog about privacy.

3. Quality:

This seems to be a no-brainer – when it’s about health, any measurements have to be precise, and all functions have to work perfectly. Otherwise the medical wearable would just contribute to confusion, false alarms, wrong actions or even worse consequences. And there are concerns that some medical apps don’t meet the high quality standards as they should. Depending on the feature of the wearable, quality also comprises sophisticated technical service for the customers and maintenance. As one of the main features is to be “always on” for improved health, manufacturers need to make sure to prevent a device gets broken, so they need excellent predictive analytics capabilities and the ability to deal with a huge amount of customer data rapidly in order to respond timely.


From companies that are not experienced with authority approvals for life sciences products, efforts do be done in order to get approved are sometimes under-estimated. However, it makes sense to double check if a wearable is classified as medical device. For example, the U.S. Food and Drug Administration (FDA) has just recently published their guidelines for Mobile Medical Applications, which points out a risk-based approach. If a mobile app poses minimal risk to patients, premarket review applications, registrations or listings are not necessary, e.g. apps providing regular educational information


So, when a manufacturer can tick all boxes mentioned above, technically the medical wearable is ready to take off. There is one more question remaining: What is about data privacy regarding wearables used to improve health? As this topic is hot, please find more about it in the next blog “Will Wearable Medical Devices Break Through? – Talking about Broad Acceptance and Privacy”.

What do you think about the issues discussed here? Continue the conversation in the comments below and on Twitter @SAP_Healthcare!

The wearable market takes on pace – especially in the healthcare and medical devices space. Opportunities to improve health through “always on” diagnosis, constant monitoring of health indicators for preventive or therapeutic reasons with possibilities of instant alarming the patient, doctors or family and friends, and even direct therapeutic features are just too promising. Experts predict a multi-billion USD market to become reality in the upcoming years. Medical wearables can be basically grouped into two major categories: devices used in diagnostics and devices used in medical treatments.


1. Wearables supporting health monitoring and rehabilitation


Wearables supporting health monitoring and rehabilitation are already wide-spread: bio-sensors are able to track heart beats (ECG), stress through skin reactions (electrodermal activity), brain activities (EEG), respiration, sleep quality, body temperature, steps, calory burn, plunges – you name it. This can be done through wrist bands, watches, smart glasses, plasters, smart tattoos, or devices through devices you can wear around your waist or torso. Intelligent textiles like smart suits or even smart bras might also become serious business in the near future – who knows?

One big bet the big players are keen to win is around wearables that are able to facilitate glocuse monitoring through minimal or non-invasive wearables that leverage electricity or ultra-sounds. Some insiders say Google invests billion USD in research in glucose monitoring, and also Apple as well as the big medical device players are in the game as well. The market would be massive not only because of the growing number of diabetes patients, but also as this might lower the barrier to use these devices as soon as they might become affordable, so that the use case could broaden up for diets as well.


Bio-sensors a bit apart from what you would classically associated with wearables – but too fascinating not to mention here – are self-implantable devices that work under the skin and smart pills that contain technology to monitor glucose or heart rates at high precision.


Opportunities for Collaborative Care


All these diagnostics wearables can improve remote monitoring and home healthcare, but what is even more, they also enable to include friends and family to support therapies – when this is wished by the patient. For example, they can be alerted in real time on health risk before the patient status gets critical. Or, they can participate in a more healthy life style of the patient through gamification elements. As the data can be aggregated and anonymized with in-memory technology, there are new opportunities for collaborative research between life sciences companies, clinicians and other researchers enabling multi-disciplinary approaches for personalized as well as evidence-based medicine.


2. Wearables is supporting therapies and physicians


The second big part of medical wearables is around therapies and supporting physicians. This covers many traditional medical devices like pacemakers, voice modulators, and others. More recent example that are prominently present in the media are smart glasses supporting physicians during appointments with patients as well as during and after surgeries. Where will the future head to? I found one interesting blog that lists new ideas like wearables that are able to deliver acupressure to reduce pain, influence eye pressure, affect blood circulation micro and needle patches for acupuncture or drug delivery. There might be also opportunities to power up such new medical devices with solutions from SAP, e.g. mobile solutions in order to create tailored apps, or in-memory-technology in order to analyze data generated from the devices.



Which wearable medical device will be the next big breakthrough? This will heavily depend on their reliability, efficacy and acceptance by authorities, physicians and patients.


Would you like to read more about this? In this case, check out the following two blogs as well:

  1. Making Medical Wearables Work: Top 3 Traits to Keep in Mind
  2. Will Wearable Medical Devices Break Through? – Talking about Broad Acceptance and Privacy

What do you think about the issues discussed here? Continue the conversation in the comments below and on Twitter @SAP_Healthcare!

This blog was written jointly by Chuck Pharris and me. I would like to express to Chuck my appreciation for his inputs, research, and contribution and thank him for his fantastic engagement!




The life sciences industry has long been associated with innovation. This innovation is not only the perceived heart of success in this industry, it is the life-blood of a vibrant economy creating jobs, increasing productivity and resulting standards of living.

Organic growth through innovation has not been the only key for success of life sciences companies. The industry also went through a period of high profile M&A activity in the 90s already with a resurgence in the last year and an especially high degree of activity in Q1 this year with 49 deals vs 35 in the same period of 2013. News catching headlines have included Pfizers record setting $119B bid for AstraZeneca, Valeants attempted hostile takeover of Botox maker Allergen, Novartis and GlaxoSmithKline asset swap, Mercks divestiture of its consumer care business to Bayer.


In spite of the skepticism of achieving success in these large megamergers given the challenges of large scale integration, McKinsey and Company report analyzed the success of the top 20 pharmaceutical companies by revenue in 1995 andtheir subsequently M&A history showing they created shareholder value, generated greater economic profit, and even changed long term performance expectations.


The challenges these former mega-mergers faced and the recent M&A activity all face success challenges more core to traditional business operations than new single drugs or technologies. In particular, some drivers of mergers this year have centered around:

  • Margin Improvement: Novartis established an internal business services unit targeting $6 billion worth of expenses it oversees. Teva is seeking to reduce half of its 75 manufacturing facilities it acquired through acquisitions.
  • Business Focus: Novartis is swapping assets with Glaxo Smith Kline in a series of transactions worth $25 billion to better focus their mutual core competencies. Novartis expects this to improve its core operating profit by 2.5%. The Harvard Business Review highlights a variant of this M&A frenzy in life sciences “David-Goliath partnerships” described as an alternative to the all or nothing disrupt or acquire view of startup competitors. Abbott spun off Abbvie and with it all of its big name drugs, including Tricor, Niaspan, and Humira, the blockbuster anti-inflammatory with greater than $10 billion in sales. Baxter plans to spin-off its biotech business in 2015 enabling it to focus on its core medical technology business.
  • Tax and Cash Incentives: Another growing trend is the pursuit of US companies of reverse acquisitions where a larger company buys a smaller foreign player and relinquishes the headquarters to this smaller company location seeking more favorable tax treatment or enabling them to utilize foreign cash reserves without repatriating the cash into the US tax structure.  Medtronics recent $42 billion takeover of Covidien, which is operated from Massachusetts but is incorporated in Ireland, is reported by Bloomberg as the “biggest yet to renounce its US tax citizenship”. Covidien gets the majority of its sales from the US where 19 of its 41 factories are located.  Its headquarters location originally shifted with its parent Tyco to Bermuda in 1997 and then spun off as a Bermuda company before switching to Ireland. This will also give Medtronic access to its $20.5 billion in cash held overseas without the 35% US corporate income tax it would have to pay if brought into the US.

The performance bar is growing given the maturity and market consolidation. The financial and operational challenges of these increasingly complex global enterprises grows on all aspects of finance, accounting, trade, R&D coordination, manufacturing and supply chain management, and overall ovation oriented collaboration. 


How can SAP help bring margin improvement, better research and collaboration to achieve greater business focus, and manage the complex financial and operational requirements of these global enterprises? Just a few illustrative examples:

  • Faster analytics for better strategic decision-making: Our solutions help getting the insights companies need to understand their performance as well as the root causes for it. The ability to run future scenarios quickly enables companies to make fact-based decisions ranging from decisions on M&A, product portfolios, global manufacturing plants, HR, customer-related questions, finance and tax – there are no limits.
  • Improved R&D processes: SAP HANA helps analyze genomic and proteomic data, and with solutions for R&D from SAP and our partners, companies can increase efficiency of Clinical Trial Supply Management and streamline project management.
  • Margin improvement through increased operational efficiencies: With an integrated solution that SAP offers, companies can standardize and consolidate operational processes globally across plants, such as Manufacturing, Supply Chain Management and Serialization, as well as Procurement and Management of CxOs. All this can be especially beneficial for achieving quick RoI after M&A.
  • Flawless post merger integration processes: Flexible and robust HR solutions by SAP allow companies to manage change, maximize employee loyalty, to identify and retain best talents, and to boost productivity.


To learn more about we can help you with your business challenges please have a look at the Life Sciences web pages on sap.com!

What do you think about the issues discussed here? Continue the conversation in the comments below and on Twitter at SAP Health Sciences!

No matter if it’s about strategy, innovation, regulatory compliance, quality standards, or responding to requests from physicians and other customers – all the different parts within a life sciences company have one thing in common: the race for the highest precision.


Speed and precision may not always go hand in hand, but enterprises who champion in both disciplines will lead the game. It’s the deep insight on small pieces and single processes as well as the ability to extract the essence of the big picture in real time that makes the difference.


It’s the ability to find out specific scientific breakthroughs and on market trends earlier than competitors as well as to simulate the future to be able to identify the right decisions resulting in competitive differentiators. It’s the power to analyze terabytes of data on genomes and proteomes quicker and to launch new drugs and medical devices faster that turns companies into first movers. It is the capacity to analyze each plant and each batch in real time to be able to detect risk of non-compliance and to define the right preventive actions early on that delivers the highest quality and cost advantages. It is the competency to respond to requests of thousands of patients quickly but forward-looking that eventually builds trust and customer loyalty.


Many life sciences companies ask us how SAP can support both, achieving precision in detail and utilizing exactly these little details as a sound basis for a clear view on the complete Life Sciences universe without losing time just crunching data. SAPPHIRE NOW offers the opportunity to get the answers through discussions, live demos, and best practices.


Are there other aspects you would like to discuss with us to win the race for the highest precision? We very much look forward to welcoming you at our Life Sciences Expert Table IN 113!

Yes, quality has its price, and yes, the global demand for drugs and medical devices is high. For these reasons, Life Sciences is an attractive area for some market participants to play a different and dangerous game offering falsified products.

The drivers for this behavior are sometimes lower price; sometimes higher margin. Either way, for patients and the concerned companies, this is more than just unpleasant. The recent scandal of PIP breast implants in Europe and an open letter of John J. Castellani, President and CEO of the Pharmaceutical Research and Manufacturers of America illustrate how severely counterfeit drugs and medical devices can impact health – and unfortunately, there are many more cases out there.


In order to combat counterfeiting of drugs and medical devices and to increase security, new serialization regulations in life sciences around the world are being established. The unique device identifier and the Drug Safety Quality Act are prominent examples, but they are not the only ones. Complying with all existing and upcoming local regulations in the rapidly changing world of serialization is complex.


Preparing for the new set of regulations in a timely manner for each region is challenging. Should life sciences companies start now and try to interpret the rough guidelines that are already available? Or should they wait until more details are clear and implement very fast at a later point of time? This won’t be easy, as they need to integrate with suppliers, contract manufacturing organizations, warehouses, plant shop floors, wholesale distributors and other third parties. How will life sciences companies validate and monitor all serialization activities? And how can they efficiently manage the massive data volumes associated to all this?


We would like to discuss with you different approaches on how to secure patient safety and how to put serialization into practice. Meet us at SAPPHIRE NOW at the open microforum discussion “Maximize the Quality of Drugs and Medical Devices Efficiently in Real Time”, June 4, 2014 at 3 p.m. at IN137 and at our Life Sciences Expert Table IN 113! We look forward to seeing you.


Filter Blog

By author: By date:
By tag: