1 2 Previous Next

SAP Co-Innovation Lab (COIL)

21 Posts

I didn't take the time to check if I have made prior mention of a new project we are enabling here in the lab focused upon SAP Hana HA/DR. This is not a new topic at all and there is a lot of good information out there on the subject of HA/DR but in this case we are building out a physical instance at COIL to demonstrate different SAP Hana HA DR scenarios. From my view it is a new project with respect to the fact that we are finally getting it started this month even though i reality we have been working the plan and planning the work for quite some time.

 

The pre-sales engineering team approached us about how they might pursue such a project as there has become an increasing desire to more thoroughly address the topic with customers. To be able to prove out some HA DR scenarios can prove challenging as this means you may need to secure hardware and other resources that a customer does not have and yet wants to see proof points on premise. For an HA DR scenario representative of a production environment, it might be unfeasible to spin up something as complex as two 3 node clusters in order to simulate site to site fail overs.

 

What the team asked is if we could work with some SAP COIL project members and sponsors to build out an HA DR environment here in the lab and then look to run some Data Center Readiness workshops with customers to cover off some of the key HA DR criteria they have for deployment.

The goal for the workshop is to provide a repeatable and programmatic way to effectively address customer’s needs to gain deep understanding of HANA operational capabilities without having to conduct custom POCs.

 

The proposed framework for the content and deliverable for this Data Center Readiness workshop is similar to existing SAP Hana Technical Academy content. It will comprise of presentation content related to architecture, positioning, roadmap, and presenter-led deep dive demos.  There is so far no hands-on exercises for the attendees planned.  The preliminary outline for the demos is-


    • High Availability – 3-node cluster (2 Active, 1 Standby)
      • Show work occurring on HANA.
      • Kill one active node to simulate HW failure.
      • Show stand-by node coming on-line and serving clients
    • Disaster Recovery – Create Active/Passive pairs (2 x 3-node clusters.  The active cluster is the same cluster being used for HA scenario)
      • Create the replication relationship between two HANA clusters
      • Show work occurring in active cluster
      • Kill active cluster to simulate a disaster
      • Show passive cluster coming on-line, serving clients, and data is consistent.
    • Optional Content
      • Monitoring and Management
        • Studio – this will be used throughout the demo
        • Solman/DBA Cockpit/3rd party (optional components)
      • Backup and Recovery
      • Security (TBD)
      • Sizing

 

For this project we've worked with Cisco and EMC to secure the necessary hardware and they both look forward to working with the SAP team with producing some really great workshops. As the guy at COIL who works to develop and enable all of these cool projects, I am always in a good mood when I see the hardware standing in the lab.

 

Here is a shot of both racks that are in the process of being cabled up and configured. We will have two three node clusters, comprised of 4 Cisco B440 M2 blades, some Cisco Nexus switches and EMC VNX 5300 storage (25, 600GB Sata drives). The plan will also look to include some network simulation via Shunra.

 

3nodeClusters_HADR.jpg

Still a lot of cabling and configuration to get after next but we hope to have spin up early April and look forward to seeing the first workshop scheduled. Many thanks to COIL founding sponsor Cisco and to EMC who joined COIL this year as one of its newest COIL Project Members.

 

As the project develops further, we will keep the blog updated and the JAM and Twitter updates coming.

Em 2011 a Cienci Soluções Inteligentes, participou do SAP COIL Day.

Nessa iniciativa foi desenvolvido um protótipo para mobilizar o processo de verificação de entrada de mercadorias, que complementa a solução SAP GRC NF-e 10.0 (localização brasileira:  Notas Fiscais de Entrada). O cliente móvel foi construído na plataforma Android. Veja a matéria completa clicando aqui.

 

Clique aqui para assistir um vídeo com um breve demo da solução.

 

Os principais objetivos do SAP COIL que estão abaixo foram alcançados e possibilitaram um forte crescimento.

  • Create co-innovated solutions
  • Accelerate technology adoption & enablement
  • Provide a global, shared project infrastructure
  • Showcase joint achievements

coil.png

 

Abaixo um gráfico que exibe os principais momentos da parceria.

 

coil_cienci.png

 

 

Agradecimentos especiais ao time que nos ajudou desde o início:

Henrique Pinto , Ronildo Santos , Joachim Von Goetz e Pedro Vidigal

Only a couple of days remain of 2013. I cast my last few glances into life's rear view mirror only to see the Christmas holiday fall away into the past but as much as I enjoyed the time off hanging out with family and friends, I endlessly do enjoy the view of the future from my perch inside the SAP Co-Innovation Lab and am happy to carve out some time at the office today to start prepping for a fast start to 2014.


I did give some thought to my last blog of the year being some sort of predictable recap of all of our amazing projects this year and yet with the future of the lab steadily embracing all things cloud, I feel more inclined to specifically look back on a couple of COIL cloud projects done in and around the Cloudframe initiative that I think sets the stage for the project focus in 2014.  From my earlier blog post on the topic, I made readers aware of COIL's own affinity for Cloud from describing a bit how having the Cloudframe project work enabled from COIL has made perfect sense  just from COIL’s own perspective of cloud value. Additionally we've seen first hand how this project work and the commitment from the project teams and stakeholders, that SAP can and will continue to exceed customer expectations with an exceptional cloud strategy and then executing to their benefit.

 

It came as no surprise to COIL when SAP announced (May 7) its HANA Enterprise Cloud (HEC) which the company is confident can address the aspirations and goals of its customers to run not only simple applications but mission critical systems from the cloud in an open, real time fashion securely and without compromise.  As the world comes to understand features and functionality of HEC today, we can tell you that much of what you find there is something first explored in the COIL. 


By gaining a glimpse into the Cloudframe project work, we can consider the value propositions of SAP HEC and why they resonate even more when we consider the time and energy committed to the ongoing Cloudframe project work in COIL and how the results of this work contributes towards the productive, innovative services and overall delivery now possible from the Hana Enterprise Cloud.  The richness of HEC is comprised of invaluable hours and hours of highly iterative engineering engagements and tacit knowledge exchange among several SAP colleagues and partners continuously active in co-innovation work. The successful outcomes from this important co-innovation project work demonstrates how SAP and its partners have become so adept in exploring new and optimal ways in which to deliver SAP platforms and applications from the cloud.


Customers respond to HEC's compelling vision for a myriad of reasons but one business driver is certainly to include the one which most influenced COIL’s own need for cloud architecture; the simple desire to address power and cooling challenges while simultaneously seeking to optimize how to implement and manage cloud resources and capability with a limited pool of skilled, technical talent.  These are real dependencies today for anyone tasked with meeting the computational needs of a business by discovering how best to reduce the known dependencies and to reduce time to provision.


The COIL Cloudframe Project(s)-

The Cloudframe project work at COIL has become a key enabler of HEC and its automating and provisioning of large scale SAP Hana systems where it has reduced time from days into minutes and all of it done in the cloud using an open, multi-vendor approach. The scope of the Cloudframe project work includes exploration and optimizing the use of high speed interconnects and fabrics, using lower cost Intel x86 Xeon computing technology and to leverage service provider-centric storage. Additionally, there is an exploration of new persistence models for In-memory computing and developing cloud computing manageability frameworks for elastic, physical provisioning and management of high performance compute, network, and storage for large scale in-memory computing.


There are multiple COIL projects which consider SAP’s Hana dB as a powerful solution for analysis and management of Big Data.  Necessarily, such a solution requires a highly tuned cluster of compute, storage, and networking nodes.  With today’s converged infrastructure focus in the datacenter, there are several frameworks and utilities available to build a SAP Hana compliant cluster.


In an effort to unify the integration of SAP Hana into several vendor datacenter solutions SAP created a thin provisioning and orchestration framework called SAP Cloudframe.  Using Cloudframe, developers have a unified API for creating, expanding, reducing, suspending, and resuming HANA dBs.

Such unification allows applications to point at different SAP Hana Service Providers and use the same API to provision the SAP Hana Service.  In this way applications can use public cloud Hana service offerings for test and development phases.  With this a customer could then switch to using Hana Enterprise Cloud for qualification and production.  Or they could point to an on premise SAP Hana Cloudframe instance and run production there.
Cloudframe takes advantage of existing software defined networking, dedicated high speed communication fabrics, and visionary in its approach towards multi-level certificate based security, Cloudframe has enabled a software company like SAP to create a vast resource pool of high performance machines with cloud-like properties.


While we sometimes speak to the Cloudframe project at COIL as it if it were one project; it’s more an initiative comprised of many projects. Here is a list of some of the key Cloudframe projects that have taken place in the COIL since early 2011:

HA Storage integration with private HANA SAN
40GbE for private HANA Cluster Network
SDN development for Cloud Frame Provisioning
High Density/Power Saving Cells for Cloud Frame delivery in HEC

For the sake of keeping to a manageable blog posting (and yes I am aware that brevity and I often part company when I blog), I’ll just share some detail from one of these project focus areas. In this case, SDN development for Cloudframe provisioning for no other reason perhaps than the fact that it continues to receive a lot of industry attention as well as how it relates to the overall expansion of virtualization in the network.

However, one contrast of interest is the fact that the Cloudframe infrastructure is not predicated upon virtualization. Instead the Cloudframe Initiative places an emphasis upon a vertical architecture so as to alleviate the need to use virtualization by providing its own management tools which provide similar flexibility while preserving all the performance of bare metal.


We already know SAP Hana as a data platform is game changing for the industry. With SAP Hana, its core architecture comprises a columnar dB and a data processing engine and it presents an entire new paradigm relative to data management, app development and analytics. It furnishes linear scalability so its special capabilities can be readily delivered from public, private or hybrid clouds.

 

Both historically and present day, databases often get vertically scaled by adding resources to the existing logical unit, most specifically to memory. By adding more memory to the SQL server and by adding more resources to help the identified problem in general, an IT implementer always want to obtain more effective use of the available infrastructure resources and technology.

 

Some solutions attempt to offer a bit of scale up and scale out ability but this is not necessary in a SAP Hana enterprise deployment or as implemented from the cloud; adding no reason for deploying and managing separate cache servers.

 

With Cloudframe, SAP supports both vertical (scale up) and horizontal (scale out) seamlessly. Cloudframe has resulted in an ability to alleviate any interference between co-located deployments by entirely separating the two commonly encountered bottlenecks (networking and storage). The Cloudframe project’s focus is about managing the infrastructure via a unified API for creating, expanding, reducing, suspending, and resuming HANA dBs. This lends itself to a much better way of vertically scaling SAP HANA in the cloud without the need to resort to hybrid scaling which even if the results are acceptable adds complexity.

 

The way we currently manage SAP Hana within the cell enables us to scale only horizontally (scale out) this is by virtue of the fact that we only have 1TB nodes and therefore scale up/scale down are not options. This is currently only a limitation of the equipment and not the architecture itself. The progression is to have multiple node sizes within the cell and HEC. The degree of scale up / scale down is dependent only upon physical barriers (min / max server size).


Accelerator technologies are being investigated for both storage (flash arrays) and compute (FPGAs) but these have not yet made it into the standard design. We will take a closer look at this dimension of the project work further into the New Year. It would seem easy to at least predict that in 2014, many in our industry will be looking to climb out over the top of the known I/O barriers And it will be interesting to see if the breakthroughs remain anchored to hardware or if new concepts root to SDN.


Specifically with respect to the SDN project work the Cloudframe team pursued at the COIL it’s interesting to learn a bit about how the team collaborated with engineers from Arista Networks.   To work with Arista led to improved ability to derive more intelligence from the fabric itself.  This starts with manageability, but there has been ongoing interest to take this even further. 


One very positive outcome stemming from the two companies collaboration at COIL came during a team effort to develop the SAP Hana provisioning framework where the team first went to automate much of the network functionality, ACLs, VLANs etc. The team had planned on performing this using the previous standard method of scripting ssh command lines and analyzing return values. From a number of iterative and useful discussions with Arista, it became evident that there was an opportunity to replace this functionality with the Arista's eAPI.  Following a thorough demonstration of the eAPI, where its ease of use and power was illustrated, it enabled the Cloudframe team to seamlessly program the switch directly from python code used to develop the framework.

 

As we dive into the new year I am very much looking forward to seeing HEC technology continue to progress and evolve from the efforts made by the Cloudframe team. One related project in addition to the Cloudframe activities that we are eager to watch gain even more momentum is focusing directly upon security in the cloud for SAP Hana. Natively, SAP Hana is secure but with respect to Cloud there is a real imperative to explore security matters from a perspective of the entire stack. A COIL project including participants from SAP, Intel, Virtustream and Vormetric directly examines how Virtustream can leverage the Vormetric Data Firewall into its cloud infrastructure as means for delivering enhanced security architecture and offer to customers a capability for both implementing and then fully controlling even stronger data encryption with more granular access controls. We will share more details as this project moves deeper into its 2nd and 3rd phases but I am already appreciating the defense in depth aspects being learned and applied.

 

There will be some new things for the COIL team to develop and implement across its own cloud infrastructure in its preparation to not only support these two important projects but to further enable an array of new cloud focused projects already being proposed and developed. It’s an exciting time for our lab as we anticipate the pursuit of new projects between SAP and its partners spanning the entire stack from applications to provisioning, deployment and infrastructure. We look forward to seeing a spectrum of such projects, all hopefully producing results that will continue to influence clouds services and advance worldwide adoption of SAP Hana.

Beginning in 2014, I’m no longer going to try to personally rank my favorite projects here in the SAP Co-Innovation Lab. It has simply become impossible to do it with so many projects easily meeting if not flat out exceeding, whatever ranking criteria I come up with to call my favorite.


You can at least know when a COIL project grabs my undivided attention when I drop what I’m doing and decide to write a blog post about it. This has become the case with the latest work being pursued here in the lab with SAP, Citrix and NVIDIA.  I will however admit to injecting a wee bit of bias in that I tend to like projects where the focus or end result relates to the topic of data visualization and this one definitely qualifies.


Back in early summer, the SAP 3D Visual Enterprise team began working in a limited way with Citrix Receiver to make SAP 3D VE accessible and usable to a mobile client. Following some initial discussions with the 3D VE team at SAPPHIRE Citrix decided to run a PoC as a project in COIL. NVIDIA as one of the newest project members to join COIL this year, has contributed a white box featuring its NVIDIA GRIDTM Boards. While the project is only now starting to examine the scalability and performance possible, the basic functionality is in place and Multi-touch enablement is now in play. This was successfully shown at SAP TechEd earlier in Las Vegas and Amsterdam.


The team is now crafting a number of scalability and performance tests.


This COIL project seeks to orchestrate all of the required elements spanning 3 different use cases to showcase how an end user can remotely access and productively use this important SAP application where performance, functionality and overall user experience is served by leveraging complementary technologies and capability provided by select Citrix and NVIDIA components.


Business Challenges


Underscoring nearly all COIL projects is an interest in driving co-innovation results that can fully address one or more known business challenges. In this particular case the question is posed:


How does a remote, mobile client user of SAP 3D Visual Enterprise Software securely access and productively use the software over sometimes unpredictable wide area networks without incurring the costs associated with expensive dedicated Telco-delivered bandwidth and or complex and expensive fat client technologies?


WAN links in terms of available, prioritized bandwidth and latency can vary widely and as such can contribute to a wide range of customer experiences; not all satisfactory. Nonetheless businesses press forward every day in pursuit of sustaining and enhancing the new desktop which is mobile. Today’s end user submitting BI queries triggering computationally intensive correlation analysis or trying to view an immensely complex 3D CAD drawing wants to get a fast return from the last finger tap, swipe or mouse click whether they actually sit on the local LAN where the target server lives, or are thousands of miles away working from a remote work site in the middle of a desert of jungle.


COIL Project Objectives


  1. Establish 3 use cases for remote use of SAP 3D Visual Enterprise showcasing effective, productive use of the software by a remote client
  2. Document/Diagram a suitable reference architecture required to deploy such an end to end solution for an SAP 3D Visual Enterprise user
  3. Show by way of comparison, this solution to more legacy and inferior or cost prohibitive methods used to achieve the same result
  4. Create a compelling demo to illustrate use of SAP 3D Visual Enterprise features and functions via a Citrix Receiver-enabled client device


So the scope of this project is to leverage existing SAP development provided capabilities of SAP 3D Visual Enterprise to effectively work with a Citrix XenDesktop/Receiver powered client.


Many customers need to run SAP 3D VE in a secure environment behind the corporate firewall to avoid problems with IP protection, and yet additionally, many firms want to enable employees using mobile devices to have access to the graphical display/dashboard offered by SAP 3D VE. This is what Citrix brings to the table with its VDI technology.  A successful COIL project result will serve to strengthen the business case for why Citrix and for SAP customers knowing they can buy SAP 3D VE with the added confidence of implementing against security and performance requirements using Citrix.

 

Once the team has its scalability and performance tests fully formed up (hopefully before the holiday break), we will implement the test environment in COIL and start generating the user loads to begin capturing data and doing the needed performance analysis. Assuming we stay on schedule, we can look for the first whitepaper and demos to become available in early Q1. If you cannot possibly wait until then and this blog has whet your appetite for more right now, then you might want to check out and register for the Citrix Special Event happening this week Virtualizing 3D Professional Graphics Apps

 

There is also a nice blog on the subject that Roland Wartenberg wrote while he was at TechEd in Las Vegas that you may want to see too.

 

As I mentioned at the start, this project gets directly at the importance and value of data visualization, not only from what is possible with SAP Visual Enterprise 3D, but also working to ensure that the end user can get access to such visualization from nearly anywhere on the planet with a connection to the Internet.

Heading to Las Vegas today, and I am very excited about the broad spectrum of projects we are going to show you in the SAP Co-Innovation Lab area in the TechED show floor.  

 

My colleague David Cruickshank has written a thorough blog a few days ago about what we will do this year http://scn.sap.com/community/coil/blog/2013/10/14/sap-co-innovation-lab-at-sap-teched-2013.  David already did a great job in providing the overview and the right context, here I just add a few heads-ups about what projects we are showing at our pod in the showfloor.

 

So far 2013 has been a very busy and very productive year for COIL.  Just in Palo Alto, we have been working on  more than 20 projects covering a large spectrum of innovation domains, and much more globally across all COIL locations.  Below is a few snapshots for what we will highlight in Las Vegas.

 

HANA and Big Data

bigdata and HANA.JPG

 

HANA for production - security, HA/DR, management,  virtualization, …

production.JPG

 

 

Paving the way for next generation HANA technologies

 

next generation.JPG

 

Mobility & HANA

 

Mobility.JPG

 

Extend the reach

 

extend.JPG

 

Come visit our pod and see a live show our partners put together with us.

It comes as no surprise to me that I am writing my first SCN blog for COIL at TechEd on Sunday, just one week before actually attending SAP TechEd.

 

As I look at the new week now, I can’t seem to find a way to fit blogging in among the larger priorities consuming my time in the lab. So here I am. I may try to post some updates this week but look to this post being the best reference for COIL at TechEd Las Vegas.

 

COIL at SAP TechEd means to work with a much larger SAP TechEd team out of Ecosystems and Channels group, so there is a lot of communication and details to stay close to in the weeks running up to Oct. 21.  It’s interesting being plugged in with the marketing team because they pretty early on start to talk
about the social dimensions of this yearly event. They know it is comprised of a large SCN community so it’s a very good thing from early on, to encourage everyone to use social media to support all the activities and meta content for what an individual or team may bring to a show floor pod, expert session, or as a teched track speaker. So the message is loud and clear to get those tweets and likes fired up.

 

For me the reality is that I have to now catch up with blogging and tweeting about COIL rather than starting this a few weeks earlier when the good idea to do so was first suggested. COIL Palo Alto continues being heads down this week getting content ready to exhibit at our Pod for the event where will showcase more than sixteen co-innovation projects currently being pursued by SAP and various partners. For taking time to learn more about the Lab and what we do, visiting attendees can then sign up for several feature demos scheduled to be run by different project teams and subject matter experts all week long.

 

Spending time talking to COIL at the pod will also let any attendee playing the Ecosystems and Channels Mobile App game will get to answer three questions allowing the attendee to scan a QR into the app. Additionally, our sister team, SAP Hana Partner Engineering, is hosting a nearby pod so we will be sharing content common to both teams this year.

 

COIL is hosting four Expert Sessions in the Clubhouse Network Lounge and the content is simply too good to try and contain for 30 minutes, so we expect atendees who check them out will likely hang out for questions or come back to the pod to learn more. This year we are thrilled to co-host, CDNetworks, McAfee, RedHat and Vmware. Kevin Liu and I will each host two of the four sessions.

 

I think anyone with work involving SAP cloud deliverables would take interest to learn more (Session EXP10477) about CDNetwork’s project to accelerate an SAP load from one endpoint to another. It’s a rational choice to look at such solutions in situations where you have no control over the end point with respect to wan optimization, performance or connection quality. If you want to look into this ahead of joining the session, be sure to check out the project whitepaper available from Kevin's blog and the COIL SCN page.

 

I’m also looking forward to co-hosting the session ( EXP 10478) with McAfee as we’ve actually worked on a few projects, a couple of which are actually being done with the global COIL computing center infrastructure engineering team to implement McAfee MOVE AV (Management for Optimized Virtual Environments ,Anti-Virus), into the COIL Cloud.  I’m not totally up to speed yet on Kevin’s sessions, but from RedHat’s project, attendees will learn more about SAP ASE and deploying with different cartridges as a part of RedHat OpenShift.

 

We have some really super content to share across so many different projects and teams in our project showcase, as COIL Palo Alto is home to SAP and Intel along with Cisco exploring Hana and Hadoop, SAP NS2 working to take SAP HANA's geospatial capability to the next level in the context of using big data for real time situation awareness, or SAP, Intel, Vormetric and Virtustream exploring 3rd party encryption solutions based upon select cloud sourced use cases. We plan to offer a closer look at Virtual SAP Hana where the SAP and Vmware team has explored a number of different configurations and general tips for deploying SAP Hana to Dev and Test environments.

 

Our Featured Demos are still being selected, prioritized and scheduled but we just reviewed the SAP NS2 demo content with the team last Friday, including Tom Turchioe and his crew from Critigen Labs. They always do cool demos (ask about the Medicare Mapper and Food Truck!) but this one, especially its richness of content is my favorite so far. Geospatial really enriches information and its analysis and I’m glad that SAP HANA has such a super capability and that we also have partners who excel at visualizing GIS as well as making sure all real time situation awareness queries and results are encrypted and secure.

 

We are also planning to show a few very interesting demos courtesy of our COIL Tokyo colleague Shuuji Watanabe. If using SAP ERP with SAP Hana is of high interest to you, this will be a series of demos you will want to see Shuuji's demos:

 

 

 

ERP on HANA – Confirming stocks and delivery date, ordering at customer site, real time sales reporting.Business Object Explorer is used for real time reporting and UI5 – a new function of HANA is used to realize a browser-based graphical interface.

  

The SAP Japan Real Time Data Platform team through this COIL Tokyo project and demo seeks to show the advantage of Suite on HANA for customers and partners.
This ERP on HANA real-real time demo project led to the RTDP team where they worked with SAP Japan CSA, Mobility IU-Delivery, D&T / Analytics Solutions (Business Object) to develop this demo.

  

     Real Time Reporting utilizing ERP on HANA by mixed environment of Cloud and On-Premise system is the second demo to be shown. With this
     demo, the SAP Japan Real Time Data Platform team wanted to show yet  another advantage of Suite on HANA
     utilizing a HANA specific function.

Classmethod – a Cloud Integrator in Japan has developed an application on AWS. This application on AWS   acesses SoH HANA DB at COIL Tokyo via internet, executing an sql sentence every 0.5 sec then shows a  summarized report via a browser. The SD benchmark tool is running in the back end during the demo.

 

    Confirming a received good invoice in one screen utilizing SHAL – a new function of SAP HANA will be the focus of the 3rd demo. At the point of sale, a goods receipt for an order is not done once but normally produced  and distributed several times.  So to confirm the status of a goods receipt and issuing invoice, it is something that can be time consuming for  Purchasing or Accounting staff because at least until now it has been the case that such end users  had to  check several business screens in
order to be properly informed. SAP Japan consulting team wants to improve  this situation utilizing SHAL in SAP Hana.

 

 

So plenty of great content and demos in motion with lots of partners and as of this post, we are still working to include project insights into other projects emerging with Vendavo, Violin Memory, Core Mobile, AutoGrid, Citrix and others.

 

While I think to do so, let me also mention that if you are planning to attend or know someone attending the SAP TechEd pre-event,
Innovation Summit sponsored by SAP PartnerEdge on October 21, COIL is hosting a breakfast event at 7:30 a.m. have them check their itineraries and consider joining me and SAP NS2 CTO David Korn in a short interview exploring a current RDS for Real time Situation Awareness COIL project. The interview will touch upon the value of pursuing the project through COIL to take full advantage of geospatial in SAP Hana and building a solution architecture featuring technologies from multiple partners.

 

If you miss this, you can pick up on the same topic again later in the week during an afternoon SAP TechEd Live session (3pm Wed, 10/23)where I will again talk to David Korn, Tom Turchioe from Critigen Labs and still to be confirmed one of our colleagues from Cisco.

 

We’ve got a lot COIL Sponsors and Project members exhibiting in the Clubhouse this year, Intel, Cisco, Vmware, Citrix, RedHat, CA, Virtustream, and many more who will be showing complementary content and demos to what you will find at the COIL Pod, so be sure to visit their exhibits too. Just look for the Oval SAP Co-Innovation Lab signage displayed by our sponsors and member participants.

 

 

 

 

 

 

Time and time again, when you read articles in any business magazine or pick up any business book focused on leadership, management, or starting a company there is always a theme around talent. Some of the biggest lessons learned of leaders, managers, and entrepreneurs are the mistakes they made in managing their talent. At the end of the day it is the people we work with or employ that can make or break our companies.

There is a wide spectrum of how organizations approach the concept of building high performing teams. However, so many companies fall short due to the managers that are in the middle building and leading these teams. Most mistakes in building high performing teams can be boiled down to simply human nature: We all want see the good in people; for the most part we choose to avoid conflict; and finally the status quo is always an easier path.

High performing teams have several key components that are required. They start with the quality of empathy. Which is the ability for each member of the team being able to understand the perspective of other team members. The willingness to explore an opinion is important. You don’t want team members that are dismissive. The second quality is listening. In order to be empathetic, you first have to listen to others points of view. In this day and age of sound bites and 140 characters, listening is becoming a lost art. It requires focus, which means putting down your devices and tuning into the conversation. The third quality is a bias for action. Each member of the team must be wiling to act. The willingness to act requires teams to compromise and be willing to test and learn from their actions. The fourth quality is willingness to fail, but more important a willingness to learn from that failure. No team will take perfect actions, but action is always preferred over inaction. One way to drive towards a bias for action is having an willingness to fail. Learning’s from those failures will often bring the teams closer and enable better decision making in the future. Now lets dig into each of these qualities a bit more to understand how they work.

To read the complete blog click here.

Time for a quick blog post to draw attention to a very cool project SAP and Autogrid are pursuing here in the COIL Lab this year.

 

Rather than write about it, I would encourage you all to check out my "EarlyView" podcast to hear firsthand about the project from a couple of the project leads.

 

In essence, this project is all about SAP for utilities working more seemelessly with AutoGrid ability to providing a scaleable Demand response Operations management solution to the utilities industry.

 

Today it is not so easy to safely and economically store energy and yet the reality faced is that energy is consumed as its generated. When demand spikes, it means energy and utility companies having the ability to quickly react by generating more energy on demand, hence investing in buffer infrastructure not only requiring significant capital expenditure to deploy but there is high cost to manage this as something that when demand lapses, is simply a sunk cost that does not continually serve revenue generation.

From the podcast learn just how a utility company  can look to offset the costs associated with adding expensive supply capacity.

 

This project is all about IT innovation software to help utility companies to coordinate energy consumption behavior of consumers at the same time to reduce energy consumption in a scalable and sustainable manner from enabling effective demand response programs. In the days to come leading up to SAP TechEd we plan to produce a whitepaper and demo which will be made available from coil.sap.com.

Since I am not attending SapphireNow in Orlando this year, I could spend my blog time this week describing some of the things the SAP Co-Innovation Lab is working on with partners that will be featured at the event this year, but since this week is already overwhelmed with such news, I'm going to instead plow into providing some more content on one of several new projects here in COIL Palo Alto that is keeping us super busy this year.

 

I now have multiple favorite projects among those which have spun up this year in COIL Palo Alto, but let me start by talking about one in particular we are doing with SAP NS2. This was a project first described to us by its requestor, SAP NS2 CTO David Korn as “Big Data Fusion” during last year’s SAP TechEd.  The project was designed from the start to drive deep insight from structured and unstructured Big Data in an on-premise or cloud-based environment that would be of high interest not only to the more popular three-letter agencies but largely to anyone with similar needs in any highly regulated industry.

 

I’m going to touch on some of the different dimensions of the project in this blog post but you can learn more from some of the key project participants first hand by listening to the COIL Early View Podcast.

 

One of the project’s main goals is to prototype some new capability to extend an existing SAP RDS solution calling for an integration of multi-source (social media, sensor, location, image, unstructured, structured, etc.) data to perform geospatial event analysis based on the aggregation of: person of interest, location of interest, activity and semantic analysis. 

 

COIL has enabled other Big Data projects over the past 12-18 months but what I like about this project is that is features rich partner collaboration where Cisco, Critigen and Encryptics are all working with SAP NS2 to develop a solution meant to address a variety of business challenges like:

  

  • A need to analyze large data repositories to answer the 5 W’s (who, what, when, where and why) in a trusted, secure manner
  • To provide an "off-the-shelf" capability that can be attached to any data source(s) to provide this capability. An architecture and set of capabilities which lets
    the client focus be to "build" their analysis instead of building a complex infrastructure and then create the analysis
  • Make this capability something that can be provided in a cloud or on-premise delivery model
  • Deliver this capability as an SAP RDS
  • Allow SAP to demonstrate a complete life-cycle of secure, Big Data information processing and analysis
  • Extend a cloud and big data solution set combining an integrated hardware and software components to solve activity based intelligence requirements

 

               

This diagram provides a good overview of the architecture underlying the solution meant to address each of these challenges-

 

BigDatafusionArch2013.jpg

The diagram gives us a comprehensive view of the overall architecture but we have a few others which I don't have time to share here that drill into more detail but these will certainly be included in some of the future white papers that will be published over the next couple of months.

 

This project effort seeks to provide a working capability that implements this reference architecture for secure distributed activity based, geospatial intelligence analysis.

 

What is Geospatial Intelligence (GEOINT) you might ask? GEOINT data sources include imagery, full motion video (FMV) and mapping data, collected by either commercial or government satellite, aircraft (UAVs, reconnaissance,  commercial aircraft or by service subscription), or by other means, such as maps, demographic databases and open source databases, census information, GPS waypoints, utility schematics, or any data about infrastructure or events on earth.

 

Human intelligence (HUMINT) gathering means to collect intelligence by means of interpersonal contact.  During WWII, HUMINT was factored in with Signal Intelligence (SIGINT) which as an example, might be an increase in radio communications right before an enemy’s ships left a harbor which might indicate fleet movement or some form of supply chain activity. The person listening in to this traffic may not be able decipher the encrypted transmissions but just the action of so much communication suddenly occurring could suggest an activity of interest. Similarly Communications Intelligence (COMMINT) was woven into the intelligence data gathered and intelligent communities looked for any correlations between all of the different events.

 

Now speed ahead to the 21st century in an age of continuous creation of petabytes of e-mail, text messaging, audio/video surveillance and now social networking, HUMINT data can be now be gathered, sifted, collated and analyzed to gather useful information to support a mission need.

 

Why is this so important? Within Homeland Security, the Deartment of Defense Intelligence Community (DoD IC) and across the DoD, there is a perfunctory need for these organizations to securely integrate data for intelligence analysis which can then be securely distributed to the edge where the information is needed most. (the “edge” in this instance being personnel like law enforcement and other field agents).

 

Today, most of the data is collected via a variety of methods where it is typically relayed back to a central processing and analysis center and then distributed for community use. In many cases it is unfortunate that where the data collected could be processed locally, there is no existing community infrastructure or it is one not capable of supporting such functions. Globally, the DoD and MoDs will spend billions of dollars extending their existing infrastructure capabilities to gradually achieve a truly mobile, distributed capability.

 

The main goal of the solution envisioned by the COIL project team is to answer a set of very basic questions that requires sifting through petabytes of information;

 

A person of interest (who)

 

is going to perform an activity (what)

 

at a given time (when)

 

at a given location (where)

 

for an unknown reason (why)

 

SAP NS2 has been involved in an R&D activity with one of its customers which invariably led to prototyping a virtual analysis appliance (IQ, BOBJ, Data Services, Text Analysis) integrated with open source applications driven by social media data from the internet that can be applied to any data source(s)
for intelligence analysis.

 

SAP NS2 looks to create through this COIL project, an SAP Rapid Deployment Solution (RDS) to address the activity based intelligence platform needs through a coupling of SAP and partner technologies. At the heart of SAP’s architecture for OLTP and OLAP processing is HANA. HANA is an in-memory database appliance that can perform high speed in-memory transaction processing (i.e. SAP Business Suite) and big-data analytics on the same data without the need for an ETL process to load a separate data warehouse and have the ability to scale to petabyte data stores.

 

When such an appliance further incorporates ESRI to create the concept of geospatial data marts to enrich the analysis abilities of SAP Business Intelligence, this begins to provide a true integrated capability of text analysis, geospatial analysis and traditional BI/BA in a single user interface.

 

The next step in the evolution of this new reference architecture and one being included as part of what this project expects to yield, is to add a complete secure mobile capability that incorporates position location and the creation of a mobile application factory that can distribute applications to the edge that will allow field analysts to perform intelligence analysis on data that is immediately collected and to process previously captured data locally instead of waiting for some other organization to perform this task.

 

What the SAP NS2 project leads desire as output from this COIL project is to integrate three related solutions into a single offering that can be used across DoD, Federal Intelligence Community, State/Local Intelligence communities and Aerospace/Defense community.

 

The integrated solution comprises the following:

   

  1. Activity Based Analysis solution (which has been previously prototyped by NS2) which consists of IQ, Data Services, Text Analysis and Business Objects integrated with ESRI (ArcGis) to provide a geospatial information analysis platform which leverages social media as the primary data source to analyze events and determine a course of action to handle these events.

    This is based on the aggregation of: person(s) of interest (POI) or WHO is involved in the event, definition of the event or the WHAT, date of the pending event or WHEN; what are the locations involved with the POI(s) and events which is the WHERE;  and sentiment analysis of data gleaned from social media or other data system to determine the WHY will the event occur.
  2. RealTime Situational Awareness (RTSA) Rapid Deployment Solution (RDS) which is a command/control appliance based on SAP HANA which has incorporated the NIEM (National Information Exchange Model) object information model into ar elational execution schema which operates within the SAP HANA appliance. The NIEM standard is sponsored by DHS, FBI, DOS, State, Local, tribal and international emergency response and intelligence communities.
  3. Both of the aforementioned solutions have a desktop and mobile component. SAP NS2 wants to integrate its secure mobile platform to the mobile aspect of the solution architecture. This will allow for secure communication between the server and mobile device and protect data at rest for the mobile device. The secure solution includes mobile device management (MDM) and a mobile enterprise application platform MEAP).

 

Situational Awareness Use Case Scenario:

 

One of the things about this project which first grabbed my attention was its proposed use case which centered around a capability to assess the activity of food trucks in the Washington DC area based upon classification, frequency of activity, and sentiment of customers with respect to their location(s) over a given period of time.

 

The use case is interesting because it is plausible that a foreign threat to the homeland could somehow leverage a food truck as a clandestine way to trigger an act of terror against US citizens and public or private property. The other fascinating thing about this use case is how easy it is to demonstrate an effective use of public data in which to help glean new actionable insights.

 

From gathering this information, an analyst can look to uncover patterns of behavior of food trucks, where they usually are, and “atypical” locations of food trucks. By using the predictive analytics functions within HANA, the analysts can project which food trucks and food truck classifications will yield higher sentiments and more activity using time series analysis (i.e. double, triple exponential smoothing);

Using an Apriori association algorithm the analyst can detect correlation of food truck proximity to other food trucks or pre-defined geo-fences; using the anomaly detection algorithm to detect when food trucks within a given geo-fence for a long period of time suddenly decides to move to a different location within the city. Through the use of these predictive functions, SAP HANA can create a specific analytics view that can be used to drive a native Netweaver HTML 5 application, a BOBJ dashboard and alert or a geographic overlay to be used by an ArcGIS application.

 

Given the proposed software architecture and implementation, Data services is used to capture data from web crawling publically available data from both Yelp and Twitter regarding the location of food trucks within Washington DC and the sentiments that the food truck patrons post on these social media sites. The use case then demonstrates how to turn this into actionable information-

     a. Using the text analysis functionality within data services (in the next release, the HANA text analysis function will be used) sentiment analysis is performed on the data captured from Yelp and Twitter and ETL to HANA

     b. In the next iteration of the demonstration scenario, HANA will perform continuous time series analysis (exponential smoothing) within a user defined geo-fence to predict the most popular food truck classifications and the associated food trucks; using the anomaly detection algorithm in tandem with the Apriori algorithm  to detect when a specific food truck has left its “normal” geo-location to do business in a neighborhood that is not associated with the food truck classification (cuisine, i.e. an Asian food truck suddenly moves to a neighborhood where the demographics suggest that Mediterranean cuisine is preferred).

    c. SAP HANA interfaces with ArcGIS to provide a geo-overlay

   d. SAP HANA drives BOBJ dashboard for real-timesituational awareness analytics

    e. SAP
HANA interfaces with the ArcGIS mobile application on the iPAD via web service to allow a field operative to captureand log real-time sentiment events to update the HANA database and bobj analytics

 

There is so much interest and excitement surrounding this project that itin fact has already spurred follow on project discussions to further exploit future SAP HANA capabilities and to explore in greater depth how geospatial intelligence can be applied to other industries like retail and healthcare. 

 

While there is always the challenge to identify the valid business case underscoring how this technology can be used, the point being made here is that this team has established a very compelling co-innovated reference architecture that is already proving what’s possible. I know from just this project alone, it is going to be a very interesting year at COIL.

The SAP Co-Innovation Lab (COIL) continues to globally enable a broad constituency of project requestors originating from both inside of SAP and from firms within its ecosystem wishing to engage in co-innovation projects.  This co-innovation enablement is comprised of a platform of services:

 

  • IT infrastructure- to provision SAP landscapes for all projects
  • IP management- sow-based projects
  • Knowledge brokering- sourcing, connecting subject matter and
    domain experts
  • Project and operations management- resources, capabilities for
    project execution

       

This platform is of high value to the project originators prefering to stay focused on the collaboration and co-innovation goals of the project itself, and not to spend many hours, days or even weeks seeking out the different resources required and becoming deeply involved in building out the right physical project environment. COIL provides such a platform. This year we ran over 100 projects like this worldwide.

 

The knowledge brokering dimension of the COIL enablement platform is something that has gradually evolved into a strong capability. This is due to a number of contributing factors, like being among the first to establish SAP platforms like Sybase SUP, SAP Gateway and SAP HANA landscapes to support an array of co-innovation project work which then helped to build our global team’s infrastructure expertise needed to stand up these environments and to
efficiently provision projects.

knowledgeBrokering.jpg

 

With nearly 5 years of co-innovation project work under our belts, COIL project managers, business leads and subject matter experts have become quite adept at thoroughly assessing project proposals to identify potential gaps relevant to required hardware and software resources as well as to recognize when discrete expertise is necessary for the project to succeed. There are a variety of ways in which COIL applies knowledge brokering for a given project. There are four ways of sourcing expertise that have emerged as somewhat cornerstone to this effort:

 

  • Project Requestor
  • Personal Networks
  • Repeat Project Participation
  • Social Networks

 

Oftentimes the project requestor has already laid sufficient groundwork in obtaining the right people to participate in the project. One observation is that we often find more formal commitments when participants are formally assigned to work on the project via the project requestor's own executive sponsorship. While this is obviously ideal, it is however somewhat rare. At best, we see some or all of the SAP application or platform experience fulfilled by the requestor and yet general experience relative to the entire technology stack or understanding key aspects of the infrastructure of equal importance to a project can be lacking and therefore represent a critical gap. It is similarly so when an SAP technology partner or ecosystem ISV proposes a COIL project. The firm will typically bring vast knowledge of its own technologies and products and perhaps some broader skill sets relative to things like networking, storage or security but may lack the deeper SAP knowledge critical for things like developing a suitable use case.

 

Nonetheless, from COIL having enabled such a diverse portfolio of projects, our business leads and project managers have built up their own personal network of contacts inside SAP and across its ecosystem to the extent that our success at finding the needed project participants is fairly successful.  We often tap talent for a project by connecting prior project participants to new projects where we see a fit and can leverage the fact the person is already familiar with COIL project processes.

 

There are of course ongoing challenges like sometimes taking more time to identify a needed expert than what a proposed project timeline can tolerate. Another issue looms where we locate the right expert who is willing to participate only to find that their availability does not coincide with the project timeline or worse, they get redirected due to their own shifting priorities and need to drop from an active project. There are ways in which to mitigate this, but the risk is there in any circumstance where the participants are not formally committed to the project through agreements spanning a project’s executive stakeholders.  

 

The last and more recent way to source expertise for co-innovation projects is to tap internal social networks by simply broadcasting information about the project and requesting assistance from those who find the proposed project of interest and where it may even align with the goals of others not yet aware of the project. The effort to tap social networks is something we’ve only just stated to explore and may in fact not get complete traction until enough people become interested to continuously follow the SAP Co-Innovation Lab and a stronger awareness that it can be possible to become involved in co-innovation projects.

 

As someone who pays attention to how co-innovation is best enabled, I am always interested to learn how we can become more efficient at what we do and how our services can be designed to scale. In a previous blog post, I described my prior experience with cross-utilization management as one well-known approach developed years ago, for increasing workforce agility. Since first becoming active with implementing Web 2.0 Technologies for collaboration as early as 2006, I’ve had the sense that there must be scalable ways for project owners to connect to a larger pool of people.  Expanding this reach via social networks may be an effective way to strengthen our overall knowledge brokering capacity but this may or may not prove to be the only way to source project talent. Given the priorities of the day, it is not always possible to nurture the social networks of interest to the degree that it will quickly become a fruitful source for consistently connecting with subject matter experts.

 

Through my own scholarly efforts to discover how other firms and organizations work to connect people and to try and find the right people at the right time for important innovation project work, I was recently introduced to a network of people who are exploring the topic of Human Capital Management (HCM) and the intersection of management 2.0 and enterprise 2.0. Should you yearn to dive straight into HCM, I can highly recommend a great book on the topic, The New HR Analytics by Jac Fitz-Enz.  As I read this book, I was struck by how prevalent predictive analytics is becoming in HR and I wondered aloud at what point a firm's HR organization would reguarly track innovation project work with respect to how such projects connect with people resources across the firm and what its impact has upon not only revenue and growth but employee motivation, loyalty and job satisfaction.

 

In much the same way we are fast becoming accustomed to sourcing computing power from the cloud. There are now tools and services emerging in the marketplace empowering the enterprise to optimize knowledge workers. For starters, SAP SuccessFactors offers market leading solutions with its BizX Suite delivering Core HR functionality, collaboration, and Analytics all from the cloud to aid firms with getting the right people aligned to the company strategy and business goals.

 

Additionally, and what I view as fully complimentary to the aforementioned capabilities are newer and compelling management techniques being developed that rely less on top down decision making and more on distributed worker based decision making. There are a few companies out
there looking at this today but one particular startup, Collabworks is evolving the notion of the “People Cloud” and describing a new paradigm of Worker-as-a Service.   It didn’t take long for me to become interested in all of this as it seems to directly address some of the knowledge brokering challenges previously mentioned. 

 

I find the idea that the right type of platform or middleware could serve to scale workers as a service to the extent that jobs can become more like services most intriguing.  It suggests that there are now opportunities to optimize talent by ensuring a firm can get the right people connected to the most important and relevant projects can occur instead of head count and job functions being tied to a single discrete business unit and fundamentally restricting a free flow of talent within an organization.

 

I’m still very much the newbie to all of this, but am eager to learn more about it. I get my first full immersion into all of it by participating in a workshop and panel discussion hosted by Collabworks at the Computer History Museum in Mountain View, CA on Dec. 11th.  For a few short hours I will be surrounded by a lot of really talented people looking at all this from a variety of perspectives where I hope to find some further insight into how all of this might serve to supercharge our agility and efficiency to find and connect experts to co-innovation projects.  What I value is how this also injects a
higher degree of ideation and helps foster the open innovation tenet of leveraging tacit knowledge exchange between SAP and ecosystem partners to both accelerate the innovation process and to find shared commercial success with bringing innovation to customers.

Recently, our Open Innovation team (including David Cruickshank) sat down with two innovators from SAP Research to discuss their approaches to innovating in the real world. Dr. Axel Saleck, VP of SAP Co-Innovation Lab Global Network, and Denis Browne, Senior Vice President of SAP Imagineering Research are featured in a video interview moderated by our own Rocky Ongkowidjojo. Axel and Denis share their motivations and some examples of exploring opportunities for open innovation as well as exploiting technologies and processes in unique ways.

 

Axel notes that in COIL, “We don’t think that we (SAP) don’t know everything….” This becomes the basis for open collaboration with partners, customers and others, leading to innovative solutions. COIL’s aim is to not look just at SAP landscapes but at the real world situations where SAP technologies and processes applied to solve big problems. One example is drawn from a project addressing disaster/recovery across a complex landscape of technologies and processes from many vendors. This resulted in a highly optimized solution that was successfully implemented, prior to the catastrophic 2011 earthquake and tsunami, with a number of Japanese companies. Axel goes on to describe recent collaboration in the mobility space, as well as Smart Meter Analytics enabling support for renewable energy.

 

Denis emphasizes the point that Imagineering is intended to be a positive disruptive force. Their motto is, “Jump and find your wings on the way down.” When collaborating with entrepreneurs, start-ups, partners and customers, the Imagineering team desires risk, uncertainty, and requires the willingness to find a creative solution to a business problem. Imagineering looks closely at the challenge of understanding when to ramp investments in order to make the transition from exploration to exploitation.

 

We look forward to hearing more from COIL and Imagineering as they achieve additional success in Open Innovation.

The session with Stefan last week was well attended, more than 50 people attended at COIL and even more attended online.

 

Many of you who missed the session have sent me emails about the recording.  Sorry it took a few days to get the files published in SCN.  Good news is that the online session recording is now available here (http://scn.sap.com/docs/DOC-29295).

 

For those who want to watch Stefan talking, the video recording for the onsite part is now available in SCN too. Due to its large size, the recording is split into two parts:

 

A note for SAP colleagues seeking general social media participation guidance

 

Laure Cetin shared with me a social media resource center where you can find our portfolio of social media guidelines, best practice information, tools and access to the social media account registries. The site is only accesible from inside SAP network.    

Do you have SAP BusinessObjects BI products running on VMware? What kind of virtualization overhead you have experienced? Any good lessons you have learned and shared with the world?

 

A long-awaited guidance

 

Let me know about your experiences. Here is what I understood from my conversations with our product folks - when not configured optimally, performance penalty of running SAP BI in virtual environments can be as high as 20%-40%, whereas with some best practices applied, the overhead can be brought down to  the industry standard expectation of around 5% to 10%.  

 

That tells how much we can benefit when we share the best practices.

 

Well, there was no official guidance for running BI in virtual environments, until now. 

 

COIL started a project with our next door neighbor in Palo Alto (VMware) and our BI colleagues late last year to address this gap, and just rolled out our first paper (the paper is also available from VMWare site here ) focusing on evaluating a subset of Java-specific best practices by applying them to an SAP BusinessObjects BI 4 deployment and analyzing their effects.

 

Check it out, and let us know what you think.

 

If you've got questions about the technical details of the paper and/or general virtualizing BI topics, get in touch with Ashish Morzaria. Ashish should be able to provide you an answer or point you to the right contact within the BI teams.

 

Whom should we thank?

 

It all started with a meeting at COIL with our friends from VMware, Vas Mitra and Justin Murray. Justin  co-authored the VMware paper "Enterprise Java Applications on VMware - Best Practices Guide" and would like to see how it's applicable to SAP BI application. Over time, we were joined by Jay Thoden van Velzen, Ashish C. Morzaria, and many others. Jay played a critical role in designing and building the BI testing environment with the help of Roehl Obaldo and Sivagopal Modadugula. Ashish was the one who pushed us through the last mile of the paper. Without Ashish's diligent work and close collaboration with other BI colleagues, we won't have the paper as it is published today. 

 

Many colleagues supported with the setup of the test environment, the execution of the tests, and the review of this paper. To mention a few: Michael Hesse at VMware; and Peter Aeschlimann, Corey Wilkie, Jacques Buchholz, David Cruickshank, Abhay Kale, Irakli Natsvlishvili, David Pascuzzi, Veronique L'Helguen Smahi and Andrew Valega at SAP

 

What next?

 

The resulting recommendations in this document only provide general guidelines related to Java and do not target any specific size or type of BI deployment.

 

Is that enough? Definitely not.

 

We are planning more document(s) to share validated "Best Practices", not only Java, but also include all other aspects,  on how to best deploy SAP BusinessObjects BI in a virtualized (VMware) environment. 

 

We are in planning phase of the next round of tests, and are open for suggestions. So if you have anything in mind and would like us to validate, now is the time to tell us. We can not grantee we will be able to address each and everything given the resources and time available to us, but we will certainly try our best.

Many of you who are in or follow the social media and open innovation spaces may already known Stefan Lindegaard.

 

Stefan is an author, speaker and strategic advisor focusing on the topics of open innovation, social media tools and intrapreneurship. His sharp work has propelled him into being a trusted advisor to many large corporations. Stefan has written two books: Making Open Innovation Work (Oct 2011) and The Open Innovation Revolution (May 2010). His next book, Social Media for Corporate Innovators and Entrepreneurs: Add Power to Your Innovation Efforts, is due fall 2012. 

 

Get a taste of his blogs at www.15inno.com.

 

Well, I have got some good news here - Stefan will be a special guest to our next eco-innovation forum session, thanks my colleague David Cruickshank for his connection with Stefan.

 

 

Title: Social Media for Corporate Innovators and Entrepreneurs - Add Power to Your Innovation Efforts

Date: Friday, June 8, 2012

Time: 12:00 PM - 1:00 PM PDT

 

For those who are in Palo Alto, grab your lunch and join us at building 1, COIL quad.  We will have a professional videophotographer onsite to record this interactive session, so be ready to ask your questions and have your smile taped.  If you have a copy of his book and would like to have it signed by Stefan, bring it with you. We will see if something can be arranged.

 

If you are remote, we will broadcast the session online. Reserve your webinar seat to get a dial in number.

 

See you tomorrow.

I'm thrilled for the SAP Co-Innovation Lab this week to host its Eco-Innovation Forum with our special guest Stefan Lindegaard this Friday, June 8 from Noon to 1pm pacific.

 

Stefan and I connected sometime last year after I had written a paper examining the intersections of Open Innovation, Co-Innovation and Social Networks. Shortly after publishing this paper to SDN, I participated in a 1 day panel discussion at Santa Clara University hosted by Dr. Terri Griffith, author of the Plugged-In Manager.

 

As I researched my material prior to the discussion it did not take long for me to find Stefan's very popluar open Innovation site, www.15inno.com. My first observation upon visting his site for the first time was to discover that there were many, many people out there passionate about Open Innovation and trying to figure out what it takes to succeed at it. We subsequently shared some ideas and comments through different topics/postings and after awhile, discussed his upcoming 3rd book that explores open innovation and social media. We exchanged a few more emails which lead to Mark Yolton and I being interviewed for the book and sharing our own views and experiences on such a rich topic. We are looking forward to its publication.

 

This Friday, Stefan will share with us how a company might use social media to bring out better innovation faster. Working on his next book, Social Media for Corporate Innovators and Entrepreneurs:Add Power to Your Innovation Efforts, Stefan has learned that this question is being asked by many corporate innovators and entrepreneurs around the world.

 

This intersection is unchartered territory, yet full of interesting opportunities. At the session this Friday, Stefan will cover:

 

  • an understanding on how social media tools can impact innovation efforts
  • an overview of the most important tools and how they can be used by innovation teams
  • examples on how leading-edge companies use social media tools in their innovation efforts
  • advice on how to get started with using social media for innovation

 

If you are on the SAP Labs Campus this Friday, we hope you can drop in and participate.

Actions

Filter Blog

By author: By date:
By tag: