Last week fellow SAP Mentor and friend Ethan Jewett and I did a webinar for University Alliances


This was based off a presentation that I put together for SAP Insider in Singapore and the SAUG Summit in Sydney


There were lots of live demo's and fortunately the demo god were kind to us


Key points are:


  • Live benchmark comparing Microsoft SQL Server and Sybase IQ Free Editions
  • Setting up and configuring BW NLS with Sybase IQ
  • Running benchmark queries using NLS with Sybase IQ
  • Setting up Smart Data Access on HANA
  • Running live queries merging data between HANA and Sybase IQ


The key trend of the presentation is helping enable customers to protect there investment in HANA by getting the best out of Hot and Cold data stores by using both HANA and Sybase IQ as part of the architecture.


Enjoy !


As most people who ready my blog's know I am in the process of moving to the APJ region (more in this post).


One of the main reasons that made me finally decide to move out East was that there was a large BusinessObjects (BOBJ) community and over 70 % of those customers were not running SAP.


The agnostic BOBJ shop is the world I have come from being a long time user of BusinessObjects, Sybase and various ETL tools over the years. The agnostic world was almost "out of place" in the SAP existence but now with SAP becoming a database company and HANA been used as a "side-car" approach the SAP world is also getting exposed to the hands on data movement and modelling expertise needed which was typically handled by BW.


While presenting at both the SAP Insider Event in Singapore and the SAUG Summit in Sydney recently I generally always open up by asking attendees if they are "Classic BOBJ customers" or are running BW. In both events there was about a 95 % of people attending my sessions that run BW so there seems to be a huge disconnect of the original BusinessObjects install base and hence the title of this blog post...


So ... if you are reading this and don't associate your-self with SAP then please read further to see why the acquisition of BusinessObjects almost 6 years ago could be a great thing for either your career or company.


SAP, especially in the APJ region headed up by Kurt Bilafer, are doing all they can to help assist the original BusinessObjects customer to move across to SAP "paper" so to speak. Have a look at this link and video below about the Analytics Plus program.



To me one of the most compelling reason to move across to SAP pricing is so that you can take advantage of the BusinessObjects Analytics Edition bundles. I did a long write up about it in this blog post.


The key is that now as BOBJ customers you get access to a world class ETL tool (no more horrific hand written SQL ETL jobs )  and a best of breed analytics database all for a fraction of the cost of acquiring it stand alone.


If you are feeling lucky then please also do enter the Analytics Makeover competition ( only for folks in APJ) where you can free BI4.1 upgrade thanks to the Accenture and EV Technologies.


I have not yet even gone into the many advantages of upgrading to BI4 yet and there are many a post blog relating to that. With that in mind here is a great post by Timo Elliott outlining all the new features and resources for BI4.1


If you are still not convinced then why not sign up for the free, yes free, 10 part webinar series called #BetterBOBJAPJ that is sponsored by SAP and

EV Technologies. All sessions will be run by SAP Press Authors and/or SAP Mentors and you will be sure to have a good understanding of the new offerings by the end of the series.


My final thought on all of this is that if you are a "Classic BOBJ" customer then after 6 years it really is time to come and say hi. With SAP pushing the HANA and database message the Enterprise Data Warehouse world is opening up within SAP and it would be great to have like minded people in the community to compare war stories with...


Looking forward to hearing from all the BOBJ stalwarts out there !

If you have not head about the SAPCC then please click here and make sure you donate to a good cause. This is my blog to track my 2 challenges during the course of Sapphire and I am starting it off with a quick equipment test.


I will post more video updates along the way



Safe travels and see every-one in Orlando


The first 100 burpess... they HURT !



Daniel you 150 burpees...



Joshua Fletcher the first 150 out of the 500 I owe you - done in the hotel room in Orlando last week



This, as is always the case with me, is a blog post that has been stewing for a long time now. I think it has become mandatory for all my blog posts to be written on planes (maybe it’s the thin air up here?) and this one is no exception. Back in October last year, I was about to board a flight to Singapore to speak at the SAP Insider BI 2012 event, when I received a mail from Kevin Poskitt from SAP.


A few months prior to this I had written the blog post How Real-Time is SAP’s Real-Time data platform.  The gist of the post was that SAP has a great stack of technologies now with the BusinessObjects and Sybase acquisitions, and I was suggesting that SAP “give away” the ETL and Database pieces of the offering to ensure that customers have an end-to-end solution from them.


When the mail came in from Kevin Poskitt, my initial reaction was that I was in trouble for shooting my mouth off (as always), but actually, I was super excited by what I read. Kevin Poskitt introduced me to the  SAP BusinessObjects Analytics Edition bundles from SAP. Instead of me mumbling on about what they are, take a look at the YouTube video below to get an idea of the offering:


  • Classic BusinessObjects bundle information from the 3:24 mark
  • BusinessObjects Analytics Edition information 4:53 mark
  • Analytics edition bundle pricing from 7:07



Three key points I want to make on this:


  • Unlimited SAP and flat file data sources and 2 other DB types. Most historic platforms that I have sourced data from in the past have always come off extracts/dumps and with this version you get unlimited flat file loads which is awesome.
  • Data Quality included in the Edge Edition - ready my post here on my thoughts about the state of Data Quality today
  • Sybase IQ starts at 16 cores - I have run some SERIOUS implementations with far less cores so there is definitely more than enough horse power there to keep you going for a while !


To me, that is a great offering, but as far as I was concerned, none of the customers knew about it. Roll onto the end of the year, and I was fortunate to be invited by Jon Reed to do a BI 2012 wrap up Google hang out, where again, I was questioning why customers were not running out and opting for this stack? From where I sit, it is a super compelling offering!  As mentioned above in the video, with only an additional 20 % add on to your presentation layer cost, you are getting a world class ETL and Database engine as part of the deal.


What happened more recently, was super concerning to me. About four weeks back I was asked by The Eventful Group to facilitate one of their round table discussions ahead of their BI 2013 conference later in this year. For anyone who is familiar with these discussions, you will know that the attendees are only customers, and the sharing and thirst for knowledge always makes these discussions worthwhile. Half way through the morning, we got talking about the EDW world and the tools available from SAP, when one customer piped up: “Yes, but all those tools add up and they are not cheap”. I then, as the facilitator, asked the question: “Has anyone here heard about the BusinessObjects Analytics Edition offering from SAP bundling all these products together”? As I love to say… crickets… nothing? In a room of 30 people, 18 of which were individual customers, not one of them had heard about the offering. I quickly explained how it all fits together, and almost all attendees were scribbling down notes to explore things further.


Now to me, that is such a huge pity, as this bundle really is an amazing offer and for the “classic” BusinessObjects or EDW folk, with Sybase IQ powering your analytics and controlling your data flow with Data Services, the world could become a much better place. For those who are wondering, …... No … I have not drunk the SAP cool-aid  , but just believe in the technology stack.


As a well-known Sybase IQ groupie, price point in the SME market was always tough to beat and often a non-starter. With this run time version of Sybase IQ, I can bet my bottom dollar, well South African Rand, that there are 1000’s of BusinessObjects customers, with various row based databases, out there which are bleeding with performance issues. Often, complex ETL and Datamart upon Datamart are implemented to solve this problem, however, these problems could almost all go away for only 20 % of your presentation layer spend! I am pretty outspoken about the various presentation tools often being the “lipstick on the pig” solution to a problem, but now, with leading ETL and DB engine, you can get things done properly from the ground up, the way it should be done.


So, what’s my quest this year? Isn’t it obvious? To let as many customers and partners know about this amazing offering.  My good friend and fellow SAP Mentor, Joshua Fletcher , and I have put together a presentation for various events this year, which is based purely on live demos, walking attendees through the technology sets and showing them how they all seamlessly integrate together. I’m really hoping that this can help educate customers on what is out there.


My good friends at the DSLayer recently interviewed Jayne Landry from SAP about the bundles that SAP are offering, and you can listen to that podcast here. Key take away for me, is that this offer is not only for net new customers, but also for existing customers.


As always, I would love to hear everyone’s thoughts on the topic, and if you need more help/info on the above or want to talk Josh and I into a road trip to do some live demos, then please feel free to reach out to us.


P.S. Jayne Landry also mentions some HANA offerings in the podcast that are going to be released soon which sound exciting.

This blog post is 3 months in the making. I started writing it on the way back from Sapphire in May, and due to the manically hectic schedule that is called “my life”, never got to finish it and am now starting fresh.


In hindsight, I’m glad this happened, as between now and then the “To BW or not to BW” debate has been a hot topic. This blog post is not going to go down that road as I feel my peers have covered those points very well, and as Ethan Jewett pointed out so eloquently in a tweet two days ago People (especially experts) tend to prefer tools that they understand.


As I sit here on flight QF63 from Sydney to Johannesburg, I am pondering the sentiments and comments from all the attendees (customers, partners and SAP employees) of the Mastering BI with SAP Conference in Melbourne the past few days. The general message, from a Database & Technology (D&T) point of view, is still very much HANA for everything which is slightly disappointing. I was hoping that SAP would have started to embrace all the products in the Real-Time data platform with one integrated view for the customer instead of pushing the HANA message which I feel customers are getting a bit tired of.




Those that know me well often ask me if I go to all these conferences to surf or to attend the event. A surfboard is always part of my luggage, and the odd detour from the convention center to the ocean is always on the cards for me. The picture above represents my Nirvana. Warm tropical water and beautiful uncrowded surf. Throw in the wife and kids and this nerd is one happy man.


For my customers, I want them to have this Nirvana when it comes to accessing their data - one entry point, and not by using a BI4 universe, which kills your performance with large data sets. To put things bluntly, I feel that my customers should not really care where the data comes from. The words “hot and cold” data get thrown around loosely but this always leads to a large amount of ETL (Extract-Transform-Load) and users needing to be educated on which “entry point” they need to access their data based on the date range they are trying to access. One of my customers referred to this solution as “clunky” and I could not agree more!


I guess by now you are wondering what I am after… what is the “silver bullet”? I want my customers to have one entry point (the real time data platform) and by using slick and intelligent archiving mechanisms “under the hood”, they must be oblivious to where the data is stored without moving and duplicating data backwards and forwards between database technologies. In the “old days” we were lucky enough to work in batch driven systems, so reconciliation was pretty simple as there was always an end of day point. Those days are now long gone as many customers load data into their BI solution 24/7, with no end of day, which always makes reconciliation a challenge (and that’s putting it nicely). 


Fun Fact: We were fortunate enough to have 90 minutes with Hasso Platner and Vishal Sikka in May in Orlando and it was fascinating for me to hear (from Hasso Plattner) that the hot/cold concept of storing data was in the original design of R1, but due to the low volumes of data back then, everything was put in the “hot storage”.


So where are we today and how close are we to that Nirvana? As is the case in the current SAP eco-system, there are two distinct scenarios: BW or non-BW customers, so I am going to break down my opinions that way.


Many thanks to fellow SAP Mentor Ethan Jewett for checking my BW facts, as my hands-on experience is limited.


BW World


Things are actually looking very good for BW customers. Right now, there are two third party companies ,that I know of, that offer the Near Line Storage (NLS), namely:


  1. PBS NLS: I have been exposed to this solution through my long-term love affair with Sybase IQ. The concept is excellent, where your read only data gets archived into a Sybase IQ data store. The benefits to the company are immediate and are as follows:
    1. Shrink the size of your native BW database, which ultimately leads to query performance enhancements
    2. Get the benefit of compression in Sybase IQ, dropping your data foot print
    3. The benefit of column based storage in Sybase IQ for fast query response times

          It almost sounds too good to be true typing this,  and by no means is the PBS solution cheap.


2.     SAND – I am less familiar with this solution, but thanks to fellow SAP Mentor, Sascha Wenninger, for his input, as they implemented this strategy at one of          his customers.


Ethan also let me know about this solution by IBM using DB2



The Future


By all accounts, native NLS is meant to be coming in one of the service packs of BW 7.3. This will allow BW customers, without paying the additional licenses to third party vendors, to have their truly hot data in HANA with the cold archived data in Sybase IQ. For me, this is an awesome solution, and once the price of HANA slowly drops over time, customers can increase the amount of the hot data in HANA.


One thing that does concern me, from a purely theoretical point of view, is that the BW application layer does most of the processing.  I understand why SAP did this, to ensure that they were database agnostic, but now that they are in the database game, there is a huge need to push the processing down to the database engines to really reap the benefits of the technology of HANA and SAP Sybase IQ. 



Enterprise Data Warehouse (EDW) / Non BW World


Many thanks to fellow SAP Mentor Josh Fletcher for checking my sanity on this section.


This is the area that is close to my heart and the one that I mainly focus on.  In this space, there are currently no options whatsoever (yet again I am not looking at a BI4 multi source universe). The EDW world is filled with ETL jobs and moving data backwards and forwards. The challenges brought in by the constant movement of data are vast, with reconciliation efforts often being quite tedious and time consuming and the “single version of the truth” is seldom met.


There is SAP Sybase Replication Server for replicating data from SAP Sybase ASE to SAP Sybase IQ. Replication Server has got a new “Real time data load” function that polls data up in replication server, and then bulk loads this into Sybase IQ. It is solid technology and works well, but all you are doing here is creating an operational data store (ODS) in your EDW by replicating the OLTP system sitting in SAP Sybase ASE.


I have such a clear vision for this space, and to me, it is clean, simple and has the potential for SAP to entrench themselves in many customers, and more importantly set themselves up for future opportunities.


During my conversations with the APJ SAP team over these last few days, the number of SAP BusinessObjects customers (2500) in the region got mentioned. Now, let’s assume that the majority of those are classic BOBJ customers and not BW ones.


If I was running D&T in APJ (doubt they would have me ) I would do all I could to make contact with those +- 2000 existing BOBJ customers and talk about Sybase IQ. Now I know a few of you might be sighing and say: Clint’s on about Sybase IQ again! But, wait, there’s more…………….


Similar to the BW space, I feel SAP need to offer the same NLS type solution for EDW customers. This way SAP can go out with confidence and start selling Sybase IQ into the EDW/classic BOBJ space immediately, and know that the HANA sales will come. As soon as SAP get this integration/archiving sorted out then, in my opinion,  SAP will be hard to beat. 


In my experiences, HANA is suffering in the EDW space (note I am based in EMEA) and when it comes down to the TCO argument with no special content all falls completely flat.


But, here is the Nirvana for both the customer and SAP – let’s also remember that SAP sell software! SAP need to approach all their EDW customers and get them to look at Sybase IQ. MANY customers out there are dying with performance of the traditional row based databases that are often seen under those classic BOBJ systems.


Customer Win


  • Smaller data footprint due to compression
  • Drastic query performance improvements
  • Faster load times of data with lower latency 
  • Not having to archive old data off due to performance constraints
  • Lower TCO


I could go on and on but that will be another blog.




  • They sell software and increase the D&T footprint. More importantly they sell their customers a tried and tested technology that will add huge and instant value to their customers. 


Let’s look 12-18 months down the line, and the HANA/Sybase IQ integration in the EDW space is out of beta and is now a slick solution. SAP can now approach the same customers with large data volumes, who I can pretty much assure you, are going to be happy with the SAP database technology (Sybase IQ), and look to start loading the hot data into HANA with seamless archiving into the already existing Sybase IQ instance


Customer Win


  • Benefit of speed of in memory technology
  • The ability to buy a small instance of HANA as an entry point and not worry about their users having multiple entry points into the data
  • Continue to leverage their previous investment in Sybase IQ




  • Sell more software
  • A non disruptive way to offer their customers a cheap entry point to in memory technology
  • A very scalable solution for customers being able to add on more and more HANA where needs be


I know the HANA message is strong right now at SAP; However, I honestly feel that the above message is one that will be more palatable to customers, and seeing that it is based on solid, scalable technology, will make SAP look good.

SAP is working hard at trying to change their image of long, expensive projects that often break the bank at customers. In my experience, within a few short weeks, you could replace a classic DBMS with SAP Sybase IQ, and amaze your customers with the benefits.


The key word for me in both my personal and business world is trust.  Customers are tired of the “HANA message” and in most cases it falls out of budget. If SAP look into their tool kit and see that SAP Sybase IQ is a great 2-4 year plan that will give customers huge value at a fraction of the price, it will buy them credibility and most importantly trust from their customers.

The investment won’t be a throw away, as their customers can use Sybase IQ as their archiving strategy to assist with long term (10+ years) trend analysis.


Don’t get me wrong; I am not a HANA hater. I think the technology is amazing and truly do feel that in memory is where everything is going to be in the next 5 years. The key is in how we get there!


I have just noticed that this blog post is over 5 pages – the joys of a  15-hour flight and post conference thoughts…


So to summarize:


BW folks: This is looking OK with a roadmap in place and third party tools for those less patient.

EDW folks: There is a real-time-data platform filled with amazing technology but right now the integration is either replication or ETL


SAP have been quite bullish in their statement of “being the 2nd largest database vendor in the world by 2015”, and, perhaps walking the road with their customers, from disk based to in memory technology over the next few years, will work better than going straight to the HANA message upfront.

Since the YouTube video below got released last week I have had many people ask me "what's your secret" and "how did I spot that in the data" ?



As I am sure you are all aware I am certainly no clairvoyant and do not possess the skills that Dustin Hoffman's character had in Rain Man.


What I do every time I see a new data set or a whack of information is to try and apply the "I" in Business Intelligence that is truly lacking these days. My thought process usually goes something like this when seeing the new information:


  1. How can we represent this data in a different way to change the way people look at this information ?
  2. What disparate/external data source can I bring into this report to enrich the value of the information and add another dimension to the decision making ?


In the case mentioned in the YouTube video above it was very much a 1. situation as the customer had been staring at the same report for well over a decade and it was clear they never fully grasped the effect of the data that was being presented to them. They do say that no presentation is complete without a pie chart and suddenly, when the data that was always buried in a long winded auditors reports was presented in our trusty pie chart did their outlook change on the data.


The beauty about BIOndemand, besides the fact that it is free, is that in a very short space of time you can be slicing and dicing your static data and getting new insights into your data. As you can judge by the tone of this post, I LOVE this product, and use it as my "secret weapon" when ever I am going to see a new customer. How do I do that? Simple!


Most customers we deal with are listed companies and have audited financials/reports posted all over the internet. Before the meeting I always extract some of that information and then, on my iPad, during an introductory meeting show the customer the "art of the possible" with their own data which really is a powerful message. It really can be as simple as taking their balance sheet and showing them visually where the bulk of their liabilities lie... the permutations are endless ! 


To me you can never underestimate the power of customers seeing the information that they can relate to - the world has moved past the stage of the "How many green T-shirts were sold in California ?" demos.


So what I decided to do, and I limited myself to an hour to prove a point, was to get some random data off the internet and put it in a BIOnDemand instance and see how/where I could add value to give you an idea of my thought process....


As I know a lot of us consultants spend our time bouncing between airports I managed to find this site which has all the data based on how and why flights are on time through out the US.


The graphing looks like this:




I then downloaded their raw data which came in this format:




If you look closely in the "airport_name" column there is a lot of information there, namely:


  1. Town
  2. State
  3. Airport Name


Other enhancements I made to the file were as follows:


  1. Split the above 3 items out ( this way we can analyze which is the most reliable airport by State/Town)
  2. Added a column for Season ( Winter/Summ/Fall/Spring) as this may affect reliability in certain areas
  3. Added a total column aggregating all the flight information. from my world it really does not help that 100 flights are late. If that was in relation to 110 then you not going to fly out of there but on the other side of the coin if that was based on 100 000 flights you would be less concerned.


Below are a few screen shots that I have taken in BIOnDemand to get an idea of the new added functionality and the multitude of ways that you can now analyze your information:



Pic 1: The warmer it is the more people fly



Pic 2: Planning for TechEd in Vegas - SouthWest have the most flights into Vegas - but also the most delays !



Pic 3: The top 10 most reliable airlines flying into Vegas in October for two years (2010,2011)



Pic 4: Weather delays, which I "think" are peoples biggest fear of flying, cause less than 1 % of delays


Pretty cool stuff... well I believe so ... I could go on forever - but you will be glad to know I won't


Yes there are some negatives about the free BIOnDemand service if you have to take a closer look, such as:


  1. It does not handle negative values very well
  2. Size limit (but as I said it's free so stop being cheap)
  3. Dates are a bit of a nightmare to work with for sorting etc etc


There are 2 main points that I want to stress before people get carried away with BIOnDemand and go replacing their Enterprise Data warehouse (EDW) with BIOnDemand - there is a product that begins with a Q that already tries to achieve this:


  1. BIOnDemand is an amazing POC tool that can be used to get business buy in and give them insight/vision into what is possible with their information. When I present this to customers I always show a picture of an iceberg with this being the very tip of it ! Garbage in = Garbage out so the correct infrastructure/architeture and governance need to be in place to ensure quality information. ( As a "challenge" to the detail orientated folk I have purposely had some "finger trouble" in the Excel sheet which skews numbers a lot and could have cost someone their job - coffee with me at TechEd for the first person who spots it "Terms and Conditions Apply and Excludes Travel and Jamie Oswald")
  2. Running under BIOnDemand is HANA. I am one of the front runners in making sure my customers buy HANA for the right reason and use case and not buy it due to market hype. In this instance HANA really performs amazingly behind the scenes and really gives the BIOnDemand the real time feeling for the end user. At the end of the day I want my end users to not care/know about where the information lies and I have to admit that SAP have it spot on here with HANA powering BIOnDemand. ** UPDATE ** Thanks to Ethan Jewett for pointing out that the free version on BIOnDemand does NOT run on HANA which really questions why SAP brand the the OnDemand site with "powered by HANA". His full write up on this can be found here.


Please feel free to contact me if you would like to play around with this dataset on BIOnDemand as I will happily share it with you. If you would like access to the data directly then click here


Really keen on your thoughts on this technology and I feel if you ensure that you pitch this tool at the right level and carefully explain where it fits in then it will become an awesome part of your arsenal



Thanks to my good friends at DSLayer for recording this podcast which is malignly non technical. A 13 minute interview on Who is Clint Vosloo... if that interests you at all


Direct download here and iTunes download here

My trip to Orlando has been a bit last minute. Once I heard the amazing news that I had been chosen to be an SAP Mentor the first thing I did was book my flight to Orlando to be part of Sapphire.

It’s a strange thing really: I have been in the BI space for 15 years now and never once attended one of the big global conferences. They however have always been part of my “nerd bucket list” so to speak. Being based at the Southern tip of Africa these trips don’t come cheap. As all things in our life we always say that we will get there “one day”. It’s a weird thing really as my motto for life in general is “if you don’t go you’ll never know” so I guess in a few days I will know.

I am writing this post on a flight down from Johannesburg (where SAP Africa head office is based) down to my home town of Cape Town. I have about 26 hours with family and then it is back to airport for the gruelling 30 hour commute to Orlando.

The flight details look something like this:

Cape Town – London (12 hours)
4 hour layover
London – Miami ( 9 ½ hours)
2 hour layover
Miami – Orlando ( 1 hour)

If the travel gods are with me then that puts me in Orlando at 5:30pm which should give me enough time to get to the hotel, get freshened up and off to the Mentor induction – I am expecting some surprises here ;-)

Those who know me will know that I come from the Sybase world, as a business we cover the full SAP BI space, but it is in the zeros and ones of the database world where my true passion lies. Since SAP acquired Sybase I have been to a few of the SAP events and one thing that’s for sure is that SAP knows how to do events! As Sapphire is the flag ship event of the year (for business users) I am expecting the event to be nothing other than mind blowing. We are fortunate to be staying across the road from the conference so I intend to attend all the sessions I can and meet as many people as possible.

The first time I got exposure to the SAP community was here in South Africa when I got to meet and connect with the great Ingo Hilgefort and the new SAP Mentor inductee Josh Fletcher at the Mastering SAP BI conference last year. What never ceases to amaze me about the SAP community, and where I feel the real power lies, is that on various occasions I have reached out to various SAP experts and they have always been willing to assist even with their busy schedules. Looking at the line-up of speakers and content my inner nerd cannot wait to get there!

I guess the question a lot of you are asking by now is ….. well if you are a database guy then what sessions are you attending? The honest answer – I am not too sure yet? I have a few must attend sessions on my list but I am really going to focus my energy on trying to meet and knowledge share with as many people as possible.

When I posed the question to my fellow mentors as “what to expect” for my first Sapphire I got a lot of great advice. One piece of advice struck a chord with me which was something like this: “You can always get access to the presentations after the event but you can never get that face to face time with product owners and members of the community again and that is invaluable”. Thanks Martin !

Post Sapphire and once we have all Jumped to Van Halen I have got a round of golf lined up for the Thursday, afternoon of course, and then I am off down the coast to Miami where I have lined up a surf or two with some locals. After that it is the long trip home again, via London, and I am pretty sure sleeping tablets won’t be needed then.

If you are going to Sapphire then please connect with me and say hi. I will be on twitter (@biitb) and am looking forward to meeting as many people as possible in the global SAP community, making new friends and most importantly helping and advising where I can.

Travel safe for those going, for those not… have some sleep for me please.