1 2 Previous Next

Open Source

30 Posts

As a continuation of the blog SAP OData Library Contributed to Apache Olingo (Incubator) I wanted to share some further insights into the Apache Olingo Incubator project.

 

About two years ago SAP started to invest into a new OData Library (Java). Goals for this effort were to implement a library which supports the OData Specification Version 2, which has nearly the same feature set one can find in SAP NetWeaver Gateway and to open source the library at Apache in order to build a developer community around OData.

 

Mid of 2013 SAP did a software grant of the library and contributed the source code to the newly formed Apache Olingo Incubator project. Shortly after, the project released version 1.0.0 in October 2013 and  version 1.1.0 in February 2014. The next version 1.2.0 is already on its way and currently available as snapshot on Apache Olingo Incubator. There you can also find the release notes. The releases cover the OData Specification Version 2. The committers of the project work constantly on the documentation for users of the open source library and are happy to answer questions via the dev mailing list or via Jira.

 

In the meanwhile OData evolves to an OASIS standard. So you can watch out for any news in the OASIS OData Technical Committee. The community work now focuses on implementing both client and server libraries for the OASIS OData Standard (Version 4). These efforts are supported by new contributions for Java (ODataClient) and Javascript (datajs), both client libraries for consuming OData Services. 

 

Apache Olingo tends to evolve into a project hosting OData Implementations in different languages and technologies which is already a great success but the community has also some more milestones to focus on:

 

  • Graduation, which means that the project leaves the incubator behind and becomes a top level project within the Apache Software Foundation
  • Agreement within the community for a common roadmap of V4 feature development
  • Merge the contributions into a common code base to go forward with the OData OASIS Standard (Version 4) feature development
  • Release a first version of an OData Java Library supporting V4
  • Release a first version of datajs supporting V4

 

Last but not least I also wanted to share some short facts around Apache Olingo (Incubator):

 

  • 2 releases, the third one is on its way
  • 19 initial committers
  • 7 new committers
  • 75 persons active on the mailing list
  • 1025 commits in the git repositories
  • more than 1500 mails via dev mailing list
  • more than 150 Jira Issues closed / resolved
  • about 20 tutorials available

 

With that I think there will be interesting times ahead of us in shaping the future of the Apache Olingo project.

 

We are interested to know what are your thoughts. So please share your comments, feedback with us by commenting to this post or if you already have more detailed questions or feature requests you may also use the dev mailing list for Apache Olingo directly. We, that is Christian Amend, Tamara Boehm, Michael Bolz, Jens Huesken, Stephan Klevenz, Sven Kobler-Morris and Chandan V.A. as the main initial committers, are happy to answer your questions.

Introduction

 

Source code for the application is available on GitHub.

 

An application using OpenUI5 at the front-end will sooner or later need to connect to the back-end services for some business logic processing. In this blog entry we'll show how we can use the popular Spring MVC framework to expose REST-like endpoints for such server-side processing. Spring MVC makes it very simple to setup and configure an interface which will handle requests with Json payload, converting all domain model objects from Json to Java and back for us.


  • Simple Maven project with embedded Tomcat for testing locally
  • Servlet 3.0, no-XML, set-up for the web application using Spring's annotation based configuration
  • JSR-303, Bean Validation through annotations, used on the model POJOs
  • Spring MVC set up with a web jar for OpenUI5 runtime and automatic serialization of the model to Json


Useful links


Here are some useful links.


 

Application

 

This is a very simple single-page application which has a table of fruit, each having a name (String) and a quantity (integer). One can add a new fruit, delete an existing entry from the table or update an existing fruit using an inline-edit.

 

fruit.png

 

Taking just the "add" operation as an example, we can see that the home view, home.view.js, calls the controller with a JavaScript object constructed as to represent a Fruit when it is serialized as the part of the Ajax request by the controller.

 

 

// add button
        var oButton = new sap.ui.commons.Button({
            text: "Add",
            press: function () {
                // check if quantity is a number
                if (oInput2.getValueState() !== sap.ui.core.ValueState.Error) {
                    oController.add({
                            // id attribute can be ignored
                            name: oInput1.getValue(),
                            quantity: oInput2.getValue()
                        }
                    );
                }
            }
        });

 

The controller, home.controller.js, then is simply sending the serialized Fruit object as the content of a POST request to the appropriate endpoint (/home/add) made available by the Spring MVC controller. Once the Ajax call returns the updated model data, it is simply rebound to the JSONModel associated with the view.

 

add: function (fruit) {
        this.doAjax("/home/add", fruit).done(this.updateModelData)
            .fail(this.handleAjaxError);
    },
updateModelData: function (modelData) {
        console.debug("Ajax response: ", modelData);
        var model = this.getView().getModel();
        if (model == null) {
            // create new JSON model
            this.getView().setModel(new sap.ui.model.json.JSONModel(modelData));
        }
        else {
            // update existing view model
            model.setData(modelData);
            model.refresh();
        }
    }

 

In what follows we'll look in detail how to implement a REST-like endpoint handling Json payloads using Spring MVC framework.

 

Spring MVC set-up

 

We are using Servlet 3.0, no web.xml, approach based on Java annotations to set up a simple Spring MVC web application. For this we need an implementation of org.springframework.web.WebApplicationInitializer where we specify the class which will be used when constructing an instance of org.springframework.web.context.support.AnnotationConfigWebApplicationContext and where we declare a dispatcher servlet. Here is our implementation, com.github.springui5.conf.WebAppInitializer.

 

 
public class WebAppInitializer implements WebApplicationInitializer {
    private static final Logger logger = LoggerFactory.getLogger(WebAppInitializer.class);
    @Override
    public void onStartup(ServletContext servletContext) throws ServletException {
        logger.info("Initializing web application with context configuration class {}", WebAppConfigurer.class.getCanonicalName());
        // create annotation based web application context
        AnnotationConfigWebApplicationContext webAppContext = new AnnotationConfigWebApplicationContext();
        webAppContext.register(WebAppConfigurer.class);
        // create and register Spring MVC dispatcher servlet
        ServletRegistration.Dynamic dispatcher = servletContext.addServlet("dispatcher",
                new DispatcherServlet(webAppContext));
        dispatcher.setLoadOnStartup(1);
        dispatcher.addMapping("/");
    }
}

The actual configuration is given then by com.github.springui5.conf.WebAppConfigurer class.

 

@Configuration
@EnableWebMvc
@ComponentScan(basePackages = {"com.github.springui5.web"})
public class WebAppConfigurer extends WebMvcConfigurerAdapter {
    /**
     * Enable default view ("index.html") mapped under "/".
     */
    @Override
    public void configureDefaultServletHandling(DefaultServletHandlerConfigurer configurer) {
        configurer.enable();
    }
    /**
     * Set up the cached resource handling for OpenUI5 runtime served from the webjar in {@code /WEB-INF/lib} directory
     * and local JavaScript files in {@code /resources} directory.
     */
    @Override
    public void addResourceHandlers(ResourceHandlerRegistry registry) {
        registry.addResourceHandler("/resources/**").addResourceLocations("classpath:/resources/", "/resources/**")
                .setCachePeriod(31556926);
    }
    /**
     * Session-scoped view-model bean for {@code home.view.js} view persisting in between successive Ajax requests.
     */
    @Bean
    @Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS)
    public HomeViewModel homeModel() {
        return new HomeViewModel();
    }
}

We use a helpful EnableWebMvc annotation, which configures our application with some useful defaults. For example, Spring will automatically configure an instance of org.springframework.http.converter.json.MappingJackson2HttpMessageConverter message converter which will use a Jackson to Java converter to serialize the model returned by the Ajax handling methods of the controller.

 

Another interesting thing to notice is that we are using Spring's resource servlet to serve the static JavaScript (OpenUI5 runtime) from the web JAR available on the classpath of the application. To create the web JAR, we can simply package the OpenUI5 runtime JavaScript, available for download, into a JAR and add it to the WEB-INF/lib directory of our project.

 

The session-scoped bean, com.github.springui5.model.HomeViewModel, is responsible for maintaining the reference to the model object corresponding to the client's view.

 

public class HomeViewModel {
    private HomeModel homeModel;
    /**
     * Initializes and returns a new model.
     */
    public HomeModel getNewHomeModel() {
        homeModel = new HomeModel();
        return homeModel;
    }
    /**
     * Returns the model for this view-model.
     */
    public HomeModel getHomeModel() {
        if (homeModel == null) {
            throw new RuntimeException("HomeModel has not been initialized yet.");
        }
        return homeModel;
    }
}

ComponentScan annotation specifies where to look for the controllers of the application. The single controller for the home view is com.github.springui5.web.HomeController.

 

@Controller
@RequestMapping(value = "/home", method = RequestMethod.POST, consumes = "application/json", produces = "application/json")
public class HomeController {
    private static final Logger logger = LoggerFactory.getLogger(HomeController.class);
    /**
     * Session-scoped view-model bean.
     */
    @Autowired
    private HomeViewModel vm;
    /**
     * Initializes the model for the view.
     */
    @RequestMapping
    public
    @ResponseBody
    HomeModel handleInit() {
        return vm.getNewHomeModel().show();
    }
    /**
     * Adds the {@linkplain com.github.springui5.domain.Fruit} parsed from the request body to the list of fruit in the
     * model.
     */
    @RequestMapping("/add")
    public
    @ResponseBody
    HomeModel handleAdd(@Valid @RequestBody Fruit fruit, BindingResult errors) {
        if (errors.hasErrors()) {
            throw new FruitValidationException(errors);
        }
        return vm.getHomeModel().add(fruit).clearError().show();
    }
    /**
     * Deletes the the {@linkplain com.github.springui5.domain.Fruit} with matching {@code id} from the list of fruit in
     * the model.
     */
    @RequestMapping("/delete/{id}")
    public
    @ResponseBody
    HomeModel handleDelete(@PathVariable long id) {
        return vm.getHomeModel().delete(id).clearError().show();
    }
    /**
     * Updates the the {@linkplain com.github.springui5.domain.Fruit} with matching {@code id} from the list of fruit in
     * the model.
     */
    @RequestMapping("/update")
    public
    @ResponseBody
    HomeModel handleUpdate(@Valid @RequestBody Fruit fruit, BindingResult errors) {
        if (errors.hasErrors()) {
            throw new FruitValidationException(errors);
        }
        return vm.getHomeModel().update(fruit).clearError().show();
    }
    /**
     * Custom exception handler for {@linkplain FruitValidationException} exceptions which produces a response with the
     * status {@linkplain HttpStatus#BAD_REQUEST} and the body string which contains the reason for the first field
     * error.
     */
    @ExceptionHandler
    @ResponseStatus(HttpStatus.BAD_REQUEST)
    public
    @ResponseBody
    HomeModel handleException(FruitValidationException ex) {
        String error = String.format("%s %s", ex.getRejectedField(), ex.getRejectedMessage());
        logger.debug("Validation error: {}", error);
        return vm.getHomeModel().storeError(error);
    }
}

We are autowiring the view-model bean into the controller. It will be reinitialized by Spring automatically for each new client of the application (new browser, for example). Ajax request handling is configured on the class and method levels via RequestMapping annotations specifying the URL paths available in the form /home or /home/add. Some methods accept a model object (Fruit) deserialized or unmarshalled from the Json in the body of the POST request via RequestBody annotations.

 

Each conroller method returns the instance of HomeModel which will be automatically serialized or marshalled to Json and later bound to the JSONModel on the client side.

 

Model and validation

 

The domain model used on the server is a couple of simple POJOs annotated with JSR-303 annotations (using Hibernate Validator implementation). Here is the class for com.github.springui5.model.HomeModel.

 

public class HomeModel implements Serializable {
    private static final Logger logger = LoggerFactory.getLogger(HomeModel.class);
    private List<Fruit> listOfFruit;
    private String error;
    public List<Fruit> getListOfFruit() {
        return listOfFruit;
    }
    public void setListOfFruit(List<Fruit> listOfFruit) {
        this.listOfFruit = listOfFruit;
    }
    public String getError() {
        return error;
    }
    public void setError(String error) {
        this.error = error;
    }
    public HomeModel() {
        listOfFruit = new ArrayList<>(Arrays.asList(new Fruit("apple", 1), new Fruit("orange", 2)));
    }
    public HomeModel add(Fruit fruit) {
        // set id, it is 0 after deserializing from Json
        fruit.setId(Fruit.newId());
        listOfFruit.add(fruit);
        return this;
    }
    public HomeModel delete(final long id) {
        CollectionUtils.filter(listOfFruit, new Predicate() {
            @Override
            public boolean evaluate(Object object) {
                return ((Fruit) object).getId() != id;
            }
        });
        return this;
    }
    public HomeModel update(final Fruit fruit) {
        // find the fruit with the same id
        Fruit oldFruit = (Fruit) CollectionUtils.find(listOfFruit, new Predicate() {
            @Override
            public boolean evaluate(Object object) {
                return ((Fruit) object).getId() == fruit.getId();
            }
        });
        // update the fruit
        oldFruit.setName(fruit.getName());
        oldFruit.setQuantity(fruit.getQuantity());
        return this;
    }
    public HomeModel storeError(String error) {
        this.error = error;
        return this;
    }
    public HomeModel clearError() {
        this.error = null;
        return this;
    }
    public HomeModel show() {
        logger.debug(Arrays.toString(listOfFruit.toArray()));
        return this;
    }
}

And here is the com.github.springui5.domain.Fruit class.

 

public class Fruit implements Serializable {
    private static long offset = 0L;
    private long id;
    @NotNull
    @NotBlank
    private String name;
    @NotNull
    @Min(1)
    private int quantity;
    /**
     * Returns a new value for {@code id} attribute. Uses timestamp adjusted with the static offset. Used only for
     * illustration.
     */
    public static long newId() {
        return System.currentTimeMillis() + offset++;
    }
    public Fruit() {
        // default constructor
    }
    public Fruit(String name, int quantity) {
        this.id = Fruit.newId();
        this.name = name;
        this.quantity = quantity;
    }
    public long getId() {
        return id;
    }
    public void setId(long id) {
        this.id = id;
    }
    public String getName() {
        return name;
    }
    public void setName(String name) {
        this.name = name;
    }
    public int getQuantity() {
        return quantity;
    }
    public void setQuantity(int quantity) {
        this.quantity = quantity;
    }
    @Override
    public boolean equals(Object obj) {
        return obj instanceof Fruit && ((Fruit) obj).getId() == id;
    }
    @Override
    public String toString() {
        return "Fruit [id: " +
                id +
                ", name: " +
                name +
                ", quantity: " +
                quantity +
                "]";
    }
}

 

Upon the initial request for the model data (/home) this is what the controller returns. Notice how the list of Fruit domain objects was automatically serialized to Json for us.

 

init.png

 

If an invalid value is submitted as the part of the request body (for example the quantity of 0 when adding a new fruit) it is automatically picked up by Spring and assigned to the org.springframework.validation.BindingResult parameter of the corresponding request handling method. The application then exposes the validation error message as the value of the models "error" attribute.

 

Testing the application

 

This is a standard Maven application which needs some mandatory dependencies to compile and run.

 

<!-- all of the necessary Spring MVC libraries will be automatically included -->
<dependency>
  <groupId>org.springframework</groupId>
  <artifactId>spring-webmvc</artifactId>
  <version>4.0.0.RELEASE</version>
</dependency>
<!-- need this for Jackson Json to Java conversion -->
<dependency>
  <groupId>com.fasterxml.jackson.core</groupId>
  <artifactId>jackson-databind</artifactId>
  <version>2.3.0</version>
</dependency>
<!-- need this to use JSR 303 Bean validation -->
<dependency>
  <groupId>org.hibernate</groupId>
  <artifactId>hibernate-validator</artifactId>
  <version>5.0.2.Final</version>
</dependency>

It also uses a Tomcat Maven plugin for running the project in an embedded Tomcat 7 using: mvn tomcat7:run from the command line.

 

Conclusion

 

Using Spring MVC with OpenUI5, the way we have described here, has some advantages. We can easily setup a REST-like endpoint which will automatically convert Json payloads to Java domain objects allowing us to concentrate on manipulating the model in Java without worrying on how the changes will be reflected in the JavaScript on the client-side. We can also plug in domain objects validation based on annotations (JSR 303), using Spring's validation mechanism. This allows us to process all business logic validation on the server-side in a declarative and transparent manner, leaving only checks for formatting errors on the client-side.

 

There are some disadvantages to this approach, however, the main of which, of course, is that we are returning an entire model for each request, which results in an unnecessary large data transfer. This should not be a limitation for a relatively simple views, but can represent a problem for the complicated views with a lot of data.

Open source is changing the way software is being developed and consumed. It is also SAP’s intention to contribute to open source and integrate open source into the product line. With the same intention, OData JPA Processor Library headed off the open source way a few months back and yes, we are now an open source software along with OData Library (Java) on Apache Software Foundation (ASF), see Apache Olingo project for details

 

 

The OData JPA Processor Library is a Java library for transforming Java Persistence API (JPA) models based on JPA specification into OData services. It is an extension of the OData Library (Java) to enable Java developers to convert JPA models into OData services. For more details check SAP OData Library Contributed to Apache Olingo (Incubator) which gives you an introduction on why OData and the features of the OData Library (Java).

 

The artifacts to get started with OData JPA Processor Library, the documentation, the code are all available on Apache Olingo. The requirements for building an OData service based on a JPA model are quite low, for a quick start you can refer to the following tutorial. In short, you just have to create a web application project in Eclipse (both Kepler and Juno versions are supported), implement a factory to link to the JPA model and register the factory class within the web.xml file, it’s that simple. The OData JPA Processor Library also supports more enhanced features like the ability to redefine the metadata of the OData services (like renaming entity type names and its properties) and to add additional artifacts like function imports to the OData service.

 

So if you are on the lookout for a reliable and easy to use software for transforming JPA models into OData services, you now know where to go The libraries are out there in the open, please do explore them, extend them and let us know the new faces you give them. Use the mailing list available here not just to let us know how you have used the libraries but also to report bugs, ask questions, the team will be glad to hear from you and to answer your queries

 

With that, we say we have arrived into the Open Source Software world!! And also, this is just the beginning and there is more to come (or at least that is the intent), so keep an eye on the Apache Olingo project. 

 

Related Information:

In today’s mobile and agile business environment, it is important to unlock the enterprise data held by applications and other systems, and enable its consumption from anywhere. The Open Data Protocol (OData), on its way to be standardized by Microsoft, IBM, SAP and a lot of other companies within OASIS (an international standards body for advancing open standards for the information society), provides a solution to simplify data sharing across applications in enterprises, in the Cloud, and on mobile devices. OData leverages proven Internet technologies such as REST, ATOM and JSON, and provides a uniform way to access data as well as data models.

 

SAP contributed the Java OData Library recently to Apache Olingo (Incubator). After just a few days available in public we got a lot of interest from other companies. That makes us confident to build up a community working on evolving this library to the latest version which is the upcoming OData Standard as result of the standardization process at OASIS.

 

Talking about features there is already a lot to be discovered in the library. Since the Entity Data Model, the URI Parsing including all System Query Options and (De)Serialization for ATOM/XML and JSON is already supported one can build an OData Services supporting advanced read / write scenarios. Features like $batch are currently added, Conditional Handling, advanced Client Support and detailed documentation are on the roadmap for the upcoming months.

 

The guiding principles during the implementation of the OData Library were to be OData 2.0 specification compliant and have an architecture in place to enhance the library in a compatible manner as much as possible. The clear separation between Core and API and keeping dependencies down to a minimum is important. The community should have the option to build extensions for various data sources on top of the library. The JPA Processor as one additional module provided is an excellent example for such an extension.

 

Besides the Core and API packages there is also an example provided in the ref and ref.web packages in order to show the features in an OData Service Implementation and to enable to integrate new features in that service also for full integration tests (fit).

 

We’ll keep you posted once the first release is available to digest. You can already dig into the coding, provide bug reports, feature requests and questions via Jira or by using the mailing list. All the information is available in the support section of the web site.

 

Further Information:

Hi

 

SAP's software is known for its role running many of the world's largest companies, but not necessarily for its user-friendliness. As part of an ongoing effort to change this perception, SAP unveiled Fiori, a set of 25 lightweight "consumer-friendly" applications that can run on desktops, tablets and mobile devices, on Wednesday at the Sapphire conference in Orlando.

Fiori applications are written in HTML5, which makes multiplatform deployments possible. They also target some of the most common business processes a user might perform, such as creating sales orders or getting their travel expenses approved, according to SAP's announcement.SAP has grouped the initial Fiori applications into four separate employee types, including manager, sales representative, employee and purchasing agent. Fiori is priced per user and available now, but specific costs weren't disclosed Wednesday.It's possible to deploy Fiori as a single group of applications, as well as separate Web applications and within portals, according to a statement.Some 250 customers helped SAP develop Fiori and make the apps more user-friendly, SAP said.SAP has basically been compelled to develop something like Fiori, according to one observer."Customers want enterprise-class apps with consumer-grade experiences," said analyst Ray Wang, CEO of Constellation Research. "Fiori is one of the ways SAP customers can pull the data out of their existing systems, and democratize that information so that everyone can benefit from access to the SAP system.""For years, the issue was that SAP data was hidden or not easily accessed," Wang added. "This is one small step to make that change."SAP's App Haus, a startup-like development group within the company,has been working to create more usable and appealing application interfaces. It wasn't immediately clear Wednesday whether the App Haus team is involved with Fiori.

The vendor has also launched a product called Screen Personas, which gives users the ability to rejigger SAP software screens to better fit their job role and personal preferences.There's plenty more to come, SAP co-CEO Jim Hagemann Snabe said during a keynote.

 

Thank You

As some of you might know SAP is a contributor in the OpenSource project Eclipse. As part of that engagement we also organize so called "Eclipse DemoCamps" to show what one can do with this great development platform which is used a lot in the IT industry and is also the IDE of choice for SAP HANA Cloud Platform.

 

This year's Eclipse DemoCamp will be held at the day of planned release date for Kepler, the release name of Eclipse V4.3.

 

In case you are interested in joining the event you can register for free or even propose a speaking slot at the Eclipse DemoCamp and join speakers like Mike Milinkovich, the Executive Director of the Eclipse Foundation.

You'll be able to listen to interesting talks, get free drinks & food and a lot of possibilities to connect with other developers during the event.

 

So register today and join us in Walldorf for the Eclipse DemoCamp.

 

Best,

Rui

Welcome to the last episode about the SAP Open Source Summit 2012. In the first three episodes, I shared my overall impressions, key parts of my presentation on the corporate open source strategy, and insights from guest keynotes, respectively. Now I want to finish with a few areas that we are focusing on next.

 

Given that we now have an open source strategy in place at SAP, the focus is now much more on execution. Whenever you want to execute a strategy, you may however run into a number of challenges - be they of a technical, organizational or procedural nature. Specifically with regard to the much stronger use of open source and contribution to open souce projects, we have found four major types of challenges. These are reuse and versioning, alignment of release schedules, product security and long-term support as shown in the slide below. I used this slide also in my keynote last week at ApacheCon Europe 2012 to illustrate key challenges for managing open source from an enterprise perspective.

 

ApacheConEurope_Keynote_CvonRiegen.png

 

  • Reuse and Versioning
    If you focus on a given open source foundation like the Apache Software Foundation or the Eclipse Foundation, it is very likely that due to the nature of the foundation's development processes there is not much overlap between the various open source technologies. But open source projects today in many cases start somewhere else and don't necessarily follow a defined governance model. This is good since it is more flexible and allows the emergence of many different ideas. For example, GitHub today has almost 4.3 million projects. But the lack of coordination is also a challenge in that it becomes more difficult to find the right technologies and to understand their level of maturity and adoption. And it increases the likelihood of overlapping and similar technologies. This is not necessarily bad, but it can become a management challenge of open source adoption in the enterprise.

    One particular challenge in this regard is that different product teams might choose different technologies for the same purpose. The integration of, for example, four different open source XML parsers into our product line can result in maintenance overhead - four different technologies need to be maintained over the course of the products' support schedule. Another challenge is the selection of the right version of the open source technology. Even if all SAP product teams agree to select the same open source XML parser, they may have done so over the course of a few years and have chosen different versions of that XML parser. It is preferable to use only one version - and actually the most recent stable one - for all products because it also minimizes the risk to miss important security bug fixes in newer versions. But the upgrade might get complicated, for example, if the XML parser's APIs that were used to integrate it into the SAP product, have changed incompatibly.
  • Release Schedules
    The adoption of the most recent stable version of an open source technology is a reasonable goal to pursue on its own. But since the release schedules of the open source technology and the SAP product are not the same, this is not necessarily an easy task and some updates of the embedded open source technology might become necessary after the SAP product has already been shipped. There are some exceptions like the Eclipse Foundation that has decided to deliver one stable release of the Eclipse Platform once per year - this simplifies the planning exercise and allows us to, for example, develop a stable Eclipse release train SAP-internally on a yearly basis. The whole exercise can get more complicated when SAP is an active contributor to the open source project. By no means is there a guarantee that the extensions we developed SAP-internally will be adopted without any changes by the open source project lead and in time before the SAP product is being released to the market. This means that we sometimes need to live with a fork of the open source technology. The operational recommendation is to avoid such forks and to always seek a close alignment between the open source standard version and the version embedded in the SAP product.
  • Security
    Product security and security response management for SAP products clearly needs to include a responsibility for fixing security vulnerabilitites in embedded open source technologies. On the one side - since the source code of the open source technology is openly available and used by many other firms - there are more sources for finding potential security vulnerabilities. On the other side, this results in an obligation to even more quickly respond to such findings or adopt known resolutions. Enterprise customers need to be able to patch their systems with respecitve bug fixes before it is being discussed in the public. This may sound like a paradox since further development of the open source technology happens "in the open." But there are resolutions - the Apache Software Foundation, for example, has a process by which security vulnerabilities can be reported on a mailing list that is only accessible to the leads of the respective open source project. Until a resolution is available, the conversation continues in a private environment. The vulnerability is finally being reported publicly only at the time a bug fix has being made available. This significantly limits the risk that IT user organizations continue to run systems that are exploitable due to known vulnerabilities.
  • Long-Term Support
    SAP products are typically supported for at least seven years. Customers expect support for the complete solution, not just for the software components that were developed SAP-internally. From a support perspective, customers actually shouldn't need to know which open source technologies have been embedded. Consequently, we need to provide a means to support respective open source technologies, including the application of bug fixes as necessary. Fortunately, the various quality checks that are being applied when selecting open source technologies at SAP radically reduce the number of necessary bug fixes. But it can never be completely avoided and sometimes it is necessary for version 3.2 when the open source project has already released version 6.0 ...
    In principle, there are two main options: either to upgrade the embedded open source technology to version 6.0 or to apply a local bug fix to version 3.2. Both options can be rather complicated. An alternative is to join forces with other interested open source developers and users and to share the cost of maintaining older versions of the open source technology. SAP has long motivated such an approach for Eclipse projects. The recent establishment of the Long Term Support Industry Working Group (LTS IWG) is a good step in this direction and fortunately, a number of other Eclipse members do see the same need and have joined the LTS IWG to solve this important dilemma.

 

So I hope that this blog post series about the SAP Open Source Summit 2012 has been useful for you. From my perspective is was good to see a strong focus on operational excellence, knowledge sharing and applying open source development principles SAP-internally. Let me conclude with Mike Milinkovich's words: "Open source software is really really mainstream." And the journey will continue.

 

Here is the overview of the blog post series again:

  1. Overall Impressions
  2. What we think - SAP Open Source Strategy
  3. Insights from keynotes
  4. What we do next (this post)

In this blog post we'll look at how one can use Spring Security framework together with the usual way of securing a JEE application on NetWeaver 7.3 and what this can bring us. Here are the useful links :

 

 

For the reference, I've tested this setup with Spring Security 3.1.3.RELEASE (which in turn depends on Spring MVC 3.0.7.RELEASE).

 

Spring Security is a popular and very flexible framework which allows to configure and manage all aspects of securing a web application : authentication, authorization, access control to domain objects. One of the many useful things which this framework provides is a spring-security-taglibs module which allows one to protect the various pieces of the JSP page using security tags tied to the user's role membership (authorization). For example, we can have a JSP like this :

 

<%@ taglib prefix="sec" uri="http://www.springframework.org/security/tags"%>
<p>Hello, this page is accessible to all users with ROLE_EVERYONE</p>
<sec:authorize access="hasRole('ROLE_SFLIGHT_USER')">
    <p>This text and link below should only be visible to users with
    ROLE_SFLIGHT_USER</p>
    <a href="<%=request.getContextPath() %>/secure">secure page</a>
</sec:authorize>

 

Notice the use of sec:authorize tag which protects access to the part of the page depending on whether or not the current user has a role SFLIGHT_USER. We'll discuss below how we can configure Spring Security to work seamlessly with the security services provided by out JEE container. The idea behind this integration is quite simple: NetWeaver already handles the user authentication for us, there is also a mechanism to map the UME roles of the portal user to the roles referenced in the web.xml of our application. All we have to do is to find a way to make Spring Security framework recognize these roles as the "granted authorities" associated with the authenticated user.

 

We start with a simple web application set up. Here are the interesting parts of our starting web.xml :

 

<login-config>
    <auth-method>TICKET</auth-method>
</login-config>
<security-role>
    <role-name>EVERYONE</role-name>
</security-role>
<security-role>
    <role-name>SFLIGHT_USER</role-name>
</security-role>
<security-constraint>
    <web-resource-collection>
        <web-resource-name>Spring Security Integration Test Application</web-resource-name>
        <url-pattern>*</url-pattern>
        <http-method>GET</http-method>
        <http-method>POST</http-method>
    </web-resource-collection>
    <auth-constraint>
        <role-name>EVERYONE</role-name>
    </auth-constraint>
    <user-data-constraint>
        <transport-guarantee>NONE</transport-guarantee>
    </user-data-constraint>
</security-constraint>

 

We chose a "tiket" authentication method for our web application meaning that the user will be considered authenticated if he or she previously logged in on the portal and that there is JSESSIONID and MYSAPSSO2 cookies are present in his or her browser session (usual SAP SSO via SAP Log on ticket). We declare two security roles "EVERYONE" and "SFLIGHT_USER" for our example. The first one is a general role assigned to every UME user. The other one is an example role which we can create and assign to some test user. We then protect any access to our web application by setting url-pattern of the security constrain to "*". The idea is that we let the JEE container to manage the authentication of the user but the finer granularity access protection within the application we will delegate to Spring Security. For the mapping between the UME roles and the web application security roles we also need this in the web-j2ee-engine.xml file of the web application module :

 

<security-role-map>
    <role-name>EVERYONE</role-name>
    <server-role-name>EVERYONE</server-role-name>
</security-role-map>
<security-role-map>
    <role-name>SFLIGHT_USER</role-name>
    <server-role-name>SFLIGHT_USER</server-role-name>
</security-role-map>

 

Now we need to set up and configure the filter chain used by Spring Security. This is done by specifying a Spring's application context file containing all the security configuration in our web.xml file :

 

<listener>
    <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
</listener>
<context-param>
    <param-name>contextConfigLocation</param-name>
    <param-value>/WEB-INF/context/security-config.xml</param-value>
</context-param>
<filter>
    <filter-name>springSecurityFilterChain</filter-name>
    <filter-class>org.springframework.web.filter.DelegatingFilterProxy</filter-class>
</filter>
<filter-mapping>
    <filter-name>springSecurityFilterChain</filter-name>
    <url-pattern>/*</url-pattern>
</filter-mapping>

 

The security configuration file, security-config.xml, uses Spring Security namespace for convenience. It uses the security scenario of "pre-authentication" and relies on two classes provided for us by the framework: J2eePreAuthenticatedProcessingFilter and  which integrates with the container authentication process by extracting user principle from the HttpServletRequest and J2eeBasedPreAuthenticatedWebAuthenticationDetailsSource which is responsible to map a configured set of the security roles specified in the web application descriptor file to the set of GrantedAuthorities provided the user membership in these roles has been established. Here is the security-config.xml file :

 

<beans:beans xmlns:beans="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.springframework.org/schema/security"
    xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
        http://www.springframework.org/schema/security http://www.springframework.org/schema/security/spring-security-3.1.xsd">
    <http auto-config="true" use-expressions="true">
        <jee mappable-roles="EVERYONE,SFLIGHT_USER" />
        <intercept-url pattern="/**" access="hasRole('ROLE_EVERYONE')" />
        <intercept-url pattern="/secure/**" access="hasRole('ROLE_EVERYONE')" />
    </http>
    <authentication-manager>
        <authentication-provider ref="preAuthAuthenticationProvider"></authentication-provider>
    </authentication-manager>
    <beans:bean id="preAuthAuthenticationProvider"
        class="org.springframework.security.web.authentication.preauth.PreAuthenticatedAuthenticationProvider">
        <beans:property name="preAuthenticatedUserDetailsService">
            <beans:bean
                class="org.springframework.security.web.authentication.preauth.PreAuthenticatedGrantedAuthoritiesUserDetailsService"></beans:bean>
        </beans:property>
    </beans:bean>
</beans:beans>

 

Notice the use of "jee" element in the "http" configuration, it is a shortcut for configuring an instance of J2eePreAuthenticatedProcessingFilter filter and registering it with the defaulf filter chain. Also, by default all the JEE security roles specified in the "mappable-roles" attribute will be mapped to the GrantedAuthorities with the names prefixed by "ROLE_".

 

Once the pre-authentication mechanism is successfully configured we can protect URL access using expressions of the kind "hasRole('ROLE_FROM_UME_HERE')" in the global "http" configuration element and in the security tags in our JSPs. For the reference, this is the Spring MVC configuration file, mvc-config.xml, which I've used for the the web application, it uses "mvc" namespaces for convenience :

 

<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:mvc="http://www.springframework.org/schema/mvc" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="
        http://www.springframework.org/schema/beans 
        http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
        http://www.springframework.org/schema/mvc 
        http://www.springframework.org/schema/mvc/spring-mvc-3.0.xsd">
    <mvc:annotation-driven />
    <bean id="viewResolver"
        class="org.springframework.web.servlet.view.UrlBasedViewResolver">
        <property name="viewClass"
            value="org.springframework.web.servlet.view.JstlView" />
        <property name="prefix" value="/WEB-INF/jsp/" />
        <property name="suffix" value=".jsp" />
    </bean>
    <mvc:view-controller path="/" view-name="index"/>
    <mvc:view-controller path="/secure" view-name="secure"/>
</beans>

 

It could be referenced in the web.xml file in the standard way :

 

<servlet>
    <servlet-name>springMvcServlet</servlet-name>
    <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
    <init-param>
        <param-name>contextConfigLocation</param-name>
        <param-value>/WEB-INF/context/mvc-config.xml</param-value>
    </init-param>
    <load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
    <servlet-name>springMvcServlet</servlet-name>
    <url-pattern>/</url-pattern>
</servlet-mapping>

 

Using Spring Security in the standard JEE web application can be very useful. Other than the use of the security tags in the JSPs, described in this post, one can think of securing method's access in Java beans or using Access Control Lists (ACL) management for example.

In the first two episodes of this blog post series about the SAP Open Source Summit 2012, I shared my overall impressions and key parts of my own presentation on the corporate open source strategy, respectively. Now I want to touch upon insights from the keynotes that our guest speakers delivered.

 

Mike Milinkovich, Executive Director, Eclipse Foundation, presented on Foundations 2.0. He offered his slides to be posted on SCN, see here. As already mentioned in my first blog post in this series, Mike stressed the point that it is commonly unknown that SAP is a strong contributor to the Eclipse Foundation. The Eclipse ecosystem is still growing - today, there are more than 2 mio downloads per month. Also, Eclipse is known for its predictabie, yearly releases and the level of corporate engagement, which was one of the differentiators when the foundation was created more than 10 years ago.

Recognizing a few trends in the software industry such as "software is everywhere" and "open source is really really mainstream", he characterized the next generation of open source foundations, which is also where Eclipse is going:

  1. Technology-agnostic Eclipse is already much more than an IDE only and applies to practically all programming languages (including ABAP ...). The foundation will continue to serve additional purposes and welcome other technologies that can make use of the development and IP management principles that it is known for.
  2. Git-based The adoption of Git as a version control and source code management system for distributed development is still growing and Eclipse has decided to migrate to a Git-based common build infrastructure, which will presumably simplify the daily life of Eclipse committers.
  3. Long-term support Particularly in enterprise environments, the requirement of long-term support (or sometimes long long-term support ...) is obvious and the establishment of the Long Term Support Industry Working Group supports the development of respective development and support models.
  4. User-led More and more end-user organizations from different industries observe the opportunities (or sometimes the need) to increase the level of collaboration, including the joint work on software projects. Eclipse already has a number of industry working groups such as Polarsys for embedded systems or LocationTech for location-aware software and Mike believes that the trend of more end-use involvement will continue.

 

Andrew Aitken is SVP, Olliance Group, a Black Duck company and an open source consulting firm. He has consulted numerous software companies on how to embed open source development and licensing approaches in their corporate strategy. From his point of view, open source is already in its fourth generation. In the beginning, it was a rather extreme movement with rather extreme characters like Richard Stallman and Bruce Perens. Then, open source emerged and became a means for commercial success: Red Hat, MySQL and SpringSource are examples of commercially quite successful companies with an open source based business model. Thirdly, the more traditional vendors complement their business model with open source approaches - today, there is hardly any software vendor that does not have a defined approach towards open source. And the fourth generation is - in line with what Mike observes - the involvement of end-users. NYSE, Airbus, BMW and NASA, for example, are all quite actively engaged in open source projects that are specific to their industry vertical.

 

Dirk Riehle, Professor for Open Source Software at the University of Erlangen-Nürnberg, presented on one of his favorite topics: Inner Source. Inner Source is what he calls "open source best practices applied to firm-internal software development." He and his team have interviewed and worked with a number of corporations, both software vendors and end-user organizations to see if the expected benefits of inner source (better code reuse, more knowledge sharing, improved resource allocation and higher job satisfaction) actually apply. While he in general confirms this, he distinguishes between two forms of inner source - volunteer (i.e., self-managed) and managed. Managed inner source requires a "defined and actively managed governance process." The right choice of model depends on the types of challenges an organization or a project is facing, which varies between developer skills, developer mindset and management mindset. More research from Dirk can be found on his blog.

 

Last but not least, Jono Bacon presented on Communities and what makes them strong and vibrant. For him, sense of belonging and sense of purpose are the most important characteristics the individual community members should feel to make the community a coherent one. Communities are obviously not restricted to software development (and of course not to open source software), but many principles of successful communities can also be seen in software-related communities. For example, the importance of providing kudos to active members, particularly new ones that start contributing, can not be overestimated. A simple "thank you" for committing a patch that resolves an annoying bug is key to keep the contributor's sense of belonging and the overall community more active. Jono has written a well-known book "The Art of Community" published at O'Reilly and continues the dialogue on the Art Of Community Online.

 

Next time, I'll finish this series and write about what SAP is - presumably - going to do next in the context of open source software.

 

In terms of the SAP Open Source Summit 2012 blog series, here is the overview again:

  1. Overall Impressions
  2. What we think - SAP Open Source Strategy
  3. Insights from keynotes (this post)
  4. What we do next

In this blog I'll share with you an alternative to the SAPUI5 development on the NetWeaver platform. SAPUI5 is an interesting and innovative effort from SAP on the front of modern UI development. But at the time of writing this post, it is still in an evaluation beta stage and it seems like there is still no decision about the licensing scheme that SAP will adopt for this technology. Since NetWeaver has been able to take advantage of Java 5 and JEE 5 and especially since it has become very easy to deploy a third-party JARs with a webapp deployed on this platform, we have a multitude of free and open source alternatives available for developing modern UI in the same way SAPUI5 has been designed. I will describe a way of using a very cool JavaScript framework, Dojo, together with my favorite MVC framework, Spring, to build a simple hello world webapp which demonstrates how these technologies can be used together on NetWeaver 7.3.

 

The sources for this webapp can be found in the Code Exchange project dojoui.

 

Note: I assume that you are already at least somewhat familiar with Spring MVC and Dojo frameworks.

 

Download a Dojo distribution from their site. I used the latest release, 1.8.1. You'll need to create a JAR containing these packages under the path /META-INF/resources/. The idea is that we will delegate to Spring the care of serving and caching these JavaScript files from this JAR for us, instead of bundling them as is in our WAR.

 

You also need to get all the JARs necessary for Spring MVC setup. There many ways to do this. I use Apache Ivy for this purpose. You can create a simple Java project in NWDS, import the single build.xml file from the Ivy's site and create a simple target which will retrieve Spring's libraries and their dependencies for us from the Maven repository. Here is the target:

 

<target name="resolve-and-retrieve" depends="install-ivy" description="--> resolves dependencies decalred in ivy.xml file">
    <ivy:resolve file="${basedir}/ivy.xml" transitive="true"/>
    <ivy:retrieve/>
</target>

 

It will resolve the dependencies declared in ivy.xml file, download them and put then in lib folder. Here are the contents of ivy.xml file:

 

<info organisation="your.org" module="anything" />
<configurations defaultconfmapping="default->default"></configurations>
<dependencies>
    <dependency org="org.springframework" name="spring-webmvc" rev="3.1.0.RELEASE" />
    <dependency org="cglib" name="cglib" rev="2.2" />
    <dependency org="org.codehaus.jackson" name="jackson-mapper-asl" rev="1.9.10" />
</dependencies>

 

You might need ivysettings.xml file, as well, here it is:

 

<ivysettings>
    <settings defaultResolver="chain" />
    <resolvers>
        <chain name="chain">
            <ibiblio name="central" m2compatible="true"></ibiblio>
            <!-- uncomment if you want to use Spring's milestone libraries -->
            <!-- 
            <ibiblio name="spring-milestone" m2compatible="true"
                root="http://repo.springsource.org/milestone"></ibiblio>
            -->
        </chain>
    </resolvers>
</ivysettings>

 

After retrieving the libraries you should have all the JARs (together with the dojo-1.8.1.jar created above) needed to create a simple webapp. You can see all the JARs needed in the image of the webapp structure at the end of this post.

 

Create a simple webapp project in NWDS (tmp~dojo~web) and assign it to an EAR (tmp~dojo~ear). Copy all the JARs into /WEB-INF/lib directory of your web project, they will be deployed to the server. Now we need to set up Spring's MVC for our webapp. This requires registering Spring's Dispatcher servlet in our web.xml file. Here it is:

 

<servlet>
    <servlet-name>SpringMvcDispatcher</servlet-name>
    <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
    <init-param>
        <param-name>contextClass</param-name>
        <param-value>org.springframework.web.context.support.AnnotationConfigWebApplicationContext</param-value>
    </init-param>
    <init-param>
        <param-name>contextConfigLocation</param-name>
        <param-value>ch.unil.dojo.web.config.WebConfig</param-value>
    </init-param>
    <load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
    <servlet-name>SpringMvcDispatcher</servlet-name>
    <url-pattern>/</url-pattern>
</servlet-mapping>

 

Spring MVC can be configured using Java annotations, this is how I have done it:

 

/**
 * Web context configuration to be processed by {@code
 * AnnotationConfigWebApplicationContext} and specified as {@code
 * contextConfigLocation} parameter for {@code DispatcherServlet}, see {@code
 * web.xml} file.
 * <p>
 * {@code EnableWebMvc} annotation configures default MVC infrastructure
 * including an instance of {@code MappingJacksonHttpMessageConverter} Json to
 * Java converter for request handlers. {@code ComponentScan} annotation
 * specifies the base package which will be scanned for the {@code Controller}
 * annotated classes.
 * 
 * @see org.springframework.web.servlet.DispatcherServlet
 * @see org.springframework.web.context.support.AnnotationConfigWebApplicationContext
 * @see org.springframework.web.servlet.config.annotation.EnableWebMvc
 * @see org.springframework.http.converter.json.MappingJacksonHttpMessageConverter
 */
@EnableWebMvc
@Configuration
@ComponentScan(basePackages = "ch.unil.dojo.web")
public class WebConfig extends WebMvcConfigurerAdapter {
    // set up the default view resolver, mapping logical view name
    // to a JSP with the same file name
    @Bean
    public ViewResolver viewResolver() {
        InternalResourceViewResolver viewResolver = new InternalResourceViewResolver();
        viewResolver.setPrefix("/WEB-INF/pages/");
        viewResolver.setSuffix(".jsp");
        return viewResolver;
    }
    // add handlers for the static resources (js, css, images),
    // will look up and cache Dojo files from the jar on the classpath
    public void addResourceHandlers(ResourceHandlerRegistry registry) {
        registry.addResourceHandler("/resources/**").addResourceLocations("classpath:/META-INF/resources/")
            .setCachePeriod(31556926);
    }
    // set up the redirection to the main view if the root URL is accessed
    public void addViewControllers(ViewControllerRegistry registry) {
        registry.addViewController("/").setViewName("index");
    }
}

 

Notice how we are setting up a special ResourceHandler for all the Dojo JavaScript files which will be served from the JAR on the classpath and even cached for faster subsequent accesses.

 

Now we can create our JavaScript font-end UI using Dojo. Create a JSP page called index.jsp in /WEB-INF/pages/ folder. The Spring will automatically use this view for all requests to the root of the web application. Here are the contents of this file:

 

<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>Testing Dojo</title>
<link rel="stylesheet" type="text/css"
    href="<%=request.getContextPath()%>/resources/dijit/themes/claro/claro.css">
<script src="<%=request.getContextPath()%>/resources/dojo/dojo.js"
    data-dojo-config="async: true,
    packages: [{name: 'js', location: '<%=request.getContextPath()%>/resources/js'}],
    gfxRenderer: 'svg'"></script>
</head>
<body class="claro">
    <script type="dojo/require">at: "dojox/mvc/at"</script>
    <div data-dojo-type="dojox/mvc/Group" data-dojo-props="target: model">
        <div data-dojo-type="dijit/form/TextBox"
            data-dojo-props="value: at('rel:', 'name'), placeHolder: 'First Name',
                properCase: true, trim: true"></div>
        <div id="submitBtn" data-dojo-type="dijit/form/Button"
            data-dojo-props="label: 'Submit'"></div>
        <div id="surfaceDiv"></div>
    </div>
    <script>
    require(["js/utils", "dojo/_base/kernel", "dojo/when", "dojo/parser", "dojo/json", "dojo/Stateful", 
             "dojo/on", "dojo/mouse", "dijit/registry", "dojox/gfx", "dojox/gfx/fx", "dojo/colors"],
            function(utils, kernel, when, parser, json, Stateful,
                    on, mouse, registry, gfx, gfxAnim, Color){
        //create a model
        var model = kernel.global.model = new Stateful();
        //parce the document, then connect the submit button
        when(parser.parse(), function() {
            //create SVG surface
            var surface = gfx.createSurface("surfaceDiv", 400, 100);
            registry.byId("submitBtn").on("click", function(evt){
                //make Ajax request
                when(utils.ajaxRequest("<%=request.getContextPath()%>/greet",
                        json.stringify(model)), function(data) {
                     //clear previous shape
                    surface.clear();
                    //crete text shape
                    var text = surface
                        .createText({x: 200, y: 50, text: data.greeting, align: "middle"})
                        .setFont({family: "Arial", size: "20pt", weight: "bold"})
                        .setFill(Color.named.skyblue);
                     new gfxAnim.animateTransform({
                         duration: 1500,
                         shape: text,
                         transform: [{
                             name: "rotategAt",
                             start: [0, 200, 50],
                             end: [360, 200, 50]
                         }]
                     }).play();
                });
            });
        });
    });
    </script>
</body>
</html>

 

This is a simple hello world webapp but with a little Dojo twist. It collects the user's name and sends to the Spring's controller in a form of an Ajax request with the name serialized as a Json string. On the cotroller's side the Json string from the body of the request is automatically converted to a POJO of the corresponding structure by the MappingJacksonHttpMessageConverter instance registered with Spring MVC by default (it's a part of EnableWebMvc configuration). Once the greeting message is generated it is stored as a value of the POJO's property and the POJO is serialized to the body of the HTTP response in the form of a Json string again automatically by the Jackson mapper. The returned Json response is parsed by Dojo which displays the greeting message as a rotating SVG text shape.

 

As you can see, we don't actually use a lot of JSP's functionality, it is mostly a plain HTML page with some JavaScript. However, we need a reference to the context path of the webapp which we can obtain by using a scriplet <%=request.getContextPath()%>. We also use a custom module, "js/utils", which Dojo is instructed to look up under /resources/js location. I have put this file on the classpath (src folder) under /WEB-INF/resources/js/utils.js this way the Spring's ResourceHandler servlet can pick it up as well. Actually, this is the only way that I found to reference my custom module on NetWeaver if I wanted to have it served by the Spring's resources servlet. Here are the contents of this file, it's just a simple utility to generate an asynchronous Ajax request to the server using Dojo's handy "dojo/request" module.

 

define(["dojo/request", "dojo/Deferred"], function(request, Deferred){
    return {
        ajaxRequest: function(path, data, sync){
             var def = new Deferred();
            request.post(path, {
                headers: {"Content-Type": "application/json"},
                data: data,
                handleAs: "json",
                sync: (sync || false)
            }).then(function(data){
                def.resolve(data);
            },
            function(error){
                def.reject(error);
            });
            return def;
        }
    };
});

 

The controller class which is responsible for handling the Ajax request is given below:

 

/**
 * Controller which will handle all incoming HTTP requests. Registered in the
 * MVC infrastructure via {@code ComponentScan} annotation used in the web
 * context configuration class.
 */
@Controller
public class AjaxController {
    private static final Logger logger = Logger.getLogger(AjaxController.class);
    /**
     * Request handling method applicable to any POST request with a header
     * {@code Content-type: application/json} and the path {@code /greet}.
     * Spring automatically converts the Json string from the body of the
     * request via {@code MappingJacksonHttpMessageConverter} to an instance of
     * {@code Greeting} form and serializes it back to a Json string set to the
     * body of the response.
     * 
     * @param form person data
     * @return greeting data
     * 
     * @see org.springframework.http.converter.json.MappingJacksonHttpMessageConverter
     */
    @RequestMapping(value = "/greet", method = RequestMethod.POST, consumes = "application/json", produces = "application/json")
    public @ResponseBody
    Greeting greet(@RequestBody Greeting form) {
        logger.debug("Processing Ajax request for /greet with form: " + form);
        form.setGreeting("Hello, " + form.getName() + "!");
        return form;
    } 
}

 

As you can see Spring makes it very easy for us to work with this kind of requests and thus it is a particularly good choice for the server-side Ajax processing.

 

Here is an outlay of the entire webapp project in NWDS for reference:

 

 

We can recap the things that we have used in this showcase and the advantages of this architecture.

 

1) Dojo front-end

 

Dojo is an excellent JavaScript framework, it is reach and constantly growing, uses AMD module loading, has a vibrant community and it is open source. The learning curve for Dojo is probably not much steeper than the one for learning SAPUI5 considering that you would have to learn JQuery with SAPUI5 at some point. The advantage of using Dojo is that you don't need to use the NetWeaver at all. I actually developed the entire webapp in STS with a Tomcat 6. Of course you will not get the code assist for the JavaScript files as you have with SAPUI5 plugin but at least it is free.

 

2) Spring MVC

 

Spring MVC framework allows to configure a clean separation of concern, RESTfull architecture for a webapp very easily, especially when it comes to conversion to and from Json requests. It all happens behind the scenes with Jackson mapper allowing one to implement server-side logic with simple POJOs. If you like this setup you can easily imagine using Sprin's Security to make your webapp secure or to use Spring CCI with an EJB session facade to connect to an ABAP backend.

In the first episode, I shared my overall impressions from the SAP Open Source Summit 2012. Now I want to share key parts of my own presentation, which mostly focused on the corporate open source strategy, its reasoning and how we operationalize it.

 

First of all, there is no room for a stand-alone open source strategy. To make sense, it clearly needs to support the overall product strategy. Secondly, it needs to allow us to gain something, for example, to increase development productivity or create more value-added features. Otherwise, we could all live a not invented here syndrome and develop all our products in a completely proprietary manner. And thirdly, it needs to executable without compromising our product standards and quality goals.

 

Let's now explore the strategy itself. It centers around three dimensions: usage of open source in SAP products, contribution to open source projects, and interoperability with open source technologies. The picture below says it all.

 

OSS_Strategy.png

 

That's it. Pretty straight-forward. Essentially, it describes what we are doing for quite some time in some areas already. No big surprise. But now it is written down, explains the business benefits and we can focus on its execution. Of course, there are a number of questions. What does "proven, broadly adopted" mean? When exactly should we contribute to open source projects and which parts of our products? And which interoperability scenarios are important from our customers' perspective?

 

O.k., let's focus on the execution and see how we can answer the operational questions. Here are four strategic recommendations providing more guidance on how to execute the strategy.

 

Recommendation1.png

 

Looking a few years back - when open source at SAP was managed as an exception - I can say that this is pretty much a 180° turn of guidance. Of course, there can be many reasons why an open source product is not the right choice. It may not be a good functional fit or expensive to make it fit. There might be doubts about the future of the project. The open source license may not be compatible with the intended use case. But if such key questions can be answered positively, the use of open source is the better economic decision for the reasons mentioned above.

 

Here is recommendation #2.

 

Recommendation2.png

 

In the majority of cases, the open source software embedded in SAP product doesn't need any modifications. This is due to the fact that we always analyse the maturity of the software and determine the adoption rate in the industry. But sometimes, there is a need to fix a bug and modify or enhance the software. In those cases, it would be a mistake to keep these changes proprietary. This is where Mike Milinkovich would say "keeping it proprietary is not an asset - it is a liability." It is econmically the better choince to contribute the changes back to the open source project so that they can be included in the next version of the original open source software. This significantly reduces the cost of reintegrating the newest version of the open source software. And it - through our active participation - protects the future of the open source project and helps avoiding the "tragedy of the FOSS commons" as identified by Schweik and English in that open source software can only be successful if there is continued contribution from diverse sources.

 

This covers contributions to existing open source projects. But what are the factors to determine when to start a new open source project? This is what recommendation #3 is all about.

 

Recommendation3.png

 

Like in recommendation #1, this is a major turn in our approach to open source (compared to a few years back). When in the past, every single line of proprietary code was considered to be intellectual property that by all means needed to be kept proprietary, we now do a much more comprehensive cost-benefit analysis and include benefits such as protection of investments through standardization and utilization of external contributions. Software components that are not SAP-specific and non-differentiating can actually benefit a lot from being made available as open source - we can simply avoid to compete with our proprietary approach in cases where competition increases cost (instead of increasing revenues). The level playing field created by providing the software as open source invites others to join us in making the software more relevant in the future. The Eclipse Memory Analyzer was the first official open source project created by SAP under this new approach (of course, SAP DB / SAP MaxDB was open source for many years before) and allowed us - in this case - to join forces with IBM and others to make it work on other Java Virtual Machines, for example.

 

Taking contributions to existing projects and creation of new projects together, SAP developers contribute to more than 50 open source projects. A selection of contributions is listed at this SCN page.

 

Finally, recommendation #4 covers interoperability between SAP and open source technologies.

 

Recommendation4.png

 

In many cases, SAP customers already use open source technologies in their system landscape and may even have built up skills on how to use, administrate or develop with the open source technology. Considering those cases, it makes a lot of sense to illustrate our customers on how to use such open source technologies in combination with SAP technologies to eventually reach a more complete solution portfolio and protect our customers' investments in the open source technologies. Consequently, we work in various areas to reach a higher level of interoperability between SAP and open source technologies, partly combined with contributions to the respective open source projects. Even in the case of SAP HANA (see picture above), a number of open source technologies - like the statistical computing library R or the distributed and "big data" computing technology Hadoop complement our offering.

 

I believe this is enough food for thought for today and look forward to your feedback.

 

In terms of the SAP Open Source Summit 2012 blog series, here is the plan again:

  1. Overall Impressions
  2. What we think - SAP Open Source Strategy (this post)
  3. Insights from keynotes
  4. What we do next

As Matthias Steiner already mentioned in his blog post SAP NetWeaver Cloud & Open Source - A match made in heaven on Monday, we organized the third SAP Open Source Summit last week Thursday and Friday. While this was an internal event, the topic obviously is not. We also had five external keynote speakers -  it was great to get their perspective and to use the opportunity for more in-depth discussions on trends and future opportunities. More on that the next episodes.

 

A theme that I heard more than once from our guests is that they recognize from the summit sessions and discussions (as well as from earlier interactions) that SAP is a quite active open source consumer and contributor, but only very few people outside of SAP know about that. It seems to be one of the best kept secrets - even though there is no reason to keep it secret. Point taken. This post is an(other) attempt to change that.

 

Let me start with an example. One that Mike Milinkovich mentioned during his keynote about Foundations 2.0: SAP is the third-largest committer to Eclipse projects. As of today, 31 committers from SAP have done 63,867 commits to 26 Eclipse projects with together 3,127,480 lines of code. I don't think that we are or need to be in a race here to compete for the second or third place. I think that the numbers just say that open source contributions are a normal course of action for SAP and become increasingly important from a product development perspective.

 

The other example I saw last week is the discussion around best practices for open source contributions. At the past open source summits (2009 and 2010, those days organized by Erwin Tenhumberg) we talked a lot about how to become a committer at an open source project. This time, we had a number of committers and even project leads on stage sharing their experience and recommendations. It is interesting to notice that Stephan Klevenz (committer to odata4j at Google Code), Matthias Sohn (co-lead for EGit and JGit at Eclipse), Krum Tsvetkov (co-lead for Eclipse Memory Analyzer, originally initiated by SAP), and Florian Mueller (chair and VP for Apache Chemistry) all work for SAP. And now the discussion was much more about effective project management, IP management done by open source foundations or procedural questions like how to report and handle security vulnerabilities in open source code (a topic that is better not discussed in the public until a fix is available).

 

Now, instead of overloading this blog post, my plan is to write a series on the Open Source Summit 2012 as follows:

  1. Overall Impressions (this post)
  2. What we think - SAP Open Source Strategy
  3. Insights from keynotes
  4. What we do next

 

I hope this will be interesting for a broader audience and look forward to your feedback.

On June 20, 2012, we have celebrated the Eclipse Juno release at our fourth Eclipse DemoCamo in Walldorf. Again a big crowd showed up. They saw cool demos, covering various areas like UI technologies and testing, modeling, SAP variant configuration, and support for Cloud development via Git and Gerrit. Many thanks to all presenters!

 

Like always, everyone recovered from this serious business during the Happy Hour: barbecue, tons of networking opportunities, and the sound of "East Ring Company" made it easy to relax.

 

We have linked some pictures from the DemoCamp Wiki page.

 

We're looking forward to seeing even more people at the Kepler DemoCamp next year!

When the SAP Java Virtual Machine team decided to join the OpenJDK project last year, they first of all wanted to understand the overall direction and governance structure of the project. After Volker Simonis and other team members contributed a few bug fixes and minor enhancements, they quickly observed that their contributions were not only welcomed, but that there are many similar interests among the different OpenJDK project members - the most important one being to protect the future of Java. What a bold statement ... in practice this means that through active participation and contribution from a broader group of participants, the work can be shared and innovation for the Java platform can jointly be defined by means of one standard implementation. An interesting observation is obviously that the participants include key vendors such as Oracle, Red Hat, Apple, IBM, and SAP that in various regards are competitors in the marketplace. But there is less and less reason to compete on Java Virtual Machine and Java Development Kit features where the same requirements are implemented differently.

 

Based on this experience, SAP today decided to significantly incresae its contributions. Please read Volker Simonis' blog post on the OpenJDK mailing list about a new project proposal. We now plan to contribute whole platform ports in a belief that it is economically better to standardize our own implementation than to keep it proprietary. The first platform port that we are going to contribute is Linux on PowerPC. Based on that, we will work together with IBM and others and also use it as a basis for the AIX on PowerPC port. Depending on the outcome, other ports for other platforms might follow. This will also help us to better integrate and align our own platform ports and extensions with current Java developments, and it will let the OpenJDK community profit from SAP’s longstanding experience in Java and Java VM technology. Every OpenJDK user will be able to benefit from a broader choice of platforms, an important consideration for many SAP customers. SAP’s commitment to deliver a high class, commercially licensed Java Virtual Machine remains unchanged. SAP is currently delivering its commercial Java based products with SAP JVM, a certified Java Virtual Machine (JVM) and Java Development Kit (JDK), compliant to the Java Standard Editions 1.4, 5, 6 & 7 on all 14 platforms supported by SAP.

In this series (part 1, part 2) we have looked at the way we can combine some leading Java technologies to create a multi-tier application with JSF 2.0 front-end, EJB session facade and layer for communicating with the ABAP back-end via JRA adapter. All of the layers take advantage of Spring framework which provides such utilities as Common Client Interface (CCI) abstraction for working with the JCA connection factory of the AS, and the jee:jndi-lookup mechanism for looking up resources and EJBs in the JNDI context. Below are some of the screenshots for the proof-of-concept application built using the techniques described in this series of posts. It uses Primefaces with cupertino and redmond themes (with the font set to match the one from SAP look-and-feel).

 

Source code is available on GitHub nw-jsf-showcase.

 

 

 

 

 

Actions

Filter Blog

By author: By date:
By tag: