cancel
Showing results for 
Search instead for 
Did you mean: 

Setup Cluster for Active-Active Deployment

Former Member
0 Kudos

Hi all,

I am trying to setup an ESP cluster with two Linux hosts. My ESP version is 5.1 SP08. My main target is to deploy project in active-active mode on these two hosts. But, from the guides (admin, docs in scn etc.), I could not understand the steps that I need to do for cluster setup with two hosts.

I have a couple of questions that I could not find answers from docs.

  • Do I need to run a single cluster database or run cluster database on each hosts?
  • Do I need to start a cluster on one host and than add nodes to cluster?
  • What sould be the hierarchy of the shared folders between the hosts?
  • etc..

If anyone who setup such a cluster summarize the steps, I will be grateful.

Thank you very much.

Bulut

Accepted Solutions (0)

Answers (1)

Answers (1)

Former Member
0 Kudos

Hi Bulut,

There can only be one active cluster database per cluster.  You can set it up for high availability as documented here (in this case there are two cluster databases plus an arbiter but only one is ever active):

  Configuring SAP SQL Anywhere Server for High Availability - Configuration and Administration Guide -...

There is a multi-node cluster example in $ESP_HOME/cluster/examples/cluster_example.xml .  See the readme "cluster_example_readme.txt" in the same directory.

I have not had time to try this yet but I think you would:

1) Start the cluster database on whichever node you choose.  The cluster database always has to be started first.  (See $ESP_HOME/cluster/config/esp1/start_db.sh).

2) Start a cluster node (See $ESP_HOME/cluster/config/esp1/start_node.sh):

   start_node.sh node1 &

3) On one of the other hosts start a second node:

   start_node.sh node2 &

Which directories need to be shared?  In the following link, search for "Needs shared drive?".  On the following line it will say yes/no.  The formatting is terrible and I have filed a documentation bug:

File and Directory Infrastructure - Configuration and Administration Guide - SAP Library

I include the link to the SP04 documentation because the formatting is better but note the directory structure changed slightly between SP04 and SP08:

   http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01611.0514/doc/html/swa1308776088481...

You mention active-active at the beginning of your post.  A word of caution here...  Active-active should only be used with projects that do not have adapters.  In an active-active setup, both projects are live.  The adapters running in these projects have no knowledge of each other.  So for example, if you have a Generic Database Output adapter in the project, there would be two adapters running (one in each project) and they would both be simultaneously writing to the target database.

Cold failover is the better (perhaps the only) choice for projects containing adapters.

Thanks,

Neal

Former Member
0 Kudos

Hi Neal,

Thank you very much for your answer.

As your said, I run only one database for my cluster. It is run on "maslak" node. I modified cluster_example.xml. Since I could not attach a file in a reply message, I am pasting it as text.

I have two host machine. One of them is "maslak" and the other one is "sariyer". Cluster db is running on "maslak". I deploy the following xml file to database. Then, I start node1 and node3 on "maslak" and I start node2 and node4 on "sariyer" with $ESP_HOME/cluster/exemple/start_node.sh nodeX command. However, when I check cluster, I show that node1 and node3 run together on "maslak" while node2 and node4 are running together on "sariyer".

Accordingly, when I deploy a project to a manager run on "maslak", then it is run on both node1 and node3 due to the HA mode. Similarly, when I deploy a project on "sariyer", then it is run on both node2 and node4.

------------------------------------------------------------------------------

"maslak":

[sybase@maslak examples]$ $ESP_HOME/bin/esp_cluster_admin --uri=esp://maslak:19011 --username=sybase --password=sybase

> get managers

Manager[0]:     node1@http://maslak:19011

Manager[1]:     node3@http://maslak:19013

> get controllers

Controller[0]:  node3@http://maslak:19013

Controller[1]:  node1@http://maslak:19011

------------------------------------------------------------------------------

"sariyer":

[sybase@sariyer bin]$ ./esp_cluster_admin --uri=esp://sariyer:19012 --username=sybase --password=sybase

> get managers

Manager[0]:     node2@http://sariyer:19012

> get controllers

Controller[0]:  node4@http://sariyer:19014

Controller[1]:  node2@http://sariyer:19012

I don't understand why I could not see all nodes together. Do I missing something? Do you have any suggestion?

------------------------------------------------------------------------------

cluster_example.xml

<?xml version="1.0" encoding="UTF-8" standalone="no"?>

<Cluster>

    <Macros>

        <Macro name="ESP_HOME" type="envar">ESP_HOME</Macro>

        <Macro name="ESP_SHARED" type="value">/shared/Shared</Macro>

        <Macro name="ESP_STORAGE" type="value">${ESP_SHARED}/storage</Macro>

    </Macros>

    <SystemProperties/>

    <Manager/>

    <Controller>

        <ApplicationTypes>

            <ApplicationType enabled="true" name="ha_project">

                <Class>com.sybase.esp.cluster.plugins.apptypes.HaProject</Class>

                <StandardStreamLogging enabled="true"/>

                <Properties>

                    <Property name="base-directory">${ESP_HOME}/cluster/examples/projects</Property>

                    <Property name="esp-home">${ESP_HOME}</Property>

                    <Property name="hostname">${ESP_HOSTNAME}</Property>

                    <Property name="ld-preload">${ESP_HOME}/lib/libjsig.so</Property>

                    <Property name="debug-level">4</Property>

                </Properties>

            </ApplicationType>

            <ApplicationType enabled="true" name="project">

                <Class>com.sybase.esp.cluster.plugins.apptypes.Project</Class>

                <StandardStreamLogging enabled="true"/>

                <Properties>

                    <Property name="base-directory">${ESP_HOME}/cluster/examples/projects</Property>

                    <Property name="esp-home">${ESP_HOME}</Property>

                    <Property name="hostname">${ESP_HOSTNAME}</Property>

                    <Property name="ld-preload">${ESP_HOME}/lib/libjsig.so</Property>

                    <Property name="debug-level">4</Property>

                </Properties>

            </ApplicationType>

            <ApplicationType enabled="true" name="toolkit_adapter">

                <Class>com.sybase.esp.cluster.plugins.apptypes.FrameworkAdapter</Class>

                <StandardStreamLogging enabled="true"/>

                <Properties>

                    <Property name="esp-home">${ESP_HOME}</Property>

                    <Property name="base-directory">${ESP_HOME}/cluster/examples/adapters</Property>

                </Properties>

            </ApplicationType>

        </ApplicationTypes>

    </Controller>

  <ServiceProvider>

  <ServiceTypes>

  <ServiceType name="discovery" enabled="true">

  <Class>com.sybase.esp.cluster.plugins.servicetypes.adapter.DiscoveryServiceImpl</Class>

  <StandardStreamLog enabled="true"/>

  <Properties>

  <Property name="base-directory">${ESP_HOME}/cluster/examples/discovery</Property>

  <Property name="cnxml-path">${ESP_HOME}/lib/adapters</Property>

  <Property name="esp-home">${ESP_HOME}</Property>

  <Property name="hostname">${ESP_HOSTNAME}</Property>

  </Properties>

  </ServiceType>

  </ServiceTypes>

  </ServiceProvider>

    <Rpc>

        <Ssl enabled="false"/>

    </Rpc>

    <Cache>

        <Persistence enabled="false">

            <Directory>${ESP_STORAGE}</Directory>

            <Limited enabled="true">

                <DataService enabled="true"/>

            </Limited>

        </Persistence>

        <Multicast enabled="false"/>

    </Cache>

    <Security>

        <Authenticators>

            <Authenticator>

                <Provider>com.sybase.security.core.PreConfiguredUserLoginModule</Provider>

                <Options>

                    <Option name="username">sybase</Option>

                    <Option name="password">{SHA-256:96rO7HHy5J0=}3xLgcVQsskbwazcBo097Ggr6sJC9c7oBHqqxkIBT3aQ=</Option>

                </Options>

            </Authenticator>

        </Authenticators>

        <Authorizer enabled="false"/>

        <KeyStore>

            <Type>JKS</Type>

            <File>${ESP_HOME}/cluster/examples/cluster_example.jks</File>

            <Password encrypted="true">

yhEtAqBrEY7FM1kO0V590/d2Imq0qVvf1wJoJa3+JwuwF6Ti

</Password>

            <KeyPassword encrypted="true">

yhEtAqBrEY7FM1kO0V590/d2Imq0qVvf1wJoJa3+JwuwF6Ti

</KeyPassword>

            <Algorithm>RSA</Algorithm>

        </KeyStore>

    </Security>

    <Nodes>

        <Node enabled="true" name="node1">

            <Macros>

        <Macro name="ESP_HOSTNAME" type="value">maslak</Macro>

  </Macros>

            <SystemProperties/>

            <Manager enabled="true"/>

            <Controller enabled="true">

                <ApplicationTypes/>

            </Controller>

  <ServiceProvider enabled="true">

  <ServiceTypes/>

  </ServiceProvider>

            <Rpc>

                <Host>${ESP_HOSTNAME}</Host>

                <Port>19011</Port>

            </Rpc>

            <Cache>

                <Host>${ESP_HOSTNAME}</Host>

                <Port>19001</Port>

            </Cache>

        </Node>

        <Node enabled="true" name="node2">

            <Macros>

        <Macro name="ESP_HOSTNAME" type="value">sariyer</Macro>

  </Macros>

            <SystemProperties/>

            <Manager enabled="true"/>

            <Controller enabled="true">

                <ApplicationTypes/>

            </Controller>

            <Rpc>

                <Host>${ESP_HOSTNAME}</Host>

                <Port>19012</Port>

            </Rpc>

            <Cache>

                <Host>${ESP_HOSTNAME}</Host>

                <Port>19002</Port>

            </Cache>

        </Node>

        <Node enabled="true" name="node3">

            <Macros>

        <Macro name="ESP_HOSTNAME" type="value">maslak</Macro>

  </Macros>

            <SystemProperties/>

            <Manager enabled="true"/>

            <Controller enabled="true">

                <ApplicationTypes/>

            </Controller>

            <Rpc>

                <Host>${ESP_HOSTNAME}</Host>

                <Port>19013</Port>

            </Rpc>

            <Cache>

                <Host>${ESP_HOSTNAME}</Host>

                <Port>19003</Port>

            </Cache>

        </Node>

        <Node enabled="true" name="node4">

            <Macros>

        <Macro name="ESP_HOSTNAME" type="value">sariyer</Macro>

  </Macros>

            <SystemProperties/>

            <Manager enabled="false"/>

            <Controller enabled="true">

                <ApplicationTypes/>

            </Controller>

            <Rpc>

                <Host>${ESP_HOSTNAME}</Host>

                <Port>19014</Port>

            </Rpc>

            <Cache>

                <Host>${ESP_HOSTNAME}</Host>

                <Port>19004</Port>

            </Cache>

        </Node>

    </Nodes>

</Cluster>

------------------------------------------------------------------------------

By the way, thank you very much for your comments on "active-active" deployment. I understand that when I use input adapters in my project and deploy it in active-active mode, both projects connect to sources seperately. I was thinking that primary instance connects to data sources and streaming data is passed to secondary instance through the protocol between the instances. In this case, active-active deployment is not so usefull. Based on your experience, what kind of projects can run for active-active?

Thanks and regards,

Bulut

Former Member
0 Kudos

Hi Bulut,

I don't see anything obviously wrong with your cluster configuration.  I used the same example cluster configuration file and started two nodes.  Since I didn't have a "log4j.properties" when I started the nodes, all of the logging went to stdout but in this logging I can see it took awhile (almost 2 minutes) for the nodes to discover each other:

Sep 26 2014 07:24:02.253 INFO hz._hzInstance_1_esp_multi.cached.thread-5 com.sybase.esp.cluster.impl.CacheService - CODE_700102 | Membership listener...memberRemoved Member [127.0.0.1]:19001

Sep 26 2014 07:24:02.254 INFO hz._hzInstance_1_esp_multi.cached.thread-5 com.sybase.esp.cluster.impl.CacheService - CODE_700103 | Membership listener cleaning up member Member [127.0.0.1]:19001

Sep 26 2014 07:25:56.546 INFO hz._hzInstance_1_esp_multi.cached.thread-3 com.sybase.esp.cluster.impl.CacheService - CODE_700101 | Membership listener...memberAdded Member [10.7.119.177]:19003 this

Sep 26 2014 07:26:02.557 INFO hz._hzInstance_1_esp_multi.cached.thread-3 com.sybase.esp.cluster.impl.CacheService - CODE_700101 | Membership listener...memberAdded Member [127.0.0.1]:19001

It may have taken two minutes because I am working remotely on a VPN today I'm not sure.  I'll be traveling for the next week so I won't be able to investigate further.

If you don't see anything obvious in each node's stdout logging, you may need to modify your start_node.sh/start_node.bat to be more like the one in $ESP_HOME/cluster/config/esp1 where it uses a "log4j.properties" type file and change all of the "info" references to "debug":

% grep LOG start_node.sh

ESP_CLUSTER_LOG_PROPERTIES=cluster.log.properties

ESP_CLUSTER_NODE_LOG_FILE=$ESP_CLUSTER_NODE_NAME.log

"$ESP_HOME/bin/esp_cluster_node" -p$ESP_CLUSTER_LOG_PROPERTIES -f$ESP_CLUSTER_NODE_LOG_FILE --config $ESP_CLUSTER_CONFIG --node-name $ESP_CLUSTER_NODE_NAME

Only a wild guess, you might check your two Linux boxes to see if they have a proxy between them that would prevent easy from communicating:

% env | grep proxy

http_proxy=http://proxy.acme.com:8080

https_proxy=http://proxy.acme.com:8080

ftp_proxy=http://proxy.acme.com:8080

no_proxy=archer.acme.com,localhost,127.0.0.1,archer2.acme.com,archer,10.7.119.177

As for when to use active-active?  Maybe you have a project that performs some complex calculations and stores the results in a WINDOW.  This project does not write the data anywhere but there are clients that subscribe to the WINDOW.  You want the clients to always be able to subscribe to this window.  One project goes down, the other is still running with an exact copy of the data.

Thanks,

Neal