Quantcast
Channel: SCN : Blog List - SAP HANA and In-Memory Computing
Viewing all 902 articles
Browse latest View live

Start/Stop HANA Database Services

$
0
0

Hi,

 

Most of you experienced with HANA (specially system admins with right privilege - has the <sid>adm credentials maintained for respective systems) might know services that support various roles of total HANA System.

 

Every one knows how to start/stop HANA instance.

 

lets start with how to start,stop,restart HANA database services, it will help administrator as well as developers with the following scenarios

 

1) A host in a distributed system failed and a standby host took over. However, the services of the failed host remain inactive even after the host is reachable again. In this case, you need to restart the services manually.

 

2) After an update of SAP HANA extended application services (SAP HANA XS), the xsengine service needs to be restarted.

 

3) HANA Developers might have face this situation when developing applications for web browser clients

 

So lets start now

 

1.Open HANA studio and right click on your HANA instance from navigator view (click windows ->show view -> navigator)

2.Click Administration

3.Now you should be able to see Administration editor similar to this


hana-services.png

4. In the Administration editor Click on Landscape Tab then select Services Tab

5. Now right click on any service and you have following options

 

OptionDescription
StopThe service is stopped normally and then typically restarted.
KillThe service is stopped immediately and then typically restarted.
Reconfigure Service...The service is reconfigured. This means that any changes made to parameters in the system's configuration files are applied.
Start Missing Services...Any inactive services are started.

 


Searched SCN for similar post but could able to find one related to Start/Stop HANA Server, It was clearly answered with screenshots, hope this help newbies

http://scn.sap.com/thread/3291521


Monitoring SAP HANA Systems During Stop and Start

$
0
0

Prerequisite


To be able to open the Administration editor of a system in diagnosis mode, you must be able to log on using the credentials of the operating system administrator (user <sid>adm) that was created when the system was installed.

 

NOTE: The <sid>adm user is not a database user but a user at the operating system level. Also referred to as the operating system administrator, this user has unlimited access to all local resources related to SAP systems.

 

===========================================

Lets Start

===========================================


The SAP HANA studio normally collects information about the system using SQL statements (Using Stored Procedures and Views available in System catalogues) .

 

However, when the system has not yet started, no SQL connection is available. Therefore, while the system is starting up or is stopped, the SAP HANA studio collects information about the database using the connection of the SAP start service (sapstartsrv).

 

 

In this way, you can analyze any problems that may occur during startup or while the system is stopped. You can also read diagnosis files even when the system is stopped.

 

You can view this information in the Administration editor in diagnosis mode.

 

It is equivalent to adding "-v" switch to any Linux commands which is an optional mode when running a program that will display varying levels of status messages as it is processing.

 

 

The Administration editor opens automatically in diagnosis mode in the following situations:


● When you open the Administration editor for a system without an SQL connection

● When you initiate the start, stop, or restart of a system

 

You can manually open a system in diagnosis mode by choosing the (Open Diagnosis Mode) button from the drop-down menu of the (Administration) button in the Navigator view.

 

HANA Studio SPS5

 

Reference: SAP HANA Administration Guide

 

Related Threats

 

Search the community for "hana dyagnosis"

Monitoring Overall System Status and Resource Usage

$
0
0

When you open the Administration editor for a particular SAP HANA system, the Overview tab provides you with a summary of the overall status of the system, as well as an overview of resource usage.

 

The following figure shows an example of the system overview of a distributed system.

 

 

Resource usage values are presented in such a way that you can compare the SAP HANA system with the operating system as a whole.

 

 

If the system is distributed, resource usage values are aggregated across all hosts. An additional bar shows the host with the highest (most critical) resource usage.

 

The bars indicating resource usage (memory, CPU, and disk) change color (green, yellow, and red) based on configurable check thresholds.

 

Reference: SAP HANA Administration Guide  (See page no 66) - Google hana_admin_en.pdf to download detailed Administration guide

Enforcing password blacklist policy in HANA

$
0
0

When creating new passwords, users are not allowed to use blacklisted words or partial words. Use SQL commands to insert or delete words or partial words from the password blacklist table.

 

 

● For inserting terms into and deleting terms from the password blacklist, you must have INSERT and DELETE privileges on either the table (_SYS_PASSWORD_BLACKLIST) or the entire _SYS_SECURITY schema.

 

● To view the contents of a table only, you must have the SELECT privilege.

 

Password_blacklist.png

 

 

The password blacklist in SAP HANA has been implemented with the table following table

 

_SYS_PASSWORD_BLACKLIST.

 

Note: This table is empty when you create a new instance.

 

You can add records to and delete records from the _SYS_PASSWORD_BLACKLIST table using the INSERT and DELETE SQL commands. You must specify values for all three columns described below:

 

Following table describes columns of the _SYS_PASSWORD_BLACKLIST table.

 

password-black-list.png

 

Lets see one example

 

In this example, the passwords "SAP", "my_sap_pwd" and "sap_password" would not be allowed,

 

INSERT INTO _SYS_SECURITY._SYS_PASSWORD_BLACKLIST VALUES ('sap', 'TRUE', 'FALSE')


regardless of how the password layout and minimal password length are defined in the corresponding parameters.

Webcast: The Benefits and Considerations of Real-Time Database Management

$
0
0

Title: The Benefits and Considerations of Real-Time Database Management

Date: Wednesday, June 12, 2013

Time: 11:00AM PDT / 2:00PM EDT

 

New advances in databases technologies are allowing companies to leverage vast amounts of data that would have been logistically impossible just 5 years ago. Not only that, organizations can analyze and put their data to work in real-time, making critical analytics based decisions faster and smarter.

 

Join Ziff Davis and SAP for this live and interactive webinar, The Benefits and Considerations of Real-Time Database Management, to:

 

Learn how real-time database management is changing the way businesses leverage their data.

Discover how to leverage real-time databases into a competitive advantage.

The real-world business cases to understand how real-time database management is becoming a foundation for informed business decisions.

 

Featured Speakers:

 

Neil McGovern

Senior Director of Solutions Marketing

SAP

 

Christopher Dawson

Consultant

 

 

REGISTER NOW!

Andy Silvey -> SAP Hana Command Line Tools and SQL Reference Examples for NetWeaver Basis Administrators

$
0
0

I've been planning to create a Hana Command Line and Administrator's Reference
for a long long time.

 

During my studies for HanaTec Certification I noticed that anybody interested in

Hana Administration is missing a one stop shop location and reference for the

Hana Command Line tools and Administrator's SQL queries.

 

There are pieces of documentation on Hana Command Line tools spread across

a wealth of locations including, SCN Discussions, Experience Hana documentation ,
Tech-Ed documentation, Hana Administration guides, OSS Notes, KB Articles, but
there's not one location giving a live list of Hana Command Line tools with references

back to supporting documentation.

 

Selfishly, I've wanted to create that list for my own benefit. But, for the Administrator,

what is more luxurious than one location containing all Hana Command Line tools

known to man (or woman), and why to be selfish, why not to share this wealth with the

world :-)

 

And so this blog was born.

 

The Blog's goals are:

 

    • . Create a Living List 0f all Known Hana Command Line and SQL Tools
    • . Invoke discussion between techicians on the different tools and
      their possibilities and uses, tips and tricks
    • . Give an intro to the Command Line Tools including example usage


On top of this list of Hana Command Line Tools, and useful SQL's, there are a
wealth of really useful Hana Administration related OSS Notes and KB Articles out
there and the Experience Hana Website .

 

Bill Ramos, has catalogued the OSS Notes nicely

in The Ultimate Set of Hana Related Links from SAP,

and the

SAP NetWeaver Basis Administrator's Toolbox...

contains the longest list of Hana Administration OSS Notes available in one place

on the Internet.

 

Another useful source as Lars Breddemann has pointed out in other blogs is the

help option on the command line, <HanaCommand> -h. Lars has authored a whole

host of excellent Hana Administration blogs, click on Lar's User and then click

on his Content to look through his blogs.

 

Furthermore, a useful read for those new to Linux would be Bill Ramos's blog:
Linux challenged no problem - check out the linux essentials for the windows admin .

 

As ever, have fun and enjoy learning.

 

All the best,

 

Andy.

 

p.s. something interesting would be to produce a similar resource detailing the navigation
steps in the Hana Studio for Basis Administrator's tasks, would somebody else like to lead
that ?

 

 

 

 

Contents: SAP Hana Command Line Tools and SQL Reference Examples for NetWeaver Basis Administrators

 

 

  • 1) HDB stop & StopSystem HDB

  • 2) HDB start & StartSystem HDB

 

  • 3) GetSystemInstanceList

 

  • 4) hdbnameserver

 

  • 5) hdbindexserver

 

  • 6) hdbuserstore

 

  • 7) Install Permanent License Using SQL

 

  • 8) M_VOLUME_IO_STATISTICS

 

    • Total Memory Used

 

    • Code and Stack Size

    • Total Memory Consumption of All Columnar Tables

 

    • Total Memory Consumption of All Row Tables

 

    • Total Memory Consumption of All Columnar Tables by Schema

 

    • List All Columnar Tables of Schema 'SYSTEM'

 

    • Available Physical Memory

 

    • Free Physical Memory

 

  • 9) Activating Emergency User with hdbnameserver & hdbindexserver

 

  • 10) Renaming an SAP HANA System with a GUI using hanaconfig.sh

 

  • 11) Location of Configuration Files

 

  • 12) Trace Files Location

 

  • 13) Testing the ODBC Installation

  • 14) hdbupd & hdbsetup

 

  • 15) ODBO Driver

 

  • 16) Collecting System Information for Support

 

  • 17) HDBSQL

 

  • 18) To run sql queries using hdbsql

 

  • 19) Performing a Database Backup Using SQL Commands

 

  • 20) Setting Up a Data Backup Using Cron

 

  • 21) Canceling a Running Data Backup

 

  • 22) Canceling Other Running Data Backups

 

  • 23) To check the state of the data backup

 

  • 24) hdblogdiag & Recovery

 

  • 25) Restore when log backups are missing

 

  • 26) hdbbackupdiag

 

  • 27) hdbcons

 

  • 28) Sizing Hana Basic Calculations

 

  • 29) HANA SSO with Kerberos and Active Directory

 

  • 30) Row Store Reorganisation

 

  • 31) How to generate a runtime dump on SAP HANA saphostagent/sapdbctrl for Hana

 

  • 32) DBSL hints for SAP HANA

 

  • 33) hdbrename

 

  • 34) hdbnsutil

 

  • 35) hdbsrvutil - Example Usage

 

  • 36) Show Hana Processes HDB proc & HDB info

 

 

 

 

SAP Hana Command Line Tools and SQL Reference Examples for NetWeaver Basis Administrators

 

 

 

 

  • 1) Stop Hana

./HDB stop

 

/usr/sap/hostctrl/exe/sapcontrol -nr <instance-nr> -function StopSystem HDB

 

 

 

 

  • 2) Start Hana

 

./HDB start

 

/usr/sap/hostctrl/exe/sapcontrol -nr <instance-nr> -function StartSystem HDB

 

 

 

 

  • 3) Query the current status of all hosts

 

/usr/sap/hostctrl/exe/sapcontrol -nr <instance-nr> -function GetSystemInstanceList

 

 

 

 

  • 4) Start the name server:

 

/usr/sap/<SID>/HDB<instance>/hdbenv.sh

 

/usr/sap/<SID>/HDB<instance>/exe/hdbnameserver

 

 

 

 

  • 5) Start an index server in a new console:

  

/usr/sap/<SID>/HDB<instance>/hdbenv.sh

 

/usr/sap/<SID>/HDB<instance>/exe/hdbindexserver -console

 

 

 

 

  • 6) hdbuserstore

 

Create a user key in the user store and store the password under this user key:


hdbuserstore SET <user_key> <env> <user_name> <password>

 

For example:

 

hdbuserstore SET millerj localhost:30115 JohnMiller 2wsx$RF

 

 

List all available user keys (passwords are not displayed):

 

hdbuserstore LIST <user_key>

 

For example:

 

hdbuserstore LIST millerj

 

The following information is displayed

 

:KEY: millerjENV: localhost:30115USER: JohnMiller?

 

 

Call hdbsql with the user key:

 

hdbsql -U <user_key>

 

For example:

 

hdbsql -U millerj

 

hdbuserstore Example of User Creation while solving saphostagent/sapdbctrl for Hana issue

 

Ref: SAP Note 1625203 - saphostagent/sapdbctrl for newdb

 

The saphostagent functions for querying the database status and for
starting and stopping the database are now also available for the Hana.

 

If you want to use the connect without a password, the following
prerequisites must be met:

 

a. To query information from the database, you require a database user with
the monitoring role.

 

You can create a database user with the Hana studio. Connect as the SYSTEM
user and execute the following SQL commands:

 

CREATE USER SAPDBCTRL PASSWORD x<password>x

 

GRANT MONITORING TO SAPDBCTRL

 

Then logon to the Hana studio under the username <username> and execute
the following command:

 

ALTER USER SAPDBCTRL PASSWORD <password>

 

b. As the OS user <sid>adm, you must provide the user key <SID>SAPDBCTRL in
hdbuserstore on the database server. You can check the existence of the
user key with the following command:

 

hdbuserstore LIST <SID>SAPDBCTRL

 

If the user key does not exist, you can create it as follows:

 

hdbuserstore SET <SID>SAPDBCTRL <dbhost>:<port> SAPDBCTRL <password>

 

The tool hdbuserstore is installed under the following paths:

 

/usr/sap/hdbclient

 

or

 

/usr/sap/<SID>/hdbclient

 

c. To check whether the sapdbctrl queries without passwords work, execute
the following command as OS user <sid>:

 

/usr/sap/hostctrl/exe/saphostctrl -function GetDatabaseStatus -dbname

 

<SID> -dbtype hdb

 

If sapdbctrl responds with the following error text:

 

ERROR: Database user authentication failed: SQLDriverConnect: Connect with
userkey failed! then you must check the connect data in hdbuserstore again.

 

 

 

 

  • 7) Install Permanent License Using SQL

 

Execute the following SQL command:

 

SET SYSTEM LICENSE ‘<license file content>’


You can delete all installed license keys by executing the following SQL command

 

UNSET SYSTEM LICENSE ALL


To Install the license key

 

SET SYSTEM LICENSE LicenseFile


More info: the Experience Hana Cookbook

 

 

 

 

 

  • 8) The monitoring view M_VOLUME_IO_STATISTICS

 

 

Show the Total Read Size and the the Total Write Size for each Volume since the Service

in question was last started.

 

The SQL command

 

ALTER SYSTEM RESET MONITORING VIEW SYS.M_VOLUME_TO_STATISTICS_RESET

 

initialises the statistics shown by this view

 

 

The monitoring view

M_VOLUME_IO_STATISTICS_RESET

 

now shows the statistics since the reset time


You can use the M_SERVICE_MEMORY view to explore the amount of SAP HANA
Used Memory as follows:

 

 

Total Memory Used

 

SELECT

 

round(sum(TOTAL_MEMORY_USED_SIZE/1024/1024)) AS

 

"Total Used MB" FROM SYS.M_SERVICE_MEMORY;

 

 

Code and Stack Size

 

SELECT round(sum(CODE_SIZE+STACK_SIZE)/1024/1024)

 

AS "Code+stack MB" FROM SYS.M_SERVICE_MEMORY;

 


Total Memory Consumption of All Columnar Tables

 

SELECT round(sum(MEMORY_SIZE_IN_TOTAL)/1024/1024)

 

AS "Column Tables MB" FROM M_CS_TABLES;

 

Total Memory Consumption of All Row Tables

 

SELECT round(sum(USED_FIXED_PART_SIZE +

 

USED_VARIABLE_PART_SIZE)/1024/1024) AS "Row

 

Tables MB" FROM M_RS_TABLES;

 


Total Memory Consumption of All Columnar Tables by Schema

 

SELECT SCHEMA_NAME AS "Schema",

 

round(sum(MEMORY_SIZE_IN_TOTAL) /1024/1024) AS

 

"MB" FROM M_CS_TABLES GROUP BY SCHEMA_NAME ORDER

 

BY "MB" DESC;

 


List All Columnar Tables of Schema 'SYSTEM'

 

SELECT TABLE_NAME AS "Table",

 

round(MEMORY_SIZE_IN_TOTAL/1024/1024, 2) as "MB" FROM

 

M_CS_TABLES WHERE SCHEMA_NAME = 'SYSTEM' ORDER BY "MB"

 

DESC;

 


Available Physical Memory

 
select round((USED_PHYSICAL_MEMORY + FREE_PHYSICAL_MEMORY) /1024/1024/1024, 2)

 

as "Physical Memory GB" from

 

PUBLIC.M_HOST_RESOURCE_UTILIZATION;

 
Execute the Linux command cat /proc/meminfo | grep MemTotal

 

 

Free Physical Memory

 
Execute the SQL query:

 

select round(FREE_PHYSICAL_MEMORY/1024/1024/1024, 2)

 

as "Free Physical GB" from

 

PUBLIC.M_HOST_RESOURCE_UTILIZATION;


 

Execute the Linux command:

 

awk 'BEGIN {sum = 0};

 

/^(MemFree|Buffers|Cached):/ {sum = sum + $2};

 

END {print sum}' /proc/meminfo

 

 

 

 

 

  • 9) Activating Emergency User with hdbnameserver & hdbindexserver

 

If the SYSTEM user’s password is lost, you can use the SAP system user
to reset the password.

 

To recover an SAP HANA instance where the SYSTEM user’s password is lost,

you need to have <sid>adm access to the instance where SAP HANA's master

index server is running.

 

Open a command line interface, and log on to the server on which
the instance of the SAP HANA master index server is running.

 

Shut down the instance.

 

Start the name server:

 

/usr/sap/<SID>/HDB<instance>/hdbenv.sh

 

/usr/sap/<SID>/HDB<instance>/exe/hdbnameserver

 

Start an index server in a new console:

  

/usr/sap/<SID>/HDB<instance>/hdbenv.sh

 

/usr/sap/<SID>/HDB<instance>/exe/hdbindexserver -console

 

You will see the output of a starting index server.

 

When the service has started, you have a console to the SAP HANA

instance where you are logged on as a SYSTEM user.

 

Reset the SYSTEM user's password and store the new password in a
secure location with the following SQL command:

 

ALTER USER SYSTEM password <new password>

 

Note: Because you are logged on as a SYSTEM user in this console,
you do not have to change this password after the next logon, regardless \

of what your password setting policy.

 

 

 

 

 

  • 10) Renaming an SAP HANA System with a GUI using hanaconfig.sh

 

In GUI mode, you will be prompted to enter the required parameters. 

Use the following instructions to complete this task.

 

Note: If you specify the host name, make sure that it is fully qualified, such as
myhost.sap.com (i.e., not just myhost).

 

Connect to the system with an X server client to enable GUI system access.

Open a root shell and go to the directory where you mounted the SAP HANA DVD,
by entering a command such as the following:

 

cd /mnt/<HANA_DVD>/DATA_UNITS/HANA_IM_LINUX__X86_64

 

Call the script in GUI mode to perform the rename:

  

./hanaconfig.sh --gui

 

Select Rename HANA System

 

Select Next

 

Specify the required entries.

 

Leave the root shell after the rename is complete.


hanaconfig.sh gui can also be used for other tasks including installing and de-installing
SMD Agents

https://cookbook.experiencesaphana.com/bw/deploying-bw-on-hana/preparation/prepare-hana-hardware/

 

 

 

 

 

  • 11) Location of Configuration Files

 

The configuration files (.ini files) are located by default in the following directories

 

$DIR_INSTANCE/../SYS/global/hdb

  

custom/config global.ini

  

indexserver.ini

  

nameserver.ini

 


For host-specific configuration settings

 

$SAP_RETRIEVAL_PATH

  

sapprofile.ini

  

daemon.ini

 

 

 

 

  • 12) Trace Files Location

 

trace files can be found here:

 

/usr/sap/<SID>/HDB<system number>/<hostname>/trace

 

 

 

 

 

  • 13) Testing the ODBC Installation

 

You can test the installation of the ODBC driver and your ability to connect
by using the odbcreg tool, which is part of the ODBC installation.

 

Start the odbcreg tool by enter a command in the form:

 

odbcreg -t hdbcodbc (for 64-bit driver)

  

or

 

odbcreg32 -t hdbcodbc32 (for 32-bit driver).

 

If the driver is installed properly, you should get the ODBC login screen.

 

Note, you can also run the command

  

odbcreg -g or odbcreg32 -g to get a list of installed drivers.

  

The SAP HANA driver is called HDBODBC

 

 

 

 

 

  • 14) hdbupd & hdbsetup

 

Automated update of SAP HANA server components

 

The SAP Software Update Manager is a separate software component that must
be started on the SAP HANA server. A good practice is to install this component
as a service.

 

The SAP Software Update Manager does not have a user interface.

It is controlled remotely from the SAP HANA studio.

 

Tip: The Software Update Manager can be configured as a Linux service by
running the following commands:

  

export JAVA_HOME=/usr/sap/<SID>/SUM/jvm/jre
  

/usr/sap/<SID>/SUM/daemon.sh install

 

 

The service can be started using the following command:

  

/etc/init.d/sum_daemon start

 

 

After the release of an SPS of the SAP HANA Appliance Software new revisions of the SAP HANA
database are released.

 

To update your SAP HANA database to these new revisions, use the commands

 

hdbupd or hdbsetup which are provided with the SAP HANA database installation.

 

 

 

 

 

  • 15) ODBO Driver

 

Install the SAP HANA ODBO driver on the host where MS EXCEL has been installed and is running.

For more information, see SAP HANA Client Installation and Update Guide.

 

a. Open a command prompt and call the SAP HANA client installation program by entering
the following command:

 

hdbinst -a client [<option_list>]

 

b. Follow the instructions displayed by the installation tool.

 

 

 

 

 

  • 16) Collecting System Information for Support

 

There is a Python script that allows you to collect information from your systems
even when no access to the system via SQL is possible.

 

This information can be added to the support message.

 

Location and Usage

 

The name of the Python script is

 

fullSystemInfoDump.py

 

and is part of a server installation. It runs from a command line and is located
in the directory

 

$DIR_INSTANCE/exe/python_support

(Linux).

 

To execute the script <sid>adm rights are required.

 

To start the script out of its location directory, enter:

 

python fullSystemInfoDump.py

 

By default the script creates a zip file with all of the collected support information
to the directory

 

DIR_TEMP/system_dump where DIR_TEMP

 

is the content of the variable with the same name in sapprofile.ini.

 

This output directory is shown as console output when the script is running, but it can

be looked up by entering:

 

hdbsrvutil -z | grep DIR_TEMP=

 

To change the default directory, an explicit absolute path can be given to the script,
for example:

 

python fullSystemInfoDump.py <output_dir>

 

Usage information can be displayed by entering:

 

python fullSystemInfoDump.py –h


To collect support information you need an SQL user with rights to select the system tables and
views listed in System Tables/Views.

 

For security reasons the user name and password for this SQL user cannot be given as command

line parameters to the script.

 

Instead you are prompted for this information:

 

ts1adm@luvm252058a:/usr/sap/TS1/HDB01/exe/python_support> python

 

fullSystemInfoDump.py

 

User name: SYSTEM

Password:

 

The password is not displayed on the command line.

 

If the system can be reached via SQL and the user name and password information is valid, the script
starts collecting support information.

 

If user name and/or password are invalid, the script aborts.

 

 

 

 

 

  • 17) HDBSQL

SAP HANA HDBSQL is a command line tool for entering and executing SQL statements, executing
database procedures, and querying information about SAP HANA databases.

 

You can use HDBSQL interactively or import commands from a file and execute them in the

background. You can access databases on your local computer and on remote computers.

 

To get a list of all available hdbsql commands, you should use:

 

hdbsql -h

 

Examples:

 

One step logon to the database on the PARMA computer with instance ID 01 as database
user MONA with the password RED.

 

hdbsql -n PARMA -i 1 -u MONA –p RED

 

To log on using a <user_key>, enter the following command:

 

hdbsql [<options>] -U <user_key>

 

 

Two-Step Logon, Procedure

 

Start HDBSQL:

 

hdbsql [<options>]

 

Log on to the database:

  

\c [<options>] -n <database_host> -i <instance_id> -u
   <database_user>,<database_user_password>

 

 

You display general information about the database:

 

\s

 

host : wdfd00245293a:30015

 

database : ORG

 

user : SYSTEM

 

kernel version: 1.50.00.000000

 

SQLDBC version: libSQLDBCHDB 1.50.00 Build 0000000-0120

 

autocommit : ON

 

 

 

 

 

  • 18) To run sql queries using hdbsql:

 

# Hdbsql –n <hostname>:<port> -u <database user> –p <password>
Port= 3<##>15 (## is the instance number)

 

For example for system number 01:

 

# Hdbsql –n coe-he-40:30115 –u system –p SAPTeched1

 

Welcome to the SAP In-Memory Computing interactive terminal.

 

Type: \h for help with commands

 

\q to quit

 

hdbsql=> select count (*) from EIM360.mara

 

COUNT(*)

 

14927

 

 

 

 

 

  • 19) Performing a Database Backup Using SQL Commands

https://cookbook.experiencesaphana.com/bw/operating-bw-on-hana/data-safety/backup/performing-database-backup/


You can enter SQL commands either by using the SQL editor in SAP HANA studio,

or by using the hdbsql program on the command line.

 

The hdbsql program is located in

 

/usr/sap/<SID>/HDB<instance number>/exe/

 

Note Backups using SQL commands are only recommended for batch mode

(see section Using Batch Mode below).

 

The backup can be started with the following SQL command:

 

BACKUP DATA USING FILE (‘<backup_file_prefix>’)

 

The backup creates the backup files in the default directory.

The name of each backup file starts with <backup_file_prefix>.

 

If you want to change the default a location, specify the full

path, for example:

 

BACKUP DATA USING FILE (‘/backupDir/data/monday/COMPLETE_DATA_BACKUP’)

 

Performing a Database Backup Using Batch Mode

Currently, the main tool for batch mode backup is the command
line interface hdbsql.

 

This is the current recommended mode for executing backups from
operating system level.

 

hdbsql enables you to trigger backups via crontab.

 

 

 

 

 

  • 20) Setting Up a Data Backup Using Cron

 

a. Install the client software.

The client software enables access to the hdbuserstore.

Use the following command:

 

hdbinst –a client (default location: /usr/sap/hdbclient)

 

 

b. Create a user key.

Use the following command:

 

/usr/sap/hdbclient/hdbuserstore set <KEY> <host>:3<instance id>15
 

<user> <password> Example: /usr/sap/hdbclient/hdbuserstore set
 

BACKUP vebwtests1:30015 user password

 

 

c. In crontab, call the following at the desired time:

 

/usr/sap/hdbclient/hdbsql –U<KEY> "BACKUP DATA USING FILE ('<target>')"

 

Example: /usr/sap/hdbclient/hdbsql -U BACKUP "BACKUP DATA USING FILE ('MONDAY')"

 

A data backup is then created in the default location. In the above example,
the prefix of all service-related backup files is “MONDAY”.

 

 

 

 

 

  • 21) Canceling a Running Data Backup

You can cancel a data backup that is still running.

To cancel a running data backup, you can use either  SAP HANA studio or an SQL command.

 

Prerequisites

 

A user needs the system privileges CATALOG READ and BACKUP ADMIN.

Canceling a Running Data Backup Using SAP HANA Studio

You can use SAP HANA studio to cancel running data backups that you

started using the backup wizard. To cancel a running data backup, choose Cancel.

The backup is then canceled and you are notified of this.

 

 

 

 

 

  • 22) Canceling Other Running Data Backups

 

To cancel any data backup:

 

a. Find the Backup ID.

 

When a data backup is started, the system assigns a unique ID to the data backup.
To find the backup ID of the running data backup, use the monitoring view

 

M_BACKUP_CATALOG,

 

which provides an overview of information about backup and recovery activities.

 

 

To find the backup ID of the running data backup, use the following SQL command:

 

select BACKUP_ID from "SYS"."M_BACKUP_CATALOG" where entry_type_name =
  'complete data backup' and state_name = 'running' order by sys_start_time desc;

 

 

You can now use the backup ID to cancel the running data backup.

 

 

Cancel the data backup.

 

To cancel the running data backup use the following SQL command:

 

backup cancel <backup_id>

 

 

Check that the data backup was canceled.

 

 

 

 

 

  • 23) To check the state of the data backup, use the following command:

 

select state_name from "SYS"."M_BACKUP_CATALOG" where backup_id = <backup_id>

 

 

 

 

 

  • 24) hdblogdiag & Recovery

 

Recover the database until a point in time with a timestamp using a dedicated directory for
data backups and further directories containing automatically written log backups.

 

Use the following command:

 

RECOVER DATABASE UNTIL TIMESTAMP '2011-08-22 15:00:00' CHANGE ALL DATA USING PATH
  ('/backup/MONDAY/') CHANGE ALL LOG USING PATH ('/backup/logs1/','/backup/logs2/')

 

 

 

 

 

  • 25) Restore when log backups are missing

 

ref: SAP Note 1816483 - Restore when log backups are missing

 

A restore fails with the error "Recovery could not be completed Cannot
open file ...".

 

One or more log backups of the SAP HANA database are missing or damaged.

 

All the available log backups exist in the file system of the database server.

 

If not all the required log backups are available, an SAP HANA database
restore is subject to restrictions.

 

In most cases, you will only be able to restore the SAP HANA database to a
in time that lies before the time when the oldest missing log backup was written.

 

A restore to any point in time (including redo of the log area) is possible
only if the content of the missing log backup still exists in the log area.

 

To determine if this is the case, proceed as follows:

 

Determine the volume IDs and the lowest log item of the missing log backup
from the name of the backup. If a previous restore has failed with the
error message specified above, you can determine the name of the missing
log backup from the error message.

 

The log backups are named after the schema log_backup_<volume ID>_0_<lowest
log item>_<highest log item>, for example,

 

log_backup_2_0_12345_67890.

 

In this example, the volume ID would be 2 and the lowest log item 12345.

 

Use hdblogdiag to determine the lowest log item that still exists in the
log segments of the log area as follows:

 

Call

 

hdblogdiag seglist <directory of the log segments for volume ID>

 

This call issues a list of log segments that looks as follows:

 

LogSegment[0/0:<lowest log item>-<highest log item>(...

 

LogSegment[0/1:<lowest log item>-<highest log item>(......

 

Take into account that the system outputs the log items in hexadecimal
format and that you have to convert them into decimal numbers.

 

Example:

 

Assuming the parameter basepath_logvolumes has the value /hana/log/HAN and
the parameter use_mountpoints has the value yes, so that the log segments
lie in the subdirectory mnt0001.

 

The log segments for the volume with the volume ID 2 then lie there in the
subdirectory

 

hdb00002: > hdblogdiag seglist /hana/log/HAN/mnt0001/hdb0002

 

  LogSegment[0/0:0x129540-0x131c00(...
  LogSegment[0/1:0x131c00-0x139540(...

 

Log segment 0 has the lowest log item 1217856 (hexadecimal 0x129540), log
segment 1 has the lowest log item 1252352 (hexadecimal 0x131c00). This
means that the lowest log item that exists in the log segments is 1217856.

 

If the lowest log item of the missing log backup is bigger than the lowest
log item that still exists in a log segment of the log area, a restore to
the latest available point in time is possible.

 

In such a case, proceed as follows:

 

Generate a new backup catalog with
 

hdbbbackupdiag -generate (SAP Note1812057)

 

Start the restore for the point in time that you selected.
If a restore to the latest available point in time is not possible, you can
restore the SAP HANA database to an earlier point in time.

 

You can restore the SAP HANA database only with the log backups that were
written before the missing log backup.

 

You cannot use the log backups that were written after the missing log
backup.

 

In this situation, proceed as follows:

 

Generate a new backup catalog with

 

hdbbbackupdiag -generate (SAP Note 1812057).

 

Start the restore for the point in time that you selected. For this, you
must select "Initialize Log Area".

 

 

 

 

 

  • 26) hdbbackupdiag

 

The program hdbbackupdiag provides support for determining the data backup
files and log backup files that are required to restore an SAP HANA
database, with "Recover the database to its most recent state" as the aim.

Depending on the availability of the file BackupCatalog.xml that is saved
under the path $DIR_INSTANCE/../SYS/global/hdb/metadata and that is used to
determine the restoration strategy, two application scenarios arise; they
are both described in the following section.

 

Scenario a: BackupCatalog.xml is available

 

If the current version of the file BackupCatalog.xml exists, the program

 

hdbbackupdiag

 

determines the most suitable data backup and creates a list
of the required log backup files.

 

To do this, you must call the program with the option -c <BackupCatalog>.

Use the option -c <BackupCatalog> to specify the name of the BackupCatalog
file.

 

In normal cases, this is BackupCatalog.xml.

 

If the program is not started from the directory in which the file

 

  BackupCatalog.xml
 
is located, you can use the option -d <Path> to adjust the access path.

 

The following example illustrates the function:

 

hdbbackupdiag -d $DIR_INSTANCE/../SYS/global/hdb/metadata -c

 

BackupCatalog.xml

 

Output:

 

Data backup file: thursday_databackup_0_1

RedoLogPosition: 0

Backup time: 2012-04-12T17:30:03+02:00

 

 

Using hdbbackupdiag to Reconstruct the Backup Catalog

 

A recovery of SAP HANA database fails. The file "backup.log" contains the
error messages:

 

"recovery strategy could not be determined" and

 

"Catalog backup not found".

 

The backup catalog that the database created is not available or defective.

All data backups and log backups that are required for the recovery are
available in the file system of the database server.

 

You can use the program "hdbbackupdiag" to create a new backup catalog.

 

For this, you provide all the data backups and log backups in the file
system of the database server. If you cannot use the standard paths for
data backups and log backups due to space restrictions, you can also save
the files in different directories. Note that all files of the data backup
must be saved to one directory. However, log backups can be distributed
across multiple directories.

 

To do this, start the program hdbbackupdiag with the following options:

 

--generate

 

--dataDir <directory of the data backups>

 

--logDirs <directories of the log backups>

 

-d <target directory of the backup catalog>

 

All directories must be specified as absolute paths.

 

For a standard installation, the call is as follows, for example:

 

hdbbackupdiag --generate --dataDir $DIR_INSTANCE/backup/data
  --logDirs $DIR_INSTANCE/backup/log -d $DIR_INSTANCE/backup/log

 

If you have distributed the log backup across multiple directories, you can
specify them behind the option --logDirs separated by commas. For example:

 

hdbbackupdiag --generate --dataDir $DIR_INSTANCE/backup/data
  --logDirs $DIR_INSTANCE/backup/log,/mnt/disc2/log -d
  $DIR_INSTANCE/backup/log

 

When you call "hdbbackupdiag", the content of the specified directories is
analyzed and a new backup catalog is generated. This backup catalog is
created with a directory for the log back up that is specified with -d.

 

Then new backup catalog has the file name "log_backup_0_0_0_0.n" whereby n is
the newly generated backup ID of the database.

 

To be able to verify the content of the backup catalog, you can output the
recovery strategy that is based on the latest data backup in the generated
backup catalog. For this, use the command

 

"hdbbackupdiag -d <target directory of the backup catalog>"

 

After you generate the backup catalog, a recovery of the database may
carried out.

 

In case the backup catalog is not created in the standard directory
$DIR_INSTANCE/backup/log, you must specify the directory in which the
backup catalog is located in the Recovery Wizard in the step "Locate Log
Backups".

 

For more information check OSS Note:

SAP Note 1812057 - Reconstruction of the backup catalog with hdbbackupdiag

 

 

 

 

 

  • 27) hdbcons

http://scn.sap.com/community/hana-in-memory/blog/2012/12/28/how-to-use-sap-hana-hdbcons-utility

 

hdbcons is a tool to provide information about the actual used memory, the peak memory,

the throughput of the different allocators, etc.

 

SAP Note 1786918 - Required information to investigate high memory consumption

You are running SAP HANA and you experience high memory consumption / out
of memory situations.

 

You are asked to provide information using hdbcons.

 

SAP HANA Development Support requires information about a suspected memory
issue.

 

Connect to your HANA database server as user sidadm.

 

a. You suspect that a query requires too much memory:

Before running the query start the tracing:

 

hdbcons 'mm flag / -rs as'

 

Run the query.

 

Stop the tracing:

 

hdbcons 'mm flag / -rd as'

 

Write output:

 

hdbcons 'mm callgraph -r -o <filename> /'

 

Please be aware that this trace will slow down the system!

 

You suspect a memory leak:

 

Please provide the output files of the following commands:

 

a. hdbcons 'mm bl -rt Pool' > /<path>/<file_name.txt>

 

This will show us how much memory is located by which code line.

 

b. hdbcons 'mm l -s -S -p' > /<path>/<file_name.txt>

 

This will show which allocator allocates how much memory at the
moment and its peak memory allocation.


The memory consumption stays at a high level / the database seems to
hang because of high memory consumption

 

Please provide a runtime dump as described in SAP Note 1813020 and
the content of the monitoring view M_HEAP_MEMORY.

 

hdbcons To get threads list

   

hdbcons "context list -s" > thread_callstack_test.txt


To generate runtime dump (see the Note 1813020 - How to generate a runtime dump on SAP HANA)

 

hdbcons "runtimedump dump -f /test/rte_dump.txt"

 

 

 

 

 

  • 28) Sizing Hana Basic Calculations

 

SAP BW Sizing On Hana Summary

 

RAM = (Source data footprint - 60gb) * 2 / 4 * c1 + 90gb

 

or

 

RAM = ( colstore tables footprint * 2 / 4 + rowstore tables footprint /1.5) * c1   + 50gb

 

Disk persistence = 4 * RAM

 

Disk Log = 1 * RAM

 

c = source database specific compression factor (where applicable)


Memory Sizing: Runtime Objects

 

RAM dynamic + RAM static

 

Total RAM is

 

RAM = (Source data footprint - 60gb) * 2 / 4 * c1 + 90gb

 

or

 

RAM = ( colstore tables footprint * 2 / 4 + rowstore tables footprint /1.5) * c1    + 50gb
 
  c = source database specific compression factor (where applicable)

 

 

Disk Sizing

 

Disk persistence = 4 * RAM

 

Disk Log = 1 * RAM

 

Useful OSS Notes:

 

OSS 1736976 - Sizing Report for BW on HANA

 

OSS 1514966 - SAP HANA 1.0: Sizing SAP In-Memory Database

 

SAP BW on HANA Sizing Report - https://websmp209.sap-ag.de/~sapidb/012006153200000051552013E/SAP_BW_on_HANA_Sizing_Report_V1_4.pdf

 

OSS 1799449 - Sizing report ERP on SAP HANA database

 

OSS 1855041 - Sizing Recommendation for Master Node in BW-on-HANA

 

 

 

 

 

 

  • 29) HANA SSO with Kerberos and Active Directory

 

The configuration is a complex task with various sources of error.

 

Main reasons are the cross OS nature of the setup (Linux/ Windows), often resulting

in problems with case (Windows: case insensitive, Linux: case sensitive) and the tight

integration into the network configuration (/etc/hosts, DNS).

 

In particular, there are various sources of error when creating the keytab and exporting it

from Active Directory.

 

The process is highly manual and consists of several steps across different operating

systems and hosts.

 

A python script is provided in SAP Note 1813724 which automates the creation of the
keytab.

 

In addition, the IP configuration at the HANA appliance is validated  (hostname resolution

/ reverse lookup) and the Kerberos configuration is checked for consistency with the remote

Active Directory.

 

Unzip the attached hdbkrbconf.zip and run as <sid>adm

 

$ python hdbkrbconf.py -h

 

for help.

 

There are two main options, "-k" for creating the keytab and "-v" for validating the

configuration including the keytab.

 

Option "-V" will provide verbose output, together with an analysis file for SAP support.

 

This file will also be written in the event of an error.

 

More info, check: SAP Note 1813724 - HANA SSO/Kerberos: create keytab and validate conf

 

 

 

 

 

 

  • 30) Row Store Reorganisation

 

Row store memory size is a lot bigger than the actual data size in row store and shows

high fragmentation ratio.

 

Row store grows by allocating a 64MB memory segment and shrinks by freeing empty

segments.

 

A segment is internally divided into fixed-size pages. When a row store table requires more

memory to store records, the table takes a free page from existing segments. If no segment

has a free page, a new segment is allocated.

 

Deleting a large number of records may result in a number of sparse segments. In such a case,

row store reorganization can be performed for memory compaction. The pages in sparse

segments are moved to other segments and the resultant empty segments are freed.

 

If the prerequisites are not satisfied, row store reorganization should not be executed.

 

The prerequisites, queries, and monitoring views used for the queries are applicable to Rev 50 and

to change in future releases.

 

o HANA Database Rev 50 or later

 

- HANA Database has to be upgraded prior to run row store
  reorganization

 

o Catalog integrity check

 

- The following procedure call should run successfully and
  returns empty result set.

 

CALL CHECK_CATALOG
('CHECK_OBJECT_REFERENTIAL_INTEGRITY','','','');

 

o Row store reorganization is recommended, when allocated row store
size is over 10GB and free page ratio is over 30%.

 

SELECT HOST, PORT, CASE WHEN (((SUM(FREE_SIZE) /
SUM(ALLOCATED_SIZE)) > 0.30) AND SUM(ALLOCATED_SIZE) >
TO_DECIMAL(10)*1024*1024*1024) THEN 'TRUE' ELSE 'FALSE' END
"Row store Reorganization Recommended", TO_DECIMAL(
SUM(FREE_SIZE)*100 / SUM(ALLOCATED_SIZE), 10,2) "Free Space
Ratio in %",TO_DECIMAL( SUM(ALLOCATED_SIZE)/1048576, 10, 2)
"Allocated Size in MB",TO_DECIMAL( SUM(FREE_SIZE)/1048576, 10,
2) "Free Size in MB" FROM M_SHARED_MEMORY WHERE ( CATEGORY =
'TABLE' OR CATEGORY = 'CATALOG' ) AND PORT LIKE '%03' GROUP BY
HOST, PORT;

 

If the result of "Reorganization Recommended" is "TRUE", then row
store memory can be reclaimed after row store reorganization.

 

o Estimation on maximum possible memory space that can be reclaimed:

 

SELECT SUM(NUM_FREE_DATA_PAGES)*16/1024 "Estimated Maximum
Memory Saving in MB" FROM SYS.M_DEV_MEMORY_SEGMENT WHERE
SEGMENT_TYPE = 0;

 

o Check disk space where log files are located.

 

SELECT (SUM(NUM_FREE_DATA_PAGES)*16/1024)*1.5 "Estimated Log
Volume Size in MB" FROM SYS.M_DEV_MEMORY_SEGMENT WHERE
SEGMENT_TYPE = 0;

 


Row store reorganization procedure

 

1. Make a complete database backup

 

2. Upgrade database to Rev50 or later

 

3. Make sure to start up database at least once during upgrade.

 

4. Shutdown HANA database.

 

5. Set up configuration parameter in indexserver.ini via HANA Studio.
  [row_engine] page_compaction_enable = true

 

6. Restart the database. Startup will take longer than normal startup
  when row store reorganization runs.

 

7. Check the memory fragmentation status after reorganization:

  

SELECT HOST, PORT, CASE WHEN (((SUM(FREE_SIZE) /
   SUM(ALLOCATED_SIZE)) > 0.30) AND SUM(ALLOCATED_SIZE) >
   TO_DECIMAL(10)*1024*1024*1024) THEN 'TRUE' ELSE 'FALSE' END
   "Row store Reorganization Recommended", TO_DECIMAL(
   SUM(FREE_SIZE)*100 / SUM(ALLOCATED_SIZE), 10,2) "Free Space
   Ratio",TO_DECIMAL( SUM(ALLOCATED_SIZE)/1048576, 10, 2)
   "Allocated Size in MB",TO_DECIMAL( SUM(FREE_SIZE)/1048576, 10,
   2) "Free Size in MB" FROM M_SHARED_MEMORY WHERE ( CATEGORY =
   'TABLE' OR CATEGORY = 'CATALOG' ) AND PORT LIKE '%03' GROUP BY
   HOST, PORT;

 

8. Check Indexserver trace

 

Please check the indexserver trace files after row store
  reorganization is done.

 

Row store reorganization is performed in 11 phases when successful.
 
  It prints out "[RSReorg] start" message at the beginning,

 

"[RSReorg] success" message followed by "Reorganization statistics"
  at the end. When there are errors during reorganization, row store
  reorganization is stopped in that phase and all the changes are
  rolled back and database starts up to return to its original state

 

Ref: SAP Note 1813245 - SAP HANA DB: Row store reorganization before reorganization is run.

 

 

 

 

 

  • 31) How to generate a runtime dump on SAP HANA saphostagent/sapdbctrl for Hana  

 

SAP Support asks you to generate a runtime dump during a high CPU/Memory consumption

 

a) Log into the linux HANA host as sidadm user;

 

b) Run command 'hdbcons';

 

c) On the hdbcons console run command below:
 
  > runtimedump dump

 

This will create a file under traces directory called 'indexserver....rtedump.trc'.

 

d) Attach generated trace file to the OSS Message

 

Or alternatively

 

hdbcons "runtimedump dump -f /test/rte_dump.txt"

 

More info check SAP OSS Note: 1813020

 

 

 

 

 

  • 32) DBSL hints for SAP HANA

 

The following DBSL hints are available for SAP HANA.

 

o dbsl_equi_join

 

You can use this hint for a FOR_ALL_ENTRIES in order to inform the
DBSL that an EQUI join can be created from this FAE statement; this
EQUI join can then be resolved for an IN data set.

 

You must then specify this hint together with the hint of the DBI interface
(&prefer_join 1&) in the statement (see Note 48230).

 

ABAP

 

SELECT * INTO <rtab> FROM <table> FOR ALL ENTRIES IN itab
WHERE a = itab-a AND b = itab-b AND c = itab-c AND d = itab-d
%_HINTS HDB '&prefer_join 1&' hdb 'dbsl_equi_join'

 

From this, the system generates the following statement for the
database.

 

SQL for the database

 

select * from <table> where (a, b, c, d) IN
( (?, ?, ?, ?), ... (?, ?, ?, ?) )

 

o dbsl_add_stmt

 

Use: dbsl_add_stmt <SQL enhancement>

 

This hint provides you with the option of adding an additional SQL
enhancement to the SQL statement in the DBSL. This hint is resolved
in the DBSL and the system appends the following text to the end of
the statement that was generated (after the keyword
dbsl_add_stmt).

 

ABAP

 

SELECT node FROM <table> CONNECTION ('HANA') INTO TABLE T_NODE
%_HINTS HDB 'dbsl_add_stmt WITH PARAMETERS (''expression'' =
''LEAVES("Node1")'' )'

 

SQL for the database

 

select node from table
with parameters ( 'expression' = 'leaves("Node1")' )

 

This hint is available in the following releases:

 

720 DBSL as of patch level 103

 

720-EXT DBSL as of patch level 103

 

803 DBSL

 

Note

In Release 7.20, you cannot use the keyword %_HINTS HDB to define the
hints for SAP HANA; instead, you must use the keyword %_HINTS ADABAS.

The database interfinterface ensures that this hint is also generated for an SAP
HANA Connection.

 

Ref: Useful OSS SAP Note 1622681 - DBSL Hints for Hana

 

 

 

 

 

 

  • 33) hdbrename

http://help.sap.com/businessobject/product_guides/HAN01SP4/en/hana_sps4_HDB_server_inst_en.pdf

 

You can rename an SAP HANA database system as described in this section.

 

Caution:

 

Renaming a SAP HANA database system that is running with a permanent SAP license invalidates the
license so that the instance is locked down.

 

You need to request a new license key with the new system

 

ID and the new hardware key

(generated by renaming) in order to unlock the renamed system.

This especially impacts renaming in batch mode: since the renamed instance is locked down,

no other activities except license installation are then possible.

 

For more information, see SAP License Key.

 

Before performing the rename, you need to meet these prerequisites:

 

• You identify an existing SAP HANA database system, created by an installation or a file system
copy, that is to be renamed.

 

• The target system ID (SID) must not exist.

 

• The target instance number must not exist.

 

• You are logged in as the root user.

 

• If a mountpoint contains the system ID, you must first rename it manually before performing the
rename of the SAP HANA database.

 

 

To rename the instances of a distributed system, you perform the rename for all hosts with a single call of hdbrename.

 

Renaming a System in Interactive Mode

 

Make sure that you meet the prerequisites in Renaming a SAP HANA Database System .

 

In interactive mode, the program queries all essential parameters that are not defined using

command line options as well as parameters that have invalid values.

 

a. Open a root shell.

 

b

 

. Via the local mount point, change to the following directory on the shared file system:

 

/<sapmnt>/<SID>/global/hdb/install/bin

 

c. Call the hdbrename program by entering this command:

 

./hdbrename

 

d

 

. Specify the required data.

 

The system is now renamed.

 

 

 

 

  • 34) hdbnsutil

 

Example Usage:

 

OSS Note 1738390 - How to update SAP HANA Linux server hardware key

 

When you have installed SAP HANA, you want to update the Linux server hardware key.

 

Prerequisite: You have logged on to Linux as SAP HANA instance administrator.

 

To update the hardware key, follow the procedure below:

 

Stop the SAP HANA DB instance
 

Remove line 'id=<a number>'
  

in /usr/sap/<SID>/SYS/global/hdb/custom/config/nameserver.ini

 

Execute 'hdbnsutil -convertTopology'

 

Start the SAP HANA DB instance

 

NOTE: Reinstallation of SAP HANA will generate a different hardware key,

but upgrade of   SAP HANA will not change the hardware key.

 

 

 

 

 

 

  • 35) hdbsrvutil - Example Usage

https://cookbook.experiencesaphana.com/bw/operating-bw-on-hana/hana-database-administration/troubleshooting/sap-support/location-and-usage/

 

Collecting Location and Usage Information

 

There is a Python script which allows you to collect support information for customer systems

even when no access to the system via SQL is possible.

 

This information can be added to the support message.

 

The name of the Python script is fullSystemInfoDump.py and is part of a server installation.

 

It runs from a command line and is located in the directory $DIR_INSTANCE/exe/python_support (Linux).

 

To execute the script <sid>adm rights of the customer system are required.

 

To start the script out of its location directory, enter: python fullSystemInfoDump.py

 

By default the script creates a zip file with all of the collected support information to the directory

 

DIR_TEMP/system_dump where DIR_TEMP is the content of the variable with the same name in sapprofile.ini.

 

This output directory is shown as console output when the script is running, but it can be looked up by entering:

 

hdbsrvutil -z | grep DIR_TEMP=

 

To change the default directory, an explicit absolute path can be given to the script, for example:

 

python fullSystemInfoDump.py <output_dir>

 

Usage information can be displayed by entering:

 

python fullSystemInfoDump.py –h

 

To collect support information you need an SQL user with rights to select the system tables and

views listed in System Tables/Views.

 

For security reasons the user name and password for this SQL user cannot be given as command

line parameters to the script.

 

If the customer system can be reached via SQL and the user name and password information is valid,

the script starts collecting support information. If user name and/or password are invalid, the script aborts.

 

If the customer system cannot be reached via SQL the script only collects information which can be

read without SQL access. The resulting zip file name has the following structure:

 

fullsysteminfodump_<SAPLOCALHOSTFULL>_<SAPSYSTEMNAME>_<YYYY>_<MM>_<DD>_<HH>_<MM>_<SS>.zip

 

SAPLOCALHOSTFULL and SAPSYSTEMNAME are again taken from sapprofile.ini.

 

 

 

 

 

  • 36) Show Hana Processes HDB proc & HDB info

HDB proc 

 

HDB info

 

eg /usr/sap/HDB/HDB00/HDB info

SAP HANA Enterprise Cloud - important step to... where?

$
0
0

During SAP Global Press Conference web-cast (Tuesday, May 7th) SAP revealed their vision for SAP cloud-based offering. This announcement was followed by a series of blogs and SAP Sapphire sessions/keynotes/discussions where SAP did a really good job on explaining how individual parts will fit together and what are the differences between individual offerings.

 

More on this subject can be found in the following blogs:

 

Sunshine on a Cloudy Day (by Vishal Sikka)

http://www.saphana.com/community/blogs/blog/2013/05/07/sunshine-on-a-cloudy-day

 

SAP HANA Enterprise Cloud, SAP HANA Cloud Platform, SAP NetWeaver Cloud Platform, NEO, ... (by Björn Goerke)

http://bgoerke.wordpress.com/2013/05/11/sap-hana-enterprise-cloud-sap-hana-cloud-platform-sap-netweaver-cloud-platform-neo

 

Evolution of the SAP HANA Cloud Platform (by Aiaz Kazi)

http://www.saphana.com/community/blogs/blog/2013/05/10/evolution-of-the-sap-hana-cloud-platform

 

Sapphire Now 2013 report card (by Jon Reed)

http://diginomica.com/2013/05/17/sapphire-now-report-card

 

I personally believe that it would be very useful to describe SAP HANA Enterprise Cloud also from technical side (some architectural diagram on how it is built) - but that is not the point of this blog.

 

While SAP did really well on explaining this particular move I still believe that it is not clear enough (at least to me) where exactly this move is leading to in long-term perspective. There is space for different interpretations of where SAP is going and how the vision of SAP future can look like.

 

Before I start I would like to point out that all of this is pure speculation based on lack of information on this subject. I hoped that this would become clearer during SAP Sapphire but I did not find anyone able to provide me with a satisfying answer to this question.

 

I would also like to highlight that content of this blog describes my personal opinion and is not representing in any way the position of my employer.

 

Speculation #1: SAP as the exclusive cloud provider for SAP HANA technology

 

First interpretation of this move (and the most logical) is that SAP is aggressively entering the hosting business with the goal to take over (ideally) all the new SAP HANA based installations as part of their SAP HANA Enterprise Cloud offering.

 

Of course there will be customers that will prefer to stay on the premise and customers preferring other hosting partners and these will continue to be supported by SAP, but all customers will be pushed by SAP to take advantage of SAP HANA Enterprise Cloud.

 

What does it mean? This might have considerable impact on the SAP ecosystem. From the perspective of SAP HANA Enterprise Cloud many SAP partners will not be partners anymore and will rather become suppliers - of hardware, of software (for backups or monitoring), and possibly also of services (in case that SAP will decide to outsource part of managed services). In any way SAP would step between the customer and the partner, taking over the control over the engagement and closing the options for customers.

 

In other words - there will not be any need for basis teams or infrastructure architects (except those working directly for SAP). This can be seen as a serious blow not only to partners themselves, but also to the whole SAP ecosystem, because some roles will become redundant (from the perspective of SAP HANA Enterprise Cloud).

 

Note: Of course nothing will change in the on-premise world, but this world will become smaller and smaller over time.

 

What about non-HANA based applications? SAP announced that the purpose of SAP HANA Enterprise Cloud is to host SAP HANA based applications. But what about the rest of the customer environment? Possibly, this is where partner cloud offerings (that are usually more generic and are able to host various technologies) come into play.

 

During SAP TechEd 2012 in Last Vegas SAP was predicting the time of „interconnected clouds“. Is this vision describing SAP future?

 

SAP announced that all SAP applications will be able to run on SAP HANA. Therefore this also means that all SAP applications will be supported in SAP HANA Enterprise Cloud.

 

Does this mean that in the future SAP will expect that all SAP software will be hosted in SAP HANA Enterprise Cloud and all non-SAP software in partner clouds?

 

Of course we are speaking about SAP vision for the future - real result can be completely different as it depends on what the customers will demand.

 

Speculation #2: SAP HANA Enterprise Cloud as a „prototype“ for other cloud partners (or maybe also SAP customers)

 

Second interpretation can be seen as opposing extreme to the previous example.

 

SAP published SAP HANA Enterprise Cloud FAQ:

http://www.saphana.com/docs/DOC-3554

 

In this FAQ you can find the following question and a corresponding answer:

 

Q: Is SAP getting back into the hosting business? A few years back SAP said that hosting was a partner’s domain and many invested in that business.

 

A: SAP HANA Enterprise Cloud is not a generic hosting offering, but allows enterprise customers an option to accelerate their SAP HANA, SAP Business Suite, and SAP NetWeaver BW powered by HANA deployments; certified partners are encouraged to provide their own SAP HANA based offerings to customers.

 

Also you might notice in various places that one of the important incentives for SAP behind offering SAP HANA Enterprise Cloud was their goal to enable faster SAP HANA adoption and to remove all roadblocks for potential customers (particularly in the area of HW procurement - you do not need to buy any HW in case of SAP HANA Enterprise Cloud).

 

Does this mean that SAP is not seeing SAP HANA Enterprise Cloud as product that they will try to aggressively push forward, but rather as a tool that will help them to accelerate adoption of SAP HANA technology?

 

If so,  it would be logical to expect that SAP will challenge their hosting partners and cloud providers to build their own SAP HANA Enterprise Clouds according to SAP specifications, because this would increase adoption of SAP HANA much more. In this case I would expect SAP to release reference architecture for SAP HANA Enterprise Cloud, their Cloud Frame technology and best practices on how to operate such environment.

 

Next question can be if big customers would be allowed to raise their own SAP HANA Enterprise Clouds in their own data centers. There are customers that are still having their own data centers and these customers could benefit from this possibility.

 

What does it mean? This scenario would have totally different impact on hosting partners as this would only change the way how they will provide their hosting services. Partners could continue to provide on-premise hosting and in parallel to that they can complement existing generic cloud offerings with dedicated co-located SAP HANA cloud offering.

 

Other partners (hardware, software or even services) would also have better positions, because they would be able to make alliances with individual cloud providers. The whole SAP ecosystem would be heavily transformed, but would survive this change.

 

And customers that prefer to run their SAP business in their own data centers could still benefit from SAP HANA Enterprise Cloud without giving up their independence.

 

Summary

 

To me it is not clear where SAP is heading, and I believe that anything between these two extreme options is possible. I totally agree that SAP made very important step, but in which direction?

 

I think that SAP should better formulate their long term vision as this will (in either way) seriously impact the whole SAP ecosystem - all SAP partners, all SAP customers and all practitioners.

 

I would welcome all constructive critics or explanations - it might be my mistake of not understanding SAP strategy (and I fully admit that this is possible) but I consider this unclear situation as so important, that I cannot afford to ignore it.

SAP HANA Administration tables/views

$
0
0

The following are some of the important HANA tables/views required for administration


Table/View Name

Used for

M_LICENSE

Contains License information

_SYS_PASSWORD_BLACKLIST

Contains the list of blacklisted passwords

GRANTED_PRIVILEGES

Privileges granted directly to the specified user (or role) are listed. Privileges contained within granted roles are not shown.

GRANTED_ROLES

All roles granted directly to the specified user (or role) are listed

EFFECTIVE_PRIVILEGES

Privileges granted to the specified user both directly and indirectly through roles are listed separately.

EFFECTIVE_ROLES

All roles granted to the specified user both directly and indirectly through other roles are listed separately.

M_VOLUME_IO_STATISTICS

Lists the total read size and the total write size for each volume since the service in question was last started

M_VOLUME_IO_STATISTICS_RESET

Lists the statistics since the reset time.

M_LOG_SEGMENTS

Monitoring view for log segments

M_VOLUME_IO_STATISTICS

Aggregated I/O statistics for the volume since the service was started

M_DATA_VOLUME_PAGE_STATISTICS.

Displays statistics on the data volume's pages

M_SERVICE_MEMORY

Lists the memory usage information

M_HEAP_MEMORY_RESET

View to reset memory usage

M_CS_TABLES

Provides column store table information

M_RS_TABLES

Provides row store table information

M_CS_COLUMNS

Contains memory related information of the Column store table’s columns

M_HOST_RESOURCE_UTILIZATION

Lists the resource utilization by the host

M_DELTA_MERGE_STATISTICS

Information about all delta merge operations since the last system start

HOST_DELTA_MERGE_STATISTICS

Delta merge statistics in the table, collected by the Statistics server

GLOBAL_MEMORY_STATISTICS

Global memory consumption of the database over the last 30 days

M_EXPORT_BINARY_STATUS

Monitor the progress of a running export

M_IMPORT_BINARY_STATUS

Monitor the progress of a running import

M_PERSISTENCE_ENCRYPTION_STATUS

Check persistence encryption status

M_PERSISTENCE_ENCRYPTION_KEYS

Check how long the current encryption key has been valid

M_BACKUP_CATALOG

Provides an overview of information about backup activities.

M_BACKUP_CATALOG_FILES

Provides information about the backups created, and the backup destinations that are used by data and log backups.

M_SERVICE_REPLICATION

Contains the basic status of replication between primary and secondary systems

M_CONVERTER_STATISTICS

View to estimate the size of the next complete data backup,

M_BACKUP_CATALOG

Provides an overview of information about backup activities.

M_BACKUP_CATALOG_FILES

Provides information about the backups created, and the backup destinations that are used by data and log backups.


Introducing this year’s SAP HANA Startup Forum Berlin 2013 participants – Part I

$
0
0

SAP HANA Startup Forum Berlin 2013 is only three weeks ahead of us. We have been able to recruit 30+ exciting startups and would like to introduce you to them. Follow this blog series to find out who this year’s participants are and what they do. Let’s start with the first two:


Apaxois a platform that helps online merchants to make more business out of subscription-based products in their portfolio. In order to do so, the platform finds the right customers for the products through semantic and web analyses of anonymized customer profiles and makes respective recommendations to the merchants. While browsing the recommendations, analytical data is also being captured to learn more about the customer. Apaxo is operated as a virtualized cloud system and hosted in multiple high-security data centers, supporting best practice privacy policies.


FlowTrack by Blue Cell Networks offers a unique solution for customer journey analysis that can measure and analyze customer behavior at the point of sale automatically and in real-time. The technology behind this solution detects Bluetooth and Wi-Fi signals from mobile devices and provides anonymized calculations of the number of people at the point of sale, their retention period, movements etc. This way, retailers can optimize their product placement, better plan their workforce and increase customer satisfaction by predicting and minimizing waiting time.


Keep checking back to find out about other participants!


Related content:

SAP HANA Startup Forum Berlin 2013 announcement

Phenomenal turnout for the SAP HANA Startup Forum in Berlin 2012

Why the “Corner 3” is the Most Important Shot of the #NBA Finals

$
0
0

Note: This is the latest in my blog series to help fans understand the value SAP will be bringing to the sports and entertainment industry. Today's post was inspired by my sports analytics hero Kirk Goldsberry who writes for Grantland.com. You can read my NBA Sabermetrics 101 blog post as a primer if interested.

 


2013-NBA-Finals-Logo.jpg

With the NBA Finals about to start, our attention has been focused on the superstars for each team: LeBron James & Dwayne Wade for the Miami Heat, and Tony Parker & Tim Duncan for the San Antonio Spurs. However, as NBA.com/stats (powered by SAP HANA) shows us, one unheralded role player (Shane Battier) could hold the key to the entire series. NBA.com’s “Shot Charts” feature shows us that Battier’s ability to make “Corner 3s” will have a huge impact on which team will win the NBA title.

 

What is a “Corner 3” and why is it important?

corner 3 labels.PNG


As the name implies, a “Corner 3” is a 3-point shot taken from either corner facing the basket (see image to the right). Some consider it the most important shot in basketball because it “stretches” a defense if you can put a great 3-point shooter in those corners. It creates space for other players to drive to the basket, it creates wider passing lanes, and it generally unclutters the area closest to the basket.

 

The Heat used this strategy to perfection during the 2012-2013 season. With LeBron James and Dwayne Wade being constant threats to drive to the basket, defenses were forced to collapse inside, thus freeing up shooters like Battier and Ray Allen to make corner 3s. While Allen is arguably the greatest pure shooter in history, Battier actually had the better year, making 46% of his 191 corner 3s (more than anyone in the NBA).

 

What does it look like?


The video below shows how Battier is able to hover in the far corners (essentially unguarded) and wait for passes from LeBron James.

 

 

 

While this worked beautifully during the regular season, Battier has gone ice cold during the playoffs - thanks in large part to the great Indiana Pacers defense. The Shot Charts below compares his field goal percentage during the regular reason vs. the playoffs. Notice how the Corner 3 areas have turned from green to red, indicating that he his shooting below league average from those spots during the playoffs.

 

Click Image to Enlarge

battier charts together.PNG

The game within the game


When Battier (and Allen) are making their corner 3s, the Heat are virtually unbeatable because opposing teams are put into a no-win situation. Either they let James drive to the basket where he is like a runaway freight train, or they give up a 3-pointer. Interestingly, the Spurs also rely heavily on the corner 3 (they ranked 3rd in the NBA in corner 3s made while Miami ranked 1st). When you watch the series, look for this “game within the game” to see how well each team’s 3-point shooters are doing. It could decide your next NBA champion.

Integrate the Right Kind of Drama into your Organisation (reposted from Blue Ocean Systems Pte Ltd)

$
0
0

No one likes a drama, especially in business. Your operation needs to run smoothly, quickly and underpin your profits; otherwise why are you in business? SAP HANA promises drama; but only the best. The best way to curate your data, the best way to gain accurate insights into your organisation, the best kind of analytics your company needs to grow.  No longer do SMEs need to settle for anything but the optimum approach.

Let’s imagine the possibilities!

  • Being able to pull a report in seconds, on all your sales orders entered in the last day, hour or even 10 minutes
  • Generating a customer’s sales history whilst in a sales meeting with them and being able to beat your competitors by offering the client the best deal possible – immediately (while still retaining your profit margin)
  • Providing instant approval on a large-scale purchase your procurement team has found knowing the numbers add up (because you can check) – and all done from your Smartphone while you are on a business trip
  • Reacting immediately to customer trends and demands for products based on up-to-the-minute analytics (imagine the seasonal product peaks and knowing day-by-day, hour-by-hour which products are moving faster)
  • Stocking, restocking and upselling capabilities which mean you never run out of supplies and rarely house surplus, inactive stock because the data allows you to optimise your inventory levels through HANA’s predictive analysis. Sales targets will never be the same again!
  • Running your customer information line (or call centre) as a central hub to the rest of your business and being able to access all necessary customer information through a sophisticated enterprise search function; becoming a customer service star in the process by resolving the majority of queries on the first call
  • Managing your human resources on any scale so, when that urgent big project comes in, your manufacturing team are ready to go and have the tools and materials they need to complete the job

And the best thing? No longer do you need to imagine! These are all real examples of how you can improve your business immediately.At a recent event held in conjunction with SAP and Blue Ocean Systems, SAP delivered some interesting stats they uncovered when surveying SMEs about their data curation and data management challenges;

  • 51% feel it takes too long to extract the information they need to perform their jobs
  • 45% feel data formats are inaccessible and underused in analytics
  • 39% feel data is too siloed
  • 35% feel data volume is growing too rapidly

Big Data is different for everyone; what is manageable to a large scale multinational, might be the breaking point for an SME with a few hundred employees. But whether large or small, the problems are all the same. How can we improve data curation and analytics?SAP HANA can deliver on its promise with statistics like;

  • Search speed improvement – up to 46 times
  • Report speed – up to 960 times (what once took hours can now be ready in seconds)
  • Increase transaction speed by up to 30%
  • Analyse data 3600x faster
  • Ability to curate and analyse both transactional and sentiment data simultaneously

SAP HANA Data Size Examples[Source: Hasso Plattner Institut]This means;

  • Being able to pull up results from your database within 5 seconds
  • Reducing the time spent on tax declarations, month-end and year-end reporting activities
  • Empowering your employees by equipping them with the right tools to better serve your consumers
  • The ability to see what the world is saying about your product even down to the Twitter hashtags – this is next generation technology for your business!

Big Data is driving competition, innovation and profit across all sectors. By gathering, understanding and then predicting your customer’s behaviours you can gain control of your organisation’s future.

  • Access market leading business intelligence data in real time using ‘in-memory’ technology
  • Effectively store, manage and access all inbound data streams affecting your business
  • Understand customer behaviour and trends better and respond accordingly (before your competitor does)
  • Grow with the evolution of data
  • Implement SAP HANA and use alongside your transactional server
  • Comply with government, industry and risk requirements
  • Equip your sales teams with full visibility so they can react quickly to market demands
  • Enable your operations team to perform preventative (instead of reactive) activities
  • Enrich your customer’s journey and experience with your company (never underestimate this in a fickle marketplace)
  • Column based data storage is compressed & always sorted – makes retrieval for analytics quick

HANA is a real game changer in terms of technology, capabilities and enhancement to your business.

To speak with the HANA experts (and see your raw data appear in the HANA environment) contact us today!

Learn More:

Join our SAP HANA Facebook pagecommunity for news and updates

Read our recent blog posts about HANA and the power of analytics;

Big Data brings Bigger Challenges and Even Better Solutions

The Before and After of Analytics

Dynamic Decision Making

Change is a Risky Business

Business Reporting: Is it all Worth it?

Be Ahead. Stay Ahead.

Does Your Organisation have the Remedy for Growing Pains?

 

SAP Gold Partner

About Arun Devan, Managing Director

SAP Operational Process Intelligence on HANA - a boost for process efficiency!

$
0
0

Introduction

 

Core processes of enterprises are often complex and have many participants involved. Just think about an Order-to-Cash process including an incredible amount of possible variants, exceptional cases, etc.

 

Currently, both process participants and process owner only have limited insight into process efficiency and process quality.

 

Maybe you’re making ex-post analyses, e.g. with a data warehouse such as SAP BW. However, this approach is almost always data-driven, NOT process-driven.
In general, SAP BW only provide information about the past, not about the current situation because such systems are often not real-time and do not contain the most recent data.

 

Sometimes a BPM Suite such as SAP NetWeaver Process Orchestration with a process engine and enterprise integration orchestration is used to execute, orchestrate and choreograph the processes. But such tools can only execute good formalized and structured workflows. In most cases they are not able to execute ad-hoc workflows.

 

What about a system offering visibility for your core processes independent from the products they’re implemented with, so you are able to analyze questions like:

  • What are my process steps?
  • How long did it take to complete them?
  • Where is my critical process path?
  • Where is manual interference needed?
  • What are the weak spots?
  • What is the overall performance of the process?
  • etc.

 

Let’s kick it up a notch: When speaking about performance of processes, you can even think of a new kind of decision support system. If a process participant gains insight into his current workload, e.g. his open sales orders, the system could give you real-time hints which orders are forecasted to be in time and which are critical to fulfill.

 

You can develop this thought further: if you can recognize patterns of process behaviour in the past, with predictive analytics operational decisions can be made better and faster in the future!

 

This is where SAP Operational Process Intelligence (short: SAP OpInt) comes in, a new product from SAP that offers all features mentioned above and thus can boost your processes’ efficiency.
SAP OpInt Screenshot 1.png
Quelle: SAP


We at CubeServ are one of a few partners from SAP worldwide attending the ramp-up (thanks to Dr. Harsh and his colleagues for the great workshop in Feb 2013), and I want to share our first experiences with you.

As the first ramp-up partner world-wide, we managed to develop a smoothly running SAP OpInt dashboard already in January.

 

But you can test SAP OpInt very easily on your own: SAP provides a great test-drive, just check this link: http://hanademo.testdrivesap.com/sapopint/

 

Dashboards

SAP Operational Process Intelligence generates pre-defined but customizable dashboards based on SAPUI5 for your business process automatically.


The process participant can see all process instances he created. In an Order-to-Cash process he is able to look at all open sales orders created by himself:

SAP OpInt Screenshot 2.png

Please click on the image to enlarge!

Based on the status, he can focus on critical sales orders and initiate corrective actions in order to get these orders back on track.

The responsibles for single phases of the process can look at all critical sales orders in the enterprise:

SAP OpInt Screenshot 3.png
Please click on the image to enlarge!

Finally, the process owner can analyze process performance for completed process instances.

Of course, you can restrict access to certain views or characteristic values via HANA analytic privileges. For example, regional sales managers can be restricted to sales orders from certain sales organizations.

 

 

Approach


SAP Operational Process Intelligence is based on SAP HANA and thus offering all advantages of in-memory computing.

It does not matter where processes are implemented: in ERP where process logs are generated via Process Observer (please have a look at this introductory blog), non-SAP systems, SAP NetWeaver BPM or SAP Business Workflow.
With SAP Operational Process Intelligence, you can import process definitions and their corresponding process logs from the systems involved in the process, i.e. information about start and end of both processes and their activities included.


In SAP HANA Studio, you can create a so-called business scenario for such a process, i.e. a business-oriented view on the
process. In this scenario, you define phases that can be evaluated later separately by the subprocess responsibles.

You can even combine process definitions from different systems into one business scenario.

Let’s take an Order-to-Cash process. You can divide such a process into phases like Order Entry, Pick&Pack and Invoicing/Payment.

SAP OpInt Screenshot 4.png

Please click on the image to enlarge!

After having defined the business scenario with its so-called phases, certain KPIs are predefined automatically such as the cycle time for each phase.

 

Of course, you can also define further KPIs specific to your business process. These KPIs will be visible in the

performance view of the generated dashboard.

 

For KPIs like cycle time, you can define targets even by an attribute in the data model, e.g. different cycle times for customers depending on the customer’s country.

 

Now it’s time to let SAP OpInt perform magic:
Generation of several objects such as calculation views, REST-based OData services and analytic privileges takes place automatically. After giving roles to the process participants, the users can have a look at their dashboard without doing anything additionally!

 

SAP OpInt offers flexibility in a way that you can modify the calculation view generated before by joining any datasource to it.
For our Order-to-Cash example, you can join e.g. SAP ERP order header table (VBAK) and customer master data table (KNA1) to it. If you define sales organization, customer name and credit limit as output attributes, you will have these information included in the dashboard without any programming effort!

 

 

Benefits


With SAP OpInt it is possible to establish a visibility of end-to-end processes like order-to-cash.  The tool provides real-time monitoring of business processes. Especially for processes with customer contact it is extremely important to gain information about the state of each business case (for example customer order).

 

Furthermore, the tool can provide not only the current state, but also so called “look ahead alerts” with expected completion dates. This means further transparency for process owner and process participants and better ability to provide information to customers.

 

SAP OpInt is a tool "on top" of other systems and not tied to any particular application.

The process definitions from SAP Process Observer, SAP Business Workflow or SAP NetWeaver BPM are supported. This means that the processes or their parts can be executed in SAP as well as in non-SAP systems but monitored and analyzed in one single tool: both orchestrated workflows (SAP NetWeaver BPM and SAP Business Workflow) and built-in processes based on event-driven process definitions (SAP Process Observer).

 

With SAP OpInt it is possible not only to monitor the currently running processes, but also to report on completed instances and to access diagrams to recognize cross-time trends.

 

The data are presented as SAP tables which allow you to quickly develop the dashboards and to contextualise the instances which are presented on them. The dashboards and many Indicators are pre-defined, so that only a relatively
small implementation effort is needed.

 

 

Conclusion

SAP OpInt is a new product that can put more intelligence into the day-to-day running of the processes. With this HANA based approach which is perfect to combine OLTP and OLAP queries in one system, you can use SAP OpInt for gaining transparency of the processes across system boundaries.

 

The workforce on their own can drive and fine-tune the running of single process instances better than ever before.

 

SAP OpInt enables you to quickly develop dashboards offering both real-time monitoring and “look-ahead alerting” functionality in particular for critical process instances and indicators and diagrams for the completed instances.

 

After the ramp-up experience, we at CubeServ can honestly say that SAP OpInt can boost process efficiency to a new level.

 

End-to-end process visibility is not a buzzword anymore, besides insight-to-action is reached by the opportunity of creating tasks for other process participants or using business rules on HANA in future.

 

Words cannot describe just how awesome it is, you have to carry-out the test-drive mentioned above and form your own opinion.

 

For starting with SAP OpInt, I propose to have a look at Process Observer at first in order to define the activities in the SAP Business Suite needed for implementing SAP Operational Process Intelligence for your core processes.

 

For further questions or if you’re interested in Process Intelligence workshops, please contact me directly: sebastian.zick@cubeserv.com or via LinkedIn.

What makes openSAP highly impressive!

$
0
0

It’s just two weeks of course that I have attended in openSAP and I am so overwhelmed with the way the content is delivered that I am compelled to write this blog. It is highly unlikely for anyone reading this blog and not knowing about openSAP, it was all over the media channels. Fortunately enough the submission dates for first week assignment was extended and that must have given laggards like me a second-chance! For a moment I will stop comparing the content delivery model of openSAP with any form of online courses that I might have attended and focus on some of the clearly visible excelling points.

 

The content is highly structured and crisp. The flow of lectures is very well linked and streamlined. When I tried downloading the weekly zipped contents, it prompted that it would take few hours for the content to get downloaded, it is certainly worth the time spent on downloading or may me more than that. The content is very well addressed to the target audience-developers. The speaker, Thomas Jung just uses the language of developers.There are no marketing or flowery words used anywhere. For me that is something very amazing as lately, I have been reading too much about HANA and those buzzwords just get in but not in openSAP. I can’t stop myself from comparing this to one of the TechEd sessions that I have attended last year on HANA in Bangalore. I am sure that majority of attendees were developers there as well. That was a demo session with ABAP on HANA and the word ‘non-disruptive’ jarred on my ears as it was being used many, many times. Considering that as a hands-on session my expectation was more to do and see that myself than just the word being repeated over and over again. I don’t know if anyone else noticed but when Thomas said the exact thing in his session [week1] about moving existing code to HANA, he said it very eloquently without using the word non-disruptive. It is not just about using the words but the style of delivery matters.To me, OpenSAP has certainly raised the bar of quality in which demo-driven lectures are supposed to be delivered.

 

The lectures are so well delivered that I was excited and completely focussed till the weekly course content was over. I am not trying to exaggerate but it’s like no one would like to miss the next part of The Lord of the Rings after watching the previous part. It happens very rarely that after taking the sessions, I am excited to evaluate myself with the questions, but openSAP certainly has the force to do that. It’s always exciting when the speaker uses the words like ‘We developers’ rather than ‘You developers’ or just ‘developers’! The content is so well focussed for a beginner to get started with HANA that I can’t stop talking about it. Thomas has really got into the mind of every developer and given the sessions by explaining each and every term he uses: as basic as ‘Schema’. Initially, when I was seeing one full video session dedicated on ‘AWS’, ‘Cloudshare’ etc., I wondered what that is for! But when I see the overall motive of the course which is to me, ‘Getting started with HANA’, that session is certainly required. When I did my first POC on HANA, I had to learn and imbibe many terms used in BW terminology, most commonly being fact tables and dimension tables. In openSAP the speaker has really done a great job in using the familiar terms.

 

The focussed approach of this course clearly speaks of the amount of planning done behind the scenes by the openSAP team. The site offers various other options like discussions, creating groups etc. Rich Heilman  and team is doing some exceptional work in handling the forum discussions there and also updating any issue being faced. There are already many generic groups created for anyone to be part of one or more groups. I was surprised when I found a group was already created with the name of my organization. That certainly provides me with some networking options with like-minded people in my own vicinity.

 

On a side note, had there been any option of peer evaluation in this course, I would have loved to poke my virtual classmate Jan Penninkhof to be my reviewer!

In case anyone is wondering how does the sessions look like, I would just share a screenshot from the course as below:

 

openSAP3.jpg

 

I feel very good that I did not procrastinate or postpone this online course for any other priority as so far this has been absolutely fantastic learning experience. Kudos to openSAP team!

The big question about openSAP HANA Course

$
0
0
I am one of more than 32 thousand people signed up for the openSAP course Introduction to Software Development on SAP HANA.

The course started on 27th May and runs for 6 weeks - but I am sure you could still register and join in now. The entire course is being delivered online and it is essentially free to take part. I say "essentially" because to fully benefit you will probably want to have access to your own HANA system to develop your code on. SAP have worked with Amazon and Cloudshare for this purpose so you will have to pay fees to one of these providers for your personal HANA system - or share one between a few mates. 

 

I was going to write a bit of a review of the course but really Kumud Singh has already done a great job of this so I would encourage you to read her blog What makes openSAP highly impressive!.

Instead let me share some early observations I have about the course.

It seems to me preparing and delivering a course online is not a lot different to doing one face to face. The same things make the course successful - or unsuccessful. My very rudimentary list is...

Step 1. Preparation. Have good quality content, well structured and audience appropriate.
Step 2. Exercises. Make them appropriate to the content, easy to follow, designed to reinforce new concepts.
Step 3. Delivery. Pick the best presenter. Ideally someone who is a subject matter expert, a good communicator, and empathetic.

For me this course is setting a new benchmark for SAP knowledge transfer. There is clearly a lot of work that has gone into all these aspects but the organization and delivery is all pretty seamless. I imagine that Thomas Jung has played a great part here although there is probably quite a team behind him that should get credit as well. I have been fortunate to assist Thomas with some of his SAP TechEd and Mastering SAP Technologies workshops over the years and his preparation is always meticulous.

An online course does require a good delivery platform, high production values for content (esp. video/audio), and a different mindset about the content. Anyone who has tried to get electronic copies of SAP training materials will know how precious SAP Education are about controlling there distribution. This must change in a world of online delivery.

Because the course is delivered entirely online the students can pick their own time to participate and take their own time to absorb the content. They can download the lectures and slides to consume at their convenience. The great thing about self-paced training is that you can rewind and replay key points until you fully understand them before moving onto more advanced topics - something that can be difficult in the traditional classroom environment. This is one of the things that Sal Khan often mentions as a key aspect of The Khan Academy.

And of course an online training experience like this provides scale. As mentioned there are more than 32 thousand people signed up for this course. If just 10% of them complete the 6 week course that is 3200 more people trained on SAP HANA development than at the beginning of the month. And the course can be run over and over again as demanded.

There is also a very active social component to the training as well. The openSAP platform has community areas and discussion forums so that students and instructors can share experiences, ask questions, work through problems together. A shout out here to the openSAP team who are responding to posts in record time. And of course those of us taking the course are touching base regularly through our usual channels like Twitter and SCN and keeping each other apprised of our relative progress. There is definitely more happening here than just training - there is a community building around this course.

I have no idea if openSAP is officially part of SAP Education. My impression of SAP Education is not that positive, and hasn't been for such a long time, so from afar it is hard to see how that organization could have come up with openSAP unless major changes have taken place. If such changes have already happened then I welcome them - and if they haven't then I suspect the SAP Education world is about to be shaken to its' core.

But the big question facing us all - is what happens next week? How long can this go on before we see some repetition? Can Thomas maintain the variety?

Here he is in week 1.
Screen Shot 2013-06-10 at 9.35.31 AM.png
A nice opening gambit - safe and conservative.

And week 2.
Screen Shot 2013-06-10 at 9.36.29 AM.png
Wow - that's stepping it up considerably. It's almost like he is about to go out to a disco.

Then week 3.
Screen Shot 2013-06-10 at 9.37.43 AM.png
A bit disappointing really. Perhaps he attracted too much attention at the disco and wants to sneak in quietly and find a booth at the back this time.

Most impressively - no SAP logos anywhere. I don't think we have ever seen Thomas without a SAP logo somewhere since the days of Kimball. What's going on here - has someone moved my cheese?

I think he can do it. Three more weeks to go and I am betting on three new shirts. We haven't seen a blue one yet so that seems pretty obvious. I suspect yellow might be a bit frantic for Thomas so perhaps a green. Then just one more and we are home. I always wondered what colour puce was. Perhaps I will find out.

Introducing this year’s SAP HANA Startup Forum participants – Part II

$
0
0

As announced previously we are continuing our blog series and introducing you to our next three startups:

 

Celonis’ Process Business Intelligence (PBI) solution combines process mining and business intelligence in one application, thus enabling to reconstruct and connect all data stored in customers’ ERP systems. Since the data is being analyzed in real-time, all modifications can be recorded and represented immediately. Furthermore, identified vulnerabilities can be analyzed to predict further correlations. Such approach allows customers to optimize their workflows easily and efficiently.

 

Conis Risk Consultingenables early predictions about emerging political risks and conflicts. It provides tailored information depending on customers’ context, thus safeguarding increased safety. Various experiences in the field of political conflicts have been compiled in a comprehensive database. Businesses and their employees can now be protected more effectively in unstable regions and new market potential can be unlocked.

 

Retention Gridis a retention marketing platform that helps vendors to better understand and take care of their existing customers, which, according to studies, are four times more likely and seven times less costly to purchase new products or services. Retention Grid’s Shopify App offers instantaneous customer segmentation into color-coded grids, allowing companies to track customer loyalty per group, identify trends and changes over time as well as take effective steps to encourage their customers to buy more often.

 

Next up are another three startups and more information about the presenters and keynotes.

 

Related content:

SAP HANA Startup Forum Berlin 2013 announcement

Phenomenal turnout for the SAP HANA Startup Forum in Berlin 2012

Introducing this year’s SAP HANA Startup Forum participants – Part I


The Applification of SAP HANA - Bigger Data, Faster Reporting – What next?

$
0
0

It’s widely acknowledged that HANA is “disruptive technology”. It’s guaranteed that SAP’s Business Intelligence capabilities, underpinned by HANA, will enable the processing and analysis of bigger data-sets, faster. The announcement of Business Suite on HANA earlier this month is highly likely to enable the current processes that a business runs to be driven faster and more times. But HANA is a technology and therefore it is essentially inert without effective application to business problems. SO how will this happen?

Now some people may say that I’ve not considered whether HANA is the correct technology and that there are better more innovative technologies out there to which I respond that my clients are looking for business benefits and solutions to issues, not a discussion on technology. I believe that the technology is now available and is being developed at an incredible pace, the challenge now for companies is how to exploit it to beat the competition at their own game or to develop a new game. So let’s consider how we do this?

Taking the area of Business Intelligence and Analytics as a worked example since HANA has operated here for longer. The challenge has always been the dynamic nature of reporting (to use the old term) and how requirements can change quickly based on the information provided. HANA exasperates this situation because the amount of data that can now be analyzed is increasing and therefore the volume of questions that arise is greater.

To address this the world of SAP needs to start to adopt a more flexible, agile (with a small or large A) approach to deployment. Firstly a sensible, but pragmatic, architecture must be established. In this context, HANA is not the only SAP game in town and such an architecture could include NLS for data that is not immediately required as well as the in-memory “stuff”.

Then how to deliver the information to the end user. Well, who is the end user? Typically, someone who has a smart device and wants immediacy and access to relevant data. To satisfy these requirements we must adopt the “App” concept and develop tools that are highly targeted to specific user groups. By exploiting managed mobility type offers this becomes a service that can be exploited with a low investment but with great effect.

And finally, it is important to reflect these changes back into how the IT organization operates. Fundamentally a lifecycle approach must be adopted that recognizes the full build, run, host model and optimizes across it. The relevant Centre of Excellence structures must be in place that ensure appropriate governance and control but at the same time ensure that tools and facilities are available for innovation. Also, what are the best development models? How should the balance between in-house and 3rd party models be used? How should Factory based Services be exploited? All important questions!

 

So, now that SAP HANA is with us how do we get the best out of it? At Capgemini we know that this will cover a broad range of topics and the discussion is only starting but the “App Effect” that has happened in other areas of IT is a good starting point for this journey.

Why do we see SAP HANA as Technology instead of ValueDriver

$
0
0

It was very interesting for me to see the different, very passionate, discussions which happened on the SAP Mentor Space both regarding HANA, Cloud and Mentors that I thought it might be better to hold back for a few days and to see what is happening. In may case I simply felt that my voice got bigger and I wanted to make sure that I am not abusing this new power.

I have been very passionate about SAP HANA since its very launch and I love to see how everyday I am able to learn new things and speak to new people which make my life a lot more interesting, business wise. The journey has never been easy, but this is what is takes to be part of a revolution. Revolutions typically start at some point, get moving but it is not written in rock that the direction will not change. In fact most recent revolutionary movements have shown us that the final result can be quite different from what one would have expected.

Now the HANA revolution has started and found rapidly adaptors who love this new "Platform" and put a lot of effort into it to make it bigger. At the beginning the main value which got attributed to HANA was speed. I guess we all still recall the 100k Club etc. Whilst Speed is important, by itself it is probably not as important as for having mass adoption in record time. But then Speed can be value, a very important value if not the most important value. For a person with a heart attack, every second counts in order to have chances to recover well. In Business it is of very high importance to be able to make the right decisions based on facts. In today's world and with today's systems those facts were often not available on the spot so management was looking at information which was x-days old and had to make assumptions.

I was not able to attend in person SAPPHIRE 2013 but I saw the video of the KeyNote of Hasso Plattner. I liked very much the demo which was shown on the shopping basket analysis. To me it was simply a fantastic example of why speed could be important. If I take the example to a next level and think of myself shopping. I use the Personal Shopper System at my Supermarket (barcode scanner you carry with you through the store). Supermarkets could take the above example and:

     - Analyze when I was the last time I bought razor blades and offer me on the screen a special promo for example when I am getting closer to the blades

     - Focus much more on my in store shopping behaviour (is there a certain route I am taking)

     - Remind me even that I still have promo coupons which I have not cashed in

Yes it is a very simple example but, yet being simple I have not seen it implemented, but this is where the value is. Make sure you communicate with your customers while they are still in the store.

This makes also the linkage between HANA and Mobility ever more obvious. Yet again I have seen many cool things on the technology side but I have seen yet few people talk about real business value.

HANA is also not the only kid on the block anymore which has In-Memory. IBM is very actively launching its DB2 BLU, Terradata launching their version and of course the popular MongoDB. All have one thing in common they work with large amounts of RAM and can help accelerate certain processes. It is my guess that we need to focus now with growing competition more on the Value story than on the bits and bytes of a solution. Or as I tend to tell to my customers I am driving a car I have never worried how the fuel gets into the cylinders to make an ignition. I care about the fact that my car is driving.

Installing HANA on Amazon: is it really that simple?

$
0
0

As a technical consultant I have installed HANA on physical servers for a customer.

I could not just play around those HANA servers because they were in productive use.

I wanted a test HANA server for myself and the team I work in, but of course a physical server is too expensive.

Therefore I wanted a HANA server on one of the cloud providers. SAP has relations with several cloud providers to give you a choice of infrastructure platforms for your HANA system.

The ones I know are:

  • Amazon AWS
  • Cloudshare
  • KT ucloud biz
  • SmartCloudPT

I choose Amazon because this one is the biggest provider, has the best relation with SAP and has the option to shutdown the instance so that the billing for the processing power stops.

I went for the developer edition (not for HANA One because that one is for productive usage).

Of course I first checked the costs, but it seems OK:

  • The SAP license costs are free for the developer edition
  • The costs at Amazon for the small server (2 vCPU and 17 GB memory) based on 4 hours per day usage:

HANApricing.png

Via the SAP Netweaver cloud I started:

https://sapsolutionsoncloudsapicl.netweaver.ondemand.com/clickthrough/index.jsp?solution=han&provider=amazon

 

In the first step you fill in your personal data and your Amazon AWS account.

After filling in your SAP user credentials and Amazon credentials, you setup your HANA server.

hana1.png

 

hana2.png

 

So I choose the smallest (and cheapest) HANA instance size.

And I had a KeyName, because I have already a few instances running on Amazon.

 

It is really next-next-finish…

 

hana3.png

 

After this you jump to the Amazon console and you see that the creation of the server is in progress.

hana4.png

And after half an hour or so, we see it in our list of instances in the Amazon console:

hana5.png

 

This was much easier and faster than I expected.

And just to be clear: I did not have to install HANA myself, it was included in the instance that was created.

 

Of course if you want to connect to this HANA server, you need to download and install the SAP HANA Studio and SAP HANA client software on your PC or laptop. Of course you all know to how download from SAPnet.

The HANA client you install via: hdbinst -a client

The HANA Studio you install via: hdbinst -a studio

 

And to really use the HANA server, you need for example Business Object server and probably a SAP backend server which is connected via an SLT server.

But at least for now, I have a HANA server and I can import test data from a local file and experiment some with modeling .

 

Best regards, Marco

SAP HANA Professional Certification ( P_HANAIMP_1)

$
0
0

I have cleared my SAP HANA Professional Certification & would like to share my experiences in this blog  with the sap community

 

This certification tests your Hands on knowledge on SAP HANA. This is more for Solution Architects & consultants who had practical experience in all different areas of  HANA. This is based on HANA SPS05.

 

My Experience Background on SAP HANA :

 

I am a SAP HANA Distingished Engineer . ( Please refer to the link for more details on this program ) http://scn.sap.com/docs/DOC-32891 . I have been working on SAP HANA for more than 2 years . We are one of the early adopters & have already implemented around 8 Datamart projects & 1 BW on HANA implementation till date . I have also cleared the other two Associate  certifications ( C_HANAIMP_1 & C_HANATEC_1) on HANA  but would rate the HANA Professional certification as the best till date . It  tests your practical experience by going to very granular details

 

Note : This blog gives you information about the certification & it doesn’t guarantee the questions/topics which I have mentioned below would be part of the certification .

 

Please find attached link which gives more details on the certification

 

https://training.sap.com/v2/certification/p_hanaimp_1-sap-certified-application-professional---sap-hana-10-g/

 

Break up of questions by topics

 

DataModeling - 17

Optimization & performance - 14

Security & Authorization -  14

System Architecture - 13

Data provisioning - 9

Reporting - 6

Life Cycle Management ( LCM) - 4

Scenario based - 3 

 

Resources :

 

saphana.com ( There are lot of documents on data modelling best practices

help.sap.com/hana ( Official HANA documentation)

TZH200 (  HA 200 is the new name)

TZH300 (  HA 300  is the new name)

 

Some pointers on topics :

 

Data Modeling :

Hana Studio usage , Navigation , Input parameters in Analytical/calc views , Variables , CE functions, Procedures, Text Analytics .

 

Optimization & Perfomance :

Whats if Scenarios , How do you improve performance , Scenario based questions , Partitioning , SQL script optimizations, RDS implementations

 

Security & Authorization :

Analytical Privileges , BO Connectivity ,Mobile Connectivity

 

System Architecture :

Configuration Parameters,XS engine components,Delta merge,Hana  Services

BW Vs Data mart scenarios, Backup & recovery , scale out

 

Data provisioning :

Advantages / Limitations of using SLT , Data services , DXC

 

Reporting :

BO  tools connectivity to Hana , BO Olap / Advanced Analysis

 

LCM :

Exoprt / Import of content  , Transports

 

Passing Score :

55% ( this might change )

 

I hope this blog gives some information to folks who are planning to take this certification

 

Thanks,

Kiran .

Introducing this year’s SAP HANA Startup Forum participants – Part III

$
0
0

SAP HANA Startup Forum Berlin 2013 is approaching fast. Here is our next take on this year’s startups:

 

GoEuro is a one-stop travel search website which combines air, rail, bus and car rental options. It allows customers to search to and from any location while providing the best possible option based on price, total travel time and convenience. The customer no longer needs to visit multiple websites to plan his entire trip, thus saving his time and benefiting from more cost transparency.

 

Paymillis an online payment solution that enables companies to accept payments internationally and in any currency. The Paymill payment form can be customized and integrated easily in any website, meaning that customers will not be redirected away to proceed with the payment. All data is highly encrypted and meets security standards.

 

Zellkraftwerk’sChipcytometry is a powerful technology to study and understand cellular structures, functions and mechanisms as well as discover new biomarkers. This iterative chip-based cytometry allows generating cytometric data from up to 30 biomarkers at the same time from the same sample. Biomarkers can be fixed on cells at the site of sample preparation and sent or stored in a controlled format, thus enabling sample re-analysis for nearly 12 months.

 

Check back in a couple of days for more information about the presenters.

 

Related content:

SAP HANA Startup Forum Berlin 2013 announcement

Phenomenal turnout for the SAP HANA Startup Forum in Berlin 2012

Introducing this year’s SAP HANA Startup Forum participants – Part I

Introducing this year’s SAP HANA Startup Forum participants – Part II

Viewing all 902 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>