Mindmajix

Hadoop with BODS Integration

Creation of Repository in BODS:-

Creation of User in HANA:-

Log on to SAP HANA Studio

          Screenshot_1821       Go to HANAS System

                Screenshot_1821     Go to Security

                   Screenshot_1821    Create User for Repository

                       Screenshot_1821   Assign Privileges for user as

                                    Public

                                    Monitoring.

Creation of Repository:-

Log on to Data services Repository Manager

        Screenshot_1821   Select the Repository as Local

           Screenshot_1821   Data base type as SAP HANA and Version as HANA 1.X

           Screenshot_1821   Provide The HANAS Server name and newly created user name and password.

Configuring the Job server in Hadoop Environment:-

Log on to Hadoop

          Screenshot_1821   Right click and open the Terminal

              Screenshot_1821     Log on to user as object

i.e. # su object.

     Screenshot_1821    Change directory to BODs installation Bin directory

i.e. #cd/home/object/bods/data services/bin

       Screenshot_1821   Setting the environmental variables

i.e. /home/—-/bin>../al-env.sh

         Screenshot_1821     To connect to server manager of BODS

i.e. /home/—-/bin>./svycfg

        Screenshot_1821     Server Manger options are open and enter the option as ‘2’ to create the job server.

          Screenshot_1821  Enter the option as ‘c’ h create a new job server and click on enter.

Screenshot_1821   Provide the name for the job server

Screenshot_1821   Enter the TCP Port Number for Job server

   Screenshot_1821   If you want to enable the SNMP for the job server provide ‘y’ otherwise ‘N’

Screenshot_1821   If you have create the repository with ODBC means enter ‘y’ otherwise ‘N’

Screenshot_1821  Provide the database server name for HANA

 Screenshot_1821    Provide the Port number for HANA

  Screenshot_1821     Select version and provide user name and

   Screenshot_1821   Provide ‘y’ to confirm the information

    Screenshot_1821   Now the job server will be created

To start or stop the Job service.

Screenshot_1821   After creating the job server, the job server manager utility is open

  Screenshot_1821   Enter the option ‘1’ to start the services.

Screenshot_1821   To start the server, enter the option as ‘s’

Screenshot_1821   To stop the server, enter the option

Screenshot_1821   To come out, select the option as ‘q’

Screenshot_1821   Enter the option as ‘X’

Screenshot_1821   The server will be started.

Integrating SAP Business objects with Hadoop:-

Universe Design Using IDT:-

Capture 15 Steps involved in configuring SAP Business objects for use with Hadoop.

  1. Configure SAP Business objects with Hive JDBC drivers, if the server is of a version lower than B04.0 with SP5. And in BO Server 4 sps, SAP Provides Hive connectivity by default.
  2. In order to configure JDBC drivers in earlier versions, we have to place the set of JAR Files.
  3. The data access layer allows the SAP BOBT platform to connect to Apache Hadoop Hive 0.7.1 and 0.8.0 databases through JDBC on all platforms.
  4. To create a connection to the Hive thrift server, you first have to place the ser of hive JAR files to the Hive directory which is available in the below path.
Connection server-install-dir/connection server/jdbc/drivers/Hive.

Capture 15 Below are the Hive JAR files, we have to copy bash on the version 0.7.1

  1. Hadoop-0.2.0.1- core. jar or hadoop- core -0.20.2.jar
  2. Hive-exec-0.7.1.jar
  3. Hive-jdbc–0.7.1.jar
  4. Hive-metastore-0.7.1.jar
  5. Hive-service-0.7.1.jar
  6. Libfb 303. Jar
  7. Log4J-1.2.16. Jar
  8. Commos-Logging-1.0.4. Jar
  9. Slf 4J-api-1.6.1. Jar
  10. Slf 4J- Log4J12 -1.6.1. Jar

Creation of ROLAP Connection for Hadoop system using IDT:-

Log in to Information Design Tool(JDT)

             Screenshot_1821      Create a user session with login credentials

Screenshot_1821   Under sessions, open connections folder

Screenshot_1821   Create a new relational connection

Screenshot_1821   Provide the relational conn name and click on Next

Screenshot_1821    Under Driver selection menu, select

             Apache Capture 15 Hadoop Hive Capture 15 JDBC Drivers.

Screenshot_1821   Click on Next

Screenshot_1821    Provide Hadoop Hive host name and port as below.

Hadoop45.wdf.sap.com:10000.

  Screenshot_1821    Click on Test connectivity.

Screenshot_1821   If it is successful, save the connection by clicking finish

Creation of universe using Hadoop Hive data:-

Screenshot_1821   Create a project in IDT

Screenshot_1821   Create a shortcut for the above connection in the project.

Screenshot_1821   Right-click on the relational connection and select publish connection to a repository

Screenshot_1821   Select your folder and click on finish.

Screenshot_1821   Now the connection is secured.

Screenshot_1821   Create a Data foundation layer and bind the conn with the data foundation layer.

   Screenshot_1821   Provide the name for Data foundation and click on next

   Screenshot_1821   Select the secured relational connection for example Hive conn.cns.

   Screenshot_1821   Click on finish and save the data foundation.

   Screenshot_1821   This conn is used by data foundation layer to import data from server.

   Screenshot_1821   From the data foundation layer, drag and drop the tables required and join the tables.

   Screenshot_1821   Save the data foundation layer.

   Screenshot_1821   Create a new business layer and bind the data foundation layer with the business layer.

   Screenshot_1821   Provide the name for business layer and select your data foundation layer.

   Screenshot_1821    finish to save the business layer.

   Screenshot_1821   Right click on the Business layer and select Publish to repository.

   Screenshot_1821    Use integrally before publishing to check dependencies.

Screenshot_1821   Log on to Crc and set universe access policy for users.

Creating of WEBI report from Hadoop –Hive universe:-

Screenshot_1821   Log on to launch pad

Screenshot_1821   Click on web application

 Screenshot_1821  Click on Create

   Screenshot_1821   Select universe and OK

   Screenshot_1821   All the universe in the Repository will be displayed.

Screenshot_1821   Select the Hadoop-Hive data universe

   Screenshot_1821   Click on select

   Screenshot_1821   Drag and drop the fields to the work space of result objects.

Screenshot_1821   Run the Query to create a web Report.

 


 

0 Responses on Hadoop with BODS Integration"

Leave a Message

Your email address will not be published. Required fields are marked *

Copy Rights Reserved © Mindmajix.com All rights reserved. Disclaimer.
Course Adviser

Fill your details, course adviser will reach you.