Import Data Connection to an SQL Database
You can create a connection that allows you to import data from an on-premise SQL database (JDBC + Freehand SQL).
Prerequisites
You have completed ONE of the following setup options:
- Performed setup using the SAP Analytics Cloud Agent Simple Deployment Kit. For
more information, see SAP Analytics Cloud Agent Simple Deployment Kit.NoteJDBC drivers must be installed using the instructions in the Post-Setup Guide included in the kit.
- Performed a manual setup of the following required components:
- The Cloud Connector is installed. For more information, see Installing the Cloud Connector.
- The SAP Analytics Cloud agent is installed. For more information, see Installing SAP Analytics Cloud Agent.
- JDBC drivers must be installed.
- The Cloud Connector is configured. For more information, see Configuring the Cloud Connector.
-
The SAP Analytics Cloud agent is configured in SAP Analytics Cloud. For more information, see Configuring SAP Analytics Cloud Agent.
- You have installed a JDBC driver. If you haven't already installed a
JDBC driver, follow these steps:
- Download JDBC drivers for the databases that you want to connect
to, and then place the downloaded jar file(s) in an appropriate
folder. The jar files can be in any location on your machine as
long as the properties file includes the correct file locations
(see the next step). For information about supported databases,
see System Requirements and Technical Prerequisites.Note
You must use a Simba JDBC driver for Amazon Redshift, Amazon EMR, Apache Hadoop Hive, Cloudera Impala, Apache Spark, and Hortonworks Data Platform.
-
Create a properties file that specifies the paths to JDBC drivers you want to use, and place it in the same file system where the SAP Analytics Cloud agent is installed. The properties file can have any name, such as DriverConfig.properties.
Example of a properties file:#This file is to specify where your JDBC drivers are on the file system of the machine where the SAP BusinessObjects Cloud Agent is installed #For each database you have a driver for, you can remove the # from before the name of the database and specify the path to the jar file #If a driver requires multiple jar files you can separate the paths with semicolons #This list corresponds to BOE 4.2 SP5 #Amazon EMR 5.6 (Hive 2.1)="path_to_JDBCdriver" #Amazon EMR Hive 0.11="path_to_JDBCdriver" #Amazon EMR Hive 0.13="path_to_JDBCdriver" #Amazon Redshift="path_to_JDBCdriver" #Apache Hadoop HIVE="path_to_JDBCdriver" #Apache Hadoop Hive 0.10="path_to_JDBCdriver" #Apache Hadoop Hive 0.12="path_to_JDBCdriver" #Apache Hadoop Hive 0.13 HiveServer2="path_to_JDBCdriver" #Apache Hadoop Hive 0.14 HiveServer2="path_to_JDBCdriver" #Apache Hadoop Hive 0.7="path_to_JDBCdriver" #Apache Hadoop Hive 0.8="path_to_JDBCdriver" #Apache Hadoop Hive 0.9="path_to_JDBCdriver" #Apache Hadoop Hive 0.x HiveServer1="path_to_JDBCdriver" #Apache Hadoop Hive 0.x HiveServer2="path_to_JDBCdriver" #Apache Hadoop Hive 1.0 HiveServer2="path_to_JDBCdriver" #Apache Hadoop Hive 1.x HiveServer2="path_to_JDBCdriver" #Apache Hadoop Hive 2.x HiveServer2="path_to_JDBCdriver" #Apache Spark 1.0="path_to_JDBCdriver" #Apache Spark 2.0="path_to_JDBCdriver" #BusinessObjects Data Federator Server XI R3="path_to_JDBCdriver" #BusinessObjects Data Federator Server XI R4="path_to_JDBCdriver" #Cloudera Impala 1.0="path_to_JDBCdriver" #Cloudera Impala 2.0="path_to_JDBCdriver" #DB2 10 for LUW="path_to_JDBCdriver" #DB2 10 for z/OS="path_to_JDBCdriver" #DB2 10.5 for LUW="path_to_JDBCdriver" #DB2 11 for LUW="path_to_JDBCdriver" #DB2 UDB v5="path_to_JDBCdriver" #DB2 UDB v6="path_to_JDBCdriver" #DB2 UDB v7="path_to_JDBCdriver" #DB2 UDB v8="path_to_JDBCdriver" #DB2 for z/OS v11="path_to_JDBCdriver" #DB2 for z/OS v12="path_to_JDBCdriver" #DB2 v9="path_to_JDBCdriver" #Data Federator Server="path_to_JDBCdriver" #Data Federator Server XI R3="path_to_JDBCdriver" #Data Federator Server XI R4="path_to_JDBCdriver" #Generic JDBC datasource="path_to_JDBCdriver" #GreenPlum 3="path_to_JDBCdriver" #GreenPlum 4="path_to_JDBCdriver" #HP Vertica 6.1="path_to_JDBCdriver" #HP Vertica 7.1="path_to_JDBCdriver" #HP Vertica 8="path_to_JDBCdriver" #Hortonworks Data Platform 2.3="path_to_JDBCdriver" #Hortonworks Data Platform 2.5="path_to_JDBCdriver" #IBM Puredata (Netezza)="path_to_JDBCdriver" #IBM Puredata (Netezza) Server 7="path_to_JDBCdriver" #Netezza Server 4="path_to_JDBCdriver" #Netezza Server 5="path_to_JDBCdriver" #Netezza Server 6="path_to_JDBCdriver" #Informix Dynamic Server 10="path_to_JDBCdriver" #Informix Dynamic Server 11="path_to_JDBCdriver" #Informix Dynamic Server 12="path_to_JDBCdriver" #Ingres Database 10="path_to_JDBCdriver" #Ingres Database 9="path_to_JDBCdriver" #MS Parallel Data Warehouse="path_to_JDBCdriver" #MS SQL Server="path_to_JDBCdriver" #MS SQL Server 2000="path_to_JDBCdriver" #MS SQL Server 2005="path_to_JDBCdriver" #MS SQL Server 2008="path_to_JDBCdriver" #MS SQL Server 2012="path_to_JDBCdriver" #MS SQL Server 2014="path_to_JDBCdriver" #MS SQL Server 2016="path_to_JDBCdriver" #MS SQL Server 2017="path_to_JDBCdriver" #MS SQL Server 2019="path_to_JDBCdriver" #MS SQL Server 6.5="path_to_JDBCdriver" #MS SQL Server 7.x="path_to_JDBCdriver" #MaxDB 7.7="path_to_JDBCdriver" #MaxDB 7.9="path_to_JDBCdriver" #MySQL="path_to_JDBCdriver" #MySQL 5="path_to_JDBCdriver" #Oracle 10="path_to_JDBCdriver" #Oracle 11="path_to_JDBCdriver" #Oracle 12="path_to_JDBCdriver" #Oracle 12c Release 2="path_to_JDBCdriver" #Oracle 7.3="path_to_JDBCdriver" #Oracle 8="path_to_JDBCdriver" #Oracle 8.0="path_to_JDBCdriver" #Oracle 8.1="path_to_JDBCdriver" #Oracle 9="path_to_JDBCdriver" #Oracle Exadata="path_to_JDBCdriver" #Oracle Exadata 11="path_to_JDBCdriver" #Oracle Exadata 12="path_to_JDBCdriver" #PostgreSQL 8="path_to_JDBCdriver" #PostgreSQL 9="path_to_JDBCdriver" #PostgreSQL 10="path_to_JDBCdriver" #PostgreSQL 11="path_to_JDBCdriver" #PostgreSQL 12="path_to_JDBCdriver" #PostgreSQL 13="path_to_JDBCdriver" #Progress OpenEdge 10="path_to_JDBCdriver" #Progress OpenEdge 11="path_to_JDBCdriver" #Snowflake="path_to_JDBCdriver" #Sybase ASIQ 12="path_to_JDBCdriver" #Sybase Adaptive Server 11="path_to_JDBCdriver" #Sybase Adaptive Server 12="path_to_JDBCdriver" #Sybase Adaptive Server 15="path_to_JDBCdriver" #Sybase Adaptive Server Enterprise 15="path_to_JDBCdriver" #Sybase Adaptive Server Enterprise 15.5="path_to_JDBCdriver" #Sybase Adaptive Server Enterprise 15.7="path_to_JDBCdriver" #Sybase Adaptive Server Enterprise 16.0="path_to_JDBCdriver" #Sybase IQ 15="path_to_JDBCdriver" #Sybase IQ 16="path_to_JDBCdriver" #Sybase SQL Anywhere 10="path_to_JDBCdriver" #Sybase SQL Anywhere 11="path_to_JDBCdriver" #Sybase SQL Anywhere 12="path_to_JDBCdriver" #Sybase SQL Anywhere 16="path_to_JDBCdriver" #Sybase SQL Anywhere 17="path_to_JDBCdriver" #Sybase SQLServer 11="path_to_JDBCdriver" #Teradata 12="path_to_JDBCdriver" #Teradata 13="path_to_JDBCdriver" #Teradata 14="path_to_JDBCdriver" #Teradata 15="path_to_JDBCdriver" #Teradata 16="path_to_JDBCdriver" #Teradata V2 R="path_to_JDBCdriver" #Teradata V2 R6="path_to_JDBCdriver"
NoteAny JDBC driver can be added as a Generic JDBC datasource as long as it uses queries that are supported by MySQL. - Modify your properties file by un-commenting the lines of the
databases you want to connect to, and enter the path to the
driver as the property value. If the driver requires more than
one jar file, the paths can be separated by a semicolon.CautionThe names of databases in your properties file must EXACTLY match the names shown in this example. If you change the names, the SQL connections will fail.
- Download JDBC drivers for the databases that you want to connect
to, and then place the downloaded jar file(s) in an appropriate
folder. The jar files can be in any location on your machine as
long as the properties file includes the correct file locations
(see the next step). For information about supported databases,
see System Requirements and Technical Prerequisites.
- You must restart the SAP Analytics Cloud
agent, using either the Java option
-DSAP_CLOUD_AGENT_PROPERTIES_PATH (if the agent is
installed on Linux), or an environment variable
SAP_CLOUD_AGENT_PROPERTIES_PATH (if the agent is
installed on Windows), to specify the complete path up to and including
the properties file. Example path: C:\<full path to
file>\DriverConfig.properties.
- If you choose to use a Java option, restart the agent via the
command line by navigating to the
tomcat/bin directory and doing the
following:
- Run the shutdown.bat or shutdown.sh script.
- Open the catalina.bat or catalina.sh file in the tomcat/bin directory and find the line where Java options are set. It should look similar to this: set "JAVA_OPTS=%JAVA_OPTS% %LOGGING_CONFIG% -Xms1024m -Xmx10246m -XX:NewSize=256m -XX:MaxNewSize=356m -XX:PermSize=256m -XX:MaxPermSize=4096m"
- Modify this line so that the
-DSAP_CLOUD_AGENT_PROPERTIES_PATH
option is included and points to your
.properties file.
Example: set "JAVA_OPTS=%JAVA_OPTS% %LOGGING_CONFIG% -Xms1024m -Xmx10246m -XX:NewSize=256m -XX:MaxNewSize=356m -XX:PermSize=256m -XX:MaxPermSize=4096m -DSAP_CLOUD_AGENT_PROPERTIES_PATH=C:\<path to driver config file> \DriverConfig.properties"
- Run the startup.bat or startup.sh script.
- If you choose to use an environment variable, restart the agent
as follows:
- If you're running Tomcat via command line: Shut down the existing Tomcat process, then add the environment variable, and then start Tomcat from a new command line window.
- If you're running Tomcat as a Windows service: Restart the Tomcat service by using the Tomcat configuration manager.
- If you choose to use a Java option, restart the agent via the
command line by navigating to the
tomcat/bin directory and doing the
following: