5
0
mirror of https://github.com/apache/sqoop.git synced 2025-05-06 00:33:44 +08:00
Go to file
Abraham Elmahrek 3bb9c595d9 SQOOP-1902: Sqoop2: Avro IDF class and unit tests
(Veena Basavaraj via Abraham Elmahrek)
2015-01-07 18:28:39 -08:00
client SQOOP-1776: Sqoop2: Delegation Token support for Authentication 2014-12-28 00:22:52 -08:00
common SQOOP-1751: Sqoop2: Rearrange LinkConfig and ToJobConfig of Kite Connector 2015-01-06 00:27:27 -08:00
common-test SQOOP-1590: Sqoop2: PostgreSQL repository implementation 2015-01-05 07:50:27 +01:00
connector SQOOP-1902: Sqoop2: Avro IDF class and unit tests 2015-01-07 18:28:39 -08:00
core SQOOP-1870: Sqoop2: Merge SPI and connector-sdk packages 2015-01-05 10:47:10 -08:00
dev-support SQOOP-1972: Sqoop2: PreCommit hook have incorrect link for test reports 2015-01-07 17:50:35 -08:00
dist SQOOP-1963: Generated files such as .project and .classpath files are included in source distribution 2015-01-05 09:29:42 +01:00
docs Sqoop2: Add test categories 2014-12-19 15:33:43 -08:00
execution SQOOP-1940: Add hashcode and equals methods to SqoopWritable 2015-01-05 11:20:49 +01:00
repository SQOOP-1590: Sqoop2: PostgreSQL repository implementation 2015-01-05 07:50:27 +01:00
security SQOOP-1747: Sqoop2: Provide more information for Kerberos initialization exception 2014-11-18 09:14:40 -08:00
server SQOOP-1590: Sqoop2: PostgreSQL repository implementation 2015-01-05 07:50:27 +01:00
shell SQOOP-1776: Sqoop2: Delegation Token support for Authentication 2014-12-28 00:22:52 -08:00
submission SQOOP-1879: Sqoop2: Submission Engine does not set all details on SubmissionRecord in Local mode 2014-12-16 18:08:22 -08:00
test SQOOP-1962: Sqoop2: Start sqoop2 minicluster on random port 2015-01-05 09:25:26 +01:00
tomcat SQOOP-1263: Using CATALINA_HOME as CATALINA_BASE 2013-12-21 00:36:24 -08:00
tools SQOOP-1786: Sqoop2: Stop using JSONValue.parse method call 2014-11-26 02:47:11 -08:00
.gitattributes SQOOP-651: adding .gitattributes to prevent CRLF and LF mismatches for source and text files 2012-10-25 15:40:22 -07:00
.gitignore SQOOP-1339: Synchronize .gitignore files 2014-06-24 09:05:20 -07:00
CHANGELOG.txt SQOOP-1739: Release notes for rc1 2014-11-16 15:32:00 -08:00
LICENSE.txt SQOOP-786: Include jars from WAR archive in LICENSE.txt file 2012-12-17 14:07:23 -08:00
NOTICE.txt SQOOP-1005: Sqoop2: Update year in the notice file to 2013 2013-06-15 15:15:59 -04:00
pom.xml SQOOP-1902: Sqoop2: Avro IDF class and unit tests 2015-01-07 18:28:39 -08:00
README.txt SQOOP-772: OutOfMemory from document generation 2012-12-17 13:24:28 -08:00

= Welcome to Sqoop

Apache Sqoop is a tool designed for efficiently transferring bulk data between
Apache Hadoop and structured datastores such as relational databases. You can use
Sqoop to import data from external structured datastores into Hadoop Distributed
File System or related systems like Hive and HBase. Conversely, Sqoop can be used
to extract data from Hadoop and export it to external structured datastores such
as relational databases and enterprise data warehouses.

== Documentation

Sqoop ships with documentation, please check module "docs" for additional materials.

More documentation is available online on Sqoop home page:

http://sqoop.apache.org/

== Compiling Sqoop

Sqoop uses the Maven build system, and it can be compiled and built running the
following commands:

  mvn compile # Compile project
  mvn package # Build source artifact
  mvn package -Pbinary # Build binary artifact

Sqoop is using Sphinx plugin to generate documentation that have higher memory
requirements that might not fit into default maven configuration. You might need
to increase maximal memory allowance to successfully execute package goal. This
can done using following command:

  export MAVEN_OPTS="-Xmx512m -XX:MaxPermSize=512m"

Sqoop currently supports multiple Hadoop distributions. In order to compile Sqoop
against a specific Hadoop version, please specify the hadoop.profile property in
Maven commands. For example:

  mvn package -Pbinary -Dhadoop.profile=100

Please refer to the Sqoop documentation for a full list of supported Hadoop
distributions and values of the hadoop.profile property.