5
0
mirror of https://github.com/apache/sqoop.git synced 2025-05-03 07:11:02 +08:00

Move org.apache.hadoop.sqoop to com.cloudera.sqoop

From: Aaron Kimball <aaron@cloudera.com>

git-svn-id: https://svn.apache.org/repos/asf/incubator/sqoop/trunk@1149906 13f79535-47bb-0310-9956-ffa450edef68
This commit is contained in:
Andrew Bayer 2011-07-22 20:03:52 +00:00
parent 9f2d744a24
commit 42875119dd
157 changed files with 548 additions and 548 deletions

View File

@ -21,4 +21,4 @@ bin=`cd ${bin} && pwd`
source ${bin}/configure-sqoop source ${bin}/configure-sqoop
${HADOOP_HOME}/bin/hadoop jar ${SQOOP_JAR} \ ${HADOOP_HOME}/bin/hadoop jar ${SQOOP_JAR} \
org.apache.hadoop.sqoop.Sqoop "$@" com.cloudera.sqoop.Sqoop "$@"

View File

@ -569,7 +569,7 @@
<arg value="+%Y" /> <arg value="+%Y" />
</exec> </exec>
<javadoc <javadoc
packagenames="org.apache.hadoop.sqoop.*" packagenames="com.cloudera.sqoop.*"
destdir="${build.javadoc}" destdir="${build.javadoc}"
author="true" author="true"
version="true" version="true"

View File

@ -23,7 +23,7 @@
<property> <property>
<name>sqoop.connection.factories</name> <name>sqoop.connection.factories</name>
<value>org.apache.hadoop.sqoop.manager.DefaultManagerFactory</value> <value>com.cloudera.sqoop.manager.DefaultManagerFactory</value>
<description>A comma-delimited list of ManagerFactory implementations <description>A comma-delimited list of ManagerFactory implementations
which are consulted, in order, to instantiate ConnManager instances which are consulted, in order, to instantiate ConnManager instances
used to drive connections to databases. used to drive connections to databases.

View File

@ -48,10 +48,10 @@ convenience methods:
The full set of methods guaranteed to exist in an auto-generated class The full set of methods guaranteed to exist in an auto-generated class
is specified in the abstract class is specified in the abstract class
+org.apache.hadoop.sqoop.lib.SqoopRecord+. +com.cloudera.sqoop.lib.SqoopRecord+.
Instances of +SqoopRecord+ may depend on Sqoop's public API. This is all classes Instances of +SqoopRecord+ may depend on Sqoop's public API. This is all classes
in the +org.apache.hadoop.sqoop.lib+ package. These are briefly described below. in the +com.cloudera.sqoop.lib+ package. These are briefly described below.
Clients of Sqoop should not need to directly interact with any of these classes, Clients of Sqoop should not need to directly interact with any of these classes,
although classes generated by Sqoop will depend on them. Therefore, these APIs although classes generated by Sqoop will depend on them. Therefore, these APIs
are considered public and care will be taken when forward-evolving them. are considered public and care will be taken when forward-evolving them.
@ -81,11 +81,11 @@ While Sqoop uses JDBC and +DataDrivenDBInputFormat+ to
read from databases, differences in the SQL supported by different vendors as read from databases, differences in the SQL supported by different vendors as
well as JDBC metadata necessitates vendor-specific codepaths for most databases. well as JDBC metadata necessitates vendor-specific codepaths for most databases.
Sqoop's solution to this problem is by introducing the +ConnManager+ API Sqoop's solution to this problem is by introducing the +ConnManager+ API
(+org.apache.hadoop.sqoop.manager.ConnMananger+). (+com.cloudera.sqoop.manager.ConnMananger+).
+ConnManager+ is an abstract class defining all methods that interact with the +ConnManager+ is an abstract class defining all methods that interact with the
database itself. Most implementations of +ConnManager+ will extend the database itself. Most implementations of +ConnManager+ will extend the
+org.apache.hadoop.sqoop.manager.SqlManager+ abstract class, which uses standard +com.cloudera.sqoop.manager.SqlManager+ abstract class, which uses standard
SQL to perform most actions. Subclasses are required to implement the SQL to perform most actions. Subclasses are required to implement the
+getConnection()+ method which returns the actual JDBC connection to the +getConnection()+ method which returns the actual JDBC connection to the
database. Subclasses are free to override all other methods as well. The database. Subclasses are free to override all other methods as well. The
@ -117,7 +117,7 @@ lightweight operation, and is done reasonably infrequently. Therefore,
class +ManagerFactory+ (See class +ManagerFactory+ (See
http://issues.apache.org/jira/browse/MAPREDUCE-750[]). One http://issues.apache.org/jira/browse/MAPREDUCE-750[]). One
+ManagerFactory+ implementation currently serves all of Sqoop: +ManagerFactory+ implementation currently serves all of Sqoop:
+org.apache.hadoop.sqoop.manager.DefaultManagerFactory+. Extensions +com.cloudera.sqoop.manager.DefaultManagerFactory+. Extensions
should not modify +DefaultManagerFactory+. Instead, an should not modify +DefaultManagerFactory+. Instead, an
extension-specific +ManagerFactory+ implementation should be provided extension-specific +ManagerFactory+ implementation should be provided
with the new +ConnManager+. +ManagerFactory+ has a single method of with the new +ConnManager+. +ManagerFactory+ has a single method of
@ -132,7 +132,7 @@ and +ConnManager+(s), and configure +sqoop-site.xml+ to use the new
+ManagerFactory+. The +DefaultManagerFactory+ principly discriminates between +ManagerFactory+. The +DefaultManagerFactory+ principly discriminates between
databases by parsing the connect string stored in +SqoopOptions+. databases by parsing the connect string stored in +SqoopOptions+.
Extension authors may make use of classes in the +org.apache.hadoop.sqoop.io+, Extension authors may make use of classes in the +com.cloudera.sqoop.io+,
+mapreduce+, and +util+ packages to facilitate their implementations. +mapreduce+, and +util+ packages to facilitate their implementations.
These packages and classes are described in more detail in the following These packages and classes are described in more detail in the following
section. section.
@ -143,7 +143,7 @@ Sqoop Internals
This section describes the internal architecture of Sqoop. This section describes the internal architecture of Sqoop.
The Sqoop program is driven by the +org.apache.hadoop.sqoop.Sqoop+ main class. The Sqoop program is driven by the +com.cloudera.sqoop.Sqoop+ main class.
A limited number of additional classes are in the same package; +SqoopOptions+ A limited number of additional classes are in the same package; +SqoopOptions+
(described earlier) and +ConnFactory+ (which manipulates +ManagerFactory+ (described earlier) and +ConnFactory+ (which manipulates +ManagerFactory+
instances). instances).
@ -153,7 +153,7 @@ General program flow
The general program flow is as follows: The general program flow is as follows:
+org.apache.hadoop.sqoop.Sqoop+ is the main class and implements _Tool_. A new +com.cloudera.sqoop.Sqoop+ is the main class and implements _Tool_. A new
instance is launched with +ToolRunner+. The first argument to Sqoop is instance is launched with +ToolRunner+. The first argument to Sqoop is
a string identifying the name of a +SqoopTool+ to run. The +SqoopTool+ a string identifying the name of a +SqoopTool+ to run. The +SqoopTool+
itself drives the execution of the user's requested operation (e.g., itself drives the execution of the user's requested operation (e.g.,
@ -178,8 +178,8 @@ the +ConnManager.importTable()+ method is left to determine how best
to run the import. Each main action is actually controlled by the to run the import. Each main action is actually controlled by the
+ConnMananger+, except for the generating of code, which is done by +ConnMananger+, except for the generating of code, which is done by
the +CompilationManager+ and +ClassWriter+. (Both in the the +CompilationManager+ and +ClassWriter+. (Both in the
+org.apache.hadoop.sqoop.orm+ package.) Importing into Hive is also +com.cloudera.sqoop.orm+ package.) Importing into Hive is also
taken care of via the +org.apache.hadoop.sqoop.hive.HiveImport+ class taken care of via the +com.cloudera.sqoop.hive.HiveImport+ class
after the +importTable()+ has completed. This is done without concern after the +importTable()+ has completed. This is done without concern
for the +ConnManager+ implementation used. for the +ConnManager+ implementation used.
@ -195,7 +195,7 @@ related data.
Subpackages Subpackages
^^^^^^^^^^^ ^^^^^^^^^^^
The following subpackages under +org.apache.hadoop.sqoop+ exist: The following subpackages under +com.cloudera.sqoop+ exist:
* +hive+ - Facilitates importing data to Hive. * +hive+ - Facilitates importing data to Hive.
* +io+ - Implementations of +java.io.*+ interfaces (namely, _OutputStream_ and * +io+ - Implementations of +java.io.*+ interfaces (namely, _OutputStream_ and
@ -252,7 +252,7 @@ more. Consequently, these must both be handled, and preferably asynchronously.
In Sqoop parlance, an "async sink" is a thread that takes an +InputStream+ and In Sqoop parlance, an "async sink" is a thread that takes an +InputStream+ and
reads it to completion. These are realized by +AsyncSink+ implementations. The reads it to completion. These are realized by +AsyncSink+ implementations. The
+org.apache.hadoop.sqoop.util.AsyncSink+ abstract class defines the operations +com.cloudera.sqoop.util.AsyncSink+ abstract class defines the operations
this factory must perform. +processStream()+ will spawn another thread to this factory must perform. +processStream()+ will spawn another thread to
immediately begin handling the data read from the +InputStream+ argument; it immediately begin handling the data read from the +InputStream+ argument; it
must read this stream to completion. The +join()+ method allows external threads must read this stream to completion. The +join()+ method allows external threads
@ -275,7 +275,7 @@ Sqoop schedules MapReduce jobs to effect imports and exports.
Configuration and execution of MapReduce jobs follows a few common Configuration and execution of MapReduce jobs follows a few common
steps (configuring the +InputFormat+; configuring the +OutputFormat+; steps (configuring the +InputFormat+; configuring the +OutputFormat+;
setting the +Mapper+ implementation; etc...). These steps are setting the +Mapper+ implementation; etc...). These steps are
formalized in the +org.apache.hadoop.sqoop.mapreduce.JobBase+ class. formalized in the +com.cloudera.sqoop.mapreduce.JobBase+ class.
The +JobBase+ allows a user to specify the +InputFormat+, The +JobBase+ allows a user to specify the +InputFormat+,
+OutputFormat+, and +Mapper+ to use. +OutputFormat+, and +Mapper+ to use.

View File

@ -16,12 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop; package com.cloudera.sqoop;
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import org.apache.hadoop.sqoop.manager.DefaultManagerFactory; import com.cloudera.sqoop.manager.DefaultManagerFactory;
import org.apache.hadoop.sqoop.manager.ManagerFactory; import com.cloudera.sqoop.manager.ManagerFactory;
import org.apache.hadoop.util.ReflectionUtils; import org.apache.hadoop.util.ReflectionUtils;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop; package com.cloudera.sqoop;
import java.util.Arrays; import java.util.Arrays;
@ -28,12 +28,12 @@
import org.apache.hadoop.util.Tool; import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner; import org.apache.hadoop.util.ToolRunner;
import org.apache.hadoop.sqoop.tool.HelpTool; import com.cloudera.sqoop.tool.HelpTool;
import org.apache.hadoop.sqoop.tool.SqoopTool; import com.cloudera.sqoop.tool.SqoopTool;
/** /**
* Main entry-point for Sqoop * Main entry-point for Sqoop
* Usage: hadoop jar (this_jar_name) org.apache.hadoop.sqoop.Sqoop (options) * Usage: hadoop jar (this_jar_name) com.cloudera.sqoop.Sqoop (options)
* See the SqoopOptions class for options. * See the SqoopOptions class for options.
*/ */
public class Sqoop extends Configured implements Tool { public class Sqoop extends Configured implements Tool {

View File

@ -17,7 +17,7 @@
*/ */
package org.apache.hadoop.sqoop; package com.cloudera.sqoop;
import java.io.File; import java.io.File;
import java.io.FileInputStream; import java.io.FileInputStream;
@ -29,7 +29,7 @@
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.sqoop.lib.LargeObjectLoader; import com.cloudera.sqoop.lib.LargeObjectLoader;
/** /**
* Command-line arguments used by Sqoop. * Command-line arguments used by Sqoop.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.cli; package com.cloudera.sqoop.cli;
import org.apache.commons.cli.Options; import org.apache.commons.cli.Options;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.cli; package com.cloudera.sqoop.cli;
import java.util.ListIterator; import java.util.ListIterator;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.cli; package com.cloudera.sqoop.cli;
import java.io.PrintWriter; import java.io.PrintWriter;
import java.io.StringWriter; import java.io.StringWriter;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.hive; package com.cloudera.sqoop.hive;
import java.io.BufferedWriter; import java.io.BufferedWriter;
import java.io.File; import java.io.File;
@ -32,10 +32,10 @@
import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path; import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import org.apache.hadoop.sqoop.util.Executor; import com.cloudera.sqoop.util.Executor;
import org.apache.hadoop.sqoop.util.LoggingAsyncSink; import com.cloudera.sqoop.util.LoggingAsyncSink;
/** /**
* Utility to import a table into the Hive metastore. Manages the connection * Utility to import a table into the Hive metastore. Manages the connection

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.hive; package com.cloudera.sqoop.hive;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;

View File

@ -16,14 +16,14 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.hive; package com.cloudera.sqoop.hive;
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path; import org.apache.hadoop.fs.Path;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import java.io.File; import java.io.File;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.io; package com.cloudera.sqoop.io;
import java.util.Map; import java.util.Map;
import java.util.Set; import java.util.Set;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.io; package com.cloudera.sqoop.io;
import java.io.InputStream; import java.io.InputStream;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.io; package com.cloudera.sqoop.io;
import java.io.BufferedOutputStream; import java.io.BufferedOutputStream;
import java.io.Closeable; import java.io.Closeable;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.io; package com.cloudera.sqoop.io;
import java.io.IOException; import java.io.IOException;
import java.util.Map; import java.util.Map;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.io; package com.cloudera.sqoop.io;
import java.io.BufferedWriter; import java.io.BufferedWriter;
import java.io.OutputStreamWriter; import java.io.OutputStreamWriter;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.io; package com.cloudera.sqoop.io;
import java.io.OutputStream; import java.io.OutputStream;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.io; package com.cloudera.sqoop.io;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import java.io.DataInput; import java.io.DataInput;
import java.io.DataOutput; import java.io.DataOutput;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import java.io.ByteArrayInputStream; import java.io.ByteArrayInputStream;
import java.io.DataInput; import java.io.DataInput;
@ -27,7 +27,7 @@
import java.util.regex.Matcher; import java.util.regex.Matcher;
import org.apache.hadoop.io.BytesWritable; import org.apache.hadoop.io.BytesWritable;
import org.apache.hadoop.sqoop.io.LobFile; import com.cloudera.sqoop.io.LobFile;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import java.io.DataInput; import java.io.DataInput;
import java.io.DataOutput; import java.io.DataOutput;
@ -26,7 +26,7 @@
import java.util.regex.Matcher; import java.util.regex.Matcher;
import org.apache.hadoop.io.Text; import org.apache.hadoop.io.Text;
import org.apache.hadoop.sqoop.io.LobFile; import com.cloudera.sqoop.io.LobFile;
/** /**
* ClobRef is a wrapper that holds a CLOB either directly, or a * ClobRef is a wrapper that holds a CLOB either directly, or a

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
/** /**

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import org.apache.hadoop.io.BytesWritable; import org.apache.hadoop.io.BytesWritable;
import java.math.BigDecimal; import java.math.BigDecimal;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import java.io.Closeable; import java.io.Closeable;
import java.io.File; import java.io.File;
@ -33,7 +33,7 @@
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path; import org.apache.hadoop.fs.Path;
import org.apache.hadoop.sqoop.io.LobFile; import com.cloudera.sqoop.io.LobFile;
/** /**
* Contains a set of methods which can read db columns from a ResultSet into * Contains a set of methods which can read db columns from a ResultSet into

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import java.io.Closeable; import java.io.Closeable;
import java.io.DataInput; import java.io.DataInput;
@ -32,8 +32,8 @@
import org.apache.hadoop.mapreduce.InputSplit; import org.apache.hadoop.mapreduce.InputSplit;
import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.lib.input.FileSplit; import org.apache.hadoop.mapreduce.lib.input.FileSplit;
import org.apache.hadoop.sqoop.io.LobFile; import com.cloudera.sqoop.io.LobFile;
import org.apache.hadoop.sqoop.io.LobReaderCache; import com.cloudera.sqoop.io.LobReaderCache;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import java.io.DataInput; import java.io.DataInput;
import java.io.DataOutput; import java.io.DataOutput;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import java.io.IOException; import java.io.IOException;
import java.nio.ByteBuffer; import java.nio.ByteBuffer;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.lib; package com.cloudera.sqoop.lib;
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import java.io.IOException; import java.io.IOException;
import java.sql.Connection; import java.sql.Connection;
@ -24,8 +24,8 @@
import java.sql.SQLException; import java.sql.SQLException;
import java.util.Map; import java.util.Map;
import org.apache.hadoop.sqoop.util.ExportException; import com.cloudera.sqoop.util.ExportException;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
/** /**
* Abstract interface that manages connections to a database. * Abstract interface that manages connections to a database.

View File

@ -16,9 +16,9 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;

View File

@ -16,17 +16,17 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import java.io.IOException; import java.io.IOException;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.mapreduce.MySQLDumpImportJob; import com.cloudera.sqoop.mapreduce.MySQLDumpImportJob;
import org.apache.hadoop.sqoop.mapreduce.MySQLExportJob; import com.cloudera.sqoop.mapreduce.MySQLExportJob;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
import org.apache.hadoop.sqoop.util.ExportException; import com.cloudera.sqoop.util.ExportException;
/** /**
* Manages direct connections to MySQL databases * Manages direct connections to MySQL databases

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import java.io.BufferedReader; import java.io.BufferedReader;
import java.io.BufferedWriter; import java.io.BufferedWriter;
@ -32,17 +32,17 @@
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.io.SplittableBufferedWriter; import com.cloudera.sqoop.io.SplittableBufferedWriter;
import org.apache.hadoop.sqoop.util.AsyncSink; import com.cloudera.sqoop.util.AsyncSink;
import org.apache.hadoop.sqoop.util.DirectImportUtils; import com.cloudera.sqoop.util.DirectImportUtils;
import org.apache.hadoop.sqoop.util.ErrorableAsyncSink; import com.cloudera.sqoop.util.ErrorableAsyncSink;
import org.apache.hadoop.sqoop.util.ErrorableThread; import com.cloudera.sqoop.util.ErrorableThread;
import org.apache.hadoop.sqoop.util.Executor; import com.cloudera.sqoop.util.Executor;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
import org.apache.hadoop.sqoop.util.JdbcUrl; import com.cloudera.sqoop.util.JdbcUrl;
import org.apache.hadoop.sqoop.util.LoggingAsyncSink; import com.cloudera.sqoop.util.LoggingAsyncSink;
import org.apache.hadoop.sqoop.util.PerfCounters; import com.cloudera.sqoop.util.PerfCounters;
/** /**
* Manages direct dumps from Postgresql databases via psql COPY TO STDOUT * Manages direct dumps from Postgresql databases via psql COPY TO STDOUT

View File

@ -16,9 +16,9 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
/** /**
* A set of parameters describing an export operation; this is passed to * A set of parameters describing an export operation; this is passed to

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import java.sql.Connection; import java.sql.Connection;
import java.sql.SQLException; import java.sql.SQLException;
@ -24,7 +24,7 @@
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
/** /**
* Database manager that is connects to a generic JDBC-compliant * Database manager that is connects to a generic JDBC-compliant

View File

@ -16,12 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
/** /**
* Manages connections to hsqldb databases. * Manages connections to hsqldb databases.

View File

@ -16,11 +16,11 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import org.apache.hadoop.mapreduce.InputFormat; import org.apache.hadoop.mapreduce.InputFormat;
import org.apache.hadoop.mapreduce.lib.db.DataDrivenDBInputFormat; import org.apache.hadoop.mapreduce.lib.db.DataDrivenDBInputFormat;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
/** /**
* A set of parameters describing an import operation; this is passed to * A set of parameters describing an import operation; this is passed to

View File

@ -16,9 +16,9 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
/** /**
* Interface for factory classes for ConnManager implementations. * Interface for factory classes for ConnManager implementations.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import java.io.IOException; import java.io.IOException;
import java.io.PrintWriter; import java.io.PrintWriter;
@ -32,8 +32,8 @@
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.util.StringUtils; import org.apache.hadoop.util.StringUtils;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
/** /**
* Manages connections to MySQL databases. * Manages connections to MySQL databases.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import java.io.BufferedWriter; import java.io.BufferedWriter;
import java.io.File; import java.io.File;
@ -27,8 +27,8 @@
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.sqoop.shims.HadoopShim; import com.cloudera.sqoop.shims.HadoopShim;
import org.apache.hadoop.sqoop.util.DirectImportUtils; import com.cloudera.sqoop.util.DirectImportUtils;
/** /**
* Helper methods and constants for MySQL imports/exports. * Helper methods and constants for MySQL imports/exports.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import java.io.IOException; import java.io.IOException;
import java.sql.Connection; import java.sql.Connection;
@ -32,11 +32,11 @@
import org.apache.hadoop.mapreduce.OutputFormat; import org.apache.hadoop.mapreduce.OutputFormat;
import org.apache.hadoop.mapreduce.lib.db.OracleDataDrivenDBInputFormat; import org.apache.hadoop.mapreduce.lib.db.OracleDataDrivenDBInputFormat;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.mapreduce.JdbcExportJob; import com.cloudera.sqoop.mapreduce.JdbcExportJob;
import org.apache.hadoop.sqoop.shims.ShimLoader; import com.cloudera.sqoop.shims.ShimLoader;
import org.apache.hadoop.sqoop.util.ExportException; import com.cloudera.sqoop.util.ExportException;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
/** /**
* Manages connections to Oracle databases. * Manages connections to Oracle databases.
@ -303,7 +303,7 @@ public void exportTable(ExportJobContext context)
try { try {
JdbcExportJob exportJob = new JdbcExportJob(context, null, null, JdbcExportJob exportJob = new JdbcExportJob(context, null, null,
(Class<? extends OutputFormat>) ShimLoader.getShimClass( (Class<? extends OutputFormat>) ShimLoader.getShimClass(
"org.apache.hadoop.sqoop.mapreduce.OracleExportOutputFormat")); "com.cloudera.sqoop.mapreduce.OracleExportOutputFormat"));
exportJob.runExport(); exportJob.runExport();
} catch (ClassNotFoundException cnfe) { } catch (ClassNotFoundException cnfe) {
throw new ExportException("Could not start export; could not find class", throw new ExportException("Could not start export; could not find class",

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import java.io.IOException; import java.io.IOException;
import java.sql.PreparedStatement; import java.sql.PreparedStatement;
@ -27,8 +27,8 @@
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
/** /**
* Manages connections to Postgresql databases. * Manages connections to Postgresql databases.

View File

@ -16,17 +16,17 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.manager; package com.cloudera.sqoop.manager;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.hive.HiveTypes; import com.cloudera.sqoop.hive.HiveTypes;
import org.apache.hadoop.sqoop.lib.BlobRef; import com.cloudera.sqoop.lib.BlobRef;
import org.apache.hadoop.sqoop.lib.ClobRef; import com.cloudera.sqoop.lib.ClobRef;
import org.apache.hadoop.sqoop.mapreduce.DataDrivenImportJob; import com.cloudera.sqoop.mapreduce.DataDrivenImportJob;
import org.apache.hadoop.sqoop.mapreduce.JdbcExportJob; import com.cloudera.sqoop.mapreduce.JdbcExportJob;
import org.apache.hadoop.sqoop.util.ExportException; import com.cloudera.sqoop.util.ExportException;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
import org.apache.hadoop.sqoop.util.ResultSetPrinter; import com.cloudera.sqoop.util.ResultSetPrinter;
import java.io.IOException; import java.io.IOException;
import java.io.PrintWriter; import java.io.PrintWriter;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
@ -35,12 +35,12 @@
import org.apache.hadoop.mapreduce.lib.db.DataDrivenDBInputFormat; import org.apache.hadoop.mapreduce.lib.db.DataDrivenDBInputFormat;
import org.apache.hadoop.mapreduce.lib.db.DBWritable; import org.apache.hadoop.mapreduce.lib.db.DBWritable;
import org.apache.hadoop.sqoop.ConnFactory; import com.cloudera.sqoop.ConnFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import org.apache.hadoop.sqoop.lib.LargeObjectLoader; import com.cloudera.sqoop.lib.LargeObjectLoader;
import org.apache.hadoop.sqoop.shims.HadoopShim; import com.cloudera.sqoop.shims.HadoopShim;
import org.apache.hadoop.sqoop.shims.ShimLoader; import com.cloudera.sqoop.shims.ShimLoader;
/** /**
* Actually runs a jdbc import job using the ORM files generated by the * Actually runs a jdbc import job using the ORM files generated by the
@ -90,7 +90,7 @@ protected Class<? extends OutputFormat> getOutputFormatClass()
throws ClassNotFoundException { throws ClassNotFoundException {
if (options.getFileLayout() == SqoopOptions.FileLayout.TextFile) { if (options.getFileLayout() == SqoopOptions.FileLayout.TextFile) {
return (Class<? extends OutputFormat>) ShimLoader.getShimClass( return (Class<? extends OutputFormat>) ShimLoader.getShimClass(
"org.apache.hadoop.sqoop.mapreduce.RawKeyTextOutputFormat"); "com.cloudera.sqoop.mapreduce.RawKeyTextOutputFormat");
} else if (options.getFileLayout() == SqoopOptions.FileLayout.SequenceFile) { } else if (options.getFileLayout() == SqoopOptions.FileLayout.SequenceFile) {
return SequenceFileOutputFormat.class; return SequenceFileOutputFormat.class;
} }

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.FileNotFoundException; import java.io.FileNotFoundException;
import java.io.IOException; import java.io.IOException;
@ -36,14 +36,14 @@
import org.apache.hadoop.mapreduce.OutputFormat; import org.apache.hadoop.mapreduce.OutputFormat;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.lib.SqoopRecord; import com.cloudera.sqoop.lib.SqoopRecord;
import org.apache.hadoop.sqoop.manager.ExportJobContext; import com.cloudera.sqoop.manager.ExportJobContext;
import org.apache.hadoop.sqoop.orm.TableClassName; import com.cloudera.sqoop.orm.TableClassName;
import org.apache.hadoop.sqoop.shims.HadoopShim; import com.cloudera.sqoop.shims.HadoopShim;
import org.apache.hadoop.sqoop.shims.ShimLoader; import com.cloudera.sqoop.shims.ShimLoader;
import org.apache.hadoop.sqoop.util.ExportException; import com.cloudera.sqoop.util.ExportException;
import org.apache.hadoop.sqoop.util.PerfCounters; import com.cloudera.sqoop.util.PerfCounters;
/** /**
* Base class for running an export MapReduce job. * Base class for running an export MapReduce job.
@ -174,7 +174,7 @@ protected Class<? extends InputFormat> getInputFormatClass()
Class<? extends InputFormat> configuredIF = super.getInputFormatClass(); Class<? extends InputFormat> configuredIF = super.getInputFormatClass();
if (null == configuredIF) { if (null == configuredIF) {
return (Class<? extends InputFormat>) ShimLoader.getShimClass( return (Class<? extends InputFormat>) ShimLoader.getShimClass(
"org.apache.hadoop.sqoop.mapreduce.ExportInputFormat"); "com.cloudera.sqoop.mapreduce.ExportInputFormat");
} else { } else {
return configuredIF; return configuredIF;
} }
@ -186,7 +186,7 @@ protected Class<? extends OutputFormat> getOutputFormatClass()
Class<? extends OutputFormat> configuredOF = super.getOutputFormatClass(); Class<? extends OutputFormat> configuredOF = super.getOutputFormatClass();
if (null == configuredOF) { if (null == configuredOF) {
return (Class<? extends OutputFormat>) ShimLoader.getShimClass( return (Class<? extends OutputFormat>) ShimLoader.getShimClass(
"org.apache.hadoop.sqoop.mapreduce.ExportOutputFormat"); "com.cloudera.sqoop.mapreduce.ExportOutputFormat");
} else { } else {
return configuredOF; return configuredOF;
} }

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
@ -35,11 +35,11 @@
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat; import org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.orm.TableClassName; import com.cloudera.sqoop.orm.TableClassName;
import org.apache.hadoop.sqoop.shims.HadoopShim; import com.cloudera.sqoop.shims.HadoopShim;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
import org.apache.hadoop.sqoop.util.PerfCounters; import com.cloudera.sqoop.util.PerfCounters;
/** /**
* Base class for running an import MapReduce job. * Base class for running an import MapReduce job.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
@ -32,9 +32,9 @@
import org.apache.hadoop.mapreduce.lib.db.DBConfiguration; import org.apache.hadoop.mapreduce.lib.db.DBConfiguration;
import org.apache.hadoop.mapreduce.lib.db.DBOutputFormat; import org.apache.hadoop.mapreduce.lib.db.DBOutputFormat;
import org.apache.hadoop.sqoop.ConnFactory; import com.cloudera.sqoop.ConnFactory;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import org.apache.hadoop.sqoop.manager.ExportJobContext; import com.cloudera.sqoop.manager.ExportJobContext;
/** /**
* Run an export using JDBC (JDBC-based ExportOutputFormat). * Run an export using JDBC (JDBC-based ExportOutputFormat).

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
@ -29,9 +29,9 @@
import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.OutputFormat; import org.apache.hadoop.mapreduce.OutputFormat;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.shims.HadoopShim; import com.cloudera.sqoop.shims.HadoopShim;
import org.apache.hadoop.sqoop.util.ClassLoaderStack; import com.cloudera.sqoop.util.ClassLoaderStack;
/** /**
* Base class for configuring and running a MapReduce job. * Base class for configuring and running a MapReduce job.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
@ -33,11 +33,11 @@
import org.apache.hadoop.mapreduce.lib.db.DataDrivenDBInputFormat; import org.apache.hadoop.mapreduce.lib.db.DataDrivenDBInputFormat;
import org.apache.hadoop.mapreduce.lib.db.DBWritable; import org.apache.hadoop.mapreduce.lib.db.DBWritable;
import org.apache.hadoop.sqoop.ConnFactory; import com.cloudera.sqoop.ConnFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import org.apache.hadoop.sqoop.manager.MySQLUtils; import com.cloudera.sqoop.manager.MySQLUtils;
import org.apache.hadoop.sqoop.shims.ShimLoader; import com.cloudera.sqoop.shims.ShimLoader;
/** /**
* Class that runs an import job using mysqldump in the mapper. * Class that runs an import job using mysqldump in the mapper.
@ -51,9 +51,9 @@ public MySQLDumpImportJob(final SqoopOptions opts)
throws ClassNotFoundException { throws ClassNotFoundException {
super(opts, MySQLDumpMapper.class, super(opts, MySQLDumpMapper.class,
(Class<? extends InputFormat>) ShimLoader.getShimClass( (Class<? extends InputFormat>) ShimLoader.getShimClass(
"org.apache.hadoop.sqoop.mapreduce.MySQLDumpInputFormat"), "com.cloudera.sqoop.mapreduce.MySQLDumpInputFormat"),
(Class<? extends OutputFormat>) ShimLoader.getShimClass( (Class<? extends OutputFormat>) ShimLoader.getShimClass(
"org.apache.hadoop.sqoop.mapreduce.RawKeyTextOutputFormat")); "com.cloudera.sqoop.mapreduce.RawKeyTextOutputFormat"));
} }
/** /**

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.BufferedReader; import java.io.BufferedReader;
import java.io.File; import java.io.File;
@ -32,15 +32,15 @@
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.io.NullWritable; import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.sqoop.lib.FieldFormatter; import com.cloudera.sqoop.lib.FieldFormatter;
import org.apache.hadoop.sqoop.lib.RecordParser; import com.cloudera.sqoop.lib.RecordParser;
import org.apache.hadoop.sqoop.manager.MySQLUtils; import com.cloudera.sqoop.manager.MySQLUtils;
import org.apache.hadoop.sqoop.util.AsyncSink; import com.cloudera.sqoop.util.AsyncSink;
import org.apache.hadoop.sqoop.util.ErrorableAsyncSink; import com.cloudera.sqoop.util.ErrorableAsyncSink;
import org.apache.hadoop.sqoop.util.ErrorableThread; import com.cloudera.sqoop.util.ErrorableThread;
import org.apache.hadoop.sqoop.util.JdbcUrl; import com.cloudera.sqoop.util.JdbcUrl;
import org.apache.hadoop.sqoop.util.LoggingAsyncSink; import com.cloudera.sqoop.util.LoggingAsyncSink;
import org.apache.hadoop.sqoop.util.PerfCounters; import com.cloudera.sqoop.util.PerfCounters;
/** /**
* Mapper that opens up a pipe to mysqldump and pulls data directly. * Mapper that opens up a pipe to mysqldump and pulls data directly.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
@ -32,10 +32,10 @@
import org.apache.hadoop.mapreduce.lib.db.DataDrivenDBInputFormat; import org.apache.hadoop.mapreduce.lib.db.DataDrivenDBInputFormat;
import org.apache.hadoop.mapreduce.lib.db.DBWritable; import org.apache.hadoop.mapreduce.lib.db.DBWritable;
import org.apache.hadoop.sqoop.ConnFactory; import com.cloudera.sqoop.ConnFactory;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import org.apache.hadoop.sqoop.manager.ExportJobContext; import com.cloudera.sqoop.manager.ExportJobContext;
import org.apache.hadoop.sqoop.manager.MySQLUtils; import com.cloudera.sqoop.manager.MySQLUtils;
/** /**
* Class that runs an export job using mysqlimport in the mapper. * Class that runs an export job using mysqlimport in the mapper.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.BufferedOutputStream; import java.io.BufferedOutputStream;
import java.io.File; import java.io.File;
@ -32,13 +32,13 @@
import org.apache.hadoop.util.Shell; import org.apache.hadoop.util.Shell;
import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.lib.db.DBConfiguration; import org.apache.hadoop.mapreduce.lib.db.DBConfiguration;
import org.apache.hadoop.sqoop.lib.TaskId; import com.cloudera.sqoop.lib.TaskId;
import org.apache.hadoop.sqoop.manager.MySQLUtils; import com.cloudera.sqoop.manager.MySQLUtils;
import org.apache.hadoop.sqoop.shims.HadoopShim; import com.cloudera.sqoop.shims.HadoopShim;
import org.apache.hadoop.sqoop.util.AsyncSink; import com.cloudera.sqoop.util.AsyncSink;
import org.apache.hadoop.sqoop.util.JdbcUrl; import com.cloudera.sqoop.util.JdbcUrl;
import org.apache.hadoop.sqoop.util.LoggingAsyncSink; import com.cloudera.sqoop.util.LoggingAsyncSink;
import org.apache.hadoop.sqoop.util.NullAsyncSink; import com.cloudera.sqoop.util.NullAsyncSink;
/** /**
* Mapper that starts a 'mysqlimport' process and uses that to export rows from * Mapper that starts a 'mysqlimport' process and uses that to export rows from

View File

@ -16,12 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.sqoop.lib.SqoopRecord; import com.cloudera.sqoop.lib.SqoopRecord;
/** /**
* mysqlimport-based exporter which accepts SqoopRecords (e.g., from * mysqlimport-based exporter which accepts SqoopRecords (e.g., from

View File

@ -16,14 +16,14 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text; import org.apache.hadoop.io.Text;
import org.apache.hadoop.sqoop.manager.MySQLUtils; import com.cloudera.sqoop.manager.MySQLUtils;
/** /**
* mysqlimport-based exporter which accepts lines of text from files * mysqlimport-based exporter which accepts lines of text from files

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
@ -24,7 +24,7 @@
import org.apache.hadoop.io.NullWritable; import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.mapreduce.Mapper.Context; import org.apache.hadoop.mapreduce.Mapper.Context;
import org.apache.hadoop.sqoop.lib.SqoopRecord; import com.cloudera.sqoop.lib.SqoopRecord;
/** /**
* Reads a SqoopRecord from the SequenceFile in which it's packed and emits * Reads a SqoopRecord from the SequenceFile in which it's packed and emits

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
@ -24,8 +24,8 @@
import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.mapreduce.Mapper.Context; import org.apache.hadoop.mapreduce.Mapper.Context;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.sqoop.lib.LargeObjectLoader; import com.cloudera.sqoop.lib.LargeObjectLoader;
import org.apache.hadoop.sqoop.lib.SqoopRecord; import com.cloudera.sqoop.lib.SqoopRecord;
/** /**
* Imports records by writing them to a SequenceFile. * Imports records by writing them to a SequenceFile.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
@ -27,8 +27,8 @@
import org.apache.hadoop.mapreduce.Mapper.Context; import org.apache.hadoop.mapreduce.Mapper.Context;
import org.apache.hadoop.util.ReflectionUtils; import org.apache.hadoop.util.ReflectionUtils;
import org.apache.hadoop.sqoop.lib.RecordParser; import com.cloudera.sqoop.lib.RecordParser;
import org.apache.hadoop.sqoop.lib.SqoopRecord; import com.cloudera.sqoop.lib.SqoopRecord;
/** /**
* Converts an input record from a string representation to a parsed Sqoop * Converts an input record from a string representation to a parsed Sqoop

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
@ -26,8 +26,8 @@
import org.apache.hadoop.io.Text; import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper.Context; import org.apache.hadoop.mapreduce.Mapper.Context;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.sqoop.lib.LargeObjectLoader; import com.cloudera.sqoop.lib.LargeObjectLoader;
import org.apache.hadoop.sqoop.lib.SqoopRecord; import com.cloudera.sqoop.lib.SqoopRecord;
/** /**
* Imports records by transforming them to strings for a plain-text flat file. * Imports records by transforming them to strings for a plain-text flat file.

View File

@ -16,20 +16,20 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.orm; package com.cloudera.sqoop.orm;
import org.apache.hadoop.io.BytesWritable; import org.apache.hadoop.io.BytesWritable;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import org.apache.hadoop.sqoop.lib.BigDecimalSerializer; import com.cloudera.sqoop.lib.BigDecimalSerializer;
import org.apache.hadoop.sqoop.lib.FieldFormatter; import com.cloudera.sqoop.lib.FieldFormatter;
import org.apache.hadoop.sqoop.lib.JdbcWritableBridge; import com.cloudera.sqoop.lib.JdbcWritableBridge;
import org.apache.hadoop.sqoop.lib.LargeObjectLoader; import com.cloudera.sqoop.lib.LargeObjectLoader;
import org.apache.hadoop.sqoop.lib.LobSerializer; import com.cloudera.sqoop.lib.LobSerializer;
import org.apache.hadoop.sqoop.lib.RecordParser; import com.cloudera.sqoop.lib.RecordParser;
import org.apache.hadoop.sqoop.lib.BlobRef; import com.cloudera.sqoop.lib.BlobRef;
import org.apache.hadoop.sqoop.lib.ClobRef; import com.cloudera.sqoop.lib.ClobRef;
import org.apache.hadoop.sqoop.lib.SqoopRecord; import com.cloudera.sqoop.lib.SqoopRecord;
import java.io.File; import java.io.File;
import java.io.FileOutputStream; import java.io.FileOutputStream;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.orm; package com.cloudera.sqoop.orm;
import java.io.File; import java.io.File;
import java.io.FileInputStream; import java.io.FileInputStream;
@ -40,9 +40,9 @@
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.mapred.JobConf; import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.util.FileListing; import com.cloudera.sqoop.util.FileListing;
import org.apache.hadoop.sqoop.shims.HadoopShim; import com.cloudera.sqoop.shims.HadoopShim;
/** /**
* Manages the compilation of a bunch of .java files into .class files * Manages the compilation of a bunch of .java files into .class files

View File

@ -16,9 +16,9 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.orm; package com.cloudera.sqoop.orm;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;

View File

@ -15,7 +15,7 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.shims; package com.cloudera.sqoop.shims;
import java.io.IOException; import java.io.IOException;

View File

@ -15,7 +15,7 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.shims; package com.cloudera.sqoop.shims;
import java.io.File; import java.io.File;
import java.io.IOException; import java.io.IOException;
@ -27,7 +27,7 @@
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.util.VersionInfo; import org.apache.hadoop.util.VersionInfo;
import org.apache.hadoop.sqoop.util.ClassLoaderStack; import com.cloudera.sqoop.util.ClassLoaderStack;
/** /**
* Provides a service locator for the appropriate shim, dynamically chosen * Provides a service locator for the appropriate shim, dynamically chosen
@ -70,29 +70,29 @@ public abstract class ShimLoader {
// CDH3 (based on 0.20.2) // CDH3 (based on 0.20.2)
HADOOP_SHIM_MATCHES.add("0.20.2-[cC][dD][hH]3.*"); HADOOP_SHIM_MATCHES.add("0.20.2-[cC][dD][hH]3.*");
HADOOP_SHIM_CLASSES.add("org.apache.hadoop.sqoop.shims.CDH3Shim"); HADOOP_SHIM_CLASSES.add("com.cloudera.sqoop.shims.CDH3Shim");
HADOOP_SHIM_JARS.add("sqoop-.*-cloudera.jar"); HADOOP_SHIM_JARS.add("sqoop-.*-cloudera.jar");
// Apache 0.22 trunk. // Apache 0.22 trunk.
// Version may have the form "0.22-SNAPSHOT" // Version may have the form "0.22-SNAPSHOT"
HADOOP_SHIM_MATCHES.add("0.22-.*"); HADOOP_SHIM_MATCHES.add("0.22-.*");
HADOOP_SHIM_CLASSES.add("org.apache.hadoop.sqoop.shims.Apache22HadoopShim"); HADOOP_SHIM_CLASSES.add("com.cloudera.sqoop.shims.Apache22HadoopShim");
HADOOP_SHIM_JARS.add("sqoop-.*-apache.jar"); HADOOP_SHIM_JARS.add("sqoop-.*-apache.jar");
// ... or "0.22.n-SNAPSHOT" // ... or "0.22.n-SNAPSHOT"
HADOOP_SHIM_MATCHES.add("0.22.\\d+-.*"); HADOOP_SHIM_MATCHES.add("0.22.\\d+-.*");
HADOOP_SHIM_CLASSES.add("org.apache.hadoop.sqoop.shims.Apache22HadoopShim"); HADOOP_SHIM_CLASSES.add("com.cloudera.sqoop.shims.Apache22HadoopShim");
HADOOP_SHIM_JARS.add("sqoop-.*-apache.jar"); HADOOP_SHIM_JARS.add("sqoop-.*-apache.jar");
// Apache 0.22 trunk snapshots often compile with "Unknown" version, // Apache 0.22 trunk snapshots often compile with "Unknown" version,
// so we default to guessing Apache in this case. // so we default to guessing Apache in this case.
HADOOP_SHIM_MATCHES.add("Unknown"); HADOOP_SHIM_MATCHES.add("Unknown");
HADOOP_SHIM_CLASSES.add("org.apache.hadoop.sqoop.shims.Apache22HadoopShim"); HADOOP_SHIM_CLASSES.add("com.cloudera.sqoop.shims.Apache22HadoopShim");
HADOOP_SHIM_JARS.add("sqoop-.*-apache.jar"); HADOOP_SHIM_JARS.add("sqoop-.*-apache.jar");
// Apache 0.21 uses the same shim jars as 0.22 // Apache 0.21 uses the same shim jars as 0.22
HADOOP_SHIM_MATCHES.add("0.21.\\d+(-.*)?"); HADOOP_SHIM_MATCHES.add("0.21.\\d+(-.*)?");
HADOOP_SHIM_CLASSES.add("org.apache.hadoop.sqoop.shims.Apache22HadoopShim"); HADOOP_SHIM_CLASSES.add("com.cloudera.sqoop.shims.Apache22HadoopShim");
HADOOP_SHIM_JARS.add("sqoop-.*-apache.jar"); HADOOP_SHIM_JARS.add("sqoop-.*-apache.jar");
} }

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import java.sql.SQLException; import java.sql.SQLException;
import java.util.Arrays; import java.util.Arrays;
@ -30,14 +30,14 @@
import org.apache.log4j.Level; import org.apache.log4j.Level;
import org.apache.log4j.Logger; import org.apache.log4j.Logger;
import org.apache.hadoop.sqoop.ConnFactory; import com.cloudera.sqoop.ConnFactory;
import org.apache.hadoop.sqoop.Sqoop; import com.cloudera.sqoop.Sqoop;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.RelatedOptions; import com.cloudera.sqoop.cli.RelatedOptions;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
import org.apache.hadoop.sqoop.manager.ConnManager; import com.cloudera.sqoop.manager.ConnManager;
import org.apache.hadoop.sqoop.shims.ShimLoader; import com.cloudera.sqoop.shims.ShimLoader;
/** /**
* Layer on top of SqoopTool that provides some basic common code * Layer on top of SqoopTool that provides some basic common code

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
@ -28,14 +28,14 @@
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.util.StringUtils; import org.apache.hadoop.util.StringUtils;
import org.apache.hadoop.sqoop.Sqoop; import com.cloudera.sqoop.Sqoop;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.RelatedOptions; import com.cloudera.sqoop.cli.RelatedOptions;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
import org.apache.hadoop.sqoop.hive.HiveImport; import com.cloudera.sqoop.hive.HiveImport;
import org.apache.hadoop.sqoop.orm.ClassWriter; import com.cloudera.sqoop.orm.ClassWriter;
import org.apache.hadoop.sqoop.orm.CompilationManager; import com.cloudera.sqoop.orm.CompilationManager;
/** /**
* Tool that generates code from a database schema. * Tool that generates code from a database schema.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import java.io.IOException; import java.io.IOException;
@ -26,12 +26,12 @@
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.util.StringUtils; import org.apache.hadoop.util.StringUtils;
import org.apache.hadoop.sqoop.Sqoop; import com.cloudera.sqoop.Sqoop;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.RelatedOptions; import com.cloudera.sqoop.cli.RelatedOptions;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
import org.apache.hadoop.sqoop.hive.HiveImport; import com.cloudera.sqoop.hive.HiveImport;
/** /**
* Tool that creates a Hive table definition. * Tool that creates a Hive table definition.

View File

@ -16,17 +16,17 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.OptionBuilder; import org.apache.commons.cli.OptionBuilder;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.RelatedOptions; import com.cloudera.sqoop.cli.RelatedOptions;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
/** /**
* Tool that evaluates a SQL statement and displays the results. * Tool that evaluates a SQL statement and displays the results.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -26,13 +26,13 @@
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.Sqoop; import com.cloudera.sqoop.Sqoop;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.RelatedOptions; import com.cloudera.sqoop.cli.RelatedOptions;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
import org.apache.hadoop.sqoop.manager.ExportJobContext; import com.cloudera.sqoop.manager.ExportJobContext;
import org.apache.hadoop.sqoop.util.ExportException; import com.cloudera.sqoop.util.ExportException;
/** /**
* Tool that performs HDFS exports to databases. * Tool that performs HDFS exports to databases.

View File

@ -16,12 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import java.util.Set; import java.util.Set;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
/** /**
* Tool that explains the usage of Sqoop. * Tool that explains the usage of Sqoop.

View File

@ -16,17 +16,17 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import java.io.IOException; import java.io.IOException;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.Sqoop; import com.cloudera.sqoop.Sqoop;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.hive.HiveImport; import com.cloudera.sqoop.hive.HiveImport;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
/** /**
* Tool that performs database imports of all tables in a database to HDFS. * Tool that performs database imports of all tables in a database to HDFS.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;
@ -26,14 +26,14 @@
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.Sqoop; import com.cloudera.sqoop.Sqoop;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.RelatedOptions; import com.cloudera.sqoop.cli.RelatedOptions;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
import org.apache.hadoop.sqoop.hive.HiveImport; import com.cloudera.sqoop.hive.HiveImport;
import org.apache.hadoop.sqoop.manager.ImportJobContext; import com.cloudera.sqoop.manager.ImportJobContext;
import org.apache.hadoop.sqoop.util.ImportException; import com.cloudera.sqoop.util.ImportException;
/** /**
* Tool that performs database imports to HDFS. * Tool that performs database imports to HDFS.

View File

@ -16,15 +16,15 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.CommandLine;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
/** /**
* Tool that lists available databases on a server. * Tool that lists available databases on a server.

View File

@ -16,15 +16,15 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.CommandLine;
import org.apache.commons.logging.Log; import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory; import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
/** /**
* Tool that lists available tables in a database. * Tool that lists available tables in a database.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import java.io.IOException; import java.io.IOException;
import java.util.Map; import java.util.Map;
@ -32,11 +32,11 @@
import org.apache.hadoop.util.StringUtils; import org.apache.hadoop.util.StringUtils;
import org.apache.hadoop.util.ToolRunner; import org.apache.hadoop.util.ToolRunner;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopOptions.InvalidOptionsException; import com.cloudera.sqoop.SqoopOptions.InvalidOptionsException;
import org.apache.hadoop.sqoop.cli.SqoopParser; import com.cloudera.sqoop.cli.SqoopParser;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
import org.apache.hadoop.sqoop.shims.ShimLoader; import com.cloudera.sqoop.shims.ShimLoader;
/** /**
* Base class for Sqoop subprograms (e.g., SqoopImport, SqoopExport, etc.) * Base class for Sqoop subprograms (e.g., SqoopImport, SqoopExport, etc.)

View File

@ -16,11 +16,11 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.tool; package com.cloudera.sqoop.tool;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.SqoopVersion; import com.cloudera.sqoop.SqoopVersion;
import org.apache.hadoop.sqoop.cli.ToolOptions; import com.cloudera.sqoop.cli.ToolOptions;
/** /**
* Tool that prints Sqoop's version. * Tool that prints Sqoop's version.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.io.InputStream; import java.io.InputStream;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.io.File; import java.io.File;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.io.IOException; import java.io.IOException;
import java.io.File; import java.io.File;
@ -29,9 +29,9 @@
import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path; import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.io.SplittingOutputStream; import com.cloudera.sqoop.io.SplittingOutputStream;
import org.apache.hadoop.sqoop.io.SplittableBufferedWriter; import com.cloudera.sqoop.io.SplittableBufferedWriter;
import org.apache.hadoop.util.Shell; import org.apache.hadoop.util.Shell;
/** /**

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
/** /**
* Partial implementation of AsyncSink that relies on ErrorableThread to * Partial implementation of AsyncSink that relies on ErrorableThread to

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
/** /**
* A thread which has an error bit which can be set from within the thread. * A thread which has an error bit which can be set from within the thread.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
/** /**
* General error during export process. * General error during export process.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.util.Arrays; import java.util.Arrays;
import java.util.ArrayList; import java.util.ArrayList;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
/** /**
* General error during the import process. * General error during the import process.

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.net.MalformedURLException; import java.net.MalformedURLException;
import java.net.URL; import java.net.URL;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.io.BufferedReader; import java.io.BufferedReader;
import java.io.InputStream; import java.io.InputStream;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.io.BufferedReader; import java.io.BufferedReader;
import java.io.InputStream; import java.io.InputStream;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.text.NumberFormat; import java.text.NumberFormat;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.util; package com.cloudera.sqoop.util;
import java.io.IOException; import java.io.IOException;
import java.io.PrintWriter; import java.io.PrintWriter;

View File

@ -23,10 +23,10 @@
import org.apache.hadoop.conf.*; import org.apache.hadoop.conf.*;
import org.apache.hadoop.util.*; import org.apache.hadoop.util.*;
import org.apache.hadoop.sqoop.Sqoop; import com.cloudera.sqoop.Sqoop;
import org.apache.hadoop.sqoop.SqoopOptions; import com.cloudera.sqoop.SqoopOptions;
import org.apache.hadoop.sqoop.tool.ExportTool; import com.cloudera.sqoop.tool.ExportTool;
import org.apache.hadoop.sqoop.tool.SqoopTool; import com.cloudera.sqoop.tool.SqoopTool;
/** /**
* Stress test export procedure by running a large-scale export to MySQL. * Stress test export procedure by running a large-scale export to MySQL.

View File

@ -19,7 +19,7 @@
import java.io.*; import java.io.*;
import org.apache.hadoop.fs.*; import org.apache.hadoop.fs.*;
import org.apache.hadoop.conf.*; import org.apache.hadoop.conf.*;
import org.apache.hadoop.sqoop.io.*; import com.cloudera.sqoop.io.*;
/** /**
* A simple benchmark to performance test LobFile reader/writer speed. * A simple benchmark to performance test LobFile reader/writer speed.

View File

@ -20,7 +20,7 @@
import java.util.*; import java.util.*;
import org.apache.hadoop.fs.*; import org.apache.hadoop.fs.*;
import org.apache.hadoop.conf.*; import org.apache.hadoop.conf.*;
import org.apache.hadoop.sqoop.io.*; import com.cloudera.sqoop.io.*;
/** /**
* Stress test LobFiles by writing a bunch of different files and reading * Stress test LobFiles by writing a bunch of different files and reading

View File

@ -27,7 +27,7 @@
buildroot=$1 buildroot=$1
version=$2 version=$2
outputdir=${buildroot}/src/org/apache/hadoop/sqoop outputdir=${buildroot}/src/com.cloudera.sqoop
outputfile=${outputdir}/SqoopVersion.java outputfile=${outputdir}/SqoopVersion.java
signature=`git log -1 --pretty=format:%H` signature=`git log -1 --pretty=format:%H`
@ -37,7 +37,7 @@ compiledate=`date`
mkdir -p ${outputdir} mkdir -p ${outputdir}
cat > ${outputfile} <<EOF cat > ${outputfile} <<EOF
// generated by src/scripts/write-version-info.sh // generated by src/scripts/write-version-info.sh
package org.apache.hadoop.sqoop; package com.cloudera.sqoop;
public final class SqoopVersion { public final class SqoopVersion {
public SqoopVersion() { public SqoopVersion() {
} }

View File

@ -15,7 +15,7 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.shims; package com.cloudera.sqoop.shims;
import java.io.IOException; import java.io.IOException;

View File

@ -15,7 +15,7 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.shims; package com.cloudera.sqoop.shims;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;

View File

@ -16,7 +16,7 @@
* limitations under the License. * limitations under the License.
*/ */
package org.apache.hadoop.sqoop.mapreduce; package com.cloudera.sqoop.mapreduce;
import java.io.IOException; import java.io.IOException;
import java.util.List; import java.util.List;

Some files were not shown because too many files have changed in this diff Show More