Apache Sqoop Cookbook: Unlocking Hadoop for Your Relational Database

Integrating data from multiple sources is essential in the age of big data, but it can be a challenging and time-consuming task. This handy cookbook provides dozens of ready-to-use recipes for using Apache Sqoop, the command-line interface application that optimizes data transfers between relational databases and Hadoop.

Sqoop is both powerful and bewildering, but with this cookbook's problem-solution-discussion format, you'll quickly learn how to deploy and then apply Sqoop in your environment. The authors provide MySQL, Oracle, and PostgreSQL database examples on GitHub that you can easily adapt for SQL Server, Netezza, Teradata, or other relational systems.

  • Transfer data from a single database table into your Hadoop ecosystem
  • Keep table data and Hadoop in sync by importing data incrementally
  • Import data from more than one database table
  • Customize transferred data by calling various database functions
  • Export generated, processed, or backed-up data from Hadoop to your database
  • Run Sqoop within Oozie, Hadoop's specialized workflow scheduler
  • Load data into Hadoop's data warehouse (Hive) or database (HBase)
  • Handle installation, connection, and syntax issues common to specific database vendors
  • 1140203618
    Apache Sqoop Cookbook: Unlocking Hadoop for Your Relational Database

    Integrating data from multiple sources is essential in the age of big data, but it can be a challenging and time-consuming task. This handy cookbook provides dozens of ready-to-use recipes for using Apache Sqoop, the command-line interface application that optimizes data transfers between relational databases and Hadoop.

    Sqoop is both powerful and bewildering, but with this cookbook's problem-solution-discussion format, you'll quickly learn how to deploy and then apply Sqoop in your environment. The authors provide MySQL, Oracle, and PostgreSQL database examples on GitHub that you can easily adapt for SQL Server, Netezza, Teradata, or other relational systems.

  • Transfer data from a single database table into your Hadoop ecosystem
  • Keep table data and Hadoop in sync by importing data incrementally
  • Import data from more than one database table
  • Customize transferred data by calling various database functions
  • Export generated, processed, or backed-up data from Hadoop to your database
  • Run Sqoop within Oozie, Hadoop's specialized workflow scheduler
  • Load data into Hadoop's data warehouse (Hive) or database (HBase)
  • Handle installation, connection, and syntax issues common to specific database vendors
  • 14.99 In Stock
    Apache Sqoop Cookbook: Unlocking Hadoop for Your Relational Database

    Apache Sqoop Cookbook: Unlocking Hadoop for Your Relational Database

    by Kathleen Ting, Jarek Cecho
    Apache Sqoop Cookbook: Unlocking Hadoop for Your Relational Database

    Apache Sqoop Cookbook: Unlocking Hadoop for Your Relational Database

    by Kathleen Ting, Jarek Cecho

    Paperback

    $14.99 
    • SHIP THIS ITEM
      In stock. Ships in 1-2 days.
    • PICK UP IN STORE

      Your local store may have stock of this item.

    Related collections and offers


    Overview

    Integrating data from multiple sources is essential in the age of big data, but it can be a challenging and time-consuming task. This handy cookbook provides dozens of ready-to-use recipes for using Apache Sqoop, the command-line interface application that optimizes data transfers between relational databases and Hadoop.

    Sqoop is both powerful and bewildering, but with this cookbook's problem-solution-discussion format, you'll quickly learn how to deploy and then apply Sqoop in your environment. The authors provide MySQL, Oracle, and PostgreSQL database examples on GitHub that you can easily adapt for SQL Server, Netezza, Teradata, or other relational systems.

  • Transfer data from a single database table into your Hadoop ecosystem
  • Keep table data and Hadoop in sync by importing data incrementally
  • Import data from more than one database table
  • Customize transferred data by calling various database functions
  • Export generated, processed, or backed-up data from Hadoop to your database
  • Run Sqoop within Oozie, Hadoop's specialized workflow scheduler
  • Load data into Hadoop's data warehouse (Hive) or database (HBase)
  • Handle installation, connection, and syntax issues common to specific database vendors

  • Product Details

    ISBN-13: 9781449364625
    Publisher: O'Reilly Media, Incorporated
    Publication date: 07/15/2013
    Pages: 94
    Product dimensions: 7.13(w) x 9.25(h) x 0.24(d)

    About the Author

    Kathleen Ting is currently a Customer Operations Engineering Manager at Cloudera where she helps customers deploy and use the Hadoop ecosystem in production. She has spoken on Hadoop, ZooKeeper, and Sqoop at many Big Data conferences including Hadoop World, ApacheCon, and OSCON. She's contributed to several projects in the open source community and is a Committer and PMC Member on Sqoop.

    Jarek Jarcec Cecho is currently a Software Engineer at Cloudera where he develops software to help customers better access and integrate with the Hadoop ecosystem. He has led the Sqoop community in the architecture of the next generation of Sqoop, known as Sqoop 2. He's contributed to several projects in the open source community and is a Committer and PMC Member on Sqoop, Flume, and MRUnit.

    Table of Contents

    Foreword; Preface; Sqoop 2; Conventions Used in This Book; Using Code Examples; Safari® Books Online; How to Contact Us; Acknowledgments; Chapter 1: Getting Started; 1.1 Downloading and Installing Sqoop; 1.2 Installing JDBC Drivers; 1.3 Installing Specialized Connectors; 1.4 Starting Sqoop; 1.5 Getting Help with Sqoop; Chapter 2: Importing Data; 2.1 Transferring an Entire Table; 2.2 Specifying a Target Directory; 2.3 Importing Only a Subset of Data; 2.4 Protecting Your Password; 2.5 Using a File Format Other Than CSV; 2.6 Compressing Imported Data; 2.7 Speeding Up Transfers; 2.8 Overriding Type Mapping; 2.9 Controlling Parallelism; 2.10 Encoding NULL Values; 2.11 Importing All Your Tables; Chapter 3: Incremental Import; 3.1 Importing Only New Data; 3.2 Incrementally Importing Mutable Data; 3.3 Preserving the Last Imported Value; 3.4 Storing Passwords in the Metastore; 3.5 Overriding the Arguments to a Saved Job; 3.6 Sharing the Metastore Between Sqoop Clients; Chapter 4: Free-Form Query Import; 4.1 Importing Data from Two Tables; 4.2 Using Custom Boundary Queries; 4.3 Renaming Sqoop Job Instances; 4.4 Importing Queries with Duplicated Columns; Chapter 5: Export; 5.1 Transferring Data from Hadoop; 5.2 Inserting Data in Batches; 5.3 Exporting with All-or-Nothing Semantics; 5.4 Updating an Existing Data Set; 5.5 Updating or Inserting at the Same Time; 5.6 Using Stored Procedures; 5.7 Exporting into a Subset of Columns; 5.8 Encoding the NULL Value Differently; 5.9 Exporting Corrupted Data; Chapter 6: Hadoop Ecosystem Integration; 6.1 Scheduling Sqoop Jobs with Oozie; 6.2 Specifying Commands in Oozie; 6.3 Using Property Parameters in Oozie; 6.4 Installing JDBC Drivers in Oozie; 6.5 Importing Data Directly into Hive; 6.6 Using Partitioned Hive Tables; 6.7 Replacing Special Delimiters During Hive Import; 6.8 Using the Correct NULL String in Hive; 6.9 Importing Data into HBase; 6.10 Importing All Rows into HBase; 6.11 Improving Performance When Importing into HBase; Chapter 7: Specialized Connectors; 7.1 Overriding Imported boolean Values in PostgreSQL Direct Import; 7.2 Importing a Table Stored in Custom Schema in PostgreSQL; 7.3 Exporting into PostgreSQL Using pg_bulkload; 7.4 Connecting to MySQL; 7.5 Using Direct MySQL Import into Hive; 7.6 Using the upsert Feature When Exporting into MySQL; 7.7 Importing from Oracle; 7.8 Using Synonyms in Oracle; 7.9 Faster Transfers with Oracle; 7.10 Importing into Avro with OraOop; 7.11 Choosing the Proper Connector for Oracle; 7.12 Exporting into Teradata; 7.13 Using the Cloudera Teradata Connector; 7.14 Using Long Column Names in Teradata; Colophon;
    From the B&N Reads Blog

    Customer Reviews