Skip to main content

SQL, NoSQL, and PySparkSQL

  • Chapter
  • First Online:
  • 1815 Accesses

Abstract

In this chapter, we will look into various Spark SQL recipes that come in handy when you have to apply SQL-like queries to the data. One of the specialties of Apache Spark is the way in which it lets the user apply data-wrangling methods in a programmatic manner and as ANSI SQL-like methods. For readers who are from pure SQL backgrounds, with a little exposure to programmatic data manipulation, these SQLs are one stop shop. Almost all of the Spark programmatic APIs can be applied to data using Spark SQLs. Also these SQLs are in ANSI standard, which enables anyone currently on SQL technologies to readily start working on Apache Spark-based Big Data projects.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   34.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   44.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Raju Kumar Mishra and Sundar Rajan Raman

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Mishra, R.K., Raman, S.R. (2019). SQL, NoSQL, and PySparkSQL. In: PySpark SQL Recipes. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-4335-0_6

Download citation

Publish with us

Policies and ethics