Skip to content
This repository has been archived by the owner on May 27, 2020. It is now read-only.

Commit

Permalink
remove duplicate table
Browse files Browse the repository at this point in the history
  • Loading branch information
pmadrigal committed Feb 3, 2016
2 parents 8ba860a + 058e5cd commit 22ea302
Show file tree
Hide file tree
Showing 5 changed files with 30 additions and 15 deletions.
11 changes: 1 addition & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,6 @@ from/into MongoDB collections.

If you are using this Data Source, feel free to briefly share your experience by Pull Request this [file](https://github.com/Stratio/spark-mongodb/blob/master/doc/src/site/sphinx/PoweredBy.rst).

## Latest compatible versions##

| spark-MongoDB | Apache Spark | MongoDB |
| ------------- | ------------- | -------- |
| 0.10.x | 1.5.x | 3.0.x |
| 0.8.2 - 0.9.2 | 1.4.0 | 3.0.x |
| 0.8.1 | 1.3.0 | 3.0.x |
| 0.8.0 | 1.2.1 | 3.0.x |


## Requirements##

This library requires Apache Spark, Scala 2.10 or Scala 2.11, Casbah 2.8.X
Expand All @@ -38,6 +28,7 @@ There also exists a [First Steps] (<https://github.com/Stratio/spark-mongodb/blo
- [Examples](https://github.com/Stratio/spark-mongodb/blob/master/doc/src/site/sphinx/First_Steps.rst#examples)
- [Scala API](https://github.com/Stratio/spark-mongodb/blob/master/doc/src/site/sphinx/First_Steps.rst#scala-api)
- [Python API](https://github.com/Stratio/spark-mongodb/blob/master/doc/src/site/sphinx/First_Steps.rst#python-api)
- [Java API](https://github.com/Stratio/spark-mongodb/blob/master/doc/src/site/sphinx/First_Steps.rst#java-api)
- [R API](https://github.com/Stratio/spark-mongodb/blob/master/doc/src/site/sphinx/First_Steps.rst#r-api)
- [Faqs](https://github.com/Stratio/spark-mongodb/blob/master/doc/src/site/sphinx/faqs.rst)

Expand Down
2 changes: 1 addition & 1 deletion doc/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
<parent>
<groupId>com.stratio.datasource</groupId>
<artifactId>spark-mongodb-parent</artifactId>
<version>0.11.0-SNAPSHOT</version>
<version>0.11.0-SNAPSHOT</version>
</parent>
<properties>
<jacoco.skip>true</jacoco.skip>
Expand Down
23 changes: 23 additions & 0 deletions doc/src/site/sphinx/First_Steps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -243,6 +243,29 @@ Then:
sqlContext.sql("CREATE TEMPORARY TABLE students_table USING com.stratio.datasource.mongodb OPTIONS (host 'host:port', database 'highschool', collection 'students')")
sqlContext.sql("SELECT * FROM students_table").collect()

Java API
--------

You need to add spark-mongodb and spark-sql dependencies to the java project.
::

public class SparkMongodbJavaExample {

public static void main(String[] args) {

JavaSparkContext sc = new JavaSparkContext("local[2]", "test spark-mongodb java");
SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
Map options = new HashMap();
options.put("host", "localhost:27017");
options.put("database", "highschoolCredentials");
options.put("collection", "students");
options.put("credentials", "user,highschoolCredentials,password");

DataFrame df = sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();
df.registerTempTable("students");
sqlContext.sql("SELECT * FROM students");
df.show(); }
}

R API
-----
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -133,9 +133,10 @@ object MongodbRelation {
* @param requiredColumnsWithIndex Required fields in statement including index within field for random accesses.
* @return A new pruned schema
*/
private[this] def pruneSchema(
schema: StructType,
requiredColumnsWithIndex: Array[(String, Option[Int])]): StructType = {
def pruneSchema(
schema: StructType,
requiredColumnsWithIndex: Array[(String, Option[Int])]): StructType = {

val name2sfield: Map[String, StructField] = schema.fields.map(f => f.name -> f).toMap
StructType(
requiredColumnsWithIndex.flatMap {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ with BeforeAndAfterAll {

it should "prune schema to adapt it to required columns" + scalaBinaryVersion in {

MongodbRelation.pruneSchema(schema,Array()) should equal(
MongodbRelation.pruneSchema(schema,Array[String]()) should equal(
new StructType(Array()))

MongodbRelation.pruneSchema(schema,Array("fakeAtt")) should equal(
Expand Down

0 comments on commit 22ea302

Please sign in to comment.