Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
306 commits
Select commit Hold shift + click to select a range
401a64c
Fixed CreateTable testcase problem and updated RowKeyParser
sboeschhuawei Oct 2, 2014
7106713
add type conversion function
Oct 2, 2014
93af29f
Removed LogicalPlan and SchemaRDD from PhysicalPlans
sboeschhuawei Oct 2, 2014
5e792ad
Working through HBase Snappy issues and HBaseSQLParser resolution issue
sboeschhuawei Oct 3, 2014
a9c22ff
Add content to test
Oct 3, 2014
4321d7e
Additional work on partitioning
sboeschhuawei Oct 3, 2014
bb8dc68
Removed LogicalPlan and SchemaRDD from PhysicalPlans
sboeschhuawei Oct 3, 2014
b46140a
Modify the workflow of InsertIntoTable
Oct 3, 2014
093e164
Incremental query testing
sboeschhuawei Oct 4, 2014
de26141
Working through issues with Catalog integration
sboeschhuawei Oct 5, 2014
33fe712
Fixed Catalog bugs: namespace mixup (partial fix), RowKey in wrong or…
sboeschhuawei Oct 6, 2014
b768186
Disabled pred pushdown and able to reach ReaderRDD
sboeschhuawei Oct 6, 2014
d54fa22
Change the syntax of CreateTable
Oct 6, 2014
6871100
fix the namespace issues
Oct 6, 2014
ff9714e
add delete table function
Oct 7, 2014
96d0290
Add Drop
Oct 7, 2014
6d58edc
Fixed conn issues in HBaseSQLReaderRDD
sboeschhuawei Oct 9, 2014
d8444f7
Fixed conn issues in HBaseSQLReaderRDD
sboeschhuawei Oct 9, 2014
e959502
use catalyst data type instead of hbase data type
Oct 9, 2014
6b60109
remove hbase data type
Oct 9, 2014
a5fd662
Change the input to catalyst datatype
Oct 9, 2014
aee3401
RowKey and HBaseSQLReaderRDD fixes
sboeschhuawei Oct 9, 2014
bbec48e
fix the code style issue
Oct 10, 2014
c89bf27
Add verification to Hbase CreateTable
Oct 10, 2014
ecdfcb4
Basic query working
sboeschhuawei Oct 13, 2014
a6cbd95
Fixed conn issues in HBaseSQLReaderRDD
sboeschhuawei Oct 13, 2014
4913a8e
add logical table exist check
Oct 13, 2014
37b4387
fix the data type issue
Oct 13, 2014
af30223
Fixed RowKeyParser write path - used by InsertIntoTable
sboeschhuawei Oct 13, 2014
57bf401
Modify the verification and add HBaseAnalyzer for future development
Oct 13, 2014
eadc2b5
Ignore integration tests requiring external hbase access
sboeschhuawei Oct 13, 2014
407e97d
create hbase table required for testing
Oct 13, 2014
d8acba2
add more data type tests
Oct 14, 2014
4d82fe1
code formatting
Oct 14, 2014
59a4414
Fixed select * path but order is incorrect
sboeschhuawei Oct 15, 2014
45f799c
Fixed conn issues in HBaseSQLReaderRDD
sboeschhuawei Oct 15, 2014
823b91c
Small test tweaks for preds
sboeschhuawei Oct 15, 2014
f3afe35
Refactored according to Yan's designs
sboeschhuawei Oct 18, 2014
5d4df1a
Refactored according to Yan's designs
sboeschhuawei Oct 18, 2014
5dae756
Removed unused/unnecessary classes and code
sboeschhuawei Oct 18, 2014
c120105
mv hbase files to the new old dir
Oct 20, 2014
7f2f032
code format and minor fix
scwf Oct 20, 2014
04385a7
update with apache master and fix confilict
scwf Oct 20, 2014
ad798d8
Simplified version of HBaseSQLContext
Oct 20, 2014
3cd96cf
scala style fix
scwf Oct 20, 2014
a10f270
revert some diffs with apache master
scwf Oct 21, 2014
f2916ee
Merge pull request #2 from scwf/hbase
scwf Oct 21, 2014
ecf84d7
Delete HBaseAnalyzer
Oct 20, 2014
0fe8f02
Change the syntax of InsertIntoTable
Oct 20, 2014
e0d1621
Modify the workflow of CreateTable and DropTable
Oct 21, 2014
dbb16bb
Make it compatible with updated Apache Spark Code
Oct 21, 2014
cfdd604
add catalog file
Oct 21, 2014
b59f28e
add to cache
Oct 21, 2014
643da9c
Fix some compilation errors
Oct 21, 2014
238f2b3
Move Parser files to original dir
Oct 21, 2014
89d918e
add delete table
Oct 21, 2014
8a7d47f
adding HBaseOperators and HBaseStrageties
Oct 21, 2014
03f01f3
fix a compilation error
Oct 22, 2014
efeaa9c
addition of package.scala
Oct 22, 2014
2a8eefc
add skeleton of HBaseRelation
Oct 22, 2014
6f53486
fix the test cases and delete table
Oct 22, 2014
2550b49
Clean up the HBaseRelation
Oct 22, 2014
0f75a20
Merge HBaseCatalogTable into HBaseRelation
Oct 23, 2014
6f5c2d0
Fix the complition errors
Oct 23, 2014
91eccd5
change create table method
Oct 23, 2014
d056c16
Recover the CreateTable workflow
Oct 23, 2014
89ab616
fix the issue based on tests
Oct 24, 2014
c9f1547
persist metadata using object serialization
Oct 24, 2014
61838fd
Change the input order
Oct 24, 2014
2e1b9c9
refact the Query processing
Oct 25, 2014
af4b958
merged from the remote branch
Oct 25, 2014
9dfb009
remove the old dir
Oct 25, 2014
2cdb1bf
remove unnecessary tolowercase
Oct 25, 2014
a4a810c
new package execution and logical
scwf Oct 27, 2014
1d10ddd
minore fix
scwf Oct 27, 2014
c49e64f
use string formatter
Oct 27, 2014
1302fdb
Fix the issue of getting HBaseRelation attributes
Oct 27, 2014
aa02d90
Merge pull request #5 from scwf/hbase
scwf Oct 27, 2014
c36082f
add row key support
Oct 27, 2014
108123d
file rename according sparksql convention
Oct 28, 2014
ccf6056
Rename HBaseCommands.scala to hbaseCommands.scala
Oct 28, 2014
ce5cb59
Rename HBaseOperators.scala to hbaseOperators.scala
Oct 28, 2014
949ef99
optimize import
Oct 28, 2014
0c346f3
use arraybuffer to improve performance
Oct 28, 2014
3ebd032
use allColumns as single parameter in the create method
Oct 28, 2014
9ac6578
Change the input parameter of CreateTable to catalog
Oct 28, 2014
944d88d
add close() to the function
Oct 29, 2014
57e3036
simplify HBaseRelation constructor
Oct 29, 2014
8295e54
Merge branch 'hbase' of https://github.com/Huawei-Spark/spark into hbase
Oct 29, 2014
b25addf
fix the test case issues after removing the parameters in the create …
Oct 29, 2014
d2e6e19
comment on HBaseSQLContext
Oct 29, 2014
202ffa5
Add the testcases to test RowKeyParser part and Add the test to inser…
Oct 29, 2014
4affc61
Removal of TestData.scala
Oct 29, 2014
3364342
add "val" to the class definition
Oct 29, 2014
2409ba7
Fix the bugs in doing Select
Oct 29, 2014
6bcfe0e
Simplify HBaseRelation ctor
Oct 30, 2014
59552df
Merge branch 'hbase' of https://github.com/Huawei-Spark/spark into hw…
Oct 30, 2014
46e2df1
refactory
Oct 30, 2014
34d548c
Add testcases for Select and temporarily change some HBaseCatalog int…
Oct 30, 2014
98ec286
fix the compilation errors
Oct 31, 2014
d083e27
fix a NPE when a table does not exist
Oct 31, 2014
3698247
Merge branch 'hbase' of https://github.com/Huawei-Spark/spark into hbase
Oct 31, 2014
ae2d31a
add INT keyword
Nov 3, 2014
567f494
Merge pull request #10 from jackylk/hbase-int
Nov 3, 2014
0f9ccb6
add bytes implementation
Nov 4, 2014
e09b52c
Merge branch 'hbase' of https://github.com/Huawei-Spark/spark into t1
Nov 4, 2014
34533b8
Merge pull request #7 from jackylk/rename
Nov 4, 2014
ba39b6c
initial support bulk loading
scwf Nov 4, 2014
360f483
add interfaces for altering table
Nov 4, 2014
95e6520
reordering keywords according to community convention
Nov 4, 2014
3c10e53
rename SparkImmutableBytesWritable
scwf Nov 4, 2014
e99841b
[WIP] support for creating hbase user table in CREATE TABLE
Nov 4, 2014
efc9348
fix line exceed issue
Nov 4, 2014
386b600
add alter table support
Nov 4, 2014
d512f8c
add alter table support
Nov 4, 2014
2993d80
add hbase profile to Pom, now we can use sbt/sbt -Phbase assembly
scwf Nov 4, 2014
ef05b51
rename the file
Nov 4, 2014
0963d20
add test case for altering table
Nov 4, 2014
119f187
Add InsertIntoTable Support
Nov 4, 2014
153c123
Add one more Bytes conversion
Nov 5, 2014
0d41c67
Add Alter ADD and Alter Drop
Nov 5, 2014
5d95863
Merge pull request #11 from jackylk/reorder
Nov 5, 2014
5dd02ec
fix line length issue
Nov 5, 2014
e5ad775
adding FIELDS TERMINATED BY clause in bulkload
Nov 5, 2014
39ab8e3
Merge pull request #13 from jackylk/create-htable
Nov 6, 2014
53db574
new implemention of bytes utility
Nov 6, 2014
18c96f0
update the test cases
Nov 6, 2014
0306443
reformating
Nov 6, 2014
adecb45
Merge pull request #16 from jackylk/reformat
Nov 6, 2014
1651d41
change short implementation
Nov 6, 2014
1a3a39d
update test cases
Nov 6, 2014
7a337da
provide double/long implementation
Nov 6, 2014
65d1d91
Add buildFilter
Nov 8, 2014
a0c9299
Fix the issue in Byte conversion
Nov 8, 2014
6f5f4e3
make string2KV more general to use
scwf Nov 10, 2014
f4ec6a4
use new BytesUtil instead
scwf Nov 10, 2014
d712d9d
adding HBaseShuffledRDD
scwf Nov 10, 2014
ff5e5a4
remove unused variable
Nov 10, 2014
5f5ba9b
Merge pull request #20 from jackylk/remove-val
Nov 10, 2014
88955ed
refactory for helper
Nov 10, 2014
efaffa8
draft for optimized bulk loading
scwf Nov 10, 2014
8f99fc6
change according to comment
Nov 10, 2014
048597f
Use of primary key in DDL
Nov 10, 2014
d186701
partial evaluation for partition pruning
Nov 10, 2014
3d2d345
partial evaluation for partition pruning
Nov 10, 2014
204ca57
Remove unnecessary change
Nov 11, 2014
f5b0bb8
Merge pull request #21 from jackylk/refactory-helper
Nov 11, 2014
3189c53
hot fix for compile
scwf Nov 11, 2014
cf63dd2
fix style
scwf Nov 11, 2014
61cebfc
use buffer to speed up encode/decode
Nov 11, 2014
7bfc1a0
resolving conflict
Nov 11, 2014
79a3add
Update the pom
Nov 11, 2014
e9f9d9d
Format the pom file
Nov 11, 2014
d3df256
Merge branch 'hbase' of https://github.com/Huawei-Spark/spark into te…
Nov 11, 2014
56ec13e
modify plan execution
Nov 11, 2014
0b4fbd4
bulk load clean
scwf Nov 11, 2014
0a88e09
adding FIELDS TERMINATED BY clause in bulkload
scwf Nov 11, 2014
cbe8d48
add FIELDS TERMINATED BY to phsical plan
scwf Nov 11, 2014
b13efbf
Add HBase CLI
sboeschhuawei Nov 12, 2014
fe96685
Add first version of HBaseIntegrationTestBase
sboeschhuawei Nov 12, 2014
a2d2ad9
Add first version of HBaseIntegrationTestBase (2)
sboeschhuawei Nov 12, 2014
0e0540f
formatting
Nov 12, 2014
8c5187c
use listbuffer instead of arraybuffer
Nov 13, 2014
c474832
fix test for bulk loading
scwf Nov 13, 2014
31ca595
fix list buffer compile error
scwf Nov 13, 2014
783b19d
clean and add a play test for minicluster
scwf Nov 13, 2014
d283e51
add license title
scwf Nov 13, 2014
94b1b7a
fix test: bulk loading test ok locally
scwf Nov 13, 2014
3309aaa
clean debug code
scwf Nov 13, 2014
fec1a69
fix hbasepartitioner bug
scwf Nov 13, 2014
895d6df
range comparison update
Nov 13, 2014
7190353
range comparison update
Nov 13, 2014
83bcea0
refactory bulk load logical plan class name
Nov 14, 2014
831b0be
add comments for range comparision
Nov 14, 2014
8f9435b
replace the default partitioning with range partitioning
Nov 14, 2014
e9186bf
Modify the PartialPredEval and Add testcases
Nov 15, 2014
0163ee4
basic CLI support
Nov 16, 2014
92bbb99
partial implement
Nov 16, 2014
47b5d6f
add todo
Nov 17, 2014
0b6aa6b
Fix the bugs in PartialEval
Nov 17, 2014
3fd88d3
Merge branch 'hbase' of https://github.com/Huawei-Spark/spark into sh…
Nov 17, 2014
88d988a
Add AND, OR and Not in PartialEval and Modify tryCompare
Nov 18, 2014
df19391
basic CLI support
Nov 18, 2014
19750db
Add Equal case to tryCompare and Change to use the correct equiv oper…
Nov 18, 2014
295f144
Small change on test case
Nov 18, 2014
e6dbd8a
fix the deprecated issue
Nov 18, 2014
18844c9
add basic CLI support
Nov 19, 2014
d12c8ab
fix bug in CREATE TABLE using int data type
Nov 19, 2014
cebc0f7
add support for SHOW TABLES
Nov 19, 2014
1235d6e
solve conflict
Nov 19, 2014
1fdbf7c
add support for INSERT VALUES
Nov 19, 2014
b09ee41
solve conflict
Nov 19, 2014
0289fe0
fix the style check errors
Nov 19, 2014
71b342e
initial code for points finder
Nov 19, 2014
a3f2fd5
code cleanup of HBaseRelation
Nov 20, 2014
7cc1b53
fix build failure
scwf Nov 20, 2014
76cf17d
optimize the imports
Nov 20, 2014
39323e5
create HBaseAdmin instance just once
Nov 21, 2014
07bbec1
Interim testcase work
sboeschhuawei Nov 19, 2014
af9194a
Working version of HBaseIntegrationTestBase
sboeschhuawei Nov 21, 2014
e66c809
Working version of HBaseIntegrationTestBase
sboeschhuawei Nov 22, 2014
5214bfe
add support for DESCEIBE
Nov 22, 2014
5dad6ee
let Analyzer resolve the relation in INSERT VALUES
Nov 22, 2014
37c9476
add completion support and help support in CLI
Nov 23, 2014
a017251
update with apache master
scwf Nov 24, 2014
35bd58e
fix hbase pom
scwf Nov 24, 2014
2f969dc
https -> http in pom so that locally agent work
scwf Nov 24, 2014
678e4ff
fix compile error
scwf Nov 24, 2014
84656bd
revert some no need change
scwf Nov 24, 2014
a532044
update with hbase
scwf Nov 24, 2014
d423718
fix scala style
scwf Nov 24, 2014
e4cdcc3
draft
scwf Nov 24, 2014
8eb8cdc
draft for support user defined schema
scwf Nov 24, 2014
b3d35c7
fix comment
scwf Nov 24, 2014
c203ce2
adding test case
scwf Nov 24, 2014
5378c5a
fix the style error
Nov 24, 2014
f21d6ce
Merge branch 'ddl' into hbase
scwf Nov 25, 2014
824f953
adding HBaseRelation2
scwf Nov 25, 2014
647c2bc
use ListBuffer in string2KV
Nov 25, 2014
987095a
refactory to use ListBuffer in string2KV
Nov 25, 2014
a78d573
recover the files from 'git push --force'
Nov 25, 2014
daa49aa
Revert "recover the files from 'git push --force'"
Nov 25, 2014
565ffb9
predicate pushdown: partial reduction, filter predicate classificatio…
Nov 26, 2014
069314a
predicate pushdown: partial reduction, filter predicate classificatio…
Nov 26, 2014
3c86a48
should have used partition-specific filter predicate
Nov 26, 2014
d46a517
temporarily revert a predicate pushdown related change
Nov 26, 2014
f96ebbc
fix the potential errors in the codes
Nov 26, 2014
a1f4acb
construct filter list (initial implementation)
Nov 26, 2014
65372d7
Critical Point Impl
Nov 27, 2014
dc9b574
Merge branch 'hbase' of https://github.com/Huawei-Spark/spark into hbase
Nov 27, 2014
3107e2f
handle non-key column properly using SingleColumnValueFilter
Nov 27, 2014
f79665d
Added CreateTableAndLoadData trait
sboeschhuawei Nov 27, 2014
c3b26d5
hot fix for test, not best solution
scwf Nov 27, 2014
b224a3a
Added workaround for Rowkey order bug and automated row data types/va…
sboeschhuawei Nov 27, 2014
26cad54
CriticalPointFinder enhancement 1
Nov 27, 2014
baa34b3
Merge branch 'hbase' of https://github.com/Huawei-Spark/spark into hbase
Nov 27, 2014
fc482ab
Added filter, order-by, and aggregate queries: created QueriesSuiteBa…
sboeschhuawei Nov 28, 2014
1749e9e
adding HBaseScanBuilder
scwf Nov 29, 2014
d7b6ae0
fix createRelation
scwf Nov 29, 2014
d54d466
adding source test
scwf Nov 29, 2014
33a1bb8
update
scwf Nov 29, 2014
ec8e196
delete no use test files
scwf Nov 30, 2014
f958b22
adding test case
scwf Nov 30, 2014
dfb7309
add license head
scwf Dec 1, 2014
f0a2135
fix DecimalType bug
scwf Dec 1, 2014
dcf8fb5
some comment
scwf Dec 1, 2014
675de86
merge and fix conflict
scwf Dec 3, 2014
3835e5a
revert change by debug
scwf Dec 4, 2014
aa0522d
refactoring code structure
scwf Dec 4, 2014
3d4548e
revert pom
scwf Dec 4, 2014
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -209,6 +209,16 @@
</dependency>
</dependencies>
</profile>
<profile>
<id>hbase</id>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hbase_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
</dependencies>
</profile>
<profile>
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To remove this.

<id>spark-ganglia-lgpl</id>
<dependencies>
Expand Down
2 changes: 2 additions & 0 deletions bin/compute-classpath.cmd
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@ set SPARK_CLASSES=%SPARK_CLASSES%;%FWDIR%tools\target\scala-%SCALA_VERSION%\clas
set SPARK_CLASSES=%SPARK_CLASSES%;%FWDIR%sql\catalyst\target\scala-%SCALA_VERSION%\classes
set SPARK_CLASSES=%SPARK_CLASSES%;%FWDIR%sql\core\target\scala-%SCALA_VERSION%\classes
set SPARK_CLASSES=%SPARK_CLASSES%;%FWDIR%sql\hive\target\scala-%SCALA_VERSION%\classes
set SPARK_CLASSES=%SPARK_CLASSES%;%FWDIR%sql\hbase\target\scala-%SCALA_VERSION%\classes

set SPARK_TEST_CLASSES=%FWDIR%core\target\scala-%SCALA_VERSION%\test-classes
set SPARK_TEST_CLASSES=%SPARK_TEST_CLASSES%;%FWDIR%repl\target\scala-%SCALA_VERSION%\test-classes
Expand All @@ -91,6 +92,7 @@ set SPARK_TEST_CLASSES=%SPARK_TEST_CLASSES%;%FWDIR%streaming\target\scala-%SCALA
set SPARK_TEST_CLASSES=%SPARK_TEST_CLASSES%;%FWDIR%sql\catalyst\target\scala-%SCALA_VERSION%\test-classes
set SPARK_TEST_CLASSES=%SPARK_TEST_CLASSES%;%FWDIR%sql\core\target\scala-%SCALA_VERSION%\test-classes
set SPARK_TEST_CLASSES=%SPARK_TEST_CLASSES%;%FWDIR%sql\hive\target\scala-%SCALA_VERSION%\test-classes
set SPARK_TEST_CLASSES=%SPARK_TEST_CLASSES%;%FWDIR%sql\hbase\target\scala-%SCALA_VERSION%\test-classes

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

revert this change

if "x%SPARK_TESTING%"=="x1" (
rem Add test clases to path - note, add SPARK_CLASSES and SPARK_TEST_CLASSES before CLASSPATH
Expand Down
2 changes: 2 additions & 0 deletions bin/compute-classpath.sh
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@ if [ -n "$SPARK_PREPEND_CLASSES" ]; then
CLASSPATH="$CLASSPATH:$FWDIR/sql/hive/target/scala-$SPARK_SCALA_VERSION/classes"
CLASSPATH="$CLASSPATH:$FWDIR/sql/hive-thriftserver/target/scala-$SPARK_SCALA_VERSION/classes"
CLASSPATH="$CLASSPATH:$FWDIR/yarn/stable/target/scala-$SPARK_SCALA_VERSION/classes"
CLASSPATH="$CLASSPATH:$FWDIR/sql/hbase/target/scala-$SCALA_VERSION/classes"
fi

# Use spark-assembly jar from either RELEASE or assembly directory
Expand Down Expand Up @@ -130,6 +131,7 @@ if [[ $SPARK_TESTING == 1 ]]; then
CLASSPATH="$CLASSPATH:$FWDIR/sql/catalyst/target/scala-$SPARK_SCALA_VERSION/test-classes"
CLASSPATH="$CLASSPATH:$FWDIR/sql/core/target/scala-$SPARK_SCALA_VERSION/test-classes"
CLASSPATH="$CLASSPATH:$FWDIR/sql/hive/target/scala-$SPARK_SCALA_VERSION/test-classes"
CLASSPATH="$CLASSPATH:$FWDIR/sql/hbase/target/scala-$SCALA_VERSION/classes"
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

revert this.

fi

# Add hadoop conf dir if given -- otherwise FileSystem.*, etc fail !
Expand Down
55 changes: 55 additions & 0 deletions bin/hbase-sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
#!/usr/bin/env bash

#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

#
# Shell script for starting the Spark SQL for HBase CLI

# Enter posix mode for bash
set -o posix

CLASS="org.apache.spark.sql.hbase.HBaseSQLCLIDriver"

# Figure out where Spark is installed
FWDIR="$(cd "`dirname "$0"`"/..; pwd)"

function usage {
echo "Usage: ./bin/hbase-sql [options] [cli option]"
pattern="usage"
pattern+="\|Spark assembly has been built with hbase"
pattern+="\|NOTE: SPARK_PREPEND_CLASSES is set"
pattern+="\|Spark Command: "
pattern+="\|--help"
pattern+="\|======="

"$FWDIR"/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2
echo
echo "CLI options:"
"$FWDIR"/bin/spark-class $CLASS --help 2>&1 | grep -v "$pattern" 1>&2
}

if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
usage
exit 0
fi

source "$FWDIR"/bin/utils.sh
SUBMIT_USAGE_FUNCTION=usage
gatherSparkSubmitOpts "$@"

exec "$FWDIR"/bin/spark-submit --class $CLASS "${SUBMISSION_OPTS[@]}" spark-internal "${APPLICATION_OPTS[@]}"
266 changes: 133 additions & 133 deletions examples/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -101,140 +101,140 @@
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-testing-util</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<!-- SPARK-4455 -->
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>org.jruby</groupId>
<artifactId>jruby-complete</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-protocol</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-common</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<!-- SPARK-4455 -->
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-annotations</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<!-- SPARK-4455 -->
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-annotations</artifactId>
</exclusion>
<exclusion>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-testing-util</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<!-- SPARK-4455 -->
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>org.jruby</groupId>
<artifactId>jruby-complete</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-protocol</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-common</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<!-- SPARK-4455 -->
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-annotations</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<!-- SPARK-4455 -->
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
</exclusion>
<exclusion>
<!-- SPARK-4455 -->
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-hadoop1-compat</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.commons</groupId>
<artifactId>commons-math</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-server</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-json</artifactId>
</exclusion>
<exclusion>
<!-- hbase uses v2.4, which is better, but ...-->
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-hadoop-compat</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-hadoop-compat</artifactId>
<version>${hbase.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
</exclusion>
<exclusion>
<!-- SPARK-4455 -->
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-hadoop1-compat</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.commons</groupId>
<artifactId>commons-math</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-server</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-json</artifactId>
</exclusion>
<exclusion>
<!-- hbase uses v2.4, which is better, but ...-->
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-hadoop-compat</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-hadoop-compat</artifactId>
<version>${hbase.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-math3</artifactId>
Expand Down Expand Up @@ -416,7 +416,7 @@
</properties>
</profile>
<profile>
<!-- We add a source directory specific to Scala 2.10 since Kafka
<!-- We add a source directory specific to Scala 2.10 since Kafka
only works with it -->
<id>scala-2.10</id>
<activation>
Expand Down
Loading