Skip to content

Commit

Permalink
AWS - remove expected error message (duckdb#14633)
Browse files Browse the repository at this point in the history
This test is failing on Windows CI continuously because the error
message is different:

```
================================================================================

Query failed, but error message did not match expected error message: https://storage.googleapis.com/a/b.csv (D:/a/duckdb/duckdb/build/release/_deps/aws_extension_fc-src/test/sql/aws_secret_gcs.test:25)!

================================================================================

from "gcs://a/b.csv";

Actual result:

================================================================================

IO Error: Unable to connect to URL "gcs://a/b.csv": 400 (Bad Request)

```

This fixes that.
  • Loading branch information
Mytherin authored Oct 30, 2024
2 parents 4bb0e3e + 4abe44b commit 7fb69a4
Show file tree
Hide file tree
Showing 3 changed files with 14 additions and 0 deletions.
1 change: 1 addition & 0 deletions .github/config/out_of_tree_extensions.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ if (NOT MINGW)
LOAD_TESTS
GIT_URL https://github.com/duckdb/duckdb_aws
GIT_TAG e738b4cc07a86d323db8b38220323752cd183a04
APPLY_PATCHES
)
endif()

Expand Down
10 changes: 10 additions & 0 deletions .github/patches/extensions/aws/test_fix.patch
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
diff --git a/test/sql/aws_secret_gcs.test b/test/sql/aws_secret_gcs.test
index cbed048..bcc274e 100644
--- a/test/sql/aws_secret_gcs.test
+++ b/test/sql/aws_secret_gcs.test
@@ -25,5 +25,4 @@ s1
statement error
from "gcs://a/b.csv"
----
-https://storage.googleapis.com/a/b.csv

3 changes: 3 additions & 0 deletions tools/pythonpkg/tests/fast/spark/test_spark_arrow_table.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,13 @@

_ = pytest.importorskip("duckdb.experimental.spark")
pa = pytest.importorskip("pyarrow")
from spark_namespace import USE_ACTUAL_SPARK


class TestArrowTable:
def test_spark_to_arrow_table(self, spark):
if USE_ACTUAL_SPARK:
return
data = [
("firstRowFirstColumn",),
("2ndRowFirstColumn",),
Expand Down

0 comments on commit 7fb69a4

Please sign in to comment.