Skip to content

[BUG] Failed to authenticate in iceberg rest catalog tests #13689

@yinqingh

Description

@yinqingh

Describe the bug
Observed this org.apache.iceberg.exceptions.NotAuthorizedException: Not authorized: Failed to authenticate error in rapids_it-iceberg-rest-catalog-dev/40.

Error details

[2025-10-28T08:56:15.548Z] �[31mFAILED�[0m ../../src/main/python/iceberg/iceberg_rtas_test.py::�[1mtest_rtas_partitioned_table_unsupported_partition_fallback[day-write_distribution_mode=none-format_version=2][DATAGEN_SEED=1761638397, TZ=UTC, INJECT_OOM, IGNORE_ORDER({'local': True}), ALLOW_NON_GPU(AtomicReplaceTableAsSelectExec,AppendDataExec,ShuffleExchangeExec,SortExec,ProjectExec)]�[0m - py4j.protocol.Py4JJavaError: An error occurred while calling o101.sql.
[2025-10-28T08:56:15.548Z] : org.apache.iceberg.exceptions.NotAuthorizedException: Not authorized: Failed to authenticate
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.ErrorHandlers$DefaultErrorHandler.accept(ErrorHandlers.java:210)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:118)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:102)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.HTTPClient.throwFailure(HTTPClient.java:211)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:323)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:262)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.HTTPClient.get(HTTPClient.java:358)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.RESTClient.get(RESTClient.java:96)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.RESTSessionCatalog.loadInternal(RESTSessionCatalog.java:386)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.RESTSessionCatalog.loadTable(RESTSessionCatalog.java:402)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.catalog.BaseSessionCatalog$AsCatalog.loadTable(BaseSessionCatalog.java:99)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.rest.RESTCatalog.loadTable(RESTCatalog.java:102)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
[2025-10-28T08:56:15.548Z] 	at java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1853)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.CachingCatalog.loadTable(CachingCatalog.java:167)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.spark.SparkCatalog.load(SparkCatalog.java:845)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.spark.SparkCatalog.loadTable(SparkCatalog.java:170)
[2025-10-28T08:56:15.548Z] 	at org.apache.iceberg.spark.SparkSessionCatalog.loadTable(SparkSessionCatalog.java:139)
[2025-10-28T08:56:15.548Z] 	at org.apache.spark.sql.connector.catalog.TableCatalog.tableExists(TableCatalog.java:185)
[2025-10-28T08:56:15.548Z] 	at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:42)
[2025-10-28T08:56:15.548Z] 	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
[2025-10-28T08:56:15.548Z] 	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:107)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:125)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:107)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:461)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:461)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:437)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:98)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:85)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:83)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:220)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:638)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:629)
[2025-10-28T08:56:15.549Z] 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:659)
[2025-10-28T08:56:15.549Z] 	at sun.reflect.GeneratedMethodAccessor67.invoke(Unknown Source)
[2025-10-28T08:56:15.549Z] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[2025-10-28T08:56:15.549Z] 	at java.lang.reflect.Method.invoke(Method.java:498)
[2025-10-28T08:56:15.549Z] 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
[2025-10-28T08:56:15.549Z] 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
[2025-10-28T08:56:15.549Z] 	at py4j.Gateway.invoke(Gateway.java:282)
[2025-10-28T08:56:15.549Z] 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
[2025-10-28T08:56:15.549Z] 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
[2025-10-28T08:56:15.549Z] 	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
[2025-10-28T08:56:15.549Z] 	at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
[2025-10-28T08:56:15.549Z] 	at java.lang.Thread.run(Thread.java:750)

Steps/Code to reproduce bug
Please provide a list of steps or a code sample to reproduce the issue.
Avoid posting private or sensitive data.

Expected behavior
A clear and concise description of what you expected to happen.

Environment details (please complete the following information)

  • Environment location: [Standalone, YARN, Kubernetes, Cloud(specify cloud provider)]
  • Spark configuration settings related to the issue

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

Labels

bot_watchSlack bot watched issue for LLM analyzerbugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions