Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use user token when requesting from HttpsDataVault on same DS #8322

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

frcroth
Copy link
Member

@frcroth frcroth commented Jan 13, 2025

URL of deployed dev instance (used for testing):

  • https://___.webknossos.xyz

Steps to test:

  • Explore a dataset with the remote URI being the zarr streaming URI of a layer on the same datastore

This branches of unified annotation versioning, so we wait until #7917 is merged and can cherrypick / rebase then

So basically any method now has tokenContext if it does something with requesting data in one of its called methods (which is a lot of methods)

Issues:


(Please delete unneeded items, merge only when none are left open)

Copy link
Contributor

coderabbitai bot commented Jan 13, 2025

📝 Walkthrough

Walkthrough

This pull request introduces comprehensive changes to the WEBKNOSSOS datastore, focusing on enhancing token-based access control and context management across various data exploration and retrieval services. The primary modification involves adding an implicit TokenContext parameter to numerous method signatures, enabling more secure and context-aware data access. These changes aim to improve how user tokens are utilized when accessing data layers, particularly in scenarios involving self-streaming and zarr links.

Changes

File Change Summary
CHANGELOG.unreleased.md Updated to reflect changes in version restoration, zarr link access, and data handling
webknossos-datastore/.../*.scala Added TokenContext parameter to multiple method signatures across various services and explorers
test/backend/DataVaultTestSuite.scala Updated test cases to incorporate TokenContext

Assessment against linked issues

Objective Addressed Explanation
Pass user token when self-streaming from Webknossos [#8255] The changes enable passing user tokens implicitly when accessing zarr links, particularly for self-streaming scenarios.

Possibly related PRs

  • Support OME-NGFF zarr dataset translation #8311: The changes in the main PR regarding the addition of TokenContext for various methods and the enhancements in handling dataset translations are related to the modifications in the retrieved PR, which also involves updates to methods for handling coordinate transformations and the introduction of a translation parameter in the NgffCoordinateTransformation class. Both PRs focus on improving context management and data handling in the Zarr dataset processing.
  • Enable S3 compliant data vaults using https and http #8167: The changes in the main PR, which focus on enhancing data access and management through the integration of TokenContext, are related to the retrieved PR as both involve modifications to the S3DataVault class and its handling of HTTP requests, specifically regarding the management of access control and the use of tokens in data vault operations.
  • Release 24.11.1 #8188: The changes in the main PR regarding the integration of TokenContext into various method signatures are related to the modifications in the retrieved PR, which also emphasizes the use of TokenContext for managing access control and context in method operations. Both PRs focus on enhancing the functionality and security of the codebase through the incorporation of this context parameter.

Suggested labels

bug, frontend, automerge

Suggested reviewers

  • fm3

Poem

🐰 Tokens dance through the code's domain,
Secure passages, no access in vain.
Implicit contexts, silently glide,
Protecting data on every side.
A rabbit's whisper of digital might! 🔒


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 4d9994a and 2be5916.

📒 Files selected for processing (1)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingService.scala (2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingService.scala
⏰ Context from checks skipped due to timeout of 90000ms (1)
  • GitHub Check: circleci_build

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@frcroth frcroth marked this pull request as draft January 13, 2025 13:37
@frcroth frcroth linked an issue Jan 13, 2025 that may be closed by this pull request
@frcroth frcroth self-assigned this Jan 13, 2025
Base automatically changed from unified-annotation-versioning to master January 27, 2025 15:19
@frcroth frcroth requested a review from fm3 January 29, 2025 12:14
@frcroth frcroth marked this pull request as ready for review January 29, 2025 12:14
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🔭 Outside diff range comments (1)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NeuroglancerUriExplorer.scala (1)

Line range hint 52-62: Critical: TokenContext not propagated to layer explorers.

While the method accepts a TokenContext parameter, it's not being passed to any of the explorer instances' explore methods. This breaks the authentication chain and could lead to unauthorized access attempts.

Apply this fix to propagate the TokenContext:

 private def exploreLayer(layerType: String, remotePath: VaultPath, name: String)(
     implicit tc: TokenContext): Fox[List[(DataLayerWithMagLocators, VoxelSize)]] =
   layerType match {
     case "n5" =>
       Fox.firstSuccess(
-        Seq(new N5ArrayExplorer().explore(remotePath, None), new N5MultiscalesExplorer().explore(remotePath, None)))
+        Seq(new N5ArrayExplorer().explore(remotePath, None)(tc), new N5MultiscalesExplorer().explore(remotePath, None)(tc)))
     case "precomputed" => 
-      new PrecomputedExplorer().explore(remotePath, None)
+      new PrecomputedExplorer().explore(remotePath, None)(tc)
     case "zarr" | "zarr2" =>
       Fox.firstSuccess(
         Seq(
-          new NgffV0_4Explorer().explore(remotePath, None),
-          new NgffV0_5Explorer().explore(remotePath, None),
-          new ZarrArrayExplorer(Vec3Int.ones).explore(remotePath, None)
+          new NgffV0_4Explorer().explore(remotePath, None)(tc),
+          new NgffV0_5Explorer().explore(remotePath, None)(tc),
+          new ZarrArrayExplorer(Vec3Int.ones).explore(remotePath, None)(tc)
         ))
     case "zarr3" => 
-      new Zarr3ArrayExplorer().explore(remotePath, None)
+      new Zarr3ArrayExplorer().explore(remotePath, None)(tc)
     case _       => Fox.failure(f"Can not explore layer of $layerType type")
   }
🧹 Nitpick comments (25)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/WebknossosZarrExplorer.scala (2)

Line range hint 49-49: Consider enhancing the error message for unsupported layers.

The current error message could be more specific about what layer types are actually received.

Consider this improvement:

-case layer => Fox.failure(s"Only remote Zarr2 or Zarr3 layers are supported, got ${layer.getClass}.")
+case layer => Fox.failure(s"Only remote Zarr2 (ZarrDataLayer, ZarrSegmentationLayer) or Zarr3 (Zarr3DataLayer, Zarr3SegmentationLayer) layers are supported, got ${layer.getClass.getSimpleName}.")

Line range hint 73-73: Consider enhancing the error message for missing mag path.

The current error message could provide more context about why the path couldn't be found.

Consider this improvement:

-          _ <- magHeaderPath.readBytes() ?~> s"Could not find $magPath"
+          _ <- magHeaderPath.readBytes() ?~> s"Could not find mag header at $magHeaderPath for mag ${mag.mag}"
webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/DSFullMeshService.scala (1)

104-104: Validate TokenContext before use in AdHocMeshRequest.

Consider adding validation to ensure the TokenContext is valid before using it in the request. This could prevent unnecessary processing if the token is invalid or expired.

+      _ <- tc.validateToken ?~> "Invalid or expired token"
       tc,
webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/SegmentIndexFileService.scala (1)

236-236: Consider adding explicit error handling for token-related failures.

While the TokenContext is correctly added, consider enhancing error handling to specifically catch and handle token-related failures (e.g., invalid or expired tokens) to provide better feedback to the caller.

Example enhancement:

   private def getDataForBucketPositions(dataSource: datasource.DataSource,
                                         dataLayer: DataLayer,
                                         mag: Vec3Int,
                                         mag1BucketPositions: Seq[Vec3Int],
                                         mappingName: Option[String])(implicit tc: TokenContext): Fox[Array[Byte]] = {
+    def handleTokenError(err: Throwable): Fox[Array[Byte]] = err match {
+      case e: InvalidTokenException => Fox.failure(s"Invalid token: ${e.getMessage}")
+      case e: ExpiredTokenException => Fox.failure(s"Expired token: ${e.getMessage}")
+      case e => Fox.failure(s"Unexpected error: ${e.getMessage}")
+    }
+
     val dataRequests = mag1BucketPositions.map { position =>
       // ... existing code ...
     }.toList
     for {
-      (data, _) <- binaryDataServiceHolder.binaryDataService.handleDataRequests(dataRequests)
+      (data, _) <- binaryDataServiceHolder.binaryDataService.handleDataRequests(dataRequests).recoverWith(handleTokenError)
     } yield data
   }
webknossos-datastore/app/com/scalableminds/webknossos/datastore/storage/DataVaultService.scala (1)

Line range hint 34-37: Consider token-aware caching strategy.

The current cache implementation uses only RemoteSourceDescriptor as the key. Since we're implementing token-based access, consider whether the cache key should include token context to prevent unauthorized access to cached vaults.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffExplorationUtils.scala (1)

Line range hint 186-313: Consider documenting TokenContext usage.

While the TokenContext integration is clean and consistent, consider adding documentation comments to explain:

  • The purpose of TokenContext in NGFF exploration
  • How TokenContext is used in the data access chain
  • Any security implications or requirements
webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/ExploreLocalLayerService.scala (1)

Line range hint 55-134: Implement consistent token propagation throughout the service.

The current implementation adds TokenContext support but uses None as the token value throughout the service. To properly implement token-based access control as per the PR objectives:

  1. Add implicit TokenContext parameters to all public methods in this service.
  2. Propagate the token through the call chain instead of creating new TokenContext instances with None.
  3. Consider adding validation to ensure tokens are present when required.

This will ensure proper user token usage when requesting from HttpsDataVault as intended.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/precomputed/PrecomputedArray.scala (1)

23-30: LGTM: TokenContext properly integrated into open method.

The TokenContext parameter is correctly added as implicit and properly used for secure file operations. Consider adding a scaladoc comment to document the token context requirement.

+  /**
+   * Opens a PrecomputedArray with the given parameters.
+   * @param tc Implicit token context for secure file access
+   */
   def open(magPath: VaultPath,
webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr/ZarrArray.scala (1)

24-25: Consider adding ScalaDoc for TokenContext parameter.

While the change is correct, it would be helpful to document the purpose and requirements of the TokenContext parameter, especially since it's an implicit parameter that affects data access control.

Add ScalaDoc comment:

+  /**
+   * Opens a ZarrArray with the specified parameters.
+   * @param tc Implicit token context for access control during data operations
+   */
   def open(path: VaultPath,
webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NeuroglancerUriExplorer.scala (1)

22-23: Consider architectural consistency between credentials and TokenContext.

While the method now accepts a TokenContext for authentication, it still enforces credentialId.isEmpty on line 28. This might create confusion about which authentication mechanism should be used. Consider documenting why both mechanisms exist and their respective purposes.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/wkw/WKWArray.scala (1)

81-82: LGTM: TokenContext properly integrated into readAndParseShardIndex.

The implicit TokenContext parameter is correctly added. Consider adding error context that includes the token information for better debugging.

Consider enhancing error reporting by including token context information in error messages:

-      shardIndexBytes <- shardPath.readBytes(Some(rangeInShardFile))
+      shardIndexBytes <- shardPath.readBytes(Some(rangeInShardFile)).mapFailure(err => 
+        s"Failed to read shard index bytes for token ${tc.token}: $err")
webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/FindDataService.scala (2)

134-135: Add ScalaDoc for TokenContext parameter.

Since this is a public method, consider adding ScalaDoc documentation to explain the TokenContext requirement and its usage.


140-140: Document security implications of token-based access.

Since this method processes potentially sensitive data:

  1. Add documentation about access control implications
  2. Consider adding access validation at the start of the method
webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/N5MultiscalesExplorer.scala (1)

21-22: LGTM! Comprehensive TokenContext propagation.

The TokenContext is correctly propagated through the entire chain of method calls, from the public explore method down to the private helper methods. This ensures consistent token context handling throughout the exploration process.

The propagation pattern used here is a good example of how to handle implicit contexts in a hierarchical method structure. Consider documenting this pattern for future similar implementations.

Also applies to: 29-32, 105-105

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_4Explorer.scala (4)

48-48: Consider documenting the TokenContext usage.

The token context is correctly added as an implicit parameter, but its specific usage and requirements should be documented for maintainability.

Add a comment explaining the token context's role in layer creation:

+  /** Creates a ZarrLayer with the specified parameters.
+   * @param tc TokenContext Used for authenticated access to remote resources
+   */
   protected def createLayer(...)

102-108: Consider enhancing error messages with context information.

The token context integration looks good, but error messages could be more descriptive to aid debugging authentication issues.

Consider enhancing error messages:

-      mag <- magFromTransforms(...) ?~> "Could not extract mag from scale transforms"
+      mag <- magFromTransforms(...) ?~> s"Could not extract mag from scale transforms for path: ${layerPath}"

129-130: Consider enhancing error messages for authentication failures.

The token context integration looks good, but error messages could be more helpful for debugging authentication issues.

Consider enhancing error messages:

-      ngffHeader <- zattrsPath.parseAsJson[NgffMetadata] ?~> s"Failed to read OME NGFF header at $zattrsPath"
+      ngffHeader <- zattrsPath.parseAsJson[NgffMetadata] ?~> 
+        s"Failed to read OME NGFF header at $zattrsPath. Please verify authentication credentials."

Line range hint 1-143: Well-structured integration of token-based access control.

The TokenContext integration is consistent throughout the file and aligns well with the PR objective of implementing user token authentication for HttpsDataVault requests. The changes maintain the existing functionality while adding secure access control.

Consider creating a comprehensive error handling strategy for authentication failures across all methods to improve debugging experience.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/BinaryDataService.scala (2)

55-56: Consider improving method signature formatting.

While the TokenContext addition is correct, consider formatting the method signature more idiomatically:

-  def handleDataRequests(requests: List[DataServiceDataRequest])(
-      implicit tc: TokenContext): Fox[(Array[Byte], List[Int])] = {
+  def handleDataRequests(
+      requests: List[DataServiceDataRequest]
+  )(implicit tc: TokenContext): Fox[(Array[Byte], List[Int])] = {

93-94: Consider improving private method signature formatting.

Similar to handleDataRequests, consider formatting the method signature more idiomatically:

-  private def handleBucketRequest(request: DataServiceDataRequest, bucket: BucketPosition)(
-      implicit tc: TokenContext): Fox[Array[Byte]] =
+  private def handleBucketRequest(
+      request: DataServiceDataRequest,
+      bucket: BucketPosition
+  )(implicit tc: TokenContext): Fox[Array[Byte]] =
webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/ChunkReader.scala (1)

3-3: LGTM! Consider adding documentation for TokenContext usage.

The TokenContext integration is clean and properly propagates through all code paths. Consider adding documentation to explain the role of TokenContext in chunk reading operations.

Add documentation above the methods to explain the TokenContext requirement:

+ /** Reads and processes a chunk of data.
+  * @param path The path to the chunk
+  * @param chunkShapeFromMetadata The expected shape from metadata
+  * @param range Optional range specifier
+  * @param useSkipTypingShortcut Whether to use typing shortcuts
+  * @param ec Implicit execution context
+  * @param tc Implicit token context for access control
+  * @return The processed chunk as a MultiArray
+  */
 def read(path: VaultPath,

Also applies to: 22-22, 43-44

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/n5/N5Array.scala (1)

3-3: LGTM! Consider grouping related parameters.

The TokenContext integration is correct. Consider grouping related parameters for better readability.

Group related parameters:

 def open(
+    // Path and identification
     path: VaultPath,
     dataSourceId: DataSourceId,
     layerName: String,
+    // Configuration
     axisOrderOpt: Option[AxisOrder],
     channelIndex: Option[Int],
     additionalAxes: Option[Seq[AdditionalAxis]],
+    // Resources
     sharedChunkContentsCache: AlfuCache[String, MultiArray]
   )(
+    // Execution context
     implicit ec: ExecutionContext,
     tc: TokenContext
   ): Fox[N5Array]

Also applies to: 25-26

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/VaultPath.scala (1)

22-23: LGTM! TokenContext is consistently propagated.

The addition of the implicit TokenContext parameter is consistent across all methods that perform data access operations. This ensures proper token-based access control throughout the data access chain.

Consider adding ScalaDoc comments to document the token context requirement and its purpose in these methods.

Also applies to: 29-29, 98-98

CHANGELOG.unreleased.md (1)

18-18: Enhance clarity of the changelog entry.

The entry correctly describes the change and follows the changelog format. However, the wording could be slightly improved for better clarity.

Consider this revision for better clarity:

- When using a zarr link to a data layer as another layers source, the user's token is used to access the data. [#8322](https://github.com/scalableminds/webknossos/pull/8322/)
+ When accessing a data layer through a zarr link as another layer's source, the user's token is now used for authentication. [#8322](https://github.com/scalableminds/webknossos/pull/8322/)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/GoogleCloudDataVault.scala (1)

37-38: Consider integrating TokenContext with Google Cloud Storage authentication.

While the class already handles authentication via GoogleServiceAccountCredential, consider using the TokenContext to:

  1. Validate access permissions before making GCS requests
  2. Potentially use the user's token for audit logging
  3. Implement rate limiting or quota management per user
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f83da10 and d31837e.

📒 Files selected for processing (44)
  • CHANGELOG.unreleased.md (1 hunks)
  • test/backend/DataVaultTestSuite.scala (13 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/controllers/BinaryDataController.scala (6 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/controllers/ZarrStreamingController.scala (1 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/BucketProvider.scala (1 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/ChunkReader.scala (3 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/DatasetArray.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/n5/N5Array.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/n5/N5ChunkReader.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/precomputed/PrecomputedArray.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/wkw/WKWArray.scala (4 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr/ZarrArray.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/Zarr3Array.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/Zarr3ChunkReader.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/DataVault.scala (1 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/FileSystemDataVault.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/GoogleCloudDataVault.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/HttpsDataVault.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/S3DataVault.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/VaultPath.scala (3 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/ExploreLocalLayerService.scala (3 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/ExploreRemoteLayerService.scala (6 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/N5ArrayExplorer.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/N5MultiscalesExplorer.scala (3 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NeuroglancerUriExplorer.scala (4 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffExplorationUtils.scala (6 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_4Explorer.scala (6 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_5Explorer.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/PrecomputedExplorer.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/RemoteLayerExplorer.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/WebknossosZarrExplorer.scala (3 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/Zarr3ArrayExplorer.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/ZarrArrayExplorer.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/AdHocMeshService.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/BinaryDataService.scala (4 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/DSFullMeshService.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/FindDataService.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/SegmentIndexFileService.scala (5 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/storage/DataVaultService.scala (3 hunks)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingLayer.scala (1 hunks)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingService.scala (2 hunks)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingLayer.scala (2 hunks)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingService.scala (1 hunks)
👮 Files not reviewed due to content moderation or server errors (9)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/VaultPath.scala
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/FileSystemDataVault.scala
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/GoogleCloudDataVault.scala
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingLayer.scala
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingLayer.scala
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/HttpsDataVault.scala
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/AdHocMeshService.scala
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingService.scala
⏰ Context from checks skipped due to timeout of 90000ms (1)
  • GitHub Check: circleci_build
🔇 Additional comments (119)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/WebknossosZarrExplorer.scala (4)

3-3: LGTM: TokenContext import added appropriately.

The import is correctly placed and necessary for the token context functionality being added.


28-29: LGTM: TokenContext parameter addition is well-implemented.

The implicit TokenContext parameter is correctly added and aligns with the PR's objective of implementing user token-based access control.


58-58: LGTM: Consistent TokenContext propagation in adaptMags.

The implicit TokenContext parameter is correctly added and properly propagated to the fixRemoteMagPath call.


64-65: LGTM: TokenContext properly utilized in fixRemoteMagPath.

The implicit TokenContext parameter is correctly added and used by the readBytes() operation.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/DSFullMeshService.scala (2)

65-70: LGTM: TokenContext propagation is properly implemented.

The addition of the implicit TokenContext parameter is consistent with the PR's objective and follows Scala best practices for context propagation.


96-97: Consider token expiration in recursive mesh chunk retrieval.

While the TokenContext propagation is correct, the recursive nature of getAllAdHocChunks could potentially lead to token expiration issues during long-running operations with many chunks.

Consider implementing a token freshness check or renewal mechanism for long-running recursive operations. Would you like me to propose a pattern for handling token expiration in recursive calls?

webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/SegmentIndexFileService.scala (4)

3-3: LGTM: TokenContext import added correctly.

The import is properly placed and necessary for the token context functionality being added.


127-127: LGTM: TokenContext properly added to method signature.

The implicit TokenContext parameter is correctly added and propagated to downstream methods, maintaining the method's contract while enabling token-based access control.


136-142: LGTM: Method signature properly updated with TokenContext.

The implicit TokenContext parameter is correctly added, and the method signature reformatting improves readability while maintaining functionality.


165-165: LGTM: TokenContext properly added to method signature.

The implicit TokenContext parameter is correctly added and propagated to the underlying data retrieval method.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/storage/DataVaultService.scala (3)

5-5: LGTM! Import added for the new configuration dependency.

The import statement is correctly placed and properly scoped.


32-32: LGTM! Constructor updated to include DataStoreConfig.

The dependency injection is properly maintained and the parameter ordering follows conventions.


53-53: Verify token handling implementation in HttpsDataVault.

The addition of config.Http.uri aligns with the PR objectives for implementing user token handling. However, we should verify the token context implementation in the HttpsDataVault class.

Let's verify the token handling implementation:

✅ Verification successful

Token handling implementation in HttpsDataVault is properly integrated.

The implementation correctly uses TokenContext for request authentication and properly propagates token context through the entire request chain, maintaining security throughout data access operations.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for token-related changes in HttpsDataVault
echo "Checking HttpsDataVault implementation..."
rg -p "class HttpsDataVault" -A 20
rg "TokenContext" --type scala

Length of output: 67152

webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingService.scala (2)

Line range hint 372-384: LGTM! Token context is correctly propagated.

The implementation properly passes the implicit TokenContext to the AdHocMeshRequest, ensuring authentication context is available for mesh data requests.


195-196: Verify token context propagation in handleDataRequests.

The addition of the implicit TokenContext parameter is correct. However, we should verify that the token context is properly propagated through the binaryDataService.handleDataRequests call.

✅ Verification successful

Token context is properly propagated through the call chain

The handleDataRequests method in BinaryDataService correctly accepts an implicit TokenContext parameter, ensuring proper token propagation from EditableMappingService.volumeData through the entire request chain.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for handleDataRequests implementation to verify token context usage
ast-grep --pattern 'def handleDataRequests($_, $_) {
  $$$
}'

Length of output: 63


Script:

#!/bin/bash
# Search for handleDataRequests mentions
rg -l "handleDataRequests"

# Search for BinaryDataService class definition
ast-grep --pattern 'class BinaryDataService {
  $$$
}'

# Broader search for similar method names
rg "def.*[Dd]ata[Rr]equest"

Length of output: 3324


Script:

#!/bin/bash
# Get the implementation of handleDataRequests with surrounding context
rg -A 10 "def handleDataRequests\(requests: List\[DataServiceDataRequest\]\)" webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/BinaryDataService.scala

Length of output: 841

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffExplorationUtils.scala (2)

3-3: LGTM!

The TokenContext import is appropriately placed and necessary for the new implicit parameter additions.


186-186: LGTM! Consistent token context integration.

The TokenContext parameter has been consistently added to all methods that interact with data. The ordering of implicit parameters (ExecutionContext before TokenContext) is maintained throughout the trait.

Verify token propagation in calling code.

Ensure that the TokenContext is properly propagated from the calling methods.

Also applies to: 195-196, 214-215, 234-234, 242-243, 308-309, 312-313

✅ Verification successful

Token context is properly propagated throughout the codebase

The verification shows that TokenContext is consistently passed through the call chain in both NgffV0.4 and NgffV0.5 implementations. All methods requiring TokenContext declare it as an implicit parameter and properly propagate it to dependent method calls.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if calling methods provide TokenContext
# Look for calls to these methods without TokenContext

# Search for method calls
rg -A 5 "getShape|getAdditionalAxes|getNgffMultiscaleChannelCount|createLayer|layersFromNgffMultiscale|layersForLabel|exploreLabelLayers"

# Search for implicit TokenContext declarations in calling classes
ast-grep --pattern 'implicit val $tc: TokenContext'

Length of output: 30442

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/ExploreLocalLayerService.scala (2)

3-3: LGTM!

The TokenContext import is correctly added and necessary for the token-based access control implementation.


55-55: Verify if passing None as token is intentional.

The change adds token context support but passes None as the token value. This might not align with the PR's objective of using user tokens for HttpsDataVault requests.

Let's verify other token usages in the codebase:

✅ Verification successful

Passing None as token is intentional for local layer exploration

The TokenContext(None) usage is correct here since ExploreLocalLayerService handles local filesystem operations that don't require authentication, unlike the external API endpoints which use tokens for access control.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for other TokenContext usages to understand the pattern
rg "TokenContext\(" -A 3 --type scala

Length of output: 77535

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/PrecomputedExplorer.scala (3)

3-3: LGTM! Import added for TokenContext.

The import is correctly placed and aligns with the PR objective of implementing user token support.


23-24: LGTM! Method signature updated correctly to include TokenContext.

The addition of the implicit TokenContext parameter is implemented correctly using Scala's curried parameter pattern and aligns with the PR objective.


23-24: Verify callers and test coverage for TokenContext.

Please ensure that:

  1. All callers of explore have been updated to provide the required TokenContext
  2. Test coverage includes scenarios with different token contexts
✅ Verification successful

TokenContext implementation verified successfully

All callers of explore properly provide TokenContext, and test coverage exists in DataVaultTestSuite.scala with appropriate TokenContext scenarios.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for calls to explore method to verify they provide TokenContext
ast-grep --pattern 'explore($_, $_)' 

# Search for test files that might need updates
rg -l "PrecomputedExplorer.*explore" --type scala --glob "*Test*.scala" --glob "*Spec*.scala"

Length of output: 134


Script:

#!/bin/bash
# Search for classes extending PrecomputedExplorer
rg "extends PrecomputedExplorer" --type scala

# Search for explore method calls with TokenContext
rg "explore\([^)]*\)\s*\([^)]*\)" --type scala

# Search for TokenContext usage in test files
rg "TokenContext" --type scala --glob "*Test*.scala" --glob "*Spec*.scala"

# Search for explore method definition and implementation
ast-grep --pattern 'def explore'

Length of output: 1238

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/precomputed/PrecomputedArray.scala (3)

3-3: LGTM: TokenContext import added correctly.

The import is properly placed with other utility imports and aligns with the PR objective of implementing token-based access control.


140-144: LGTM: Consistent token context propagation in shard handling methods.

The TokenContext is properly propagated through the shard handling chain, ensuring secure access throughout the data retrieval process. The caching mechanism remains efficient while adding security context.

Also applies to: 231-237


256-257: Verify parent class compatibility.

The override of getShardedChunkPathAndRange includes TokenContext. Please ensure that the parent class DatasetArray has been updated accordingly to maintain the contract.

✅ Verification successful

Parent class compatibility verified ✓

The parent class DatasetArray defines the method with the same signature, including the TokenContext parameter. The override is correct.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for the parent class definition and its method signature
ast-grep --pattern 'class DatasetArray {
  $$$
  def getShardedChunkPathAndRange($$$): $$$
  $$$
}'

Length of output: 101


Script:

#!/bin/bash
# First, find files containing DatasetArray
rg "DatasetArray" -l

# Then look for the actual definition and context
rg "(?:class|trait|abstract class) DatasetArray" -A 10

Length of output: 5054


Script:

#!/bin/bash
# Look for the method definition in DatasetArray.scala
rg "def getShardedChunkPathAndRange" webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/DatasetArray.scala -A 2

Length of output: 333

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/n5/N5ChunkReader.scala (3)

3-3: LGTM: TokenContext import added.

The import is correctly placed and aligns with the PR's objective of implementing token-based access control.


Line range hint 24-41: Verify consistent token handling across N5-related components.

The changes look good, but we should verify that other N5-related components (like N5DataExtractor) properly handle token context when needed.

✅ Verification successful

Token context is properly handled across all N5 components

The verification confirms that TokenContext is consistently handled:

  • All explorer components properly accept and propagate TokenContext
  • N5Array.open correctly handles token context for file operations
  • The changes in N5ChunkReader align with the existing token handling pattern
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check for other N5-related components that might need TokenContext

# Look for N5-related classes that might need token context
rg "class.*N5.*" -l | while read -r file; do
    echo "=== Checking $file ==="
    # Look for methods that might need TokenContext
    rg "def.*\(" "$file"
done

# Look for direct data access methods in N5 classes
ast-grep --pattern 'class $N5 {
  $$$
  def $method($params) {
    $$$
    readBytes($$$)
    $$$
  }
  $$$
}'

Length of output: 4895


Script:

#!/bin/bash
# Check token context handling in key N5 components

echo "=== Checking N5Array.open ==="
rg "def open.*VaultPath" -A 10 webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/n5/N5Array.scala

echo -e "\n=== Checking N5MultiscalesExplorer.explore ==="
rg "def explore.*VaultPath" -A 10 webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/N5MultiscalesExplorer.scala

echo -e "\n=== Checking N5ArrayExplorer.explore ==="
rg "def explore.*VaultPath" -A 10 webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/N5ArrayExplorer.scala

Length of output: 2656


25-26: Verify token propagation in the method implementation.

The method signature has been updated to include TokenContext as an implicit parameter, which is good. However, we should verify that this token is properly propagated to the readBytes call on VaultPath.

✅ Verification successful

Token propagation is correctly implemented

The TokenContext is properly propagated from the method signature to the VaultPath.readBytes call through Scala's implicit resolution mechanism.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify that VaultPath.readBytes accepts TokenContext
ast-grep --pattern 'def readBytes(range: Option[NumericRange[Long]])(implicit $_: ExecutionContext, $_: TokenContext)'

Length of output: 120


Script:

#!/bin/bash
# Find VaultPath class definition
ast-grep --pattern 'class VaultPath'

# Find readBytes method definition in VaultPath
rg "def readBytes.*TokenContext" -A 5

# Find usage of readBytes in N5ChunkReader
rg "readBytes.*" "app/com/scalableminds/webknossos/datastore/datareaders/n5/N5ChunkReader.scala" -A 3

Length of output: 11353


Script:

#!/bin/bash
# Find N5ChunkReader file
fd "N5ChunkReader.scala"

# Search for readBytes usage in any N5ChunkReader file
rg -g "N5ChunkReader.scala" "readBytes" -B 2 -A 3

Length of output: 1163

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr/ZarrArray.scala (2)

3-3: LGTM! Import added for TokenContext.

The TokenContext import is correctly added and necessary for the new parameter.


24-25: Verify all callers have TokenContext in scope.

The addition of implicit TokenContext parameter is correct and aligns with the PR objectives. However, this change requires all callers to have a TokenContext in scope.

Let's verify all callers have been updated:

✅ Verification successful

✓ TokenContext is properly available in scope

The caller in DatasetArrayBucketProvider.openDatasetArray has TokenContext properly declared as an implicit parameter, ensuring it's available for the ZarrArray.open call.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for calls to ZarrArray.open to ensure they have TokenContext in scope
ast-grep --pattern 'ZarrArray.open($$$)'

Length of output: 1189


Script:

#!/bin/bash
# Get the context of the method containing ZarrArray.open call
rg -B 10 -A 5 "ZarrArray\.open\(" webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala

Length of output: 1191


Script:

#!/bin/bash
# Get the full class context including imports
rg -B 50 -A 5 "class DatasetArrayBucketProvider" webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala

Length of output: 1828


Script:

#!/bin/bash
# Get the method signature containing ZarrArray.open
rg -B 20 "ZarrArray\.open\(" --context-separator='---' webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala

Length of output: 1351

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NeuroglancerUriExplorer.scala (3)

3-3: LGTM: TokenContext import added appropriately.

The import is correctly placed and necessary for the TokenContext parameter additions.


36-37: LGTM: TokenContext properly propagated through the exploration chain.

The TokenContext parameter is correctly added and propagated to downstream method calls, maintaining the authentication context throughout the layer exploration process.


Line range hint 52-62: Verify TokenContext implementation in all explorer classes.

Let's verify that all explorer classes have properly implemented the TokenContext parameter in their explore methods.

✅ Verification successful

TokenContext implementation verified in all explorer classes

All explorer classes, including the ones referenced in the code snippet (N5ArrayExplorer, N5MultiscalesExplorer, PrecomputedExplorer, NgffV0_4Explorer), correctly implement the explore method with the TokenContext parameter. The implementation is consistent across the entire codebase.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if all explorer classes have the TokenContext parameter in their explore methods

# Search for explorer classes that might be missing TokenContext
ast-grep --pattern 'class $EXPLORER extends RemoteLayerExplorer {
  $$$
  def explore(remotePath: VaultPath, credentialId: Option[String]): Fox[$$$]
  $$$
}'

echo "Classes above might be missing TokenContext parameter. They should match this pattern instead:"

ast-grep --pattern 'class $EXPLORER extends RemoteLayerExplorer {
  $$$
  def explore(remotePath: VaultPath, credentialId: Option[String])(implicit tc: TokenContext): Fox[$$$]
  $$$
}'

Length of output: 545


Script:

#!/bin/bash
# First, find the base class definition
echo "=== Base Explorer Class ==="
rg "trait.*Explorer" --type scala

echo -e "\n=== All Explorer Classes ==="
rg "class.*Explorer" --type scala

echo -e "\n=== Explore Method Implementations ==="
rg "def explore.*remotePath" --type scala -A 1

Length of output: 6516

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/wkw/WKWArray.scala (3)

4-4: LGTM: TokenContext import added appropriately.

The import is correctly placed and necessary for the new token context functionality.


63-64: LGTM: TokenContext properly propagated in getShardedChunkPathAndRange.

The implicit TokenContext parameter is correctly added and properly propagated to dependent operations.


25-26: LGTM: TokenContext properly integrated into open method.

The implicit TokenContext parameter is correctly added and will be properly propagated to the underlying readBytes operation.

Let's verify that all readBytes calls in the codebase now accept TokenContext:

✅ Verification successful

Verified: TokenContext properly integrated with readBytes across the codebase

The readBytes method in VaultPath.scala correctly accepts TokenContext as an implicit parameter, and this pattern is consistently followed throughout all usages in the codebase, including WKW, Zarr, N5, and other array implementations.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for readBytes calls to ensure they all accept TokenContext
ast-grep --pattern 'readBytes($$$)' 

Length of output: 441


Script:

#!/bin/bash
# Search for readBytes method definitions and calls with more specific patterns
echo "=== Method definitions ==="
ast-grep --pattern 'def readBytes'

echo -e "\n=== Method calls with qualified names ==="
rg "readBytes\(" -A 2

echo -e "\n=== Looking for files containing readBytes ==="
fd -t f -x grep -l "readBytes"

Length of output: 13888

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/RemoteLayerExplorer.scala (2)

5-5: LGTM! Import added for TokenContext.

The import is correctly placed and necessary for the new token context functionality.


23-24: Verify implementations of RemoteLayerExplorer trait.

The addition of implicit TokenContext parameter is a good security enhancement that aligns with the PR objectives. However, this change requires updates to all implementing classes.

Let's verify all implementations have been updated:

Additionally, verify all call sites have TokenContext in scope:

✅ Verification successful

All RemoteLayerExplorer implementations properly updated with TokenContext parameter

All eight implementations (ZarrArrayExplorer, WebknossosZarrExplorer, NgffV0_5Explorer, NgffV0_4Explorer, Zarr3ArrayExplorer, N5MultiscalesExplorer, PrecomputedExplorer, N5ArrayExplorer) have been correctly updated to include the implicit TokenContext parameter in their explore method signatures.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Find all implementations of RemoteLayerExplorer and verify they handle TokenContext
ast-grep --pattern 'class $_ extends $$$RemoteLayerExplorer$$$' 

Length of output: 65


Script:

#!/bin/bash
# Find all calls to explore method
ast-grep --pattern '$_.explore($_, $_)'

Length of output: 1820


Script:

#!/bin/bash
# Try different patterns to find implementations
echo "=== Finding implementations ==="
ast-grep --pattern 'class $_ extends RemoteLayerExplorer'
ast-grep --pattern 'class $_ extends $$$RemoteLayerExplorer'

echo -e "\n=== Checking for implicit TokenContext in calling classes ==="
ast-grep --pattern 'implicit val $tc: TokenContext'
ast-grep --pattern 'implicit def $tc: TokenContext'
ast-grep --pattern '(implicit $tc: TokenContext)'

echo -e "\n=== Checking imports of TokenContext ==="
rg "import.*TokenContext"

Length of output: 10450


Script:

#!/bin/bash
echo "=== Finding implementations in explore directory ==="
# List files in explore directory
fd . "webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore" -t f -x cat {} \; | grep -A 5 "class.*Explorer.*extends"

echo -e "\n=== Finding trait definition ==="
cat webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/RemoteLayerExplorer.scala

Length of output: 4794

webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/FindDataService.scala (5)

4-4: LGTM: TokenContext import added correctly.

The import is properly placed and necessary for the token context changes.


48-48: LGTM: Consistent token context propagation.

The implicit TokenContext is properly threaded through to the underlying getDataFor calls.


99-100: Verify token context in recursive calls.

While the TokenContext addition looks correct, ensure that the recursive nature of this method properly maintains the token context throughout all iterations.

Consider adding debug logging in development to verify token context preservation across recursive calls.


Line range hint 1-193: Consider comprehensive testing strategy.

The TokenContext changes touch multiple layers of data access. Consider:

  1. Adding integration tests that verify token propagation through the entire chain
  2. Testing error cases with invalid or missing tokens
  3. Testing concurrent access scenarios

Would you like me to help create a test plan or generate test cases?


26-27: Verify token context propagation to binaryDataService.

The TokenContext parameter addition looks correct, but we should verify that binaryDataService.handleDataRequest properly utilizes the implicit token context.

✅ Verification successful

Token context propagation is correctly implemented

The TokenContext parameter is properly propagated from getDataFor to binaryDataService.handleDataRequest through Scala's implicit parameter mechanism. The implementation follows the codebase's context propagation patterns.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for handleDataRequest method definition to verify token context usage
ast-grep --pattern 'def handleDataRequest($params) = {
  $$$
}'

Length of output: 65


Script:

#!/bin/bash
# Find files containing handleDataRequest
rg "handleDataRequest" -A 5

# Find callers of getDataFor
rg "getDataFor" -A 5

Length of output: 26434

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/N5ArrayExplorer.scala (2)

3-3: LGTM! Import added for TokenContext.

The import is correctly placed and necessary for the token context implementation.


20-21: LGTM! Method signature updated to include TokenContext.

The explore method now correctly accepts an implicit TokenContext parameter, maintaining backward compatibility while enabling token-based access control.

Let's verify that all callers of this method can provide a TokenContext:

✅ Verification successful

✓ TokenContext is properly available to all callers

The calling method exploreLayer has the required implicit TokenContext parameter, which is automatically passed to N5ArrayExplorer.explore and other explorer implementations. The implementation maintains proper implicit scope.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for calls to explore method in N5ArrayExplorer
rg -l "N5ArrayExplorer.*explore" | while read -r file; do
  echo "=== $file ==="
  rg -A 2 "N5ArrayExplorer.*explore" "$file"
done

Length of output: 662


Script:

#!/bin/bash
# Get the full context of the calling method in NeuroglancerUriExplorer
ast-grep --pattern 'def $$$explore$$$' webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NeuroglancerUriExplorer.scala

Length of output: 142


Script:

#!/bin/bash
# Get more context around the explore method call in NeuroglancerUriExplorer
rg -B 10 -A 10 "N5ArrayExplorer.*explore" webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NeuroglancerUriExplorer.scala

Length of output: 1249

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/Zarr3ArrayExplorer.scala (2)

3-3: LGTM! Import added for TokenContext.

The import is correctly placed and consistent with other explorer implementations.


20-21: LGTM! Method signature updated to include TokenContext.

The explore method signature change is consistent with other explorer implementations, maintaining a uniform approach to token context handling.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/ZarrArrayExplorer.scala (1)

3-3: LGTM! TokenContext integration is consistent.

The changes follow the same pattern as other explorer implementations, maintaining consistency across the codebase.

Also applies to: 20-21

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/N5MultiscalesExplorer.scala (1)

3-3: LGTM! Import added for TokenContext.

The import is correctly placed and necessary for the token context implementation.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_4Explorer.scala (4)

3-3: LGTM: TokenContext import added appropriately.

The import is correctly placed and necessary for the token-based access control implementation.


89-89: LGTM: Token context properly integrated into header retrieval.

The token context is correctly added and will be used for authenticated access to Zarr headers.


123-123: LGTM: Token context properly integrated into shape retrieval.

The token context is correctly added and propagated to the underlying header retrieval.


21-22: LGTM: Token context properly integrated into explore method.

The token context is correctly added as an implicit parameter and will be properly propagated to downstream operations.

Let's verify that all callers of this method provide a TokenContext:

✅ Verification successful

TokenContext integration verified across all explorer implementations and usages

The explore method's TokenContext parameter is consistently implemented across all explorer classes and properly handled in both remote and local exploration scenarios. The parameter is correctly threaded through the entire exploration chain, with appropriate handling of authentication context for both remote (authenticated) and local (unauthenticated) access patterns.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for calls to explore method
ast-grep --pattern 'explore($$$remotePath$$$, $$$credentialId$$$)' 

Length of output: 68


Script:

#!/bin/bash
# Search for explore method calls with context
rg -B 2 -A 2 "explore.*VaultPath.*credentialId" 

Length of output: 7863


Script:

#!/bin/bash
# Search for where explorers are used
rg -B 2 -A 2 "new (Ngff|Precomputed|Neuroglancer|Zarr|N5).*Explorer"

Length of output: 8418

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/Zarr3Array.scala (5)

3-3: LGTM: TokenContext import added correctly

The import is appropriately placed and necessary for the token context functionality being added.


121-122: LGTM: Consistent TokenContext propagation

The TokenContext is correctly propagated to the internal method chain, maintaining the security context.


137-137: LGTM: TokenContext added to low-level read operation

The TokenContext is properly added to the method that performs the actual byte reading operation.


172-173: Verify cache behavior with different TokenContexts

The TokenContext is correctly added to the method signature, but consider how the parsedShardIndexCache behaves with different TokenContexts.

Let's verify the cache implementation:

✅ Verification successful

Cache implementation is correct and safe

The parsedShardIndexCache uses VaultPath as its key without incorporating TokenContext, which is appropriate since shard indices are immutable structural data. The TokenContext is correctly used for access control during reading but doesn't need to be part of the cache key.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for AlfuCache usage patterns to understand if it considers TokenContext in its key
rg "AlfuCache.*TokenContext" -A 5

Length of output: 35


Script:

#!/bin/bash
# Find parsedShardIndexCache definition and usage
rg "parsedShardIndexCache" -A 5

# Also search for AlfuCache usage patterns more broadly
rg "AlfuCache" -A 5

Length of output: 69578


25-26: LGTM: TokenContext properly added to open method

The TokenContext is correctly added as an implicit parameter, maintaining backward compatibility while enabling token-based access control.

Let's verify that all callers of this method have the TokenContext in scope:

✅ Verification successful

✓ TokenContext is properly threaded through the call chain

The TokenContext is correctly passed through all layers from the top-level load method down to Zarr3Array.open via implicit parameters, ensuring proper token-based access control throughout the execution path.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for calls to Zarr3Array.open to ensure TokenContext is in scope
ast-grep --pattern 'Zarr3Array.open($$$)' 

Length of output: 1204


Script:

#!/bin/bash
# Get context around the Zarr3Array.open call to verify TokenContext availability
rg -B 10 "Zarr3Array\.open" webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala

Length of output: 700


Script:

#!/bin/bash
# Get the full class definition to verify TokenContext threading
ast-grep --pattern 'class DatasetArrayBucketProvider {
  $$$
}'

Length of output: 65


Script:

#!/bin/bash
# Get the class definition and context using ripgrep
rg -B 5 -A 20 "class DatasetArrayBucketProvider" webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala

Length of output: 1570


Script:

#!/bin/bash
# Get the openDatasetArrayWithTimeLogging method implementation
rg -B 5 -A 15 "openDatasetArrayWithTimeLogging" webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala

Length of output: 1867


Script:

#!/bin/bash
# Get the complete openDatasetArray method implementation
rg -A 30 "private def openDatasetArray" webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala

Length of output: 2472

webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/BinaryDataService.scala (3)

3-3: LGTM: TokenContext import added correctly.

The import is necessary for the token-based access control changes.


Line range hint 93-111: Verify TokenContext usage in bucket operations.

The method correctly accepts TokenContext, but let's verify it's properly used in bucket operations:

✅ Verification successful

TokenContext is properly used in bucket operations

The code correctly handles TokenContext through the entire bucket operation chain:

  • Properly defined in BucketProvider trait
  • Correctly implemented in concrete providers
  • Properly propagated through implicit parameters
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check if bucketProvider.load uses TokenContext
ast-grep --pattern 'load($$$)(implicit $$$: TokenContext)' 

# Verify BucketProvider interface includes TokenContext
ast-grep --pattern 'trait BucketProvider {
  $$$
  def load($$$)(implicit $$$: TokenContext)
  $$$
}'

Length of output: 164


Script:

#!/bin/bash
# Find BucketProvider trait and its implementations
rg -l "trait BucketProvider" 
rg -l "extends BucketProvider"

# Search for load method definitions with context
rg "def load.*TokenContext" -A 3

# Find TokenContext usage patterns in bucket-related files
rg "TokenContext" --type scala

Length of output: 81541


36-36: LGTM: TokenContext parameter added correctly.

The implicit TokenContext parameter is properly added and propagated to downstream handleBucketRequest calls, enabling token-based access control as intended.

Let's verify that TokenContext is properly propagated throughout the codebase:

✅ Verification successful

TokenContext propagation verified successfully

The TokenContext parameter is properly propagated throughout the entire call chain:

  • Controllers correctly extract and validate tokens using accessTokenService.validateAccessFromTokenContext
  • Service layer methods explicitly declare implicit tc: TokenContext parameters
  • All calls to handleDataRequest occur within contexts where TokenContext is available
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for handleDataRequest calls to ensure they have TokenContext in scope
ast-grep --pattern 'handleDataRequest($$$)(implicit $$$: TokenContext)' 

Length of output: 73


Script:

#!/bin/bash
# Find all handleDataRequest calls with context
rg "handleDataRequest" -B 2 -A 2

# Also search for the method definition to understand its full signature
rg "def handleDataRequest" -B 2 -A 2

Length of output: 8753


Script:

#!/bin/bash
# Check if calling methods have TokenContext parameter
rg "def .+\(.*\)\(.*TokenContext" webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/FindDataService.scala webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/AdHocMeshService.scala webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingService.scala webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingService.scala webknossos-datastore/app/com/scalableminds/webknossos/datastore/controllers/BinaryDataController.scala webknossos-datastore/app/com/scalableminds/webknossos/datastore/controllers/ZarrStreamingController.scala -A 1

Length of output: 1686


Script:

#!/bin/bash
# Check controller endpoints for Action definitions
rg "def .*(Action|action).*" webknossos-datastore/app/com/scalableminds/webknossos/datastore/controllers/BinaryDataController.scala webknossos-datastore/app/com/scalableminds/webknossos/datastore/controllers/ZarrStreamingController.scala -B 1 -A 3

Length of output: 7112

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_5Explorer.scala (3)

3-3: LGTM: TokenContext import added appropriately.

The import is correctly placed and necessary for the token-based access control changes.


99-105: Good practice: Proper ordering of multiple implicit parameters.

The method correctly orders multiple implicit parameters with ExecutionContext before TokenContext, following Scala's implicit resolution rules. This pattern should be followed for consistency when adding other implicit parameters.


22-23: LGTM: Consistent TokenContext integration across all methods.

The addition of implicit tc: TokenContext is implemented consistently across all methods, maintaining backward compatibility through implicit parameters.

Verify token propagation in nested method calls.

While the token context is correctly added to all method signatures, ensure it's properly propagated through all nested method calls, especially in:

  • explore()exploreLabelLayers() and layersFromNgffMultiscale()
  • createLayer()elementClassFromMags() and parseChannelAttributes()

Also applies to: 49-50, 90-91, 99-105, 120-120, 126-127

✅ Verification successful

Token propagation verified: Implementation is correct and complete

The codebase shows consistent and proper TokenContext propagation:

  • All Fox-returning methods correctly include TokenContext
  • Nested method calls maintain token context through the entire chain
  • Implementation follows the established pattern across all related files
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for method calls that might need TokenContext
ast-grep --pattern 'def $method($params) = {
  $$$
  $nestedMethod($args)
  $$$
}'

Length of output: 84


Script:

#!/bin/bash
# Search for method definitions with TokenContext and their implementations
rg -U "def \w+.*TokenContext.*=\s*\{[\s\S]+?\}" app/com/scalableminds/webknossos/datastore/explore/NgffV0_5Explorer.scala

# Also search for method calls that might use TokenContext
rg "implicit tc" app/com/scalableminds/webknossos/datastore/explore/NgffV0_5Explorer.scala

Length of output: 444


Script:

#!/bin/bash
# Find the correct file path
fd -e scala "NgffV0_5Explorer"

# Backup search for any Scala files with similar name
fd -e scala ".*[Nn]gff.*"

Length of output: 762


Script:

#!/bin/bash
# Check method implementations and TokenContext usage
rg -U "def.*TokenContext.*\{[\s\S]+?\}" webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_5Explorer.scala

# Look for method calls to related classes
rg "NgffExplorationUtils|NgffMetadata" webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_5Explorer.scala

Length of output: 386


Script:

#!/bin/bash
# Check for Fox operations and method calls
rg -U "Fox\[.*?\].*?=|def.*?=.*?Fox\[" webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_5Explorer.scala

# Check the trait implementation
cat webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffExplorationUtils.scala

# Look for method calls with tc parameter
rg "\w+\(.*tc.*\)" webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/NgffV0_5Explorer.scala

Length of output: 16743

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/DataVault.scala (1)

3-3: LGTM! Verify implementations of DataVault trait.

The addition of TokenContext as an implicit parameter in the trait is a good approach for consistent access control across all vault implementations.

Let's verify that all implementations of DataVault have been updated:

Also applies to: 9-10

✅ Verification successful

✅ All DataVault implementations correctly include TokenContext

All implementations of the DataVault trait have been properly updated to include the TokenContext parameter in their readBytesAndEncoding method:

  • S3DataVault
  • HttpsDataVault
  • FileSystemDataVault
  • GoogleCloudDataVault
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for all DataVault implementations and verify they include TokenContext
ast-grep --pattern 'class $_ extends DataVault {
  $$$
  def readBytesAndEncoding($_, $_)($_: ExecutionContext, $_: TokenContext)
  $$$
}'

Length of output: 140


Script:

#!/bin/bash
# Find all files containing "extends DataVault"
rg "extends DataVault" -l

# Show the context around readBytesAndEncoding in these files
echo "=== Method implementations ==="
rg "readBytesAndEncoding" -A 5 -B 2

Length of output: 8909

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/Zarr3ChunkReader.scala (1)

3-3: LGTM! TokenContext properly propagated.

The addition of TokenContext is clean and maintains the existing logic while ensuring proper access control propagation to the underlying readBytes operation.

Also applies to: 14-15

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/GoogleCloudDataVault.scala (2)

37-38: Consider integrating token-based access control with GCS authentication.

While GCS has its own authentication via ServiceAccountCredentials, the TokenContext could be used to implement additional access control layers (e.g., user-specific bucket/object permissions).

Consider implementing a permission check that combines both GCS authentication and token-based access control:

 override def readBytesAndEncoding(path: VaultPath, range: RangeSpecifier)(
     implicit ec: ExecutionContext,
     tc: TokenContext): Fox[(Array[Byte], Encoding.Value)] = {
 
   val objName = path.toUri.getPath.tail
   val blobId = BlobId.of(bucket, objName)
   for {
+    _ <- tc.validateAccess(bucket, objName) ?~> "Unauthorized access to GCS object"
     bytes <- try {
       // ... existing code ...

37-38: Consider implementing token validation for cloud storage access.

While the TokenContext parameter has been added, the implementation doesn't utilize it for access control. Consider validating the user's token before allowing access to Google Cloud Storage to ensure proper security measures are in place.

webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingLayer.scala (4)

33-34: LGTM! Token context properly integrated.

The token context parameter has been correctly added to both load methods, maintaining consistency with the PR's objective of ensuring token context availability for data requests.

Also applies to: 50-51


33-34: LGTM! Token context propagation looks good.

The TokenContext parameter has been correctly added to both implementations of the load method, maintaining consistency across the codebase.

Also applies to: 50-51


33-34: LGTM: Token context properly added to method signature.

The addition of the TokenContext parameter aligns with the PR objectives and maintains consistency with the codebase's authentication requirements.


50-51: LGTM: Token context properly added to method signature.

The addition of the TokenContext parameter aligns with the PR objectives and maintains consistency with the codebase's authentication requirements.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/HttpsDataVault.scala (7)

23-24: LGTM! Proper initialization of datastore authority.

The dataStoreHost parameter and dataStoreAuthority field are correctly implemented to enable same-datastore request detection.

Also applies to: 32-32


114-126: LGTM! Secure implementation of conditional token authentication.

The implementation correctly:

  • Uses token auth for same-datastore requests
  • Falls back to basic auth for external requests
  • Handles cases with no credentials gracefully

23-24: LGTM! Constructor changes look good.

The dataStoreHost parameter and dataStoreAuthority initialization are correctly implemented.

Also applies to: 32-32


114-126: Verify the token-based authentication logic.

The implementation correctly prioritizes token-based authentication for same-datastore requests while maintaining backward compatibility with basic auth. However, let's verify the behavior:

  1. When uri.getAuthority == dataStoreAuthority: Uses token auth with "X-Auth-Token"
  2. Otherwise: Falls back to basic auth if credentials are available
✅ Verification successful

Token-based authentication logic is correctly implemented

The implementation properly prioritizes token auth for same-datastore requests while maintaining basic auth fallback, and is consistently used across the codebase.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the token-based authentication implementation
# Test: Check for consistent usage of X-Auth-Token header across the codebase
# Expect: Only used for same-datastore requests

# Search for X-Auth-Token header usage
rg -A 5 'X-Auth-Token'

# Search for potential token validation or processing
ast-grep --pattern 'withHttpHeaders($_, $token)'

Length of output: 8175


23-24: LGTM: Proper initialization of datastore authority.

The addition of dataStoreHost parameter and dataStoreAuthority field enables the distinction between same-datastore and external requests.

Also applies to: 32-32


114-126: LGTM: Token-based authentication properly implemented.

The authentication logic correctly:

  • Uses token authentication for same-datastore requests
  • Falls back to basic auth for external datastores
  • Maintains backward compatibility

35-36: LGTM: Token context properly propagated.

The token context is consistently added to all relevant method signatures, ensuring proper authentication context throughout the request chain.

Also applies to: 70-71, 78-79, 86-86

webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/DatasetArrayBucketProvider.scala (3)

37-37: LGTM! Consistent token context propagation.

Token context is properly propagated through the call chain from load through to openDatasetArray, maintaining the authentication context for downstream operations.

Also applies to: 50-51, 64-65


37-37: LGTM! Token context propagation is consistent.

The TokenContext parameter has been correctly added to all methods in the call chain, ensuring proper propagation of the token context.

Also applies to: 50-51, 64-65


37-37: LGTM: Consistent token context propagation.

The token context is properly added to all methods in the call chain, ensuring authentication context is maintained throughout dataset array operations.

Also applies to: 50-51, 64-65

webknossos-datastore/app/com/scalableminds/webknossos/datastore/services/AdHocMeshService.scala (7)

34-34: LGTM! Proper token context handling in request and actor.

The token context is correctly:

  • Added to the request case class
  • Extracted and passed to the service by the actor

Also applies to: 50-50


74-74: LGTM! Token context properly propagated through mesh generation.

The token context is correctly propagated from requestAdHocMesh through to generateAdHocMeshImpl, ensuring authentication context is maintained throughout the mesh generation process.

Also applies to: 93-93


34-34: LGTM! Case class field addition looks good.

The tokenContext field has been correctly added to the AdHocMeshRequest case class.


50-50: LGTM! Token context handling in actor and service methods is correct.

The changes ensure proper token context propagation:

  1. Actor correctly extracts token context from request
  2. Service methods properly declare implicit token context parameter
  3. Token context flows through the implementation chain

Also applies to: 74-74, 93-93


34-34: LGTM: Token context properly added to request model.

The AdHocMeshRequest case class now includes the token context, enabling proper authentication context propagation.


50-50: LGTM: Actor properly handles token context.

The actor correctly extracts the token context from the request and propagates it to the service method.


74-74: LGTM: Token context properly propagated through service methods.

The token context is consistently added to service method signatures, ensuring authentication context is maintained throughout mesh generation.

Also applies to: 93-93

webknossos-datastore/app/com/scalableminds/webknossos/datastore/explore/ExploreRemoteLayerService.scala (1)

56-57: LGTM! TokenContext is properly threaded through the service.

The TokenContext parameter has been consistently added to all relevant method signatures, ensuring proper token context propagation throughout the service chain.

Also applies to: 80-81, 122-123, 146-147, 187-188

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/S3DataVault.scala (1)

106-107: LGTM! TokenContext properly added to S3DataVault implementation.

The TokenContext parameter has been correctly added to the readBytesAndEncoding method, maintaining consistency with the DataVault trait contract.

test/backend/DataVaultTestSuite.scala (3)

30-30: LGTM! TokenContext properly initialized for tests.

The TokenContext is initialized with None, which is appropriate for test scenarios.


41-45: LGTM! Test cases properly updated with TokenContext.

All test cases have been consistently updated to include the TokenContext parameter in readBytes calls, ensuring proper test coverage of the new token context functionality.

Also applies to: 59-61, 71-71, 79-79, 91-91, 105-107, 123-126, 139-140, 155-155, 167-167, 179-179


219-220: LGTM! MockDataVault properly updated.

The MockDataVault implementation has been updated to include TokenContext in its method signature, maintaining consistency with the real implementation.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/DatasetArray.scala (1)

68-72: LGTM! TokenContext properly integrated in DatasetArray.

The TokenContext parameter has been consistently added to all data reading methods, ensuring proper token context propagation throughout the data reading chain.

Also applies to: 119-120, 153-154, 191-192, 199-200, 205-206

webknossos-datastore/app/com/scalableminds/webknossos/datastore/controllers/BinaryDataController.scala (8)

4-4: LGTM! Added TokenContext import.

The import is correctly added and will be used throughout the file for token-based access control.


98-98: LGTM! Consistent parameter renaming.

The parameter renaming from implicit request to implicit r is consistent across all action methods. This change aligns with Scala naming conventions for shorter parameter names in action methods.

Also applies to: 144-144, 175-175


256-260: LGTM! Added token context to mesh request.

The tokenContextForRequest is correctly added to the AdHocMeshRequest construction, ensuring proper token propagation for mesh generation.


4-4: LGTM!

The TokenContext import is correctly added to support the token context changes in this file.


98-98: LGTM! Method signatures updated for token context support.

The changes correctly:

  1. Rename the implicit request parameter to r for consistency
  2. Add TokenContext as an implicit parameter to the requestData method

These changes align with the PR objective of implementing user token usage for requests.

Also applies to: 144-144, 313-313


98-98: LGTM! Consistent parameter naming.

The change from implicit request to implicit r in the action methods maintains consistency with the codebase's naming conventions.

Also applies to: 144-144, 175-175


256-260: LGTM! Proper token context handling in mesh request.

The addition of tokenContextForRequest to AdHocMeshRequest ensures proper access control when generating ad-hoc meshes.


313-313: LGTM! Enhanced access control in data requests.

The addition of implicit TokenContext to requestData method ensures proper access control when handling data requests.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/controllers/ZarrStreamingController.scala (3)

264-264: LGTM! Added TokenContext to rawZarrCube method.

The TokenContext parameter is correctly added as an implicit parameter, ensuring proper token propagation for Zarr cube requests.


264-264: LGTM! Method signature updated for token context support.

The TokenContext parameter is correctly added to the rawZarrCube method, maintaining consistency with similar changes across the codebase.


264-264: LGTM! Consistent token context handling in Zarr streaming.

The addition of implicit TokenContext to rawZarrCube method aligns with the overall access control enhancement strategy.

webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingService.scala (2)

626-626: LGTM! Added token context to mesh request.

The TokenContext is correctly added to the AdHocMeshRequest construction, ensuring proper token propagation for volume tracing mesh requests.


626-626: LGTM! AdHocMeshRequest constructor updated for token context support.

The TokenContext parameter is correctly added to the AdHocMeshRequest constructor, maintaining consistency with similar changes across the codebase.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/BucketProvider.scala (5)

3-3: LGTM! Added TokenContext import.

The import is correctly added and will be used by the trait's methods.


10-10: LGTM! Added TokenContext to load method.

The TokenContext parameter is correctly added as an implicit parameter to the load method, ensuring proper token propagation for bucket loading operations.


3-3: LGTM!

The TokenContext import is correctly added to support the token context changes in this file.


10-10: LGTM! Method signature updated for token context support.

The TokenContext parameter is correctly added to the load method, maintaining consistency with similar changes across the codebase.


3-10: LGTM! Important interface change for access control.

The addition of TokenContext as an implicit parameter to the load method is a crucial change that ensures proper access control context is available throughout the data loading chain.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/VaultPath.scala (2)

22-23: LGTM! The TokenContext integration looks good.

The addition of TokenContext to these methods is well-implemented and aligns with the PR's objective of enhancing token-based access control. The context is correctly propagated through the call chain.

Also applies to: 29-29, 98-98


22-23: LGTM! The TokenContext integration looks good.

The addition of TokenContext to these methods is well-implemented and consistent with the PR's objective of enhancing token-based access control for data requests.

Also applies to: 29-29, 98-98

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datavault/FileSystemDataVault.scala (2)

20-21: Verify token-based access control for file system operations.

While the TokenContext parameter has been added, it's not being used to validate access to the file system. This could potentially allow unauthorized access to files.


20-21: Consider implementing token validation for file system access.

While the TokenContext parameter has been added, the implementation doesn't utilize it for access control. Consider validating the user's token before allowing access to the file system to ensure proper security measures are in place.

webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingLayer.scala (2)

31-32: LGTM! Excellent implementation of TokenContext.

The implementation demonstrates proper token context handling:

  • Correctly propagates the context to service calls
  • Uses the layer's token context appropriately
  • Shows good practices in maintaining the authentication context throughout the operation

31-32: LGTM! Excellent token context usage.

The implementation demonstrates proper token context handling by correctly propagating it to downstream service calls and utilizing the layer's token context for remote fallback operations.

@@ -130,7 +131,7 @@ class ExploreLocalLayerService @Inject()(dataVaultService: DataVaultService)
fullPath <- Fox.successful(path.resolve(layer))
remoteSourceDescriptor <- Fox.successful(RemoteSourceDescriptor(fullPath.toUri, None))
vaultPath <- dataVaultService.getVaultPath(remoteSourceDescriptor) ?~> "dataVault.setup.failed"
layersWithVoxelSizes <- explorer.explore(vaultPath, None)
layersWithVoxelSizes <- explorer.explore(vaultPath, None)(TokenContext(None))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider propagating the user token instead of using None.

Similar to the ZarrArrayExplorer change, this modification passes None as the token value. Consider:

  1. Adding an implicit TokenContext parameter to the containing methods to propagate the actual user token.
  2. Using the propagated token instead of None.

Here's a suggested approach:

  private def exploreLocalLayer(
      makeLayersRelative: List[DataLayerWithMagLocators] => List[DataLayerWithMagLocators],
      explorer: RemoteLayerExplorer)(path: Path, dataSourceId: DataSourceId, layerDirectory: String)(
-     implicit ec: ExecutionContext): Fox[DataSourceWithMagLocators] =
+     implicit ec: ExecutionContext, tokenContext: TokenContext): Fox[DataSourceWithMagLocators] =

Then update the explore call to use the propagated token:

-     layersWithVoxelSizes <- explorer.explore(vaultPath, None)(TokenContext(None))
+     layersWithVoxelSizes <- explorer.explore(vaultPath, None)(tokenContext)

Committable suggestion skipped: line range outside the PR's diff.

Comment on lines +20 to +21
implicit ec: ExecutionContext,
tc: TokenContext): Fox[(Array[Byte], Encoding.Value)] =
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

⚠️ Potential issue

The review comment correctly identifies a security concern in FileSystemDataVault.

While TokenContext is properly passed to the FileSystemDataVault methods, it's not being used to validate access permissions before performing file system operations. This is inconsistent with the codebase's access control patterns, where TokenContext is actively used for validation (e.g., in AccessTokenService).

The suggested fix to implement validateAccess is appropriate and aligns with the codebase's security patterns:

  • Validates access before file operations
  • Uses TokenContext consistently with other services
  • Prevents unauthorized access to file system resources
🔗 Analysis chain

Verify token-based access control for file system operations.

While the TokenContext parameter has been added, it's not being utilized to validate access permissions for file system operations. This could potentially allow unauthorized access to files.

Consider implementing access control checks using the token context before performing file system operations. For example:

 override def readBytesAndEncoding(path: VaultPath, range: RangeSpecifier)(
     implicit ec: ExecutionContext,
     tc: TokenContext): Fox[(Array[Byte], Encoding.Value)] =
   for {
+    _ <- tc.validateAccess(path) ?~> "Unauthorized access to file system path"
     localPath <- vaultPathToLocalPath(path)
     bytes <- readBytesLocal(localPath, range)
   } yield (bytes, Encoding.identity)
🏁 Scripts executed

The following scripts were executed for the analysis:

Error: No shell scripts found


Script:

#!/bin/bash
# Find and examine FileSystemDataVault implementation
echo "=== FileSystemDataVault Implementation ==="
ast-grep --pattern 'class FileSystemDataVault {
  $$$
}'

echo -e "\n=== TokenContext Implementation ==="
ast-grep --pattern 'class TokenContext {
  $$$
}'

echo -e "\n=== TokenContext Usage Patterns ==="
rg "TokenContext" -A 5 -B 5

echo -e "\n=== Similar Validation Patterns ==="
rg "validateAccess" -A 3 -B 3

Length of output: 70723


Script:

#!/bin/bash
# Search for FileSystemDataVault implementation
echo "=== FileSystemDataVault Implementation ==="
rg -l "FileSystemDataVault" | xargs cat

echo -e "\n=== Access Control Patterns in Vaults ==="
rg "extends.*Vault" -A 10 -B 10

echo -e "\n=== Security Check Methods ==="
rg "validate.*access|check.*permission|authorize" -A 5 -B 5

Length of output: 46163

Comment on lines +31 to +32
override def load(readInstruction: DataReadInstruction)(implicit ec: ExecutionContext,
tc: TokenContext): Fox[Array[Byte]] = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Remove unused implicit TokenContext parameter

The load method declares an implicit tc: TokenContext parameter but never uses it. Instead, the class consistently uses the explicit tokenContext field that's passed through the constructor. Remove the unused implicit parameter to improve code clarity.

🔗 Analysis chain

Verify token context consistency.

The class uses both an implicit TokenContext parameter and an explicit tokenContext field from the layer. Ensure that these contexts are consistent and that the correct context is being used for each operation.

Run the following script to verify token context usage:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check for potential token context inconsistencies
# Look for places where both implicit and explicit token contexts are used

ast-grep --pattern 'implicit ec: ExecutionContext, tc: TokenContext'

# Check for explicit token context usage
rg -A 3 'tokenContext\.' 

Length of output: 5166


Script:

#!/bin/bash
# Get the complete class definition and implementation
ast-grep --pattern 'class EditableMappingLayer {
  $$$
}'

# Find all references to tokenContext and tc within the class
rg -A 5 -B 5 '(tokenContext|[^a-zA-Z]tc\.)' webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingLayer.scala

Length of output: 3102

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
CHANGELOG.unreleased.md (1)

21-21: Grammar Correction: Use Proper Genitive Form

The changelog entry currently reads "another layers source," which is grammatically incorrect. For better readability and clarity, consider changing it to "another layer's source" to correctly indicate ownership.

Consider applying this diff:

- When using a zarr link to a data layer as another layers source, the user's token is used to access the data.
+ When using a zarr link to a data layer as another layer's source, the user's token is used to access the data.
🧰 Tools
🪛 LanguageTool

[uncategorized] ~21-~21: It seems likely that a singular genitive (’s) apostrophe is missing.
Context: ... a zarr link to a data layer as another layers source, the user's token is used to acc...

(AI_HYDRA_LEO_APOSTROPHE_S_XS)

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d31837e and 4d9994a.

📒 Files selected for processing (1)
  • CHANGELOG.unreleased.md (1 hunks)
🧰 Additional context used
🪛 LanguageTool
CHANGELOG.unreleased.md

[uncategorized] ~21-~21: It seems likely that a singular genitive (’s) apostrophe is missing.
Context: ... a zarr link to a data layer as another layers source, the user's token is used to acc...

(AI_HYDRA_LEO_APOSTROPHE_S_XS)

⏰ Context from checks skipped due to timeout of 90000ms (1)
  • GitHub Check: circleci_build

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Pass user token when self-streaming from Webknossos
1 participant