Skip to content

Create Pipeline Cornerstone and Refactor Store Implementation #9451

@tobiu

Description

@tobiu

Create Pipeline Cornerstone and Refactor Store Implementation

Goal

Establish Neo.data.Pipeline as the central orchestrator for data transformation and remote execution, and remove the brittle remote instantiation logic from Neo.data.Store.

Context

Currently, Neo.data.Store attempts to directly manage the cross-worker instantiation of its Normalizer (via afterSetNormalizer). This is an abstraction leak; a Store should not be hardcoded to Neo.worker.Data or manage remote IDs.

We need a dedicated Neo.data.Pipeline class. The Store will aggregate a Pipeline using ClassSystemUtil.beforeSetInstance(). The Pipeline takes over the responsibility of owning the Connection, Parser, and Normalizer, and importantly, orchestrating whether they run locally in the App Worker or remotely in the Data Worker.

Acceptance Criteria

  • Create src/data/Pipeline.mjs extending Neo.core.Base.
  • Give Pipeline the following reactive configs: workerExecution (default 'app'), connection_, parser_, and normalizer_.
  • Use ClassSystemUtil.beforeSetInstance inside the Pipeline to instantiate these sub-components.
  • If workerExecution: 'data', the Pipeline should use Neo.worker.Data.createInstance to spawn the actual Connection, Parser, and Normalizer instances exclusively inside the Data Worker (meaning the App Worker Pipeline only holds the configs, not the instances), storing the remote ID.
  • Refactor Neo.data.Store: Remove afterSetNormalizer and afterSetParser. Introduce a pipeline_ config that uses ClassSystemUtil.beforeSetInstance to create the Pipeline.
  • The Store's load() method should delegate to this.pipeline.read().

Metadata

Metadata

Assignees

Labels

aiarchitectureArchitecture related issuescoreCore framework functionalityenhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions