Skip to content

Feature/element data classes+id+dataclasses#611

Open
FBumann wants to merge 29 commits intofeature/element-data-classes+idfrom
feature/element-data-classes+id+dataclasses
Open

Feature/element data classes+id+dataclasses#611
FBumann wants to merge 29 commits intofeature/element-data-classes+idfrom
feature/element-data-classes+id+dataclasses

Conversation

@FBumann
Copy link
Member

@FBumann FBumann commented Feb 15, 2026

Description

Brief description of the changes in this PR.

Type of Change

  • Bug fix
  • New feature
  • Documentation update
  • Code refactoring

Related Issues

Closes #(issue number)

Testing

  • I have tested my changes
  • Existing tests still pass

Checklist

  • My code follows the project style
  • I have updated documentation if needed
  • I have added tests for new functionality (if applicable)

Summary by CodeRabbit

  • New Features

    • Added coordinate-based data alignment and effect normalization support for batched operations.
  • Bug Fixes & Improvements

    • Enhanced Flow constructor to use explicit keyword arguments (bus, flow_id) for better clarity and flexibility.
    • Improved serialization infrastructure with standalone utilities and better circular-dependency handling.
    • Streamlined element initialization with modern dataclass patterns.
  • Documentation

    • Updated all examples and notebooks to reflect the new Flow API usage.

FBumann and others added 15 commits February 15, 2026 12:10
  - Piece: 2 init fields (start, end) + 1 internal field (has_time_dim with init=False)
  - Piecewise: 1 field (pieces) + property has_time_dim with propagation setter
  - PiecewiseConversion: 1 field (piecewises) + property has_time_dim with propagation setter
  - PiecewiseEffects: 2 fields (piecewise_origin, piecewise_shares) + property has_time_dim with propagation setter
  - InvestParameters: 9 fields with defaults + __post_init__ for None→{} and None→epsilon normalization
  - StatusParameters: 11 fields with defaults + __post_init__ for None→{} normalization

  All classes:
  - Use @DataClass(eq=False) to avoid issues with numpy/DataArray equality
  - Keep Interface inheritance (serialization still works)
  - Keep transform_data() and link_to_flow_system() (to be removed in later phases when *Data classes handle alignment)

⏺ Task #11 complete. 389 test_math tests pass + 136 IO tests pass. The leaf interface classes (Piece, Piecewise, PiecewiseConversion, PiecewiseEffects,
  InvestParameters, StatusParameters) are now @DataClass with auto-generated __init__ and __repr__, storing raw types in their fields.
StatusData, InvestmentData, and ConvertersData now handle their own
effect alignment via coords/normalize_effects params, removing the need
for transform_data()/link_to_flow_system() calls on StatusParameters,
InvestParameters, and PiecewiseConversion from Element classes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…terfaces

Remove now-unused methods from Piece, Piecewise, PiecewiseConversion,
PiecewiseEffects, InvestParameters, and StatusParameters since alignment
is now handled by *Data classes in batched.py. Also remove the
has_time_dim propagation mechanism that was only used by transform_data.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Flow attributes now stay as raw user data after connect_and_transform().
Alignment to model coordinates happens lazily in FlowsData at model-build
time via a new _align() helper. Also extends align_to_coords to handle
Python lists (for IO roundtrip) and fixes serialization of unnamed DataArrays.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Remove Interface inheritance from Piece, Piecewise, PiecewiseConversion,
PiecewiseEffects, InvestParameters, StatusParameters. Add dataclass
serialization support in _extract_dataarrays_recursive for plain
dataclasses. Add standalone _has_value function and __repr__ methods.
Convert Piece values to DataArray in __post_init__.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Effect attributes are now aligned lazily in EffectsData via
align_to_coords/align_effects_to_coords instead of eagerly in
Effect.transform_data(). Move validate_config checks to
EffectsData.validate(). Effect.transform_data() is now a no-op.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Bus.imbalance_penalty_per_flow_hour is now aligned lazily in BusesData
via align_to_coords. Move validate_config checks to BusesData.validate().
Bus.transform_data() is now a no-op.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…rsData

Conversion factors are now aligned lazily in ConvertersData via
align_to_coords. Move validate_config checks to ConvertersData.validate().
LinearConverter.transform_data() now only calls super for status propagation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Storage attributes are now aligned lazily in StoragesData via
align_to_coords. Move validate_config checks to StoragesData.validate().
Storage.transform_data() now only calls super for status propagation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…nsData

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace the element.transform_data() loop in connect_and_transform()
with explicit _propagate_all_status_parameters(). Remove transform_data()
from all element classes and Interface base. Remove unused _fit_coords
and _fit_effect_coords from Interface. All data alignment now happens
lazily in *Data classes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Move all validation checks from element validate_config() methods into
the corresponding *Data.validate() in batched.py. Remove validate_config()
and _plausibility_checks() from Flow, Component, EffectsCollection, and
the abstract _plausibility_checks from Element base class.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…th-based naming

Replace Interface's ~500 lines of IO infrastructure with standalone functions
(create_reference_structure, resolve_reference_structure, etc.) that use
deterministic path-based DataArray keys (e.g., components.Boiler.size).
Interface methods become thin wrappers. FlowSystemDatasetIO uses standalone
functions via lazy imports. This enables future removal of Interface inheritance.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Move handle_deprecated_kwarg() and validate_kwargs() to module-level
standalone functions. Remove dead wrapper methods (_create_reference_structure,
_resolve_dataarray_reference, _resolve_reference_structure, _has_value) from
Interface. Remove Interface inheritance from Carrier since it only needs
@register_class_for_io. Update FlowSystem to build datasets directly via its
own _create_reference_structure instead of super().to_dataset().

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 15, 2026

Important

Review skipped

Auto reviews are disabled on base/target branches other than the default branch.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

This PR redesigns the Flow constructor API from positional to keyword-based arguments, converts multiple core classes to dataclasses with post-initialization validation, refactors serialization into standalone utilities, introduces coordinate-based alignment for batched data, and updates all documentation and tests to reflect these breaking changes.

Changes

Cohort / File(s) Summary
Flow API Redesign
flixopt/elements.py, flixopt/interface.py, flixopt/carrier.py
Converted core element classes to dataclasses with post-init validation. Flow now requires explicit bus and flow_id keyword arguments instead of positional naming. Bus, Component, and related classes restructured with IO-exclusion handling and deprecation pathways for legacy parameters.
Serialization Infrastructure Overhaul
flixopt/structure.py, flixopt/core.py, flixopt/io.py
Introduced standalone serialization utilities (align_to_coords, align_effects_to_coords, create_reference_structure, resolve_reference_structure, valid_id) to handle DataArray alignment and path-based reference extraction. Replaced embedded Interface serialization logic with modular utilities. Added lazy import patterns to avoid circular dependencies.
Batched Data Alignment
flixopt/batched.py
Extended batched data classes (StatusData, InvestmentData, StoragesData, etc.) with coordinate-aligned initialization and normalization callbacks. Added internal alignment helpers and propagated coordinates throughout batched accessor to enable period/scenario-based weighting and alignment.
Flow System Architecture
flixopt/flow_system.py, flixopt/flow_system_status.py, flixopt/effects.py
FlowSystem removed Interface inheritance; introduced element ownership tracking via registries. Replaced per-element transform_data calls with centralized status parameter propagation. Effect class converted to dataclass with validated ID and automatic share-dictionary initialization.
Notebook & Test Updates
docs/notebooks/*.ipynb, docs/home/quick-start.md, docs/user-guide/*, tests/*
Comprehensive updates to all notebooks and tests replacing positional Flow syntax with keyword-based bus/flow_id usage. Updated example systems, fixtures, and assertions to reflect new API. Solution key references updated where flow naming changed (e.g., charge→charging, discharge→discharging).
Model Building & Conversion
flixopt/components.py
LinearConverter, Storage, Transmission, SourceAndSink converted to dataclass-based declarations with updated constructors. Added _io_exclude sets for IO serialization control and post-init methods for flow wiring and prevention-of-simultaneous-flow-rates logic.

Sequence Diagram(s)

sequenceDiagram
    participant User as User/Tests
    participant FS as FlowSystem
    participant Element as Element (Component/Bus/Flow)
    participant Structure as structure (new utils)
    participant IO as io (serialization)

    User->>FS: build model with elements
    FS->>Element: add components/buses/flows (dataclass init)
    Element->>Element: __post_init__ validation & normalization
    Element->>FS: register in element registries
    
    User->>FS: optimize/connect_and_transform
    FS->>Structure: _propagate_all_status_parameters
    FS->>Structure: _prepare_effects with alignment
    
    Note over Structure: New alignment logic
    Structure->>Structure: align_to_coords (data→DataArray)
    Structure->>Structure: align_effects_to_coords (dict→aligned dict)
    
    User->>IO: to_dataset/from_netcdf
    IO->>Structure: create_reference_structure (extract refs)
    Structure->>IO: return path-based refs + DataArrays
    IO->>IO: serialize/deserialize via paths
Loading

Estimated code review effort

🎯 5 (Critical) | ⏱️ ~120 minutes

Possibly related PRs

  • Modifies Flow constructor and core element signatures (elements.py, interface.py) alongside dataclass conversions, affecting downstream component wiring (Feature/simplify structure #464).
  • Updates Flow class signature and adds flow-hours/period-constraint parameters, directly overlapping with this PR's Flow API refactoring (Feature/sums over all periods #475).
  • Modifies element/effect transformation logic and dataclass initialization patterns in flixopt/effects.py and flixopt/elements.py, intersecting with this PR's dataclass and post-init refactoring (Add FlowContainer for inputs/outputs #587).

Suggested labels

v3.0.0, breaking-change, refactoring

Poem

🐰 Hop, hop! The flows now speak in keywords clear,
Bus-based identities emerge from dataclass sphere,
Align and normalize, from time to space,
A grand redesign with elegant grace!

🚥 Pre-merge checks | ❌ 4

❌ Failed checks (3 warnings, 1 inconclusive)

Check name Status Explanation Resolution
Description check ⚠️ Warning The PR description is entirely template-based with no concrete information filled in; all sections contain only placeholder text and unchecked boxes, providing no actual details about changes, motivation, or testing performed. Fill in the PR description with a concise summary of changes (dataclass conversions, IO/alignment refactoring, API changes), type of change (refactoring), related issues, and actual testing status.
Docstring Coverage ⚠️ Warning Docstring coverage is 79.69% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Merge Conflict Detection ⚠️ Warning ⚠️ Unable to check for merge conflicts: Invalid branch name format
Title check ❓ Inconclusive The title is vague and uses abbreviated/technical jargon ('element data classes+id+dataclasses') that does not clearly summarize the main change for someone scanning history. Replace with a clearer, more descriptive title such as 'Convert core elements to dataclasses with unified ID validation' or 'Refactor Element, Flow, Bus to dataclasses and decouple IO/alignment logic'.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feature/element-data-classes+id+dataclasses

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

FBumann and others added 9 commits February 15, 2026 19:19
The base method was a one-liner and all 6 overrides were either no-ops
(just calling super) or propagating to flows (which _connect_network
already handles). Replace with direct _flow_system assignment at the
4 call sites and in _connect_network for flows.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Merge Interface into Element, remove Interface class (~240 lines)
- Convert Effect, Bus, Component to @DataClass(eq=False, repr=False)
- Convert Flow to @DataClass(eq=False, repr=False, init=False) preserving deprecation bridge
- Convert LinearConverter, Storage, Transmission, Source, Sink, SourceAndSink to @DataClass
- Add _io_exclude class attribute for excluding computed fields from serialization
- Remove dead ContainerMixin, FlowContainer, ElementContainer, ResultsContainer (~244 lines)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Remove Element.__init__(), validate_kwargs(), and handle_deprecated_kwarg()
as dead code now that all element classes are @DataClass. Simplify Flow's
custom __init__ to use a positional-only param with clean bus/flow_id
resolution, dropping all deprecation bridges (label=, id=, *args form).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Extract valid_id() and replace_references_with_stats() to module-level functions
- Remove _short_id indirection from all element classes
- Remove unused IO methods from Element (to_dataset, from_dataset, to_netcdf,
  from_netcdf, to_json, get_structure, copy) — only FlowSystem uses these
- Remove Element.id property/setter, __repr__, _valid_id, _valid_label
- Remove nonsensical Flow.id setter (computed property can't round-trip)
- Add Flow.label override returning flow_id with deprecation warning

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…ersion

Make create_reference_structure() coords-aware so numpy arrays and pandas
objects are converted to DataArrays during serialization. Remove Piece
__post_init__ that eagerly converted start/end to DataArray. Remove unused
obj_to_dataset/obj_from_dataset standalone serialization functions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Rename _is_alignable_numeric to _is_numeric_array, remove unnecessary
bool guard, improve docstrings. Only convert arrays when coords is
available (avoids dim_0 dimension conflicts). Minor formatting.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Remove dead model references (_flows_model, _buses_model, _storages_model)
- Move runtime state (_flow_system, _variable_names, _constraint_names) to FlowSystem registry
- Convert Flow to keyword-only dataclass init (bus= required kwarg)
- Prefer dataclasses.fields() over inspect.signature() for IO serialization
- Remove Element.solution and Element.flow_system properties

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…t Storage flows

- Remove unnecessary flow_id= from docs, docstrings, and examples
- Update conversion_factors keys to match bus names where flow_id was removed
- Storage auto-defaults flow_id to 'charging'/'discharging' when not set
- Defer Flow.flow_id defaulting to Component._connect_flows for composability
- Fix effects_per_flow_hour type annotation to Numeric_TPS | dict | None
- Keep meta_data as field(default_factory=dict)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…_init__

Avoids Python dataclass ordering issues (required fields after defaults from
parent) by using explicit __init__ that calls super().__init__() directly.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@FBumann
Copy link
Member Author

FBumann commented Feb 16, 2026

@coderabbitai review

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 16, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

FBumann and others added 2 commits February 16, 2026 12:56
…tputs to list[Flow]

- Remove fake '' defaults from mandatory fields (Component.id, Flow.bus)
- Serialize IdList as list instead of dict (keys are derivable from objects)
- Remove dict-to-list conversion code from Component.__post_init__ and subclasses
- Simplify type annotations: list[Flow] instead of list[Flow] | dict[str, Flow]

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (6)
flixopt/elements.py (1)

164-172: ⚠️ Potential issue | 🟠 Major

Fix flow_id uniqueness check order to avoid false duplicates.
With multiple flows lacking flow_id, the uniqueness check runs before defaults are applied and will raise even when bus names are distinct.

✅ Suggested fix (default flow_id before uniqueness check)
         _inputs = self.inputs or []
         _outputs = self.outputs or []
 
+        # Default flow_id to bus name before uniqueness check
+        for flow in _inputs + _outputs:
+            if flow.flow_id is None:
+                flow.flow_id = valid_id(flow.bus if isinstance(flow.bus, str) else str(flow.bus))
+
         # Check uniqueness on raw lists (before connecting)
         all_flow_ids = [flow.flow_id for flow in _inputs + _outputs]
         if len(set(all_flow_ids)) != len(all_flow_ids):
             duplicates = {fid for fid in all_flow_ids if all_flow_ids.count(fid) > 1}
             raise ValueError(f'Flow names must be unique! "{self.id}" got 2 or more of: {duplicates}')

Also applies to: 219-223

docs/home/quick-start.md (1)

54-83: ⚠️ Potential issue | 🟡 Minor

Quick-start example: add explicit flow_id for storage charge/discharge to avoid relying on defaults

The storage example now uses:

  • charging=fx.Flow(bus='electricity', size=50)
  • discharging=fx.Flow(bus='electricity', size=50) (Line 77-78)

If flow_id defaults to the bus name, having two flows on the same bus in the same component can become ambiguous unless Storage assigns defaults internally. To keep the quick-start copy/paste-safe and to illustrate the new API clearly, I’d recommend adding flow_id explicitly.

Suggested doc change
 battery = fx.Storage(
     'battery',
-    charging=fx.Flow(bus='electricity', size=50),
-    discharging=fx.Flow(bus='electricity', size=50),
+    charging=fx.Flow(bus='electricity', flow_id='charge', size=50),
+    discharging=fx.Flow(bus='electricity', flow_id='discharge', size=50),
     capacity_in_flow_hours=100,  # 100 kWh capacity
flixopt/batched.py (1)

577-636: ⚠️ Potential issue | 🟠 Major

StoragesData: fail fast if coords is missing (otherwise _align will crash)

StoragesData.__init__ accepts coords: dict[str, pd.Index] | None = None, but the _align() method unconditionally passes self._coords to align_to_coords(), which requires a non-None dict. If any caller instantiates StoragesData(..., coords=None) and then accesses an aligned property (e.g., eta_charge, capacity_upper), this will raise at runtime with a cryptic error from align_to_coords.

Current instantiations always pass coords=self._fs.indexes, but the signature allows None, creating a footgun for future usage. Add validation in __init__ to fail fast:

 def __init__(
     self,
     storages: list,
     dim_name: str,
     effect_ids: list[str],
     timesteps_extra: pd.DatetimeIndex | None = None,
     coords: dict[str, pd.Index] | None = None,
     normalize_effects: Any = None,
 ):
     """Initialize StoragesData.
     ...
     """
     self._storages = storages
     self._dim_name = dim_name
     self._effect_ids = effect_ids
     self._timesteps_extra = timesteps_extra
+    if coords is None:
+        raise ValueError('StoragesData requires coords for alignment (got None).')
     self._coords = coords
     self._normalize_effects = normalize_effects
     self._by_id = {s.id: s for s in storages}
docs/notebooks/03-investment-optimization.ipynb (1)

157-168: ⚠️ Potential issue | 🔴 Critical

Unmigrated positional argument will cause TypeError at runtime.

Line 158 passes 'Heat' as the first positional argument to Flow. With the new dataclass definition where bus is the first field, Flow('Heat', bus='Heat', ...) assigns 'Heat' to bus positionally and via keyword, resulting in TypeError: __init__() got multiple values for argument 'bus'.

🐛 Proposed fix
         "    fx.Source(\n",
         "        'SolarCollectors',\n",
         "        outputs=[\n",
         "            fx.Flow(\n",
-        "                'Heat',\n",
-        "                bus='Heat',\n",
+        "                bus='Heat',\n",
+        "                flow_id='Heat',\n",
docs/notebooks/10-transmission.ipynb (2)

461-486: ⚠️ Potential issue | 🔴 Critical

Unmigrated positional arguments in Example 3 will cause TypeError at runtime.

Lines 464 and 474 pass 'from_A' and 'from_B' as the first positional argument to Flow. With the new dataclass where bus is the first field, Flow('from_A', bus='Heat_A', ...) assigns 'from_A' to bus positionally and 'Heat_A' via keyword, raising TypeError: __init__() got multiple values for argument 'bus'.

Note: The out1/out2 flows on lines 472 and 482 were correctly migrated with flow_id= keyword, but the in1/in2 flows were missed.

🐛 Proposed fix
     "    fx.Transmission(\n",
     "        'Pipe_AB',\n",
     "        in1=fx.Flow(\n",
-    "            'from_A',\n",
-    "            bus='Heat_A',\n",
+    "            bus='Heat_A',\n",
+    "            flow_id='from_A',\n",
     "            size=fx.InvestParameters(\n",
     "                effects_of_investment_per_size={'costs': PIPE_COST_PER_KW * 7},  # Weekly cost\n",
     "                minimum_size=0,\n",
     "                maximum_size=300,\n",
     "            ),\n",
     "        ),\n",
     "        out1=fx.Flow(bus='Heat_B', flow_id='to_B'),\n",
     "        in2=fx.Flow(\n",
-    "            'from_B',\n",
-    "            bus='Heat_B',\n",
+    "            bus='Heat_B',\n",
+    "            flow_id='from_B',\n",
     "            size=fx.InvestParameters(\n",
     "                effects_of_investment_per_size={'costs': PIPE_COST_PER_KW * 7},\n",
     "                minimum_size=0,\n",
     "                maximum_size=300,\n",
     "            ),\n",
     "        ),\n",

541-587: ⚠️ Potential issue | 🟡 Minor

Documentation code examples in Key Concepts still use old positional Flow API.

The markdown code blocks at lines 543–549 and 578–586 show Flow('from_A', bus='Bus_A', ...) syntax, which is inconsistent with the new Flow(bus=..., flow_id=...) API. While these won't cause runtime errors (they're markdown, not executable), they'll confuse readers.

📝 Proposed fix for lines 543-549
-    "    in1=fx.Flow(bus='Bus_A', flow_id='from_A', size=100),\n",
-    "    out1=fx.Flow(bus='Bus_B', flow_id='to_B', size=100),\n",
+    "    in1=fx.Flow(bus='Bus_A', flow_id='from_A', size=100),\n",
+    "    out1=fx.Flow(bus='Bus_B', flow_id='to_B', size=100),\n",
     "    # Direction 2: B → A (optional - omit for unidirectional)\n",
-    "    in2=fx.Flow(bus='Bus_B', flow_id='from_B', size=100),\n",
-    "    out2=fx.Flow(bus='Bus_A', flow_id='to_A', size=100),\n",
+    "    in2=fx.Flow(bus='Bus_B', flow_id='from_B', size=100),\n",
+    "    out2=fx.Flow(bus='Bus_A', flow_id='to_A', size=100),\n",

Ah wait — lines 545–549 are already marked as changed with ~ and use the new API. The issue is specifically in the Investment Optimization example at lines 577–586:

     "```python\n",
     "in1=fx.Flow(\n",
-    "    'from_A', \n",
-    "    bus='Bus_A',\n",
+    "    bus='Bus_A',\n",
+    "    flow_id='from_A',\n",
     "    size=fx.InvestParameters(  # Pass InvestParameters as size\n",
🤖 Fix all issues with AI agents
In `@docs/user-guide/building-models/index.md`:
- Around line 255-266: The example in flow_system.add_elements is missing a
fx.Bus('Gas') declaration while fx.Source('Gas') and fx.LinearConverter
reference bus='Gas'; add fx.Bus('Gas') alongside fx.Bus('Heat') in the same
flow_system.add_elements call so the Gas bus exists before the Source and
LinearConverter (look for fx.Bus('Heat'), fx.Source('Gas'), and
fx.LinearConverter to locate the snippet).

In `@flixopt/batched.py`:
- Around line 1720-1777: The validate() method rejects numpy scalar types for
flow.previous_flow_rate; update the type check inside validate (the block that
currently tests isinstance(flow.previous_flow_rate, np.ndarray) and
isinstance(..., (int, float, list))) to also accept numpy scalar types by
including np.number in the isinstance tuple (so numpy scalars like np.float64
pass), ensuring consistency with np.isscalar() usage elsewhere and with
assignments in optimize_accessor.py that may set a scalar via .item().

In `@flixopt/flow_system.py`:
- Around line 1694-1698: The id()-only guard in the FlowSystem register path
allows an element to be added to different FlowSystem instances; switch to a
weak-ref owner registry to enforce exclusive ownership: add a module-level or
FlowSystem-class attribute like a weakref.WeakKeyDictionary mapping element ->
FlowSystem owner, update the register path (where id(element) and
_registered_elements are checked) to first consult the weak owner registry and
raise or refuse registration if another FlowSystem is recorded, set the weak
owner entry to self when registration succeeds, and clear the weak owner entry
on element removal/deregistration (or let it vanish when the element is
collected); keep the _check_if_element_is_unique call but use the weak registry
as the authoritative cross‑FlowSystem protection.

In `@flixopt/structure.py`:
- Around line 770-780: The serialization branch in _extract_recursive currently
tags any dataclass with a '__class__' marker but resolve_reference_structure
only reconstructs classes listed in CLASS_REGISTRY, causing unregistered
dataclasses to fail round-trip; update _extract_recursive to detect when
obj.__class__.__name__ is not present in CLASS_REGISTRY and either (a) remove
the '__class__' marker and serialize the dataclass as a plain dict of its
non-empty fields, or (b) raise an explicit serialization error with a clear
message so callers fail fast; reference _extract_recursive,
resolve_reference_structure, and CLASS_REGISTRY when implementing the guard.
🧹 Nitpick comments (5)
tests/superseded/math/test_linear_converter.py (1)

17-25: Use .flow_id instead of deprecated .label in conversion factor dicts.

Throughout this file, input_flow.label and output_flow.label are used as keys in conversion_factors dicts (Lines 25, 59, 92-94, 127, 173-177, 211, 244, 282). The .label property is deprecated and emits a DeprecationWarning per flixopt/elements.py. Since this PR migrates to the new flow_id API, these should be updated to .flow_id for consistency and to avoid warnings during test runs.

Example fix (apply same pattern throughout)
-        conversion_factors=[{input_flow.label: 0.8, output_flow.label: 1.0}],
+        conversion_factors=[{input_flow.flow_id: 0.8, output_flow.flow_id: 1.0}],
tests/test_clustering/test_cluster_reduce_expand.py (1)

18-31: Flow(bus=..., flow_id=...) migration in clustering tests looks consistent

The updated Flow constructions (e.g., HeatDemand(Q), Boiler(Q_th), Boiler(Q_fu)) match the new component(flow_id) naming that the tests later select from solution/statistics (e.g., sel(flow='Boiler(Q_th)')). Lines 23-30, 252-260, and 756-763 look coherent.

Optional: consider also setting flow_id on flows where it’s currently omitted (e.g., GasSource output, storage charge/discharge flows) to avoid relying on defaults and keep dataset keys stable if defaults change.

Also applies to: 245-262, 391-405, 577-633, 751-765, 946-1003, 1283-1317, 1539-1667

flixopt/batched.py (1)

2272-2317: ConvertersData: consider raising PlausibilityError (not KeyError) for unknown conversion_factors keys

aligned_conversion_factors() currently does converter.flows[flow_label] (Line 2309) which can throw a raw KeyError before validate() has a chance to produce the nicer message (Line 2553-2557). A small guard here would improve UX in misconfigured models.

Proposed tweak (friendlier error)
         for idx, conv_factor in enumerate(converter.conversion_factors):
             aligned_dict = {}
             for flow_label, values in conv_factor.items():
-                flow_id = converter.flows[flow_label].id
+                try:
+                    flow_id = converter.flows[flow_label].id
+                except KeyError as e:
+                    raise PlausibilityError(
+                        f'{converter.id}: Flow "{flow_label}" in conversion_factors is not in inputs/outputs'
+                    ) from e
                 aligned = align_to_coords(values, self._coords, name=f'{flow_id}|conversion_factor{idx}')

Also applies to: 2373-2392

docs/notebooks/04-operational-constraints.ipynb (1)

127-147: Drop the deprecated positional Flow argument for thermal_flow.

The main boiler’s thermal_flow still uses the legacy positional flow name plus bus=.... With the new Flow signature, this is deprecated and can emit warnings. Prefer keyword-only bus (and optional flow_id if needed) for consistency.

♻️ Suggested cleanup
-        thermal_flow=fx.Flow(
-            'Steam',
-            bus='Steam',
-            size=500,
-            relative_minimum=0.3,  # Minimum load: 30% = 150 kW
-        ),
+        thermal_flow=fx.Flow(
+            bus='Steam',
+            size=500,
+            relative_minimum=0.3,  # Minimum load: 30% = 150 kW
+        ),
tests/test_scenarios.py (1)

62-80: Inconsistent positional vs keyword bus usage within test_system fixture.

Lines 62 and 66 pass the bus as a positional argument (Flow(electricity_bus.label_full, ...)), while the rest of this PR consistently uses the keyword form (Flow(bus=..., ...)). Consider updating these untouched lines for consistency:

-    demand = Flow(electricity_bus.label_full, flow_id='Demand', fixed_relative_profile=demand_profiles)
+    demand = Flow(bus=electricity_bus.label_full, flow_id='Demand', fixed_relative_profile=demand_profiles)
...
-    power_gen = Flow(
-        electricity_bus.label_full,
+    power_gen = Flow(
+        bus=electricity_bus.label_full,
         flow_id='Generation',
...
-    storage_charge = Flow(electricity_bus.label_full, size=10)
-    storage_discharge = Flow(electricity_bus.label_full, size=10)
+    storage_charge = Flow(bus=electricity_bus.label_full, size=10)
+    storage_discharge = Flow(bus=electricity_bus.label_full, size=10)

Comment on lines 255 to 266
flow_system.add_elements(
fx.Bus('Heat'),
fx.Effect('costs', '€', is_standard=True, is_objective=True),
fx.Source('Gas', outputs=[fx.Flow('gas', bus=None, size=500, effects_per_flow_hour=0.05)]),
fx.Source('Gas', outputs=[fx.Flow(bus='Gas', size=500, effects_per_flow_hour=0.05)]),
fx.LinearConverter(
'Boiler',
inputs=[fx.Flow('gas', bus=None, size=500)], # Inline source
outputs=[fx.Flow('heat', bus='Heat', size=450)],
conversion_factors=[{'gas': 1, 'heat': 0.9}],
inputs=[fx.Flow(bus='Gas', size=500)],
outputs=[fx.Flow(bus='Heat', size=450)],
conversion_factors=[{'Gas': 1, 'Heat': 0.9}],
),
fx.Sink('Demand', inputs=[fx.Flow('heat', bus='Heat', size=1, fixed_relative_profile=demand)]),
fx.Sink('Demand', inputs=[fx.Flow(bus='Heat', size=1, fixed_relative_profile=demand)]),
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing Bus('Gas') in Pattern 1 example.

This pattern example adds Bus('Heat') but not Bus('Gas'), even though the Source and LinearConverter reference bus='Gas'. This would fail at runtime if a user copies the snippet directly. This appears to be a pre-existing omission, but worth fixing since the example is now being updated anyway.

Proposed fix
 flow_system.add_elements(
     fx.Bus('Heat'),
+    fx.Bus('Gas'),
     fx.Effect('costs', '€', is_standard=True, is_objective=True),
     fx.Source('Gas', outputs=[fx.Flow(bus='Gas', size=500, effects_per_flow_hour=0.05)]),
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
flow_system.add_elements(
fx.Bus('Heat'),
fx.Effect('costs', '€', is_standard=True, is_objective=True),
fx.Source('Gas', outputs=[fx.Flow('gas', bus=None, size=500, effects_per_flow_hour=0.05)]),
fx.Source('Gas', outputs=[fx.Flow(bus='Gas', size=500, effects_per_flow_hour=0.05)]),
fx.LinearConverter(
'Boiler',
inputs=[fx.Flow('gas', bus=None, size=500)], # Inline source
outputs=[fx.Flow('heat', bus='Heat', size=450)],
conversion_factors=[{'gas': 1, 'heat': 0.9}],
inputs=[fx.Flow(bus='Gas', size=500)],
outputs=[fx.Flow(bus='Heat', size=450)],
conversion_factors=[{'Gas': 1, 'Heat': 0.9}],
),
fx.Sink('Demand', inputs=[fx.Flow('heat', bus='Heat', size=1, fixed_relative_profile=demand)]),
fx.Sink('Demand', inputs=[fx.Flow(bus='Heat', size=1, fixed_relative_profile=demand)]),
)
flow_system.add_elements(
fx.Bus('Heat'),
fx.Bus('Gas'),
fx.Effect('costs', '€', is_standard=True, is_objective=True),
fx.Source('Gas', outputs=[fx.Flow(bus='Gas', size=500, effects_per_flow_hour=0.05)]),
fx.LinearConverter(
'Boiler',
inputs=[fx.Flow(bus='Gas', size=500)],
outputs=[fx.Flow(bus='Heat', size=450)],
conversion_factors=[{'Gas': 1, 'Heat': 0.9}],
),
fx.Sink('Demand', inputs=[fx.Flow(bus='Heat', size=1, fixed_relative_profile=demand)]),
)
🤖 Prompt for AI Agents
In `@docs/user-guide/building-models/index.md` around lines 255 - 266, The example
in flow_system.add_elements is missing a fx.Bus('Gas') declaration while
fx.Source('Gas') and fx.LinearConverter reference bus='Gas'; add fx.Bus('Gas')
alongside fx.Bus('Heat') in the same flow_system.add_elements call so the Gas
bus exists before the Source and LinearConverter (look for fx.Bus('Heat'),
fx.Source('Gas'), and fx.LinearConverter to locate the snippet).

Comment on lines 1720 to 1777
def validate(self) -> None:
"""Validate all flows (config + DataArray checks).

Performs both:
- Config validation via Flow.validate_config()
- DataArray validation (post-transformation checks)

Raises:
PlausibilityError: If any validation check fails.
"""
if not self.elements:
return

for flow in self.elements.values():
flow.validate_config()
# Size is required when using StatusParameters (for big-M constraints)
if flow.status_parameters is not None and flow.size is None:
raise PlausibilityError(
f'Flow "{flow.id}" has status_parameters but no size defined. '
f'A size is required when using status_parameters to bound the flow rate.'
)

if flow.size is None and flow.fixed_relative_profile is not None:
raise PlausibilityError(
f'Flow "{flow.id}" has a fixed_relative_profile but no size defined. '
f'A size is required because flow_rate = size * fixed_relative_profile.'
)

# Size is required for load factor constraints (total_flow_hours / size)
if flow.size is None and flow.load_factor_min is not None:
raise PlausibilityError(
f'Flow "{flow.id}" has load_factor_min but no size defined. '
f'A size is required because the constraint is total_flow_hours >= size * load_factor_min * hours.'
)

if flow.size is None and flow.load_factor_max is not None:
raise PlausibilityError(
f'Flow "{flow.id}" has load_factor_max but no size defined. '
f'A size is required because the constraint is total_flow_hours <= size * load_factor_max * hours.'
)

# Validate previous_flow_rate type
if flow.previous_flow_rate is not None:
if not any(
[
isinstance(flow.previous_flow_rate, np.ndarray) and flow.previous_flow_rate.ndim == 1,
isinstance(flow.previous_flow_rate, (int, float, list)),
]
):
raise TypeError(
f'previous_flow_rate must be None, a scalar, a list of scalars or a 1D-numpy-array. '
f'Got {type(flow.previous_flow_rate)}. '
f'Different values in different periods or scenarios are not yet supported.'
)

# Warning: fixed_relative_profile + status_parameters is unusual
if flow.fixed_relative_profile is not None and flow.status_parameters is not None:
logger.warning(
f'Flow {flow.id} has both a fixed_relative_profile and status_parameters. '
f'This will allow the flow to be switched active and inactive, '
f'effectively differing from the fixed_flow_rate.'
)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

# First, let's find the structure of the file and confirm the line numbers
wc -l flixopt/batched.py

# Find usages of previous_flow_rate with assignments
echo "=== previous_flow_rate assignments ==="
rg -n "previous_flow_rate\s*=" flixopt/

# Find the validate method and the specific line mentioned
echo "=== Line 1568 context (np.isscalar usage) ==="
sed -n '1560,1575p' flixopt/batched.py

# Find the validation block mentioned (1758-1763)
echo "=== Validation block (1758-1763) ==="
sed -n '1755,1770p' flixopt/batched.py

# Check if np.number is referenced anywhere
echo "=== np.number references ==="
rg -n "np\.number" flixopt/

Repository: flixOpt/flixopt

Length of output: 2617


🏁 Script executed:

#!/bin/bash
set -euo pipefail

# Find test files that might test previous_flow_rate
echo "=== Test files with previous_flow_rate ==="
find . -name "test*.py" -o -name "*test.py" | xargs grep -l "previous_flow_rate" 2>/dev/null || echo "No matches found"

# Check all previous_flow_rate usages in test files
echo "=== previous_flow_rate in tests ==="
rg -n "previous_flow_rate" tests/ --type py 2>/dev/null || echo "No tests directory or no matches"

# Check numpy scalar usage patterns
echo "=== numpy scalar usage patterns ==="
rg -n "np\.float64|np\.int64|np\.number" flixopt/batched.py | head -20

Repository: flixOpt/flixopt

Length of output: 12420


🏁 Script executed:

#!/bin/bash
set -euo pipefail

# Verify np.isscalar behavior with numpy types
python3 << 'EOF'
import numpy as np

# Test np.isscalar with various types
test_values = [
    (10, "int"),
    (10.5, "float"),
    (np.int64(10), "np.int64"),
    (np.float64(10.5), "np.float64"),
    (np.array([10]), "np.array([10])"),
    ([10], "list"),
]

print("=== np.isscalar() behavior ===")
for val, name in test_values:
    print(f"np.isscalar({name:20s}) = {np.isscalar(val)}")

print("\n=== isinstance with np.number ===")
for val, name in test_values:
    is_np_number = isinstance(val, np.number)
    print(f"isinstance({name:20s}, np.number) = {is_np_number}")

print("\n=== values.item() return type ===")
arr = np.array([42.5])
scalar = arr.item()
print(f"type(np.array([42.5]).item()) = {type(scalar)}")
print(f"isinstance(scalar, np.number) = {isinstance(scalar, np.number)}")
print(f"isinstance(scalar, (int, float)) = {isinstance(scalar, (int, float))}")
EOF

Repository: flixOpt/flixopt

Length of output: 176


🏁 Script executed:

#!/bin/bash
set -euo pipefail

# Look at optimize_accessor.py:352 in full context
echo "=== optimize_accessor.py context around line 352 ==="
sed -n '340,365p' flixopt/optimize_accessor.py

# Check the method signature and what values could be
sed -n '300,365p' flixopt/optimize_accessor.py | head -70

Repository: flixOpt/flixopt

Length of output: 4059


🏁 Script executed:

#!/bin/bash
set -euo pipefail

# Let's verify the behavior by looking at numpy documentation and checking
# if the codebase already uses np.number elsewhere

echo "=== Checking if np.number is imported/used in batched.py ==="
head -100 flixopt/batched.py | grep -E "^import|^from"

echo "=== Check if np is imported ==="
grep -n "^import numpy\|^from numpy" flixopt/batched.py | head -5

echo "=== Verify line 1568 again for full context ==="
sed -n '1560,1575p' flixopt/batched.py

echo "=== Full validation method ==="
sed -n '1720,1777p' flixopt/batched.py

Repository: flixOpt/flixopt

Length of output: 4287


🏁 Script executed:

#!/bin/bash
set -euo pipefail

# Check what numpy scalar types are mentioned in project docs
echo "=== Documentation of np.number in codebase ==="
grep -B2 -A2 "np.number" flixopt/core.py

# Check if there are any other uses of np.number in isinstance checks
echo "=== Other isinstance checks with np.number ==="
rg -n "isinstance.*np\.number" flixopt/

Repository: flixOpt/flixopt

Length of output: 671


Fix validate() to accept numpy scalar types for previous_flow_rate

The validation at lines 1758-1763 rejects numpy scalar types (e.g., np.float64) but should accept them. This creates an inconsistency:

  • Line 1568 uses np.isscalar() which returns True for numpy scalars
  • optimize_accessor.py:352 assigns numpy scalars via .item(): target_flow.previous_flow_rate = values.item() if values.size == 1 else values
  • Project documentation in core.py explicitly lists np.number as a supported scalar type

Add np.number to the isinstance check:

Proposed fix
                 if not any(
                     [
                         isinstance(flow.previous_flow_rate, np.ndarray) and flow.previous_flow_rate.ndim == 1,
-                        isinstance(flow.previous_flow_rate, (int, float, list)),
+                        isinstance(flow.previous_flow_rate, (int, float, np.number, list)),
                     ]
                 ):
🤖 Prompt for AI Agents
In `@flixopt/batched.py` around lines 1720 - 1777, The validate() method rejects
numpy scalar types for flow.previous_flow_rate; update the type check inside
validate (the block that currently tests isinstance(flow.previous_flow_rate,
np.ndarray) and isinstance(..., (int, float, list))) to also accept numpy scalar
types by including np.number in the isinstance tuple (so numpy scalars like
np.float64 pass), ensuring consistency with np.isscalar() usage elsewhere and
with assignments in optimize_accessor.py that may set a scalar via .item().

Comment on lines +1694 to +1698
if id(element) in self._registered_elements:
return # Already registered to this FlowSystem
# Check if any other FlowSystem has claimed this element — not possible to detect
# with id()-based tracking alone, but duplicates are caught by _check_if_element_is_unique

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Reintroduce cross‑FlowSystem ownership protection.
The new id()-based tracking only prevents re‑adding an element to the same FlowSystem and no longer detects elements already attached to a different FlowSystem. This can lead to shared mutable state across systems and subtle cross‑contamination. Consider a weak‑ref owner registry to enforce exclusivity.

🛠️ Possible fix using a weak owner registry
+import weakref
+
+_ELEMENT_OWNERS = weakref.WeakKeyDictionary()
+
 def _check_if_element_already_assigned(self, element: Element) -> None:
-    if id(element) in self._registered_elements:
-        return  # Already registered to this FlowSystem
-    # Check if any other FlowSystem has claimed this element — not possible to detect
-    # with id()-based tracking alone, but duplicates are caught by _check_if_element_is_unique
+    owner = _ELEMENT_OWNERS.get(element)
+    if owner is not None and owner is not self:
+        raise ValueError(f'Element {element.id} is already assigned to another FlowSystem.')
+    _ELEMENT_OWNERS[element] = self
🤖 Prompt for AI Agents
In `@flixopt/flow_system.py` around lines 1694 - 1698, The id()-only guard in the
FlowSystem register path allows an element to be added to different FlowSystem
instances; switch to a weak-ref owner registry to enforce exclusive ownership:
add a module-level or FlowSystem-class attribute like a
weakref.WeakKeyDictionary mapping element -> FlowSystem owner, update the
register path (where id(element) and _registered_elements are checked) to first
consult the weak owner registry and raise or refuse registration if another
FlowSystem is recorded, set the weak owner entry to self when registration
succeeds, and clear the weak owner entry on element removal/deregistration (or
let it vanish when the element is collected); keep the
_check_if_element_is_unique call but use the weak registry as the authoritative
cross‑FlowSystem protection.

Comment on lines 770 to 780
if dataclasses.is_dataclass(obj) and not isinstance(obj, type):
structure: dict[str, Any] = {'__class__': obj.__class__.__name__}
for field in dataclasses.fields(obj):
value = getattr(obj, field.name)
if value is None:
continue
processed, field_arrays = _extract_recursive(value, f'{path}.{field.name}', coords)
arrays.update(field_arrays)
if processed is not None and not _is_empty(processed):
structure[field.name] = processed
return structure, arrays
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Prevent round‑trip failures for unregistered dataclasses.
_extract_recursive emits a __class__ marker for any dataclass, but resolve_reference_structure only reconstructs classes in CLASS_REGISTRY. Any dataclass not explicitly registered will serialize and then fail to deserialize. Consider failing fast at serialization time (or serialize as a plain dict without __class__).

🛠️ Proposed guard to fail fast
-    if dataclasses.is_dataclass(obj) and not isinstance(obj, type):
-        structure: dict[str, Any] = {'__class__': obj.__class__.__name__}
+    if dataclasses.is_dataclass(obj) and not isinstance(obj, type):
+        if obj.__class__.__name__ not in CLASS_REGISTRY:
+            raise ValueError(
+                f'Dataclass {obj.__class__.__name__} is not registered for IO. '
+                'Use `@register_class_for_io` or convert to basic types.'
+            )
+        structure: dict[str, Any] = {'__class__': obj.__class__.__name__}
🤖 Prompt for AI Agents
In `@flixopt/structure.py` around lines 770 - 780, The serialization branch in
_extract_recursive currently tags any dataclass with a '__class__' marker but
resolve_reference_structure only reconstructs classes listed in CLASS_REGISTRY,
causing unregistered dataclasses to fail round-trip; update _extract_recursive
to detect when obj.__class__.__name__ is not present in CLASS_REGISTRY and
either (a) remove the '__class__' marker and serialize the dataclass as a plain
dict of its non-empty fields, or (b) raise an explicit serialization error with
a clear message so callers fail fast; reference _extract_recursive,
resolve_reference_structure, and CLASS_REGISTRY when implementing the guard.

FBumann and others added 3 commits February 16, 2026 13:21
…n paths

- Serialize scalars as 0-d DataArrays (no broadcasting) alongside arrays
- Use flow ids instead of numeric indices for IdList serialization paths
- Unwrap 0-d DataArrays back to Python scalars on deserialization
- Move flow_id uniqueness check after _connect_flows() to avoid false
  collisions on unresolved None flow_ids
- Simplify FlowSystem restoration with _resolve helper for ::: references

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant