Skip to content

Conversation

@ayush4874
Copy link

Problem:
Currently, GetHistoryFieldValue in COutput.hpp triggers a fatal SU2_MPI::Error (causing an MPI_Abort) if a requested history field is missing. This behavior is overly aggressive for high-level interfaces (like the Python wrapper) or coupled simulations where a field might be temporarily unavailable or optional.

Solution:
Replaced the fatal error with a warning message (std::cout) and a default return value of 0.0. This ensures the simulation continues running while alerting the user.

Changes:

  • Modified COutput.hpp: Replaced SU2_MPI::Error with a warning print and safe return in GetHistoryFieldValue.

@pcarruscag
Copy link
Member

Can you share a config where you've run into this problem?

@ayush4874
Copy link
Author

Hi @pcarruscag,

You can use any standard configuration file (like the quickstart_inviscid.cfg from the tutorial cases) to reproduce this.

The crash is not triggered by the configuration settings themselves, but by the API call when using the Python wrapper (or C++ driver) to request a field that doesn't exist.

Steps to Reproduce:

  1. Run any standard case using the Python wrapper.
  2. Request a non-existent history field inside the iteration loop.
import pysu2
# Use any valid config, e.g., quickstart_inviscid.cfg
driver = pysu2.CSinglezoneDriver("quickstart_inviscid.cfg", 1, MPI_COMM_WORLD)

# ... initialization ...

# CRASH: This triggers MPI_Abort immediately because "NonExistentVar" is not in the map.
# The user has no way to check for existence or catch the error.
val = driver.GetHistoryOutput("NonExistentVar")

Context:
In coupled simulations (like the ML-coupling I am working on), we need to query fields dynamically. If a field is missing (e.g., waiting for initialization), the current behavior (hard abort) kills the entire coupled simulation. A warning allows us to handle it safely.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants