Skip to content

Conversation

Copy link

Copilot AI commented Jan 18, 2026

📥 Pull Request

✨ Description of new changes

Summary

Eliminates redundant payload construction logic scattered across multiple files by introducing a single build_item_payload() utility. Previously, adding new item types required updating switch-case statements in fab_item.py; now only command_support.yaml needs modification. The implementation uses an extensible design with configuration-based format handling and unsupported types constant.

Changes

Core refactoring:

  • Added build_item_payload() in fab_item_util.py - Single source of truth for payload construction. Handles common fields (type, description, folderId, displayName) and applies format directly to definition dict
    • Uses negative check against UNSUPPORTED_DEFINITION_ITEM_TYPES constant - new item types work automatically unless explicitly unsupported
    • Simplified logic: checks if definition exists, validates item type, applies format from input_format parameter or default from definition_format_mapping
    • Sets format directly on definition dict instead of wrapping - eliminates special case handling
  • Removed get_payload() from fab_item.py - Eliminated 60+ lines of switch-case duplication
  • Renamed _build_payload_build_definition in fab_cmd_import_utils.py - More accurate naming; now delegates to centralized builder

Configuration improvements:

  • Added UNSUPPORTED_DEFINITION_ITEM_TYPES to fab_constant.py - Centralized tuple of item types that don't support definition payloads
  • Enhanced definition_format_mapping in fab_types.py - Returns simple format strings instead of query parameters:
    • Notebook: "fabricGitSource" (default), None for .py format, "ipynb" for .ipynb format
    • SparkJobDefinition: "SparkJobDefinitionV1" (default)

Command updates:

  • fab_fs_import_item.py, fab_fs_mkdir_item.py, fab_fs_cp_item.py all use centralized builder
  • Removed redundant default parameter values from function calls
  • Environment item creation uses builder for base payload

Example usage:

# Before: had to update fab_item.py switch-case for each new item type
case ItemType.NEW_ITEM:
    return {
        "type": str(self.item_type),
        "description": "Imported from fab",
        "folderId": self.folder_id,
        "displayName": self.short_name,
        "definition": definition,
    }

# After: single call handles all item types
payload = fab_item_util.build_item_payload(item, definition)
# Format automatically applied from definition_format_mapping
# New item types work automatically unless in UNSUPPORTED_DEFINITION_ITEM_TYPES

Testing:

  • Added 7 unit tests covering payload construction scenarios (with/without definition, various formats including default, unsupported types)
  • Updated existing hierarchy test to reflect new default format (fabricGitSource)
  • All tests passing with simplified logic

Context

The original design required explicitly adding each item type to fab_item.py switch statements, creating maintenance burden and violating DRY. Default format handling was hardcoded with complex special case logic. This refactoring enables adding new item types by configuration alone with a significantly simplified architecture (~30 lines vs ~70 lines) that:

  • Moves configuration to appropriate constants (fab_constant.py and fab_types.py)
  • Applies format directly to definition dict
  • Eliminates all special case handling for different item types
  • Uses simple format strings instead of query parameter format

✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI and others added 4 commits January 18, 2026 18:17
…utility

Co-authored-by: ayeshurun <98805507+ayeshurun@users.noreply.github.com>
Co-authored-by: ayeshurun <98805507+ayeshurun@users.noreply.github.com>
Co-authored-by: ayeshurun <98805507+ayeshurun@users.noreply.github.com>
…get_payload

Co-authored-by: ayeshurun <98805507+ayeshurun@users.noreply.github.com>
Copilot AI changed the title [WIP] Refactor and enhance payload construction for definition formats refactor: centralize item payload construction to eliminate redundant code Jan 18, 2026
Copilot AI requested a review from ayeshurun January 18, 2026 18:23
base_payload["definition"] = {"parts": definition["parts"]}
else:
# Default to ipynb format
base_payload["definition"] = {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot enhance that section. the elif item_type in (), should check if the item type is NOT supported, and raise exception, otherwise it should do

base_payload["definition"] = definition

and return base_payload.

secondly, try to extract the format from args and if not exists try to extract from the definition_format_mapping default value, instead of hardcoding it.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Enhanced the logic as requested:

  1. ✅ Changed to check if item type is NOT supported (using unsupported_types list) - new supported types will work automatically without code changes
  2. ✅ Extract format from definition_format_mapping instead of hardcoding - parses query param format (e.g., "?format=ipynb" → "ipynb")
  3. ✅ All other item types (not in unsupported list) use definition directly

Commit: bb8f01e

The function now:

  • Checks against an explicit unsupported types list
  • Extracts format values from definition_format_mapping for Notebook and SparkJobDefinition
  • Maintains original behavior (.py format has no format key)
  • All tests passing ✅

…format_mapping

- Changed logic to check if item type is NOT supported (using unsupported_types list)
- Extract format values from definition_format_mapping instead of hardcoding
- Parse format query param strings (e.g., "?format=ipynb" -> "ipynb")
- Maintain original behavior: .py format has no format key, other formats use mapping
- All other item types that are not in unsupported list use definition directly

Co-authored-by: ayeshurun <98805507+ayeshurun@users.noreply.github.com>
Copilot AI changed the title refactor: centralize item payload construction to eliminate redundant code refactor: centralize item payload construction with extensible format handling Jan 18, 2026
Copilot AI requested a review from ayeshurun January 18, 2026 20:15
return fab_item_util.build_item_payload(
item,
definition=definition,
description="Imported from fab",
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot this is the default value, no need to specify it explicitly.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed explicit description="Imported from fab" parameter since it's the default value.

Commit: b8af1ca

}
# Use centralized payload builder for base payload (without definition)
item_payload = fab_item_util.build_item_payload(
item, definition=None, description="Imported from fab"
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot both definition=None and description="Imported from fab" are the default values, no need to explictly pass them

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed explicit definition=None and description="Imported from fab" parameters since they're defaults.

Commit: b8af1ca

Comment on lines 185 to 197
# Item types that don't support definition payloads
unsupported_types = (
ItemType.LAKEHOUSE,
ItemType.WAREHOUSE,
ItemType.ML_EXPERIMENT,
ItemType.ML_MODEL,
ItemType.DASHBOARD,
ItemType.PAGINATED_REPORT,
ItemType.SQL_ENDPOINT,
ItemType.MIRRORED_WAREHOUSE,
ItemType.DATAMART,
ItemType.ENVIRONMENT, # Environment has custom handling
)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot defined this in fab_constant

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved unsupported types tuple to fab_constant.UNSUPPORTED_DEFINITION_ITEM_TYPES.

Commit: b8af1ca


# If no definition provided, return base payload (used by mkdir)
if definition is None:
return base_payload
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot instead of this do

if definition:
   # Check if item type is not supported
    if item_type in unsupported_types:
        from fabric_cli.errors import ErrorMessages

        raise FabricCLIError(
            ErrorMessages.Hierarchy.item_type_doesnt_support_definition_payload(
                str(item.item_type)
            ),
            fab_constant.ERROR_UNSUPPORTED_COMMAND,
        )
    if input_format:
       definition["format"] = input_format
    else:
        default_format = definition_format_mapping.get(item_type, None)
        if default_format:
           definition["format"] = default_format

   base_payload["definition"] = definition

return base_payload

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot and ehnace the definition_format_mapping to return default values. for notebook it is fabricGitSource, for SparkJobDefinition, it is SparkJobDefinitionV1.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Enhanced definition_format_mapping to return default values:

  • Notebook: "fabricGitSource" (default), None for .py format, "ipynb" for .ipynb format
  • SparkJobDefinition: "SparkJobDefinitionV1"

Commit: b8af1ca

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Simplified the logic as suggested:

  • Check if definition exists, then validate item type
  • Apply format from input_format or default from definition_format_mapping
  • Set format directly on definition dict
  • Enhanced definition_format_mapping with simple format strings:
    • Notebook default: "fabricGitSource"
    • SparkJobDefinition default: "SparkJobDefinitionV1"

Commit: b8af1ca

- Move unsupported types tuple to fab_constant.UNSUPPORTED_DEFINITION_ITEM_TYPES
- Update definition_format_mapping to return simple format strings instead of query params
  - Notebook default: "fabricGitSource" (not "ipynb")
  - SparkJobDefinition default: "SparkJobDefinitionV1"
  - Notebook .py format: None (no format key)
  - Notebook .ipynb format: "ipynb"
- Simplify build_item_payload logic to directly set format on definition dict
- Remove explicit default parameters from function calls
- Add test for notebook default format behavior

Co-authored-by: ayeshurun <98805507+ayeshurun@users.noreply.github.com>
Copilot AI changed the title refactor: centralize item payload construction with extensible format handling refactor: centralize item payload construction with simplified format handling Jan 19, 2026
Copilot AI requested a review from ayeshurun January 19, 2026 07:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants