From c827dcd23f1daab401501bd29f0eaf1a0eac5e63 Mon Sep 17 00:00:00 2001 From: Suyash Kshirsagar Date: Mon, 17 Nov 2025 09:23:46 -0800 Subject: [PATCH 1/4] docs: fix example filename references and swap documentation URLs - Changed complete_walkthrough.py to walkthrough.py in all references - Swapped API reference and Product documentation URLs in header - Cleaned up limitations section for clarity --- README.md | 8 +++----- examples/README.md | 6 +++--- 2 files changed, 6 insertions(+), 8 deletions(-) diff --git a/README.md b/README.md index 0e22790..2396a0b 100644 --- a/README.md +++ b/README.md @@ -6,7 +6,7 @@ A Python client library for Microsoft Dataverse that provides a unified interface for CRUD operations, SQL queries, table metadata management, and file uploads through the Dataverse Web API. -**[Source code](https://github.com/microsoft/PowerPlatform-DataverseClient-Python)** | **[Package (PyPI)](https://pypi.org/project/PowerPlatform-Dataverse-Client/)** | **[API reference documentation](https://learn.microsoft.com/en-us/power-apps/developer/data-platform/sdk-python/)** | **[Product documentation](https://learn.microsoft.com/en-us/python/api/dataverse-sdk-docs-python/dataverse-overview?view=dataverse-sdk-python-latest/)** | **[Samples](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/tree/main/examples)** +**[Source code](https://github.com/microsoft/PowerPlatform-DataverseClient-Python)** | **[Package (PyPI)](https://pypi.org/project/PowerPlatform-Dataverse-Client/)** | **[API reference documentation](https://learn.microsoft.com/python/api/dataverse-sdk-docs-python/dataverse-overview?view=dataverse-sdk-python-latest)** | **[Product documentation](https://learn.microsoft.com/power-apps/developer/data-platform/sdk-python/)** | **[Samples](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/tree/main/examples)** > [!IMPORTANT] > This library is currently in **preview**. Preview versions are provided for early access to new features and may contain breaking changes. @@ -261,7 +261,7 @@ Explore our comprehensive examples in the [`examples/`](examples/) directory: - **[Functional Testing](examples/basic/functional_testing.py)** - Test core functionality in your environment **๐Ÿš€ Advanced Usage:** -- **[Complete Walkthrough](examples/advanced/complete_walkthrough.py)** - Full feature demonstration with production patterns +- **[Complete Walkthrough](examples/advanced/walkthrough.py)** - Full feature demonstration with production patterns - **[File Upload](examples/advanced/file_upload.py)** - Upload files to Dataverse file columns ๐Ÿ“– See the [examples README](examples/README.md) for detailed guidance and learning progression. @@ -323,11 +323,9 @@ For optimal performance in production environments: ### Limitations - SQL queries are **read-only** and support a limited subset of SQL syntax -- Create Table supports a limited number of column types. +- Create Table supports a limited number of column types. Lookup column is not yet supported. - Creating relationships between tables is not yet supported. - File uploads are limited by Dataverse file size restrictions (default 128MB per file) -- Custom table creation requires appropriate security privileges in the target environment -- Rate limits apply based on your Power Platform license and environment configuration ## Contributing diff --git a/examples/README.md b/examples/README.md index dec7427..6373fed 100644 --- a/examples/README.md +++ b/examples/README.md @@ -67,15 +67,15 @@ python examples/basic/functional_testing.py ### ๐Ÿš€ Step 3: Master Advanced Features ```bash # Comprehensive walkthrough with production patterns -python examples/advanced/complete_walkthrough.py +python examples/advanced/walkthrough.py ``` ## ๐ŸŽฏ Quick Start Recommendations - **New to the SDK?** โ†’ Start with `examples/basic/installation_example.py` - **Need to test/validate?** โ†’ Use `examples/basic/functional_testing.py` -- **Want to see all features?** โ†’ Run `examples/advanced/complete_walkthrough.py` -- **Building production apps?** โ†’ Study patterns in `examples/advanced/complete_walkthrough.py` +- **Want to see all features?** โ†’ Run `examples/advanced/walkthrough.py` +- **Building production apps?** โ†’ Study patterns in `examples/advanced/walkthrough.py` ## ๐Ÿ“‹ Prerequisites From a6c78c61f50c8b3f5b52d40ebbc18a3b5346d44e Mon Sep 17 00:00:00 2001 From: suyask-msft <158708948+suyask-msft@users.noreply.github.com> Date: Mon, 17 Nov 2025 09:28:36 -0800 Subject: [PATCH 2/4] Update README.md Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 2396a0b..d1b14f3 100644 --- a/README.md +++ b/README.md @@ -323,7 +323,7 @@ For optimal performance in production environments: ### Limitations - SQL queries are **read-only** and support a limited subset of SQL syntax -- Create Table supports a limited number of column types. Lookup column is not yet supported. +- Create Table supports a limited number of column types. Lookup columns are not yet supported. - Creating relationships between tables is not yet supported. - File uploads are limited by Dataverse file size restrictions (default 128MB per file) From 0dfc813956de44292ff00106773f2a6615f2c658 Mon Sep 17 00:00:00 2001 From: Suyash Kshirsagar Date: Mon, 17 Nov 2025 09:38:23 -0800 Subject: [PATCH 3/4] style: apply Black code formatting to pass CI checks --- examples/__init__.py | 2 +- examples/advanced/__init__.py | 2 +- examples/advanced/file_upload.py | 252 +++++++++++------- examples/advanced/walkthrough.py | 53 ++-- examples/basic/__init__.py | 2 +- examples/basic/functional_testing.py | 125 ++++----- examples/basic/installation_example.py | 125 +++++---- src/PowerPlatform/Dataverse/client.py | 14 +- src/PowerPlatform/Dataverse/core/__init__.py | 2 +- src/PowerPlatform/Dataverse/core/auth.py | 5 +- src/PowerPlatform/Dataverse/core/config.py | 1 + .../Dataverse/core/error_codes.py | 8 +- src/PowerPlatform/Dataverse/core/errors.py | 15 +- src/PowerPlatform/Dataverse/core/http.py | 12 +- src/PowerPlatform/Dataverse/data/__init__.py | 2 +- src/PowerPlatform/Dataverse/data/odata.py | 111 ++++---- src/PowerPlatform/Dataverse/data/upload.py | 8 +- .../Dataverse/extensions/__init__.py | 2 +- .../Dataverse/models/__init__.py | 2 +- src/PowerPlatform/Dataverse/utils/__init__.py | 2 +- src/PowerPlatform/__init__.py | 2 +- tests/__init__.py | 2 +- tests/conftest.py | 20 +- tests/fixtures/test_data.py | 39 +-- tests/unit/__init__.py | 2 +- tests/unit/core/__init__.py | 2 +- tests/unit/core/test_http_errors.py | 71 +++-- tests/unit/data/__init__.py | 2 +- .../unit/data/test_enum_optionset_payload.py | 22 +- tests/unit/data/test_logical_crud.py | 38 +-- tests/unit/data/test_sql_parse.py | 12 +- 31 files changed, 551 insertions(+), 406 deletions(-) diff --git a/examples/__init__.py b/examples/__init__.py index 87c2ba9..fd88c22 100644 --- a/examples/__init__.py +++ b/examples/__init__.py @@ -1,4 +1,4 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT license. -"""Examples package for the Dataverse SDK.""" \ No newline at end of file +"""Examples package for the Dataverse SDK.""" diff --git a/examples/advanced/__init__.py b/examples/advanced/__init__.py index fc6b584..1929f4a 100644 --- a/examples/advanced/__init__.py +++ b/examples/advanced/__init__.py @@ -1,4 +1,4 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT license. -"""Advanced examples showcasing complex Dataverse SDK features.""" \ No newline at end of file +"""Advanced examples showcasing complex Dataverse SDK features.""" diff --git a/examples/advanced/file_upload.py b/examples/advanced/file_upload.py index 24a5851..5a90c7e 100644 --- a/examples/advanced/file_upload.py +++ b/examples/advanced/file_upload.py @@ -4,7 +4,7 @@ """ PowerPlatform Dataverse Client - File Upload Example -This example demonstrates file upload capabilities using the +This example demonstrates file upload capabilities using the PowerPlatform-Dataverse-Client SDK with automatic chunking for large files. Prerequisites: @@ -33,25 +33,25 @@ print("No URL entered; exiting.") sys.exit(1) -base_url = entered.rstrip('/') +base_url = entered.rstrip("/") # Mode selection (numeric): # 1 = small (single PATCH <128MB) # 2 = chunk (streaming for any size) # 3 = all (small + chunk) mode_raw = input("Choose mode: 1) small 2) chunk 3) all [default 3]: ").strip() if not mode_raw: - mode_raw = '3' -if mode_raw not in {'1','2','3'}: + mode_raw = "3" +if mode_raw not in {"1", "2", "3"}: print({"invalid_mode": mode_raw, "fallback": 3}) - mode_raw = '3' + mode_raw = "3" mode_int = int(mode_raw) -run_small = mode_int in (1,3) -run_chunk = mode_int in (2,3) +run_small = mode_int in (1, 3) +run_chunk = mode_int in (2, 3) -delete_table_choice = input("Delete the table at end? (y/N): ").strip() or 'n' +delete_table_choice = input("Delete the table at end? (y/N): ").strip() or "n" cleanup_table = delete_table_choice.lower() in ("y", "yes", "true", "1") -delete_record_choice = input("Delete the created record at end? (Y/n): ").strip() or 'y' +delete_record_choice = input("Delete the created record at end? (Y/n): ").strip() or "y" cleanup_record = delete_record_choice.lower() in ("y", "yes", "true", "1") credential = InteractiveBrowserCredential() @@ -59,22 +59,26 @@ # --------------------------- Helpers --------------------------- + def log(call: str): print({"call": call}) + # Simple SHA-256 helper with caching to avoid re-reading large files multiple times. _FILE_HASH_CACHE = {} + def file_sha256(path: Path): # returns (hex_digest, size_bytes) try: m = _FILE_HASH_CACHE.get(path) if m: return m[0], m[1] import hashlib # noqa: WPS433 + h = hashlib.sha256() size = 0 - with path.open('rb') as f: # stream to avoid high memory for large files - for chunk in iter(lambda: f.read(1024 * 1024), b''): + with path.open("rb") as f: # stream to avoid high memory for large files + for chunk in iter(lambda: f.read(1024 * 1024), b""): size += len(chunk) h.update(chunk) digest = h.hexdigest() @@ -83,6 +87,7 @@ def file_sha256(path: Path): # returns (hex_digest, size_bytes) except Exception: # noqa: BLE001 return None, None + def generate_test_pdf(size_mb: int = 10) -> Path: """Generate a dummy PDF file of specified size for testing purposes.""" try: @@ -92,71 +97,73 @@ def generate_test_pdf(size_mb: int = 10) -> Path: # Fallback: generate a simple binary file with PDF headers test_file = Path(__file__).resolve().parent / f"test_dummy_{size_mb}mb.pdf" target_size = size_mb * 1024 * 1024 - + # Minimal PDF structure pdf_header = b"%PDF-1.4\n" pdf_body = b"1 0 obj\n<< /Type /Catalog /Pages 2 0 R >>\nendobj\n" pdf_body += b"2 0 obj\n<< /Type /Pages /Kids [3 0 R] /Count 1 >>\nendobj\n" pdf_body += b"3 0 obj\n<< /Type /Page /Parent 2 0 R /MediaBox [0 0 612 792] >>\nendobj\n" - + # Fill with dummy data to reach target size current_size = len(pdf_header) + len(pdf_body) padding_needed = target_size - current_size - 50 # Reserve space for trailer padding = b"% " + (b"padding " * (padding_needed // 8))[:padding_needed] + b"\n" - + pdf_trailer = b"xref\n0 4\ntrailer\n<< /Size 4 /Root 1 0 R >>\nstartxref\n0\n%%EOF\n" - - with test_file.open('wb') as f: + + with test_file.open("wb") as f: f.write(pdf_header) f.write(pdf_body) f.write(padding) f.write(pdf_trailer) - - print({"test_pdf_generated": str(test_file), "size_mb": test_file.stat().st_size / (1024*1024)}) + + print({"test_pdf_generated": str(test_file), "size_mb": test_file.stat().st_size / (1024 * 1024)}) return test_file - + # ReportLab available - generate proper PDF test_file = Path(__file__).resolve().parent / f"test_dummy_{size_mb}mb.pdf" c = canvas.Canvas(str(test_file), pagesize=letter) - + # Add pages with content until we reach target size target_size = size_mb * 1024 * 1024 page_num = 0 - + while test_file.exists() is False or test_file.stat().st_size < target_size: page_num += 1 c.drawString(100, 750, f"Test PDF - Page {page_num}") c.drawString(100, 730, f"Generated for file upload testing") - + # Add some text to increase file size for i in range(50): c.drawString(50, 700 - (i * 12), f"Line {i}: " + "Sample text content " * 20) - + c.showPage() - + # Save periodically to check size if page_num % 10 == 0: c.save() if test_file.stat().st_size >= target_size: break c = canvas.Canvas(str(test_file), pagesize=letter) - + if not test_file.exists() or test_file.stat().st_size < target_size: c.save() - - print({"test_pdf_generated": str(test_file), "size_mb": test_file.stat().st_size / (1024*1024)}) + + print({"test_pdf_generated": str(test_file), "size_mb": test_file.stat().st_size / (1024 * 1024)}) return test_file -def backoff(op, *, delays=(0,2,5,10), retry_status=(400,403,404,409,412,429,500,502,503,504)): + +def backoff(op, *, delays=(0, 2, 5, 10), retry_status=(400, 403, 404, 409, 412, 429, 500, 502, 503, 504)): last = None for d in delays: - if d: time.sleep(d) + if d: + time.sleep(d) try: return op() except Exception as ex: # noqa: BLE001 last = ex - r = getattr(ex, 'response', None) - code = getattr(r, 'status_code', None) + r = getattr(ex, "response", None) + code = getattr(r, "status_code", None) if isinstance(ex, requests.exceptions.HTTPError) and code in retry_status: continue # For non-HTTP errors just retry the schedule @@ -164,9 +171,11 @@ def backoff(op, *, delays=(0,2,5,10), retry_status=(400,403,404,409,412,429,500, if last: raise last + # --------------------------- Table ensure --------------------------- TABLE_SCHEMA_NAME = "new_FileSample" + def ensure_table(): # Check by schema existing = client.get_table_info(TABLE_SCHEMA_NAME) @@ -175,9 +184,10 @@ def ensure_table(): return existing log("client.create_table('new_FileSample', schema={'new_Title': 'string'})") info = client.create_table(TABLE_SCHEMA_NAME, {"new_Title": "string"}) - print({"table": TABLE_SCHEMA_NAME, "existed": False, "metadata_id": info.get('metadata_id')}) + print({"table": TABLE_SCHEMA_NAME, "existed": False, "metadata_id": info.get("metadata_id")}) return info + try: table_info = ensure_table() except Exception as e: # noqa: BLE001 @@ -187,13 +197,14 @@ def ensure_table(): entity_set = table_info.get("entity_set_name") table_schema_name = table_info.get("table_schema_name") -attr_prefix = table_schema_name.split('_',1)[0] if '_' in table_schema_name else table_schema_name +attr_prefix = table_schema_name.split("_", 1)[0] if "_" in table_schema_name else table_schema_name name_attr = f"{attr_prefix}_name" small_file_attr_schema = f"{attr_prefix}_SmallDocument" # second file attribute for small single-request demo small_file_attr_logical = f"{attr_prefix}_smalldocument" # expected logical name (lowercase) chunk_file_attr_schema = f"{attr_prefix}_ChunkDocument" # attribute for streaming chunk upload demo chunk_file_attr_logical = f"{attr_prefix}_chunkdocument" # expected logical name + def ensure_file_attribute_generic(schema_name: str, label: str, key_prefix: str): meta_id = table_info.get("metadata_id") if not meta_id: @@ -223,7 +234,11 @@ def ensure_file_attribute_generic(schema_name: str, label: str, key_prefix: str) "DisplayName": { "@odata.type": "Microsoft.Dynamics.CRM.Label", "LocalizedLabels": [ - {"@odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel", "Label": label, "LanguageCode": int(client._config.language_code)} + { + "@odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel", + "Label": label, + "LanguageCode": int(client._config.language_code), + } ], }, "RequiredLevel": {"Value": "None"}, @@ -235,10 +250,10 @@ def ensure_file_attribute_generic(schema_name: str, label: str, key_prefix: str) time.sleep(2) return True except Exception as ex: # noqa: BLE001 - resp = getattr(ex, 'response', None) + resp = getattr(ex, "response", None) body_l = None try: - body_l = resp.text.lower() if getattr(resp, 'text', None) else None + body_l = resp.text.lower() if getattr(resp, "text", None) else None except Exception: # noqa: BLE001 pass if body_l and ("duplicate" in body_l or "exists" in body_l): @@ -247,6 +262,7 @@ def ensure_file_attribute_generic(schema_name: str, label: str, key_prefix: str) print({f"{key_prefix}_file_attribute_created": False, "error": str(ex)}) return False + # Conditionally ensure each attribute only if its mode is selected if run_small: ensure_file_attribute_generic(small_file_attr_schema, "Small Document", "small") @@ -279,67 +295,86 @@ def ensure_file_attribute_generic(schema_name: str, label: str, key_prefix: str) _GENERATED_TEST_FILE = generate_test_pdf(10) # track generated file for cleanup _GENERATED_TEST_FILE_8MB = generate_test_pdf(8) # track 8MB replacement file for cleanup + def get_dataset_info(file_path: Path): if file_path in _DATASET_INFO_CACHE: return _DATASET_INFO_CACHE[file_path] - + sha_hex, size = file_sha256(file_path) info = (file_path, size, sha_hex) _DATASET_INFO_CACHE[file_path] = info return info + # --------------------------- Small single-request file upload demo --------------------------- if run_small: print("Small single-request upload demo:") try: DATASET_FILE, small_file_size, src_hash = get_dataset_info(_GENERATED_TEST_FILE) - backoff(lambda: client.upload_file( - table_schema_name, - record_id, - small_file_attr_logical, - str(DATASET_FILE), - mode="small", - )) + backoff( + lambda: client.upload_file( + table_schema_name, + record_id, + small_file_attr_logical, + str(DATASET_FILE), + mode="small", + ) + ) print({"small_upload_completed": True, "small_source_size": small_file_size}) odata = client._get_odata() - dl_url_single = f"{odata.api}/{entity_set}({record_id})/{small_file_attr_logical}/$value" # raw entity_set URL OK + dl_url_single = ( + f"{odata.api}/{entity_set}({record_id})/{small_file_attr_logical}/$value" # raw entity_set URL OK + ) resp_single = odata._request("get", dl_url_single) content_single = resp_single.content or b"" import hashlib # noqa: WPS433 + downloaded_hash = hashlib.sha256(content_single).hexdigest() if content_single else None hash_match = (downloaded_hash == src_hash) if (downloaded_hash and src_hash) else None - print({ - "small_file_source_size": small_file_size, - "small_file_download_size": len(content_single), - "small_file_size_match": len(content_single) == small_file_size, - "small_file_source_sha256_prefix": src_hash[:16] if src_hash else None, - "small_file_download_sha256_prefix": downloaded_hash[:16] if downloaded_hash else None, - "small_file_hash_match": hash_match, - }) - + print( + { + "small_file_source_size": small_file_size, + "small_file_download_size": len(content_single), + "small_file_size_match": len(content_single) == small_file_size, + "small_file_source_sha256_prefix": src_hash[:16] if src_hash else None, + "small_file_download_sha256_prefix": downloaded_hash[:16] if downloaded_hash else None, + "small_file_hash_match": hash_match, + } + ) + # Now test replacing with an 8MB file print("Small single-request upload demo - REPLACE with 8MB file:") replacement_file, replace_size_small, replace_hash_small = get_dataset_info(_GENERATED_TEST_FILE_8MB) - backoff(lambda: client.upload_file( - table_schema_name, - record_id, - small_file_attr_logical, - str(replacement_file), - mode="small", - )) + backoff( + lambda: client.upload_file( + table_schema_name, + record_id, + small_file_attr_logical, + str(replacement_file), + mode="small", + ) + ) print({"small_replace_upload_completed": True, "small_replace_source_size": replace_size_small}) resp_single_replace = odata._request("get", dl_url_single) content_single_replace = resp_single_replace.content or b"" downloaded_hash_replace = hashlib.sha256(content_single_replace).hexdigest() if content_single_replace else None - hash_match_replace = (downloaded_hash_replace == replace_hash_small) if (downloaded_hash_replace and replace_hash_small) else None - print({ - "small_replace_source_size": replace_size_small, - "small_replace_download_size": len(content_single_replace), - "small_replace_size_match": len(content_single_replace) == replace_size_small, - "small_replace_source_sha256_prefix": replace_hash_small[:16] if replace_hash_small else None, - "small_replace_download_sha256_prefix": downloaded_hash_replace[:16] if downloaded_hash_replace else None, - "small_replace_hash_match": hash_match_replace, - }) + hash_match_replace = ( + (downloaded_hash_replace == replace_hash_small) + if (downloaded_hash_replace and replace_hash_small) + else None + ) + print( + { + "small_replace_source_size": replace_size_small, + "small_replace_download_size": len(content_single_replace), + "small_replace_size_match": len(content_single_replace) == replace_size_small, + "small_replace_source_sha256_prefix": replace_hash_small[:16] if replace_hash_small else None, + "small_replace_download_sha256_prefix": ( + downloaded_hash_replace[:16] if downloaded_hash_replace else None + ), + "small_replace_hash_match": hash_match_replace, + } + ) except Exception as ex: # noqa: BLE001 print({"single_upload_failed": str(ex)}) @@ -348,52 +383,65 @@ def get_dataset_info(file_path: Path): print("Streaming chunk upload demo (upload_file_chunk):") try: DATASET_FILE, src_size_chunk, src_hash_chunk = get_dataset_info(_GENERATED_TEST_FILE) - backoff(lambda: client.upload_file( - table_schema_name, - record_id, - chunk_file_attr_logical, - str(DATASET_FILE), - mode="chunk", - )) + backoff( + lambda: client.upload_file( + table_schema_name, + record_id, + chunk_file_attr_logical, + str(DATASET_FILE), + mode="chunk", + ) + ) print({"chunk_upload_completed": True}) odata = client._get_odata() - dl_url_chunk = f"{odata.api}/{entity_set}({record_id})/{chunk_file_attr_logical}/$value" # raw entity_set for download + dl_url_chunk = ( + f"{odata.api}/{entity_set}({record_id})/{chunk_file_attr_logical}/$value" # raw entity_set for download + ) resp_chunk = odata._request("get", dl_url_chunk) content_chunk = resp_chunk.content or b"" import hashlib # noqa: WPS433 + dst_hash_chunk = hashlib.sha256(content_chunk).hexdigest() if content_chunk else None hash_match_chunk = (dst_hash_chunk == src_hash_chunk) if (dst_hash_chunk and src_hash_chunk) else None - print({ - "chunk_source_size": src_size_chunk, - "chunk_download_size": len(content_chunk), - "chunk_size_match": len(content_chunk) == src_size_chunk, - "chunk_source_sha256_prefix": src_hash_chunk[:16] if src_hash_chunk else None, - "chunk_download_sha256_prefix": dst_hash_chunk[:16] if dst_hash_chunk else None, - "chunk_hash_match": hash_match_chunk, - }) + print( + { + "chunk_source_size": src_size_chunk, + "chunk_download_size": len(content_chunk), + "chunk_size_match": len(content_chunk) == src_size_chunk, + "chunk_source_sha256_prefix": src_hash_chunk[:16] if src_hash_chunk else None, + "chunk_download_sha256_prefix": dst_hash_chunk[:16] if dst_hash_chunk else None, + "chunk_hash_match": hash_match_chunk, + } + ) # Now test replacing with an 8MB file print("Streaming chunk upload demo - REPLACE with 8MB file:") replacement_file, replace_size_chunk, replace_hash_chunk = get_dataset_info(_GENERATED_TEST_FILE_8MB) - backoff(lambda: client.upload_file( - table_schema_name, - record_id, - chunk_file_attr_logical, - str(replacement_file), - mode="chunk", - )) + backoff( + lambda: client.upload_file( + table_schema_name, + record_id, + chunk_file_attr_logical, + str(replacement_file), + mode="chunk", + ) + ) print({"chunk_replace_upload_completed": True}) resp_chunk_replace = odata._request("get", dl_url_chunk) content_chunk_replace = resp_chunk_replace.content or b"" dst_hash_chunk_replace = hashlib.sha256(content_chunk_replace).hexdigest() if content_chunk_replace else None - hash_match_chunk_replace = (dst_hash_chunk_replace == replace_hash_chunk) if (dst_hash_chunk_replace and replace_hash_chunk) else None - print({ - "chunk_replace_source_size": replace_size_chunk, - "chunk_replace_download_size": len(content_chunk_replace), - "chunk_replace_size_match": len(content_chunk_replace) == replace_size_chunk, - "chunk_replace_source_sha256_prefix": replace_hash_chunk[:16] if replace_hash_chunk else None, - "chunk_replace_download_sha256_prefix": dst_hash_chunk_replace[:16] if dst_hash_chunk_replace else None, - "chunk_replace_hash_match": hash_match_chunk_replace, - }) + hash_match_chunk_replace = ( + (dst_hash_chunk_replace == replace_hash_chunk) if (dst_hash_chunk_replace and replace_hash_chunk) else None + ) + print( + { + "chunk_replace_source_size": replace_size_chunk, + "chunk_replace_download_size": len(content_chunk_replace), + "chunk_replace_size_match": len(content_chunk_replace) == replace_size_chunk, + "chunk_replace_source_sha256_prefix": replace_hash_chunk[:16] if replace_hash_chunk else None, + "chunk_replace_download_sha256_prefix": dst_hash_chunk_replace[:16] if dst_hash_chunk_replace else None, + "chunk_replace_hash_match": hash_match_chunk_replace, + } + ) except Exception as ex: # noqa: BLE001 print({"chunk_upload_failed": str(ex)}) diff --git a/examples/advanced/walkthrough.py b/examples/advanced/walkthrough.py index 5311592..201196c 100644 --- a/examples/advanced/walkthrough.py +++ b/examples/advanced/walkthrough.py @@ -52,12 +52,12 @@ def main(): if not base_url: print("No URL entered; exiting.") sys.exit(1) - - base_url = base_url.rstrip('/') - + + base_url = base_url.rstrip("/") + log_call("InteractiveBrowserCredential()") credential = InteractiveBrowserCredential() - + log_call(f"DataverseClient(base_url='{base_url}', credential=...)") client = DataverseClient(base_url=base_url, credential=credential) print(f"โœ“ Connected to: {base_url}") @@ -70,10 +70,10 @@ def main(): print("=" * 80) table_name = "new_WalkthroughDemo" - + log_call(f"client.get_table_info('{table_name}')") table_info = client.get_table_info(table_name) - + if table_info: print(f"โœ“ Table already exists: {table_info.get('table_schema_name')}") print(f" Logical Name: {table_info.get('table_logical_name')}") @@ -85,7 +85,7 @@ def main(): "new_Quantity": "int", "new_Amount": "decimal", "new_Completed": "bool", - "new_Priority": Priority + "new_Priority": Priority, } table_info = client.create_table(table_name, columns) print(f"โœ“ Created table: {table_info.get('table_schema_name')}") @@ -105,7 +105,7 @@ def main(): "new_Quantity": 5, "new_Amount": 1250.50, "new_Completed": False, - "new_Priority": Priority.MEDIUM + "new_Priority": Priority.MEDIUM, } id1 = client.create(table_name, single_record)[0] print(f"โœ“ Created single record: {id1}") @@ -118,22 +118,22 @@ def main(): "new_Quantity": 10, "new_Amount": 500.00, "new_Completed": True, - "new_Priority": Priority.HIGH + "new_Priority": Priority.HIGH, }, { "new_Title": "Update test cases", "new_Quantity": 8, "new_Amount": 750.25, "new_Completed": False, - "new_Priority": Priority.LOW + "new_Priority": Priority.LOW, }, { "new_Title": "Deploy to staging", "new_Quantity": 3, "new_Amount": 2000.00, "new_Completed": False, - "new_Priority": Priority.HIGH - } + "new_Priority": Priority.HIGH, + }, ] ids = client.create(table_name, multiple_records) print(f"โœ“ Created {len(ids)} records: {ids}") @@ -149,15 +149,20 @@ def main(): log_call(f"client.get('{table_name}', '{id1}')") record = client.get(table_name, id1) print("โœ“ Retrieved single record:") - print(json.dumps({ - "new_walkthroughdemoid": record.get("new_walkthroughdemoid"), - "new_title": record.get("new_title"), - "new_quantity": record.get("new_quantity"), - "new_amount": record.get("new_amount"), - "new_completed": record.get("new_completed"), - "new_priority": record.get("new_priority"), - "new_priority@FormattedValue": record.get("new_priority@OData.Community.Display.V1.FormattedValue") - }, indent=2)) + print( + json.dumps( + { + "new_walkthroughdemoid": record.get("new_walkthroughdemoid"), + "new_title": record.get("new_title"), + "new_quantity": record.get("new_quantity"), + "new_amount": record.get("new_amount"), + "new_completed": record.get("new_completed"), + "new_priority": record.get("new_priority"), + "new_priority@FormattedValue": record.get("new_priority@OData.Community.Display.V1.FormattedValue"), + }, + indent=2, + ) + ) # Multiple read with filter log_call(f"client.get('{table_name}', filter='new_quantity gt 5')") @@ -201,7 +206,7 @@ def main(): "new_Quantity": i, "new_Amount": i * 10.0, "new_Completed": False, - "new_Priority": Priority.LOW + "new_Priority": Priority.LOW, } for i in range(1, 21) ] @@ -212,7 +217,7 @@ def main(): log_call(f"client.get('{table_name}', page_size=5)") print("Fetching records with page_size=5...") for page_num, page in enumerate(client.get(table_name, orderby=["new_Quantity"], page_size=5), start=1): - record_ids = [r.get('new_walkthroughdemoid')[:8] + "..." for r in page] + record_ids = [r.get("new_walkthroughdemoid")[:8] + "..." for r in page] print(f" Page {page_num}: {len(page)} records - IDs: {record_ids}") # ============================================================================ @@ -245,7 +250,7 @@ def main(): "new_Quantity": 1, "new_Amount": 99.99, "new_Completed": False, - "new_Priority": "High" # String label instead of int + "new_Priority": "High", # String label instead of int } label_id = client.create(table_name, label_record)[0] retrieved = client.get(table_name, label_id) diff --git a/examples/basic/__init__.py b/examples/basic/__init__.py index 5eff67a..fc820f8 100644 --- a/examples/basic/__init__.py +++ b/examples/basic/__init__.py @@ -1,4 +1,4 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT license. -"""Basic examples for getting started with the Dataverse SDK.""" \ No newline at end of file +"""Basic examples for getting started with the Dataverse SDK.""" diff --git a/examples/basic/functional_testing.py b/examples/basic/functional_testing.py index ea4b727..d850b7b 100644 --- a/examples/basic/functional_testing.py +++ b/examples/basic/functional_testing.py @@ -7,7 +7,7 @@ This script provides comprehensive functional testing of the PowerPlatform-Dataverse-Client SDK: - Real environment connection testing -- Table creation and metadata operations +- Table creation and metadata operations - Full CRUD operations testing - Query functionality validation - Interactive cleanup options @@ -20,7 +20,7 @@ Usage: python examples/advanced/functional_testing.py -Note: This is an advanced testing script. For basic installation validation, +Note: This is an advanced testing script. For basic installation validation, use examples/basic/installation_example.py instead. """ @@ -39,15 +39,15 @@ def get_dataverse_org_url() -> str: """Get Dataverse org URL from user input.""" print("\n๐ŸŒ Dataverse Environment Setup") print("=" * 50) - + if not sys.stdin.isatty(): print("โŒ Interactive input required. Run this script in a terminal.") sys.exit(1) - + while True: org_url = input("Enter your Dataverse org URL (e.g., https://yourorg.crm.dynamics.com): ").strip() if org_url: - return org_url.rstrip('/') + return org_url.rstrip("/") print("โš ๏ธ Please enter a valid URL.") @@ -55,18 +55,18 @@ def setup_authentication() -> DataverseClient: """Set up authentication and create Dataverse client.""" print("\n๐Ÿ” Authentication Setup") print("=" * 50) - + org_url = get_dataverse_org_url() try: credential = InteractiveBrowserCredential() client = DataverseClient(org_url, credential) - + # Test the connection print("๐Ÿงช Testing connection...") tables = client.list_tables() print(f"โœ… Connection successful! Found {len(tables)} tables.") return client - + except Exception as e: print(f"โŒ Authentication failed: {e}") print("๐Ÿ’ก Please check your credentials and permissions.") @@ -77,43 +77,42 @@ def ensure_test_table(client: DataverseClient) -> Dict[str, Any]: """Create or verify test table exists.""" print("\n๐Ÿ“‹ Test Table Setup") print("=" * 50) - + table_schema_name = "test_TestSDKFunctionality" - + try: # Check if table already exists existing_table = client.get_table_info(table_schema_name) if existing_table: print(f"โœ… Test table '{table_schema_name}' already exists") return existing_table - + except Exception: print(f"๐Ÿ“ Table '{table_schema_name}' not found, creating...") - + try: print("๐Ÿ”จ Creating new test table...") # Create the test table with various field types table_info = client.create_table( table_schema_name, primary_column_schema_name="test_name", - columns= - { - "test_description": "string", # Description field - "test_count": "int", # Integer field - "test_amount": "decimal", # Decimal field - "test_is_active": "bool", # Boolean field - "test_created_date": "datetime" # DateTime field - } + columns={ + "test_description": "string", # Description field + "test_count": "int", # Integer field + "test_amount": "decimal", # Decimal field + "test_is_active": "bool", # Boolean field + "test_created_date": "datetime", # DateTime field + }, ) - + print(f"โœ… Created test table: {table_info.get('table_schema_name')}") print(f" Logical name: {table_info.get('table_logical_name')}") print(f" Entity set: {table_info.get('entity_set_name')}") - + # Wait a moment for table to be ready time.sleep(2) return table_info - + except MetadataError as e: print(f"โŒ Failed to create table: {e}") sys.exit(1) @@ -126,7 +125,7 @@ def test_create_record(client: DataverseClient, table_info: Dict[str, Any]) -> s table_schema_name = table_info.get("table_schema_name") attr_prefix = table_schema_name.split("_", 1)[0] if "_" in table_schema_name else table_schema_name - + # Create test record data test_data = { f"{attr_prefix}_name": f"Test Record {datetime.now().strftime('%H:%M:%S')}", @@ -134,13 +133,13 @@ def test_create_record(client: DataverseClient, table_info: Dict[str, Any]) -> s f"{attr_prefix}_count": 42, f"{attr_prefix}_amount": 123.45, f"{attr_prefix}_is_active": True, - f"{attr_prefix}_created_date": datetime.now().isoformat() + f"{attr_prefix}_created_date": datetime.now().isoformat(), } - + try: print("๐Ÿš€ Creating test record...") created_ids = client.create(table_schema_name, test_data) - + if isinstance(created_ids, list) and created_ids: record_id = created_ids[0] print(f"โœ… Record created successfully!") @@ -149,7 +148,7 @@ def test_create_record(client: DataverseClient, table_info: Dict[str, Any]) -> s return record_id else: raise ValueError("Unexpected response from create operation") - + except HttpError as e: print(f"โŒ HTTP error during record creation: {e}") sys.exit(1) @@ -162,29 +161,33 @@ def test_read_record(client: DataverseClient, table_info: Dict[str, Any], record """Test record reading.""" print("\n๐Ÿ“– Record Reading Test") print("=" * 50) - + table_schema_name = table_info.get("table_schema_name") attr_prefix = table_schema_name.split("_", 1)[0] if "_" in table_schema_name else table_schema_name - + try: print(f"๐Ÿ” Reading record: {record_id}") record = client.get(table_schema_name, record_id) - + if record: print("โœ… Record retrieved successfully!") print(" Retrieved data:") - + # Display key fields - for field_name in [f"{attr_prefix}_name", f"{attr_prefix}_description", - f"{attr_prefix}_count", f"{attr_prefix}_amount", - f"{attr_prefix}_is_active"]: + for field_name in [ + f"{attr_prefix}_name", + f"{attr_prefix}_description", + f"{attr_prefix}_count", + f"{attr_prefix}_amount", + f"{attr_prefix}_is_active", + ]: if field_name in record: print(f" {field_name}: {record[field_name]}") - + return record else: raise ValueError("Record not found") - + except HttpError as e: print(f"โŒ HTTP error during record reading: {e}") sys.exit(1) @@ -197,22 +200,22 @@ def test_query_records(client: DataverseClient, table_info: Dict[str, Any]) -> N """Test querying multiple records.""" print("\n๐Ÿ” Record Query Test") print("=" * 50) - + table_schema_name = table_info.get("table_schema_name") attr_prefix = table_schema_name.split("_", 1)[0] if "_" in table_schema_name else table_schema_name - + try: print("๐Ÿ” Querying records from test table...") - + # Query with filter and select records_iterator = client.get( table_schema_name, select=[f"{attr_prefix}_name", f"{attr_prefix}_count", f"{attr_prefix}_amount"], filter=f"{attr_prefix}_is_active eq true", top=5, - orderby=[f"{attr_prefix}_name asc"] + orderby=[f"{attr_prefix}_name asc"], ) - + record_count = 0 for batch in records_iterator: for record in batch: @@ -221,9 +224,9 @@ def test_query_records(client: DataverseClient, table_info: Dict[str, Any]) -> N count = record.get(f"{attr_prefix}_count", "N/A") amount = record.get(f"{attr_prefix}_amount", "N/A") print(f" Record {record_count}: {name} (Count: {count}, Amount: {amount})") - + print(f"โœ… Query completed! Found {record_count} active records.") - + except Exception as e: print(f"โš ๏ธ Query test encountered an issue: {e}") print(" This might be expected if the table is very new.") @@ -233,13 +236,13 @@ def cleanup_test_data(client: DataverseClient, table_info: Dict[str, Any], recor """Clean up test data.""" print("\n๐Ÿงน Cleanup") print("=" * 50) - + table_schema_name = table_info.get("table_schema_name") - + # Ask user if they want to clean up cleanup_choice = input("Do you want to delete the test record? (y/N): ").strip().lower() - - if cleanup_choice in ['y', 'yes']: + + if cleanup_choice in ["y", "yes"]: try: client.delete(table_schema_name, record_id) print("โœ… Test record deleted successfully") @@ -247,11 +250,11 @@ def cleanup_test_data(client: DataverseClient, table_info: Dict[str, Any], recor print(f"โš ๏ธ Failed to delete test record: {e}") else: print("โ„น๏ธ Test record kept for inspection") - + # Ask about table cleanup table_cleanup = input("Do you want to delete the test table? (y/N): ").strip().lower() - - if table_cleanup in ['y', 'yes']: + + if table_cleanup in ["y", "yes"]: try: client.delete_table(table_info.get("table_schema_name")) print("โœ… Test table deleted successfully") @@ -267,41 +270,41 @@ def main(): print("=" * 70) print("This script tests SDK functionality in a real Dataverse environment:") print(" โ€ข Authentication & Connection") - print(" โ€ข Table Creation & Metadata Operations") + print(" โ€ข Table Creation & Metadata Operations") print(" โ€ข Record CRUD Operations") print(" โ€ข Query Functionality") print(" โ€ข Interactive Cleanup") print("=" * 70) print("๐Ÿ’ก For installation validation, run examples/basic/installation_example.py first") print("=" * 70) - + try: # Setup and authentication client = setup_authentication() - + # Table setup table_info = ensure_test_table(client) - + # Test record operations record_id = test_create_record(client, table_info) retrieved_record = test_read_record(client, table_info, record_id) - + # Test querying test_query_records(client, table_info) - + # Success summary print("\n๐ŸŽ‰ Functional Test Summary") print("=" * 50) print("โœ… Authentication: Success") print("โœ… Table Operations: Success") - print("โœ… Record Creation: Success") + print("โœ… Record Creation: Success") print("โœ… Record Reading: Success") print("โœ… Record Querying: Success") print("\n๐Ÿ’ก Your PowerPlatform Dataverse Client SDK is fully functional!") - + # Cleanup cleanup_test_data(client, table_info, record_id) - + except KeyboardInterrupt: print("\n\nโš ๏ธ Test interrupted by user") sys.exit(1) @@ -312,4 +315,4 @@ def main(): if __name__ == "__main__": - main() \ No newline at end of file + main() diff --git a/examples/basic/installation_example.py b/examples/basic/installation_example.py index ffa803c..eee2ccb 100644 --- a/examples/basic/installation_example.py +++ b/examples/basic/installation_example.py @@ -6,7 +6,7 @@ This comprehensive example demonstrates: - Package installation and validation -- Import verification and troubleshooting +- Import verification and troubleshooting - Basic usage patterns and code examples - Optional interactive testing with real Dataverse environment @@ -60,34 +60,40 @@ from typing import Optional from datetime import datetime + def validate_imports(): """Validate that all key imports work correctly.""" print("๐Ÿ” Validating Package Imports...") print("-" * 50) - + try: # Test main namespace import from PowerPlatform.Dataverse import DataverseClient, __version__ + print(f" โœ… Main namespace: PowerPlatform.Dataverse") print(f" โœ… Package version: {__version__}") print(f" โœ… DataverseClient class: {DataverseClient}") - + # Test submodule imports from PowerPlatform.Dataverse.core.errors import HttpError, MetadataError + print(f" โœ… Core errors: HttpError, MetadataError") - + from PowerPlatform.Dataverse.core.config import DataverseConfig + print(f" โœ… Core config: DataverseConfig") - + from PowerPlatform.Dataverse.data.odata import ODataClient + print(f" โœ… Data layer: ODataClient") - + # Test Azure Identity import from azure.identity import InteractiveBrowserCredential + print(f" โœ… Azure Identity: InteractiveBrowserCredential") - + return True, __version__, DataverseClient - + except ImportError as e: print(f" โŒ Import failed: {e}") print("\n๐Ÿ’ก Troubleshooting:") @@ -111,13 +117,19 @@ def validate_client_methods(DataverseClient): """Validate that DataverseClient has expected methods.""" print("\n๐Ÿ—๏ธ Validating Client Methods...") print("-" * 50) - + expected_methods = [ - 'create', 'get', 'update', 'delete', - 'create_table', 'get_table_info', 'delete_table', - 'list_tables', 'query_sql' + "create", + "get", + "update", + "delete", + "create_table", + "get_table_info", + "delete_table", + "list_tables", + "query_sql", ] - + missing_methods = [] for method in expected_methods: if hasattr(DataverseClient, method): @@ -125,7 +137,7 @@ def validate_client_methods(DataverseClient): else: print(f" โŒ Method missing: {method}") missing_methods.append(method) - + return len(missing_methods) == 0 @@ -133,22 +145,23 @@ def validate_package_metadata(): """Validate package metadata from pip.""" print("\n๐Ÿ“ฆ Validating Package Metadata...") print("-" * 50) - + try: - result = subprocess.run([sys.executable, '-m', 'pip', 'show', 'PowerPlatform-Dataverse-Client'], - capture_output=True, text=True) - + result = subprocess.run( + [sys.executable, "-m", "pip", "show", "PowerPlatform-Dataverse-Client"], capture_output=True, text=True + ) + if result.returncode == 0: - lines = result.stdout.split('\n') + lines = result.stdout.split("\n") for line in lines: - if any(line.startswith(prefix) for prefix in ['Name:', 'Version:', 'Summary:', 'Location:']): + if any(line.startswith(prefix) for prefix in ["Name:", "Version:", "Summary:", "Location:"]): print(f" โœ… {line}") return True else: print(f" โŒ Package not found in pip list") print(" ๐Ÿ’ก Try: pip install PowerPlatform-Dataverse-Client") return False - + except Exception as e: print(f" โŒ Metadata validation failed: {e}") return False @@ -158,8 +171,9 @@ def show_usage_examples(): """Display comprehensive usage examples.""" print("\n๐Ÿ“š Usage Examples") print("=" * 50) - - print(""" + + print( + """ ๐Ÿ”ง Basic Setup: ```python from PowerPlatform.Dataverse import DataverseClient @@ -229,52 +243,53 @@ def show_usage_examples(): tables = client.list_tables() print(f"Found {len(tables)} tables") ``` -""") +""" + ) def interactive_test(): """Offer optional interactive testing with real Dataverse environment.""" print("\n๐Ÿงช Interactive Testing") print("=" * 50) - + choice = input("Would you like to test with a real Dataverse environment? (y/N): ").strip().lower() - - if choice not in ['y', 'yes']: + + if choice not in ["y", "yes"]: print(" โ„น๏ธ Skipping interactive test") return - + print("\n๐ŸŒ Dataverse Environment Setup") print("-" * 50) - + if not sys.stdin.isatty(): print(" โŒ Interactive input required for testing") return - + org_url = input("Enter your Dataverse org URL (e.g., https://yourorg.crm.dynamics.com): ").strip() if not org_url: print(" โš ๏ธ No URL provided, skipping test") return - + try: from PowerPlatform.Dataverse import DataverseClient from azure.identity import InteractiveBrowserCredential - + print(" ๐Ÿ” Setting up authentication...") credential = InteractiveBrowserCredential() - + print(" ๐Ÿš€ Creating client...") - client = DataverseClient(org_url.rstrip('/'), credential) - + client = DataverseClient(org_url.rstrip("/"), credential) + print(" ๐Ÿงช Testing connection...") tables = client.list_tables() - + print(f" โœ… Connection successful!") print(f" ๐Ÿ“‹ Found {len(tables)} tables in environment") print(f" ๐ŸŒ Connected to: {org_url}") - + print("\n ๐Ÿ’ก Your SDK is ready for use!") print(" ๐Ÿ’ก Check the usage examples above for common patterns") - + except Exception as e: print(f" โŒ Interactive test failed: {e}") print(" ๐Ÿ’ก This might be due to authentication, network, or permissions") @@ -287,46 +302,46 @@ def main(): print("=" * 70) print(f"๐Ÿ•’ Validation Time: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}") print("=" * 70) - + # Step 1: Validate imports imports_success, version, DataverseClient = validate_imports() if not imports_success: print("\nโŒ Import validation failed. Please check installation.") sys.exit(1) - + # Step 2: Validate client methods if DataverseClient: methods_success = validate_client_methods(DataverseClient) if not methods_success: print("\nโš ๏ธ Some client methods are missing, but basic functionality should work.") - + # Step 3: Validate package metadata metadata_success = validate_package_metadata() - + # Step 4: Show usage examples show_usage_examples() - + # Step 5: Optional interactive testing interactive_test() - + # Summary print("\n" + "=" * 70) print("๐Ÿ“Š VALIDATION SUMMARY") print("=" * 70) - + results = [ ("Package Imports", imports_success), - ("Client Methods", methods_success if 'methods_success' in locals() else True), - ("Package Metadata", metadata_success) + ("Client Methods", methods_success if "methods_success" in locals() else True), + ("Package Metadata", metadata_success), ] - + all_passed = True for test_name, success in results: - status = "โœ… PASS" if success else "โŒ FAIL" + status = "โœ… PASS" if success else "โŒ FAIL" print(f"{test_name:<20} {status}") if not success: all_passed = False - + print("=" * 70) if all_passed: print("๐ŸŽ‰ SUCCESS: PowerPlatform-Dataverse-Client is properly installed!") @@ -334,16 +349,16 @@ def main(): print(f"๐Ÿ“ฆ Package Version: {version}") print("\n๐Ÿ’ก What this validates:") print(" โœ… Package installation is correct") - print(" โœ… All namespace imports work") + print(" โœ… All namespace imports work") print(" โœ… Client classes are accessible") print(" โœ… Package metadata is valid") print(" โœ… Ready for development and production use") - + print(f"\n๐ŸŽฏ Next Steps:") print(" โ€ข Review the usage examples above") - print(" โ€ข Configure your Azure Identity credentials") + print(" โ€ข Configure your Azure Identity credentials") print(" โ€ข Start building with PowerPlatform.Dataverse!") - + else: print("โŒ Some validation checks failed!") print("๐Ÿ’ก Review the errors above and reinstall if needed:") @@ -355,4 +370,4 @@ def main(): if __name__ == "__main__": print("๐Ÿš€ PowerPlatform-Dataverse-Client SDK Installation Example") print("=" * 60) - main() \ No newline at end of file + main() diff --git a/src/PowerPlatform/Dataverse/client.py b/src/PowerPlatform/Dataverse/client.py index ef06d24..0a0c320 100644 --- a/src/PowerPlatform/Dataverse/client.py +++ b/src/PowerPlatform/Dataverse/client.py @@ -147,7 +147,9 @@ def create(self, table_schema_name: str, records: Union[Dict[str, Any], List[Dic return ids raise TypeError("records must be dict or list[dict]") - def update(self, table_schema_name: str, ids: Union[str, List[str]], changes: Union[Dict[str, Any], List[Dict[str, Any]]]) -> None: + def update( + self, table_schema_name: str, ids: Union[str, List[str]], changes: Union[Dict[str, Any], List[Dict[str, Any]]] + ) -> None: """ Update one or more records. @@ -220,7 +222,7 @@ def delete( :raises TypeError: If ``ids`` is not str or list[str]. :raises HttpError: If the underlying Web API delete request fails. - + :return: BulkDelete job ID when deleting multiple records via BulkDelete; otherwise ``None``. :rtype: ``str`` or ``None`` @@ -298,7 +300,7 @@ def get( Query multiple records with filtering (note: exact logical names in filter):: for batch in client.get( - "account", + "account", filter="statecode eq 0 and name eq 'Contoso'", # Must use exact logical names (lower-case) select=["name", "telephone1"] ): @@ -332,7 +334,7 @@ def get( raise TypeError("record_id must be str") return od._get( table_schema_name, - record_id, + record_id, select=select, ) return od._get_multiple( @@ -514,7 +516,7 @@ def list_tables(self) -> list[str]: print(table) """ return self._get_odata()._list_tables() - + def create_columns( self, table_schema_name: str, @@ -675,5 +677,5 @@ def flush_cache(self, kind) -> int: """ return self._get_odata()._flush_cache(kind) -__all__ = ["DataverseClient"] +__all__ = ["DataverseClient"] diff --git a/src/PowerPlatform/Dataverse/core/__init__.py b/src/PowerPlatform/Dataverse/core/__init__.py index 1a136b3..79454f5 100644 --- a/src/PowerPlatform/Dataverse/core/__init__.py +++ b/src/PowerPlatform/Dataverse/core/__init__.py @@ -8,4 +8,4 @@ configuration, HTTP client, and error handling. """ -__all__ = [] \ No newline at end of file +__all__ = [] diff --git a/src/PowerPlatform/Dataverse/core/auth.py b/src/PowerPlatform/Dataverse/core/auth.py index f5b6973..e5619e4 100644 --- a/src/PowerPlatform/Dataverse/core/auth.py +++ b/src/PowerPlatform/Dataverse/core/auth.py @@ -26,6 +26,7 @@ class TokenPair: :param access_token: The access token string. :type access_token: ``str`` """ + resource: str access_token: str @@ -41,9 +42,7 @@ class AuthManager: def __init__(self, credential: TokenCredential) -> None: if not isinstance(credential, TokenCredential): - raise TypeError( - "credential must implement azure.core.credentials.TokenCredential." - ) + raise TypeError("credential must implement azure.core.credentials.TokenCredential.") self.credential: TokenCredential = credential def acquire_token(self, scope: str) -> TokenPair: diff --git a/src/PowerPlatform/Dataverse/core/config.py b/src/PowerPlatform/Dataverse/core/config.py index acef530..02785e6 100644 --- a/src/PowerPlatform/Dataverse/core/config.py +++ b/src/PowerPlatform/Dataverse/core/config.py @@ -29,6 +29,7 @@ class DataverseConfig: :param http_timeout: Optional request timeout in seconds. Reserved for future use. :type http_timeout: ``float`` | ``None`` """ + language_code: int = 1033 # Optional HTTP tuning (not yet wired everywhere; reserved for future use) diff --git a/src/PowerPlatform/Dataverse/core/error_codes.py b/src/PowerPlatform/Dataverse/core/error_codes.py index 0ff36d3..23a99a7 100644 --- a/src/PowerPlatform/Dataverse/core/error_codes.py +++ b/src/PowerPlatform/Dataverse/core/error_codes.py @@ -76,10 +76,11 @@ TRANSIENT_STATUS = {429, 502, 503, 504} + def http_subcode(status: int) -> str: """ Convert HTTP status code to error subcode string. - + :param status: HTTP status code (e.g., 400, 404, 500). :type status: ``int`` :return: Error subcode string (e.g., "http_400", "http_404"). @@ -87,13 +88,14 @@ def http_subcode(status: int) -> str: """ return HTTP_STATUS_TO_SUBCODE.get(status, f"http_{status}") + def is_transient_status(status: int) -> bool: """ Check if an HTTP status code indicates a transient error that may succeed on retry. - + Transient status codes include: 429 (Too Many Requests), 502 (Bad Gateway), 503 (Service Unavailable), and 504 (Gateway Timeout). - + :param status: HTTP status code to check. :type status: ``int`` :return: True if the status code is considered transient. diff --git a/src/PowerPlatform/Dataverse/core/errors.py b/src/PowerPlatform/Dataverse/core/errors.py index ef810a8..9fa2d0c 100644 --- a/src/PowerPlatform/Dataverse/core/errors.py +++ b/src/PowerPlatform/Dataverse/core/errors.py @@ -16,6 +16,7 @@ from typing import Any, Dict, Optional import datetime as _dt + class DataverseError(Exception): """ Base structured exception for the Dataverse SDK. @@ -35,6 +36,7 @@ class DataverseError(Exception): :param is_transient: Whether the error is potentially transient and may succeed on retry. :type is_transient: ``bool`` """ + def __init__( self, message: str, @@ -53,7 +55,7 @@ def __init__( self.details = details or {} self.source = source or "client" self.is_transient = is_transient - self.timestamp = _dt.datetime.now(_dt.timezone.utc).isoformat().replace('+00:00', 'Z') + self.timestamp = _dt.datetime.now(_dt.timezone.utc).isoformat().replace("+00:00", "Z") def to_dict(self) -> Dict[str, Any]: """ @@ -76,6 +78,7 @@ def to_dict(self) -> Dict[str, Any]: def __repr__(self) -> str: # pragma: no cover return f"{self.__class__.__name__}(code={self.code!r}, subcode={self.subcode!r}, message={self.message!r})" + class ValidationError(DataverseError): """ Exception raised for client-side validation failures. @@ -87,9 +90,11 @@ class ValidationError(DataverseError): :param details: Optional dictionary with additional validation context. :type details: ``dict`` | ``None`` """ + def __init__(self, message: str, *, subcode: Optional[str] = None, details: Optional[Dict[str, Any]] = None): super().__init__(message, code="validation_error", subcode=subcode, details=details, source="client") + class MetadataError(DataverseError): """ Exception raised for metadata operation failures. @@ -101,9 +106,11 @@ class MetadataError(DataverseError): :param details: Optional dictionary with additional metadata context. :type details: ``dict`` | ``None`` """ + def __init__(self, message: str, *, subcode: Optional[str] = None, details: Optional[Dict[str, Any]] = None): super().__init__(message, code="metadata_error", subcode=subcode, details=details, source="client") + class SQLParseError(DataverseError): """ Exception raised for SQL query parsing failures. @@ -115,9 +122,11 @@ class SQLParseError(DataverseError): :param details: Optional dictionary with SQL query context and parse information. :type details: ``dict`` | ``None`` """ + def __init__(self, message: str, *, subcode: Optional[str] = None, details: Optional[Dict[str, Any]] = None): super().__init__(message, code="sql_parse_error", subcode=subcode, details=details, source="client") + class HttpError(DataverseError): """ Exception raised for HTTP request failures from the Dataverse Web API. @@ -145,6 +154,7 @@ class HttpError(DataverseError): :param details: Optional additional diagnostic details. :type details: ``dict`` | ``None`` """ + def __init__( self, message: str, @@ -157,7 +167,7 @@ def __init__( traceparent: Optional[str] = None, body_excerpt: Optional[str] = None, retry_after: Optional[int] = None, - details: Optional[Dict[str, Any]] = None + details: Optional[Dict[str, Any]] = None, ) -> None: d = details or {} if service_error_code is not None: @@ -182,4 +192,5 @@ def __init__( is_transient=is_transient, ) + __all__ = ["DataverseError", "HttpError", "ValidationError", "MetadataError", "SQLParseError"] diff --git a/src/PowerPlatform/Dataverse/core/http.py b/src/PowerPlatform/Dataverse/core/http.py index f19c068..057b135 100644 --- a/src/PowerPlatform/Dataverse/core/http.py +++ b/src/PowerPlatform/Dataverse/core/http.py @@ -20,10 +20,10 @@ class HttpClient: """ HTTP client with configurable retry logic and timeout handling. - + Provides automatic retry behavior for transient failures and default timeout management for different HTTP methods. - + :param retries: Maximum number of retry attempts for transient errors. Default is 5. :type retries: ``int`` | ``None`` :param backoff: Base delay in seconds between retry attempts. Default is 0.5. @@ -31,7 +31,7 @@ class HttpClient: :param timeout: Default request timeout in seconds. If None, uses per-method defaults. :type timeout: ``float`` | ``None`` """ - + def __init__( self, retries: Optional[int] = None, @@ -45,10 +45,10 @@ def __init__( def request(self, method: str, url: str, **kwargs: Any) -> requests.Response: """ Execute an HTTP request with automatic retry logic and timeout management. - + Applies default timeouts based on HTTP method (120s for POST/DELETE, 10s for others) and retries on network errors with exponential backoff. - + :param method: HTTP method (GET, POST, PUT, DELETE, etc.). :type method: ``str`` :param url: Target URL for the request. @@ -74,6 +74,6 @@ def request(self, method: str, url: str, **kwargs: Any) -> requests.Response: except requests.exceptions.RequestException: if attempt == self.max_attempts - 1: raise - delay = self.base_delay * (2 ** attempt) + delay = self.base_delay * (2**attempt) time.sleep(delay) continue diff --git a/src/PowerPlatform/Dataverse/data/__init__.py b/src/PowerPlatform/Dataverse/data/__init__.py index 86a3659..4a84de9 100644 --- a/src/PowerPlatform/Dataverse/data/__init__.py +++ b/src/PowerPlatform/Dataverse/data/__init__.py @@ -8,4 +8,4 @@ SQL query functionality, and file upload capabilities. """ -__all__ = [] \ No newline at end of file +__all__ = [] diff --git a/src/PowerPlatform/Dataverse/data/odata.py b/src/PowerPlatform/Dataverse/data/odata.py index 385497e..aafb85d 100644 --- a/src/PowerPlatform/Dataverse/data/odata.py +++ b/src/PowerPlatform/Dataverse/data/odata.py @@ -18,7 +18,7 @@ from .upload import ODataFileUpload from ..core.errors import * from ..core.error_codes import ( - http_subcode, + http_subcode, is_transient_status, VALIDATION_SQL_NOT_STRING, VALIDATION_SQL_EMPTY, @@ -53,7 +53,7 @@ def _normalize_cache_key(table_schema_name: str) -> str: @staticmethod def _lowercase_keys(record: Dict[str, Any]) -> Dict[str, Any]: """Convert all dictionary keys to lowercase for case-insensitive column names. - + Dataverse LogicalNames for attributes are stored lowercase, but users may provide PascalCase names (matching SchemaName). This normalizes the input. """ @@ -64,7 +64,7 @@ def _lowercase_keys(record: Dict[str, Any]) -> Dict[str, Any]: @staticmethod def _lowercase_list(items: Optional[List[str]]) -> Optional[List[str]]: """Convert all strings in a list to lowercase for case-insensitive column names. - + Used for $select, $orderby, $expand parameters where column names must be lowercase. """ if not items: @@ -94,7 +94,12 @@ def __init__( if not self.base_url: raise ValueError("base_url is required.") self.api = f"{self.base_url}/api/data/v9.2" - self.config = config or __import__("PowerPlatform.Dataverse.core.config", fromlist=["DataverseConfig"]).DataverseConfig.from_env() + self.config = ( + config + or __import__( + "PowerPlatform.Dataverse.core.config", fromlist=["DataverseConfig"] + ).DataverseConfig.from_env() + ) self._http = HttpClient( retries=self.config.http_retries, backoff=self.config.http_backoff, @@ -160,7 +165,9 @@ def _request(self, method: str, url: str, *, expected: tuple[int, ...] = (200, 2 sc = r.status_code subcode = http_subcode(sc) correlation_id = headers.get("x-ms-correlation-request-id") or headers.get("x-ms-correlation-id") - request_id = headers.get("x-ms-client-request-id") or headers.get("request-id") or headers.get("x-ms-request-id") + request_id = ( + headers.get("x-ms-client-request-id") or headers.get("request-id") or headers.get("x-ms-request-id") + ) traceparent = headers.get("traceparent") ra = headers.get("Retry-After") retry_after = None @@ -299,7 +306,9 @@ def _primary_id_attr(self, table_schema_name: str) -> str: f"PrimaryIdAttribute not resolved for table_schema_name '{table_schema_name}'. Metadata did not include PrimaryIdAttribute." ) - def _update_by_ids(self, table_schema_name: str, ids: List[str], changes: Union[Dict[str, Any], List[Dict[str, Any]]]) -> None: + def _update_by_ids( + self, table_schema_name: str, ids: List[str], changes: Union[Dict[str, Any], List[Dict[str, Any]]] + ) -> None: """Update many records by GUID list using the collection-bound ``UpdateMultiple`` action. :param table_schema_name: Schema name of the table. @@ -412,9 +421,11 @@ def _format_key(self, key: str) -> str: return k # Escape single quotes in alternate key values if "=" in k and "'" in k: + def esc(match): # match.group(1) is the key, match.group(2) is the value return f"{match.group(1)}='{self._escape_odata_quotes(match.group(2))}'" + k = re.sub(r"(\w+)=\'([^\']*)\'", esc, k) return f"({k})" if len(k) == 36 and "-" in k: @@ -670,7 +681,7 @@ def _entity_set_from_schema_name(self, table_schema_name: str) -> str: """ if not table_schema_name: raise ValueError("table schema name required") - + # Use normalized (lowercase) key for cache lookup cache_key = self._normalize_cache_key(table_schema_name) cached = self._logical_to_entityset_cache.get(cache_key) @@ -691,7 +702,11 @@ def _entity_set_from_schema_name(self, table_schema_name: str) -> str: except ValueError: items = [] if not items: - plural_hint = " (did you pass a plural entity set name instead of the singular table schema name?)" if table_schema_name.endswith("s") and not table_schema_name.endswith("ss") else "" + plural_hint = ( + " (did you pass a plural entity set name instead of the singular table schema name?)" + if table_schema_name.endswith("s") and not table_schema_name.endswith("ss") + else "" + ) raise MetadataError( f"Unable to resolve entity set for table schema name '{table_schema_name}'. Provide the singular table schema name.{plural_hint}", subcode=METADATA_ENTITYSET_NOT_FOUND, @@ -733,7 +748,7 @@ def _get_entity_by_table_schema_name( headers: Optional[Dict[str, str]] = None, ) -> Optional[Dict[str, Any]]: """Get entity metadata by table schema name. Case-insensitive. - + Note: LogicalName is stored lowercase in Dataverse, so we lowercase the input for case-insensitive matching. The response includes SchemaName, LogicalName, EntitySetName, and MetadataId. @@ -783,9 +798,7 @@ def _create_entity( f"Failed to create or retrieve entity '{table_schema_name}' (EntitySetName not available)." ) if not ent.get("MetadataId"): - raise RuntimeError( - f"MetadataId missing after creating entity '{table_schema_name}'." - ) + raise RuntimeError(f"MetadataId missing after creating entity '{table_schema_name}'.") return ent def _get_attribute_metadata( @@ -834,11 +847,13 @@ def _build_localizedlabels_payload(self, translations: Dict[int, str]) -> Dict[s raise ValueError(f"Language code '{lang}' must be int") if not isinstance(text, str) or not text.strip(): raise ValueError(f"Label for lang {lang} must be non-empty string") - locs.append({ - "@odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel", - "Label": text, - "LanguageCode": lang, - }) + locs.append( + { + "@odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel", + "Label": text, + "LanguageCode": lang, + } + ) if not locs: raise ValueError("At least one translation required") return { @@ -846,7 +861,9 @@ def _build_localizedlabels_payload(self, translations: Dict[int, str]) -> Dict[s "LocalizedLabels": locs, } - def _enum_optionset_payload(self, column_schema_name: str, enum_cls: type[Enum], is_primary_name: bool = False) -> Dict[str, Any]: + def _enum_optionset_payload( + self, column_schema_name: str, enum_cls: type[Enum], is_primary_name: bool = False + ) -> Dict[str, Any]: """Create local (IsGlobal=False) PicklistAttributeMetadata from an Enum subclass. Supports translation mapping via optional class attribute `__labels__`: @@ -915,11 +932,13 @@ def _enum_optionset_payload(self, column_schema_name: str, enum_cls: type[Enum], for lang in all_langs: label_text = labels_by_lang.get(lang, {}).get(m.name, m.name) per_lang[lang] = label_text - options.append({ - "@odata.type": "Microsoft.Dynamics.CRM.OptionMetadata", - "Value": m.value, - "Label": self._build_localizedlabels_payload(per_lang), - }) + options.append( + { + "@odata.type": "Microsoft.Dynamics.CRM.OptionMetadata", + "Value": m.value, + "Label": self._build_localizedlabels_payload(per_lang), + } + ) attr_label = column_schema_name.split("_")[-1] return { @@ -962,8 +981,8 @@ def _optionset_map(self, table_schema_name: str, attr_logical: str) -> Optional[ cache_key = (self._normalize_cache_key(table_schema_name), self._normalize_cache_key(attr_logical)) now = time.time() entry = self._picklist_label_cache.get(cache_key) - if isinstance(entry, dict) and 'map' in entry and (now - entry.get('ts', 0)) < self._picklist_cache_ttl_seconds: - return entry['map'] + if isinstance(entry, dict) and "map" in entry and (now - entry.get("ts", 0)) < self._picklist_cache_ttl_seconds: + return entry["map"] # LogicalNames in Dataverse are stored in lowercase, so we need to lowercase for filters attr_esc = self._escape_odata_quotes(attr_logical.lower()) @@ -984,7 +1003,7 @@ def _optionset_map(self, table_schema_name: str, attr_logical: str) -> Optional[ if getattr(err, "status_code", None) == 404: if attempt < 2: # Exponential-ish backoff: 0.4s, 0.8s - time.sleep(0.4 * (2 ** attempt)) + time.sleep(0.4 * (2**attempt)) continue raise RuntimeError( f"Picklist attribute metadata not found after retries: entity='{table_schema_name}' attribute='{attr_logical}' (404)" @@ -992,14 +1011,14 @@ def _optionset_map(self, table_schema_name: str, attr_logical: str) -> Optional[ raise if r_type is None: raise RuntimeError("Failed to retrieve attribute metadata due to repeated request failures.") - + body_type = r_type.json() items = body_type.get("value", []) if isinstance(body_type, dict) else [] if not items: return None attr_md = items[0] if attr_md.get("AttributeType") not in ("Picklist", "PickList"): - self._picklist_label_cache[cache_key] = {'map': {}, 'ts': now} + self._picklist_label_cache[cache_key] = {"map": {}, "ts": now} return {} # Step 2: fetch with expand only now that we know it's a picklist @@ -1017,7 +1036,7 @@ def _optionset_map(self, table_schema_name: str, attr_logical: str) -> Optional[ except HttpError as err: if getattr(err, "status_code", None) == 404: if attempt < 2: - time.sleep(0.4 * (2 ** attempt)) # 0.4s, 0.8s + time.sleep(0.4 * (2**attempt)) # 0.4s, 0.8s continue raise RuntimeError( f"Picklist OptionSet metadata not found after retries: entity='{table_schema_name}' attribute='{attr_logical}' (404)" @@ -1025,7 +1044,7 @@ def _optionset_map(self, table_schema_name: str, attr_logical: str) -> Optional[ raise if r_opts is None: raise RuntimeError("Failed to retrieve picklist OptionSet metadata due to repeated request failures.") - + attr_full = {} try: attr_full = r_opts.json() if r_opts.text else {} @@ -1052,10 +1071,10 @@ def _optionset_map(self, table_schema_name: str, attr_logical: str) -> Optional[ normalized = self._normalize_picklist_label(lab) mapping.setdefault(normalized, val) if mapping: - self._picklist_label_cache[cache_key] = {'map': mapping, 'ts': now} + self._picklist_label_cache[cache_key] = {"map": mapping, "ts": now} return mapping # No options available - self._picklist_label_cache[cache_key] = {'map': {}, 'ts': now} + self._picklist_label_cache[cache_key] = {"map": {}, "ts": now} return {} def _convert_labels_to_ints(self, table_schema_name: str, record: Dict[str, Any]) -> Dict[str, Any]: @@ -1077,12 +1096,16 @@ def _convert_labels_to_ints(self, table_schema_name: str, record: Dict[str, Any] out[k] = val return out - def _attribute_payload(self, column_schema_name: str, dtype: Any, *, is_primary_name: bool = False) -> Optional[Dict[str, Any]]: + def _attribute_payload( + self, column_schema_name: str, dtype: Any, *, is_primary_name: bool = False + ) -> Optional[Dict[str, Any]]: # Enum-based local option set support if isinstance(dtype, type) and issubclass(dtype, Enum): return self._enum_optionset_payload(column_schema_name, dtype, is_primary_name=is_primary_name) if not isinstance(dtype, str): - raise ValueError(f"Unsupported column spec type for '{column_schema_name}': {type(dtype)} (expected str or Enum subclass)") + raise ValueError( + f"Unsupported column spec type for '{column_schema_name}': {type(dtype)} (expected str or Enum subclass)" + ) dtype_l = dtype.lower().strip() label = column_schema_name.split("_")[-1] if dtype_l in ("string", "text"): @@ -1174,7 +1197,7 @@ def _get_table_info(self, table_schema_name: str) -> Optional[Dict[str, Any]]: "metadata_id": ent.get("MetadataId"), "columns_created": [], } - + def _list_tables(self) -> List[Dict[str, Any]]: """List all non-private tables (``IsPrivate eq false``). @@ -1184,9 +1207,7 @@ def _list_tables(self) -> List[Dict[str, Any]]: :raises HttpError: If the metadata request fails. """ url = f"{self.api}/EntityDefinitions" - params = { - "$filter": "IsPrivate eq false" - } + params = {"$filter": "IsPrivate eq false"} r = self._request("get", url, params=params) return r.json().get("value", []) @@ -1253,7 +1274,9 @@ def _create_table( if primary_column_schema_name: primary_attr_schema = primary_column_schema_name else: - primary_attr_schema = f"{table_schema_name.split('_',1)[0]}_Name" if "_" in table_schema_name else "new_Name" + primary_attr_schema = ( + f"{table_schema_name.split('_',1)[0]}_Name" if "_" in table_schema_name else "new_Name" + ) attributes: List[Dict[str, Any]] = [] attributes.append(self._attribute_payload(primary_attr_schema, "string", is_primary_name=True)) @@ -1307,7 +1330,7 @@ def _create_columns( """ if not isinstance(columns, dict) or not columns: raise TypeError("columns must be a non-empty dict[name -> type]") - + ent = self._get_entity_by_table_schema_name(table_schema_name) if not ent or not ent.get("MetadataId"): raise MetadataError( @@ -1392,9 +1415,7 @@ def _delete_columns( attr_metadata_id = attr_meta.get("MetadataId") if not attr_metadata_id: - raise RuntimeError( - f"Metadata incomplete for column '{column_name}' (missing MetadataId)." - ) + raise RuntimeError(f"Metadata incomplete for column '{column_name}' (missing MetadataId).") attr_url = f"{self.api}/EntityDefinitions({metadata_id})/Attributes({attr_metadata_id})" self._request("delete", attr_url, headers={"If-Match": "*"}) @@ -1411,7 +1432,7 @@ def _delete_columns( self._flush_cache("picklist") return deleted - + # ---------------------- Cache maintenance ------------------------- def _flush_cache( self, @@ -1434,4 +1455,4 @@ def _flush_cache( removed = len(self._picklist_label_cache) self._picklist_label_cache.clear() - return removed \ No newline at end of file + return removed diff --git a/src/PowerPlatform/Dataverse/data/upload.py b/src/PowerPlatform/Dataverse/data/upload.py index 874706a..9458164 100644 --- a/src/PowerPlatform/Dataverse/data/upload.py +++ b/src/PowerPlatform/Dataverse/data/upload.py @@ -52,13 +52,11 @@ def upload_file( if mode == "small": return self._upload_file_small( - entity_set, record_id, file_name_attribute, path, - content_type=mime_type, if_none_match=if_none_match + entity_set, record_id, file_name_attribute, path, content_type=mime_type, if_none_match=if_none_match ) if mode == "chunk": return self._upload_file_chunk( - entity_set, record_id, file_name_attribute, path, - if_none_match=if_none_match + entity_set, record_id, file_name_attribute, path, if_none_match=if_none_match ) raise ValueError(f"Invalid mode '{mode}'. Use 'auto', 'small', or 'chunk'.") @@ -73,6 +71,7 @@ def _upload_file_small( ) -> None: """Upload a file (<128MB) via single PATCH.""" import os + if not record_id: raise ValueError("record_id required") if not os.path.isfile(path): @@ -130,6 +129,7 @@ def _upload_file_chunk( """ import os, math from urllib.parse import quote + if not record_id: raise ValueError("record_id required") if not os.path.isfile(path): diff --git a/src/PowerPlatform/Dataverse/extensions/__init__.py b/src/PowerPlatform/Dataverse/extensions/__init__.py index 74eb25a..41bf8bc 100644 --- a/src/PowerPlatform/Dataverse/extensions/__init__.py +++ b/src/PowerPlatform/Dataverse/extensions/__init__.py @@ -6,4 +6,4 @@ """ # Will be populated with extensions as they are created -__all__ = [] \ No newline at end of file +__all__ = [] diff --git a/src/PowerPlatform/Dataverse/models/__init__.py b/src/PowerPlatform/Dataverse/models/__init__.py index 95a12a8..396cd4e 100644 --- a/src/PowerPlatform/Dataverse/models/__init__.py +++ b/src/PowerPlatform/Dataverse/models/__init__.py @@ -6,4 +6,4 @@ """ # Will be populated with models as they are created -__all__ = [] \ No newline at end of file +__all__ = [] diff --git a/src/PowerPlatform/Dataverse/utils/__init__.py b/src/PowerPlatform/Dataverse/utils/__init__.py index e08d110..812583f 100644 --- a/src/PowerPlatform/Dataverse/utils/__init__.py +++ b/src/PowerPlatform/Dataverse/utils/__init__.py @@ -7,4 +7,4 @@ Placeholder module for future utility adapters. """ -__all__ = [] \ No newline at end of file +__all__ = [] diff --git a/src/PowerPlatform/__init__.py b/src/PowerPlatform/__init__.py index a52b03f..1a02353 100644 --- a/src/PowerPlatform/__init__.py +++ b/src/PowerPlatform/__init__.py @@ -1,4 +1,4 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT license. -__path__ = __import__('pkgutil').extend_path(__path__, __name__) \ No newline at end of file +__path__ = __import__("pkgutil").extend_path(__path__, __name__) diff --git a/tests/__init__.py b/tests/__init__.py index cdfc326..24c2563 100644 --- a/tests/__init__.py +++ b/tests/__init__.py @@ -1,4 +1,4 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT license. -"""Test package for the Dataverse SDK.""" \ No newline at end of file +"""Test package for the Dataverse SDK.""" diff --git a/tests/conftest.py b/tests/conftest.py index dcb8000..b223af7 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -16,23 +16,21 @@ @pytest.fixture def dummy_auth(): """Mock authentication object for testing.""" + class DummyAuth: def acquire_token(self, scope): class Token: access_token = "test_token_12345" + return Token() + return DummyAuth() -@pytest.fixture +@pytest.fixture def test_config(): """Test configuration with safe defaults.""" - return DataverseConfig( - language_code=1033, - http_retries=0, - http_backoff=0.1, - http_timeout=5 - ) + return DataverseConfig(language_code=1033, http_retries=0, http_backoff=0.1, http_timeout=5) @pytest.fixture @@ -52,14 +50,10 @@ def sample_base_url(): @pytest.fixture def sample_entity_data(): """Sample entity data for testing.""" - return { - "name": "Test Account", - "telephone1": "555-0100", - "websiteurl": "https://example.com" - } + return {"name": "Test Account", "telephone1": "555-0100", "websiteurl": "https://example.com"} @pytest.fixture def sample_guid(): """Sample GUID for testing.""" - return "11111111-2222-3333-4444-555555555555" \ No newline at end of file + return "11111111-2222-3333-4444-555555555555" diff --git a/tests/fixtures/test_data.py b/tests/fixtures/test_data.py index 3e2fc73..20e2f17 100644 --- a/tests/fixtures/test_data.py +++ b/tests/fixtures/test_data.py @@ -13,16 +13,16 @@ "value": [ { "LogicalName": "account", - "EntitySetName": "accounts", + "EntitySetName": "accounts", "PrimaryIdAttribute": "accountid", - "DisplayName": {"UserLocalizedLabel": {"Label": "Account"}} + "DisplayName": {"UserLocalizedLabel": {"Label": "Account"}}, }, { "LogicalName": "contact", "EntitySetName": "contacts", - "PrimaryIdAttribute": "contactid", - "DisplayName": {"UserLocalizedLabel": {"Label": "Contact"}} - } + "PrimaryIdAttribute": "contactid", + "DisplayName": {"UserLocalizedLabel": {"Label": "Contact"}}, + }, ] } @@ -33,37 +33,22 @@ "accountid": "11111111-2222-3333-4444-555555555555", "name": "Contoso Ltd", "telephone1": "555-0100", - "websiteurl": "https://contoso.com" + "websiteurl": "https://contoso.com", }, { - "accountid": "22222222-3333-4444-5555-666666666666", + "accountid": "22222222-3333-4444-5555-666666666666", "name": "Fabrikam Inc", "telephone1": "555-0200", - "websiteurl": "https://fabrikam.com" - } + "websiteurl": "https://fabrikam.com", + }, ] } # Sample error responses SAMPLE_ERROR_RESPONSES = { - "404": { - "error": { - "code": "0x80040217", - "message": "The requested resource was not found." - } - }, - "429": { - "error": { - "code": "0x80072321", - "message": "Too many requests. Please retry after some time." - } - } + "404": {"error": {"code": "0x80040217", "message": "The requested resource was not found."}}, + "429": {"error": {"code": "0x80072321", "message": "Too many requests. Please retry after some time."}}, } # Sample SQL query results -SAMPLE_SQL_RESPONSE = { - "value": [ - {"name": "Account 1", "revenue": 1000000}, - {"name": "Account 2", "revenue": 2000000} - ] -} \ No newline at end of file +SAMPLE_SQL_RESPONSE = {"value": [{"name": "Account 1", "revenue": 1000000}, {"name": "Account 2", "revenue": 2000000}]} diff --git a/tests/unit/__init__.py b/tests/unit/__init__.py index bce68aa..dd1a3ed 100644 --- a/tests/unit/__init__.py +++ b/tests/unit/__init__.py @@ -1,4 +1,4 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT license. -"""Unit tests for the Dataverse SDK.""" \ No newline at end of file +"""Unit tests for the Dataverse SDK.""" diff --git a/tests/unit/core/__init__.py b/tests/unit/core/__init__.py index b3b8cd6..deabc01 100644 --- a/tests/unit/core/__init__.py +++ b/tests/unit/core/__init__.py @@ -1,4 +1,4 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT license. -"""Unit tests for core infrastructure components.""" \ No newline at end of file +"""Unit tests for core infrastructure components.""" diff --git a/tests/unit/core/test_http_errors.py b/tests/unit/core/test_http_errors.py index f3fd29f..622663a 100644 --- a/tests/unit/core/test_http_errors.py +++ b/tests/unit/core/test_http_errors.py @@ -6,47 +6,66 @@ from PowerPlatform.Dataverse.core.error_codes import HTTP_404, HTTP_429, HTTP_500 from PowerPlatform.Dataverse.data.odata import ODataClient + class DummyAuth: def acquire_token(self, scope): - class T: access_token = "x" + class T: + access_token = "x" + return T() + class DummyHTTP: def __init__(self, responses): self._responses = responses + def request(self, method, url, **kwargs): if not self._responses: raise AssertionError("No more responses") status, headers, body = self._responses.pop(0) + class R: pass + r = R() r.status_code = status r.headers = headers if isinstance(body, dict): import json + r.text = json.dumps(body) - def json_func(): return body + + def json_func(): + return body + r.json = json_func else: r.text = body or "" - def json_fail(): raise ValueError("non-json") + + def json_fail(): + raise ValueError("non-json") + r.json = json_fail return r + class MockClient(ODataClient): def __init__(self, responses): super().__init__(DummyAuth(), "https://org.example", None) self._http = DummyHTTP(responses) + # --- Tests --- + def test_http_404_subcode_and_service_code(): - responses = [( - 404, - {"x-ms-correlation-request-id": "cid1"}, - {"error": {"code": "0x800404", "message": "Not found"}}, - )] + responses = [ + ( + 404, + {"x-ms-correlation-request-id": "cid1"}, + {"error": {"code": "0x800404", "message": "Not found"}}, + ) + ] c = MockClient(responses) with pytest.raises(HttpError) as ei: c._request("get", c.api + "/accounts(abc)") @@ -56,11 +75,13 @@ def test_http_404_subcode_and_service_code(): def test_http_429_transient_and_retry_after(): - responses = [( - 429, - {"Retry-After": "7"}, - {"error": {"message": "Throttle"}}, - )] + responses = [ + ( + 429, + {"Retry-After": "7"}, + {"error": {"message": "Throttle"}}, + ) + ] c = MockClient(responses) with pytest.raises(HttpError) as ei: c._request("get", c.api + "/accounts") @@ -71,11 +92,13 @@ def test_http_429_transient_and_retry_after(): def test_http_500_body_excerpt(): - responses = [( - 500, - {}, - "Internal failure XYZ stack truncated", - )] + responses = [ + ( + 500, + {}, + "Internal failure XYZ stack truncated", + ) + ] c = MockClient(responses) with pytest.raises(HttpError) as ei: c._request("get", c.api + "/accounts") @@ -85,11 +108,13 @@ def test_http_500_body_excerpt(): def test_http_non_mapped_status_code_subcode_fallback(): - responses = [( - 418, # I'm a teapot (not in map) - {}, - {"error": {"message": "Teapot"}}, - )] + responses = [ + ( + 418, # I'm a teapot (not in map) + {}, + {"error": {"message": "Teapot"}}, + ) + ] c = MockClient(responses) with pytest.raises(HttpError) as ei: c._request("get", c.api + "/accounts") diff --git a/tests/unit/data/__init__.py b/tests/unit/data/__init__.py index c0c3a00..30ad611 100644 --- a/tests/unit/data/__init__.py +++ b/tests/unit/data/__init__.py @@ -1,4 +1,4 @@ # Copyright (c) Microsoft Corporation. # Licensed under the MIT license. -"""Unit tests for data access components.""" \ No newline at end of file +"""Unit tests for data access components.""" diff --git a/tests/unit/data/test_enum_optionset_payload.py b/tests/unit/data/test_enum_optionset_payload.py index 09d0212..f548943 100644 --- a/tests/unit/data/test_enum_optionset_payload.py +++ b/tests/unit/data/test_enum_optionset_payload.py @@ -6,12 +6,15 @@ from PowerPlatform.Dataverse.data.odata import ODataClient + class DummyAuth: def acquire_token(self, scope): # pragma: no cover - simple stub class T: access_token = "token" + return T() + class DummyConfig: """Minimal config stub providing attributes ODataClient.__init__ expects.""" @@ -22,14 +25,17 @@ def __init__(self, language_code=1033): self.http_backoff = 0 self.http_timeout = 5 + def _make_client(lang=1033): return ODataClient(DummyAuth(), "https://org.example", DummyConfig(language_code=lang)) + def _labels_for(option): label = option.get("Label") or {} locs = label.get("LocalizedLabels") or [] return {l.get("LanguageCode"): l.get("Label") for l in locs if isinstance(l, dict)} + def test_enum_basic_no_labels_uses_member_names(): class Basic(IntEnum): One = 10 @@ -46,6 +52,7 @@ class Basic(IntEnum): assert 1033 in labels assert labels[1033] in ("One", "Two") + def test_enum_with_multilanguage_labels_includes_all(): class ML(IntEnum): Active = 1 @@ -73,6 +80,7 @@ class ML(IntEnum): got_map = value_to_labels[val] assert got_map == exp_map, f"Labels mismatch for value {val}: expected {exp_map}, got {got_map}" + def test_missing_translation_falls_back_to_member_name(): class PartiallyTranslated(IntEnum): Alpha = 1 @@ -93,6 +101,7 @@ class PartiallyTranslated(IntEnum): assert alpha_labels[1033] == "Alpha" assert beta_labels[1033] == "Beta" + def test_labels_accept_member_objects_and_names(): class Mixed(IntEnum): A = 1 @@ -108,32 +117,40 @@ class Mixed(IntEnum): assert labels_map[1][1033] == "LetterA" assert labels_map[2][1033] == "LetterB" + def test_is_primary_name_flag_propagates(): class PN(IntEnum): X = 1 + c = _make_client() payload = c._enum_optionset_payload("new_Status", PN, is_primary_name=True) assert payload["IsPrimaryName"] is True + def test_duplicate_enum_values_raise(): class Dup(IntEnum): A = 1 B = 1 + c = _make_client() with pytest.raises(ValueError): c._enum_optionset_payload("new_Status", Dup) + def test_non_int_enum_values_raise(): class Bad(Enum): A = "x" + c = _make_client() with pytest.raises(ValueError): c._enum_optionset_payload("new_Status", Bad) + def test_enum_labels_not_dict_raises(): class BadLabels(IntEnum): A = 1 __labels__ = ["not", "a", "dict"] + c = _make_client() with pytest.raises(ValueError): c._enum_optionset_payload("new_BadLabels", BadLabels) @@ -145,6 +162,7 @@ class BadLangKey(IntEnum): __labels__ = { "en": {"A": "Alpha"}, } + c = _make_client() with pytest.raises(ValueError): c._enum_optionset_payload("new_BadLangKey", BadLangKey) @@ -156,6 +174,7 @@ class BadMapping(IntEnum): __labels__ = { 1033: ["A", "Alpha"], } + c = _make_client() with pytest.raises(ValueError): c._enum_optionset_payload("new_BadMapping", BadMapping) @@ -167,6 +186,7 @@ class EmptyLabel(IntEnum): __labels__ = { 1033: {"A": " "}, } + c = _make_client() with pytest.raises(ValueError): - c._enum_optionset_payload("new_EmptyLabel", EmptyLabel) \ No newline at end of file + c._enum_optionset_payload("new_EmptyLabel", EmptyLabel) diff --git a/tests/unit/data/test_logical_crud.py b/tests/unit/data/test_logical_crud.py index 1bc0b72..5a8cb70 100644 --- a/tests/unit/data/test_logical_crud.py +++ b/tests/unit/data/test_logical_crud.py @@ -6,15 +6,20 @@ from PowerPlatform.Dataverse.data.odata import ODataClient from PowerPlatform.Dataverse.core.errors import MetadataError + class DummyAuth: def acquire_token(self, scope): - class T: access_token = "x" + class T: + access_token = "x" + return T() + class DummyHTTPClient: def __init__(self, responses): self._responses = responses self.calls = [] + def request(self, method, url, **kwargs): self.calls.append((method, url, kwargs)) if not self._responses: @@ -24,44 +29,39 @@ def request(self, method, url, **kwargs): resp.status_code = status resp.headers = headers resp.text = "" if body is None else ("{}" if isinstance(body, dict) else str(body)) + def raise_for_status(): if status >= 400: raise RuntimeError(f"HTTP {status}") return None + def json_func(): return body if isinstance(body, dict) else {} + resp.raise_for_status = raise_for_status resp.json = json_func return resp + class MockableClient(ODataClient): def __init__(self, responses): super().__init__(DummyAuth(), "https://org.example", None) self._http = DummyHTTPClient(responses) + def _convert_labels_to_ints(self, table_schema_name, record): # pragma: no cover - test shim return record + # Helper metadata response for logical name resolution -MD_ACCOUNT = { - "value": [ - { - "LogicalName": "account", - "EntitySetName": "accounts", - "PrimaryIdAttribute": "accountid" - } - ] -} +MD_ACCOUNT = {"value": [{"LogicalName": "account", "EntitySetName": "accounts", "PrimaryIdAttribute": "accountid"}]} MD_SAMPLE = { "value": [ - { - "LogicalName": "new_sampleitem", - "EntitySetName": "new_sampleitems", - "PrimaryIdAttribute": "new_sampleitemid" - } + {"LogicalName": "new_sampleitem", "EntitySetName": "new_sampleitems", "PrimaryIdAttribute": "new_sampleitemid"} ] } + def make_entity_create_headers(entity_set, guid): return {"OData-EntityId": f"https://org.example/api/data/v9.2/{entity_set}({guid})"} @@ -108,7 +108,11 @@ def test_get_multiple_paging(): # metadata, first page, second page responses = [ (200, {}, MD_ACCOUNT), - (200, {}, {"value": [{"accountid": "1"}], "@odata.nextLink": "https://org.example/api/data/v9.2/accounts?$skip=1"}), + ( + 200, + {}, + {"value": [{"accountid": "1"}], "@odata.nextLink": "https://org.example/api/data/v9.2/accounts?$skip=1"}, + ), (200, {}, {"value": [{"accountid": "2"}]}), ] c = MockableClient(responses) @@ -122,4 +126,4 @@ def test_unknown_table_schema_name_raises(): ] c = MockableClient(responses) with pytest.raises(MetadataError): - c._entity_set_from_schema_name("nonexistent") \ No newline at end of file + c._entity_set_from_schema_name("nonexistent") diff --git a/tests/unit/data/test_sql_parse.py b/tests/unit/data/test_sql_parse.py index 1b3fb7d..c0feaa5 100644 --- a/tests/unit/data/test_sql_parse.py +++ b/tests/unit/data/test_sql_parse.py @@ -4,36 +4,46 @@ import pytest from PowerPlatform.Dataverse.data.odata import ODataClient + class DummyAuth: def acquire_token(self, scope): - class T: access_token = "x" # no real token needed for parsing tests + class T: + access_token = "x" # no real token needed for parsing tests + return T() + def _client(): return ODataClient(DummyAuth(), "https://org.example", None) + def test_basic_from(): c = _client() assert c._extract_logical_table("SELECT a FROM account") == "account" + def test_underscore_name(): c = _client() assert c._extract_logical_table("select x FROM new_sampleitem where x=1") == "new_sampleitem" + def test_startfrom_identifier(): c = _client() # Ensure we pick the real table 'case', not 'from' portion inside 'startfrom' assert c._extract_logical_table("SELECT col, startfrom FROM case") == "case" + def test_case_insensitive_keyword(): c = _client() assert c._extract_logical_table("SeLeCt 1 FrOm ACCOUNT") == "account" + def test_missing_from_raises(): c = _client() with pytest.raises(ValueError): c._extract_logical_table("SELECT 1") + def test_from_as_value_not_table(): c = _client() # Table should still be 'incident'; word 'from' earlier shouldn't interfere From 49ff69fff28ea3504ece6efb2f763b3920cd201e Mon Sep 17 00:00:00 2001 From: Suyash Kshirsagar Date: Mon, 17 Nov 2025 11:15:37 -0800 Subject: [PATCH 4/4] ci: enforce Black formatting as required check - Remove continue-on-error flag from Black formatting step - Black formatting is now a blocking gate for PRs - All code has been formatted to pass Black checks --- .github/workflows/python-package.yml | 1 - 1 file changed, 1 deletion(-) diff --git a/.github/workflows/python-package.yml b/.github/workflows/python-package.yml index 57fd173..8d83040 100644 --- a/.github/workflows/python-package.yml +++ b/.github/workflows/python-package.yml @@ -31,7 +31,6 @@ jobs: python -m pip install -e .[dev] - name: Check format with black - continue-on-error: true # TODO: fix detected formatting errors and remove this line. run: | black src tests --check